CN107786802A - Unmanned plane image pickup method and device - Google Patents
Unmanned plane image pickup method and device Download PDFInfo
- Publication number
- CN107786802A CN107786802A CN201610744128.8A CN201610744128A CN107786802A CN 107786802 A CN107786802 A CN 107786802A CN 201610744128 A CN201610744128 A CN 201610744128A CN 107786802 A CN107786802 A CN 107786802A
- Authority
- CN
- China
- Prior art keywords
- point
- unmanned plane
- clapped
- shooting
- positional information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/66—Remote control of cameras or camera parts, e.g. by remote control devices
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
- G05D3/12—Control of position or direction using feedback
Abstract
The application proposes unmanned plane image pickup method and device.Method includes:According to the positional information of the shooting point received, control unmanned plane flies to shooting point;According to the positional information by bat point received, the direction of point is clapped into the heading alignment of unmanned plane, the direction of unmanned machine head camera is aligned to the direction for being clapped point;Control unmanned plane is shot to being clapped point.The application realizes clear, accurate shooting of the unmanned plane to object.
Description
Technical field
The application is related to unmanned air vehicle technique field, more particularly to unmanned plane image pickup method and device.
Background technology
Unmanned machine people, such as unmanned plane, there is small volume, low cost, maneuverability, easy to use and to environment bar
Part requires many advantages, such as relatively low.From being born from unmanned machine people, it is just as the continuous of scientific and technological level carrying
It is high and constantly progressive, and progressively it has been widely used in the various fields such as military, civilian, police, performed task includes:Mesh
Mark is scouted, traced and monitored, target is hit, injures assessment, rescue and relief work, personnel's search and rescue, terrain prospecting etc..
The content of the invention
The embodiment of the present application provides unmanned plane image pickup method and device, to realize clear, accurate bat of the unmanned plane to object
Take the photograph.
What the technical scheme of the application was realized in:
A kind of unmanned plane image pickup method, this method include:
According to the positional information of the shooting point received, control unmanned plane flies to shooting point;
According to the positional information for being clapped point received, the heading of unmanned plane is directed to the direction for being clapped point, by nothing
The direction of point is clapped in the direction alignment of people's machine head camera;
Unmanned machine head camera is controlled to be shot to being clapped point.
The positional information control unmanned plane for the shooting point that the basis receives further comprises before flying to shooting point:
The instruction for flying to appointed place that user terminal is sent is received, the instruction carries the positional information of shooting point;
And the control unmanned plane fly to shooting point after, according to the positional information for being clapped point received by unmanned plane
Heading alignment is further comprised before the direction for clapping point:
The shooting aligned instructions that user terminal is sent are received, the instruction carries the positional information for being clapped point.
After the direction that the direction alignment of unmanned machine head camera is clapped to point, unmanned machine head camera is controlled to quilt
Clap before point is shot and further comprise:
Receive the shooting instruction that user terminal is sent.
What the basis received is clapped the positional information of point, and the direction alignment of the head of unmanned plane is clapped to the direction of point
Including:
According to the current location information of unmanned plane and the positional information for being clapped point, unmanned plane is calculated to by line between bat point
Deflection, obtain unmanned plane head current deflection, according to the current deflection and unmanned plane of unmanned plane head to quilt
Clap the deflection of line between point, calculate unmanned plane head and unmanned plane arrive clapped between line angle, according to the angle
Rotate unmanned plane head so that point direction is clapped in the alignment of unmanned plane heading;
Or the control unmanned plane flies to shooting point and included:Control unmanned plane straight line flies to shooting point;And the general
The heading alignment of unmanned plane is included by the direction for clapping point:By the head rotating 180 degree of unmanned plane, to cause the machine of unmanned plane
The alignment of head direction is by bat point:The direction of takeoff point.
The positional information control unmanned plane for the shooting point that the basis receives further comprises before flying to shooting point:
The shooting instruction that user terminal is sent is received, the instruction carries the positional information of shooting point;
And the control unmanned plane fly to shooting point after, according to the positional information for being clapped point received by unmanned plane
Heading alignment is further comprised before the direction for clapping point:
The positional information of point is clapped to user terminal requests.
A kind of unmanned plane filming apparatus, the device is located in the flight controller of unmanned plane or ground is controlled in terminal,
The device includes:
First module:According to the positional information of the shooting point received, control unmanned plane flies to shooting point;
Second module:According to the positional information for being clapped point received, by the alignment of unmanned plane heading by the side of bat point
To the alignment of the direction of unmanned machine head camera to be clapped to the direction of point, controls unmanned machine head camera to be shot to being clapped point.
First module flies to the advance one of shooting point according to the positional information of the shooting point received control unmanned plane
Step includes:
The instruction for flying to appointed place that user terminal is sent is received, the instruction carries the positional information of shooting point;
And second module control unmanned plane fly to shooting point after, according to the positional information for being clapped point received
The alignment of unmanned plane heading is further comprised before the direction for clapping point:
The shooting aligned instructions that user terminal is sent are received, the instruction carries the positional information for being clapped point.
After the direction that second module is put the direction alignment of unmanned machine head camera by bat, control unmanned machine head
Camera further comprises to being clapped before point is shot:
Receive the shooting instruction that user terminal is sent.
The alignment of unmanned plane heading is clapped point by second module according to the positional information for being clapped point received
Direction includes:According to the current location information of unmanned plane with by clap point positional information, calculate unmanned plane arrive by clap between connected
The deflection of line, the current deflection of unmanned plane head is obtained, is arrived according to the current deflection and unmanned plane of unmanned plane head
Clapped the deflection of line between point, calculate unmanned plane head and unmanned plane arrive clapped between line angle, according to the folder
Angle rotates unmanned plane head so that point direction is clapped in the alignment of unmanned plane heading;
Or the first module control unmanned plane flies to shooting point and included:Control unmanned plane straight line flies to shooting point;
And second module is included the heading alignment of unmanned plane by the direction for clapping point:By the head rotating 180 of unmanned plane
Degree, to cause the alignment of the heading of unmanned plane by bat point:The direction of takeoff point.
First module flies to the advance one of shooting point according to the positional information of the shooting point received control unmanned plane
Step includes:
The shooting instruction that user terminal is sent is received, the instruction carries the positional information of shooting point;
And second module control unmanned plane fly to shooting point after, according to the positional information for being clapped point received
The alignment of unmanned plane heading is further comprised before the direction for clapping point:
The positional information of point is clapped to user terminal requests.
It can be seen that the application is by controlling unmanned plane to fly to appointed place, and control unmanned plane head and head camera all
Alignment starts to shoot after being clapped point, realizes clear, accurate shooting of the unmanned plane to object.
Brief description of the drawings
Fig. 1 is the unmanned plane image pickup method flow chart that the embodiment of the application one provides;
Fig. 2 is the unmanned plane image pickup method flow chart that another embodiment of the application provides;
Fig. 3 is the unmanned plane image pickup method flow chart that the another embodiment of the application provides;
Fig. 4 is the composition schematic diagram for the unmanned plane filming apparatus that the embodiment of the present application provides.
Embodiment
Below in conjunction with the accompanying drawings and specific embodiment the present invention is further described in more detail.
Fig. 1 is the unmanned plane image pickup method flow chart that the embodiment of the application one provides, and it is comprised the following steps that:
Step 101:According to the positional information of the shooting point received, control unmanned plane flies to shooting point.
Step 102:According to the positional information for being clapped point received, by the heading alignment of unmanned plane by the side of bat point
To by the direction alignment of unmanned machine head camera by the direction of bat point.
Step 103:Unmanned machine head camera is controlled to be shot to being clapped point.
In a particular application, the embodiment of the present application can be used for unmanned plane shooting arbitrary objects, and be particularly suitable for use in shooting ground
Face user.
Fig. 2 is the unmanned plane image pickup method flow chart that another embodiment of the application provides, and it is comprised the following steps that:
Step 201:When user determines to need unmanned plane to have a photo taken, " unmanned plane is manual for input on the subscriber terminal
Shooting " instruction, user terminal according to itself current location and default shooting point and the relative position information for being clapped point, it is determined that
The position of the shooting point of unmanned plane, the instruction for flying to appointed place is sent to unmanned plane, and the instruction carries the position letter of shooting point
Breath.
The shooting point of unmanned plane i.e., position when unmanned plane shoots user where unmanned plane, can use gps coordinate:Longitude, latitude
Degree, elevation information represent.
Clapped point i.e., user position, gps coordinate can be used:Longitude, latitude, elevation information expression, can be by user terminal
Internal GPS module measurement obtains.
Preset shooting point and clapped the relative position of point, such as:Shooting point is relative to be clapped the longitude, latitude, height of point
Spend information.
Step 202:Unmanned plane receives the instruction for flying to appointed place, the position of the shooting point carried according to the instruction
Information, fly to shooting point.
Hovering of the instruction of appointed place concretely in appointed place is flown to instruct, then after unmanned plane flies to shooting point,
Hovered in shooting point.
Step 203:User terminal obtains itself current positional information, using the positional information as the present bit for being clapped point
Confidence ceases, and sends shooting aligned instructions to unmanned plane, the instruction carries the current location information for being clapped point.
User terminal can obtain itself current positional information from the GPS module of itself, and the positional information includes:Longitude,
Latitude and height.
Step 204:Unmanned plane receives the shooting aligned instructions, obtains the current deflection Q1 of itself head, meanwhile, obtain
The current positional information of body is derived from, according to itself current positional information and the current location information for being clapped point, calculates nobody
Machine is to by the deflection Q2 of the line between bat point.
Unmanned plane can obtain the current deflection Q1 of itself head from the angle measuring sensor of itself.
Unmanned plane can obtain itself current positional information from the GPS module of itself, and the positional information is the three-dimensional letters of GPS
Breath:Longitude, latitude and height, the positional information for being clapped point is also GPS three-dimensional informations, according to two three-dimensional information, can be calculated
Go out unmanned plane to the deflection Q2 for being clapped the line between point.
Step 205:Unmanned plane calculates head and unmanned plane to the angle for being clapped the line between point, root according to Q1, Q2
According to the angle rotating head so that the direction of point is clapped in the direction alignment of head.
Step 206:Unmanned plane obtains the angle of head camera and head, control head rotation so that the side of head camera
To alignment heading, that is, it is aligned and is clapped point direction.
Unmanned plane can obtain the angle of head camera and head from the head gyroscope of itself.
In actual applications, when unmanned plane distance is less than preset height threshold value by the height for clapping point, head can only be adjusted
Roll angle so that head camera direction alignment heading, due to unmanned plane distance by clap point it is nearer, then now it is believed that
Head camera also targeted by the direction for being clapped point;
When unmanned plane distance is more than or equal to preset height threshold value by the height for clapping point, then the roll of head is first adjusted
Angle so that the direction alignment heading of head camera;Then the angle of pitch of head is adjusted so that the direction alignment of head camera
Clapped the direction of point.Certainly, in actual applications, it can not also consider that unmanned plane distance is clapped the height of point, and all pass through all the time
Adjust head roll angle and the angle of pitch come cause head camera alignment by bat point.
Wherein, the angle of pitch=arccos of head (clapped height/unmanned plane of point and by between bat point by unmanned plane distance
Distance), wherein, arccos is anticosine operator.When being rest on the ground by bat point, the height that unmanned plane distance is clapped point is
The height of unmanned plane.
Step 207:When user confirms to need shooting, shooting instruction is inputted to user terminal, user terminal is sent out to unmanned plane
Shooting is sent to instruct, unmanned aerial vehicle (UAV) control head camera starts to shoot.
It is whole to user when user gets forwarded to according to head camera the picture of user terminal in real time, and confirmation can start shooting
End input shooting instruction.
In actual applications, step 202~205 could alternatively be following steps:
Step 202:Unmanned plane receives the instruction for flying to appointed place, the position of the shooting point carried according to the instruction
Information, straight line fly to shooting point.
Step 203~205:After unmanned plane reaches shooting point, by head rotating 180 degree so that heading alignment is clapped
Point (i.e. user position) direction.
Because in step 202, unmanned plane is to fly to shooting point from user's straight line, therefore, after shooting point is reached, nobody
Machine rotates 180 degree, you can so that heading is directed at user.
Fig. 3 is the unmanned plane image pickup method flow chart that the another embodiment of the application provides, and it is comprised the following steps that:
Step 301:When user determines to need unmanned plane to have a photo taken, " unmanned plane is automatic for input on the subscriber terminal
Shooting " instruction, user terminal according to itself current location and default shooting point and the relative position information for being clapped point, it is determined that
The position of the shooting point of unmanned plane, shooting instruction is sent to unmanned plane, the instruction carries the positional information of shooting point.
Step 302:Unmanned plane receives shooting instruction, the positional information of the shooting point carried according to the instruction, flies to
Shooting point, send to obtain to user terminal and clapped point position requests.
After unmanned plane flies to shooting point, hovered in shooting point.
Step 303:User terminal receives the acquisition and is clapped point position requests, obtains the current position letter of this user terminal
Breath, using the positional information as the current location information for being clapped point, return to obtain to unmanned plane and clapped point position response, the response
Carry the current location information for being clapped point.
Step 304:Unmanned plane receives the acquisition and is clapped point position response, and the present bit for being clapped point is read from the response
Confidence ceases, and obtains the current deflection Q1 of itself head, and according to itself current positional information and the current location for being clapped point
Information, calculate this unmanned plane and arrive by the deflection Q2 of line between bat point.
Step 305:Unmanned plane calculates head and unmanned plane to the angle for being clapped the line between point, root according to Q1, Q2
According to the angle rotating head so that the direction of point is clapped in the direction alignment of head.
Step 306:Unmanned plane obtains the angle of head camera and head, control head rotation so that the side of head camera
To alignment heading, that is, it is aligned and is clapped point direction.
Step 307:Unmanned aerial vehicle (UAV) control head camera starts to shoot.
Fig. 4 is the composition schematic diagram for the unmanned plane filming apparatus that the embodiment of the present application provides, and the device mainly includes:First
The module 42 of module 41 and second, wherein:
First module 41:According to the positional information of the shooting point received, control unmanned plane flies to shooting point, to the second mould
Block 42, which is sent, flies to shooting point notice.
Second module 42:When receiving that the first module 41 sends after flying to shooting point notice, clapped according to receiving
The positional information of point, the alignment of unmanned plane heading is clapped to the direction of point, the direction alignment of unmanned machine head camera is clapped
The direction of point, unmanned machine head camera is controlled to be shot to being clapped point.
In one embodiment, the first module 41 controls unmanned plane to fly to shooting point according to the positional information of the shooting point received
Further comprise before:
The instruction for flying to appointed place that user terminal is sent is received, the instruction carries the positional information of shooting point;
And second module 42 control unmanned plane fly to shooting point after, will according to the positional information for being clapped point received
The alignment of unmanned plane heading is further comprised before the direction for clapping point:
The shooting aligned instructions that user terminal is sent are received, the instruction carries the positional information for being clapped point.
In one embodiment, after the direction that the second module 42 is put the direction alignment of unmanned machine head camera by bat, control
Unmanned machine head camera further comprises to being clapped before point is shot:
Receive the shooting instruction that user terminal is sent.
In one embodiment, the second module 42 is according to the positional information for being clapped point received, by unmanned plane heading pair
The accurate direction for being clapped point includes:
According to the current location information of unmanned plane and the positional information for being clapped point, unmanned plane is calculated to by line between bat point
Deflection;
Obtain the current deflection of unmanned plane head;
According to the deflection of the current deflection and unmanned plane of unmanned plane the head line between being put by bat, nobody is calculated
The angle of machine head and the unmanned plane line between being put by bat, rotates unmanned plane head so that unmanned plane head according to the angle
Point direction is clapped in direction alignment.
In one embodiment, the first module 41 control unmanned plane, which flies to shooting point, to be included:Control unmanned plane straight line flies to shooting
Point;And second module 42 alignment of the heading of unmanned plane is included by the direction for clapping point:By the head rotating 180 of unmanned plane
Degree, to cause the alignment of the heading of unmanned plane by bat point:The direction of takeoff point.
In one embodiment, the first module 41 controls unmanned plane to fly to shooting point according to the positional information of the shooting point received
Further comprise before:
The shooting instruction that user terminal is sent is received, the instruction carries the positional information of shooting point;
And second module 42 control unmanned plane fly to shooting point after, will according to the positional information for being clapped point received
The alignment of unmanned plane heading is further comprised before the direction for clapping point:
The positional information of point is clapped to user terminal requests.
Said apparatus can be located in the flight controller of unmanned plane or ground control terminal.
The preferred embodiment of the application is the foregoing is only, not limiting the application, all essences in the application
God any modification, equivalent substitution and improvements done etc., should be included within the scope of the application protection with principle.
Claims (10)
1. a kind of unmanned plane image pickup method, it is characterised in that this method includes:
According to the positional information of the shooting point received, control unmanned plane flies to shooting point;
According to the positional information for being clapped point received, the heading of unmanned plane is directed to the direction for being clapped point, by unmanned plane
The direction of point is clapped in the direction alignment of head camera;
Unmanned machine head camera is controlled to be shot to being clapped point.
2. according to the method for claim 1, it is characterised in that
The positional information control unmanned plane for the shooting point that the basis receives further comprises before flying to shooting point:
The instruction for flying to appointed place that user terminal is sent is received, the instruction carries the positional information of shooting point;
And the control unmanned plane fly to shooting point after, according to the positional information for being clapped point received by unmanned plane head
Direction alignment is further comprised before the direction for clapping point:
The shooting aligned instructions that user terminal is sent are received, the instruction carries the positional information for being clapped point.
3. according to the method for claim 2, it is characterised in that the direction by unmanned machine head camera is aligned by bat point
Direction after, the unmanned machine head camera of control further comprises to being clapped before point is shot:
Receive the shooting instruction that user terminal is sent.
4. according to the method for claim 1, it is characterised in that what the basis received is clapped the positional information of point, will
The direction alignment of the head of unmanned plane is included by the direction for clapping point:
According to the current location information of unmanned plane and the positional information for being clapped point, unmanned plane is calculated to by the side of line between bat point
To angle, the current deflection of acquisition unmanned plane head, according to the current deflection and unmanned plane of unmanned plane head to by bat point
Between line deflection, calculate unmanned plane head and unmanned plane and arrive the angle for being clapped line between point, according to angle rotation
Unmanned plane head so that point direction is clapped in the alignment of unmanned plane heading;
Or the control unmanned plane flies to shooting point and included:Control unmanned plane straight line flies to shooting point;It is and described by nobody
The heading alignment of machine is included by the direction for clapping point:By the head rotating 180 degree of unmanned plane, to cause the head side of unmanned plane
To alignment by bat point:The direction of takeoff point.
5. according to the method for claim 1, it is characterised in that the positional information control for the shooting point that the basis receives
Unmanned plane further comprises before flying to shooting point:
The shooting instruction that user terminal is sent is received, the instruction carries the positional information of shooting point;
And the control unmanned plane fly to shooting point after, according to the positional information for being clapped point received by unmanned plane head
Direction alignment is further comprised before the direction for clapping point:
The positional information of point is clapped to user terminal requests.
6. a kind of unmanned plane filming apparatus, it is characterised in that the device is located in the flight controller of unmanned plane or ground control
In terminal processed, the device includes:
First module:According to the positional information of the shooting point received, control unmanned plane flies to shooting point;
Second module:According to the positional information for being clapped point received, the alignment of unmanned plane heading is clapped to the direction of point, will
The direction of point is clapped in the direction alignment of unmanned machine head camera, controls unmanned machine head camera to be shot to being clapped point.
7. device according to claim 6, it is characterised in that
First module flies to the bag that takes a step forward of shooting point according to the positional information of the shooting point received control unmanned plane
Include:
The instruction for flying to appointed place that user terminal is sent is received, the instruction carries the positional information of shooting point;
And second module control unmanned plane fly to shooting point after, according to the positional information for being clapped point received by nothing
Man-machine heading alignment is further comprised before the direction for clapping point:
The shooting aligned instructions that user terminal is sent are received, the instruction carries the positional information for being clapped point.
8. device according to claim 7, it is characterised in that second module is by the direction pair of unmanned machine head camera
It is accurate by after the direction for clapping point, the unmanned machine head camera of control further comprises to being clapped before point is shot:
Receive the shooting instruction that user terminal is sent.
9. device according to claim 6, it is characterised in that second module is according to the position for being clapped point received
Information, the alignment of unmanned plane heading is included by the direction for clapping point:According to the current location information of unmanned plane with being clapped point
Positional information, unmanned plane is calculated to by the deflection of line between bat point, the current deflection of acquisition unmanned plane head, according to nothing
The current deflection and unmanned plane of man-machine head are to by the deflection of line between bat point, calculating unmanned plane head and unmanned plane
The angle of line between being put by bat, rotates unmanned plane head so that unmanned plane heading is aligned by bat point according to the angle
Direction;
Or the first module control unmanned plane flies to shooting point and included:Control unmanned plane straight line flies to shooting point;And institute
Stating the second module is included the heading alignment of unmanned plane by the direction for clapping point:By the head rotating 180 degree of unmanned plane, so that
The heading for obtaining unmanned plane is aligned by bat point:The direction of takeoff point.
10. device according to claim 6, it is characterised in that first module is according to the position of the shooting point received
Confidence breath control unmanned plane further comprises before flying to shooting point:
The shooting instruction that user terminal is sent is received, the instruction carries the positional information of shooting point;
And second module control unmanned plane fly to shooting point after, according to the positional information for being clapped point received by nothing
Man-machine heading alignment is further comprised before the direction for clapping point:
The positional information of point is clapped to user terminal requests.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610744128.8A CN107786802A (en) | 2016-08-26 | 2016-08-26 | Unmanned plane image pickup method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610744128.8A CN107786802A (en) | 2016-08-26 | 2016-08-26 | Unmanned plane image pickup method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN107786802A true CN107786802A (en) | 2018-03-09 |
Family
ID=61440981
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610744128.8A Pending CN107786802A (en) | 2016-08-26 | 2016-08-26 | Unmanned plane image pickup method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN107786802A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109660721A (en) * | 2018-12-14 | 2019-04-19 | 上海扩博智能技术有限公司 | Unmanned plane during flying shooting quality optimization method, system, equipment and storage medium |
CN111316187A (en) * | 2019-04-29 | 2020-06-19 | 深圳市大疆创新科技有限公司 | Cloud deck control method, cloud deck and shooting device |
CN113625503A (en) * | 2018-07-18 | 2021-11-09 | 深圳市大疆创新科技有限公司 | Image shooting method and unmanned aerial vehicle |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103188431A (en) * | 2011-12-27 | 2013-07-03 | 鸿富锦精密工业(深圳)有限公司 | System and method for controlling unmanned aerial vehicle to conduct image acquisition |
CN104670512A (en) * | 2015-02-13 | 2015-06-03 | 徐鹏 | Multi-lens aerial shoot cloud deck |
CN104917966A (en) * | 2015-05-28 | 2015-09-16 | 小米科技有限责任公司 | Flight shooting method and device |
CN105554381A (en) * | 2015-12-11 | 2016-05-04 | 上海斐讯数据通信技术有限公司 | Picture taking control method, system, electronic equipment and aircraft |
-
2016
- 2016-08-26 CN CN201610744128.8A patent/CN107786802A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103188431A (en) * | 2011-12-27 | 2013-07-03 | 鸿富锦精密工业(深圳)有限公司 | System and method for controlling unmanned aerial vehicle to conduct image acquisition |
CN104670512A (en) * | 2015-02-13 | 2015-06-03 | 徐鹏 | Multi-lens aerial shoot cloud deck |
CN104917966A (en) * | 2015-05-28 | 2015-09-16 | 小米科技有限责任公司 | Flight shooting method and device |
CN105554381A (en) * | 2015-12-11 | 2016-05-04 | 上海斐讯数据通信技术有限公司 | Picture taking control method, system, electronic equipment and aircraft |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113625503A (en) * | 2018-07-18 | 2021-11-09 | 深圳市大疆创新科技有限公司 | Image shooting method and unmanned aerial vehicle |
CN109660721A (en) * | 2018-12-14 | 2019-04-19 | 上海扩博智能技术有限公司 | Unmanned plane during flying shooting quality optimization method, system, equipment and storage medium |
CN111316187A (en) * | 2019-04-29 | 2020-06-19 | 深圳市大疆创新科技有限公司 | Cloud deck control method, cloud deck and shooting device |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11733692B2 (en) | Systems and methods for controlling an unmanned aerial vehicle | |
CN105487552B (en) | The method and device of unmanned plane track up | |
US11454964B2 (en) | Systems and methods for adjusting flight control of an unmanned aerial vehicle | |
JP6390013B2 (en) | Control method for small unmanned aerial vehicles | |
JP5775632B2 (en) | Aircraft flight control system | |
WO2018098704A1 (en) | Control method, apparatus, and system, unmanned aerial vehicle, and mobile platform | |
CN107918397A (en) | The autonomous camera system of unmanned plane mobile image kept with target following and shooting angle | |
CN110333735B (en) | System and method for realizing unmanned aerial vehicle water and land secondary positioning | |
CN108021145A (en) | The autonomous camera system of unmanned plane mobile image kept with target following and shooting angle | |
WO2018076895A1 (en) | Method, device, and system for controlling flying of slave unmanned aerial vehicle based on master unmanned aerial vehicle | |
CN106094876A (en) | A kind of unmanned plane target locking system and method thereof | |
CN105739544B (en) | Course following method and device of holder | |
CN107786802A (en) | Unmanned plane image pickup method and device | |
WO2020233682A1 (en) | Autonomous circling photographing method and apparatus and unmanned aerial vehicle | |
JP2019032234A (en) | Display device | |
JP2020170213A (en) | Drone-work support system and drone-work support method | |
CN204287973U (en) | flight camera | |
CN104049267A (en) | Forest fire point positioning method based on GPS and microwave distance measurement | |
CN108227749A (en) | Unmanned plane and its tracing system | |
CN107783551A (en) | The method and device that control unmanned plane follows | |
CN107357316A (en) | A kind of method for controlling UAV locking specific region to obtain photo | |
CN107783542A (en) | The control method and control system of unmanned plane | |
CN109900238A (en) | Measurement method, device and the computer readable storage medium at antenna for base station angle | |
WO2022094808A1 (en) | Photographing control method and apparatus, unmanned aerial vehicle, device, and readable storage medium | |
CN107783550A (en) | The method and device that control unmanned plane makes a return voyage |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20180309 |