CN109121434A - Unmanned plane interaction camera system and method - Google Patents

Unmanned plane interaction camera system and method Download PDF

Info

Publication number
CN109121434A
CN109121434A CN201780000407.6A CN201780000407A CN109121434A CN 109121434 A CN109121434 A CN 109121434A CN 201780000407 A CN201780000407 A CN 201780000407A CN 109121434 A CN109121434 A CN 109121434A
Authority
CN
China
Prior art keywords
unmanned plane
user
camera assembly
camera
shooting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201780000407.6A
Other languages
Chinese (zh)
Other versions
CN109121434B (en
Inventor
张景嵩
张凌
戴志宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Iac Nanjing Technology Co ltd
Iac Nanchang Technology Co ltd
Inventec Appliances Shanghai Corp
Inventec Appliances Pudong Corp
Inventec Appliances Corp
Original Assignee
Iac Nanjing Technology Co ltd
Iac Nanchang Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Iac Nanjing Technology Co ltd, Iac Nanchang Technology Co ltd filed Critical Iac Nanjing Technology Co ltd
Publication of CN109121434A publication Critical patent/CN109121434A/en
Application granted granted Critical
Publication of CN109121434B publication Critical patent/CN109121434B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The present invention provides a kind of unmanned plane interaction camera system and methods, and the system comprises unmanned plane, camera assembly and control assembly, one end of the camera assembly is rotatably connected to the side of the unmanned plane;Wherein the control assembly includes, control instruction library, image processing module, instructs determination module and instruction execution module, and described instruction execution module controls the unmanned plane and/or the camera assembly according to obtained control instruction is searched.The present invention provides a kind of technical solutions that interaction shooting is realized based on unmanned plane, camera assembly obtains shooting image automatically and is automatically analyzed to obtain pending user action feature by control assembly, the control instruction that user needs is interpreted according to pending user action feature, thus user directly can carry out flight control to unmanned plane by movement and carry out shooting control to camera assembly, to realize shooting function, the shooting of meet demand can be easily realized in all case, and the user experience is improved.

Description

Unmanned plane interaction camera system and method Technical field
The present invention relates to unmanned aerial vehicle (UAV) control technical fields more particularly to a kind of user by acting the unmanned plane directly controlled interaction camera system and method.
Background technique
Unmanned plane, that is, UAV is the not manned aircraft using radio robot and the process control provided for oneself manipulation.Due to the rapid development of unmanned air vehicle technique in recent years, every field is had been widely used for.
The shooting of existing unmanned plane can be divided into that commercialization is taken photo by plane and personal entertainment self-timer two large divisions, the application program being all made of in remote controler or handheld mobile device at present are manipulated.However when carrying out personal entertainment self-timer using unmanned plane, user will often combine two aspect of unmanned plane and remote controler, operate and inconvenient.For example, can usually need to observe the application program picture in handheld mobile device because of user when group photograph is met in shooting; and lead to not to photograph oneself clearly facial; or shooting motion skip can not make satisfied movement because of remote controler in hand, influence shooting effect from when shooting.
Furthermore, it is contemplated that the factors such as unmanned plane own wt and size, often electricity ratio is less, cruise duration is short for miniaturization self-timer unmanned plane, to influence to shoot enjoyment, is unable to satisfy the use demand of active user.
Summary of the invention
For the problems of the prior art, the purpose of the present invention is to provide a kind of unmanned plane interaction camera system and methods, user directly can carry out flight control to unmanned plane by movement and carry out shooting control to camera assembly, to realize shooting function, improve shooting effect.
The embodiment of the present invention provides a kind of unmanned plane interaction camera system, and the system comprises unmanned plane, camera assembly and control assembly, one end of the camera assembly is rotatably connected to the side of the unmanned plane;Wherein the control assembly includes:
Control instruction library, for storing the mapping relations of preset various user action features and various control instructions, the control instruction includes unmanned aerial vehicle (UAV) control instruction and/or camera assembly control instruction;
Image processing module is handled for the shooting image to the camera assembly, to obtain user action feature pending in the shooting image;
Determination module is instructed, for according to the pending user action feature, searching corresponding control instruction in the control instruction library;And
Instruction execution module, the control instruction for being obtained according to lookup control the unmanned plane and/or the camera assembly.
Optionally, the camera assembly includes picture pick-up device and camera shooting bracket, and the picture pick-up device is set to the camera shooting In bracket, and one end of the camera shooting bracket is rotatably connected to the side of the unmanned plane;
The system also includes a display equipment, the display equipment is detachable or is fixably attached to the other end of the camera shooting bracket.
Optionally, the display equipment includes array display screen and the first display control unit;First display control unit obtains the shooting image of the picture pick-up device, and is shown by the array display screen.
Optionally, the display equipment includes dot matrix display and the second display control unit;Second display control unit obtains the control instruction that described instruction determination module is searched, and the user's prompt information associated with the control instruction that the lookup obtains that controls that the dot matrix display show.
Optionally, one end of the camera shooting bracket is set as convex block, and the side of the unmanned plane is provided with a groove being adapted with the convex block shape;The convex block of the camera shooting bracket is embedded in the groove of the unmanned plane;
The lower surface of the unmanned plane is a flat surface, the lower surface of the unmanned plane includes that a camera shooting bracket corresponds to area, lower surface of the two sides of the groove of the unmanned plane perpendicular to the unmanned plane, and the convex block of the camera shooting bracket can be rotated in the groove of the unmanned plane, so that the camera shooting bracket can be vertical and rotate with described image in the angular range that the corresponding area of bracket is bonded with the lower surface of the unmanned plane.
Optionally, the lower surface of the unmanned plane further includes that an electric energy storage device corresponds to area, and the electric energy storage device corresponds to area corresponding with the camera shooting bracket, area without intersecting;
The system also includes an electric energy storage device, the electric energy storage device is detachable or is fixably attached to the lower surface of the unmanned plane, and the electric energy storage device is bonded the electric energy storage device and corresponds to area.
Optionally, the camera shooting bracket includes first support arm, second support arm and third support arm, the side of the first support arm is connected to the convex block, and the other side of the first support arm is provided with one first slot, one end of the second support arm and one end of third support arm are respectively connected to the both ends of the first support arm, and the second support arm and third support arm are each perpendicular to the first support arm, the other end of the second support arm is provided with one second slot, and the other end of the third support arm is provided with a third slot;
The side of the display equipment is inserted into first slot, and the other side of the display equipment is inserted into second slot and the third slot.
It optionally, further include that voice obtains equipment, the voice obtains the voice data that equipment is used to obtain user;
The control instruction library is also used to store the mapping relations of preset various voice keywords and various control instructions;
The control assembly further includes speech processing module, and the speech processing module is used to extract the voice keyword in the voice data of the user included;
Described instruction determination module is also used to search corresponding control instruction in the control instruction library according to the voice keyword of extraction.
Optionally, the speech processing module is also used to obtain the vocal print feature of user in the voice data of the user, and judges whether the vocal print feature of the user is to prestore specified vocal print feature;
If the vocal print feature of the user be it is default allow vocal print feature, described instruction determination module extracts the voice keyword for including in the voice data of the user, and is looked into the control instruction library according to the voice keyword of extraction Look for corresponding control instruction;
If the vocal print feature of the user is not default permission vocal print feature, described instruction determination module ignores the vocal print feature of the user, without extracting the processing of voice keyword.
Optionally, described image processing module is also used to obtain the physiological characteristic of user in the shooting image of the camera assembly, and judges whether the physiological characteristic of user is to prestore specified physiological characteristic;
If the physiological characteristic of the user is to prestore specified physiological characteristic, described instruction determination module searches corresponding control instruction according to the pending user action feature in the control instruction library;
If the physiological characteristic of the user is not to prestore specified physiological characteristic, described instruction determination module ignores the pending user action feature, without searching control instruction processing.
Optionally, unmanned aerial vehicle (UAV) control instruction includes at least one of unmanned plane translation instruction, unmanned plane rotation command, unmanned plane power-on instruction and unmanned plane shutdown command;The camera assembly control instruction includes at least one of camera assembly rotation command, acquisition parameters adjustment instruction, shooting sign on and shooting halt instruction.
Optionally, the control instruction further include:
First mode selection instruction, indicate that the control assembly enters first mode, in the first mode, described instruction determination module is according to the user action feature, corresponding unmanned aerial vehicle (UAV) control instruction is searched in the control instruction library, and controls the unmanned plane according to the instruction of obtained unmanned aerial vehicle (UAV) control is searched;
Second mode selection instruction, indicate that the control assembly enters second mode, in the second mode, described instruction determination module is according to the user action feature, corresponding camera assembly control instruction is searched in the control instruction library, and controls the camera assembly according to obtained camera assembly control instruction is searched.
Optionally, the control instruction further include:
Panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane with pre-set velocity (0, it α) is persistently moved in angular range, α is default pan-shot maximum angle.
Optionally, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
The position of the camera assembly detection user;
The unmanned plane rotates α/n to side in same level using the position of user as starting point, and wherein n is the first default partition value, and n > 1;
The camera assembly starts to shoot, and the unmanned plane at the uniform velocity rotates α to the other side with pre-set velocity in same level;
After the unmanned plane stops operating, the camera assembly stops shooting.
Optionally, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
Described instruction execution module calculates the distance between the camera assembly and user L;
Described instruction execution module selectes the anchor point between the camera assembly and user, is circle with the anchor point The heart is angled the first fan-shaped of α using L/m as radius is raw, and level to be captured is on the described first fan-shaped circular arc, and wherein m is the second default partition value, and m > 1;
Described instruction execution module generates second sector opposite with first sector, and the described second fan-shaped two sides are respectively the reverse extending line of the described first fan-shaped two sides, and the described second fan-shaped radius is (m-1) L/m, angle α;
The camera assembly starts to shoot, and the unmanned plane is moved to the other end of the circular arc from one end of the described second fan-shaped circular arc along the track of the circular arc;
After the unmanned plane is moved to the other end of the described second fan-shaped circular arc, the camera assembly stops shooting.
Optionally, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
Described instruction execution module calculates the distance between the camera assembly and user L;
Described instruction execution module selectes the anchor point between the camera assembly and user, using the anchor point as vertex, using L/m as the length of waist, generate the first isosceles triangle that corner angle is α, and level to be captured is on the bottom edge of first isosceles triangle, wherein m is the second default partition value, and m > 1;
Described instruction execution module generates second isosceles triangle opposite with first isosceles triangle, two waists of second isosceles triangle are respectively the reverse extending line of two waists of first isosceles triangle, and the length of the waist of second isosceles triangle is (m-1) L/m, corner angle α;
The camera assembly starts to shoot, and the unmanned plane is moved to the other end on the bottom edge from the one end on the bottom edge of second isosceles triangle along the track on the bottom edge;
After the unmanned plane is moved to the other end on the bottom edge of second isosceles triangle, the camera assembly stops shooting.
Optionally, the control instruction further include:
The third mode selection instruction indicates that the control assembly enters the third mode, and under the third mode, described instruction execution module controls the camera assembly and shot after the default waiting time.
Optionally, the control instruction further include:
Fourth mode selection instruction, indicate that the control assembly enters fourth mode, under the fourth mode, described instruction execution module detects the position of user by the camera assembly, and the unmanned plane and the camera assembly are controlled automatically according to the position movement of the user, so that the camera assembly persistently shoots user.
Optionally, under the fourth mode, described instruction execution module obtains the change in location acceleration of user, and when the change in location acceleration of user is more than predetermined acceleration threshold value, described instruction execution module issues alarm signal to outside.
Optionally, at least one range sensor is additionally provided on the unmanned plane, the control assembly further includes obstacle computing module, and the obstacle computing module is used to obtain the obstacle detection data of the range sensor;
In the pending control instruction include unmanned plane move, and the obstacle computing module judge the distance between barrier and described unmanned plane in the unmanned plane move on moving direction be less than preset secure threshold when, take Disappear the unmanned plane move, and issues limit value alerting signal to outside.
The present invention also provides a kind of unmanned plane interaction image pickup methods, and using the unmanned plane interaction camera system, described method includes following steps:
The camera assembly obtains shooting image;
Described image processing module handles the shooting image of the camera assembly, to obtain user action feature pending in the shooting image;
Described instruction determination module searches corresponding control instruction according to the pending user action feature in the control instruction library;And
Described instruction execution module controls the unmanned plane and/or the camera assembly according to obtained control instruction is searched.
Optionally, the control instruction further includes panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly and carries out distant view photograph shooting with following steps:
The position of the camera assembly detection user;
The unmanned plane rotates α/n to side in same level using the position of user as starting point, and wherein n is the first default partition value, and n > 1, α are default pan-shot maximum angle;
The camera assembly starts to shoot, and the unmanned plane at the uniform velocity rotates α to the other side with pre-set velocity in same level;
After the unmanned plane stops operating, the camera assembly stops shooting.
Optionally, the control instruction further includes panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
Described instruction execution module calculates the distance between the camera assembly and user L;
Described instruction execution module selectes the anchor point between the camera assembly and user, using the anchor point as the center of circle, using L/m as raw the first sector for being angled α of radius, and level to be captured is on the described first fan-shaped circular arc, wherein m is the second default partition value, and m > 1, α are default pan-shot maximum angle;
Described instruction execution module generates second sector opposite with first sector, and the described second fan-shaped two sides are respectively the reverse extending line of the described first fan-shaped two sides, and the described second fan-shaped radius is (m-1) L/m, angle α;
The camera assembly starts to shoot, and the unmanned plane is moved to the other end of the circular arc from one end of the described second fan-shaped circular arc along the track of the circular arc;
After the unmanned plane is moved to the other end of the described second fan-shaped circular arc, the camera assembly stops shooting.
Optionally, the control instruction further includes panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
Described instruction execution module calculates the distance between the camera assembly and user L;
Described instruction execution module selectes the anchor point between the camera assembly and user, using the anchor point as vertex, using L/m as the length of waist, generate the first isosceles triangle that corner angle is α, and level to be captured is on the bottom edge of first isosceles triangle, wherein m is the second default partition value, and m > 1, α are default pan-shot maximum angle;
Described instruction execution module generates second isosceles triangle opposite with first isosceles triangle, two waists of second isosceles triangle are respectively the reverse extending line of two waists of first isosceles triangle, and the length of the waist of second isosceles triangle is (m-1) L/m, corner angle α;
The camera assembly starts to shoot, and the unmanned plane is moved to the other end on the bottom edge from the one end on the bottom edge of second isosceles triangle along the track on the bottom edge;
After the unmanned plane is moved to the other end on the bottom edge of second isosceles triangle, the camera assembly stops shooting.
Unmanned plane interaction camera system provided by the present invention and method have the advantage that
The present invention provides the technical solutions that a kind of user is directly controlled by movement, camera assembly obtains shooting image automatically and is automatically analyzed to obtain pending user action feature by control assembly, the control instruction that user needs is interpreted according to pending user action feature, thus user directly can carry out flight control to unmanned plane by movement and carry out shooting control to camera assembly, to realize shooting function, the shooting of meet demand can be easily realized in all case, and the user experience is improved.
Detailed description of the invention
Upon reading the detailed description of non-limiting embodiments with reference to the following drawings, other features, purposes and advantages of the invention will become more apparent.
Fig. 1 is the structural block diagram of the unmanned plane interaction camera system of one embodiment of the invention;
Fig. 2 is the structural schematic diagram of the unmanned plane interaction camera system using array display screen of one embodiment of the invention;
Fig. 3 is the structural schematic diagram of the unmanned plane interaction camera system using dot matrix display of one embodiment of the invention;
Fig. 4 is the schematic diagram of the adjustment unmanned plane position of one embodiment of the invention;
Fig. 5 is the schematic diagram of the adjustment camera assembly angle of one embodiment of the invention;
Fig. 6~7 are the schematic diagrames of the gesture control of one embodiment of the invention;
Fig. 8 is the structural schematic diagram using external display equipment of one embodiment of the invention;
Fig. 9 is the structural schematic diagram when display equipment of one embodiment of the invention is packed up;
Figure 10 is the schematic bottom view of the unmanned plane of one embodiment of the invention when not in use;
Figure 11 is the structural schematic diagram of the electric energy storage device of one embodiment of the invention;
Status diagram when Figure 12 is the unmanned plane charging of one embodiment of the invention;
Figure 13 is the flow chart of the unmanned plane charging process of one embodiment of the invention;
Figure 14 is the schematic diagram by voice control unmanned plane position of one embodiment of the invention;
Figure 15 is the structural schematic diagram of the unmanned plane interaction camera system of the increase voice control of one embodiment of the invention;
Figure 16 is the flow chart of user's voice print verification of one embodiment of the invention;
Figure 17 is the flow chart of user's physiological characteristic verifying of one embodiment of the invention;
Figure 18~20 are the flow charts of the unmanned plane interaction image pickup method of one embodiment of the invention;
Figure 21 is the flow chart of the pan-shot of one embodiment of the invention;
The schematic diagram that unmanned plane rotates when Figure 22 is the pan-shot of one embodiment of the invention;
The schematic diagram that unmanned plane is moved along arc track when Figure 23 is the pan-shot of one embodiment of the invention;
The schematic diagram that unmanned plane is moved along straight path when Figure 24 is the pan-shot of one embodiment of the invention;
Figure 25 is that the unmanned plane of one embodiment of the invention automatically tracks the flow chart of user location;
Figure 26 is the flow chart of the automatic Obstacle avoidance of unmanned plane of one embodiment of the invention.
Specific embodiment
Example embodiment is described more fully with reference to the drawings.However, example embodiment can be implemented in a variety of forms, and it is not understood as limited to embodiment set forth herein;On the contrary, thesing embodiments are provided so that the present invention will be full and complete, and the design of example embodiment is comprehensively communicated to those skilled in the art.Identical appended drawing reference indicates same or similar structure in figure, thus will omit repetition thereof.
As shown in Figure 1, the embodiment of the present invention provides a kind of unmanned plane interaction camera system, the system comprises unmanned plane 200, camera assembly 300 and control assembly 100, one end of the camera assembly 300 is rotatably connected to the side of the unmanned plane 200;Wherein the control assembly 100 includes: control instruction library 110, and for storing the mapping relations of preset various user action features and various control instructions, the control instruction includes unmanned aerial vehicle (UAV) control instruction and/or camera assembly control instruction;Image processing module 120 is handled for the shooting image to the camera assembly 300, to obtain user action feature pending in the shooting image;Determination module 130 is instructed, for according to the pending user action feature, searching corresponding control instruction in the control instruction library;And instruction execution module 140, the control instruction for being obtained according to lookup control the unmanned plane 200 and/or the camera assembly 300.
User action feature herein is preferably the gesture of user, different control instructions can be obtained using different gestures, however in practical applications, can also be using other user action features, such as user's expression in the eyes, user nods, shakes the head, user's laugh etc., such as shot when the picture for capturing user's laugh can be set, so as to the automatic capture, etc. for realizing user smile.The following embodiments introduction mostly gesture based on user controls, it being understood, however, that uses other users motion characteristic, also belongs within protection scope of the present invention.
As shown in Fig. 2, the structural schematic diagram of the unmanned plane interaction camera system for one embodiment of the invention.Unmanned plane 200 is shown, the side of unmanned plane 200 is rotatably mounted with a camera assembly 300, the camera assembly 300 Including picture pick-up device 320 and camera shooting bracket 310, the picture pick-up device 320 is set in the camera shooting bracket 310, and one end of the camera shooting bracket 310 is rotatably connected to the side of the unmanned plane 200;Further, the system can also include a display equipment 330, and the display equipment 330 is detachable or is fixably attached to the other end of the camera shooting bracket 310.
The control assembly 100 controls the unmanned plane 200 and/or camera assembly 300 for convenience; the control assembly 100 can be set in the inside of unmanned plane 200; or the surface of unmanned plane 200 is set, or be arranged at other positions, all belong to the scope of protection of the present invention within.Described instruction execution module 140 can be communicated directly with the controller of the unmanned plane 200, can also be carried out wireless communication with the camera assembly 300, to realize the transmitting and feedback of control instruction.
The display equipment 330 can according to need the content that display is checked for user, give two kinds of set-up modes of display equipment 330 in Fig. 2 and Fig. 3.
It include array display screen and the first display control unit in display equipment 330 shown in Figure 2;First display control unit obtains the shooting image of the picture pick-up device 320, and is shown by the array display screen.Array display screen can include but is not limited to color LCD screen, and user can observe self-timer picture in real time by display screen.
Show to include dot matrix display and the second display control unit in equipment 330 shown in Fig. 3;Second display control unit obtains described instruction determination module 130 and searches obtained control instruction, and the user's prompt information associated with the control instruction that the lookup obtains that controls that the dot matrix display show.The dot matrix display can include but is not limited to dot matrix LED screen, and user can carry out self-timer preparation and shooting by the cresset spread pattern of LED.
For example, user's prompt information can be self-timer countdown, such as when starting within countdown five seconds shooting, the dot matrix display successively shows 5,4,3,2,1, and user can carry out self-timer preparation according to countdown;User's prompt information can also indicate which kind of screening-mode be currently at, such as when showing 2, and expression is currently at second mode, etc..
By using unmanned plane interaction camera system of the invention, camera assembly obtains shooting image automatically and is automatically analyzed to obtain pending user action feature by control assembly, the control instruction that user needs is interpreted according to pending user action feature, thus the control of unmanned plane 200 and/or camera assembly 300 may be implemented in user.
Controlling unmanned plane 200 and/or when camera assembly 300, the unmanned aerial vehicle (UAV) control instruction may include at least one of unmanned plane translation instruction, unmanned plane rotation command, unmanned plane power-on instruction and unmanned plane shutdown command;The camera assembly control instruction may include at least one of camera assembly rotation command, acquisition parameters adjustment instruction, shooting sign on and shooting halt instruction.Focusing, light filling, image size when adjustable acquisition parameters may include shooting herein etc..
As shown in figure 4, the schematic diagram of adjustment 200 position of unmanned plane for one embodiment of the invention.Specific adjustment unmanned plane position can use following steps:
A. it after unmanned plane 200 starts and takes off, hovers in initial position;
B. the self-timer angle from display equipment 330 of user 400, it was found that portrait is in position (portrait in Fig. 4 shown in dotted line) to the left in display equipment 330, user 400 is moved to the left position by gesture (from Fig. 4 400 hand dashed lines states of user to solid line state) control unmanned plane, until portrait is in picture middle position (portrait in Fig. 4 shown in solid line);
C. after meeting shooting condition, user 400 is shot by gesture control.
As shown in figure 5, the schematic diagram of adjustment 300 angle of camera assembly for one embodiment of the invention.Specific adjustment camera assembly 300 can use following steps:
A. it after unmanned plane 200 starts and takes off, hovers in initial position;
B. the self-timer angle from display equipment 303 of user 400, it was found that unmanned plane 200 is higher, portrait is in position on the lower side (portrait as shown in dotted line in Fig. 5), user is downwardly turned over by gesture (from Fig. 4 400 hand dashed lines states of user to solid line state) control camera assembly 300, to drive picture pick-up device 302 to downwardly turn over, until portrait is in picture middle position (portrait as shown in solid line in Fig. 5);
C. after meeting shooting condition, user 400 is shot by gesture control.
In addition, the mode of control unmanned plane 200 and camera assembly 300 can also be with flexible choice, for example, when portrait is in position on the lower side, can also be adjusted by way of reducing the height of unmanned plane 200, so that portrait is in picture middle position in Fig. 5.
Specifically, the adjustment mode of unmanned plane 200 and camera assembly 300 can be distinguished using preset different gesture instructions.That is, when knowing a gesture, it can know that the specific control object of the gesture is unmanned plane 200 or camera assembly 300, it will also be appreciated that it is what that the gesture, which specifically controls unmanned plane 200 or the movement of camera assembly 300,.
As shown in Figure 6 and Figure 7, the mapping relations of a kind of user action feature and control instruction are shown.Control instruction corresponding to a variety of different gestures is as shown in table 1 below herein.
1 gesture of table and control instruction mapping table
A kind of example of gesture control is only gived in Fig. 6 and Fig. 7.In practical applications, user can also be revised as meeting the gesture of its use habit with the mapping relations of customized a variety of different gestures and different control instructions.And other motion characteristics can also be increased, for example, user nods, indicate confirmation shooting, user shakes the head, indicates to delete previous shooting image, etc..
As given two kinds of set-up modes of camera assembly 300 in Fig. 8 and Fig. 9.As shown in Figure 8, the camera assembly 300 uses external display equipment 340, the external display equipment 340 may further be the mobile terminal of user, and external display equipment 340 can also can be communicated with control assembly 100 by data lines such as USB by wireless communication.One end of the camera shooting bracket 310 is set as convex block 311, and the side of the unmanned plane 200 is provided with a groove 210 being adapted with the convex block shape;The convex block 311 of the camera shooting bracket is embedded in the groove 210 of the unmanned plane.
In the set-up mode of Fig. 8, the camera shooting bracket 310 includes first support arm 312, second support arm 313 and third support arm 314, the side of the first support arm 312 is connected to the convex block 311, and the other side of the first support arm 312 is provided with one first slot, one end of the second support arm 313 and one end of third support arm 314 are respectively connected to the both ends of the first support arm 312, and the second support arm 313 and third support arm 314 are each perpendicular to the first support arm 312, the other end of the second support arm 313 is provided with one second slot, the other end of the third support arm 314 is provided with a third slot.The external display equipment 340 can be placed in camera shooting bracket 310, first slot is inserted into the upper end of external display equipment 340, second slot and third slot are inserted into the lower end of external display equipment 340, to form stable and connection convenient for loading and unloading between external display equipment 340 and camera shooting bracket 310.
In the embodiment of Fig. 9, the display equipment 330 is built-in display equipment 330.Similarly, in this kind of set-up mode, camera shooting bracket 310 is realized by the cooperation of convex block 311 and groove 210 and is rotated, and display equipment 330 is also rotated together with camera shooting bracket 310.The lower surface of the unmanned plane 200 is a flat surface, the lower surface of the unmanned plane 200 includes that a camera shooting bracket corresponds to area 220, lower surface of the two sides of the groove 210 of the unmanned plane 200 perpendicular to the unmanned plane 200, thus the convex block 311 can rotate upwardly and downwardly in groove 210, so that the camera shooting bracket 310 can be vertical and rotate with described image in the angular range that the corresponding area 220 of bracket is bonded with the lower surface of the unmanned plane 200.As described above, in use, camera assembly 300 can be adjusted in the angular range of needs, to obtain better shooting effect.And finished in use or unmanned plane electricity exhausts when not being available, camera shooting bracket 310 can be folded to area 220 corresponding with the camera shooting bracket is bonded, and facilitates and is folded carrying.
In addition, battery capacity also do not grow by very little, cruise duration due to the universal small volume of unmanned plane 200, in order to overcome the problem, the embodiment of the present invention further provides a kind of mode of convenient charging.As shown in Figure 10~12, institute The lower surface for stating unmanned plane 200 further includes that an electric energy storage device corresponds to area 230, and the electric energy storage device corresponds to area 220 corresponding with the camera shooting bracket, area without intersecting;The system also includes an electric energy storage device 500, the electric energy storage device 500 is detachable or is fixably attached to the lower surface of the unmanned plane 200, and the electric energy storage device 500 is bonded the electric energy storage device and corresponds to area.
The flow chart to charge when as shown in figure 13, for using the structure of the embodiment, when display screen is external display screen, the connection between external display screen and unmanned plane is disconnected first, external display screen can be removed, can also stay on camera shooting bracket 310, fold together;If inserting electric energy storage device 500 at this time, start to charge, otherwise directly shuts down.In order to guarantee small load when 200 flight of unmanned plane, there is electricity in unmanned plane 200 and in use, electric energy storage device 500 is removed, and when unmanned plane 200 does not use or electricity exhausts, camera shooting bracket 310 can be folded, electric energy storage device 500 is then installed on electric energy storage device and corresponds to area.It is connected to the rechargeable battery of unmanned plane using electric energy storage device 500, carries out charging action.When folding charging, unmanned plane 200 obtains a lesser volume by folding, and is convenient for carrying, until can continue to use after charging complete.
As shown in Figure 14 and Figure 15, the embodiment of the present invention can also include that voice obtains equipment 600, and the voice obtains the voice data that equipment 600 is used to obtain user;The control instruction library 110 is also used to store the mapping relations of preset various voice keywords and various control instructions;The control assembly 100 further includes speech processing module 150, and the speech processing module 150 is used to extract the voice keyword in the voice data of the user included;Described instruction determination module 130 is also used to search corresponding control instruction in the control instruction library according to the voice keyword of extraction.
Equipment 600 is obtained by setting voice, which can also realize that user carries out shooting control by voice.Such as setting keyword " booting " is to open camera assembly 300, when there is " booting " word in the voice data for detecting user, then automatically turn on camera assembly 300, or detect in the voice data of user there is " unmanned plane " and " being moved to the left ", then it automatically controls unmanned plane and is moved to the left.The more convenient convenience of voice control, and not can be adapted for any occasion by the constraint of other conditions, the shooting effect without influencing user.
Further, as shown in figure 16, it is contemplated that for user when using voice control, control assembly 100 may receive the noise in the external world other people sound or environment, it is also necessary to distinguish different sound.The i.e. described speech processing module is also used to obtain the vocal print feature of user in the voice data of the user, and judges whether the vocal print feature of the user is to prestore specified vocal print feature;
If the vocal print feature of the user is default permission vocal print feature, then show that the voice data is the voice data of designated user, control can be executed according to this voice data, then described instruction determination module extracts the voice keyword for including in the voice data of the user, and searches corresponding control instruction in the control instruction library according to the voice keyword of extraction;If the vocal print feature of the user be not it is default allow vocal print feature, show the voice data of the voice data unspecified persons, screened out, is i.e. the described instruction determination module vocal print feature of ignoring the user, without extracting the processing of voice keyword.
Similarly, as shown in figure 17, the camera assembly 300 is also possible to that other people motion characteristic of unspecified persons can be got, and in order to avoid obscuring, described image processing module is also used to obtain the shooting image of the camera assembly The physiological characteristic of middle user, and judge whether the physiological characteristic of user is to prestore specified physiological characteristic;
If the physiological characteristic of the user is to prestore specified physiological characteristic, show to get is the movement of designated user, and described instruction determination module searches corresponding control instruction according to the pending user action feature in the control instruction library;If the physiological characteristic of the user is not to prestore specified physiological characteristic, described instruction determination module ignores the pending user action feature, without searching control instruction processing.
Herein; the physiological characteristic for obtaining user, is also referred to the face profile of user, the color development of user, hair lengths, the colour of skin of user, lip color etc., can also be combined with a variety of physiological characteristics and more accurately be distinguished; etc., all belong to the scope of protection of the present invention within.
As shown in figure 18, the embodiment of the present invention also provides a kind of unmanned plane interaction image pickup method, and using the unmanned plane interaction camera system, described method includes following steps:
S100: the camera assembly obtains shooting image;
S200: described image processing module handles the shooting image of the camera assembly, to obtain user action feature pending in the shooting image;
S300: described instruction determination module searches corresponding control instruction according to the pending user action feature in the control instruction library;And
S400: described instruction execution module controls the unmanned plane and/or the camera assembly according to obtained control instruction is searched.
When the control instruction includes unmanned aerial vehicle (UAV) control instruction, camera assembly control instruction or other effective instructions, deterministic process can use process as shown in figure 19, successively execute judgement and control, however be not limited to such mode.Other first judge whether it is camera assembly control instruction, then judge whether it is unmanned aerial vehicle (UAV) control instruction etc., all belong to the scope of protection of the present invention within.
As shown in figure 20, a kind of embodiment of specific unmanned plane interaction image pickup method is shown.It first determines whether display screen type, if it is external display screen, needs to make control assembly connect external display screen by wireless communication mode first, to prepare for control below.Then corresponding control instruction is searched according to the corresponding relationship of gesture and control instruction, and executes control.As described above, motion characteristic is not limited in this one kind of gesture in the present invention, the purpose of the present invention is also may be implemented in the different movements of other physical feelings.
As described above, can be distinguished by different motion characteristics to distinguish unmanned aerial vehicle (UAV) control instruction and camera assembly control instruction.Alternatively, it is also possible to be realized using different control modules.For example, the control instruction can also include first mode selection instruction and second mode selection instruction, indicate respectively that the control assembly enters first mode and second mode.
Into after first mode, the user action feature received later is defaulted as being directed toward unmanned aerial vehicle (UAV) control instruction, i.e. described instruction determination module is according to the user action feature, corresponding unmanned aerial vehicle (UAV) control instruction is searched in the control instruction library, and the unmanned plane is controlled according to the instruction of obtained unmanned aerial vehicle (UAV) control is searched, and no longer execute camera assembly control instruction;Into after the second mode, the user action feature received later is defaulted as being directed toward camera assembly control Instruction, i.e. described instruction determination module is according to the user action feature, corresponding camera assembly control instruction is searched in the control instruction library, and controls the camera assembly according to obtained camera assembly control instruction is searched, and no longer executes camera assembly control instruction.
Using this kind of mode, it is possible to reduce the quantity of user setting motion characteristic.For example, being equally that palm is spread out, move down, in the flrst mode, means that control unmanned plane moves down, under the second mode, then it represents that control camera assembly downwardly turns over.A kind of specific embodiment is only provided herein, and protection scope of the present invention is not limited.
Further, due to the stationarity and controllability during unmanned plane during flying, take camera shooting that there are some irreplaceable advantages relative to user hand, for example, unmanned plane can take the smaller photo of shake, it is lower to the stabilization performance requirement of picture pick-up device.User often can not obtain ideal distant view photograph because of shake or other factors interference when hand pans photo by camera.And this problem can be overcome by unmanned plane.
As shown in figure 21 and figure, the control instruction further includes panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane with pre-set velocity (0, it α) is persistently rotated in angular range, α is default pan-shot maximum angle.Optionally, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
The position of the camera assembly detection user 400;
The unmanned plane 200 rotates α/n to side in same level using the position of user 400 as starting point, this stage is unmanned plane positioning stage, this does not shoot in the process, and wherein n is the first default partition value;
The camera assembly starts to shoot, and the unmanned plane 200 at the uniform velocity rotates α to the other side with pre-set velocity in same level, to reach the distant view photograph that an angle is α, and user is located at the designated position of distant view photograph;
After the unmanned plane 200 stops operating, the camera assembly stops shooting.
When n is 2, user can be made to be located at the center of distant view photograph.In practical applications, angle [alpha] can according to need setting, the position that user is located in distant view photograph can also be adjusted, such as setting user is located at position to the left, then unmanned plane first can be rotated into α/4 etc. to side, style of shooting is very flexible, and the photo success rate that pans is high, and shooting photo effect is more preferable.
As shown in figure 23, it is the mode of another pan-shot, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
Described instruction execution module calculates the distance between the camera assembly and user 400 L, i.e., the distance shown in dotted line connected between user 400 and unmanned plane 200 in figure;
Described instruction execution module selectes the anchor point between the camera assembly and user 400, using the anchor point as the center of circle, using L/m as raw the first sector 701 for being angled α of radius, and level to be captured is on the circular arc of first sector 701, wherein m is the second default partition value, and m > 1, α are default pan-shot maximum angle;
Described instruction execution module generates and described first fan-shaped 701 the second opposite sectors 702, second sector 702 two sides are respectively the reverse extending line of the two sides of first sector 701, and the radius of second sector 702 is (m-1) L/m, angle α;
The camera assembly starts to shoot, and the unmanned plane 200 is moved to the other end of the circular arc from one end of the circular arc of second sector 702 along the track of the circular arc;
After the unmanned plane 200 is moved to the other end of the circular arc of second sector 702, the camera assembly stops shooting.
As shown in figure 24, it is the mode of another pan-shot, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
Described instruction execution module calculates the distance between the camera assembly and user 400 L, i.e., the distance shown in dotted line connected between user 400 and unmanned plane 200 in figure;
Described instruction execution module selectes the anchor point between the camera assembly and user 400, using the anchor point as vertex, using L/m as the length of waist, generate the first isosceles triangle 703 that corner angle is α, and level to be captured is on the bottom edge of first isosceles triangle 703, wherein m is the second default partition value, and m > 1, α are default pan-shot maximum angle;
Described instruction execution module generates second isosceles triangle 704 opposite with first isosceles triangle 703, two waists of second isosceles triangle 704 are respectively the reverse extending line of two waists of first isosceles triangle 703, and the length of the waist of second isosceles triangle 704 is (m-1) L/m, corner angle α;
The camera assembly starts to shoot, and the unmanned plane 200 is moved to the other end on the bottom edge from the one end on the bottom edge of second isosceles triangle 704 along the track on the bottom edge;
After the unmanned plane 200 is moved to the other end on the bottom edge of second isosceles triangle 704, the camera assembly stops shooting.
Shooting track in Figure 23 and Figure 24, which can according to need, to be selected, and forms distant view photograph by lasting shooting, or synthesize a panoramic photograph by shooting multiple pictures, the different selections of m and α can obtain different coverages, have more flexibility.Unmanned plane can be mobile according to the desired guiding trajectory being calculated, and camera assembly is made to obtain different camera site and shooting angle.
When being shot using unmanned plane, sometimes for certain time, such as shooting countdown can be set, the i.e. described control instruction can also include the third mode selection instruction, indicate that the control assembly enters the third mode, under the third mode, described instruction execution module controls the camera assembly and is shot after the default waiting time.In countdown process, count down time can be shown by display equipment, remaining time can also be indicated by other display lamps or prompt tone.
As shown in figure 25, unmanned plane of the invention can also realize user's auto-tracking shooting function.The control instruction can also include fourth mode selection instruction, indicate that the control assembly enters fourth mode, under the fourth mode, described instruction execution module detects the position of user by the camera assembly, and it is mobile automatically according to the position of the user to control the unmanned plane and the camera assembly, so that the camera assembly persistently shoots user User's shooting is automatically tracked to realize, guarantees that user is constantly within coverage.
Optionally, under the fourth mode, described instruction execution module can also obtain the change in location acceleration of user, and when the change in location acceleration of user is more than predetermined acceleration threshold value, described instruction execution module issues alarm signal to outside.Using this kind of mode, on the one hand, can be by warning note user note that user oneself is allowed actively to come within the coverage of camera assembly when camera assembly can not capture user location;On the other hand, also fall detection may be implemented, when user accidentally fall or it is uncomfortable fall in a swoon when, can alarm signal be issued to outside automatically, if user does not cancel the alarm signal within a certain period of time, it then can further notify and the mobile terminal of the other users of user-association or dial ambulance call etc., while providing high quality shooting for user, also guarantee the safety in user's use process.
As shown in figure 26, further, in view of user control unmanned plane movement when may because of distance erroneous estimation or maloperation and make other barriers on unmanned aerial vehicle, in order to ensure the safety of unmanned plane itself, at least one range sensor is also provided on the unmanned plane, the control assembly further includes obstacle computing module, and the obstacle computing module is used to obtain the obstacle detection data of the range sensor;
It include unmanned plane move in the pending control instruction, and the obstacle computing module is when judging that the distance between barrier and described unmanned plane in the unmanned plane move on moving direction are less than default secure threshold, cancel the unmanned plane move, and issues limit value alerting signal to outside.The i.e. described obstacle computing module is after passing through the barrier that may be knocked around known to range sensor, it is predicted according to the pointing direction of control instruction, if unmanned plane performs unmanned plane move, whether barrier may be knocked, if it is, it does not execute the unmanned plane move then, and reminds user distance already less than limit value, there is the danger for knocking barrier.
Using this kind of embodiment, the case where especially suitable for being shot indoors.Due to being limited indoors by metope and ceiling, and the barriers such as furniture, ornaments for having other very much, using this kind of mode, by reliably calculating and risk prediction, then it can guarantee the safety that unmanned plane is shot indoors.It is equally possible that be suitable for unmanned plane the outdoor shooting the case where, it is outdoor it is spacious in place of, the possible movement speed of unmanned plane is very fast, and user can not predict the danger not arrived well, using such mode, then can guarantee the stability and reliability of unmanned plane interaction shooting process.
Compared with prior art, the present invention provides the technical solutions that a kind of user is directly controlled by movement, camera assembly obtains shooting image automatically and is automatically analyzed to obtain pending user action feature by control assembly, the control instruction that user needs is interpreted according to pending user action feature, thus user directly can carry out flight control to unmanned plane by movement and carry out shooting control to camera assembly, to realize shooting function, the shooting of meet demand can be easily realized in all case, and the user experience is improved.
The above content is a further detailed description of the present invention in conjunction with specific preferred embodiments, and it cannot be said that specific implementation of the invention is only limited to these instructions.For those of ordinary skill in the art to which the present invention belongs, without departing from the inventive concept of the premise, a number of simple deductions or replacements can also be made, all shall be regarded as belonging to protection scope of the present invention.

Claims (24)

  1. A kind of unmanned plane interaction camera system, which is characterized in that the system comprises unmanned plane, camera assembly and control assembly, one end of the camera assembly is rotatably connected to the side of the unmanned plane;Wherein the control assembly includes:
    Control instruction library, for storing the mapping relations of preset various user action features and various control instructions, the control instruction includes unmanned aerial vehicle (UAV) control instruction and/or camera assembly control instruction;
    Image processing module is handled for the shooting image to the camera assembly, to obtain user action feature pending in the shooting image;
    Determination module is instructed, for according to the pending user action feature, searching corresponding control instruction in the control instruction library;And
    Instruction execution module, the control instruction for being obtained according to lookup control the unmanned plane and/or the camera assembly.
  2. Unmanned plane interaction camera system according to claim 1, it is characterized in that, the camera assembly includes picture pick-up device and camera shooting bracket, and the picture pick-up device is set in the camera shooting bracket, and one end of the camera shooting bracket is rotatably connected to the side of the unmanned plane;
    The system also includes a display equipment, the display equipment is detachable or is fixably attached to the other end of the camera shooting bracket.
  3. Unmanned plane interaction camera system according to claim 2, which is characterized in that the display equipment includes array display screen and the first display control unit;First display control unit obtains the shooting image of the picture pick-up device, and is shown by the array display screen.
  4. Unmanned plane interaction camera system according to claim 2, which is characterized in that the display equipment includes dot matrix display and the second display control unit;Second display control unit obtains the control instruction that described instruction determination module is searched, and the user's prompt information associated with the control instruction that the lookup obtains that controls that the dot matrix display show.
  5. Unmanned plane interaction camera system according to claim 2, which is characterized in that one end of the camera shooting bracket is set as convex block, and the side of the unmanned plane is provided with a groove being adapted with the convex block shape;The convex block of the camera shooting bracket is embedded in the groove of the unmanned plane;
    The lower surface of the unmanned plane is a flat surface, the lower surface of the unmanned plane includes that a camera shooting bracket corresponds to area, lower surface of the two sides of the groove of the unmanned plane perpendicular to the unmanned plane, and the convex block of the camera shooting bracket can be rotated in the groove of the unmanned plane, so that the camera shooting bracket can be vertical and rotate with described image in the angular range that the corresponding area of bracket is bonded with the lower surface of the unmanned plane.
  6. Unmanned plane according to claim 5 interaction camera system, which is characterized in that the lower surface of the unmanned plane further includes that an electric energy storage device corresponds to area, and the electric energy storage device corresponds to area corresponding with the camera shooting bracket, area without intersecting;
    The system also includes an electric energy storage device, the electric energy storage device is detachable or is fixably attached to the unmanned plane Lower surface, and the electric energy storage device is bonded the electric energy storage device and corresponds to area.
  7. Unmanned plane interaction camera system according to claim 5, it is characterized in that, the camera shooting bracket includes first support arm, second support arm and third support arm, the side of the first support arm is connected to the convex block, and the other side of the first support arm is provided with one first slot, one end of the second support arm and one end of third support arm are respectively connected to the both ends of the first support arm, and the second support arm and third support arm are each perpendicular to the first support arm, the other end of the second support arm is provided with one second slot, the other end of the third support arm is provided with a third slot;
    The side of the display equipment is inserted into first slot, and the other side of the display equipment is inserted into second slot and the third slot.
  8. Unmanned plane interaction camera system according to claim 1, which is characterized in that further include that voice obtains equipment, the voice obtains the voice data that equipment is used to obtain user;
    The control instruction library is also used to store the mapping relations of preset various voice keywords and various control instructions;
    The control assembly further includes speech processing module, and the speech processing module is used to extract the voice keyword in the voice data of the user included;
    Described instruction determination module is also used to search corresponding control instruction in the control instruction library according to the voice keyword of extraction.
  9. Unmanned plane interaction camera system according to claim 8, which is characterized in that the speech processing module is also used to obtain the vocal print feature of user in the voice data of the user, and judges whether the vocal print feature of the user is to prestore specified vocal print feature;
    If the vocal print feature of the user is default permission vocal print feature, then described instruction determination module extracts the voice keyword for including in the voice data of the user, and searches corresponding control instruction in the control instruction library according to the voice keyword of extraction;
    If the vocal print feature of the user is not default permission vocal print feature, described instruction determination module ignores the vocal print feature of the user, without extracting the processing of voice keyword.
  10. Unmanned plane interaction camera system according to claim 1, which is characterized in that described image processing module is also used to obtain the physiological characteristic of user in the shooting image of the camera assembly, and judges whether the physiological characteristic of user is to prestore specified physiological characteristic;
    If the physiological characteristic of the user is to prestore specified physiological characteristic, described instruction determination module searches corresponding control instruction according to the pending user action feature in the control instruction library;
    If the physiological characteristic of the user is not to prestore specified physiological characteristic, described instruction determination module ignores the pending user action feature, without searching control instruction processing.
  11. Unmanned plane according to claim 1 interaction camera system, which is characterized in that the unmanned aerial vehicle (UAV) control instruction includes at least one of unmanned plane translation instruction, unmanned plane rotation command, unmanned plane power-on instruction and unmanned plane shutdown command;The camera assembly control instruction includes at least one of camera assembly rotation command, acquisition parameters adjustment instruction, shooting sign on and shooting halt instruction.
  12. Unmanned plane interaction camera system according to claim 1, which is characterized in that the control instruction Further include:
    First mode selection instruction, indicate that the control assembly enters first mode, in the first mode, described instruction determination module is according to the user action feature, corresponding unmanned aerial vehicle (UAV) control instruction is searched in the control instruction library, and controls the unmanned plane according to the instruction of obtained unmanned aerial vehicle (UAV) control is searched;
    Second mode selection instruction, indicate that the control assembly enters second mode, in the second mode, described instruction determination module is according to the user action feature, corresponding camera assembly control instruction is searched in the control instruction library, and controls the camera assembly according to obtained camera assembly control instruction is searched.
  13. Unmanned plane interaction camera system according to claim 1, which is characterized in that the control instruction further include:
    Panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane with pre-set velocity (0, it α) is persistently moved in angular range, α is default pan-shot maximum angle.
  14. Unmanned plane interaction camera system according to claim 13, which is characterized in that under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
    The position of the camera assembly detection user;
    The unmanned plane rotates α/n to side in same level using the position of user as starting point, and wherein n is the first default partition value, and n > 1;
    The camera assembly starts to shoot, and the unmanned plane at the uniform velocity rotates α to the other side with pre-set velocity in same level;
    After the unmanned plane stops operating, the camera assembly stops shooting.
  15. Unmanned plane interaction camera system according to claim 13, which is characterized in that under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
    Described instruction execution module calculates the distance between the camera assembly and user L;
    Described instruction execution module selectes the anchor point between the camera assembly and user, using the anchor point as the center of circle, using L/m as raw the first sector for being angled α of radius, and level to be captured is on the described first fan-shaped circular arc, wherein m is the second default partition value, and m > 1;
    Described instruction execution module generates second sector opposite with first sector, and the described second fan-shaped two sides are respectively the reverse extending line of the described first fan-shaped two sides, and the described second fan-shaped radius is (m-1) L/m, angle α;
    The camera assembly starts to shoot, and the unmanned plane is moved to the other end of the circular arc from one end of the described second fan-shaped circular arc along the track of the circular arc;
    After the unmanned plane is moved to the other end of the described second fan-shaped circular arc, the camera assembly stops shooting.
  16. Unmanned plane interaction camera system according to claim 13, which is characterized in that in the panorama mould Under formula, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
    Described instruction execution module calculates the distance between the camera assembly and user L;
    Described instruction execution module selectes the anchor point between the camera assembly and user, using the anchor point as vertex, using L/m as the length of waist, generate the first isosceles triangle that corner angle is α, and level to be captured is on the bottom edge of first isosceles triangle, wherein m is the second default partition value, and m > 1;
    Described instruction execution module generates second isosceles triangle opposite with first isosceles triangle, two waists of second isosceles triangle are respectively the reverse extending line of two waists of first isosceles triangle, and the length of the waist of second isosceles triangle is (m-1) L/m, corner angle α;
    The camera assembly starts to shoot, and the unmanned plane is moved to the other end on the bottom edge from the one end on the bottom edge of second isosceles triangle along the track on the bottom edge;
    After the unmanned plane is moved to the other end on the bottom edge of second isosceles triangle, the camera assembly stops shooting.
  17. Unmanned plane interaction camera system according to claim 1, which is characterized in that the control instruction further include:
    The third mode selection instruction indicates that the control assembly enters the third mode, and under the third mode, described instruction execution module controls the camera assembly and shot after the default waiting time.
  18. Unmanned plane interaction camera system according to claim 1, which is characterized in that the control instruction further include:
    Fourth mode selection instruction, indicate that the control assembly enters fourth mode, under the fourth mode, described instruction execution module detects the position of user by the camera assembly, and the unmanned plane and the camera assembly are controlled automatically according to the position movement of the user, so that the camera assembly persistently shoots user.
  19. Unmanned plane interaction camera system according to claim 18, it is characterized in that, under the fourth mode, the change in location acceleration of described instruction execution module acquisition user, when the change in location acceleration of user is more than predetermined acceleration threshold value, described instruction execution module issues alarm signal to outside.
  20. Unmanned plane interaction camera system according to claim 1, it is characterized in that, at least one range sensor is additionally provided on the unmanned plane, the control assembly further includes obstacle computing module, and the obstacle computing module is used to obtain the obstacle detection data of the range sensor;
    It include unmanned plane move in the pending control instruction, and the obstacle computing module is when judging that the distance between barrier and described unmanned plane in the unmanned plane move on moving direction are less than default secure threshold, cancel the unmanned plane move, and issues limit value alerting signal to outside.
  21. A kind of unmanned plane interaction image pickup method, which is characterized in that using the interaction camera system of unmanned plane described in any one of claims 1 to 20, described method includes following steps:
    The camera assembly obtains shooting image;
    Described image processing module handles the shooting image of the camera assembly, to obtain in the shooting image Pending user action feature;
    Described instruction determination module searches corresponding control instruction according to the pending user action feature in the control instruction library;And
    Described instruction execution module controls the unmanned plane and/or the camera assembly according to obtained control instruction is searched.
  22. Unmanned plane interaction image pickup method according to claim 21, it is characterized in that, the control instruction further includes panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly and carries out distant view photograph shooting with following steps:
    The position of the camera assembly detection user;
    The unmanned plane rotates α/n to side in same level using the position of user as starting point, and wherein n is the first default partition value, and n > 1, α are default pan-shot maximum angle;
    The camera assembly starts to shoot, and the unmanned plane at the uniform velocity rotates α to the other side with pre-set velocity in same level;
    After the unmanned plane stops operating, the camera assembly stops shooting.
  23. Unmanned plane interaction camera system according to claim 21, it is characterized in that, the control instruction further includes panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
    Described instruction execution module calculates the distance between the camera assembly and user L;
    Described instruction execution module selectes the anchor point between the camera assembly and user, using the anchor point as the center of circle, using L/m as raw the first sector for being angled α of radius, and level to be captured is on the described first fan-shaped circular arc, wherein m is the second default partition value, and m > 1, α are default pan-shot maximum angle;
    Described instruction execution module generates second sector opposite with first sector, and the described second fan-shaped two sides are respectively the reverse extending line of the described first fan-shaped two sides, and the described second fan-shaped radius is (m-1) L/m, angle α;
    The camera assembly starts to shoot, and the unmanned plane is moved to the other end of the circular arc from one end of the described second fan-shaped circular arc along the track of the circular arc;
    After the unmanned plane is moved to the other end of the described second fan-shaped circular arc, the camera assembly stops shooting.
  24. Unmanned plane interaction camera system according to claim 21, it is characterized in that, the control instruction further includes panning mode selection instruction, indicate that the control assembly enters panning mode, under the panning mode, described instruction execution module controls the unmanned plane and the camera assembly proceeds as follows distant view photograph shooting:
    Described instruction execution module calculates the distance between the camera assembly and user L;
    Described instruction execution module selectes the anchor point between the camera assembly and user, using the anchor point as vertex, using L/m as the length of waist, generate the first isosceles triangle that corner angle is α, and level to be captured is on the bottom edge of first isosceles triangle, wherein m is the second default partition value, and m > 1, α are default pan-shot maximum angle;
    Described instruction execution module generates second isosceles triangle opposite with first isosceles triangle, two waists of second isosceles triangle are respectively the reverse extending line of two waists of first isosceles triangle, and the length of the waist of second isosceles triangle is (m-1) L/m, corner angle α;
    The camera assembly starts to shoot, and the unmanned plane is moved to the other end on the bottom edge from the one end on the bottom edge of second isosceles triangle along the track on the bottom edge;
    After the unmanned plane is moved to the other end on the bottom edge of second isosceles triangle, the camera assembly stops shooting.
CN201780000407.6A 2017-04-17 2017-04-17 Unmanned aerial vehicle interactive shooting system and method Active CN109121434B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/080738 WO2018191840A1 (en) 2017-04-17 2017-04-17 Interactive photographing system and method for unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN109121434A true CN109121434A (en) 2019-01-01
CN109121434B CN109121434B (en) 2021-07-27

Family

ID=63855487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780000407.6A Active CN109121434B (en) 2017-04-17 2017-04-17 Unmanned aerial vehicle interactive shooting system and method

Country Status (3)

Country Link
CN (1) CN109121434B (en)
TW (1) TWI696122B (en)
WO (1) WO2018191840A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112019744A (en) * 2020-08-27 2020-12-01 新石器慧义知行智驰(北京)科技有限公司 Photographing method, device, equipment and medium
TWI768630B (en) * 2020-12-29 2022-06-21 財團法人工業技術研究院 Movable photographing system and photography composition control method
US11445121B2 (en) 2020-12-29 2022-09-13 Industrial Technology Research Institute Movable photographing system and photography composition control method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN104865856A (en) * 2015-03-30 2015-08-26 成都好飞机器人科技有限公司 Voice control method for unmanned aerial vehicle
CN105338238A (en) * 2014-08-08 2016-02-17 联想(北京)有限公司 Photographing method and electronic device
CN105391939A (en) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 Unmanned aerial vehicle shooting control method, device, unmanned aerial vehicle shooting method and unmanned aerial vehicle
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
US9471059B1 (en) * 2015-02-17 2016-10-18 Amazon Technologies, Inc. Unmanned aerial vehicle assistant
CN106143870A (en) * 2015-07-28 2016-11-23 英华达(上海)科技有限公司 Unmanned vehicle
CN106227231A (en) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 The control method of unmanned plane, body feeling interaction device and unmanned plane

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678506B2 (en) * 2014-06-19 2017-06-13 Skydio, Inc. Magic wand interface and other user interaction paradigms for a flying digital assistant
US20160101856A1 (en) * 2014-06-23 2016-04-14 Nixie Labs, Inc. Wearable unmanned aerial vehicles, and associated systems and methods
JP6470112B2 (en) * 2015-06-01 2019-02-13 日本電信電話株式会社 Mobile device operation terminal, mobile device operation method, and mobile device operation program
CN105138126B (en) * 2015-08-26 2018-04-13 小米科技有限责任公司 Filming control method and device, the electronic equipment of unmanned plane
CN105607740A (en) * 2015-12-29 2016-05-25 清华大学深圳研究生院 Unmanned aerial vehicle control method and device based on computer vision
CN106200679B (en) * 2016-09-21 2019-01-29 中国人民解放军国防科学技术大学 Single operation person's multiple no-manned plane mixing Active Control Method based on multi-modal natural interaction
CN106444843B (en) * 2016-12-07 2019-02-15 北京奇虎科技有限公司 Unmanned plane relative bearing control method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105338238A (en) * 2014-08-08 2016-02-17 联想(北京)有限公司 Photographing method and electronic device
US9471059B1 (en) * 2015-02-17 2016-10-18 Amazon Technologies, Inc. Unmanned aerial vehicle assistant
CN104865856A (en) * 2015-03-30 2015-08-26 成都好飞机器人科技有限公司 Voice control method for unmanned aerial vehicle
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN106143870A (en) * 2015-07-28 2016-11-23 英华达(上海)科技有限公司 Unmanned vehicle
CN105391939A (en) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 Unmanned aerial vehicle shooting control method, device, unmanned aerial vehicle shooting method and unmanned aerial vehicle
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
CN106227231A (en) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 The control method of unmanned plane, body feeling interaction device and unmanned plane

Also Published As

Publication number Publication date
TWI696122B (en) 2020-06-11
WO2018191840A1 (en) 2018-10-25
CN109121434B (en) 2021-07-27
TW201839663A (en) 2018-11-01

Similar Documents

Publication Publication Date Title
CN110692027B (en) System and method for providing easy-to-use release and automatic positioning of drone applications
CN107087427B (en) Control method, device and the equipment and aircraft of aircraft
US11340606B2 (en) System and method for controller-free user drone interaction
CN113038016B (en) Unmanned aerial vehicle image acquisition method and unmanned aerial vehicle
CN110494360B (en) System and method for providing autonomous photography and photography
CN105242685B (en) A kind of accompanying flying unmanned plane system and method
WO2018209702A1 (en) Method for controlling unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
CN108062106A (en) Unmanned vehicle and the method for using unmanned vehicle shooting object
WO2015013979A1 (en) Remote control method and terminal
JP6696118B2 (en) Electronics
CN205353774U (en) Accompany unmanned aerial vehicle system of taking photo by plane of shooing aircraft
CN109032188A (en) Flight instruments and system
US20160124435A1 (en) 3d scanning and imaging method utilizing a self-actuating compact unmanned aerial device
US20210112194A1 (en) Method and device for taking group photo
WO2021127888A1 (en) Control method, smart glasses, mobile platform, gimbal, control system, and computer-readable storage medium
CN109121434A (en) Unmanned plane interaction camera system and method
WO2022082440A1 (en) Method, apparatus and system for determining target following strategy, and device and storage medium
CN110035220A (en) Apparatus control system and method for photography
WO2016068354A1 (en) Unmanned aerial vehicle, automatic target photographing device and method
US11434002B1 (en) Personal drone assistant
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
WO2022188151A1 (en) Image photographing method, control apparatus, movable platform, and computer storage medium
KR101599149B1 (en) An imaging device with automatic tracing for the object
CN107848622A (en) External microphone for unmanned vehicle

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant