CN109121434B - Unmanned aerial vehicle interactive shooting system and method - Google Patents

Unmanned aerial vehicle interactive shooting system and method Download PDF

Info

Publication number
CN109121434B
CN109121434B CN201780000407.6A CN201780000407A CN109121434B CN 109121434 B CN109121434 B CN 109121434B CN 201780000407 A CN201780000407 A CN 201780000407A CN 109121434 B CN109121434 B CN 109121434B
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
shooting
user
instruction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201780000407.6A
Other languages
Chinese (zh)
Other versions
CN109121434A (en
Inventor
张景嵩
张凌
戴志宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Inverda Shanghai Electronics Co ltd
Inventec Appliances Shanghai Corp
Original Assignee
Inverda Shanghai Electronics Co ltd
Inventec Appliances Shanghai Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Inverda Shanghai Electronics Co ltd, Inventec Appliances Shanghai Corp filed Critical Inverda Shanghai Electronics Co ltd
Publication of CN109121434A publication Critical patent/CN109121434A/en
Application granted granted Critical
Publication of CN109121434B publication Critical patent/CN109121434B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

The invention provides an unmanned aerial vehicle interactive shooting system and a method, wherein the system comprises an unmanned aerial vehicle, a shooting component and a control component, wherein one end of the shooting component is rotatably connected to one side of the unmanned aerial vehicle; the control assembly comprises a control instruction library, an image processing module, an instruction judging module and an instruction executing module, and the instruction executing module controls the unmanned aerial vehicle and/or the camera assembly according to the searched control instruction. The invention provides a technical scheme for realizing interactive shooting based on an unmanned aerial vehicle, wherein a shooting component automatically acquires a shot image, a control component automatically analyzes the shot image to obtain the action characteristics of a user to be executed, and a control instruction required by the user is read according to the action characteristics of the user to be executed, so that the user can directly perform flight control on the unmanned aerial vehicle and perform shooting control on the shooting component through action, thereby realizing a shooting function, easily realizing shooting meeting requirements in any occasion and improving user experience.

Description

Unmanned aerial vehicle interactive shooting system and method
Technical Field
The invention relates to the technical field of unmanned aerial vehicle control, in particular to an unmanned aerial vehicle interactive shooting system and method for direct control of a user through actions.
Background
Unmanned aerial vehicles, i.e., drones, are unmanned aerial vehicles that are controlled and operated by radio remote control devices and self-contained programs. Due to rapid development of the unmanned aerial vehicle technology in recent years, it has been widely used in various fields.
The existing unmanned aerial vehicle shooting can be divided into two parts of commercial aerial photography and personal entertainment self-shooting, and the application program in a remote controller or a handheld mobile device is adopted for control at present. However, when using unmanned aerial vehicle to carry out personal entertainment autodyne, the user often will compromise unmanned aerial vehicle and remote controller two aspects simultaneously, and it is inconvenient to operate. For example, when a party photo is taken, a user often needs to observe an application program picture in a handheld mobile device, so that a clear face of the user cannot be taken, or when a shooting motion jumps from shooting, a satisfactory action cannot be taken due to a remote controller in the hand, so that a shooting effect is affected.
In addition, consider factors such as unmanned aerial vehicle self weight and size, miniaturized auto heterodyne unmanned aerial vehicle often the electric quantity is less, the time of endurance is short to influence and shoot the enjoyment, can't satisfy current user's user demand.
Disclosure of Invention
Aiming at the problems in the prior art, the invention aims to provide an unmanned aerial vehicle interactive shooting system and method, wherein a user can directly perform flight control on an unmanned aerial vehicle and shooting control on a camera shooting assembly through action, so that a shooting function is realized, and a shooting effect is improved.
The embodiment of the invention provides an unmanned aerial vehicle interactive shooting system, which comprises an unmanned aerial vehicle, a camera shooting assembly and a control assembly, wherein one end of the camera shooting assembly is rotatably connected to one side of the unmanned aerial vehicle; wherein the control assembly comprises:
the control instruction library is used for storing mapping relations between various preset user action characteristics and various control instructions, and the control instructions comprise unmanned aerial vehicle control instructions and/or camera shooting assembly control instructions;
the image processing module is used for processing the shot image of the camera shooting assembly so as to obtain the user action characteristics to be executed in the shot image;
the instruction judging module is used for searching the corresponding control instruction in the control instruction library according to the user action characteristic to be executed; and
and the instruction execution module is used for controlling the unmanned aerial vehicle and/or the camera shooting assembly according to the searched control instruction.
Optionally, the camera shooting assembly comprises a camera shooting device and a camera shooting bracket, the camera shooting device is arranged in the camera shooting bracket, and one end of the camera shooting bracket is rotatably connected to one side of the unmanned aerial vehicle;
the system also comprises a display device which is detachably or fixedly arranged at the other end of the camera bracket.
Optionally, the display device includes an array display screen and a first display control unit; the first display control unit acquires a shot image of the camera equipment and displays the shot image through the array type display screen.
Optionally, the display device includes a dot-matrix display screen and a second display control unit; and the second display control unit acquires the control instruction searched by the instruction judgment module and controls the dot-matrix display screen to display user prompt information associated with the searched control instruction.
Optionally, one end of the camera shooting support is provided with a projection, and one side of the unmanned aerial vehicle is provided with a groove which is adaptive to the shape of the projection; the projection of the camera shooting support is embedded in the groove of the unmanned aerial vehicle;
unmanned aerial vehicle's lower surface is a plane, unmanned aerial vehicle's lower surface includes that a support of making a video recording corresponds the district, the both sides face perpendicular to of unmanned aerial vehicle's recess unmanned aerial vehicle's lower surface, just the lug of the support of making a video recording is in rotatable in unmanned aerial vehicle's the recess, so that the support of making a video recording can with unmanned aerial vehicle's lower surface is perpendicular and with the support of making a video recording corresponds the angular range internal rotation of district laminating.
Optionally, the lower surface of the unmanned aerial vehicle further comprises an electricity storage device corresponding area, and the electricity storage device corresponding area is not intersected with the camera shooting support corresponding area;
the system further comprises electricity storage equipment, wherein the electricity storage equipment can be detachably or fixedly installed on the lower surface of the unmanned aerial vehicle, and is attached to the corresponding area of the electricity storage equipment.
Optionally, the camera bracket includes a first support arm, a second support arm and a third support arm, one side of the first support arm is connected to the projection, and the other side of the first support arm is provided with a first slot, one end of the second support arm and one end of the third support arm are respectively connected to two ends of the first support arm, and the second support arm and the third support arm are both perpendicular to the first support arm, the other end of the second support arm is provided with a second slot, and the other end of the third support arm is provided with a third slot;
one side of the display device is inserted into the first slot, and the other side of the display device is inserted into the second slot and the third slot.
Optionally, the system further comprises a voice acquiring device, wherein the voice acquiring device is used for acquiring voice data of the user;
the control instruction library is also used for storing the mapping relation between various preset voice keywords and various control instructions;
the control component also comprises a voice processing module, and the voice processing module is used for extracting voice keywords included in the voice data of the user;
the instruction judging module is also used for searching the corresponding control instruction in the control instruction library according to the extracted voice keyword.
Optionally, the voice processing module is further configured to obtain a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
if the voiceprint feature of the user is a preset allowed voiceprint feature, the instruction judging module extracts a voice keyword included in the voice data of the user and searches a corresponding control instruction in the control instruction library according to the extracted voice keyword;
and if the voiceprint features of the user are not preset allowed voiceprint features, the instruction judging module ignores the voiceprint features of the user and does not extract the voice keywords.
Optionally, the image processing module is further configured to acquire a physiological characteristic of the user in the captured image of the image capturing component, and determine whether the physiological characteristic of the user is a pre-stored specified physiological characteristic;
if the physiological characteristics of the user are pre-stored specified physiological characteristics, the instruction judging module searches the corresponding control instruction in the control instruction library according to the user action characteristics to be executed;
and if the physiological characteristics of the user are not pre-stored specified physiological characteristics, the instruction judging module ignores the user action characteristics to be executed and does not perform searching control instruction processing.
Optionally, the unmanned aerial vehicle control instruction includes at least one of an unmanned aerial vehicle translation instruction, an unmanned aerial vehicle rotation instruction, an unmanned aerial vehicle startup instruction, and an unmanned aerial vehicle shutdown instruction; the camera shooting assembly control instruction comprises at least one of a camera shooting assembly rotating instruction, a shooting parameter adjusting instruction, a shooting starting instruction and a shooting stopping instruction.
Optionally, the control instruction further includes:
a first mode selection instruction, which indicates the control component to enter a first mode, wherein in the first mode, the instruction judgment module searches the corresponding unmanned aerial vehicle control instruction in the control instruction library according to the user action characteristic, and controls the unmanned aerial vehicle according to the searched unmanned aerial vehicle control instruction;
and the second mode selection instruction indicates the control assembly to enter a second mode, and in the second mode, the instruction judgment module searches the corresponding camera assembly control instruction in the control instruction library according to the user action characteristic and controls the camera assembly according to the searched camera assembly control instruction.
Optionally, the control instruction further includes:
and a panoramic mode selection instruction for indicating the control component to enter a panoramic mode, wherein in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle to continuously move within an angle range of (0, alpha) at a preset speed, and alpha is a preset panoramic shooting maximum angle.
Optionally, in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera assembly to take panoramic photos as follows:
the camera shooting component detects the position of a user;
the unmanned aerial vehicle rotates alpha/n to one side in the same horizontal plane by taking the position of a user as a starting point, wherein n is a first preset segmentation value, and n is greater than 1;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle rotates to the other side at a constant speed in the same horizontal plane at a preset speed by alpha;
after the unmanned aerial vehicle stops rotating, the camera shooting assembly stops shooting.
Optionally, in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera assembly to take panoramic photos as follows:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first sector with an angle alpha by taking the positioning point as a circle center and L/m as a radius, and places a to-be-shot object on an arc of the first sector, wherein m is a second preset segmentation value, and m is greater than 1;
the instruction execution module generates a second fan shape opposite to the first fan shape, two side edges of the second fan shape are reverse extension lines of two side edges of the first fan shape respectively, the radius of the second fan shape is (m-1) L/m, and the angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the second fan-shaped arc to the other end of the arc along the track of the arc;
unmanned aerial vehicle removes to behind the other end of the sectorial circular arc of second, the subassembly of making a video recording stops to shoot.
Optionally, in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera assembly to take panoramic photos as follows:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first isosceles triangle with a vertex angle of alpha by taking the positioning point as a vertex and L/m as the length of a waist, and locates a to-be-shot object on the bottom edge of the first isosceles triangle, wherein m is a second preset segmentation value, and m is greater than 1;
the instruction execution module generates a second isosceles triangle opposite to the first isosceles triangle, the two waists of the second isosceles triangle are reverse extension lines of the two waists of the first isosceles triangle respectively, the length of the waist of the second isosceles triangle is (m-1) L/m, and the vertex angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the bottom edge of the second isosceles triangle to the other end of the bottom edge along the track of the bottom edge;
after the unmanned aerial vehicle moves to the other end of the bottom edge of the second isosceles triangle, the camera shooting assembly stops shooting.
Optionally, the control instruction further includes:
and the third mode selection instruction instructs the control assembly to enter a third mode, and in the third mode, the instruction execution module controls the camera assembly to shoot after preset waiting time.
Optionally, the control instruction further includes:
and the fourth mode selection instruction indicates the control assembly to enter a fourth mode, and in the fourth mode, the instruction execution module detects the position of the user through the camera assembly and controls the unmanned aerial vehicle and the camera assembly to automatically move according to the position of the user so that the camera assembly continuously shoots the user.
Optionally, in the fourth mode, the instruction execution module obtains a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
Optionally, the unmanned aerial vehicle is further provided with at least one distance sensor, and the control assembly further comprises an obstacle calculation module, wherein the obstacle calculation module is used for acquiring obstacle detection data of the distance sensor;
the control command to be executed comprises an unmanned aerial vehicle moving command, and the obstacle calculation module judges that obstacles in the moving direction in the unmanned aerial vehicle moving command and the distance between unmanned aerial vehicles are smaller than a preset safety threshold value, cancels the unmanned aerial vehicle moving command and sends out limit value reminding signals to the outside.
The invention also provides an unmanned aerial vehicle interactive shooting method, which adopts the unmanned aerial vehicle interactive shooting system, and the method comprises the following steps:
the camera shooting component acquires a shot image;
the image processing module processes the shot image of the camera shooting assembly to acquire the user action characteristics to be executed in the shot image;
the instruction judging module searches the corresponding control instruction in the control instruction library according to the user action characteristic to be executed; and
and the instruction execution module controls the unmanned aerial vehicle and/or the camera shooting assembly according to the searched control instruction.
Optionally, the control instruction further includes a panoramic mode selection instruction, which instructs the control component to enter a panoramic mode, and in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera component to take panoramic photos in the following steps:
the camera shooting component detects the position of a user;
the unmanned aerial vehicle rotates alpha/n to one side in the same horizontal plane by taking the position of a user as a starting point, wherein n is a first preset segmentation value, n is greater than 1, and alpha is a preset maximum panoramic shooting angle;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle rotates to the other side at a constant speed in the same horizontal plane at a preset speed by alpha;
after the unmanned aerial vehicle stops rotating, the camera shooting assembly stops shooting.
Optionally, the control instruction further includes a panoramic mode selection instruction, which instructs the control component to enter a panoramic mode, and in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera component to take panoramic photos in the following manner:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first sector with an angle alpha by taking the positioning point as a circle center and L/m as a radius, and places a to-be-shot object on a circular arc of the first sector, wherein m is a second preset segmentation value, m is greater than 1, and alpha is a preset maximum panoramic shooting angle;
the instruction execution module generates a second fan shape opposite to the first fan shape, two side edges of the second fan shape are reverse extension lines of two side edges of the first fan shape respectively, the radius of the second fan shape is (m-1) L/m, and the angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the second fan-shaped arc to the other end of the arc along the track of the arc;
unmanned aerial vehicle removes to behind the other end of the sectorial circular arc of second, the subassembly of making a video recording stops to shoot.
Optionally, the control instruction further includes a panoramic mode selection instruction, which instructs the control component to enter a panoramic mode, and in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera component to take panoramic photos in the following manner:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first isosceles triangle with a vertex angle of alpha by taking the positioning point as a vertex and L/m as the length of a waist, and a to-be-shot object is positioned at the bottom of the first isosceles triangle, wherein m is a second preset segmentation value, m is greater than 1, and alpha is a preset maximum panoramic shooting angle;
the instruction execution module generates a second isosceles triangle opposite to the first isosceles triangle, the two waists of the second isosceles triangle are reverse extension lines of the two waists of the first isosceles triangle respectively, the length of the waist of the second isosceles triangle is (m-1) L/m, and the vertex angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the bottom edge of the second isosceles triangle to the other end of the bottom edge along the track of the bottom edge;
after the unmanned aerial vehicle moves to the other end of the bottom edge of the second isosceles triangle, the camera shooting assembly stops shooting.
The unmanned aerial vehicle interactive shooting system and the unmanned aerial vehicle interactive shooting method have the following advantages:
the invention provides a technical scheme that a user can directly control through actions, a camera shooting component automatically acquires a shot image, a control component automatically analyzes the shot image to obtain the action characteristics of the user to be executed, and a control instruction required by the user is read according to the action characteristics of the user to be executed, so that the user can directly perform flight control on an unmanned aerial vehicle and perform shooting control on the camera shooting component through actions, thereby realizing a shooting function, easily realizing shooting meeting requirements in any occasions, and improving user experience.
Drawings
Other features, objects and advantages of the present invention will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, with reference to the accompanying drawings.
Fig. 1 is a block diagram of an unmanned aerial vehicle interactive shooting system according to an embodiment of the present invention;
fig. 2 is a schematic structural diagram of an unmanned aerial vehicle interactive shooting system using an array display screen according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of an unmanned aerial vehicle interactive shooting system using a dot matrix display screen according to an embodiment of the present invention;
fig. 4 is a schematic diagram of adjusting the position of a drone according to an embodiment of the present invention;
FIG. 5 is a schematic view of an embodiment of the present invention for adjusting the angle of a camera module;
FIGS. 6-7 are schematic diagrams of gesture control according to an embodiment of the invention;
FIG. 8 is a schematic structural diagram of an external display device according to an embodiment of the present invention;
FIG. 9 is a schematic structural diagram of a display device according to an embodiment of the present invention when it is folded;
fig. 10 is a bottom schematic view of a drone of an embodiment of the present invention when not in use;
fig. 11 is a schematic structural view of an electric storage device according to an embodiment of the present invention;
fig. 12 is a schematic diagram of the charging state of the drone according to an embodiment of the invention;
fig. 13 is a flow chart of a charging process for a drone of an embodiment of the present invention;
FIG. 14 is a schematic diagram of the control of the position of a drone by voice in accordance with an embodiment of the present invention;
fig. 15 is a schematic structural diagram of an unmanned aerial vehicle interactive shooting system with voice control added according to an embodiment of the present invention;
FIG. 16 is a flow diagram of user voiceprint authentication in accordance with one embodiment of the invention;
FIG. 17 is a flow chart of user physiological characteristic verification according to an embodiment of the present invention;
18-20 are flowcharts of an unmanned aerial vehicle interactive shooting method according to an embodiment of the present invention;
fig. 21 is a flowchart of panorama shooting according to an embodiment of the present invention;
fig. 22 is a schematic view of rotation of the drone during panoramic photography in accordance with an embodiment of the present invention;
fig. 23 is a schematic diagram of the unmanned aerial vehicle moving along an arc track during panoramic shooting according to an embodiment of the present invention;
fig. 24 is a schematic diagram of the unmanned aerial vehicle moving along a straight track in panoramic photographing according to an embodiment of the present invention;
fig. 25 is a flow chart of an unmanned aerial vehicle automatically tracking a user's location in accordance with an embodiment of the invention;
fig. 26 is a flow chart of automatic obstacle avoidance for an unmanned aerial vehicle according to an embodiment of the present invention.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and thus their repetitive description will be omitted.
As shown in fig. 1, an embodiment of the present invention provides an unmanned aerial vehicle interactive shooting system, where the system includes an unmanned aerial vehicle 200, a camera assembly 300, and a control assembly 100, and one end of the camera assembly 300 is rotatably connected to one side of the unmanned aerial vehicle 200; wherein the control assembly 100 comprises: the control instruction library 110 is used for storing mapping relations between various preset user action characteristics and various control instructions, wherein the control instructions comprise unmanned aerial vehicle control instructions and/or camera shooting assembly control instructions; the image processing module 120 is configured to process a captured image of the camera assembly 300 to obtain a user action feature to be executed in the captured image; the instruction determining module 130 is configured to search, according to the user action feature to be executed, a corresponding control instruction in the control instruction library; and an instruction execution module 140, configured to control the drone 200 and/or the camera module 300 according to the found control instruction.
The user action characteristics are preferably gestures of the user, that is, different gestures can be used to obtain different control instructions, however, in practical applications, other user action characteristics may also be used, such as the user's eye, the user's head nodding, head shaking, the user's laugh, etc., for example, shooting may be performed when a picture of the user's laugh is captured, so that automatic capturing of the user's smile may be realized, and so on. The following embodiments describe control based on user gestures, however, it is understood that other user action features are also within the scope of the present invention.
Fig. 2 is a schematic structural diagram of an unmanned aerial vehicle interactive shooting system according to an embodiment of the present invention. The unmanned aerial vehicle 200 is shown, a camera shooting assembly 300 is rotatably mounted on one side of the unmanned aerial vehicle 200, the camera shooting assembly 300 comprises a camera shooting device 320 and a camera shooting bracket 310, the camera shooting device 320 is arranged in the camera shooting bracket 310, and one end of the camera shooting bracket 310 is rotatably connected to one side of the unmanned aerial vehicle 200; further, the system may further include a display device 330, and the display device 330 may be detachably or fixedly mounted at the other end of the camera support 310.
In order to facilitate the control component 100 to control the drone 200 and/or the camera component 300, the control component 100 may be disposed inside the drone 200, or on the surface of the drone 200, or at other positions, and all of them are within the scope of the present invention. The instruction execution module 140 may directly communicate with the controller of the drone 200, or wirelessly communicate with the camera module 300, so as to transmit and feed back control instructions.
The display device 330 can display the content for the user to view according to the requirement, and two setting modes of the display device 330 are shown in fig. 2 and fig. 3.
The display device 330 shown in fig. 2 includes an array display screen and a first display control unit therein; the first display control unit acquires a shot image of the image pickup apparatus 320 and displays the shot image through the array display screen. The array display screen may include, but is not limited to, a color LCD screen through which a user may view a self-portrait picture in real time.
The display device 330 shown in fig. 3 includes a dot matrix display screen and a second display control unit therein; the second display control unit obtains the control instruction found by the instruction determination module 130, and controls the dot-matrix display screen to display the user prompt information associated with the found control instruction. The dot matrix display screen may include, but is not limited to, a dot matrix LED screen, and a user may prepare for self-timer shooting and photograph through a lamp arrangement of LEDs.
For example, the user prompt information may be a self-timer countdown, for example, when shooting starts in five seconds, the dot-matrix display screen sequentially displays 5, 4, 3, 2 and 1, and the user can prepare for self-timer counting according to the countdown; the user prompt may also indicate what shooting mode is currently in, such as when display 2 indicates that the second mode is currently in, and so on.
By adopting the unmanned aerial vehicle interactive shooting system, the shooting component automatically acquires the shot image, the control component automatically analyzes the shot image to obtain the user action characteristics to be executed, and the control instruction required by the user is interpreted according to the user action characteristics to be executed, so that the user can realize the control of the unmanned aerial vehicle 200 and/or the shooting component 300.
When controlling the drone 200 and/or the camera assembly 300, the drone control instruction may include at least one of a drone translation instruction, a drone rotation instruction, a drone startup instruction, and a drone shutdown instruction; the camera assembly control instructions may include at least one of camera assembly rotation instructions, shooting parameter adjustment instructions, shooting start instructions, and shooting stop instructions. The shooting parameters that can be adjusted here may include focusing, fill-in light, image size, and the like at the time of shooting.
Fig. 4 is a schematic diagram of adjusting the position of the drone 200 according to an embodiment of the present invention. The following steps can be adopted for specifically adjusting the position of the unmanned aerial vehicle:
a. when the unmanned aerial vehicle 200 starts and takes off, hovering at an initial position;
b. the user 400 observes the self-timer angle from the display device 330, finds that the portrait is at a left position (the portrait shown by a dotted line in fig. 4) in the display device 330, and the user 400 controls the unmanned aerial vehicle to move to the left through gestures (from a state of a dotted line of the hand of the user 400 in fig. 4 to a state of a solid line) until the portrait is at a picture center position (the portrait shown by a solid line in fig. 4);
c. when the photographing conditions are met, the user 400 performs photographing through gesture control.
Fig. 5 is a schematic diagram illustrating an angle adjustment of the camera module 300 according to an embodiment of the invention. Specifically, the following steps may be employed to adjust the camera module 300:
a. after the unmanned aerial vehicle 200 is started and takes off, hovering at an initial position;
b. the user 400 observes the self-timer angle from the display device 303, finds that the unmanned aerial vehicle 200 is higher, and the portrait is located at a lower position (as the portrait shown by a dotted line in fig. 5), and controls the image pickup assembly 300 to turn downwards through a gesture (from a state of the dotted line of the hand of the user 400 to a state of a solid line in fig. 4), so as to drive the image pickup device 302 to turn downwards until the portrait is located at a picture center position (as the portrait shown by the solid line in fig. 5);
c. when the photographing conditions are met, the user 400 performs photographing through gesture control.
In addition, the mode of controlling the drone 200 and the camera assembly 300 may also be flexibly selected, for example, in fig. 5, when the portrait is in a downward position, the adjustment may also be performed in a mode of lowering the height of the drone 200 so that the portrait is in a picture centered position.
Specifically, the adjustment modes of the drone 200 and the camera module 300 may be distinguished by using different preset gesture instructions. That is, when a gesture is known, it can be known whether the gesture specifically controls the drone 200 or the camera assembly 300, and it can also be known what the gesture specifically controls the action of the drone 200 or the camera assembly 300.
As shown in fig. 6 and 7, a mapping relationship between a user action characteristic and a control instruction is shown. The control commands corresponding to the different gestures are shown in table 1 below.
TABLE 1 gesture and control instruction mapping table
Figure GDA0001392642800000121
An example of only one type of gesture control is given in fig. 6 and 7. In practical application, a user can also customize the mapping relation between various gestures and different control instructions and modify the gestures into gestures according with the use habits of the user. And other motion features may also be added, e.g., a user nodding his head to indicate confirmation of a shot, a user shaking his head to indicate deletion of a previously shot image, etc.
Two arrangements of the camera assembly 300 are shown in fig. 8 and 9. As shown in fig. 8, the camera module 300 employs an external display device 340, the external display device 340 may further be a mobile terminal of a user, and the external display device 340 and the control module 100 may communicate with each other through a wireless communication or a data line such as a USB. One end of the camera shooting support 310 is provided with a convex block 311, and one side of the unmanned aerial vehicle 200 is provided with a groove 210 which is matched with the convex block in shape; the convex block 311 of the camera shooting support is embedded in the groove 210 of the unmanned aerial vehicle.
In the setting manner shown in fig. 8, the camera bracket 310 includes a first arm 312, a second arm 313 and a third arm 314, one side of the first arm 312 is connected to the protrusion 311, and a first slot is disposed on the other side of the first arm 312, one end of the second arm 313 and one end of the third arm 314 are respectively connected to two ends of the first arm 312, the second arm 313 and the third arm 314 are perpendicular to the first arm 312, a second slot is disposed on the other end of the second arm 313, and a third slot is disposed on the other end of the third arm 314. The external display device 340 may be placed in the camera bracket 310, the upper end of the external display device 340 is inserted into the first slot, and the lower end of the external display device 340 is inserted into the second slot and the third slot, so that a stable and conveniently detachable connection between the external display device 340 and the camera bracket 310 is formed.
In the embodiment of fig. 9, the display device 330 is a built-in display device 330. Likewise, in this arrangement, the camera holder 310 is rotated by the engagement of the protrusion 311 with the recess 210, and the display device 330 is also rotated together with the camera holder 310. The lower surface of unmanned aerial vehicle 200 is a plane, unmanned aerial vehicle 200's lower surface includes that a support corresponds the district of making a video recording, the both sides face perpendicular to of unmanned aerial vehicle 200's recess 210 the lower surface of unmanned aerial vehicle 200, from this the lug 311 can go up and down in recess 210 and rotate, so that support 310 of making a video recording can with unmanned aerial vehicle 200's lower surface is perpendicular and with the support of making a video recording corresponds the angular range internal rotation of district laminating. As described above, during use, the camera module 300 can be adjusted within a desired angle range to obtain a better shooting effect. And when finishing using or unmanned aerial vehicle electric quantity exhausts and can't use, can with make a video recording support 310 fold to with make a video recording support corresponds the district laminating, conveniently carries it is folding.
In addition, because the unmanned aerial vehicle 200 generally has a small size, a small battery capacity and a short endurance time, in order to overcome the problem, the embodiment of the invention further provides a convenient charging mode. As shown in fig. 10 to 12, the lower surface of the unmanned aerial vehicle 200 further includes an electric storage device corresponding area, and the electric storage device corresponding area is not intersected with the camera support corresponding area; the system further comprises an electricity storage device 500, wherein the electricity storage device 500 is detachably or fixedly installed on the lower surface of the unmanned aerial vehicle 200, and the electricity storage device 500 is attached to the corresponding area of the electricity storage device.
As shown in fig. 13, for a charging flowchart when the structure of this embodiment is adopted, when the display screen is an external display screen, the connection between the external display screen and the unmanned aerial vehicle is firstly disconnected, the external display screen may be detached, or may be left on the camera bracket 310 and folded together; if the power storage device 500 is inserted at this time, charging is started, otherwise, the power storage device is directly turned off. In order to ensure a small load when the drone 200 flies, the power storage device 500 is detached when the drone 200 is powered on and used, and the camera bracket 310 can be folded when the drone 200 is not used or the power is exhausted, and then the power storage device 500 is installed in the power storage device corresponding area. The rechargeable battery connected to the drone using the power storage device 500 performs the charging action. During folding charging, unmanned aerial vehicle 200 acquires a less volume through folding, conveniently carries, can continue to use after accomplishing to charging.
As shown in fig. 14 and fig. 15, an embodiment of the present invention may further include a voice acquiring apparatus 600, where the voice acquiring apparatus 600 is configured to acquire voice data of a user; the control instruction library 110 is further configured to store mapping relationships between preset various voice keywords and various control instructions; the control component 100 further includes a voice processing module 150, where the voice processing module 150 is configured to extract a voice keyword included in the voice data of the user; the instruction determination module 130 is further configured to search the corresponding control instruction in the control instruction library according to the extracted voice keyword.
By providing the voice acquiring apparatus 600, this embodiment can also realize shooting control by the user through voice. For example, the keyword "power on" is set to turn on the camera shooting assembly 300, and when it is detected that the word "power on" exists in the voice data of the user, the camera shooting assembly 300 is automatically turned on, or when it is detected that "unmanned aerial vehicle" and "left movement" exist in the voice data of the user, the unmanned aerial vehicle is automatically controlled to move left. The voice control is more convenient and fast, is not restricted by other conditions, and can be applied to any occasions without influencing the shooting effect of the user.
Further, as shown in fig. 16, it is necessary to distinguish different sounds considering that the control assembly 100 may receive sounds of other outside people or noises in the environment when the user uses the voice control. The voice processing module is also used for acquiring the voiceprint characteristics of the user in the voice data of the user and judging whether the voiceprint characteristics of the user are the pre-stored specified voiceprint characteristics;
if the voiceprint feature of the user is a preset allowed voiceprint feature, the voice data is indicated to be the voice data of the specified user, and control can be executed according to the voice data, the instruction judging module extracts the voice keywords included in the voice data of the user, and searches the corresponding control instruction in the control instruction library according to the extracted voice keywords; and if the voiceprint features of the user are not preset allowed voiceprint features, the voiceprint features of the user indicate that the voice data are not the voice data of the appointed user and need to be screened out, namely the command judging module ignores the voiceprint features of the user and does not extract voice keywords.
Similarly, as shown in fig. 17, the image capturing assembly 300 may further acquire action characteristics of other people than the specified user, and in order to avoid confusion, the image processing module is further configured to acquire physiological characteristics of the user in the captured image of the image capturing assembly and determine whether the physiological characteristics of the user are pre-stored specified physiological characteristics;
if the physiological characteristics of the user are pre-stored specified physiological characteristics, the obtained action of the specified user is indicated, and the instruction judging module searches the corresponding control instruction in the control instruction library according to the action characteristics of the user to be executed; and if the physiological characteristics of the user are not pre-stored specified physiological characteristics, the instruction judging module ignores the user action characteristics to be executed and does not perform searching control instruction processing.
Here, the obtaining of the physiological characteristics of the user may refer to the facial contour of the user, the hair color of the user, the hair length, the skin color and the lip color of the user, and may also be a combination of multiple physiological characteristics for more accurate identification, and the like, and all of them are within the scope of the present invention.
As shown in fig. 18, an embodiment of the present invention further provides an unmanned aerial vehicle interactive shooting method, where the unmanned aerial vehicle interactive shooting system is adopted, and the method includes the following steps:
s100: the camera shooting component acquires a shot image;
s200: the image processing module processes the shot image of the camera shooting assembly to acquire the user action characteristics to be executed in the shot image;
s300: the instruction judging module searches the corresponding control instruction in the control instruction library according to the user action characteristic to be executed; and
s400: and the instruction execution module controls the unmanned aerial vehicle and/or the camera shooting assembly according to the searched control instruction.
When the control instruction includes an unmanned aerial vehicle control instruction, a camera shooting component control instruction or other effective instructions, the determination process may adopt a flow shown in fig. 19 to sequentially perform determination and control, but is not limited to this manner. Other methods firstly judge whether the command is a camera shooting component control command, then judge whether the command is an unmanned aerial vehicle control command, and the like, and all the methods belong to the protection scope of the invention.
As shown in fig. 20, a specific embodiment of the unmanned aerial vehicle interactive shooting method is shown. The type of the display screen is judged firstly, and if the display screen is an external display screen, the control assembly is connected with the external display screen in a wireless communication mode, so that preparation is made for subsequent control. And then searching a corresponding control instruction according to the corresponding relation between the gesture and the control instruction, and executing control. As described above, the motion characteristics of the present invention are not limited to gestures, and the object of the present invention can be achieved by different motions of other body parts.
As described above, in order to distinguish between the drone control instruction and the camera assembly control instruction, the distinction may be made by different motion characteristics. In addition, different control modules can be adopted for implementation. For example, the control instructions may further include a first mode selection instruction and a second mode selection instruction instructing the control assembly to enter a first mode and a second mode, respectively.
After entering the first mode, the received user action characteristics are defaulted to be directed at an unmanned aerial vehicle control instruction, namely the instruction judging module searches the corresponding unmanned aerial vehicle control instruction in the control instruction library according to the user action characteristics, controls the unmanned aerial vehicle according to the searched unmanned aerial vehicle control instruction and does not execute a camera shooting assembly control instruction any more; after entering the second mode, the received user action characteristics default to be directed to the camera shooting assembly control instruction, namely, the instruction judging module searches the corresponding camera shooting assembly control instruction in the control instruction library according to the user action characteristics, controls the camera shooting assembly according to the searched camera shooting assembly control instruction, and does not execute the camera shooting assembly control instruction any more.
In this way, the number of user-set action features can be reduced. For example, the palm is also unfolded and moves downwards, in the first mode, the unmanned aerial vehicle is controlled to move downwards, and in the second mode, the camera shooting assembly is controlled to turn downwards. Only one specific embodiment is given here, without limiting the scope of protection of the invention.
Further, because of the stationarity and controllability of unmanned aerial vehicle in the flight process, it has some irreplaceable advantages for the user takes the camera shooting with hand, for example, unmanned aerial vehicle can take out the less picture of shake, and the anti-shake performance requirement to camera equipment is lower. When a user holds a camera to take a panoramic picture, the user often cannot obtain an ideal panoramic picture because of shaking or interference of other factors. And this problem can be overcome by drones.
As shown in fig. 21 and 22, the control instruction further includes a panoramic mode selection instruction, which instructs the control component to enter a panoramic mode, in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle to continuously rotate within an angle range of (0, α) at a preset speed, and α is a preset maximum panoramic shooting angle. Optionally, in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera assembly to take panoramic photos as follows:
the camera assembly detects the position of the user 400;
the unmanned aerial vehicle 200 rotates to one side in the same horizontal plane by alpha/n by taking the position of the user 400 as a starting point, the unmanned aerial vehicle positioning stage is a unmanned aerial vehicle positioning stage, shooting is not performed in the process, and n is a first preset segmentation value;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle 200 rotates to the other side at a constant speed at a preset speed in the same horizontal plane at an angle alpha, so that a panoramic picture with an angle alpha is obtained, and a user is located at an appointed position of the panoramic picture;
after the unmanned aerial vehicle 200 stops rotating, the camera shooting assembly stops shooting.
When n is 2, the user may be positioned at the center of the panoramic photograph. In practical application, angle alpha can set up as required, and the position that the user is located in the panorama photo also can adjust, for example set up the user and be located the position of leaning on left, then can rotate alpha/4 etc. to one side with unmanned aerial vehicle earlier, the shooting mode is very nimble to it is high to shoot the panorama photo success rate, and it is better to shoot the photo effect.
As shown in fig. 23, in another panoramic shooting mode, in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera shooting assembly to take panoramic photos as follows:
the instruction execution module calculates the distance L between the camera assembly and the user 400, i.e. the distance shown by the dotted line connecting the user 400 and the drone 200 in the figure;
the instruction execution module selects a positioning point between the camera shooting assembly and the user 400, generates a first sector 701 with an angle alpha by taking the positioning point as a circle center and L/m as a radius, and places the object to be shot on an arc of the first sector 701, wherein m is a second preset segmentation value, m is greater than 1, and alpha is a preset maximum panoramic shooting angle;
the instruction execution module generates a second sector 702 opposite to the first sector 701, two side edges of the second sector 702 are respectively reverse extension lines of two side edges of the first sector 701, a radius of the second sector 702 is (m-1) L/m, and an angle is alpha;
the camera module starts shooting, and the drone 200 moves from one end of the arc of the second sector 702 to the other end of the arc along the trajectory of the arc;
after the unmanned aerial vehicle 200 moves to the other end of the arc of the second sector 702, the camera shooting assembly stops shooting.
As shown in fig. 24, in the panoramic mode, the instruction execution module controls the drone and the camera assembly to take panoramic photographs in the following manner:
the instruction execution module calculates the distance L between the camera assembly and the user 400, i.e. the distance shown by the dotted line connecting the user 400 and the drone 200 in the figure;
the instruction execution module selects a positioning point between the camera shooting assembly and the user 400, generates a first isosceles triangle 703 with a vertex angle of alpha by taking the positioning point as a vertex and taking L/m as the length of a waist, and a to-be-shot object is positioned on the bottom edge of the first isosceles triangle 703, wherein m is a second preset segmentation value, m is greater than 1, and alpha is a preset maximum panoramic shooting angle;
the instruction execution module generates a second isosceles triangle 704 opposite to the first isosceles triangle 703, wherein two waists of the second isosceles triangle 704 are reverse extension lines of the two waists of the first isosceles triangle 703 respectively, the length of the waist of the second isosceles triangle 704 is (m-1) L/m, and the vertex angle is alpha;
the camera shooting component starts shooting, and the unmanned aerial vehicle 200 moves from one end of the bottom side of the second isosceles triangle 704 to the other end of the bottom side along the track of the bottom side;
after the unmanned aerial vehicle 200 moves to the other end of the bottom side of the second isosceles triangle 704, the camera shooting assembly stops shooting.
The shooting tracks in fig. 23 and 24 can be selected as required, a panoramic photo is formed by continuous shooting, or a plurality of photos are shot to be combined into a panoramic photo, and different shooting ranges can be obtained by different selections of m and alpha, so that the method is more flexible. Unmanned aerial vehicle can move according to the predetermined orbit that obtains of calculating, makes the subassembly of making a video recording acquire different shooting positions and shooting angle.
When the unmanned aerial vehicle is used for shooting, a certain preparation time is sometimes needed, for example, a shooting countdown can be set, that is, the control instruction can further include a third mode selection instruction for instructing the control component to enter a third mode, and in the third mode, the instruction execution module controls the camera shooting component to shoot after a preset waiting time. In the countdown process, the countdown time can be displayed through the display device, and the remaining preparation time can also be indicated through other display lamps or prompt tones.
As shown in fig. 25, the drone of the present invention can also implement a user auto-tracking shooting function. The control instruction can also comprise a fourth mode selection instruction which indicates the control assembly to enter a fourth mode, and in the fourth mode, the instruction execution module detects the position of the user through the camera assembly and controls the unmanned aerial vehicle and the camera assembly to automatically move according to the position of the user, so that the camera assembly continuously shoots the user, automatic tracking of shooting of the user is achieved, and the user is guaranteed to be always within a shooting range.
Optionally, in the fourth mode, the instruction execution module may further obtain a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside. By adopting the mode, on one hand, when the camera shooting component can not capture the position of the user, the user can be prompted to pay attention by alarming, so that the user can actively come within the shooting range of the camera shooting component; on the other hand, the falling detection can be realized, when the user falls down carelessly or is uncomfortable and dizzy, the alarm signal can be automatically sent to the outside, if the user does not cancel the alarm signal within a certain time, the mobile terminal of other users related to the user can be further informed or emergency calls can be dialed, and the like, so that high-quality shooting is provided for the user, and meanwhile, the safety of the user in the using process is ensured.
As shown in fig. 26, further, in order to ensure the safety of the drone itself, in consideration that the drone may collide with other obstacles due to incorrect estimation or misoperation of the distance when the user controls the action of the drone, at least one distance sensor may be further disposed on the drone, and the control assembly further includes an obstacle calculation module for acquiring obstacle detection data of the distance sensor;
the control command to be executed comprises an unmanned aerial vehicle moving command, and the obstacle calculation module judges that obstacles in the moving direction in the unmanned aerial vehicle moving command and the distance between unmanned aerial vehicles are smaller than a preset safety threshold value, cancels the unmanned aerial vehicle moving command and sends out limit value reminding signals to the outside. Namely, after the obstacle calculation module knows the surrounding obstacle which may be collided with through the distance sensor, prediction is carried out according to the pointing direction of the control instruction, if the unmanned aerial vehicle executes the unmanned aerial vehicle movement instruction, whether the obstacle may be collided with or not, if the unmanned aerial vehicle movement instruction is executed, the unmanned aerial vehicle movement instruction is not executed, and the user is reminded that the distance is smaller than the limit value, and the obstacle is collided with.
This embodiment is particularly suitable for indoor shooting. Because indoor by the restriction of wall and ceiling to there are barriers such as many other furniture, ornaments, adopt this kind of mode, through reliable calculation and dangerous prediction, then can guarantee the security that unmanned aerial vehicle shot indoor. Similarly, also can be applicable to the condition that unmanned aerial vehicle shot outdoors, in outdoor spacious place, unmanned aerial vehicle probably moving speed is very fast, and the user can't foresee the danger that does not arrive well, adopts this kind of mode, then can guarantee the stability and the reliability of unmanned aerial vehicle interaction shooting process.
Compared with the prior art, the invention provides a technical scheme that a user can directly control through actions, the camera shooting component automatically acquires a shot image, the control component automatically analyzes the shot image to obtain the action characteristics of the user to be executed, and a control instruction required by the user is read according to the action characteristics of the user to be executed, so that the user can directly perform flight control on the unmanned aerial vehicle and perform shooting control on the camera shooting component through actions, thereby realizing a shooting function, easily realizing shooting meeting requirements in any occasions, and improving user experience.
The foregoing is a more detailed description of the invention in connection with specific preferred embodiments and it is not intended that the invention be limited to these specific details. For those skilled in the art to which the invention pertains, several simple deductions or substitutions can be made without departing from the spirit of the invention, and all shall be considered as belonging to the protection scope of the invention.

Claims (17)

1. An unmanned aerial vehicle interactive shooting system is characterized by comprising an unmanned aerial vehicle, a camera shooting assembly and a control assembly, wherein one end of the camera shooting assembly is rotatably connected to one side of the unmanned aerial vehicle; wherein the control assembly comprises:
the control instruction library is used for storing mapping relations between various preset user action characteristics and various control instructions, and the control instructions comprise unmanned aerial vehicle control instructions and/or camera shooting assembly control instructions;
the image processing module is used for processing the shot image of the camera shooting assembly so as to obtain the user action characteristics to be executed in the shot image;
the instruction judging module is used for searching the corresponding control instruction in the control instruction library according to the user action characteristic to be executed; and
the command execution module is used for controlling the unmanned aerial vehicle and/or the camera shooting assembly according to the searched control command;
the control instructions further comprise:
a panoramic mode selection instruction, which indicates the control component to enter a panoramic mode, wherein in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle to continuously move within an angle range of (0, alpha) at a preset speed, and alpha is a preset panoramic shooting maximum angle;
in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera shooting assembly to take panoramic photos in the following modes:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first sector with an angle alpha by taking the positioning point as a circle center and L/m as a radius, and places a to-be-shot object on an arc of the first sector, wherein m is a second preset segmentation value, and m is greater than 1;
the instruction execution module generates a second fan shape opposite to the first fan shape, two side edges of the second fan shape are reverse extension lines of two side edges of the first fan shape respectively, the radius of the second fan shape is (m-1) L/m, and the angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the second fan-shaped arc to the other end of the arc along the track of the arc;
after the unmanned aerial vehicle moves to the other end of the second fan-shaped arc, the shooting component stops shooting; or,
in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera shooting assembly to take panoramic photos in the following modes:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first isosceles triangle with a vertex angle of alpha by taking the positioning point as a vertex and L/m as the length of a waist, and locates a to-be-shot object on the bottom edge of the first isosceles triangle, wherein m is a second preset segmentation value, and m is greater than 1;
the instruction execution module generates a second isosceles triangle opposite to the first isosceles triangle, the two waists of the second isosceles triangle are reverse extension lines of the two waists of the first isosceles triangle respectively, the length of the waist of the second isosceles triangle is (m-1) L/m, and the vertex angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the bottom edge of the second isosceles triangle to the other end of the bottom edge along the track of the bottom edge;
after the unmanned aerial vehicle moves to the other end of the bottom edge of the second isosceles triangle, the camera shooting assembly stops shooting.
2. The unmanned aerial vehicle interactive shooting system of claim 1, wherein the camera assembly comprises a camera device and a camera bracket, the camera device is disposed in the camera bracket, and one end of the camera bracket is rotatably connected to one side of the unmanned aerial vehicle;
the system also comprises a display device which is detachably or fixedly arranged at the other end of the camera bracket.
3. The unmanned aerial vehicle interactive shooting system of claim 2, wherein the display device comprises an array display screen and a first display control unit; the first display control unit acquires a shot image of the camera equipment and displays the shot image through the array type display screen.
4. The unmanned aerial vehicle interactive shooting system of claim 2, wherein the display device comprises a dot matrix display screen and a second display control unit; and the second display control unit acquires the control instruction searched by the instruction judgment module and controls the dot-matrix display screen to display user prompt information associated with the searched control instruction.
5. The interactive shooting system of unmanned aerial vehicle of claim 2, wherein one end of the camera bracket is provided with a projection, and one side of the unmanned aerial vehicle is provided with a groove adapted to the shape of the projection; the projection of the camera shooting support is embedded in the groove of the unmanned aerial vehicle;
unmanned aerial vehicle's lower surface is a plane, unmanned aerial vehicle's lower surface includes that a support of making a video recording corresponds the district, the both sides face perpendicular to of unmanned aerial vehicle's recess unmanned aerial vehicle's lower surface, just the lug of the support of making a video recording is in rotatable in unmanned aerial vehicle's the recess, so that the support of making a video recording can with unmanned aerial vehicle's lower surface is perpendicular and with the support of making a video recording corresponds the angular range internal rotation of district laminating.
6. The interactive shooting system for unmanned aerial vehicles according to claim 5, wherein the lower surface of the unmanned aerial vehicle further comprises an electricity storage device corresponding area, and the electricity storage device corresponding area is not intersected with the camera support corresponding area;
the system further comprises electricity storage equipment, wherein the electricity storage equipment can be detachably or fixedly installed on the lower surface of the unmanned aerial vehicle, and is attached to the corresponding area of the electricity storage equipment.
7. The unmanned aerial vehicle interactive shooting system of claim 5, wherein the camera support comprises a first support arm, a second support arm and a third support arm, one side of the first support arm is connected to the projection, a first slot is formed in the other side of the first support arm, one end of the second support arm and one end of the third support arm are respectively connected to two ends of the first support arm, the second support arm and the third support arm are perpendicular to the first support arm, a second slot is formed in the other end of the second support arm, and a third slot is formed in the other end of the third support arm;
one side of the display device is inserted into the first slot, and the other side of the display device is inserted into the second slot and the third slot.
8. The unmanned aerial vehicle interactive shooting system of claim 1, further comprising a voice acquisition device for acquiring voice data of a user;
the control instruction library is also used for storing the mapping relation between various preset voice keywords and various control instructions;
the control component also comprises a voice processing module, and the voice processing module is used for extracting voice keywords included in the voice data of the user;
the instruction judging module is also used for searching the corresponding control instruction in the control instruction library according to the extracted voice keyword.
9. The unmanned aerial vehicle interactive shooting system of claim 8, wherein the voice processing module is further configured to obtain a voiceprint feature of a user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
if the voiceprint feature of the user is a preset allowed voiceprint feature, the instruction judging module extracts a voice keyword included in the voice data of the user and searches a corresponding control instruction in the control instruction library according to the extracted voice keyword;
and if the voiceprint features of the user are not preset allowed voiceprint features, the instruction judging module ignores the voiceprint features of the user and does not extract the voice keywords.
10. The unmanned aerial vehicle interactive shooting system of claim 1, wherein the image processing module is further configured to obtain physiological characteristics of a user in a shot image of the camera assembly, and determine whether the physiological characteristics of the user are pre-stored designated physiological characteristics;
if the physiological characteristics of the user are pre-stored specified physiological characteristics, the instruction judging module searches the corresponding control instruction in the control instruction library according to the user action characteristics to be executed;
and if the physiological characteristics of the user are not pre-stored specified physiological characteristics, the instruction judging module ignores the user action characteristics to be executed and does not perform searching control instruction processing.
11. The unmanned aerial vehicle interactive photography system of claim 1, wherein the unmanned aerial vehicle control instructions comprise at least one of unmanned aerial vehicle translation instructions, unmanned aerial vehicle rotation instructions, unmanned aerial vehicle startup instructions, and unmanned aerial vehicle shutdown instructions; the camera shooting assembly control instruction comprises at least one of a camera shooting assembly rotating instruction, a shooting parameter adjusting instruction, a shooting starting instruction and a shooting stopping instruction.
12. The interactive unmanned aerial vehicle shooting system of claim 1, wherein the control instructions further comprise:
a first mode selection instruction, which indicates the control component to enter a first mode, wherein in the first mode, the instruction judgment module searches the corresponding unmanned aerial vehicle control instruction in the control instruction library according to the user action characteristic, and controls the unmanned aerial vehicle according to the searched unmanned aerial vehicle control instruction;
and the second mode selection instruction indicates the control assembly to enter a second mode, and in the second mode, the instruction judgment module searches the corresponding camera assembly control instruction in the control instruction library according to the user action characteristic and controls the camera assembly according to the searched camera assembly control instruction.
13. The interactive unmanned aerial vehicle shooting system of claim 1, wherein the control instructions further comprise:
and the third mode selection instruction instructs the control assembly to enter a third mode, and in the third mode, the instruction execution module controls the camera assembly to shoot after preset waiting time.
14. The interactive unmanned aerial vehicle shooting system of claim 1, wherein the control instructions further comprise:
and the fourth mode selection instruction indicates the control assembly to enter a fourth mode, and in the fourth mode, the instruction execution module detects the position of the user through the camera assembly and controls the unmanned aerial vehicle and the camera assembly to automatically move according to the position of the user so that the camera assembly continuously shoots the user.
15. The unmanned aerial vehicle interactive shooting system of claim 14, wherein in the fourth mode, the instruction execution module obtains a position change acceleration of a user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
16. The unmanned aerial vehicle interactive shooting system of claim 1, wherein the unmanned aerial vehicle is further provided with at least one distance sensor, the control assembly further comprises an obstacle calculation module, and the obstacle calculation module is used for acquiring obstacle detection data of the distance sensor;
the control command to be executed comprises an unmanned aerial vehicle moving command, and the obstacle calculation module judges that obstacles in the moving direction in the unmanned aerial vehicle moving command and the distance between unmanned aerial vehicles are smaller than a preset safety threshold value, cancels the unmanned aerial vehicle moving command and sends out limit value reminding signals to the outside.
17. An unmanned aerial vehicle interactive shooting method, characterized in that the unmanned aerial vehicle interactive shooting system of any one of claims 1 to 16 is adopted, the method comprises the following steps:
the camera shooting component acquires a shot image;
the image processing module processes the shot image of the camera shooting assembly to acquire the user action characteristics to be executed in the shot image;
the instruction judging module searches the corresponding control instruction in the control instruction library according to the user action characteristic to be executed; and
the command execution module controls the unmanned aerial vehicle and/or the camera shooting assembly according to the searched control command;
the control instruction further comprises a panoramic mode selection instruction which indicates the control assembly to enter a panoramic mode, and in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera assembly to take panoramic photos in the following mode:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first sector with an angle alpha by taking the positioning point as a circle center and L/m as a radius, and places a to-be-shot object on a circular arc of the first sector, wherein m is a second preset segmentation value, m is greater than 1, and alpha is a preset maximum panoramic shooting angle;
the instruction execution module generates a second fan shape opposite to the first fan shape, two side edges of the second fan shape are reverse extension lines of two side edges of the first fan shape respectively, the radius of the second fan shape is (m-1) L/m, and the angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the second fan-shaped arc to the other end of the arc along the track of the arc;
after the unmanned aerial vehicle moves to the other end of the second fan-shaped arc, the shooting component stops shooting; or,
in the panoramic mode, the instruction execution module controls the unmanned aerial vehicle and the camera shooting assembly to take panoramic photos in the following modes:
the instruction execution module calculates the distance L between the camera shooting assembly and a user;
the instruction execution module selects a positioning point between the camera shooting assembly and a user, generates a first isosceles triangle with a vertex angle of alpha by taking the positioning point as a vertex and L/m as the length of a waist, and a to-be-shot object is positioned at the bottom of the first isosceles triangle, wherein m is a second preset segmentation value, m is greater than 1, and alpha is a preset maximum panoramic shooting angle;
the instruction execution module generates a second isosceles triangle opposite to the first isosceles triangle, the two waists of the second isosceles triangle are reverse extension lines of the two waists of the first isosceles triangle respectively, the length of the waist of the second isosceles triangle is (m-1) L/m, and the vertex angle is alpha;
the camera shooting assembly starts shooting, and the unmanned aerial vehicle moves from one end of the bottom edge of the second isosceles triangle to the other end of the bottom edge along the track of the bottom edge;
after the unmanned aerial vehicle moves to the other end of the bottom edge of the second isosceles triangle, the camera shooting assembly stops shooting.
CN201780000407.6A 2017-04-17 2017-04-17 Unmanned aerial vehicle interactive shooting system and method Active CN109121434B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/080738 WO2018191840A1 (en) 2017-04-17 2017-04-17 Interactive photographing system and method for unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN109121434A CN109121434A (en) 2019-01-01
CN109121434B true CN109121434B (en) 2021-07-27

Family

ID=63855487

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780000407.6A Active CN109121434B (en) 2017-04-17 2017-04-17 Unmanned aerial vehicle interactive shooting system and method

Country Status (3)

Country Link
CN (1) CN109121434B (en)
TW (1) TWI696122B (en)
WO (1) WO2018191840A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112019744A (en) * 2020-08-27 2020-12-01 新石器慧义知行智驰(北京)科技有限公司 Photographing method, device, equipment and medium
US11445121B2 (en) 2020-12-29 2022-09-13 Industrial Technology Research Institute Movable photographing system and photography composition control method
TWI768630B (en) * 2020-12-29 2022-06-21 財團法人工業技術研究院 Movable photographing system and photography composition control method

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN104865856A (en) * 2015-03-30 2015-08-26 成都好飞机器人科技有限公司 Voice control method for unmanned aerial vehicle
CN105338238A (en) * 2014-08-08 2016-02-17 联想(北京)有限公司 Photographing method and electronic device
CN105391939A (en) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 Unmanned aerial vehicle shooting control method, device, unmanned aerial vehicle shooting method and unmanned aerial vehicle
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
US9471059B1 (en) * 2015-02-17 2016-10-18 Amazon Technologies, Inc. Unmanned aerial vehicle assistant
CN106143870A (en) * 2015-07-28 2016-11-23 英华达(上海)科技有限公司 Unmanned vehicle
CN106227231A (en) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 The control method of unmanned plane, body feeling interaction device and unmanned plane

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678506B2 (en) * 2014-06-19 2017-06-13 Skydio, Inc. Magic wand interface and other user interaction paradigms for a flying digital assistant
US9599992B2 (en) * 2014-06-23 2017-03-21 Nixie Labs, Inc. Launch-controlled unmanned aerial vehicles, and associated systems and methods
JP6470112B2 (en) * 2015-06-01 2019-02-13 日本電信電話株式会社 Mobile device operation terminal, mobile device operation method, and mobile device operation program
CN105138126B (en) * 2015-08-26 2018-04-13 小米科技有限责任公司 Filming control method and device, the electronic equipment of unmanned plane
CN105607740A (en) * 2015-12-29 2016-05-25 清华大学深圳研究生院 Unmanned aerial vehicle control method and device based on computer vision
CN106200679B (en) * 2016-09-21 2019-01-29 中国人民解放军国防科学技术大学 Single operation person's multiple no-manned plane mixing Active Control Method based on multi-modal natural interaction
CN106444843B (en) * 2016-12-07 2019-02-15 北京奇虎科技有限公司 Unmanned plane relative bearing control method and device

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105338238A (en) * 2014-08-08 2016-02-17 联想(北京)有限公司 Photographing method and electronic device
US9471059B1 (en) * 2015-02-17 2016-10-18 Amazon Technologies, Inc. Unmanned aerial vehicle assistant
CN104865856A (en) * 2015-03-30 2015-08-26 成都好飞机器人科技有限公司 Voice control method for unmanned aerial vehicle
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN106143870A (en) * 2015-07-28 2016-11-23 英华达(上海)科技有限公司 Unmanned vehicle
CN105391939A (en) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 Unmanned aerial vehicle shooting control method, device, unmanned aerial vehicle shooting method and unmanned aerial vehicle
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
CN106227231A (en) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 The control method of unmanned plane, body feeling interaction device and unmanned plane

Also Published As

Publication number Publication date
WO2018191840A1 (en) 2018-10-25
CN109121434A (en) 2019-01-01
TW201839663A (en) 2018-11-01
TWI696122B (en) 2020-06-11

Similar Documents

Publication Publication Date Title
US11729487B2 (en) Image pickup apparatus and control method therefor
US20200014848A1 (en) Autonomous media capturing
US11188101B2 (en) Method for controlling aircraft, device, and aircraft
JP6696118B2 (en) Electronics
US10021286B2 (en) Positioning apparatus for photographic and video imaging and recording and system utilizing the same
CN113038016B (en) Unmanned aerial vehicle image acquisition method and unmanned aerial vehicle
WO2018209702A1 (en) Method for controlling unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
US11962896B2 (en) Positioning apparatus for photographic and video imaging and recording and system utilizing the same
CN106131413B (en) Shooting equipment and control method thereof
CN109121434B (en) Unmanned aerial vehicle interactive shooting system and method
US20190014242A1 (en) Wearable video camera medallion with circular display
US20160292886A1 (en) Apparatus and method for photographing people using a movable remote device
KR102670994B1 (en) Unmanned Aerial Vehicle and the Method for controlling thereof
US20160124435A1 (en) 3d scanning and imaging method utilizing a self-actuating compact unmanned aerial device
US20230362472A1 (en) Image pickup apparatus and control method therefor
CN109981944A (en) Electronic device and its control method
CN109625303B (en) Unmanned aerial vehicle for photography and control method thereof
WO2019104681A1 (en) Image capture method and device
JP7551082B2 (en) Camera stabilizer
CN110035220B (en) Device control system and method for photography
CN110337806A (en) Group picture image pickup method and device
CN110557560A (en) image pickup apparatus, control method thereof, and storage medium
CN111512625B (en) Image pickup apparatus, control method thereof, and storage medium
CN110177201B (en) Underwater control method of camera and wearable device
WO2022000138A1 (en) Photographing control method and apparatus, and gimbal and photographing system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant