WO2018191840A1 - Interactive photographing system and method for unmanned aerial vehicle - Google Patents

Interactive photographing system and method for unmanned aerial vehicle Download PDF

Info

Publication number
WO2018191840A1
WO2018191840A1 PCT/CN2017/080738 CN2017080738W WO2018191840A1 WO 2018191840 A1 WO2018191840 A1 WO 2018191840A1 CN 2017080738 W CN2017080738 W CN 2017080738W WO 2018191840 A1 WO2018191840 A1 WO 2018191840A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
user
instruction
control
camera
Prior art date
Application number
PCT/CN2017/080738
Other languages
French (fr)
Chinese (zh)
Inventor
张景嵩
张凌
戴志宏
Original Assignee
英华达(上海)科技有限公司
英华达(上海)电子有限公司
英华达股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 英华达(上海)科技有限公司, 英华达(上海)电子有限公司, 英华达股份有限公司 filed Critical 英华达(上海)科技有限公司
Priority to CN201780000407.6A priority Critical patent/CN109121434B/en
Priority to PCT/CN2017/080738 priority patent/WO2018191840A1/en
Priority to TW107111546A priority patent/TWI696122B/en
Publication of WO2018191840A1 publication Critical patent/WO2018191840A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the invention relates to the technical field of drone control, in particular to a drone interactive shooting system and method directly controlled by a user through an action.
  • Unmanned aerial vehicles or drones
  • the existing UAV shooting can be divided into two parts: commercial aerial photography and personal entertainment self-timer.
  • Currently it is controlled by an application in a remote control or a handheld mobile device.
  • the user when using a drone for personal entertainment selfies, the user often has to take both the drone and the remote control into consideration, which is not convenient to operate.
  • the remote controller in the hand when shooting a party group photo, it is often impossible for the user to observe the application screen in the handheld mobile device, so that it is impossible to capture a clear face, or to shoot a motion jump from the time of shooting, because the remote controller in the hand cannot do it. Satisfied action, affecting the shooting effect.
  • the miniaturized self-timer drones tend to have less power and shorter battery life, which affects the fun of shooting and cannot meet the needs of current users.
  • the object of the present invention is to provide an unmanned aerial vehicle interactive shooting system and method, which can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function. Improve the shooting effect.
  • An embodiment of the present invention provides a UAV interactive photographing system, the system including a drone, a camera assembly, and a control assembly, one end of the camera assembly being rotatably coupled to one side of the drone;
  • the control components include:
  • control instruction library configured to store a preset mapping relationship between various user action features and various control commands, where the control command includes a drone control command and/or a camera component control command;
  • An image processing module configured to process a captured image of the camera component to acquire a user action feature to be executed in the captured image
  • An instruction determining module configured to search for a corresponding control instruction in the control instruction library according to the user action feature to be executed
  • an instruction execution module configured to control the drone and/or the camera assembly according to the obtained control instruction.
  • the camera assembly includes an imaging device and a camera bracket, and the camera device is disposed in the camera In the bracket, and one end of the camera bracket is rotatably connected to one side of the drone;
  • the system also includes a display device detachably or fixedly mounted to the other end of the camera mount.
  • the display device comprises an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device and displays through the array display screen.
  • the display device includes a dot matrix display screen and a second display control unit; the second display control unit acquires a control command obtained by the instruction determination module, and controls the dot matrix display User prompt information associated with the search control command obtained.
  • one end of the camera bracket is disposed as a bump, and one side of the drone is provided with a groove corresponding to the shape of the bump; the protrusion of the camera bracket is embedded in the In the groove of the man-machine;
  • the lower surface of the drone is a plane, and the lower surface of the drone includes a corresponding area of the camera bracket, and the two sides of the groove of the drone are perpendicular to the lower surface of the drone, and
  • the protrusion of the camera bracket is rotatable in a recess of the drone so that the camera bracket can be perpendicular to a lower surface of the drone and an angle corresponding to a corresponding area of the camera bracket Rotate within the range.
  • the lower surface of the unmanned aerial vehicle further includes a corresponding area of the electrical storage device, and the corresponding area of the electrical storage device does not intersect with the corresponding area of the imaging bracket;
  • the system further includes a power storage device detachably or fixedly mounted on a lower surface of the drone, and the power storage device is attached to the corresponding area of the power storage device.
  • the camera bracket includes a first arm, a second arm, and a third arm, one side of the first arm is connected to the bump, and the other of the first arm One side is disposed at a side, one end of the second arm and one end of the third arm are respectively connected to two ends of the first arm, and the second arm and the third arm are vertical In the first arm, the other end of the second arm is provided with a second slot, and the other end of the third arm is provided with a third slot;
  • One side of the display device is inserted into the first slot, and the other side of the display device is inserted into the second slot and the third slot.
  • the method further includes: a voice acquiring device, where the voice acquiring device is configured to acquire voice data of the user;
  • the control instruction library is further configured to store a mapping relationship between preset various voice keywords and various control instructions
  • the control component further includes a voice processing module, where the voice processing module is configured to extract a voice keyword included in the voice data of the user;
  • the instruction determining module is further configured to search for a corresponding control instruction in the control instruction library according to the extracted voice keyword.
  • the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
  • the instruction determining module extracts a voice keyword included in the voice data of the user, and is in the control instruction library according to the extracted voice keyword. Check Find the corresponding control command;
  • the instruction determination module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
  • the image processing module is further configured to acquire a physiological feature of the user in the captured image of the camera component, and determine whether the physiological feature of the user is a pre-stored designated physiological feature;
  • the instruction determining module searches for the corresponding control instruction in the control instruction library according to the user action feature to be executed;
  • the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
  • the UAV control instruction includes at least one of a UAV translation command, a UAV rotation instruction, a UAV power-on instruction, and an unmanned machine instruction;
  • the camera component control instruction includes a camera component At least one of a rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
  • control instruction further includes:
  • the instruction determining module searches for a corresponding drone in the control instruction library according to the user action feature Controlling the command and controlling the drone according to the obtained drone control command;
  • the instruction determining module searches for a corresponding camera component control in the control instruction library according to the user action feature Commanding, and controlling the camera assembly according to the obtained camera component control command.
  • control instruction further includes:
  • a panoramic mode selection instruction instructing the control component to enter a panoramic mode, in which the instruction execution module controls the drone to continuously move within a range of (0, ⁇ ) angles at a preset speed, ⁇ is Preset panorama to shoot the maximum angle.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the camera assembly detects a location of the user
  • the drone starts with a user's position as a starting point, and rotates ⁇ /n to one side in the same horizontal plane, where n is a first preset split value, and n>1;
  • the camera assembly starts shooting, and the drone rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane;
  • the camera assembly stops shooting.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera component and a user, and the positioning point is a circle a first sector of the angle ⁇ with a radius of L/m, and the object to be photographed is located on the arc of the first sector, where m is a second predetermined segmentation value, and m>1;
  • the instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector
  • the radius is (m-1) L/m and the angle is ⁇ ;
  • the camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
  • the camera assembly stops shooting.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of ⁇ with a length of L/m as a waist, and The subject to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second predetermined segmentation value, and m>1;
  • the command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle.
  • the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is ⁇ ;
  • the camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
  • the camera assembly stops shooting.
  • control instruction further includes:
  • a third mode selection instruction instructing the control component to enter a third mode, in the third mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
  • control instruction further includes:
  • a fourth mode selection instruction instructing the control component to enter a fourth mode
  • the instruction execution module detects a position of the user through the camera component, and controls the drone and the camera The component automatically moves according to the location of the user such that the camera assembly continues to capture the user.
  • the instruction execution module acquires a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
  • the UAV is further provided with at least one distance sensor
  • the control component further includes an obstacle calculation module
  • the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor
  • the to-be-executed control instruction includes a drone movement instruction, and the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold Take The drone movement command is cancelled, and a limit reminder signal is issued to the outside.
  • the invention also provides a method for interactively photographing a drone, which adopts the UAV interactive photographing system, and the method comprises the following steps:
  • the camera assembly acquires a captured image
  • the image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image
  • the instruction determining module searches for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
  • the instruction execution module controls the drone and/or the camera assembly according to the obtained control command.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component by the following steps Take a panoramic photo shoot:
  • the camera assembly detects a location of the user
  • the drone starts with a user's position as a starting point, and rotates ⁇ /n to one side in the same horizontal plane, where n is a first preset split value, and n>1, ⁇ is a preset panoramic shooting maximum angle;
  • the camera assembly starts shooting, and the drone rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane;
  • the camera assembly stops shooting.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a center to generate a first sector with an angle of ⁇ with a radius of L/m, and the object to be photographed is located at the first a sector-shaped arc, where m is the second preset segmentation value, and m>1, ⁇ is the maximum angle of the preset panoramic shooting;
  • the instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector
  • the radius is (m-1) L/m and the angle is ⁇ ;
  • the camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
  • the camera assembly stops shooting.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of ⁇ with a length of L/m as a waist, and The object to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second preset segmentation value, and m>1, where ⁇ is a preset panoramic shooting maximum angle;
  • the command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle.
  • the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is ⁇ ;
  • the camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
  • the camera assembly stops shooting.
  • the invention provides a technical solution that the user directly controls through the action, the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed,
  • This user can directly control the flight control of the drone and control the shooting of the camera unit, thus enabling the shooting function, which can easily meet the needs of shooting in any occasion and improve the user experience.
  • FIG. 1 is a block diagram showing the structure of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of an unmanned aerial camera interactive shooting system using an array display screen according to an embodiment of the present invention
  • FIG. 3 is a schematic structural diagram of a UAV interactive photographing system using a dot matrix display screen according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of adjusting the position of a drone according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of adjusting an angle of a camera assembly according to an embodiment of the invention.
  • 6-7 are schematic diagrams of gesture control according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of an external display device according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a display device when it is stowed according to an embodiment of the present invention.
  • FIG. 10 is a schematic bottom view of the unmanned aerial vehicle according to an embodiment of the present invention when not in use;
  • FIG. 11 is a schematic structural diagram of an electrical storage device according to an embodiment of the present invention.
  • FIG. 12 is a schematic diagram showing a state of a drone when charging according to an embodiment of the present invention.
  • FIG. 13 is a flow chart showing a charging process of a drone according to an embodiment of the present invention.
  • FIG. 14 is a schematic diagram of controlling the position of a drone by voice according to an embodiment of the present invention.
  • 15 is a schematic structural diagram of an unmanned aerial camera interactive shooting system with voice control added according to an embodiment of the present invention.
  • 16 is a flow chart of user voiceprint verification according to an embodiment of the present invention.
  • 17 is a flow chart of user physiological feature verification according to an embodiment of the present invention.
  • 18 to 20 are flowcharts of a method for interactively capturing a drone according to an embodiment of the present invention.
  • 21 is a flow chart of panoramic shooting according to an embodiment of the present invention.
  • Figure 22 is a schematic view showing the rotation of the drone during panoramic shooting according to an embodiment of the present invention.
  • FIG. 23 is a schematic diagram of a drone moving along a circular arc path during panoramic shooting according to an embodiment of the present invention.
  • 24 is a schematic diagram of a drone moving along a linear trajectory during panoramic shooting according to an embodiment of the present invention.
  • 25 is a flow chart of automatically tracking a user's position by a drone according to an embodiment of the present invention.
  • Figure 26 is a flow chart showing the automatic obstacle avoidance of the drone according to an embodiment of the present invention.
  • an embodiment of the present invention provides a UAV interactive photographing system, which includes a drone 200, a camera assembly 300, and a control assembly 100.
  • One end of the camera assembly 300 is rotatably coupled to the One side of the drone 200;
  • the control component 100 includes: a control instruction library 110 for storing a mapping relationship between preset various user action features and various control commands, the control command including a drone a control instruction and/or a camera component control instruction; an image processing module 120, configured to process the captured image of the camera component 300 to acquire a user action feature to be executed in the captured image; and an instruction determining module 130, configured to: And searching for the corresponding control instruction in the control instruction library according to the user action feature to be executed; and the instruction execution module 140, configured to control the drone 200 and/or the Camera assembly 300.
  • the user action feature here is preferably a gesture of the user, that is, different control commands can be used to obtain different control commands.
  • other user action features such as the user's eyes, the user nods, shaking his head, and the user are also available. Laughing, etc., for example, it is possible to set a picture that captures the user's laughter, so that automatic capture of the user's smile can be achieved, and the like.
  • the following embodiments describe multi-user based gestures for control, however it will be appreciated that the use of other user motion features is also within the scope of the present invention.
  • FIG. 2 is a schematic structural diagram of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention.
  • a drone 200 in which one side of the drone 200 is rotatably mounted with a camera assembly 300, the camera assembly 300
  • the image capturing device 320 and the image capturing device 310 are disposed in the image capturing bracket 310, and one end of the image capturing bracket 310 is rotatably connected to one side of the drone 200; further,
  • the system can also include a display device 330 that is detachably or fixedly mounted to the other end of the camera mount 310.
  • control assembly 100 may be disposed inside the drone 200, or disposed on the surface of the drone 200, or set All other locations are within the scope of the invention.
  • the instruction execution module 140 can directly communicate with the controller of the drone 200, or can perform wireless communication with the camera assembly 300, thereby implementing delivery and feedback of control commands.
  • the display device 330 can display content for viewing by the user according to needs, and two setting manners of the display device 330 are given in FIG. 2 and FIG.
  • the display device 330 shown in FIG. 2 includes an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device 320 and displays it through the array display screen.
  • the array display can include, but is not limited to, a color LCD screen, and the user can view the self-timer picture in real time through the display.
  • the display device 330 shown in FIG. 3 includes a dot matrix display screen and a second display control unit; the second display control unit acquires the control command obtained by the instruction determination module 130, and controls the dot matrix
  • the display screen displays user prompt information associated with the search control command obtained.
  • the dot matrix display screen may include, but is not limited to, a dot matrix LED screen, and the user can perform self-photographing preparation and shooting through the LED number arrangement form.
  • the user prompt information may be a self-timer countdown. For example, when the countdown starts shooting for five seconds, the dot matrix display sequentially displays 5, 4, 3, 2, and 1, and the user can prepare for self-time according to the countdown; the user prompts information. It is also possible to indicate which shooting mode is currently in use, for example, when 2 is displayed, it means that it is currently in the second mode, and so on.
  • the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed, thereby the user can Control of the drone 200 and/or camera assembly 300 is accomplished.
  • the drone control command may include at least one of a drone panning command, a drone rotation command, a drone powering command, and a drone machine command.
  • the camera component control command may include at least one of a camera component rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
  • the shooting parameters that can be adjusted here can include focus, fill light, image size, and so on.
  • FIG. 4 a schematic diagram of adjusting the position of the drone 200 according to an embodiment of the present invention is shown. To adjust the position of the drone, you can use the following steps:
  • the user 400 observes the self-portrait angle from the display device 330, and finds that the portrait is in the left-left position in the display device 330 (the portrait shown by the broken line in FIG. 4), and the user 400 passes the gesture (from the dotted line state of the user 400 in FIG. 4 to the solid line) State) controls the drone to move to the left position until the portrait is in the center of the screen (the portrait shown in solid lines in Figure 4);
  • the user 400 After the shooting conditions are met, the user 400 performs shooting by gesture control.
  • FIG. 5 it is a schematic diagram of adjusting the angle of the camera assembly 300 according to an embodiment of the invention. To specifically adjust the camera component 300, the following steps can be taken:
  • the user 400 observes the self-portrait angle from the display device 303, finds that the drone 200 is high, the portrait is in the downward position (such as the portrait shown by the dotted line in FIG. 5), and the user passes the gesture (from the user 400 hand dotted state in FIG. 4) Up to the solid state) controlling the camera assembly 300 to flip down, thereby driving the camera device 302 to flip down until the portrait is in the center of the screen (such as the portrait shown by the solid line in FIG. 5);
  • the user 400 After the shooting conditions are met, the user 400 performs shooting by gesture control.
  • the manner of controlling the drone 200 and the camera assembly 300 can also be flexibly selected.
  • the adjustment can also be performed by reducing the height of the drone 200.
  • the portrait is in the middle of the screen.
  • the adjustment manner of the drone 200 and the camera assembly 300 can be distinguished by using different preset gesture commands. That is, when a gesture is known, it can be known whether the gesture specific control object is the drone 200 or the camera assembly 300, and it can be known that the gesture specifically controls the action of the drone 200 or the camera assembly 300.
  • gesture control is given in Figures 6 and 7.
  • the user can also customize the mapping relationship between different gestures and different control commands, and modify it to a gesture that conforms to its usage habits.
  • Other action features can also be added. For example, the user nods to confirm the shooting, the user shakes the head to delete the previous captured image, and so on.
  • the camera assembly 300 uses an external display device 340.
  • the external display device 340 can further be a user's mobile terminal.
  • the external display device 340 and the control component 100 can communicate via wireless or USB. Wait for the data line to communicate.
  • One end of the camera bracket 310 is disposed as a bump 311, and one side of the drone 200 is provided with a groove 210 corresponding to the shape of the bump; the bump 311 of the camera bracket is embedded in the In the groove 210 of the drone.
  • the camera bracket 310 includes a first arm 312 , a second arm 313 , and a third arm 314 .
  • One side of the first arm 312 is connected to the bump 311 .
  • a first slot is disposed on the other side of the first arm 312, and one end of the second arm 313 and one end of the third arm 314 are respectively connected to two ends of the first arm 312.
  • the second arm 313 and the third arm 314 are both perpendicular to the first arm 312, and the other end of the second arm 313 is provided with a second slot, the third arm 314 The other end is provided with a third slot.
  • the external display device 340 can be placed in the camera holder 310, the upper end of the external display device 340 is inserted into the first slot, and the lower end of the external display device 340 is inserted into the second slot and the third slot. Thereby, a stable and convenient connection between the external display device 340 and the imaging stand 310 is formed.
  • the display device 330 is a built-in display device 330.
  • the camera holder 310 is rotated by the cooperation of the bump 311 and the groove 210, and the display device 330 is also rotated together with the camera holder 310.
  • the lower surface of the drone 200 is a plane, and the lower surface of the drone 200 includes a camera bracket corresponding area 220. The two sides of the recess 210 of the drone 200 are perpendicular to the drone.
  • the camera assembly 300 can be adjusted within a desired range of angles to achieve better shooting results.
  • the camera stand 310 can be folded into the corresponding area 220 of the camera stand to facilitate folding and carrying.
  • the embodiment of the present invention further provides a convenient charging mode.
  • the lower surface of the unmanned aerial vehicle 200 further includes a power storage device corresponding area 230, and the corresponding area of the power storage device does not intersect with the camera support corresponding area 220; the system further includes a power storage device 500, The power storage device 500 is detachably or fixedly mounted on a lower surface of the drone 200, and the power storage device 500 is attached to the corresponding region of the power storage device.
  • the connection between the external display screen and the drone is first disconnected, and the external display can be displayed.
  • the screen is removed, and it can also be left on the camera holder 310 and folded together; if the power storage device 500 is inserted at this time, charging starts, otherwise it is directly turned off.
  • the power storage device 500 is installed in the corresponding area of the power storage device.
  • the charging device is connected to the rechargeable battery of the drone using the power storage device 500 to perform a charging operation.
  • the embodiment of the present invention may further include a voice acquiring device 600, where the voice acquiring device 600 is configured to acquire voice data of a user; and the control command library 110 is further configured to store presets. a mapping relationship between the voice keyword and the various control commands; the control component 100 further includes a voice processing module 150, the voice processing module 150 is configured to extract a voice keyword included in the voice data of the user; The module 130 is further configured to search for the corresponding control instruction in the control instruction library according to the extracted voice keyword.
  • this embodiment can also implement the user's shooting control by voice. For example, if the keyword “power on” is set to turn on the camera component 300, when the word “power on” is detected in the voice data of the user, the camera component 300 is automatically turned on, or “the drone” is detected in the voice data of the user. And “moving to the left” automatically controls the drone to move to the left. Voice control is more convenient and convenient, and is not subject to other conditions, and can be applied to any occasion without affecting the user's shooting effect.
  • the control component 100 may receive noise from other people's voices or the environment, and also need to distinguish different sounds. That is, the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
  • the command determining module extracts the voice data of the user. a voice keyword included, and searching for a corresponding control instruction in the control instruction library according to the extracted voice keyword; if the voiceprint feature of the user is not a preset allowed voiceprint feature, indicating that the voice data is not The voice data of the specified user needs to be screened out, that is, the command determining module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
  • the camera component 300 may also acquire motion characteristics of other people who are not designated users.
  • the image processing module is further configured to acquire a captured image of the camera component. The physiological characteristics of the user, and determining whether the physiological characteristic of the user is a pre-stored designated physiological feature;
  • the instruction determining module searches for the corresponding one in the control instruction library according to the user action feature to be executed. And controlling the instruction; if the physiological characteristic of the user is not pre-existing the specified physiological feature, the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
  • obtaining the physiological characteristics of the user may refer to the facial features of the user, the color of the user, the length of the hair, the skin color of the user, the color of the lips, etc., or a combination of various physiological features for more accurate identification, etc. And the like are all within the scope of protection of the present invention.
  • an embodiment of the present invention further provides a method for interactively capturing a UAV, which adopts the UAV interactive photographing system, and the method includes the following steps:
  • the image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image
  • the instruction execution module controls the drone and/or the camera component according to the obtained control instruction.
  • the determination process may adopt the flow shown in FIG. 19 to sequentially perform the determination and control, but is not limited to this manner. Others determine whether it is a camera component control command, and then determine whether it is a drone control command or the like, which falls within the scope of protection of the present invention.
  • an embodiment of a specific UAV interactive photographing method is shown. First, determine the type of display. If it is an external display, you need to first connect the control unit to the external display through wireless communication to prepare for the rear control. Then, according to the correspondence between the gesture and the control instruction, the corresponding control instruction is searched for and the control is executed.
  • the action features of the present invention are not limited to the one of the gestures, and the different actions of other body parts can also achieve the object of the present invention.
  • control instruction may further include a first mode selection instruction and a second mode selection instruction respectively instructing the control component to enter the first mode and the second mode.
  • the received user action feature defaults to pointing to the drone control command, that is, the command determining module searches for the corresponding drone control in the control command library according to the user action feature. Commanding, and controlling the drone according to the obtained drone control command, and no longer executing the camera component control instruction; after entering the second mode, the received user action feature defaults to pointing to the camera component control An instruction, that is, the instruction determining module searches for a corresponding camera component control instruction in the control instruction library according to the user action feature, and controls the camera component according to the obtained camera component control instruction, and does not execute Camera component control instructions.
  • the palm is also spread out and moved downward.
  • the first mode it means that the drone is controlled to move downward
  • the second mode it means that the camera assembly is turned down. Only one specific embodiment is given herein, and the scope of protection of the present invention is not limited thereto.
  • the drone due to the smoothness and controllability of the drone during flight, it has some irreplaceable advantages compared to the user's hand taking the camera. For example, the drone can take a photo with less jitter, and the camera device The anti-shake performance requirements are lower. When a user takes a panoramic photo with the camera in his hand, he or she often loses the ideal panoramic photo due to jitter or other factors. And this problem can be overcome by drones.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone to preset The speed continues to rotate within the (0, ⁇ ) angle range, and ⁇ is the maximum angle for the preset panorama.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the camera assembly detects a location of the user 400
  • the drone 200 takes the position of the user 400 as a starting point, and rotates ⁇ /n to one side in the same horizontal plane.
  • This stage is a positioning stage of the drone, and no shooting is performed in this process, where n is the first preset dividing value. ;
  • the camera assembly starts shooting, and the drone 200 rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane, thereby reaching a panoramic photo with an angle ⁇ , and the user is located at a designated position of the panoramic photo;
  • the camera assembly stops shooting.
  • the user can be placed in the center of the panoramic photo.
  • the angle ⁇ can be set as needed, and the position of the user in the panoramic photo can also be adjusted. For example, if the user is located in the left position, the drone can be rotated to one side by a/4, etc., and the shooting mode is adopted. Very flexible, and the success rate of taking panoramic photos is better, and taking photos is better.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
  • the instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a center to generate a first sector 701 of an angle ⁇ with a radius of L/m, and the object to be photographed is located at the center.
  • is the maximum angle of the preset panoramic shooting;
  • the instruction execution module generates a second sector 702 opposite to the first sector 701, the second sector
  • the two sides of the 702 are respectively opposite extension lines of the two sides of the first sector 701, and the radius of the second sector 702 is (m-1) L / m, the angle is ⁇ ;
  • the camera assembly starts shooting, and the drone 200 moves from one end of the arc of the second sector 702 along the trajectory of the arc to the other end of the arc;
  • the camera assembly stops shooting.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
  • the instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a vertex to generate a first isosceles triangle 703 with an apex angle ⁇ with a length of L/m as a waist. And the object to be photographed is located on a bottom edge of the first isosceles triangle 703, where m is a second preset segmentation value, and m>1, ⁇ is a preset panoramic shooting maximum angle;
  • the instruction execution module generates a second isosceles triangle 704 opposite to the first isosceles triangle 703, and the two waists of the second isosceles triangle 704 are respectively opposite to the two waists of the first isosceles triangle 703 To the extension line, and the length of the waist of the second isosceles triangle 704 is (m-1) L/m, and the apex angle is ⁇ ;
  • the camera assembly starts shooting, and the drone 200 moves from one end of the bottom edge of the second isosceles triangle 704 along the trajectory of the bottom edge to the other end of the bottom edge;
  • the camera assembly stops shooting.
  • the shooting trajectories in FIG. 23 and FIG. 24 can be selected as needed, forming a panoramic photo by continuous shooting, or synthesizing a plurality of photos into one panoramic photo, and different selections of m and ⁇ can obtain different shooting ranges, and more flexibility.
  • the drone can move according to the calculated preset trajectory, so that the camera component acquires different shooting positions and shooting angles.
  • a shooting countdown may be set, that is, the control command may further include a third mode selection instruction indicating that the control component enters the third mode, in the In the three mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
  • the countdown time can be displayed by the display device, or the remaining preparation time can be indicated by other display lights or prompts.
  • the drone of the present invention can also realize a user automatic tracking shooting function.
  • the control instruction may further include a fourth mode selection instruction instructing the control component to enter a fourth mode, in the fourth mode, the instruction execution module detects a position of the user through the camera component, and controls the The drone and the camera assembly automatically move according to the position of the user, so that the camera assembly continuously captures the user. This enables automatic tracking of user shots, ensuring that the user is always within range.
  • the instruction execution module may further acquire a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
  • the instruction execution module sends an alarm signal to the outside.
  • the camera component cannot capture the user's position
  • the user can be alerted by the alarm, so that the user can actively come to the shooting range of the camera component; on the other hand, the fall detection can also be realized.
  • the alarm signal can be automatically sent to the outside. If the user does not cancel the alarm signal within a certain period of time, the mobile terminal of other users associated with the user or the emergency telephone can be further notified. While providing users with high-quality shooting, it also ensures the safety of users during use.
  • At least one distance sensor may be disposed on the drone, the control component further includes an obstacle calculation module, and the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor;
  • the to-be-executed control instruction includes a drone movement instruction
  • the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold , cancel the drone movement instruction, and issue a limit reminder signal to the outside. That is, the obstacle calculation module predicts according to the pointing direction of the control command after the obstacle is detected by the distance sensor, and if the drone performs the drone movement instruction, it may hit the obstacle. If so, the drone movement command is not executed, and the user is reminded that the distance is already less than the limit and there is a danger of hitting an obstacle.
  • this embodiment is particularly suitable for shooting indoors. Due to the limitation of wall and ceiling in the room, and many other obstacles such as furniture and furnishings, this method can ensure the safety of the drone indoors through reliable calculation and danger prediction. Similarly, it can also be applied to the case where the drone is photographed outdoors. In an open space, the drone may move faster, and the user cannot predict the danger of the arrival. Therefore, in this way, Ensure the stability and reliability of the drone interactive shooting process.
  • the present invention provides a technical solution in which a user directly controls through an action, and the camera component automatically acquires a captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the user action feature to be executed according to the user action feature to be executed.
  • the user needs the control command, so that the user can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function, and can easily realize the shooting to meet the demand in any occasion, thereby improving the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Studio Devices (AREA)

Abstract

Provided are an interactive photographing system and method for an unmanned aerial vehicle. The system comprises an unmanned aerial vehicle, a camera component, and a control component. One end of the camera component is rotatably connected to one side of the unmanned aerial vehicle. The control component comprises a control instruction library, an image processing module, an instruction determining module, and an instruction execution module. The instruction execution module controls the unmanned aerial vehicle and/or the camera component according to a found control instruction. Provided is a technical solution of implementing interactive photographing based on an unmanned aerial vehicle. A captured image is automatically obtained by the camera component and automatically analyzed by the control component to obtain a user action feature to be executed; a control instruction desired by a user is interpreted according to the user action feature to be executed; thus, the user can directly perform flight control on the unmanned aerial vehicle and perform photographing control on the camera component by means of an action, so as to achieve a photographing function. The present invention can easily implement photographing satisfying requirements on any occasion, and improving user experience.

Description

无人机交互拍摄系统及方法UAV interactive shooting system and method 技术领域Technical field
本发明涉及无人机控制技术领域,尤其涉及一种用户通过动作直接控制的无人机交互拍摄系统及方法。The invention relates to the technical field of drone control, in particular to a drone interactive shooting system and method directly controlled by a user through an action.
背景技术Background technique
无人机即无人驾驶飞机,是利用无线电遥控设备和自备的程序控制操纵的不载人飞机。由于近年来无人机技术的迅速发展,其已经广泛应用于各个领域。Unmanned aerial vehicles, or drones, are unmanned aircraft that are controlled by radio-controlled equipment and self-contained procedures. Due to the rapid development of drone technology in recent years, it has been widely used in various fields.
现有的无人机拍摄可分为商用航拍及个人娱乐自拍两大部分,目前均采用遥控器或手持移动设备中的应用程序进行操控。然而在使用无人机进行个人娱乐自拍时,用户往往要同时兼顾无人机和遥控器两方面,操作起来并不方便。例如,在拍摄聚会团体照时,常常会因为用户需要观察手持移动设备中的应用程序画面,而导致无法拍到自己清晰的面部,或拍摄运动跳跃自拍摄时,因为手中的遥控器而无法做出满意的动作,影响拍摄效果。The existing UAV shooting can be divided into two parts: commercial aerial photography and personal entertainment self-timer. Currently, it is controlled by an application in a remote control or a handheld mobile device. However, when using a drone for personal entertainment selfies, the user often has to take both the drone and the remote control into consideration, which is not convenient to operate. For example, when shooting a party group photo, it is often impossible for the user to observe the application screen in the handheld mobile device, so that it is impossible to capture a clear face, or to shoot a motion jump from the time of shooting, because the remote controller in the hand cannot do it. Satisfied action, affecting the shooting effect.
另外,考虑到无人机自身重量和尺寸等因素,小型化自拍无人机往往电量比较少、续航时间短,从而影响拍摄乐趣,无法满足当前用户的使用需求。In addition, taking into account the weight and size of the drone, the miniaturized self-timer drones tend to have less power and shorter battery life, which affects the fun of shooting and cannot meet the needs of current users.
发明内容Summary of the invention
针对现有技术中的问题,本发明的目的在于提供一种无人机交互拍摄系统及方法,用户可以直接通过动作对无人机进行飞行控制和对摄像组件进行拍摄控制,从而实现拍摄功能,提高了拍摄效果。Aiming at the problems in the prior art, the object of the present invention is to provide an unmanned aerial vehicle interactive shooting system and method, which can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function. Improve the shooting effect.
本发明实施例提供一种无人机交互拍摄系统,所述系统包括无人机、摄像组件和控制组件,所述摄像组件的一端可转动地连接至所述无人机的一侧;其中所述控制组件包括:An embodiment of the present invention provides a UAV interactive photographing system, the system including a drone, a camera assembly, and a control assembly, one end of the camera assembly being rotatably coupled to one side of the drone; The control components include:
控制指令库,用于存储预设的各种用户动作特征与各种控制指令的映射关系,所述控制指令包括无人机控制指令和/或摄像组件控制指令;a control instruction library, configured to store a preset mapping relationship between various user action features and various control commands, where the control command includes a drone control command and/or a camera component control command;
图像处理模块,用于对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;An image processing module, configured to process a captured image of the camera component to acquire a user action feature to be executed in the captured image;
指令判定模块,用于根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及An instruction determining module, configured to search for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
指令执行模块,用于根据查找得到的控制指令控制所述无人机和/或所述摄像组件。And an instruction execution module, configured to control the drone and/or the camera assembly according to the obtained control instruction.
可选地,所述摄像组件包括摄像设备和摄像支架,所述摄像设备设置于所述摄像 支架中,且所述摄像支架的一端可转动地连接至所述无人机的一侧;Optionally, the camera assembly includes an imaging device and a camera bracket, and the camera device is disposed in the camera In the bracket, and one end of the camera bracket is rotatably connected to one side of the drone;
所述系统还包括一显示设备,所述显示设备可拆卸或固定地安装于所述摄像支架的另一端。The system also includes a display device detachably or fixedly mounted to the other end of the camera mount.
可选地,所述显示设备包括阵列式显示屏和第一显示控制单元;所述第一显示控制单元获取所述摄像设备的拍摄图像,并通过所述阵列式显示屏进行显示。Optionally, the display device comprises an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device and displays through the array display screen.
可选地,所述显示设备包括点阵式显示屏和第二显示控制单元;所述第二显示控制单元获取所述指令判定模块查找得到的控制指令,并控制所述点阵式显示屏显示与所述查找得到的控制指令相关联的用户提示信息。Optionally, the display device includes a dot matrix display screen and a second display control unit; the second display control unit acquires a control command obtained by the instruction determination module, and controls the dot matrix display User prompt information associated with the search control command obtained.
可选地,所述摄像支架的一端设置为凸块,所述无人机的一侧设置有一与所述凸块形状相适应的凹槽;所述摄像支架的凸块嵌设于所述无人机的凹槽中;Optionally, one end of the camera bracket is disposed as a bump, and one side of the drone is provided with a groove corresponding to the shape of the bump; the protrusion of the camera bracket is embedded in the In the groove of the man-machine;
所述无人机的下表面为一平面,所述无人机的下表面包括一摄像支架对应区,所述无人机的凹槽的两侧面垂直于所述无人机的下表面,且所述摄像支架的凸块在所述无人机的凹槽中可转动,以使得所述摄像支架可在与所述无人机的下表面垂直以及与所述摄像支架对应区贴合的角度范围内转动。The lower surface of the drone is a plane, and the lower surface of the drone includes a corresponding area of the camera bracket, and the two sides of the groove of the drone are perpendicular to the lower surface of the drone, and The protrusion of the camera bracket is rotatable in a recess of the drone so that the camera bracket can be perpendicular to a lower surface of the drone and an angle corresponding to a corresponding area of the camera bracket Rotate within the range.
可选地,所述无人机的下表面还包括一储电设备对应区,且所述储电设备对应区与所述摄像支架对应区无交叉;Optionally, the lower surface of the unmanned aerial vehicle further includes a corresponding area of the electrical storage device, and the corresponding area of the electrical storage device does not intersect with the corresponding area of the imaging bracket;
所述系统还包括一储电设备,所述储电设备可拆卸或固定地安装于所述无人机的下表面,且所述储电设备贴合所述储电设备对应区。The system further includes a power storage device detachably or fixedly mounted on a lower surface of the drone, and the power storage device is attached to the corresponding area of the power storage device.
可选地,所述摄像支架包括第一支臂、第二支臂和第三支臂,所述第一支臂的一侧连接至所述凸块,且所述第一支臂的另一侧设置有一第一插槽,所述第二支臂的一端和第三支臂的一端分别连接至所述第一支臂的两端,且所述第二支臂和第三支臂均垂直于所述第一支臂,所述第二支臂的另一端设置有一第二插槽,所述第三支臂的另一端设置有一第三插槽;Optionally, the camera bracket includes a first arm, a second arm, and a third arm, one side of the first arm is connected to the bump, and the other of the first arm One side is disposed at a side, one end of the second arm and one end of the third arm are respectively connected to two ends of the first arm, and the second arm and the third arm are vertical In the first arm, the other end of the second arm is provided with a second slot, and the other end of the third arm is provided with a third slot;
所述显示设备的一侧插入所述第一插槽中,所述显示设备的另一侧插入所述第二插槽和所述第三插槽中。One side of the display device is inserted into the first slot, and the other side of the display device is inserted into the second slot and the third slot.
可选地,还包括语音获取设备,所述语音获取设备用于获取用户的语音数据;Optionally, the method further includes: a voice acquiring device, where the voice acquiring device is configured to acquire voice data of the user;
所述控制指令库还用于存储预设的各种语音关键词与各种控制指令的映射关系;The control instruction library is further configured to store a mapping relationship between preset various voice keywords and various control instructions;
所述控制组件还包括语音处理模块,所述语音处理模块用于提取所述用户的语音数据中包括的语音关键词;The control component further includes a voice processing module, where the voice processing module is configured to extract a voice keyword included in the voice data of the user;
所述指令判定模块还用于根据提取的语音关键词在所述控制指令库中查找所对应的控制指令。The instruction determining module is further configured to search for a corresponding control instruction in the control instruction library according to the extracted voice keyword.
可选地,所述语音处理模块还用于获取所述用户的语音数据中用户的声纹特征,并判断所述用户的声纹特征是否为预存指定声纹特征;Optionally, the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
如果所述用户的声纹特征为预设允许声纹特征,则所述指令判定模块提取所述用户的语音数据中包括的语音关键词,并根据提取的语音关键词在所述控制指令库中查 找所对应的控制指令;If the voiceprint feature of the user is a preset allowed voiceprint feature, the instruction determining module extracts a voice keyword included in the voice data of the user, and is in the control instruction library according to the extracted voice keyword. Check Find the corresponding control command;
如果所述用户的声纹特征不为预设允许声纹特征,则所述指令判定模块忽略所述用户的声纹特征,不进行提取语音关键词处理。If the voiceprint feature of the user is not a preset allowable voiceprint feature, the instruction determination module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
可选地,所述图像处理模块还用于获取所述摄像组件的拍摄图像中用户的生理特征,并判断用户的生理特征是否为预存指定生理特征;Optionally, the image processing module is further configured to acquire a physiological feature of the user in the captured image of the camera component, and determine whether the physiological feature of the user is a pre-stored designated physiological feature;
如果所述用户的生理特征为预存指定生理特征,则所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;If the physiological characteristic of the user is a pre-stored specified physiological feature, the instruction determining module searches for the corresponding control instruction in the control instruction library according to the user action feature to be executed;
如果所述用户的生理特征不为预存指定生理特征,则所述指令判定模块忽略所述待执行的用户动作特征,不进行查找控制指令处理。If the physiological characteristic of the user is not a pre-stored specified physiological feature, the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
可选地,所述无人机控制指令包括无人机平移指令、无人机转动指令、无人机开机指令和无人机关机指令中的至少一种;所述摄像组件控制指令包括摄像组件转动指令、拍摄参数调整指令、拍摄开始指令和拍摄停止指令中的至少一种。Optionally, the UAV control instruction includes at least one of a UAV translation command, a UAV rotation instruction, a UAV power-on instruction, and an unmanned machine instruction; the camera component control instruction includes a camera component At least one of a rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
可选地,所述控制指令还包括:Optionally, the control instruction further includes:
第一模式选择指令,指示所述控制组件进入第一模式,在所述第一模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的无人机控制指令,并根据查找得到的无人机控制指令控制所述无人机;a first mode selection instruction instructing the control component to enter a first mode, in the first mode, the instruction determining module searches for a corresponding drone in the control instruction library according to the user action feature Controlling the command and controlling the drone according to the obtained drone control command;
第二模式选择指令,指示所述控制组件进入第二模式,在所述第二模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的摄像组件控制指令,并根据查找得到的摄像组件控制指令控制所述摄像组件。a second mode selection instruction, instructing the control component to enter a second mode, in the second mode, the instruction determining module searches for a corresponding camera component control in the control instruction library according to the user action feature Commanding, and controlling the camera assembly according to the obtained camera component control command.
可选地,所述控制指令还包括:Optionally, the control instruction further includes:
全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机以预设速度在(0,α)角度范围内持续移动,α为预设全景拍摄最大角度。a panoramic mode selection instruction instructing the control component to enter a panoramic mode, in which the instruction execution module controls the drone to continuously move within a range of (0, α) angles at a preset speed, α is Preset panorama to shoot the maximum angle.
可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:Optionally, in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
所述摄像组件检测用户的位置;The camera assembly detects a location of the user;
所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1;The drone starts with a user's position as a starting point, and rotates α/n to one side in the same horizontal plane, where n is a first preset split value, and n>1;
所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;The camera assembly starts shooting, and the drone rotates α to the other side at a preset speed at a preset speed in the same horizontal plane;
所述无人机停止转动后,所述摄像组件停止拍摄。After the drone stops rotating, the camera assembly stops shooting.
可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:Optionally, in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆 心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1;The instruction execution module selects an positioning point between the camera component and a user, and the positioning point is a circle a first sector of the angle α with a radius of L/m, and the object to be photographed is located on the arc of the first sector, where m is a second predetermined segmentation value, and m>1;
所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;The instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector The radius is (m-1) L/m and the angle is α;
所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;The camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。After the drone moves to the other end of the arc of the second sector, the camera assembly stops shooting.
可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:Optionally, in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1;The instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of α with a length of L/m as a waist, and The subject to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second predetermined segmentation value, and m>1;
所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;The command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle. And the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is α;
所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;The camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。After the drone moves to the other end of the bottom edge of the second isosceles triangle, the camera assembly stops shooting.
可选地,所述控制指令还包括:Optionally, the control instruction further includes:
第三模式选择指令,指示所述控制组件进入第三模式,在所述第三模式下,所述指令执行模块控制所述摄像组件在预设等待时间后进行拍摄。And a third mode selection instruction instructing the control component to enter a third mode, in the third mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
可选地,所述控制指令还包括:Optionally, the control instruction further includes:
第四模式选择指令,指示所述控制组件进入第四模式,在所述第四模式下,所述指令执行模块通过所述摄像组件检测用户的位置,且控制所述无人机和所述摄像组件自动根据所述用户的位置移动,以使得所述摄像组件持续对用户进行拍摄。a fourth mode selection instruction instructing the control component to enter a fourth mode, in the fourth mode, the instruction execution module detects a position of the user through the camera component, and controls the drone and the camera The component automatically moves according to the location of the user such that the camera assembly continues to capture the user.
可选地,所述第四模式下,所述指令执行模块获取用户的位置变化加速度,当用户的位置变化加速度超过预设加速度阈值时,所述指令执行模块向外部发出报警信号。Optionally, in the fourth mode, the instruction execution module acquires a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
可选地,所述无人机上还设置有至少一个距离传感器,所述控制组件还包括障碍计算模块,所述障碍计算模块用于获取所述距离传感器的障碍物探测数据;Optionally, the UAV is further provided with at least one distance sensor, the control component further includes an obstacle calculation module, and the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor;
所述待执行控制指令中包含无人机移动指令,且所述障碍计算模块判断所述无人机移动指令中移动方向上的障碍物与所述无人机之间的距离小于预设安全阈值时,取 消所述无人机移动指令,并向外部发出限值提醒信号。The to-be-executed control instruction includes a drone movement instruction, and the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold Take The drone movement command is cancelled, and a limit reminder signal is issued to the outside.
本发明还提供一种无人机交互拍摄方法,采用所述的无人机交互拍摄系统,所述方法包括如下步骤:The invention also provides a method for interactively photographing a drone, which adopts the UAV interactive photographing system, and the method comprises the following steps:
所述摄像组件获取拍摄图像;The camera assembly acquires a captured image;
所述图像处理模块对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;The image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image;
所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及The instruction determining module searches for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
所述指令执行模块根据查找得到的控制指令控制所述无人机和/或所述摄像组件。The instruction execution module controls the drone and/or the camera assembly according to the obtained control command.
可选地,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下步骤进行全景照片拍摄:Optionally, the control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component by the following steps Take a panoramic photo shoot:
所述摄像组件检测用户的位置;The camera assembly detects a location of the user;
所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1,α为预设全景拍摄最大角度;The drone starts with a user's position as a starting point, and rotates α/n to one side in the same horizontal plane, where n is a first preset split value, and n>1, α is a preset panoramic shooting maximum angle;
所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;The camera assembly starts shooting, and the drone rotates α to the other side at a preset speed at a preset speed in the same horizontal plane;
所述无人机停止转动后,所述摄像组件停止拍摄。After the drone stops rotating, the camera assembly stops shooting.
可选地,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:Optionally, the control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;The instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a center to generate a first sector with an angle of α with a radius of L/m, and the object to be photographed is located at the first a sector-shaped arc, where m is the second preset segmentation value, and m>1, α is the maximum angle of the preset panoramic shooting;
所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;The instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector The radius is (m-1) L/m and the angle is α;
所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;The camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。After the drone moves to the other end of the arc of the second sector, the camera assembly stops shooting.
可选地,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄: Optionally, the control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;The instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of α with a length of L/m as a waist, and The object to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second preset segmentation value, and m>1, where α is a preset panoramic shooting maximum angle;
所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;The command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle. And the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is α;
所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;The camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。After the drone moves to the other end of the bottom edge of the second isosceles triangle, the camera assembly stops shooting.
本发明所提供的无人机交互拍摄系统及方法具有下列优点:The UAV interactive photographing system and method provided by the invention have the following advantages:
本发明提供了一种用户通过动作直接控制的技术方案,摄像组件自动获取拍摄图像并由控制组件自动分析得到待执行的用户动作特征,根据待执行的用户动作特征解读用户需要的控制指令,由此用户可以直接通过动作对无人机进行飞行控制和对摄像组件进行拍摄控制,从而实现拍摄功能,在任何场合均能够轻松实现满足需求的拍摄,提升了用户体验。The invention provides a technical solution that the user directly controls through the action, the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed, This user can directly control the flight control of the drone and control the shooting of the camera unit, thus enabling the shooting function, which can easily meet the needs of shooting in any occasion and improve the user experience.
附图说明DRAWINGS
通过阅读参照以下附图对非限制性实施例所作的详细描述,本发明的其它特征、目的和优点将会变得更明显。Other features, objects, and advantages of the present invention will become apparent from the Detailed Description of Description
图1是本发明一实施例的无人机交互拍摄系统的结构框图;1 is a block diagram showing the structure of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention;
图2是本发明一实施例的采用阵列式显示屏的无人机交互拍摄系统的结构示意图;2 is a schematic structural diagram of an unmanned aerial camera interactive shooting system using an array display screen according to an embodiment of the present invention;
图3是本发明一实施例的采用点阵式显示屏的无人机交互拍摄系统的结构示意图;3 is a schematic structural diagram of a UAV interactive photographing system using a dot matrix display screen according to an embodiment of the present invention;
图4是本发明一实施例的调整无人机位置的示意图;4 is a schematic diagram of adjusting the position of a drone according to an embodiment of the present invention;
图5是本发明一实施例的调整摄像组件角度的示意图;FIG. 5 is a schematic diagram of adjusting an angle of a camera assembly according to an embodiment of the invention; FIG.
图6~7是本发明一实施例的手势控制的示意图;6-7 are schematic diagrams of gesture control according to an embodiment of the present invention;
图8是本发明一实施例的采用外置显示设备的结构示意图;FIG. 8 is a schematic structural diagram of an external display device according to an embodiment of the present invention; FIG.
图9是本发明一实施例的显示设备收起时的结构示意图;FIG. 9 is a schematic structural diagram of a display device when it is stowed according to an embodiment of the present invention; FIG.
图10是本发明一实施例的无人机不使用时的底面示意图;10 is a schematic bottom view of the unmanned aerial vehicle according to an embodiment of the present invention when not in use;
图11是本发明一实施例的储电设备的结构示意图; 11 is a schematic structural diagram of an electrical storage device according to an embodiment of the present invention;
图12是本发明一实施例的无人机充电时的状态示意图;FIG. 12 is a schematic diagram showing a state of a drone when charging according to an embodiment of the present invention; FIG.
图13是本发明一实施例的无人机充电过程的流程图;13 is a flow chart showing a charging process of a drone according to an embodiment of the present invention;
图14是本发明一实施例的通过语音控制无人机位置的示意图;14 is a schematic diagram of controlling the position of a drone by voice according to an embodiment of the present invention;
图15是本发明一实施例的增加语音控制的无人机交互拍摄系统的结构示意图;15 is a schematic structural diagram of an unmanned aerial camera interactive shooting system with voice control added according to an embodiment of the present invention;
图16是本发明一实施例的用户声纹验证的流程图;16 is a flow chart of user voiceprint verification according to an embodiment of the present invention;
图17是本发明一实施例的用户生理特征验证的流程图;17 is a flow chart of user physiological feature verification according to an embodiment of the present invention;
图18~20是本发明一实施例的无人机交互拍摄方法的流程图;18 to 20 are flowcharts of a method for interactively capturing a drone according to an embodiment of the present invention;
图21是本发明一实施例的全景拍摄的流程图;21 is a flow chart of panoramic shooting according to an embodiment of the present invention;
图22是本发明一实施例的全景拍摄时无人机转动的示意图;Figure 22 is a schematic view showing the rotation of the drone during panoramic shooting according to an embodiment of the present invention;
图23是本发明一实施例的全景拍摄时无人机沿圆弧轨迹移动的示意图;23 is a schematic diagram of a drone moving along a circular arc path during panoramic shooting according to an embodiment of the present invention;
图24是本发明一实施例的全景拍摄时无人机沿直线轨迹移动的示意图;24 is a schematic diagram of a drone moving along a linear trajectory during panoramic shooting according to an embodiment of the present invention;
图25是本发明一实施例的无人机自动跟踪用户位置的流程图;25 is a flow chart of automatically tracking a user's position by a drone according to an embodiment of the present invention;
图26是本发明一实施例的无人机自动障碍物躲避的流程图。Figure 26 is a flow chart showing the automatic obstacle avoidance of the drone according to an embodiment of the present invention.
具体实施方式detailed description
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的实施方式;相反,提供这些实施方式使得本发明将全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。在图中相同的附图标记表示相同或类似的结构,因而将省略对它们的重复描述。Example embodiments will now be described more fully with reference to the accompanying drawings. However, the example embodiments can be embodied in a variety of forms and should not be construed as being limited to the embodiments set forth herein. To those skilled in the art. The same reference numerals in the drawings denote the same or similar structures, and a repetitive description thereof will be omitted.
如图1所示,本发明实施例提供一种无人机交互拍摄系统,所述系统包括无人机200、摄像组件300和控制组件100,所述摄像组件300的一端可转动地连接至所述无人机200的一侧;其中所述控制组件100包括:控制指令库110,用于存储预设的各种用户动作特征与各种控制指令的映射关系,所述控制指令包括无人机控制指令和/或摄像组件控制指令;图像处理模块120,用于对所述摄像组件300的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;指令判定模块130,用于根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及指令执行模块140,用于根据查找得到的控制指令控制所述无人机200和/或所述摄像组件300。As shown in FIG. 1 , an embodiment of the present invention provides a UAV interactive photographing system, which includes a drone 200, a camera assembly 300, and a control assembly 100. One end of the camera assembly 300 is rotatably coupled to the One side of the drone 200; wherein the control component 100 includes: a control instruction library 110 for storing a mapping relationship between preset various user action features and various control commands, the control command including a drone a control instruction and/or a camera component control instruction; an image processing module 120, configured to process the captured image of the camera component 300 to acquire a user action feature to be executed in the captured image; and an instruction determining module 130, configured to: And searching for the corresponding control instruction in the control instruction library according to the user action feature to be executed; and the instruction execution module 140, configured to control the drone 200 and/or the Camera assembly 300.
此处的用户动作特征优选为用户的手势,即采用不同的手势可以获得不同的控制指令,然而在实际应用中,也可以采用其他的用户动作特征,如用户眼神,用户点头、摇头,用户大笑等,例如可以设置捕捉到用户大笑的画面时即进行拍摄,从而可以实现用户笑容的自动捕捉,等等。下文中的实施例介绍多基于用户的手势进行控制,然而可以理解的是,采用其他用户动作特征,也属于本发明的保护范围之内。The user action feature here is preferably a gesture of the user, that is, different control commands can be used to obtain different control commands. However, in actual applications, other user action features, such as the user's eyes, the user nods, shaking his head, and the user are also available. Laughing, etc., for example, it is possible to set a picture that captures the user's laughter, so that automatic capture of the user's smile can be achieved, and the like. The following embodiments describe multi-user based gestures for control, however it will be appreciated that the use of other user motion features is also within the scope of the present invention.
如图2所示,为本发明一实施例的无人机交互拍摄系统的结构示意图。其中示出了无人机200,无人机200的一侧可转动地安装有一摄像组件300,所述摄像组件300 包括摄像设备320和摄像支架310,所述摄像设备320设置于所述摄像支架310中,且所述摄像支架310的一端可转动地连接至所述无人机200的一侧;进一步地,所述系统还可以包括一显示设备330,所述显示设备330可拆卸或固定地安装于所述摄像支架310的另一端。FIG. 2 is a schematic structural diagram of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention. There is shown a drone 200 in which one side of the drone 200 is rotatably mounted with a camera assembly 300, the camera assembly 300 The image capturing device 320 and the image capturing device 310 are disposed in the image capturing bracket 310, and one end of the image capturing bracket 310 is rotatably connected to one side of the drone 200; further, The system can also include a display device 330 that is detachably or fixedly mounted to the other end of the camera mount 310.
为了方便所述控制组件100对所述无人机200和/或摄像组件300进行控制,所述控制组件100可以设置在无人机200的内部,或设置在无人机200的表面,或设置在其他位置处,均属于本发明的保护范围之内。所述指令执行模块140可以直接与所述无人机200的控制器进行通信,也可以与所述摄像组件300进行无线通信,从而实现控制指令的传递和反馈。In order to facilitate the control component 100 to control the drone 200 and/or the camera assembly 300, the control assembly 100 may be disposed inside the drone 200, or disposed on the surface of the drone 200, or set All other locations are within the scope of the invention. The instruction execution module 140 can directly communicate with the controller of the drone 200, or can perform wireless communication with the camera assembly 300, thereby implementing delivery and feedback of control commands.
所述显示设备330可以根据需要显示供用户查看的内容,图2和图3中给出了显示设备330的两种设置方式。The display device 330 can display content for viewing by the user according to needs, and two setting manners of the display device 330 are given in FIG. 2 and FIG.
图2中示出的显示设备330中包括阵列式显示屏和第一显示控制单元;所述第一显示控制单元获取所述摄像设备320的拍摄图像,并通过所述阵列式显示屏进行显示。阵列式显示屏可以包括但不限于彩色LCD屏幕,用户可以通过显示屏实时观察自拍画面。The display device 330 shown in FIG. 2 includes an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device 320 and displays it through the array display screen. The array display can include, but is not limited to, a color LCD screen, and the user can view the self-timer picture in real time through the display.
图3中示出的显示设备330中包括点阵式显示屏和第二显示控制单元;所述第二显示控制单元获取所述指令判定模块130查找得到的控制指令,并控制所述点阵式显示屏显示与所述查找得到的控制指令相关联的用户提示信息。所述点阵式显示屏可以包括但不限于点阵式LED屏幕,用户可以通过LED的灯号排列形式进行自拍准备和拍摄。The display device 330 shown in FIG. 3 includes a dot matrix display screen and a second display control unit; the second display control unit acquires the control command obtained by the instruction determination module 130, and controls the dot matrix The display screen displays user prompt information associated with the search control command obtained. The dot matrix display screen may include, but is not limited to, a dot matrix LED screen, and the user can perform self-photographing preparation and shooting through the LED number arrangement form.
例如,用户提示信息可以是自拍倒计时,例如倒计时五秒开始拍摄时,所述点阵式显示屏依次显示5、4、3、2、1,用户可以根据倒计时来做好自拍准备;用户提示信息也可以指示当前处于何种拍摄模式,例如显示2时,表示当前处于第二模式,等等。For example, the user prompt information may be a self-timer countdown. For example, when the countdown starts shooting for five seconds, the dot matrix display sequentially displays 5, 4, 3, 2, and 1, and the user can prepare for self-time according to the countdown; the user prompts information. It is also possible to indicate which shooting mode is currently in use, for example, when 2 is displayed, it means that it is currently in the second mode, and so on.
通过采用本发明的无人机交互拍摄系统,摄像组件自动获取拍摄图像并由控制组件自动分析得到待执行的用户动作特征,根据待执行的用户动作特征解读用户需要的控制指令,由此用户可以实现无人机200和/或摄像组件300的控制。By adopting the UAV interactive shooting system of the present invention, the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed, thereby the user can Control of the drone 200 and/or camera assembly 300 is accomplished.
在控制无人机200和/或摄像组件300时,所述无人机控制指令可以包括无人机平移指令、无人机转动指令、无人机开机指令和无人机关机指令中的至少一种;所述摄像组件控制指令可以包括摄像组件转动指令、拍摄参数调整指令、拍摄开始指令和拍摄停止指令中的至少一种。此处可以调整的拍摄参数可以包括拍摄时的对焦、补光、图像大小等等。When controlling the drone 200 and/or the camera assembly 300, the drone control command may include at least one of a drone panning command, a drone rotation command, a drone powering command, and a drone machine command. The camera component control command may include at least one of a camera component rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command. The shooting parameters that can be adjusted here can include focus, fill light, image size, and so on.
如图4所示,为本发明一实施例的调整无人机200位置的示意图。具体调整无人机位置可以采用如下步骤:As shown in FIG. 4, a schematic diagram of adjusting the position of the drone 200 according to an embodiment of the present invention is shown. To adjust the position of the drone, you can use the following steps:
a.当无人机200启动并起飞后,悬停于初始位置; a. When the drone 200 is started and taken off, hovering in the initial position;
b.用户400从显示设备330观察自拍角度,发现人像处于显示设备330中偏左位置(图4中虚线示出的人像),用户400通过手势(从图4中用户400手虚线状态至实线状态)控制无人机向左移动位置,至人像处于画面居中位置(图4中实线示出的人像);b. The user 400 observes the self-portrait angle from the display device 330, and finds that the portrait is in the left-left position in the display device 330 (the portrait shown by the broken line in FIG. 4), and the user 400 passes the gesture (from the dotted line state of the user 400 in FIG. 4 to the solid line) State) controls the drone to move to the left position until the portrait is in the center of the screen (the portrait shown in solid lines in Figure 4);
c.当符合拍摄条件后,用户400通过手势控制进行拍摄。c. After the shooting conditions are met, the user 400 performs shooting by gesture control.
如图5所示,为本发明一实施例的调整摄像组件300角度的示意图。具体调整摄像组件300可以采用如下步骤:As shown in FIG. 5, it is a schematic diagram of adjusting the angle of the camera assembly 300 according to an embodiment of the invention. To specifically adjust the camera component 300, the following steps can be taken:
a.无人机200启动并起飞后,悬停于初始位置;a. After the drone 200 is started and taken off, hovering in the initial position;
b.用户400从显示设备303观察自拍角度,发现无人机200偏高,人像处于偏下位置(如图5中虚线示出的人像),用户通过手势(从图4中用户400手虚线状态至实线状态)控制摄像组件300向下翻转,从而带动摄像设备302向下翻转,至人像处于画面居中位置(如图5中实线示出的人像);b. The user 400 observes the self-portrait angle from the display device 303, finds that the drone 200 is high, the portrait is in the downward position (such as the portrait shown by the dotted line in FIG. 5), and the user passes the gesture (from the user 400 hand dotted state in FIG. 4) Up to the solid state) controlling the camera assembly 300 to flip down, thereby driving the camera device 302 to flip down until the portrait is in the center of the screen (such as the portrait shown by the solid line in FIG. 5);
c.当符合拍摄条件后,用户400通过手势控制进行拍摄。c. After the shooting conditions are met, the user 400 performs shooting by gesture control.
另外,控制无人机200和摄像组件300的方式也可以灵活选择,例如,在图5中,当人像处于偏下位置时,也可以通过降低无人机200的高度的方式来进行调整,使得人像处于画面居中位置。In addition, the manner of controlling the drone 200 and the camera assembly 300 can also be flexibly selected. For example, in FIG. 5, when the portrait is in the down position, the adjustment can also be performed by reducing the height of the drone 200. The portrait is in the middle of the screen.
具体地,对无人机200和摄像组件300的调整方式可以采用预设的不同手势指令进行区分。即,当获知一个手势时,即可以知道该手势具体控制对象是无人机200还是摄像组件300,也可以知道该手势具体控制无人机200或摄像组件300的动作是什么。Specifically, the adjustment manner of the drone 200 and the camera assembly 300 can be distinguished by using different preset gesture commands. That is, when a gesture is known, it can be known whether the gesture specific control object is the drone 200 or the camera assembly 300, and it can be known that the gesture specifically controls the action of the drone 200 or the camera assembly 300.
如图6和图7所示,示出了一种用户动作特征与控制指令的映射关系。此处各种不同的手势所对应的控制指令如下表1所示。As shown in Figures 6 and 7, a mapping relationship between user action features and control commands is shown. The control commands corresponding to the various gestures here are shown in Table 1 below.
表1手势与控制指令映射表Table 1 gesture and control instruction mapping table
Figure PCTCN2017080738-appb-000001
Figure PCTCN2017080738-appb-000001
Figure PCTCN2017080738-appb-000002
Figure PCTCN2017080738-appb-000002
图6和图7中仅给出了一种手势控制的示例。在实际应用中,用户也可以自定义各种不同的手势与不同控制指令的映射关系,将其修改为符合其使用习惯的手势。并且也可以增加其他动作特征,例如,用户点头则表示确认拍摄,用户摇头则表示删除前一拍摄图像,等等。Only one example of gesture control is given in Figures 6 and 7. In practical applications, the user can also customize the mapping relationship between different gestures and different control commands, and modify it to a gesture that conforms to its usage habits. Other action features can also be added. For example, the user nods to confirm the shooting, the user shakes the head to delete the previous captured image, and so on.
如图8和图9中给出了摄像组件300的两种设置方式。如图8所示,所述摄像组件300采用外置显示设备340,该外置显示设备340进一步可以是用户的移动终端,外置显示设备340与控制组件100可以通过无线通信,也可以通过USB等数据线进行通信。所述摄像支架310的一端设置为凸块311,所述无人机200的一侧设置有一与所述凸块形状相适应的凹槽210;所述摄像支架的凸块311嵌设于所述无人机的凹槽210中。Two arrangements of the camera assembly 300 are shown in Figures 8 and 9. As shown in FIG. 8, the camera assembly 300 uses an external display device 340. The external display device 340 can further be a user's mobile terminal. The external display device 340 and the control component 100 can communicate via wireless or USB. Wait for the data line to communicate. One end of the camera bracket 310 is disposed as a bump 311, and one side of the drone 200 is provided with a groove 210 corresponding to the shape of the bump; the bump 311 of the camera bracket is embedded in the In the groove 210 of the drone.
在图8的设置方式中,所述摄像支架310包括第一支臂312、第二支臂313和第三支臂314,所述第一支臂312的一侧连接至所述凸块311,且所述第一支臂312的另一侧设置有一第一插槽,所述第二支臂313的一端和第三支臂314的一端分别连接至所述第一支臂312的两端,且所述第二支臂313和第三支臂314均垂直于所述第一支臂312,所述第二支臂313的另一端设置有一第二插槽,所述第三支臂314的另一端设置有一第三插槽。所述外置显示设备340可以放置于摄像支架310中,外置显示设备340的上端插入所述第一插槽,外置显示设备340的下端插入所述第二插槽和第三插槽,从而形成外置显示设备340和摄像支架310之间稳定且方便装卸的连接。In the arrangement of FIG. 8 , the camera bracket 310 includes a first arm 312 , a second arm 313 , and a third arm 314 . One side of the first arm 312 is connected to the bump 311 . And a first slot is disposed on the other side of the first arm 312, and one end of the second arm 313 and one end of the third arm 314 are respectively connected to two ends of the first arm 312. The second arm 313 and the third arm 314 are both perpendicular to the first arm 312, and the other end of the second arm 313 is provided with a second slot, the third arm 314 The other end is provided with a third slot. The external display device 340 can be placed in the camera holder 310, the upper end of the external display device 340 is inserted into the first slot, and the lower end of the external display device 340 is inserted into the second slot and the third slot. Thereby, a stable and convenient connection between the external display device 340 and the imaging stand 310 is formed.
在图9的实施方式中,所述显示设备330为内置的显示设备330。同样地,该种设置方式中,摄像支架310通过凸块311与凹槽210的配合实现转动,显示设备330也随摄像支架310一起转动。所述无人机200的下表面为一平面,所述无人机200的下表面包括一摄像支架对应区220,所述无人机200的凹槽210的两侧面垂直于所述无人机200的下表面,由此所述凸块311可以在凹槽210中上下转动,以使得所述摄像支架310可在与所述无人机200的下表面垂直以及与所述摄像支架对应区220贴合的角度范围内转动。如上所述,在使用过程中,摄像组件300可以在需要的角度范围内进行调整,以获取更好的拍摄效果。而在使用完毕或者无人机电量耗尽无法使用时,可以将摄像支架310折叠至与所述摄像支架对应区220贴合,方便将其折叠携带。In the embodiment of FIG. 9, the display device 330 is a built-in display device 330. Similarly, in this arrangement, the camera holder 310 is rotated by the cooperation of the bump 311 and the groove 210, and the display device 330 is also rotated together with the camera holder 310. The lower surface of the drone 200 is a plane, and the lower surface of the drone 200 includes a camera bracket corresponding area 220. The two sides of the recess 210 of the drone 200 are perpendicular to the drone. a lower surface of the cover 311, whereby the bump 311 can be rotated up and down in the recess 210 such that the camera mount 310 can be perpendicular to the lower surface of the drone 200 and to the camera mount corresponding region 220 Rotate within the angle range of the fit. As described above, during use, the camera assembly 300 can be adjusted within a desired range of angles to achieve better shooting results. When the use is completed or the drone is exhausted and cannot be used, the camera stand 310 can be folded into the corresponding area 220 of the camera stand to facilitate folding and carrying.
另外,由于无人机200普遍体积较小,电池容量也很小,续航时间不长,为了克服该问题,本发明实施例进一步提供了一种便捷充电的方式。如图10~12所示,所 述无人机200的下表面还包括一储电设备对应区230,且所述储电设备对应区与所述摄像支架对应区220无交叉;所述系统还包括一储电设备500,所述储电设备500可拆卸或固定地安装于所述无人机200的下表面,且所述储电设备500贴合所述储电设备对应区。In addition, since the UAV 200 is generally small in size, the battery capacity is also small, and the battery life is not long. In order to overcome the problem, the embodiment of the present invention further provides a convenient charging mode. As shown in Figures 10 to 12, The lower surface of the unmanned aerial vehicle 200 further includes a power storage device corresponding area 230, and the corresponding area of the power storage device does not intersect with the camera support corresponding area 220; the system further includes a power storage device 500, The power storage device 500 is detachably or fixedly mounted on a lower surface of the drone 200, and the power storage device 500 is attached to the corresponding region of the power storage device.
如图13所示,为采用该实施例的结构时充电的流程图,当显示屏为外置显示屏时,首先断开外置显示屏与无人机之间的连接,可以将外置显示屏拆除,也可以将其留在摄像支架310上,一起折叠;如果此时插入了储电设备500,则开始充电,否则直接关机。为了保证无人机200飞行时的小负载,在无人机200有电且使用时,将储电设备500拆除,而当无人机200不使用或电量耗尽时,可以将摄像支架310折叠,然后将储电设备500安装于储电设备对应区。使用储电设备500连接至无人机的可充电电池,进行充电动作。折叠充电时,无人机200通过折叠获取一个较小的体积,方便携带,至充电完成后可以继续使用。As shown in FIG. 13 , in order to use the charging structure of the structure of the embodiment, when the display screen is an external display screen, the connection between the external display screen and the drone is first disconnected, and the external display can be displayed. The screen is removed, and it can also be left on the camera holder 310 and folded together; if the power storage device 500 is inserted at this time, charging starts, otherwise it is directly turned off. In order to ensure the small load when the drone 200 is flying, when the drone 200 is powered and used, the power storage device 500 is removed, and when the drone 200 is not used or the battery is exhausted, the camera bracket 310 can be folded. Then, the power storage device 500 is installed in the corresponding area of the power storage device. The charging device is connected to the rechargeable battery of the drone using the power storage device 500 to perform a charging operation. When folding and charging, the drone 200 obtains a small volume by folding, which is convenient to carry, and can be used after the charging is completed.
如图14和图15所示,本发明实施例还可以包括语音获取设备600,所述语音获取设备600用于获取用户的语音数据;所述控制指令库110还用于存储预设的各种语音关键词与各种控制指令的映射关系;所述控制组件100还包括语音处理模块150,所述语音处理模块150用于提取所述用户的语音数据中包括的语音关键词;所述指令判定模块130还用于根据提取的语音关键词在所述控制指令库中查找所对应的控制指令。As shown in FIG. 14 and FIG. 15 , the embodiment of the present invention may further include a voice acquiring device 600, where the voice acquiring device 600 is configured to acquire voice data of a user; and the control command library 110 is further configured to store presets. a mapping relationship between the voice keyword and the various control commands; the control component 100 further includes a voice processing module 150, the voice processing module 150 is configured to extract a voice keyword included in the voice data of the user; The module 130 is further configured to search for the corresponding control instruction in the control instruction library according to the extracted voice keyword.
通过设置语音获取设备600,该实施例还可以实现用户通过语音进行拍摄控制。例如设置关键词“开机”为将摄像组件300开启,当检测到用户的语音数据中存在“开机”一词时,则自动开启摄像组件300,或检测到用户的语音数据中存在“无人机”和“向左移动”,则自动控制无人机向左移动。语音控制更加便捷方便,并且不受其他条件的约束,可以适用于任何场合,而不影响用户的拍摄效果。By setting the voice acquisition device 600, this embodiment can also implement the user's shooting control by voice. For example, if the keyword "power on" is set to turn on the camera component 300, when the word "power on" is detected in the voice data of the user, the camera component 300 is automatically turned on, or "the drone" is detected in the voice data of the user. And "moving to the left" automatically controls the drone to move to the left. Voice control is more convenient and convenient, and is not subject to other conditions, and can be applied to any occasion without affecting the user's shooting effect.
进一步地,如图16所示,考虑到用户在使用语音控制时,控制组件100可能会接收到外界其他人的声音或环境中的噪音,还需要对不同的声音进行辨别。即所述语音处理模块还用于获取所述用户的语音数据中用户的声纹特征,并判断所述用户的声纹特征是否为预存指定声纹特征;Further, as shown in FIG. 16, in consideration of the user's use of voice control, the control component 100 may receive noise from other people's voices or the environment, and also need to distinguish different sounds. That is, the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
如果所述用户的声纹特征为预设允许声纹特征,则表明该语音数据为指定用户的语音数据,可以按照此语音数据执行控制,则所述指令判定模块提取所述用户的语音数据中包括的语音关键词,并根据提取的语音关键词在所述控制指令库中查找所对应的控制指令;如果所述用户的声纹特征不为预设允许声纹特征,则表明该语音数据非指定用户的语音数据,需要进行筛除,即所述指令判定模块忽略所述用户的声纹特征,不进行提取语音关键词处理。If the voiceprint feature of the user is a preset allowable voiceprint feature, indicating that the voice data is voice data of a specified user, and the control may be performed according to the voice data, the command determining module extracts the voice data of the user. a voice keyword included, and searching for a corresponding control instruction in the control instruction library according to the extracted voice keyword; if the voiceprint feature of the user is not a preset allowed voiceprint feature, indicating that the voice data is not The voice data of the specified user needs to be screened out, that is, the command determining module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
同样地,如图17所示,所述摄像组件300还可能会获取到非指定用户的其他人的动作特征,为了避免混淆,所述图像处理模块还用于获取所述摄像组件的拍摄图像 中用户的生理特征,并判断用户的生理特征是否为预存指定生理特征;Similarly, as shown in FIG. 17 , the camera component 300 may also acquire motion characteristics of other people who are not designated users. To avoid confusion, the image processing module is further configured to acquire a captured image of the camera component. The physiological characteristics of the user, and determining whether the physiological characteristic of the user is a pre-stored designated physiological feature;
如果所述用户的生理特征为预存指定生理特征,则表明获取到的是指定用户的动作,所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;如果所述用户的生理特征不为预存指定生理特征,则所述指令判定模块忽略所述待执行的用户动作特征,不进行查找控制指令处理。If the physiological characteristic of the user is pre-stored with the specified physiological feature, it indicates that the action of the specified user is acquired, and the instruction determining module searches for the corresponding one in the control instruction library according to the user action feature to be executed. And controlling the instruction; if the physiological characteristic of the user is not pre-existing the specified physiological feature, the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
此处,获取用户的生理特征,可以指的是用户的五官轮廓、用户的发色、头发长度、用户的肤色、唇色等等,也可以多种生理特征结合起来进行更准确的辨别,等等,均属于本发明的保护范围之内。Here, obtaining the physiological characteristics of the user may refer to the facial features of the user, the color of the user, the length of the hair, the skin color of the user, the color of the lips, etc., or a combination of various physiological features for more accurate identification, etc. And the like are all within the scope of protection of the present invention.
如图18所示,本发明实施例还提供一种无人机交互拍摄方法,采用所述的无人机交互拍摄系统,所述方法包括如下步骤:As shown in FIG. 18, an embodiment of the present invention further provides a method for interactively capturing a UAV, which adopts the UAV interactive photographing system, and the method includes the following steps:
S100:所述摄像组件获取拍摄图像;S100: the camera component acquires a captured image;
S200:所述图像处理模块对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;S200: The image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image;
S300:所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及S300: the instruction determining module searches for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
S400:所述指令执行模块根据查找得到的控制指令控制所述无人机和/或所述摄像组件。S400: The instruction execution module controls the drone and/or the camera component according to the obtained control instruction.
当所述控制指令包括无人机控制指令、摄像组件控制指令或其他有效指令时,其判断过程可以采用如图19所示的流程,依次执行判断和控制,然而并不限于此种方式。其他先判断是否为摄像组件控制指令,再判断是否为无人机控制指令等等,均属于本发明的保护范围之内。When the control command includes a drone control command, a camera component control command, or other valid command, the determination process may adopt the flow shown in FIG. 19 to sequentially perform the determination and control, but is not limited to this manner. Others determine whether it is a camera component control command, and then determine whether it is a drone control command or the like, which falls within the scope of protection of the present invention.
如图20所示,示出了一种具体的无人机交互拍摄方法的实施方式。首先判断显示屏种类,如果为外置显示屏,则需要首先通过无线通讯方式使控制组件连接外置显示屏,从而为后面控制做准备。然后根据手势和控制指令的对应关系来查找对应的控制指令,并执行控制。如上所述,本发明中动作特征并不仅限于手势这一种,其他身体部位的不同动作,也可以实现本发明的目的。As shown in FIG. 20, an embodiment of a specific UAV interactive photographing method is shown. First, determine the type of display. If it is an external display, you need to first connect the control unit to the external display through wireless communication to prepare for the rear control. Then, according to the correspondence between the gesture and the control instruction, the corresponding control instruction is searched for and the control is executed. As described above, the action features of the present invention are not limited to the one of the gestures, and the different actions of other body parts can also achieve the object of the present invention.
如上所述,为了区分无人机控制指令和摄像组件控制指令,可以通过不同的动作特征来进行区分。另外,也可以采用不同控制模块来实现。例如,所述控制指令还可以包括第一模式选择指令和第二模式选择指令,分别指示所述控制组件进入第一模式和第二模式。As described above, in order to distinguish between the drone control command and the camera component control command, it is possible to distinguish by different action characteristics. In addition, different control modules can also be implemented. For example, the control instruction may further include a first mode selection instruction and a second mode selection instruction respectively instructing the control component to enter the first mode and the second mode.
进入第一模式后,之后接收到的用户动作特征默认为指向无人机控制指令,即所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的无人机控制指令,并根据查找得到的无人机控制指令控制所述无人机,而不再执行摄像组件控制指令;进入所述第二模式后,之后接收到的用户动作特征默认为指向摄像组件控制 指令,即所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的摄像组件控制指令,并根据查找得到的摄像组件控制指令控制所述摄像组件,而不再执行摄像组件控制指令。After entering the first mode, the received user action feature defaults to pointing to the drone control command, that is, the command determining module searches for the corresponding drone control in the control command library according to the user action feature. Commanding, and controlling the drone according to the obtained drone control command, and no longer executing the camera component control instruction; after entering the second mode, the received user action feature defaults to pointing to the camera component control An instruction, that is, the instruction determining module searches for a corresponding camera component control instruction in the control instruction library according to the user action feature, and controls the camera component according to the obtained camera component control instruction, and does not execute Camera component control instructions.
采用该种方式,可以减少用户设置动作特征的数量。例如,同样是手掌摊开,向下移动,在第一模式下,就表示控制无人机向下移动,在第二模式下,则表示控制摄像组件向下翻转。此处仅给出一种具体的实施方式,本发明的保护范围不以此为限。In this way, the number of user-set action features can be reduced. For example, the palm is also spread out and moved downward. In the first mode, it means that the drone is controlled to move downward, and in the second mode, it means that the camera assembly is turned down. Only one specific embodiment is given herein, and the scope of protection of the present invention is not limited thereto.
进一步地,由于无人机飞行过程中的平稳性和可控性,其相对于用户手拿相机拍摄具有一些不可替代的优势,例如,无人机可以拍出抖动更小的照片,对摄像设备的防抖性能要求更低。用户在手拿相机拍摄全景照片时,往往会因为抖动或其他因素干扰而无法获得理想的全景照片。而这一问题可以被无人机所克服。Further, due to the smoothness and controllability of the drone during flight, it has some irreplaceable advantages compared to the user's hand taking the camera. For example, the drone can take a photo with less jitter, and the camera device The anti-shake performance requirements are lower. When a user takes a panoramic photo with the camera in his hand, he or she often loses the ideal panoramic photo due to jitter or other factors. And this problem can be overcome by drones.
如图21和图22所示,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机以预设速度在(0,α)角度范围内持续转动,α为预设全景拍摄最大角度。可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:As shown in FIG. 21 and FIG. 22, the control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone to preset The speed continues to rotate within the (0, α) angle range, and α is the maximum angle for the preset panorama. Optionally, in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
所述摄像组件检测用户400的位置;The camera assembly detects a location of the user 400;
所述无人机200以用户400的位置为起始点,在同一水平面内向一侧转动α/n,此阶段为无人机定位阶段,此过程中不拍摄,其中n为第一预设分割值;The drone 200 takes the position of the user 400 as a starting point, and rotates α/n to one side in the same horizontal plane. This stage is a positioning stage of the drone, and no shooting is performed in this process, where n is the first preset dividing value. ;
所述摄像组件开始拍摄,且所述无人机200在同一水平面内以预设速度匀速向另一侧转动α,从而达到一个角度为α的全景照片,并且用户位于全景照片的指定位置;The camera assembly starts shooting, and the drone 200 rotates α to the other side at a preset speed at a preset speed in the same horizontal plane, thereby reaching a panoramic photo with an angle α, and the user is located at a designated position of the panoramic photo;
所述无人机200停止转动后,所述摄像组件停止拍摄。After the drone 200 stops rotating, the camera assembly stops shooting.
当n为2时,可以使得用户位于全景照片的中央。在实际应用中,角度α可以根据需要设置,用户位于全景照片中的位置也可以进行调整,例如设置用户位于偏左位置,则可以先将无人机向一侧转动α/4等,拍摄方式十分灵活,并且拍摄全景照片成功率高,拍摄照片效果更好。When n is 2, the user can be placed in the center of the panoramic photo. In practical applications, the angle α can be set as needed, and the position of the user in the panoramic photo can also be adjusted. For example, if the user is located in the left position, the drone can be rotated to one side by a/4, etc., and the shooting mode is adopted. Very flexible, and the success rate of taking panoramic photos is better, and taking photos is better.
如图23所示,为另一种全景拍摄的方式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:As shown in FIG. 23, in another panoramic shooting mode, in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
所述指令执行模块计算所述摄像组件与用户400之间的距离L,即图中用户400和无人机200之间连接的虚线示出的距离;The instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
所述指令执行模块选定所述摄像组件与用户400之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形701,且待拍摄物位于所述第一扇形701的圆弧上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;The instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a center to generate a first sector 701 of an angle α with a radius of L/m, and the object to be photographed is located at the center. On the arc of the first sector 701, where m is the second preset segmentation value, and m>1, α is the maximum angle of the preset panoramic shooting;
所述指令执行模块生成与所述第一扇形701相对的第二扇形702,所述第二扇形 702的两侧边分别为所述第一扇形701的两侧边的反向延长线,且所述第二扇形702的半径为(m-1)L/m,角度为α;The instruction execution module generates a second sector 702 opposite to the first sector 701, the second sector The two sides of the 702 are respectively opposite extension lines of the two sides of the first sector 701, and the radius of the second sector 702 is (m-1) L / m, the angle is α;
所述摄像组件开始拍摄,且所述无人机200从所述第二扇形702的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;The camera assembly starts shooting, and the drone 200 moves from one end of the arc of the second sector 702 along the trajectory of the arc to the other end of the arc;
所述无人机200移动至所述第二扇形702的圆弧的另一端后,所述摄像组件停止拍摄。After the drone 200 moves to the other end of the arc of the second sector 702, the camera assembly stops shooting.
如图24所示,为再一种全景拍摄的方式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:As shown in FIG. 24, in another manner of panoramic shooting, in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
所述指令执行模块计算所述摄像组件与用户400之间的距离L,即图中用户400和无人机200之间连接的虚线示出的距离;The instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
所述指令执行模块选定所述摄像组件与用户400之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形703,且待拍摄物位于所述第一等腰三角形703的底边上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;The instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a vertex to generate a first isosceles triangle 703 with an apex angle α with a length of L/m as a waist. And the object to be photographed is located on a bottom edge of the first isosceles triangle 703, where m is a second preset segmentation value, and m>1, α is a preset panoramic shooting maximum angle;
所述指令执行模块生成与所述第一等腰三角形703相对的第二等腰三角形704,所述第二等腰三角形704的两腰分别为所述第一等腰三角形703的两腰的反向延长线,且所述第二等腰三角形704的腰的长度为(m-1)L/m,顶角角度为α;The instruction execution module generates a second isosceles triangle 704 opposite to the first isosceles triangle 703, and the two waists of the second isosceles triangle 704 are respectively opposite to the two waists of the first isosceles triangle 703 To the extension line, and the length of the waist of the second isosceles triangle 704 is (m-1) L/m, and the apex angle is α;
所述摄像组件开始拍摄,且所述无人机200从所述第二等腰三角形704的底边的一端沿该底边的轨迹移动至该底边的另一端;The camera assembly starts shooting, and the drone 200 moves from one end of the bottom edge of the second isosceles triangle 704 along the trajectory of the bottom edge to the other end of the bottom edge;
所述无人机200移动至所述第二等腰三角形704的底边的另一端后,所述摄像组件停止拍摄。After the drone 200 moves to the other end of the bottom side of the second isosceles triangle 704, the camera assembly stops shooting.
图23和图24中的拍摄轨迹可以根据需要进行选择,通过持续拍摄形成全景照片,或通过拍摄多张照片合成为一张全景照片,m和α的不同选择可以获得不同的拍摄范围,更具有灵活性。无人机可以根据计算得到的预设轨迹移动,使摄像组件获取不同的拍摄位置和拍摄角度。The shooting trajectories in FIG. 23 and FIG. 24 can be selected as needed, forming a panoramic photo by continuous shooting, or synthesizing a plurality of photos into one panoramic photo, and different selections of m and α can obtain different shooting ranges, and more flexibility. The drone can move according to the calculated preset trajectory, so that the camera component acquires different shooting positions and shooting angles.
在使用无人机进行拍摄时,有时需要一定的准备时间,例如可以设置拍摄倒计时,即所述控制指令还可以包括第三模式选择指令,指示所述控制组件进入第三模式,在所述第三模式下,所述指令执行模块控制所述摄像组件在预设等待时间后进行拍摄。倒计时过程中,可以通过显示设备显示倒计时时间,也可以通过其他显示灯或提示音指示剩余准备时间。When shooting with a drone, sometimes a certain preparation time is required, for example, a shooting countdown may be set, that is, the control command may further include a third mode selection instruction indicating that the control component enters the third mode, in the In the three mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time. During the countdown process, the countdown time can be displayed by the display device, or the remaining preparation time can be indicated by other display lights or prompts.
如图25所示,本发明的无人机还可以实现用户自动跟踪拍摄功能。所述控制指令还可以包括第四模式选择指令,指示所述控制组件进入第四模式,在所述第四模式下,所述指令执行模块通过所述摄像组件检测用户的位置,且控制所述无人机和所述摄像组件自动根据所述用户的位置移动,以使得所述摄像组件持续对用户进行拍摄, 从而实现自动跟踪用户拍摄,保证用户一直处于拍摄范围之内。As shown in FIG. 25, the drone of the present invention can also realize a user automatic tracking shooting function. The control instruction may further include a fourth mode selection instruction instructing the control component to enter a fourth mode, in the fourth mode, the instruction execution module detects a position of the user through the camera component, and controls the The drone and the camera assembly automatically move according to the position of the user, so that the camera assembly continuously captures the user. This enables automatic tracking of user shots, ensuring that the user is always within range.
可选地,所述第四模式下,所述指令执行模块还可以获取用户的位置变化加速度,当用户的位置变化加速度超过预设加速度阈值时,所述指令执行模块向外部发出报警信号。采用该种方式,一方面,当摄像组件无法捕捉到用户位置时,可以通过报警提示用户注意,让用户自己主动来到摄像组件的拍摄范围之内;另一方面,也可以实现跌倒检测,当用户不小心跌倒或身体不适晕倒时,可以自动向外部发出报警信号,如果用户在一定时间内没有取消该报警信号,则可以进一步通知与用户关联的其他用户的移动终端或拨打急救电话等,在为用户提供高质量拍摄的同时,也保证用户使用过程中的安全。Optionally, in the fourth mode, the instruction execution module may further acquire a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside. In this way, on the one hand, when the camera component cannot capture the user's position, the user can be alerted by the alarm, so that the user can actively come to the shooting range of the camera component; on the other hand, the fall detection can also be realized. When the user accidentally falls or feels uncomfortable, the alarm signal can be automatically sent to the outside. If the user does not cancel the alarm signal within a certain period of time, the mobile terminal of other users associated with the user or the emergency telephone can be further notified. While providing users with high-quality shooting, it also ensures the safety of users during use.
如图26所示,进一步地,考虑到用户在控制无人机动作时可能会因为距离的错误估计或误操作而使得无人机撞上其他障碍物,为了保障无人机本身的安全性,所述无人机上还可以设置有至少一个距离传感器,所述控制组件还包括障碍计算模块,所述障碍计算模块用于获取所述距离传感器的障碍物探测数据;As shown in FIG. 26, further, considering that the user may cause the drone to hit other obstacles due to incorrect estimation or misoperation of the distance when controlling the drone, in order to protect the safety of the drone itself, At least one distance sensor may be disposed on the drone, the control component further includes an obstacle calculation module, and the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor;
所述待执行控制指令中包含无人机移动指令,且所述障碍计算模块判断所述无人机移动指令中移动方向上的障碍物与所述无人机之间的距离小于预设安全阈值时,取消所述无人机移动指令,并向外部发出限值提醒信号。即所述障碍计算模块在通过距离传感器已知周围可能撞上的障碍物之后,根据控制指令的指向方向进行预测,如果无人机执行了无人机移动指令,是否可能会撞上障碍物,如果是,则不执行该无人机移动指令,并且提醒用户距离已经小于限值,有撞上障碍物的危险。The to-be-executed control instruction includes a drone movement instruction, and the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold , cancel the drone movement instruction, and issue a limit reminder signal to the outside. That is, the obstacle calculation module predicts according to the pointing direction of the control command after the obstacle is detected by the distance sensor, and if the drone performs the drone movement instruction, it may hit the obstacle. If so, the drone movement command is not executed, and the user is reminded that the distance is already less than the limit and there is a danger of hitting an obstacle.
采用该种实施方式,特别适用于在室内进行拍摄的情况。由于在室内受墙面和天花板的限制,并且有其他很多的家具、摆设等障碍物,采用该种方式,通过可靠的计算和危险预测,则可以保证无人机在室内拍摄的安全性。同样地,也可以适用于无人机在室外拍摄的情况,在室外空旷之处,无人机可能移动速度较快,而用户无法很好地预知未到来的危险,采用此种方式,则可以保证无人机交互拍摄过程的稳定性和可靠性。With this embodiment, it is particularly suitable for shooting indoors. Due to the limitation of wall and ceiling in the room, and many other obstacles such as furniture and furnishings, this method can ensure the safety of the drone indoors through reliable calculation and danger prediction. Similarly, it can also be applied to the case where the drone is photographed outdoors. In an open space, the drone may move faster, and the user cannot predict the danger of the arrival. Therefore, in this way, Ensure the stability and reliability of the drone interactive shooting process.
与现有技术相比,本发明提供了一种用户通过动作直接控制的技术方案,摄像组件自动获取拍摄图像并由控制组件自动分析得到待执行的用户动作特征,根据待执行的用户动作特征解读用户需要的控制指令,由此用户可以直接通过动作对无人机进行飞行控制和对摄像组件进行拍摄控制,从而实现拍摄功能,在任何场合均能够轻松实现满足需求的拍摄,提升了用户体验。Compared with the prior art, the present invention provides a technical solution in which a user directly controls through an action, and the camera component automatically acquires a captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the user action feature to be executed according to the user action feature to be executed. The user needs the control command, so that the user can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function, and can easily realize the shooting to meet the demand in any occasion, thereby improving the user experience.
以上内容是结合具体的优选实施方式对本发明所作的进一步详细说明,不能认定本发明的具体实施只局限于这些说明。对于本发明所属技术领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干简单推演或替换,都应当视为属于本发明的保护范围。 The above is a further detailed description of the present invention in connection with the specific preferred embodiments, and the specific embodiments of the present invention are not limited to the description. It will be apparent to those skilled in the art that the present invention may be made without departing from the spirit and scope of the invention.

Claims (24)

  1. 一种无人机交互拍摄系统,其特征在于,所述系统包括无人机、摄像组件和控制组件,所述摄像组件的一端可转动地连接至所述无人机的一侧;其中所述控制组件包括:A UAV interactive photographing system, characterized in that the system comprises a drone, a camera assembly and a control assembly, one end of the camera assembly being rotatably connected to one side of the drone; wherein Control components include:
    控制指令库,用于存储预设的各种用户动作特征与各种控制指令的映射关系,所述控制指令包括无人机控制指令和/或摄像组件控制指令;a control instruction library, configured to store a preset mapping relationship between various user action features and various control commands, where the control command includes a drone control command and/or a camera component control command;
    图像处理模块,用于对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;An image processing module, configured to process a captured image of the camera component to acquire a user action feature to be executed in the captured image;
    指令判定模块,用于根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及An instruction determining module, configured to search for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
    指令执行模块,用于根据查找得到的控制指令控制所述无人机和/或所述摄像组件。And an instruction execution module, configured to control the drone and/or the camera assembly according to the obtained control instruction.
  2. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述摄像组件包括摄像设备和摄像支架,所述摄像设备设置于所述摄像支架中,且所述摄像支架的一端可转动地连接至所述无人机的一侧;The UAV interactive photographing system according to claim 1, wherein the image capturing component comprises an image capturing device and an image capturing bracket, wherein the image capturing device is disposed in the image capturing bracket, and one end of the image capturing bracket is rotatable Connected to one side of the drone;
    所述系统还包括一显示设备,所述显示设备可拆卸或固定地安装于所述摄像支架的另一端。The system also includes a display device detachably or fixedly mounted to the other end of the camera mount.
  3. 根据权利要求2所述的无人机交互拍摄系统,其特征在于,所述显示设备包括阵列式显示屏和第一显示控制单元;所述第一显示控制单元获取所述摄像设备的拍摄图像,并通过所述阵列式显示屏进行显示。The UAV interactive photographing system according to claim 2, wherein the display device comprises an array display screen and a first display control unit; the first display control unit acquires a captured image of the image capturing device, And display through the array display screen.
  4. 根据权利要求2所述的无人机交互拍摄系统,其特征在于,所述显示设备包括点阵式显示屏和第二显示控制单元;所述第二显示控制单元获取所述指令判定模块查找得到的控制指令,并控制所述点阵式显示屏显示与所述查找得到的控制指令相关联的用户提示信息。The UAV interactive photographing system according to claim 2, wherein the display device comprises a dot matrix display screen and a second display control unit; and the second display control unit acquires the command determination module to obtain Control instructions and control the dot matrix display to display user prompt information associated with the search control command.
  5. 根据权利要求2所述的无人机交互拍摄系统,其特征在于,所述摄像支架的一端设置为凸块,所述无人机的一侧设置有一与所述凸块形状相适应的凹槽;所述摄像支架的凸块嵌设于所述无人机的凹槽中;The UAV interactive photographing system according to claim 2, wherein one end of the image capturing bracket is disposed as a bump, and one side of the drone is provided with a groove corresponding to the shape of the bump. a bump of the camera bracket is embedded in a recess of the drone;
    所述无人机的下表面为一平面,所述无人机的下表面包括一摄像支架对应区,所述无人机的凹槽的两侧面垂直于所述无人机的下表面,且所述摄像支架的凸块在所述无人机的凹槽中可转动,以使得所述摄像支架可在与所述无人机的下表面垂直以及与所述摄像支架对应区贴合的角度范围内转动。The lower surface of the drone is a plane, and the lower surface of the drone includes a corresponding area of the camera bracket, and the two sides of the groove of the drone are perpendicular to the lower surface of the drone, and The protrusion of the camera bracket is rotatable in a recess of the drone so that the camera bracket can be perpendicular to a lower surface of the drone and an angle corresponding to a corresponding area of the camera bracket Rotate within the range.
  6. 根据权利要求5所述的无人机交互拍摄系统,其特征在于,所述无人机的下表面还包括一储电设备对应区,且所述储电设备对应区与所述摄像支架对应区无交叉;The UAV interactive photographing system according to claim 5, wherein the lower surface of the UAV further comprises a corresponding area of the electric storage device, and the corresponding area of the electric storage device and the corresponding area of the camera support No cross;
    所述系统还包括一储电设备,所述储电设备可拆卸或固定地安装于所述无人机的 下表面,且所述储电设备贴合所述储电设备对应区。The system further includes an electrical storage device detachably or fixedly mounted to the drone a lower surface, and the power storage device is attached to the corresponding area of the power storage device.
  7. 根据权利要求5所述的无人机交互拍摄系统,其特征在于,所述摄像支架包括第一支臂、第二支臂和第三支臂,所述第一支臂的一侧连接至所述凸块,且所述第一支臂的另一侧设置有一第一插槽,所述第二支臂的一端和第三支臂的一端分别连接至所述第一支臂的两端,且所述第二支臂和第三支臂均垂直于所述第一支臂,所述第二支臂的另一端设置有一第二插槽,所述第三支臂的另一端设置有一第三插槽;The UAV interactive photographing system according to claim 5, wherein the image capturing bracket comprises a first arm, a second arm and a third arm, and one side of the first arm is connected to the a bump, and a first slot is disposed on the other side of the first arm, and one end of the second arm and one end of the third arm are respectively connected to two ends of the first arm, And the second arm and the third arm are both perpendicular to the first arm, the other end of the second arm is provided with a second slot, and the other end of the third arm is provided with a first Three slots
    所述显示设备的一侧插入所述第一插槽中,所述显示设备的另一侧插入所述第二插槽和所述第三插槽中。One side of the display device is inserted into the first slot, and the other side of the display device is inserted into the second slot and the third slot.
  8. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,还包括语音获取设备,所述语音获取设备用于获取用户的语音数据;The UAV interactive photographing system according to claim 1, further comprising a voice acquiring device, wherein the voice acquiring device is configured to acquire voice data of the user;
    所述控制指令库还用于存储预设的各种语音关键词与各种控制指令的映射关系;The control instruction library is further configured to store a mapping relationship between preset various voice keywords and various control instructions;
    所述控制组件还包括语音处理模块,所述语音处理模块用于提取所述用户的语音数据中包括的语音关键词;The control component further includes a voice processing module, where the voice processing module is configured to extract a voice keyword included in the voice data of the user;
    所述指令判定模块还用于根据提取的语音关键词在所述控制指令库中查找所对应的控制指令。The instruction determining module is further configured to search for a corresponding control instruction in the control instruction library according to the extracted voice keyword.
  9. 根据权利要求8所述的无人机交互拍摄系统,其特征在于,所述语音处理模块还用于获取所述用户的语音数据中用户的声纹特征,并判断所述用户的声纹特征是否为预存指定声纹特征;The UAV interactive photographing system according to claim 8, wherein the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is Specify a voiceprint feature for pre-existing;
    如果所述用户的声纹特征为预设允许声纹特征,则所述指令判定模块提取所述用户的语音数据中包括的语音关键词,并根据提取的语音关键词在所述控制指令库中查找所对应的控制指令;If the voiceprint feature of the user is a preset allowed voiceprint feature, the instruction determining module extracts a voice keyword included in the voice data of the user, and is in the control instruction library according to the extracted voice keyword. Find the corresponding control instruction;
    如果所述用户的声纹特征不为预设允许声纹特征,则所述指令判定模块忽略所述用户的声纹特征,不进行提取语音关键词处理。If the voiceprint feature of the user is not a preset allowable voiceprint feature, the instruction determination module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
  10. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述图像处理模块还用于获取所述摄像组件的拍摄图像中用户的生理特征,并判断用户的生理特征是否为预存指定生理特征;The UAV interactive photographing system according to claim 1, wherein the image processing module is further configured to acquire a physiological feature of the user in the captured image of the camera component, and determine whether the physiological feature of the user is a pre-stored designation. Physiological characteristics
    如果所述用户的生理特征为预存指定生理特征,则所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;If the physiological characteristic of the user is a pre-stored specified physiological feature, the instruction determining module searches for the corresponding control instruction in the control instruction library according to the user action feature to be executed;
    如果所述用户的生理特征不为预存指定生理特征,则所述指令判定模块忽略所述待执行的用户动作特征,不进行查找控制指令处理。If the physiological characteristic of the user is not a pre-stored specified physiological feature, the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
  11. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述无人机控制指令包括无人机平移指令、无人机转动指令、无人机开机指令和无人机关机指令中的至少一种;所述摄像组件控制指令包括摄像组件转动指令、拍摄参数调整指令、拍摄开始指令和拍摄停止指令中的至少一种。The UAV interactive photographing system according to claim 1, wherein the UAV control command comprises a UAV translation command, a UAV rotation command, a UAV start command, and an unmanned machine command. At least one of the camera assembly control commands includes at least one of a camera component rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
  12. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令 还包括:The UAV interactive photographing system according to claim 1, wherein said control command Also includes:
    第一模式选择指令,指示所述控制组件进入第一模式,在所述第一模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的无人机控制指令,并根据查找得到的无人机控制指令控制所述无人机;a first mode selection instruction instructing the control component to enter a first mode, in the first mode, the instruction determining module searches for a corresponding drone in the control instruction library according to the user action feature Controlling the command and controlling the drone according to the obtained drone control command;
    第二模式选择指令,指示所述控制组件进入第二模式,在所述第二模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的摄像组件控制指令,并根据查找得到的摄像组件控制指令控制所述摄像组件。a second mode selection instruction, instructing the control component to enter a second mode, in the second mode, the instruction determining module searches for a corresponding camera component control in the control instruction library according to the user action feature Commanding, and controlling the camera assembly according to the obtained camera component control command.
  13. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括:The UAV interactive photographing system according to claim 1, wherein the control instruction further comprises:
    全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机以预设速度在(0,α)角度范围内持续移动,α为预设全景拍摄最大角度。a panoramic mode selection instruction instructing the control component to enter a panoramic mode, in which the instruction execution module controls the drone to continuously move within a range of (0, α) angles at a preset speed, α is Preset panorama to shoot the maximum angle.
  14. 根据权利要求13所述的无人机交互拍摄系统,其特征在于,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:The UAV interactive photographing system according to claim 13, wherein in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
    所述摄像组件检测用户的位置;The camera assembly detects a location of the user;
    所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1;The drone starts with a user's position as a starting point, and rotates α/n to one side in the same horizontal plane, where n is a first preset split value, and n>1;
    所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;The camera assembly starts shooting, and the drone rotates α to the other side at a preset speed at a preset speed in the same horizontal plane;
    所述无人机停止转动后,所述摄像组件停止拍摄。After the drone stops rotating, the camera assembly stops shooting.
  15. 根据权利要求13所述的无人机交互拍摄系统,其特征在于,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:The UAV interactive photographing system according to claim 13, wherein in the panoramic mode, the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1;The instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a center to generate a first sector with an angle of α with a radius of L/m, and the object to be photographed is located at the first a sector-shaped arc, where m is a second predetermined segmentation value, and m>1;
    所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;The instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector The radius is (m-1) L/m and the angle is α;
    所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;The camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
    所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。After the drone moves to the other end of the arc of the second sector, the camera assembly stops shooting.
  16. 根据权利要求13所述的无人机交互拍摄系统,其特征在于,在所述全景模 式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:The UAV interactive photographing system according to claim 13, wherein in said panoramic mode The instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1;The instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of α with a length of L/m as a waist, and The subject to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second predetermined segmentation value, and m>1;
    所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;The command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle. And the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is α;
    所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;The camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
    所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。After the drone moves to the other end of the bottom edge of the second isosceles triangle, the camera assembly stops shooting.
  17. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括:The UAV interactive photographing system according to claim 1, wherein the control instruction further comprises:
    第三模式选择指令,指示所述控制组件进入第三模式,在所述第三模式下,所述指令执行模块控制所述摄像组件在预设等待时间后进行拍摄。And a third mode selection instruction instructing the control component to enter a third mode, in the third mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
  18. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括:The UAV interactive photographing system according to claim 1, wherein the control instruction further comprises:
    第四模式选择指令,指示所述控制组件进入第四模式,在所述第四模式下,所述指令执行模块通过所述摄像组件检测用户的位置,且控制所述无人机和所述摄像组件自动根据所述用户的位置移动,以使得所述摄像组件持续对用户进行拍摄。a fourth mode selection instruction instructing the control component to enter a fourth mode, in the fourth mode, the instruction execution module detects a position of the user through the camera component, and controls the drone and the camera The component automatically moves according to the location of the user such that the camera assembly continues to capture the user.
  19. 根据权利要求18所述的无人机交互拍摄系统,其特征在于,所述第四模式下,所述指令执行模块获取用户的位置变化加速度,当用户的位置变化加速度超过预设加速度阈值时,所述指令执行模块向外部发出报警信号。The UAV interactive photographing system according to claim 18, wherein in the fourth mode, the instruction execution module acquires a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, The instruction execution module issues an alarm signal to the outside.
  20. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述无人机上还设置有至少一个距离传感器,所述控制组件还包括障碍计算模块,所述障碍计算模块用于获取所述距离传感器的障碍物探测数据;The UAV interactive photographing system according to claim 1, wherein the drone is further provided with at least one distance sensor, the control component further comprises an obstacle calculation module, and the obstacle calculation module is configured to acquire the Obstacle detection data of the distance sensor;
    所述待执行控制指令中包含无人机移动指令,且所述障碍计算模块判断所述无人机移动指令中移动方向上的障碍物与所述无人机之间的距离小于预设安全阈值时,取消所述无人机移动指令,并向外部发出限值提醒信号。The to-be-executed control instruction includes a drone movement instruction, and the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold , cancel the drone movement instruction, and issue a limit reminder signal to the outside.
  21. 一种无人机交互拍摄方法,其特征在于,采用权利要求1至20中任一项所述的无人机交互拍摄系统,所述方法包括如下步骤:A UAV interactive photographing method, characterized by using the UAV interactive photographing system according to any one of claims 1 to 20, the method comprising the following steps:
    所述摄像组件获取拍摄图像;The camera assembly acquires a captured image;
    所述图像处理模块对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中 待执行的用户动作特征;The image processing module processes the captured image of the imaging component to acquire the captured image User action characteristics to be executed;
    所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及The instruction determining module searches for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
    所述指令执行模块根据查找得到的控制指令控制所述无人机和/或所述摄像组件。The instruction execution module controls the drone and/or the camera assembly according to the obtained control command.
  22. 根据权利要求21所述的无人机交互拍摄方法,其特征在于,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下步骤进行全景照片拍摄:The UAV interactive photographing method according to claim 21, wherein the control instruction further comprises a panoramic mode selection instruction, instructing the control component to enter a panoramic mode, in the panoramic mode, the instruction execution module Controlling the drone and the camera assembly to perform panoramic photo shooting in the following steps:
    所述摄像组件检测用户的位置;The camera assembly detects a location of the user;
    所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1,α为预设全景拍摄最大角度;The drone starts with a user's position as a starting point, and rotates α/n to one side in the same horizontal plane, where n is a first preset split value, and n>1, α is a preset panoramic shooting maximum angle;
    所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;The camera assembly starts shooting, and the drone rotates α to the other side at a preset speed at a preset speed in the same horizontal plane;
    所述无人机停止转动后,所述摄像组件停止拍摄。After the drone stops rotating, the camera assembly stops shooting.
  23. 根据权利要求21所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:The UAV interactive photographing system according to claim 21, wherein the control instruction further comprises a panoramic mode selection instruction, instructing the control component to enter a panoramic mode, in the panoramic mode, the instruction execution module Controlling the drone and the camera assembly to perform panoramic photo shooting in the following manner:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;The instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a center to generate a first sector with an angle of α with a radius of L/m, and the object to be photographed is located at the first a sector-shaped arc, where m is the second preset segmentation value, and m>1, α is the maximum angle of the preset panoramic shooting;
    所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;The instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector The radius is (m-1) L/m and the angle is α;
    所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;The camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
    所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。After the drone moves to the other end of the arc of the second sector, the camera assembly stops shooting.
  24. 根据权利要求21所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:The UAV interactive photographing system according to claim 21, wherein the control instruction further comprises a panoramic mode selection instruction, instructing the control component to enter a panoramic mode, in the panoramic mode, the instruction execution module Controlling the drone and the camera assembly to perform panoramic photo shooting in the following manner:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;The instruction execution module calculates a distance L between the camera component and a user;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度; The instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of α with a length of L/m as a waist, and The object to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second preset segmentation value, and m>1, α is a preset panoramic shooting maximum angle;
    所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;The command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle. And the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is α;
    所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;The camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
    所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。 After the drone moves to the other end of the bottom edge of the second isosceles triangle, the camera assembly stops shooting.
PCT/CN2017/080738 2017-04-17 2017-04-17 Interactive photographing system and method for unmanned aerial vehicle WO2018191840A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201780000407.6A CN109121434B (en) 2017-04-17 2017-04-17 Unmanned aerial vehicle interactive shooting system and method
PCT/CN2017/080738 WO2018191840A1 (en) 2017-04-17 2017-04-17 Interactive photographing system and method for unmanned aerial vehicle
TW107111546A TWI696122B (en) 2017-04-17 2018-04-02 Interactive photographic system and method for unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/080738 WO2018191840A1 (en) 2017-04-17 2017-04-17 Interactive photographing system and method for unmanned aerial vehicle

Publications (1)

Publication Number Publication Date
WO2018191840A1 true WO2018191840A1 (en) 2018-10-25

Family

ID=63855487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/080738 WO2018191840A1 (en) 2017-04-17 2017-04-17 Interactive photographing system and method for unmanned aerial vehicle

Country Status (3)

Country Link
CN (1) CN109121434B (en)
TW (1) TWI696122B (en)
WO (1) WO2018191840A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112019744A (en) * 2020-08-27 2020-12-01 新石器慧义知行智驰(北京)科技有限公司 Photographing method, device, equipment and medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI768630B (en) * 2020-12-29 2022-06-21 財團法人工業技術研究院 Movable photographing system and photography composition control method
US11445121B2 (en) 2020-12-29 2022-09-13 Industrial Technology Research Institute Movable photographing system and photography composition control method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
CN105138126A (en) * 2015-08-26 2015-12-09 小米科技有限责任公司 Unmanned aerial vehicle shooting control method and device and electronic device
WO2015200209A1 (en) * 2014-06-23 2015-12-30 Nixie Labs, Inc. Wearable unmanned aerial vehicles, launch- controlled unmanned aerial vehicles, and associated systems and methods
CN105391939A (en) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 Unmanned aerial vehicle shooting control method, device, unmanned aerial vehicle shooting method and unmanned aerial vehicle
CN105607740A (en) * 2015-12-29 2016-05-25 清华大学深圳研究生院 Unmanned aerial vehicle control method and device based on computer vision
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle
JP2016225872A (en) * 2015-06-01 2016-12-28 日本電信電話株式会社 Travel apparatus operation terminal, travel apparatus operation method and travel apparatus operation program

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678506B2 (en) * 2014-06-19 2017-06-13 Skydio, Inc. Magic wand interface and other user interaction paradigms for a flying digital assistant
CN105338238B (en) * 2014-08-08 2019-04-23 联想(北京)有限公司 A kind of photographic method and electronic equipment
US9471059B1 (en) * 2015-02-17 2016-10-18 Amazon Technologies, Inc. Unmanned aerial vehicle assistant
CN108334109B (en) * 2015-03-30 2021-02-12 绵阳硅基智能科技有限公司 Voice control device
CN106155080B (en) * 2015-07-28 2020-04-10 英华达(上海)科技有限公司 Unmanned plane
CN105677300A (en) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle
CN106227231A (en) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 The control method of unmanned plane, body feeling interaction device and unmanned plane
CN106200679B (en) * 2016-09-21 2019-01-29 中国人民解放军国防科学技术大学 Single operation person's multiple no-manned plane mixing Active Control Method based on multi-modal natural interaction
CN106444843B (en) * 2016-12-07 2019-02-15 北京奇虎科技有限公司 Unmanned plane relative bearing control method and device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015200209A1 (en) * 2014-06-23 2015-12-30 Nixie Labs, Inc. Wearable unmanned aerial vehicles, launch- controlled unmanned aerial vehicles, and associated systems and methods
CN104808799A (en) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 Unmanned aerial vehicle capable of indentifying gesture and identifying method thereof
JP2016225872A (en) * 2015-06-01 2016-12-28 日本電信電話株式会社 Travel apparatus operation terminal, travel apparatus operation method and travel apparatus operation program
CN105138126A (en) * 2015-08-26 2015-12-09 小米科技有限责任公司 Unmanned aerial vehicle shooting control method and device and electronic device
CN105391939A (en) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 Unmanned aerial vehicle shooting control method, device, unmanned aerial vehicle shooting method and unmanned aerial vehicle
CN105607740A (en) * 2015-12-29 2016-05-25 清华大学深圳研究生院 Unmanned aerial vehicle control method and device based on computer vision
CN105847684A (en) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 Unmanned aerial vehicle

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112019744A (en) * 2020-08-27 2020-12-01 新石器慧义知行智驰(北京)科技有限公司 Photographing method, device, equipment and medium

Also Published As

Publication number Publication date
CN109121434A (en) 2019-01-01
TW201839663A (en) 2018-11-01
TWI696122B (en) 2020-06-11
CN109121434B (en) 2021-07-27

Similar Documents

Publication Publication Date Title
US11509817B2 (en) Autonomous media capturing
US11188101B2 (en) Method for controlling aircraft, device, and aircraft
US11729487B2 (en) Image pickup apparatus and control method therefor
US11340606B2 (en) System and method for controller-free user drone interaction
CN110692027B (en) System and method for providing easy-to-use release and automatic positioning of drone applications
CN113038016B (en) Unmanned aerial vehicle image acquisition method and unmanned aerial vehicle
WO2018209702A1 (en) Method for controlling unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
JP6696118B2 (en) Electronics
CN108702458B (en) Shooting method and device
WO2018191840A1 (en) Interactive photographing system and method for unmanned aerial vehicle
CN108614638A (en) AR imaging methods and device
WO2015013979A1 (en) Remote control method and terminal
KR101951666B1 (en) Drone for taking pictures and controlling method thereof
US20200329202A1 (en) Image capturing apparatus, control method, and recording medium
WO2019227333A1 (en) Group photograph photographing method and apparatus
CN108377398A (en) Based on infrared AR imaging methods, system and electronic equipment
CN110035220B (en) Device control system and method for photography
WO2015083152A2 (en) Apparatus and method for photographing a person using a movable remote device
CN110291777A (en) Image-pickup method, equipment and machine readable storage medium
WO2022000138A1 (en) Photographing control method and apparatus, and gimbal and photographing system
CN110291776B (en) Flight control method and aircraft
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
KR101599149B1 (en) An imaging device with automatic tracing for the object
CN109447924B (en) Picture synthesis method and device and electronic equipment
WO2021056442A1 (en) Composition method and system for photographing device, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906581

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906581

Country of ref document: EP

Kind code of ref document: A1