WO2022094860A1 - Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage lisible par ordinateur - Google Patents

Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage lisible par ordinateur Download PDF

Info

Publication number
WO2022094860A1
WO2022094860A1 PCT/CN2020/126757 CN2020126757W WO2022094860A1 WO 2022094860 A1 WO2022094860 A1 WO 2022094860A1 CN 2020126757 W CN2020126757 W CN 2020126757W WO 2022094860 A1 WO2022094860 A1 WO 2022094860A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
target
flight trajectory
mode
processor
Prior art date
Application number
PCT/CN2020/126757
Other languages
English (en)
Chinese (zh)
Inventor
李博文
唐梓清
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2020/126757 priority Critical patent/WO2022094860A1/fr
Priority to CN202080071047.0A priority patent/CN114585985A/zh
Publication of WO2022094860A1 publication Critical patent/WO2022094860A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Definitions

  • the present application relates to the technical field of unmanned aerial vehicles, and in particular, to a method and device for controlling an unmanned aerial vehicle, an unmanned aerial vehicle, and a computer-readable storage medium.
  • the drone can be equipped with a camera, and through the camera, a video from the perspective of the drone can be taken.
  • a drone to shoot the user needs to connect the drone with the control device first. After the connection is successful, the user can control the drone to fly through the control device, and it is necessary to check the shooting angle of the shooting device while flying. control. It can be seen that the use of drones to shoot videos requires users to perform various operations such as connecting and controlling equipment, which brings inconvenience to users.
  • the embodiments of the present application provide a method, device, drone and computer-readable storage medium for controlling an unmanned aerial vehicle, one of the purposes is to solve the steps and operations such as connecting a control device and the like that need to be performed when using an unmanned aerial vehicle to shoot video. troublesome technical issues.
  • a first aspect of the embodiments of the present application provides a method for controlling an unmanned aerial vehicle, wherein a photographing device is mounted on the unmanned aerial vehicle, and the method includes:
  • the take-off mode when the image captured by the photographing device includes a human face, controlling the UAV to start the power system to control the UAV to take off;
  • the drone After the drone is controlled to take off, the drone is controlled to fly on the target flight trajectory, and the photographing device is controlled to take the photographing object corresponding to the face as the main body during the flight.
  • a second aspect of the embodiments of the present application provides a control device for an unmanned aerial vehicle, the unmanned aerial vehicle is equipped with a photographing device, and the unmanned aerial vehicle control device includes: a processor and a memory storing a computer program, the processor is in The following steps are implemented when executing the computer program:
  • the take-off mode when the image captured by the photographing device includes a human face, controlling the UAV to start the power system to control the UAV to take off;
  • the drone After the drone is controlled to take off, the drone is controlled to fly on the target flight trajectory, and the photographing device is controlled to take the photographing object corresponding to the face as the main body during the flight.
  • a third aspect of the embodiments of the present application provides an unmanned aerial vehicle, including:
  • the body is provided with a physical button
  • a drive device connected with the body for powering the UAV
  • a gimbal connected to the body
  • a processor and a memory in which a computer program is stored the processor implementing the following steps when executing the computer program:
  • the take-off mode when the image captured by the photographing device includes a human face, controlling the UAV to start the power system to control the UAV to take off;
  • the drone After the drone is controlled to take off, the drone is controlled to fly on the target flight trajectory, and the photographing device is controlled to take the photographing object corresponding to the face as the main body during the flight.
  • a fourth aspect of the embodiments of the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program, and the computer program is executed by a processor to implement the drone control method provided by the embodiments of the present application.
  • the UAV can directly enter the preset take-off mode.
  • this take-off mode if the UAV captures a human face, It can automatically take off and fly on the target flight trajectory, and can also automatically take the subject corresponding to the captured face as the main body during the flight.
  • the method provided by the embodiment of the present application greatly simplifies the operation of the user using the drone to shoot video, and the user does not need to connect the drone to the control device, but only needs to directly operate the physical button on the drone to trigger the unmanned aerial vehicle.
  • the human-machine automatic flight and automatic shooting, the whole process is fast and simple, which can greatly improve the enthusiasm of users to use drones to shoot videos.
  • FIG. 1 is a schematic diagram of a scene in which a user uses a drone to take a selfie according to an embodiment of the present application.
  • FIG. 2 is a flowchart of a method for controlling an unmanned aerial vehicle provided by an embodiment of the present application.
  • FIG. 3 is a schematic diagram of a scene in which a drone is used for face recognition according to an embodiment of the present application.
  • FIG. 4 is a schematic diagram of a scene in which a drone recognizes a user's forward push gesture provided by an embodiment of the present application.
  • FIG. 5 is a schematic diagram of a scene where a drone is landed on a palm according to an embodiment of the present application.
  • FIG. 6 is a schematic structural diagram of a drone control device provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of an unmanned aerial vehicle provided by an embodiment of the present application.
  • Drones are usually equipped with a camera, which may also be called a camera or camera. Due to the unique shooting angle and wide field of view of drones, many users like to take selfies with drones and record themselves and the scene they are in in the video.
  • FIG. 1 is a schematic diagram of a scene in which a user uses a drone to take a selfie according to an embodiment of the present application.
  • the user can control the drone to fly through the control device, and needs to control the camera on the drone to face him during the flight to ensure that he is in the shot.
  • the user needs to make the drone transmit the original video to the control device, so that the user can edit the original video on the control device to generate a shareable video or short film.
  • the control device may be a remote controller equipped with a display screen, or a combination of a remote controller and a terminal device, such as a mobile phone, tablet, computer, etc., or a combination of a remote controller and flying glasses.
  • FIG. 2 is a flowchart of a method for controlling an unmanned aerial vehicle provided by an embodiment of the present application. The method may include the following steps:
  • the shooting object is photographed and a video is recorded or a picture is stored.
  • the drone can be provided with a physical button, and through the physical button, the user can make the drone enter a preset take-off mode.
  • the preset take-off mode may also be called a paper airplane mode in one example.
  • physical buttons are more convenient to operate and more reliable to trigger.
  • the physical button may be a push button, and the user can make the drone enter a preset takeoff mode by pressing the physical button.
  • the physical button may be a toggle button, and the user can make the drone enter a preset takeoff mode by toggling the physical button.
  • the physical button may be a touch-sensitive button, and the user can make the drone enter a preset take-off mode by touching the physical button.
  • the drone can directly enter the preset take-off mode.
  • the so-called direct entry means that the drone does not need to establish a connection with the control device before entering the preset take-off mode, and establishing a connection with the control device is an option rather than a necessity. Therefore, in one example, the user does not need to use a mobile phone, flight glasses, remote control, etc. during the entire shooting process, and only needs to operate the physical buttons on the drone.
  • the drone's camera unit can start capturing images.
  • the UAV may include a gimbal, and the photographing device may be mounted on the gimbal of the UAV.
  • the gimbal of the UAV can be controlled to move to The target pose, so that the camera mounted on the gimbal can shoot at the target angle.
  • FIG. 3 is a schematic diagram of a scene of using a drone for face recognition according to an embodiment of the present application. As shown in Figure 3, the drone can be a small drone, and the user can hold the drone on the palm.
  • the camera device of the drone can be driven by the gimbal. Rotating down to face straight ahead, the user can raise the drone so that the camera can be aimed at his face, so that face recognition can begin.
  • the camera can also automatically shoot images from the bottom to the top by controlling the gimbal, so that the user does not need to lift the drone, the drone The user's face is captured when shooting upwards.
  • the drone can take off automatically, and can automatically fly with the target flight trajectory, and during the flight, the camera of the drone can always point the camera at the object corresponding to the recognized face (user), so that the main body of the captured video can be the user himself, so as to obtain the drone selfie video desired by the user.
  • the target flight path can be associated with a preset takeoff pattern.
  • the relationship between the target flight trajectory and the take-off mode may be preset by the user.
  • the user can select one of various flight trajectories as the target flight trajectory by controlling the application program installed on the device in advance, so that after the drone enters the take-off mode and completes face recognition, it can automatically describe the target flight trajectory.
  • the target flight trajectory may also be determined according to the user's gesture after the drone takes off.
  • the correspondence between different gestures and flight trajectories can be configured in the drone in advance. After the drone scans the face, the drone can automatically take off and hover in front of the subject. At this time, The user can make a gesture corresponding to the desired flight trajectory, then the camera of the drone can capture an image or image sequence containing the user's gesture, and by recognizing the user's gesture in the image, the identified user's gesture The flight trajectory corresponding to the gesture is determined as the target flight trajectory.
  • the target flight trajectory can be any of a variety of flight trajectories.
  • the flight trajectory may include an escalating flight trajectory, a skyrocketing flight trajectory, a circumnavigating flight trajectory, a spiral flight trajectory, and the like.
  • the drone flies in an increasingly distant flight trajectory, the drone can gradually move away from the subject in a diagonal upward direction, and the field of view in the captured picture will gradually expand.
  • the UAV is flying in a sky-rocketing trajectory
  • the UAV can rise vertically at a faster speed.
  • the drone can fly around the subject without changing the flying height.
  • the drone flies in a spiral flight trajectory the drone can gradually increase its flying height while flying around the subject, resulting in a spiral rise effect.
  • the target flight trajectory can be determined according to the user's gesture, so different gestures can be preset for different flight trajectories.
  • the gesture may be a dynamic gesture.
  • the gesture corresponding to the soaring flight trajectory may be a palm up
  • the gesture corresponding to the surrounding flight trajectory may be a circle with a finger
  • the gesture corresponding to a spiral flight trajectory may be a finger pointing upward to draw a tornado
  • the gesture can be a forward push of the palm.
  • the user controls the drone to fly in an increasingly distant flight trajectory by making a forward push gesture.
  • the target flight speed of the drone when flying on the target flight trajectory can also be determined according to the movement speed of the user's gesture.
  • the movement speed of the user's gesture can be calculated according to the image sequence including the user's gesture captured by the photographing device. There are various specific calculation methods, for example, it can be calculated according to the optical flow information between frames, or it can be calculated according to the shutter of the photographing device. The speed and motion blur in the image are calculated, and for some specific gestures, it can also be calculated according to the change of depth information.
  • the drone can make a forward push gesture in front of the camera.
  • the forward push gesture if the user wants the drone to fly faster, he can push the palm forward at a faster speed. If the user wants the drone to fly at a slower speed, he can control the push of the palm Slow down.
  • the drone can obtain the depth information of the user's palm through the depth sensor, and can determine the forward speed of the forward push gesture according to the change of the depth information of the user's palm, and can use the forward speed according to the forward speed. , to determine the target flight speed of the UAV when it flies in an increasingly distant flight path.
  • the depth sensor can be implemented in multiple ways, such as a binocular vision sensor, a TOF sensor, and so on.
  • the drone can be provided with multiple shooting devices corresponding to different orientations.
  • the drone can be equipped with front, Cameras in any direction from the rear, left, right, up, and down can obtain scene images in different directions through different shooting devices, so that the current scene type can be determined according to the scene image, and then the scene type can be determined according to the scene image. suitable target flight path.
  • the target flight trajectory adapted to the current scene is the sky-rocketing flight trajectory. If shooting indoors in an area, according to the scene images captured by multiple shooting devices on the drone, it can be determined that the target flight trajectory adapted to the current scene is the surrounding flight trajectory.
  • the UAV can directly enter the preset take-off mode.
  • this take-off mode if the UAV captures a human face, It can automatically take off and fly on the target flight trajectory, and can also automatically take the subject corresponding to the captured face as the main body during the flight.
  • the method provided by the embodiment of the present application greatly simplifies the operation of the user using the drone to shoot video, and the user does not need to connect the drone to the control device, but only needs to directly operate the physical button on the drone to trigger the unmanned aerial vehicle.
  • the human-machine automatic flight and automatic shooting, the whole process is fast and simple, which can greatly improve the enthusiasm of users to use drones to shoot videos.
  • the drone after the drone completes the shooting when flying on the target flight trajectory, it can automatically generate a movie by using the video obtained by shooting, so as to realize automatic editing.
  • the final film template can be pre-stored in the drone, and the pre-stored film template in the drone can be managed through the application program on the control device, and the final film template can include music material, video special effects, and transition effects. , text material, picture material and other content, by combining the shot video with the template, you can quickly generate a viewing movie corresponding to the shot video, which is convenient for users to share.
  • the drone can also be automatically controlled to return home after the shooting is completed.
  • the drone can be a small drone, and the drone can take off from the palm of the user, or land on the palm of the user to complete the return flight.
  • the drone can be controlled to return to the target position.
  • the target position can be a specified height above the take-off position, for example, it can be 0.5m above the take-off position.
  • the take-off position can be the position recorded by the drone when it took off. After the drone returns to the target position, the landing can be started.
  • FIG. 5 is a schematic diagram of a scene where a drone lands on a palm according to an embodiment of the present application.
  • palm detection can be performed on the area below the drone, and when the target palm is detected, the drone can be controlled to land on the target palm.
  • palm detection can be realized through image recognition technology. Specifically, a camera can be equipped under the drone, images are taken by the camera under the drone, and palm recognition is performed on the captured image, and the target can be detected. palm.
  • the user may raise the palm when the drone is landing, so as to catch the drone as soon as possible, and the blade of the drone is rotating at a high speed.
  • the distance between the user's palm rapidly decreases, the risk of the user being cut by the drone increases. Therefore, for the sake of safety, in one embodiment, the distance between the drone and the target palm can be obtained through the distance sensor.
  • the landing speed of the drone is determined, that is, when the gap between the shortening speed and the landing speed is greater than the preset threshold, it can be determined that the user is raising the palm.
  • the speed of the blades can slow down the shortening of the distance between the drone and the target palm.
  • the increase in the speed of the blades can also remind the user to slow down to a certain extent. Haste is not enough. Of course, if the distance between the drone and the target palm is 0 or close to 0, the motor of the blade can be quickly braked to make the drone stop quickly.
  • ambient lights may be provided on the arms and/or fuselage of the drone.
  • the ambient light can have different lighting modes and lighting colors, among which the lighting modes can include breathing, constant light, rainbow, throttle (that is, the brightness of the light is linked to the propulsion depth of the acceleration stick of the drone), fast flash, slow flash, flowing water Lights (multiple ambient lights emit light in a certain order) and other modes, and the light-emitting colors can include various colors such as white, red, orange, yellow, green, and rainbow colors (gradient conversion between multiple colors).
  • the user can set the lighting color or lighting mode of the ambient light on the drone by controlling the device.
  • the control device can establish communication with the drone, and the control device can include a terminal device or flying glasses.
  • the user can set the lighting color and/or lighting mode of the ambient light through an application installed on the terminal device or flying glasses. , so that the terminal device or the flying glasses can send the color information corresponding to the light-emitting color selected by the user and/or the mode information corresponding to the light-emitting mode to the drone, and the drone can The information controls the ambient light to switch to the corresponding light-emitting color and/or light-emitting mode.
  • the drone can perform face recognition on the captured image after entering the preset take-off mode, but face recognition takes a certain amount of time, and in some cases, the recognition may not be successful, such as the user shaking Or the environment is too dark, etc. Therefore, in one embodiment, the progress of face recognition may be prompted to the user through an ambient light.
  • the ambient light can be controlled to emit light in the first color and/or the first mode, such as making the ambient light emit orange light and/or flashing continuously, and after the recognition is completed, the ambient light can be controlled The lights are illuminated in a second color and/or a second pattern, such as making the ambient light green and/or always on.
  • the user can also be prompted through the ambient light. Since the blades of the drone need to reach a certain rotational speed (which can be called the target rotational speed) before the drone can take off, the user needs to keep lifting the drone until the rotational speed of the blades does not reach the target rotational speed. Therefore, in one embodiment, the user can be prompted for the state of the blade rotation speed through the ambient light, and before the blade rotation speed of the drone does not reach the target rotation speed, the ambient light can be controlled to emit light in a third color or a third mode, such as It emits a flashing red light.
  • a third color or a third mode such as It emits a flashing red light.
  • the ambient light can be controlled to emit light in the fourth color or the fourth mode, such as a steady green light. Through this change of ambient light, it can help users determine when they do not need to lift, and prevent the drone from falling due to insufficient rotation speed.
  • an attitude sensor may be configured in the UAV, and the current attitude of the UAV can be obtained through the attitude sensor, and it can be determined whether the UAV is kept level.
  • the ambient light can be controlled to emit light in the fifth color or the fifth mode.
  • the ambient light can be controlled to emit light in the sixth color. Or the sixth mode glows. Through this change of ambient light, the user can be assisted to adjust the palm so that the drone can take off safely.
  • the breathing frequency of the ambient light can be determined according to the remaining power of the drone. In one example, if the remaining power of the drone is lower, the breathing rate of the ambient light can be higher, and at the same time, the brightness of the ambient light can be reduced, so that an obvious signal of insufficient power can be sent to the user.
  • the ambient light has various optional lighting colors and optional lighting modes, which makes the drone itself have a certain ornamental value.
  • the ambient light of the drone can work in a breathing mode, and the breathing frequency of the ambient light can match the rhythm of the music, bringing a sense of visual and auditory unity to the user.
  • the drone can obtain the rhythm information of the target music, so that the breathing frequency of the ambient light can be controlled according to the rhythm information, so that the moment when the ambient light is on matches the rhythm point of the target music.
  • the rhythm information of the target music can be obtained by the drone from the control device.
  • the target music can be the music currently playing on the terminal device, or it can be specified by the user through the application program on the terminal device. music.
  • the rhythm information of the target music can also be obtained by the drone by analyzing the audio signal collected by its own radio.
  • the user only needs to operate the physical buttons on the UAV, and the UAV can automatically recognize the face, automatically take off and fly, automatically shoot video for the shooting object corresponding to the face, and automatically Editing the captured video greatly simplifies the process of using drones to shoot videos.
  • the whole process is fast and simple, which can greatly improve the enthusiasm of users to shoot videos using drones.
  • the drone can be equipped with an ambient light, so that the user can personalize the decoration of the drone through the ambient light, and can also use the ambient light to perform face recognition, palm takeoff and other links to the user. Prompts enrich the interaction between the user and the drone.
  • FIG. 6 is a schematic structural diagram of a drone control device provided by an embodiment of the present application.
  • the drone control device provided by the embodiment of the present application includes: a processor 610 and a memory 620 storing a computer program, and the processor implements the following steps when executing the computer program:
  • the drone After the drone is controlled to take off, the drone is controlled to fly on the target flight trajectory, and the photographing device is controlled to take the photographing object corresponding to the face as the main body during the flight.
  • the target flight trajectory is associated with the preset takeoff mode.
  • the association relationship between the target flight trajectory and the preset take-off mode is preset by a user.
  • the processor is further configured to, after the shooting is completed, use the video obtained when the drone flies on the target flight trajectory to generate a movie, so as to realize automatic editing.
  • the processor is configured to edit the captured video according to a pre-stored template when editing the captured video.
  • the film-forming template includes one or more of the following: music material, video special effect, transition effect, text material, and picture material.
  • the processor is also used for,
  • the user gesture is recognized by the photographing device, and the target flight trajectory of the drone is determined according to the recognition result.
  • the processor is further configured to, according to the image including the user gesture captured by the photographing device, determine the movement speed of the user gesture; and determine the movement speed of the user gesture according to the movement speed of the user gesture.
  • the target flight speed of the man-machine when flying on the target flight path.
  • the target flight trajectory includes an escalating flight trajectory
  • the user gesture corresponding to the escalating flight trajectory includes a forward push gesture
  • the processor is used to obtain the depth information of the user's palm when determining the target flight speed of the drone when flying at the target flight trajectory according to the movement speed of the user's gesture;
  • the change of the depth information of the user's palm determines the forward speed of the forward push gesture; according to the forward speed, the target flight speed of the unmanned aerial vehicle when it flies on the flight path that is gradually moving away is determined.
  • the target flight trajectory includes one or more of the following: a progressive flight trajectory, a skyrocketing flight trajectory, a circular flight trajectory, and a spiral flight trajectory.
  • the drone moves away from the photographed object in an oblique upward direction.
  • the unmanned aerial vehicle includes a plurality of photographing devices arranged in different directions
  • the processor is further configured to, before controlling the unmanned aerial vehicle to fly with the target flight trajectory, obtain information from the plurality of photographing devices.
  • the processor is configured to control the UAV to return to the target position when controlling the UAV to return.
  • the target position includes a specified height above the take-off position of the UAV.
  • the processor is further configured to, after the drone returns to the target position, perform palm detection on the area below the drone, and when the target palm is detected, control the drone.
  • the man-machine landed on the palm of the target.
  • the processor when controlling the drone to land on the palm of the target, the processor is used to obtain the distance between the drone and the palm of the target; When the shortening speed of the distance between the target palms is greater than the landing speed of the drone, the landing speed of the drone is reduced.
  • the drone further includes ambient lights.
  • the drone is connected to a control device, and the processor is further configured to, according to the color information and/or mode information sent by the control device, control the ambient light to display the color corresponding to the color information. emit light and/or emit light in a mode corresponding to the mode information.
  • the processor is further configured to, when performing face recognition on the image captured by the photographing device, before the recognition is completed, control the ambient light to emit light in a first color or a first mode, and before the recognition is completed, After completion, the ambient light is controlled to emit light in a second color or a second mode.
  • the processor is further configured to, when controlling the drone to take off, before the rotational speed of the blades of the drone does not reach the target rotational speed, control the ambient light to change to a third color or a third color.
  • Mode lighting after the rotating speed of the blade reaches the target rotating speed, the ambient light is controlled to emit light in a fourth color or a fourth mode.
  • the lighting mode of the ambient light includes a breathing mode.
  • the processor is further configured to determine the breathing frequency of the ambient light according to the remaining power of the drone.
  • the processor is further configured to acquire rhythm information of the target music; and adjust the breathing frequency of the ambient light according to the rhythm information.
  • the rhythm information of the target music is obtained from the control device of the drone.
  • the UAV can directly enter the preset take-off mode.
  • this take-off mode if the UAV captures a human face, It can automatically take off and fly on the target flight trajectory, and can also automatically take the subject corresponding to the captured face as the main body during the flight.
  • the method provided by the embodiment of the present application greatly simplifies the operation of the user using the drone to shoot video, and the user does not need to connect the drone to the control device, but only needs to directly operate the physical button on the drone to trigger the unmanned aerial vehicle.
  • the human-machine automatic flight and automatic shooting, the whole process is fast and simple, which can greatly improve the enthusiasm of users to use drones to shoot videos.
  • FIG. 7 is a schematic structural diagram of an unmanned aerial vehicle provided by an embodiment of the present application.
  • the unmanned aerial vehicle provided by the embodiment of the present application includes:
  • a body 710 which is provided with physical buttons
  • a drive device 720 connected to the body, for providing power to the drone;
  • pan/tilt 730 connected to the body
  • the take-off mode when the image captured by the photographing device includes a human face, controlling the UAV to start the power system to control the UAV to take off;
  • the drone After the drone is controlled to take off, the drone is controlled to fly on the target flight trajectory, and the photographing device is controlled to take the photographing object corresponding to the face as the main body during the flight.
  • the target flight trajectory is associated with the preset takeoff mode.
  • the association relationship between the target flight trajectory and the preset take-off mode is preset by a user.
  • the processor is further configured to, after the shooting is completed, generate a movie by using the video obtained when the drone flies on the target flight trajectory, so as to realize automatic editing.
  • the processor is configured to edit the captured video according to a pre-stored template when editing the captured video.
  • the film-forming template includes one or more of the following: music material, video special effect, transition effect, text material, and picture material.
  • the processor is further configured to, before controlling the UAV to fly with the target flight trajectory, identify the user's gesture through the photographing device, and determine the target flight trajectory of the UAV according to the recognition result. .
  • the processor is further configured to, according to the image including the user gesture captured by the photographing device, determine the movement speed of the user gesture; and determine the movement speed of the user gesture according to the movement speed of the user gesture.
  • the target flight speed of the man-machine when flying on the target flight path.
  • the target flight trajectory includes an escalating flight trajectory
  • the user gesture corresponding to the escalating flight trajectory includes a forward push gesture
  • the processor is used to obtain the depth information of the user's palm when determining the target flight speed of the drone when flying at the target flight trajectory according to the movement speed of the user's gesture;
  • the change of the depth information of the user's palm determines the forward speed of the forward push gesture; according to the forward speed, the target flight speed of the unmanned aerial vehicle when it flies on the flight path that is gradually moving away is determined.
  • the target flight trajectory includes one or more of the following: a progressive flight trajectory, a skyrocketing flight trajectory, a circular flight trajectory, and a spiral flight trajectory.
  • the drone moves away from the photographed object in an oblique upward direction.
  • the unmanned aerial vehicle includes a plurality of photographing devices arranged in different directions
  • the processor is further configured to, before controlling the unmanned aerial vehicle to fly with the target flight trajectory, obtain information from the plurality of photographing devices.
  • the processor is configured to control the UAV to return to the target position when controlling the UAV to return.
  • the target position includes a specified height above the take-off position of the UAV.
  • the processor is further configured to, after the drone returns to the target position, perform palm detection on the area below the drone, and when the target palm is detected, control the drone.
  • the man-machine landed on the palm of the target.
  • the processor when controlling the drone to land on the palm of the target, the processor is used to obtain the distance between the drone and the palm of the target; When the shortening speed of the distance between the target palms is greater than the landing speed of the drone, the landing speed of the drone is reduced.
  • the drone further includes ambient lights.
  • the drone is connected to a control device, and the processor is further configured to, according to the color information and/or mode information sent by the control device, control the ambient light to display the color corresponding to the color information. emit light and/or emit light in a mode corresponding to the mode information.
  • the processor is further configured to, when performing face recognition on the image captured by the photographing device, before the recognition is completed, control the ambient light to emit light in a first color or a first mode, and before the recognition is completed, After completion, the ambient light is controlled to emit light in a second color or a second mode.
  • the processor is further configured to, when controlling the drone to take off, before the rotational speed of the blades of the drone does not reach the target rotational speed, control the ambient light to change to a third color or a third color.
  • Mode lighting after the rotating speed of the blade reaches the target rotating speed, the ambient light is controlled to emit light in a fourth color or a fourth mode.
  • the lighting mode of the ambient light includes a breathing mode.
  • the processor is further configured to determine the breathing frequency of the ambient light according to the remaining power of the drone.
  • the processor is further configured to acquire rhythm information of the target music; and adjust the breathing frequency of the ambient light according to the rhythm information.
  • the rhythm information of the target music is obtained from the control device of the drone.
  • the drone after the user operates the physical button on the drone, the drone can directly enter the preset take-off mode.
  • this take-off mode if the drone captures a human face, the drone can It takes off automatically and flies with the target flight trajectory. During the flight, it can also automatically take the subject corresponding to the captured face as the main body to shoot.
  • the method provided by the embodiment of the present application greatly simplifies the operation of the user using the drone to shoot video, and the user does not need to connect the drone to the control device, but only needs to directly operate the physical button on the drone to trigger the unmanned aerial vehicle.
  • the human-machine automatic flight and automatic shooting, the whole process is fast and simple, which can greatly improve the enthusiasm of users to use drones to shoot videos.
  • Embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, any UAV control provided by the embodiments of the present application is implemented method.
  • Embodiments of the present application may take the form of a computer program product implemented on one or more storage media (including, but not limited to, disk storage, CD-ROM, optical storage, etc.) having program code embodied therein.
  • Computer-usable storage media includes permanent and non-permanent, removable and non-removable media, and storage of information can be accomplished by any method or technology.
  • Information may be computer readable instructions, data structures, modules of programs, or other data.
  • Examples of computer storage media include, but are not limited to, phase-change memory (PRAM), static random access memory (SRAM), dynamic random access memory (DRAM), other types of random access memory (RAM), read only memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), Flash Memory or other memory technology, Compact Disc Read Only Memory (CD-ROM), Digital Versatile Disc (DVD) or other optical storage, Magnetic tape cassettes, magnetic tape magnetic disk storage or other magnetic storage devices or any other non-transmission medium that can be used to store information that can be accessed by a computing device.
  • PRAM phase-change memory
  • SRAM static random access memory
  • DRAM dynamic random access memory
  • RAM random access memory
  • ROM read only memory
  • EEPROM Electrically Erasable Programmable Read Only Memory
  • Flash Memory or other memory technology
  • CD-ROM Compact Disc Read Only Memory
  • CD-ROM Compact Disc Read Only Memory
  • DVD Digital Versatile Disc
  • Magnetic tape cassettes magnetic tape magnetic disk storage or other magnetic storage devices or any other non-

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un procédé de commande de véhicule aérien sans pilote. Un dispositif de photographie est monté sur un véhicule aérien sans pilote. Le procédé consiste : après qu'un véhicule aérien sans pilote a démarré, à acquérir une opération d'un utilisateur sur des touches physiques du véhicule aérien sans pilote (202) ; en fonction de l'opération sur les touches physiques, à commander le véhicule aérien sans pilote pour entrer directement dans un mode de décollage prédéfini (204) ; dans le mode de décollage, lorsqu'une image photographiée par le dispositif de photographie comprend un visage, à commander le véhicule aérien sans pilote pour démarrer un système d'alimentation pour commander le décollage du véhicule aérien sans pilote (206) ; et après que le véhicule aérien sans pilote est commandé pour décoller, à commander le véhicule aérien sans pilote pour voler le long d'une trajectoire de vol cible et à commander, pendant le vol, le dispositif de photographie pour effectuer une photographie en prenant un objet photographique correspondant au visage en tant que sujet (208). Le procédé peut résoudre les problèmes techniques liés au besoin d'étapes telles que la connexion d'un dispositif de commande lorsqu'une vidéo est photographiée à l'aide d'un véhicule aérien sans pilote, et liés à des opérations fastidieuses.
PCT/CN2020/126757 2020-11-05 2020-11-05 Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage lisible par ordinateur WO2022094860A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2020/126757 WO2022094860A1 (fr) 2020-11-05 2020-11-05 Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage lisible par ordinateur
CN202080071047.0A CN114585985A (zh) 2020-11-05 2020-11-05 无人机控制方法、装置、无人机及计算机可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/126757 WO2022094860A1 (fr) 2020-11-05 2020-11-05 Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage lisible par ordinateur

Publications (1)

Publication Number Publication Date
WO2022094860A1 true WO2022094860A1 (fr) 2022-05-12

Family

ID=81458416

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/126757 WO2022094860A1 (fr) 2020-11-05 2020-11-05 Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN114585985A (fr)
WO (1) WO2022094860A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150266575A1 (en) * 2014-03-21 2015-09-24 Brandon Borko System for automatic takeoff and landing by interception of small uavs
CN108521812A (zh) * 2017-05-19 2018-09-11 深圳市大疆创新科技有限公司 无人机的控制方法、无人机以及机器可读存储介质
CN109074168A (zh) * 2018-01-23 2018-12-21 深圳市大疆创新科技有限公司 无人机的控制方法、设备和无人机
CN110300938A (zh) * 2016-12-21 2019-10-01 杭州零零科技有限公司 用于免控制器式用户无人机交互的系统和方法
US20200062388A1 (en) * 2018-08-22 2020-02-27 Ford Global Technologies, Llc Take off and landing system for drone for use with an autonomous vehicle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150266575A1 (en) * 2014-03-21 2015-09-24 Brandon Borko System for automatic takeoff and landing by interception of small uavs
CN110300938A (zh) * 2016-12-21 2019-10-01 杭州零零科技有限公司 用于免控制器式用户无人机交互的系统和方法
CN108521812A (zh) * 2017-05-19 2018-09-11 深圳市大疆创新科技有限公司 无人机的控制方法、无人机以及机器可读存储介质
CN109074168A (zh) * 2018-01-23 2018-12-21 深圳市大疆创新科技有限公司 无人机的控制方法、设备和无人机
US20200062388A1 (en) * 2018-08-22 2020-02-27 Ford Global Technologies, Llc Take off and landing system for drone for use with an autonomous vehicle

Also Published As

Publication number Publication date
CN114585985A (zh) 2022-06-03

Similar Documents

Publication Publication Date Title
JP6777121B2 (ja) 情報処理装置、情報処理方法、コンピュータ読み取り可能な媒体および撮像システム
US11120261B2 (en) Imaging control method and device
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US10863073B2 (en) Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
CN113038016B (zh) 无人机图像采集方法及无人机
WO2018098678A1 (fr) Procédé, dispositif et appareil de commande d'aéronef, et aéronef
WO2018209702A1 (fr) Procédé de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support d'informations lisible par machine
CN105227842B (zh) 一种航拍设备的拍摄范围标定装置及方法
CN109948423B (zh) 应用人脸及姿态识别的无人机旅游伴随服务方法及无人机
WO2019227333A1 (fr) Procédé et appareil de photographie de photo de groupe
US11024090B2 (en) Virtual frame for guided image composition
CN204287973U (zh) 飞行相机
CN112154440A (zh) 无人机的仿真方法、仿真装置和计算机可读存储介质
WO2022094860A1 (fr) Procédé et dispositif de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage lisible par ordinateur
US20200382696A1 (en) Selfie aerial camera device
CN105807783A (zh) 飞行相机
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
WO2022056683A1 (fr) Procédé, dispositif et système de détermination de champ de vision et support
KR20180000110A (ko) 드론 및 그 제어방법
CN116762354A (zh) 影像拍摄方法、控制装置、可移动平台和计算机存储介质
WO2023035165A1 (fr) Procédé de photographie, système de photographie et support de stockage
WO2021031840A1 (fr) Dispositif, appareil photographique, corps mobile, procédé et programme
US20230351899A1 (en) Navigation correction for excessive wind
JP7173657B2 (ja) 制御装置、撮像装置、制御方法、及びプログラム
WO2023123254A1 (fr) Procédé et dispositif de commande pour un véhicule aérien sans pilote, véhicule aérien sans pilote et support de stockage

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20960321

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20960321

Country of ref document: EP

Kind code of ref document: A1