CN114585985A - Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium - Google Patents

Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium Download PDF

Info

Publication number
CN114585985A
CN114585985A CN202080071047.0A CN202080071047A CN114585985A CN 114585985 A CN114585985 A CN 114585985A CN 202080071047 A CN202080071047 A CN 202080071047A CN 114585985 A CN114585985 A CN 114585985A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
drone
target
flight trajectory
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202080071047.0A
Other languages
Chinese (zh)
Inventor
李博文
唐梓清
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN114585985A publication Critical patent/CN114585985A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/104Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

A method for controlling an unmanned aerial vehicle, wherein a shooting device is mounted on the unmanned aerial vehicle, and the method comprises the following steps: after the unmanned aerial vehicle is started, acquiring the operation (202) of a user on an entity button on the unmanned aerial vehicle; controlling the unmanned aerial vehicle to directly enter a preset takeoff mode (204) according to the operation of the entity key; in a takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system to control the unmanned aerial vehicle to take off (206); after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the face as a main body in the flying process (208). The method can solve the technical problems that steps such as connecting control equipment and the like are needed when the unmanned aerial vehicle is used for shooting videos, and operation is troublesome.

Description

Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium
Technical Field
The application relates to the technical field of unmanned aerial vehicles, in particular to an unmanned aerial vehicle control method and device, an unmanned aerial vehicle and a computer readable storage medium.
Background
Unmanned aerial vehicle can carry on the shooting device, can shoot the video that obtains the unmanned aerial vehicle visual angle through the shooting device. When using unmanned aerial vehicle to shoot, the user need be connected unmanned aerial vehicle and controlgear earlier, after connecting successfully, the user just can control unmanned aerial vehicle flight through controlgear to need control the shooting angle of shooting the device in the flight. Therefore, the user needs to carry out various operations such as connection control equipment and the like when the unmanned aerial vehicle is used for shooting videos, and inconvenience is brought to the user.
Disclosure of Invention
In view of this, embodiments of the present application provide a method and an apparatus for controlling an unmanned aerial vehicle, and a computer-readable storage medium, and one of the purposes is to solve the technical problem that the steps of connecting a control device and the like are required and the operation is troublesome when the unmanned aerial vehicle is used to shoot a video.
A first aspect of an embodiment of the present application provides an unmanned aerial vehicle control method, where a shooting device is mounted on an unmanned aerial vehicle, and the method includes:
after the unmanned aerial vehicle is started, acquiring the operation of a user on an entity button on the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
A second aspect of the embodiments of the present application provides an unmanned aerial vehicle control device, unmanned aerial vehicle is equipped with the shooting device, unmanned aerial vehicle control device includes: a processor and a memory storing a computer program, the processor implementing the following steps when executing the computer program:
after the unmanned aerial vehicle is started, acquiring the operation of a user on an entity button on the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
The third aspect of the embodiments of the present application provides an unmanned aerial vehicle, including:
the key comprises a machine body, a key body and a key body, wherein a solid key is arranged on the machine body;
the driving device is connected with the machine body and used for providing power for the unmanned aerial vehicle;
the holder is connected with the machine body;
a shooting device mounted on the pan/tilt head;
a processor and a memory storing a computer program, the processor implementing the following steps when executing the computer program:
after the unmanned aerial vehicle is started, acquiring the operation of a user on the entity key;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
A fourth aspect of the embodiments of the present application provides a computer-readable storage medium, where a computer program is stored, and the computer program is executed by a processor to implement the unmanned aerial vehicle control method provided in the embodiments of the present application.
The unmanned aerial vehicle control method that this application embodiment provided, after the entity button on user operation unmanned aerial vehicle, unmanned aerial vehicle can directly get into preset mode of taking off, under this mode of taking off, if unmanned aerial vehicle shoots the face, just can take off automatically and fly with the target flight orbit, can also shoot as the main part with the shooting object that the face that shoots corresponds automatically at the in-process of flying. Therefore, the method provided by the embodiment of the application greatly simplifies the operation of shooting videos by the user by using the unmanned aerial vehicle, the user does not need to connect the unmanned aerial vehicle with the control equipment, and only needs to directly operate the entity keys on the unmanned aerial vehicle to trigger the automatic flight and automatic shooting of the unmanned aerial vehicle, the whole process is rapid and simple, and the enthusiasm of the user for shooting videos by using the unmanned aerial vehicle can be greatly improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive labor.
Fig. 1 is a schematic view of a scene in which a user uses an unmanned aerial vehicle for self-shooting according to an embodiment of the present application.
Fig. 2 is a flowchart of an unmanned aerial vehicle control method provided in an embodiment of the present application.
Fig. 3 is a scene schematic diagram of face recognition using an unmanned aerial vehicle according to an embodiment of the present application.
Fig. 4 is a schematic view of a scene in which an unmanned aerial vehicle recognizes a user forward-pushing gesture provided in an embodiment of the present application.
Fig. 5 is a schematic view of a scene of handheld landing of an unmanned aerial vehicle provided in an embodiment of the present application.
Fig. 6 is a schematic structural diagram of an unmanned aerial vehicle control device according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of an unmanned aerial vehicle provided in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
With the gradual maturity of unmanned aerial vehicle technology, unmanned aerial vehicles begin to walk into people's life. The unmanned aerial vehicle is generally equipped with a shooting device, and the shooting device may also be called a camera or a camera. Because unmanned aerial vehicle has unique shooting visual angle and wide field of vision, consequently many users like to carry out the autodyne with unmanned aerial vehicle, with oneself and the scene record that oneself is located in the middle of the video.
As shown in fig. 1, fig. 1 is a schematic view of a scene in which a user uses a self-timer unmanned aerial vehicle according to an embodiment of the present application. When using unmanned aerial vehicle autodyne, the user can control unmanned aerial vehicle flight through controlgear to need the shooting device orientation oneself on the unmanned aerial vehicle of flight in-process control, in order to ensure oneself to be located in the middle of the picture of shooing. After the video original is obtained by shooting, the user needs to make the unmanned aerial vehicle transmit the shot video original to the control device, so that the user can clip the video original on the control device to generate a sharable movie or short film. Here, the control device may be a remote controller equipped with a display screen, or may be a combination of the remote controller and a terminal device, and the terminal device may be, for example, a mobile phone, a tablet, a computer, or the like, or may be a combination of the remote controller and flight glasses.
It is thus clear that the user needs to carry out a large amount of work for shooing a section of film that can share, for example need be connected controlgear and unmanned aerial vehicle, need control unmanned aerial vehicle flight through controlgear, need control shooting device's shooting angle, need carry out the transmission of video original film, need carry out the clip of video original film and so on. These works need consume the user a large amount of time, reduce user's shooting efficiency to even if the user wants the short film of shooting only ten seconds, still need carry out above-mentioned all works, it is loaded down with trivial details and troublesome to use unmanned aerial vehicle to shoot the short film, greatly reduced user uses unmanned aerial vehicle to shoot the enthusiasm of short film.
In order to solve the above problem, the embodiment of the application provides an unmanned aerial vehicle control method. Referring to fig. 2, fig. 2 is a flowchart of a method for controlling an unmanned aerial vehicle according to an embodiment of the present application. The method may comprise the steps of:
s202, after the unmanned aerial vehicle is started, obtaining the operation of a user on an entity button on the unmanned aerial vehicle.
And S204, controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key.
S206, in the takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off.
And S208, after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
Specifically, in the process of controlling the unmanned aerial vehicle to fly along the target flight path, the shooting object is shot and a video or a picture is recorded.
The unmanned aerial vehicle can be provided with the entity button on, and through this entity button, the user can make unmanned aerial vehicle get into predetermined mode of taking off. Here, the preset takeoff mode may also be referred to as a paper airplane mode in one example. Compared with virtual key or software triggering and other modes, the physical key is more convenient to operate and more reliable in triggering.
The user's operation of the physical keys may be in a variety of ways. In an embodiment, the entity button may be a push button, and the user may press the entity button to enable the drone to enter a preset takeoff mode. In one embodiment, the physical button may be a toggle button, and the user may toggle the physical button to enable the drone to enter a predetermined takeoff mode. In an embodiment, the entity button may be a touch-sensitive button, and the user may enable the drone to enter a preset takeoff mode by touching the entity button.
After the user operates the entity button, unmanned aerial vehicle can directly get into the mode of predetermineeing taking off. So-called direct entry, i.e. the drone does not need to establish a connection with the control device before entering the preset takeoff mode, establishing a connection with the control device is an option and not a necessary one. Therefore, in one example, the user can not use a mobile phone, flying glasses, a remote controller and the like in the whole shooting process, and only the entity keys on the unmanned aerial vehicle need to be operated.
After unmanned aerial vehicle got into the mode of predetermineeing takeoff, unmanned aerial vehicle's shooting device can begin to shoot the image. In an embodiment, unmanned aerial vehicle can include the cloud platform, and the shooting device can be carried on unmanned aerial vehicle's cloud platform, and after unmanned aerial vehicle got into the predetermined mode of taking off, can control unmanned aerial vehicle's cloud platform and move to the target position appearance to, the shooting device that the cloud platform carried on can shoot with the target angle. Referring to fig. 3, fig. 3 is a schematic view of a scene using an unmanned aerial vehicle for face recognition according to an embodiment of the present application. As shown in fig. 3, unmanned aerial vehicle can be unmanned aerial vehicle, and the user can hold in the palm unmanned aerial vehicle, gets into the mode of predetermineeing after unmanned aerial vehicle, and unmanned aerial vehicle's shooting device can rotate to the dead ahead of orientation under the drive of cloud platform, and the user can hold in the palm high so that the face that the shooting device can aim at oneself with unmanned aerial vehicle to can begin to carry out face identification. In an example, unmanned aerial vehicle also can make the shooting device can be automatic follow up shoot the image through controlling the cloud platform after getting into predetermined takeoff mode to the user can need not to hold in the palm high with unmanned aerial vehicle, and unmanned aerial vehicle alright catch user's people's face when upwards shooing.
After accomplishing face identification, unmanned aerial vehicle can take off automatically to can fly with target flight orbit automatically, and at the flight in-process, unmanned aerial vehicle's shooting device can be all the time with the camera lens to the shooting object (user) that the face that discerns corresponds, make the picture main part of the video that obtains of shooing can be user self, obtain the unmanned aerial vehicle auto heterodyne video that the user expects.
The target flight trajectory may be associated with a preset takeoff mode. In one embodiment, the association relationship between the target flight trajectory and the takeoff mode may be preset by a user. For example, a user may select one of the multiple flight trajectories as a target flight trajectory through an application installed on the control device in advance, so that the unmanned aerial vehicle can automatically fly with the target flight trajectory after entering the takeoff mode and completing face recognition.
In one embodiment, the target flight trajectory may also be determined from the user's gestures after the drone takes off. The corresponding relation between different gestures and the flight track can be configured in advance in the unmanned aerial vehicle, then after the unmanned aerial vehicle scans the people's face, the unmanned aerial vehicle can take off automatically, and hover in the place ahead of shooting the object, at this moment, the user can make the gesture that the flight track that expects corresponds, then the image or the image sequence that contain the user gesture can be shot to the shooting device of unmanned aerial vehicle, through discerning the user gesture in the image, the flight track that the gesture that will be done by the user who discerns corresponds is confirmed to be the target flight track.
The target flight trajectory may be any of a variety of flight trajectories. Here, the flight trajectory may include a progressive flight trajectory, a flying-in-flight trajectory, a circling flight trajectory, a spiral flight trajectory, and the like. When unmanned aerial vehicle flies with the flight track of gradually keeping away from, unmanned aerial vehicle can be kept away from the shooting object with the ascending direction in the slant gradually, and the field of vision in the picture of shooing will be gradually wide. When the unmanned aerial vehicle flies in a flying track, the unmanned aerial vehicle can vertically ascend at a higher speed. When the unmanned aerial vehicle flies with a circling flight trajectory, the unmanned aerial vehicle can fly circling around the photographic object under the condition that the flying height is unchanged. When unmanned aerial vehicle flies with spiral flight track, unmanned aerial vehicle can improve the flying height of self when surrounding shooting object flight gradually, produces spiral rising's effect.
As described above, the target flight path may be determined according to the gesture of the user, and thus different gestures may be set for different flight paths in advance. Here, the gesture may be a dynamic gesture. For example, in one example, the gesture corresponding to the skyway flight trajectory may be a palm rest, the gesture corresponding to the circle flight trajectory may be a finger drawing a circle, the gesture corresponding to the spiral flight trajectory may be a finger drawing tornado upwards, and the gesture corresponding to the gradually-distant flight trajectory may be a palm pushing forwards. Referring to fig. 4, the user in fig. 4 controls the drone to fly at a far-going flight trajectory by making a forward-pushing gesture.
Because gestures may be dynamic, the user's hand movements may have different movement speeds while making the gesture. In one embodiment, the target flying speed of the unmanned aerial vehicle when flying in the target flying trajectory can be further determined according to the movement speed of the user gesture. The movement speed of the user gesture can be calculated according to an image sequence which is shot by the shooting device and comprises the user gesture, and the specific calculation mode has multiple modes, such as calculation according to optical flow information between frames, calculation according to shutter speed of the shooting device and movement blur in the image, and calculation according to change of depth information for some specific gestures.
In one example, if the user wants the drone to fly in a far-away flight trajectory, a gesture of pushing forward can be made in front of the lens after the drone completes face scanning, takes off and hovers. When doing the forward push gesture, if the user wants unmanned aerial vehicle flight speed some fast, then can push the palm forward with faster speed, if the user wants unmanned aerial vehicle's flight speed some slow, then can control the pushing speed of palm some slow. When the user does the forward pushing gesture, the unmanned aerial vehicle can acquire the depth information of the palm of the user through the depth sensor, the forward speed of the forward pushing gesture can be determined according to the change of the depth information of the palm of the user, and the target flight speed of the unmanned aerial vehicle during flying along the gradually-distant flight trajectory can be determined according to the forward speed. Here, there are various implementations of the depth sensor, such as a binocular vision sensor, a TOF sensor, and the like.
Different flight trajectories have different requirements on the place, for example, the surrounding flight trajectory has unchanged height during flying, so the requirement on the height of the place is lower, the requirement on the area of the place is higher, the flying trajectory in the sky is opposite, the requirement on the height of the place is higher, but the requirement on the area of the place is lower … …, therefore, in one implementation mode, the unmanned aerial vehicle can be provided with a plurality of shooting devices corresponding to different directions, for example, the unmanned aerial vehicle can be provided with a front camera, a rear camera, a left camera, a right camera, an upper camera and a lower camera in any directions, scene images in different directions can be obtained through different shooting devices, so that the current scene type can be determined according to the scene images, and further, the target flight trajectory suitable for the scene type can be determined.
For example, if the user shoots in a building group erected in a high-rise building, the target flight trajectory adapted to the current scene can be determined to be the sky-shooting flight trajectory according to scene images shot by a plurality of shooting devices on the unmanned aerial vehicle, and if the user shoots indoors with a certain area, the target flight trajectory adapted to the current scene can be determined to be the surrounding flight trajectory according to the scene images shot by the plurality of shooting devices on the unmanned aerial vehicle.
The unmanned aerial vehicle control method that this application embodiment provided, after the entity button on user operation unmanned aerial vehicle, unmanned aerial vehicle can directly get into preset mode of taking off, under this mode of taking off, if unmanned aerial vehicle shoots the face, just can take off automatically and fly with the target flight orbit, can also shoot as the main part with the shooting object that the face that shoots corresponds automatically at the in-process of flying. Therefore, the method provided by the embodiment of the application greatly simplifies the operation of shooting videos by the user by using the unmanned aerial vehicle, the user does not need to connect the unmanned aerial vehicle with the control equipment, and only needs to directly operate the entity keys on the unmanned aerial vehicle to trigger the automatic flight and automatic shooting of the unmanned aerial vehicle, the whole process is rapid and simple, and the enthusiasm of the user for shooting videos by using the unmanned aerial vehicle can be greatly improved.
In one embodiment, the drone may automatically generate a movie using the video obtained from the shooting after completing the shooting while flying at the target flight trajectory, so as to achieve automatic clipping. Specifically, can be in unmanned aerial vehicle prestore become the piece template, prestore become the piece template in unmanned aerial vehicle can manage through the application on the controlgear, can include content such as music material, video special effect, transition effect, text material, picture material in becoming the piece template, through combining the video of shooing with becoming the piece template, can generate the film that has the sight that the video that shoots corresponds fast, convenience of customers shares.
In an implementation mode, the unmanned aerial vehicle can be automatically controlled to return to the home after shooting is completed. As before, unmanned aerial vehicle can be unmanned aerial vehicle, then unmanned aerial vehicle can take off from user's palm, also can land on user's palm and accomplish the return journey. Specifically, after shooting is completed, the unmanned aerial vehicle can be controlled to return to the target position. The target position may be a specified height above the takeoff position, for example, 0.5m above the takeoff position. Here, the takeoff position may be a position recorded by the drone at takeoff. After the unmanned aerial vehicle navigates back to the target position, the landing can begin. Referring to fig. 5, fig. 5 is a schematic view of a scene of handheld landing of an unmanned aerial vehicle provided in an embodiment of the present application.
When unmanned aerial vehicle begins to descend, can carry out the palm to unmanned aerial vehicle's below region and detect, when detecting the target palm, can control unmanned aerial vehicle and descend at the target palm. Here, the palm detection can be realized through image recognition technology, and is concrete, can be equipped with the shooting device in unmanned aerial vehicle's below, shoots the image through the shooting device of unmanned aerial vehicle below to carry out palm discernment to the image of shooing, can detect the target palm.
In some cases, the user is for shortening unmanned aerial vehicle's landing time, lifting palm when unmanned aerial vehicle descends probably to catch unmanned aerial vehicle as early as possible, and unmanned aerial vehicle's paddle is at high-speed rotatory, and when the distance between unmanned aerial vehicle and user palm shortened fast, the risk that the user was cut by unmanned aerial vehicle will increase. Therefore, in consideration of safety, in an implementation mode, the distance between the unmanned aerial vehicle and the target palm can be obtained through the distance sensor, when the shortening speed of the distance between the unmanned aerial vehicle and the target palm is obviously greater than the landing speed of the unmanned aerial vehicle, namely the shortening speed and the difference between the landing speeds is greater than a preset threshold value, the user can be determined to be lifting the palm, at the moment, the landing speed of the unmanned aerial vehicle can be properly reduced, namely the rotating speed of the blade is slightly increased, so that the shortening of the distance between the unmanned aerial vehicle and the target palm is slowed down, the rotating speed of the blade is increased, and the user can be reminded of a slow-down state to a certain extent, and the speed is not reached. Certainly, if the distance between unmanned aerial vehicle and the target palm is 0 or when being close to 0, then can brake the motor of paddle rapidly, make unmanned aerial vehicle stall rapidly.
To enrich the user's personalized expression, in one embodiment, an atmosphere light may be provided on the arm and/or fuselage of the drone. The atmosphere lamps can have different light emitting modes and light emitting colors, wherein the light emitting modes can include various modes such as breathing, normal lighting, rainbow, throttle (namely, the light brightness is hooked with the pushing depth of an unmanned aerial vehicle acceleration rocker), flash, slow flash, water lamp (light is emitted according to a certain sequence among a plurality of atmosphere lamps), and the light emitting colors can include various colors such as white, red, orange, yellow, green, rainbow color (gradual change among a plurality of colors).
The user can set up the luminous colour or the luminous mode of atmosphere lamp on the unmanned aerial vehicle through controlgear. As mentioned above, the control device may establish communication with the drone, the control device may include a terminal device or flight glasses, the user may set a lighting color and/or a lighting mode of the ambience lamp through an application installed on the terminal device or flight glasses, so that the terminal device or flight glasses may send color information corresponding to the lighting color selected by the user and/or mode information corresponding to the lighting mode to the drone, and the drone may control the ambience lamp to switch to the corresponding lighting color and/or lighting mode according to the color information and/or the mode information.
As described above, the unmanned aerial vehicle can perform face recognition on the shot image after entering the preset takeoff mode, and the face recognition requires a certain time, and in some cases, the recognition may be unsuccessful, such as the user shaking or the environment being too dark, and therefore, in an embodiment, the progress of the face recognition may be prompted to the user through the atmosphere lamp. When the unmanned aerial vehicle is recognizing a human face, the atmosphere lamp can be controlled to emit light in a first color and/or a first mode, such as orange light and/or continuous flashing, and after the recognition is completed, the atmosphere lamp can be controlled to emit light in a second color and/or a second mode, such as green light and/or constant lighting.
When unmanned aerial vehicle takes off from the user's palm, also can indicate the user through the atmosphere lamp. Because unmanned aerial vehicle's paddle needs to reach certain rotational speed (can be called target rotational speed) just can make unmanned aerial vehicle take off, consequently before the paddle rotational speed does not reach target rotational speed, the user need continuously lift unmanned aerial vehicle. Therefore, in one embodiment, the user may be prompted by an ambience light about the state of the blade rotation speed, the ambience light may be controlled to emit light in a third color or a third mode, such as emitting a blinking red light, before the blade rotation speed of the drone reaches the target rotation speed, and the ambience light may be controlled to emit light in a fourth color or a fourth mode, such as emitting a normally bright green light, after the blade rotation speed reaches the target rotation speed. Through the change of this kind of atmosphere lamp, can help the user to confirm the opportunity that need not to lift, prevent that unmanned aerial vehicle from falling because of the rotational speed is not enough.
When the unmanned aerial vehicle takes off in the palm, the user can preferably maintain the palm level for safety. In one embodiment, an attitude sensor may be configured in the unmanned aerial vehicle, the attitude sensor may acquire a current attitude of the unmanned aerial vehicle, and may determine whether the unmanned aerial vehicle is kept horizontal, when the palm of the user fails to be kept horizontal and has a large inclination, that is, when a difference between the current attitude and the horizontal attitude of the unmanned aerial vehicle is greater than a preset threshold value, the ambience lamp may be controlled to emit light in a fifth color or a fifth mode, and when the palm of the user is kept horizontal, the ambience lamp may be controlled to emit light in a sixth color or a sixth mode. Through the change of this kind of atmosphere lamp, can assist the user to adjust the palm to make unmanned aerial vehicle can take off safely.
In one embodiment, when the atmosphere lamp emits light in the breathing mode, the breathing frequency of the atmosphere lamp may be determined according to the remaining power of the drone. In one example, if the remaining power of the unmanned aerial vehicle is lower, the breathing frequency of the atmosphere lamp can be higher, and meanwhile, the brightness of the atmosphere lamp can be reduced, so that an obvious power shortage signal can be sent to a user.
After the unmanned aerial vehicle is provided with the atmosphere lamp, the atmosphere lamp has various selectable luminous colors and selectable luminous modes, so that the unmanned aerial vehicle has certain ornamental value. In one embodiment, the atmosphere light of the drone may operate in a breathing mode, and the breathing frequency of the atmosphere light may match the rhythm of the music, giving the user a sense of unity in vision and hearing. Specifically, unmanned aerial vehicle can acquire the rhythm information of target music to can be according to the respiratory frequency of rhythm information control atmosphere lamp, the rhythm point that makes the atmosphere lamp light the moment and target music matches.
The rhythm information of the target music may be obtained by the drone from the control device, for example, if the control device includes a terminal device, the target music may be music currently played by the terminal device, or may also be music specified by the user through an application program on the terminal device. The rhythm information of the target music can also be obtained by analyzing the audio signal collected by the radio of the unmanned aerial vehicle.
According to the unmanned aerial vehicle control method provided by the embodiment of the application, a user only needs to operate the entity keys on the unmanned aerial vehicle, the unmanned aerial vehicle can automatically recognize the face, automatically take off and fly, automatically shoot videos aiming at shooting objects corresponding to the face, and automatically clip the shot videos, so that the process that the user uses the unmanned aerial vehicle to shoot a film is greatly simplified, the whole process is rapid and simple, and the enthusiasm of the user for shooting the film by using the unmanned aerial vehicle can be greatly improved. In addition, according to the method provided by the embodiment of the application, the unmanned aerial vehicle can be provided with the atmosphere lamp, so that a user can personally decorate the unmanned aerial vehicle through the atmosphere lamp, and can be prompted through the atmosphere lamp in links such as face recognition and handheld take-off, and interaction modes between the user and the unmanned aerial vehicle are enriched.
Reference may be made to fig. 6, and fig. 6 is a schematic structural diagram of an unmanned aerial vehicle control device provided in an embodiment of the present application. The unmanned aerial vehicle controlling means that this application embodiment provided includes: a processor 610 and a memory 620 having stored thereon a computer program which when executed by the processor performs the steps of:
after the unmanned aerial vehicle is started, acquiring the operation of a user on an entity button on the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the take-off mode, when an image shot by a shooting device carried by the unmanned aerial vehicle comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
Optionally, the target flight trajectory is associated with the preset takeoff mode.
Optionally, the association relationship between the target flight trajectory and the preset takeoff mode is preset by a user.
Optionally, the processor is further configured to, after the shooting is completed, generate a movie by using a video obtained by shooting when the unmanned aerial vehicle flies in the target flight trajectory, so as to implement automatic clipping.
Optionally, the processor is configured to clip the captured video according to a pre-stored filming template when clipping the captured video.
Optionally, the sheeting template comprises one or more of: music material, video special effect, transition effect, text material and picture material.
Optionally, the processor is further configured to,
before controlling the unmanned aerial vehicle to fly with the target flight path, identifying the user gesture through the shooting device, and determining the target flight path of the unmanned aerial vehicle according to the identification result.
Optionally, the processor is further configured to determine a movement speed of the user gesture according to the image including the user gesture captured by the capturing device; and determining the target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies with the target flight trajectory according to the movement speed of the user gesture.
Optionally, the target flight trajectory includes a gradually-distant flight trajectory, and the user gesture corresponding to the gradually-distant flight trajectory includes a forward-pushing gesture.
Optionally, the processor is configured to obtain depth information of the palm of the user when determining a target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies in the target flight trajectory according to the movement speed of the user gesture; determining a forward speed of the forward push gesture according to a change in depth information of the user's palm; and determining the target flight speed of the unmanned aerial vehicle when flying with the gradually-far flight trajectory according to the forward speed.
Optionally, the target flight trajectory includes one or more of: a gradually-distant flight trajectory, a flying-to-the-sky flight trajectory, a surrounding flight trajectory and a spiral flight trajectory.
Optionally, the unmanned aerial vehicle is when flying with the far-off flight trajectory, is kept away from the shooting object in an oblique upward direction.
Optionally, the unmanned aerial vehicle includes multiple cameras disposed at different orientations, and the processor is further configured to acquire scene images of multiple orientations through the multiple cameras before controlling the unmanned aerial vehicle to fly along a target flight trajectory; and determining a target flight track corresponding to the current scene according to the scene image.
Optionally, the processor is used for controlling the unmanned aerial vehicle to return to the target position when the unmanned aerial vehicle returns.
Optionally, the target position includes a specified height above a takeoff position of the drone.
Optionally, the processor is further configured to, after the unmanned aerial vehicle navigates back to the target position, detect a palm in a region below the unmanned aerial vehicle, and when a target palm is detected, control the unmanned aerial vehicle to land on the target palm.
Optionally, the processor is configured to obtain a distance between the unmanned aerial vehicle and the target palm when controlling the unmanned aerial vehicle to land on the target palm; when unmanned aerial vehicle with the speed of shortening of the distance between the target palm is greater than when unmanned aerial vehicle's the landing speed, reduce unmanned aerial vehicle's landing speed.
Optionally, the drone further comprises an atmosphere light.
Optionally, the unmanned aerial vehicle is connected to a control device, and the processor is further configured to control the atmosphere lamp to emit light in a color corresponding to the color information and/or in a mode corresponding to the mode information according to the color information and/or the mode information sent by the control device.
Optionally, the processor is further configured to, when performing face recognition on the image captured by the capturing device, control the ambience lamp to emit light in a first color or a first mode before the recognition is completed, and control the ambience lamp to emit light in a second color or a second mode after the recognition is completed.
Optionally, the processor is further configured to, when the unmanned aerial vehicle is controlled to take off, control the atmosphere lamp to emit light in a third color or a third mode before a paddle rotation speed of the unmanned aerial vehicle does not reach a target rotation speed, and control the atmosphere lamp to emit light in a fourth color or a fourth mode after the paddle rotation speed reaches the target rotation speed.
Optionally, the lighting mode of the atmosphere lamp comprises a breathing mode.
Optionally, the processor is further configured to determine a breathing frequency of the atmosphere lamp according to a remaining power of the unmanned aerial vehicle.
Optionally, the processor is further configured to obtain rhythm information of the target music; and adjusting the breathing frequency of the atmosphere lamp according to the rhythm information.
Optionally, the rhythm information of the target music is acquired from a control device of the drone.
For the above embodiments of the unmanned aerial vehicle control device, reference may be made to the related description in the foregoing for specific implementation, and details are not repeated here.
The utility model provides an unmanned aerial vehicle controlling means, after the entity button on user operation unmanned aerial vehicle, unmanned aerial vehicle can directly get into the mode of taking off of predetermineeing, under this mode of taking off, if unmanned aerial vehicle shoots the people's face, just can take off automatically and fly with the target flight orbit, can also shoot as the main part with the shooting object that the people's face that shoots corresponds automatically at the process of flight. Therefore, the method provided by the embodiment of the application greatly simplifies the operation of shooting videos by the user by using the unmanned aerial vehicle, the user does not need to connect the unmanned aerial vehicle with the control equipment, and only needs to directly operate the entity keys on the unmanned aerial vehicle to trigger the automatic flight and automatic shooting of the unmanned aerial vehicle, the whole process is rapid and simple, and the enthusiasm of the user for shooting videos by using the unmanned aerial vehicle can be greatly improved.
Reference may be made to fig. 7, and fig. 7 is a schematic structural diagram of the unmanned aerial vehicle provided in the embodiment of the present application. The embodiment of the application provides an unmanned aerial vehicle includes:
a body 710 on which a physical key is disposed;
the driving device 720 is connected with the machine body and used for providing power for the unmanned aerial vehicle;
a pan/tilt head 730 connected to the body;
a camera 740 mounted on the pan/tilt head;
a processor 750 and a memory 760 in which a computer program is stored, which processor realizes the following steps when executing the computer program:
after the unmanned aerial vehicle is started, acquiring the operation of a user on the entity key;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
Optionally, the target flight trajectory is associated with the preset takeoff mode.
Optionally, the association relationship between the target flight trajectory and the preset takeoff mode is preset by a user.
Optionally, the processor is further configured to, after the shooting is completed, generate a movie by using a video obtained by shooting when the unmanned aerial vehicle flies in the target flight trajectory, so as to implement automatic clipping.
Optionally, the processor is configured to clip the captured video according to a pre-stored filming template when clipping the captured video.
Optionally, the sheeting template comprises one or more of: music material, video special effect, transition effect, text material and picture material.
Optionally, the processor is further configured to, before controlling the unmanned aerial vehicle to fly with the target flight trajectory, identify the user gesture through the shooting device, and determine the target flight trajectory of the unmanned aerial vehicle according to an identification result.
Optionally, the processor is further configured to determine a movement speed of the user gesture according to the image including the user gesture captured by the capturing device; and determining the target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies with the target flight trajectory according to the movement speed of the user gesture.
Optionally, the target flight trajectory includes a gradually-distant flight trajectory, and the user gesture corresponding to the gradually-distant flight trajectory includes a forward-pushing gesture.
Optionally, the processor is configured to obtain depth information of the palm of the user when determining a target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies in the target flight trajectory according to the movement speed of the user gesture; determining a forward speed of the forward push gesture according to a change in depth information of the user's palm; and determining the target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies in the gradually-distant flight trajectory according to the forward speed.
Optionally, the target flight trajectory includes one or more of: a gradually-distant flight trajectory, a sky-rushing flight trajectory, a surrounding flight trajectory and a spiral flight trajectory.
Optionally, the unmanned aerial vehicle is when flying with the far-off flight trajectory, is kept away from the shooting object in an oblique upward direction.
Optionally, the drone includes a plurality of cameras disposed at different orientations, and the processor is further configured to acquire scene images at a plurality of orientations through the plurality of cameras before controlling the drone to fly at the target flight trajectory; and determining a target flight track corresponding to the current scene according to the scene image.
Optionally, the processor is used for controlling the unmanned aerial vehicle to return to the target position when the unmanned aerial vehicle returns.
Optionally, the target position includes a specified height above a takeoff position of the drone.
Optionally, the processor is further configured to, after the unmanned aerial vehicle navigates back to the target position, detect a palm in a region below the unmanned aerial vehicle, and when a target palm is detected, control the unmanned aerial vehicle to land on the target palm.
Optionally, the processor is configured to obtain a distance between the unmanned aerial vehicle and the target palm when controlling the unmanned aerial vehicle to land on the target palm; when unmanned aerial vehicle with the speed of shortening of the distance between the target palm is greater than when unmanned aerial vehicle's the landing speed, reduce unmanned aerial vehicle's landing speed.
Optionally, the drone further comprises an atmosphere light.
Optionally, the unmanned aerial vehicle is connected to a control device, and the processor is further configured to control the atmosphere lamp to emit light in a color corresponding to the color information and/or in a mode corresponding to the mode information according to the color information and/or the mode information sent by the control device.
Optionally, the processor is further configured to, when performing face recognition on the image captured by the capturing device, control the ambience lamp to emit light in a first color or a first mode before the recognition is completed, and control the ambience lamp to emit light in a second color or a second mode after the recognition is completed.
Optionally, the processor is further configured to, when the unmanned aerial vehicle is controlled to take off, control the atmosphere lamp to emit light in a third color or a third mode before a paddle rotation speed of the unmanned aerial vehicle does not reach a target rotation speed, and control the atmosphere lamp to emit light in a fourth color or a fourth mode after the paddle rotation speed reaches the target rotation speed.
Optionally, the lighting mode of the atmosphere lamp comprises a breathing mode.
Optionally, the processor is further configured to determine a breathing frequency of the atmosphere lamp according to a remaining power of the unmanned aerial vehicle.
Optionally, the processor is further configured to obtain rhythm information of the target music; and adjusting the breathing frequency of the atmosphere lamp according to the rhythm information.
Optionally, the rhythm information of the target music is acquired from a control device of the drone.
For various embodiments of the above provided unmanned aerial vehicle, specific implementations thereof may refer to the relevant descriptions in the foregoing, and are not described herein again.
The utility model provides an unmanned aerial vehicle, after the entity button on user operation unmanned aerial vehicle, unmanned aerial vehicle can directly get into predetermined mode of taking off, under this mode of taking off, if unmanned aerial vehicle shoots the people's face, just can take off automatically and fly with target flight orbit, can also shoot as the main part with the shooting object that the people's face that shoots corresponds automatically at the process of flight. Therefore, the method provided by the embodiment of the application greatly simplifies the operation of shooting videos by the user by using the unmanned aerial vehicle, the user does not need to connect the unmanned aerial vehicle with the control equipment, and only needs to directly operate the entity keys on the unmanned aerial vehicle to trigger the automatic flight and automatic shooting of the unmanned aerial vehicle, the whole process is rapid and simple, and the enthusiasm of the user for shooting videos by using the unmanned aerial vehicle can be greatly improved.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program implements any one of the methods for controlling an unmanned aerial vehicle provided in the embodiment of the present application.
In the above, various embodiments are provided for each protection subject, and on the basis of no conflict or contradiction, a person skilled in the art can freely combine various embodiments according to actual situations, thereby forming various technical solutions. The present disclosure is not limited to the text, and the technical solutions obtained by combining all the components cannot be expanded, but it can be understood that the technical solutions which are not expanded also belong to the scope disclosed in the embodiments of the present disclosure.
Embodiments of the present application may take the form of a computer program product embodied on one or more storage media including, but not limited to, disk storage, CD-ROM, optical storage, and the like, in which program code is embodied. Computer-usable storage media include permanent and non-permanent, removable and non-removable media, and information storage may be implemented by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of the storage medium of the computer include, but are not limited to: phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technologies, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic tape storage or other magnetic storage devices, or any other non-transmission medium, may be used to store information that may be accessed by a computing device.
It is noted that, herein, relational terms such as first and second, and the like may be used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element.
The method and apparatus provided by the embodiments of the present invention are described in detail above, and the principle and the embodiments of the present invention are explained in detail herein by using specific examples, and the description of the embodiments is only used to help understanding the method and the core idea of the present invention; meanwhile, for a person skilled in the art, according to the idea of the present invention, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present invention.

Claims (76)

1. The utility model provides an unmanned aerial vehicle control method, its characterized in that, the unmanned aerial vehicle is loaded with camera, the method includes:
after the unmanned aerial vehicle is started, acquiring the operation of a user on an entity key on the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
2. The method of claim 1, wherein the target flight trajectory is associated with the preset takeoff mode.
3. The method as claimed in claim 2, characterized in that the association of the target flight trajectory with the preset takeoff mode is preset by a user.
4. The method of claim 1, further comprising:
and after shooting is finished, generating a film by utilizing the video shot by the unmanned aerial vehicle when flying along the target flight trajectory, so as to realize automatic cutting.
5. The method according to claim 1, wherein the generating a movie using the video obtained by shooting comprises:
and editing the shot video according to a pre-stored film forming template to generate a film.
6. The method of claim 5, wherein the sheeting template comprises one or more of: music material, video special effect, transition effect, text material and picture material.
7. The method of claim 1, wherein prior to controlling the drone to fly at a target flight trajectory, the method further comprises:
and recognizing the user gesture through the shooting device, and determining the target flight track of the unmanned aerial vehicle according to the recognition result.
8. The method of claim 7, further comprising:
determining the movement speed of the user gesture according to the image which is shot by the shooting device and comprises the user gesture;
and determining the target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies with the target flight trajectory according to the movement speed of the user gesture.
9. The method of claim 8, wherein the target flight trajectory comprises a progressively farther flight trajectory, and wherein the user gesture corresponding to the progressively farther flight trajectory comprises a forward-push gesture.
10. The method of claim 9, wherein determining a target flight speed of the drone while flying at the target flight trajectory according to the speed of motion of the user gesture comprises:
acquiring depth information of a palm of a user;
determining a forward speed of the forward push gesture according to a change in depth information of the user's palm;
and determining the target flight speed of the unmanned aerial vehicle when flying with the gradually-far flight trajectory according to the forward speed.
11. The method of claim 1, wherein the target flight trajectory comprises one or more of: a gradually-distant flight trajectory, a sky-rushing flight trajectory, a surrounding flight trajectory and a spiral flight trajectory.
12. The method of claim 11, wherein the drone is directed away from the photographic subject in a diagonally upward direction while flying in the increasing flight trajectory.
13. The method of claim 1, wherein the drone includes a plurality of cameras disposed at different orientations, the method further comprising, prior to controlling the drone to fly at the target flight trajectory:
acquiring scene images of a plurality of orientations through the plurality of shooting devices;
and determining a target flight track corresponding to the current scene according to the scene image.
14. The method of claim 1, wherein the controlling the drone to return comprises:
and controlling the unmanned aerial vehicle to return to the target position.
15. The method of claim 14, wherein the target position comprises a specified height above a takeoff position of the drone.
16. The method of claim 15, further comprising:
after the unmanned aerial vehicle navigates back to the target position, the palm detection is carried out on the area below the unmanned aerial vehicle, and when a target palm is detected, the unmanned aerial vehicle is controlled to land on the target palm.
17. The method of claim 16, wherein said controlling said drone to land on said target palm comprises:
acquiring the distance between the unmanned aerial vehicle and the target palm;
when unmanned aerial vehicle with the speed of shortening of the distance between the target palm is greater than when unmanned aerial vehicle's the landing speed, reduce unmanned aerial vehicle's landing speed.
18. The method of claim 1, wherein the drone further comprises an atmosphere light.
19. The method of claim 18, wherein the drone is connected to a control device, the method further comprising:
and controlling the atmosphere lamp to emit light in the color corresponding to the color information and/or in the mode corresponding to the mode information according to the color information and/or the mode information sent by the control equipment.
20. The method of claim 18, further comprising:
when the face recognition is carried out on the image shot by the shooting device, the atmosphere lamp is controlled to emit light in a first color or a first mode before the recognition is finished, and the atmosphere lamp is controlled to emit light in a second color or a second mode after the recognition is finished.
21. The method of claim 18, further comprising:
when the unmanned aerial vehicle is controlled to take off, the atmosphere lamp is controlled to emit light in a third color or a third mode before the rotating speed of blades of the unmanned aerial vehicle does not reach a target rotating speed, and the atmosphere lamp is controlled to emit light in a fourth color or a fourth mode after the rotating speed of the blades reaches the target rotating speed.
22. The method of claim 18, wherein the lighting mode of the atmosphere lamp comprises a breathing mode.
23. The method of claim 22, further comprising:
and determining the breathing frequency of the atmosphere lamp according to the residual electric quantity of the unmanned aerial vehicle.
24. The method of claim 22, further comprising:
acquiring rhythm information of target music;
and adjusting the breathing frequency of the atmosphere lamp according to the rhythm information.
25. The method of claim 24, wherein the tempo information of the target music is obtained from a control device of the drone.
26. An unmanned aerial vehicle controlling means, its characterized in that includes: a processor and a memory storing a computer program, the processor implementing the following steps when executing the computer program:
after the unmanned aerial vehicle is started, acquiring the operation of a user on an entity button on the unmanned aerial vehicle;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the takeoff mode, when an image shot by a shooting device carried by the unmanned aerial vehicle comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
27. The apparatus of claim 26, wherein the target flight trajectory is associated with the preset takeoff mode.
28. The apparatus as claimed in claim 27, wherein the association of the target flight trajectory with the preset takeoff mode is preset by a user.
29. The apparatus of claim 26, wherein the processor is further configured to generate a movie with the video captured by the drone while flying at the target flight trajectory after the capturing is completed, to enable automatic clipping.
30. The apparatus of claim 29, wherein the processor, when clipping the captured video, is configured to clip the captured video according to a pre-stored filming template.
31. The apparatus of claim 30, wherein the sheeting template comprises one or more of: music material, video special effect, transition effect, text material and picture material.
32. The apparatus of claim 26, wherein the processor is further configured to,
before controlling the unmanned aerial vehicle to fly with the target flight path, identifying the user gesture through the shooting device, and determining the target flight path of the unmanned aerial vehicle according to the identification result.
33. The device of claim 32, wherein the processor is further configured to determine a movement speed of the user gesture from the image captured by the capturing device that includes the user gesture; and determining the target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies with the target flight trajectory according to the movement speed of the user gesture.
34. The apparatus of claim 33, wherein the target flight trajectory comprises a progressively farther flight trajectory, the progressively farther flight trajectory corresponding to a user gesture comprising a forward push gesture.
35. The apparatus of claim 34, wherein the processor is configured to obtain depth information of a palm of a user's hand when determining a target flight speed of the drone in flight with the target flight trajectory according to the movement speed of the user gesture; determining a forward speed of the forward push gesture according to a change in depth information of the user's palm; and determining the target flight speed of the unmanned aerial vehicle when flying with the gradually-far flight trajectory according to the forward speed.
36. The apparatus of claim 26, wherein the target flight trajectory comprises one or more of: a gradually-distant flight trajectory, a sky-rushing flight trajectory, a surrounding flight trajectory and a spiral flight trajectory.
37. The apparatus of claim 36, wherein the drone is directed away from the photographic subject in a diagonally upward direction while flying in the increasing flight trajectory.
38. The apparatus of claim 26, wherein the drone includes a plurality of cameras disposed at different orientations, the processor further configured to acquire scene images at a plurality of orientations via the plurality of cameras prior to controlling the drone to fly at the target flight trajectory; and determining a target flight track corresponding to the current scene according to the scene image.
39. The apparatus of claim 26, wherein the processor, when controlling the drone to return, is configured to control the drone to return to a target location.
40. The apparatus of claim 39, wherein the target position comprises a specified height above a takeoff position of the drone.
41. The apparatus of claim 40, wherein the processor is further configured to perform palm detection on an area below the drone after the drone has navigated back to the target location, and when a target palm is detected, control the drone to land on the target palm.
42. The apparatus of claim 41, wherein the processor is configured to obtain a distance between the drone and the target palm when controlling the drone to land on the target palm; when unmanned aerial vehicle with the speed of shortening of the distance between the target palm is greater than when unmanned aerial vehicle's the landing speed, reduce unmanned aerial vehicle's landing speed.
43. The apparatus of claim 26, wherein the drone further comprises an atmosphere light.
44. The apparatus of claim 43, wherein the unmanned aerial vehicle is connected to a control device, and the processor is further configured to control the ambience lamp to emit light in a color corresponding to the color information and/or in a mode corresponding to the mode information according to the color information and/or the mode information sent by the control device.
45. The device of claim 43, wherein the processor is further configured to control the ambience lamp to emit light in a first color or a first mode before recognition is completed and to control the ambience lamp to emit light in a second color or a second mode after recognition is completed when the image captured by the capturing device is subjected to face recognition.
46. The apparatus of claim 43, wherein the processor is further configured to control the ambience lamp to emit light in a third color or a third mode before a blade speed of the drone reaches a target speed and to control the ambience lamp to emit light in a fourth color or a fourth mode after the blade speed reaches the target speed at takeoff of the drone.
47. The apparatus of claim 43, wherein the lighting mode of the atmosphere lamp comprises a breathing mode.
48. The apparatus of claim 47, wherein the processor is further configured to determine a breathing frequency of the ambience light based on a remaining power of the drone.
49. The apparatus of claim 47, wherein the processor is further configured to obtain tempo information of the target music; and adjusting the breathing frequency of the atmosphere lamp according to the rhythm information.
50. The apparatus of claim 49, wherein the tempo information of the target music is obtained from a control device of the drone.
51. An unmanned aerial vehicle, comprising:
the key comprises a machine body, a key body and a key body, wherein a solid key is arranged on the machine body;
the driving device is connected with the machine body and used for providing power for the unmanned aerial vehicle;
the holder is connected with the machine body;
a shooting device mounted on the pan/tilt head;
a processor and a memory storing a computer program, the processor implementing the following steps when executing the computer program:
after the unmanned aerial vehicle is started, acquiring the operation of a user on the entity key;
controlling the unmanned aerial vehicle to directly enter a preset takeoff mode according to the operation of the entity key;
in the takeoff mode, when the image shot by the shooting device comprises a human face, controlling the unmanned aerial vehicle to start a power system so as to control the unmanned aerial vehicle to take off;
and after the unmanned aerial vehicle is controlled to take off, the unmanned aerial vehicle is controlled to fly along a target flight track, and the shooting device is controlled to shoot by taking a shooting object corresponding to the human face as a main body in the flying process.
52. A drone according to claim 49, wherein the target flight trajectory is associated with the preset takeoff mode.
53. A drone according to claim 50, characterised in that the association of the target flight trajectory with the preset takeoff mode is preset by the user.
54. A drone as claimed in claim 51, wherein the processor is further configured to generate a movie, after the filming is complete, from video captured while the drone is flying at the target flight trajectory to enable automatic clipping.
55. A drone as claimed in claim 54, wherein the processor, when clipping the captured video, is configured to clip the captured video according to a pre-stored filming template.
56. The drone of claim 55, wherein the sheeting template comprises one or more of: music material, video special effect, transition effect, text material and picture material.
57. A drone according to claim 51, wherein the processor is further configured to identify, by the camera, a user gesture before controlling the drone to fly with a target flight trajectory, and determine the target flight trajectory of the drone according to the identification result.
58. The drone of claim 57, wherein the processor is further configured to determine a speed of movement of the user gesture from the image captured by the camera that includes the user gesture; and determining the target flight speed of the unmanned aerial vehicle when the unmanned aerial vehicle flies with the target flight trajectory according to the movement speed of the user gesture.
59. A drone as claimed in claim 58, wherein the target flight trajectory includes a progressively farther flight trajectory corresponding to a user gesture including a push-forward gesture.
60. A drone as claimed in claim 59, wherein the processor, in determining a target flight speed of the drone while flying at the target flight trajectory in accordance with the speed of movement of the user gesture, is configured to obtain depth information of a user palm; determining a forward speed of the forward push gesture according to a change in depth information of the user's palm; and determining the target flight speed of the unmanned aerial vehicle when flying with the gradually-far flight trajectory according to the forward speed.
61. A drone as claimed in claim 51, wherein the target flight trajectory includes one or more of: a gradually-distant flight trajectory, a sky-rushing flight trajectory, a surrounding flight trajectory and a spiral flight trajectory.
62. A drone as claimed in claim 61, wherein the drone is directed away from the photographic subject in an obliquely upward direction when flying in the increasing flight trajectory.
63. A drone according to claim 51, comprising a plurality of cameras disposed at different orientations, the processor being further configured to acquire scene images at a plurality of orientations from the plurality of cameras prior to controlling the drone to fly at the target flight trajectory; and determining a target flight track corresponding to the current scene according to the scene image.
64. A drone as claimed in claim 63, wherein the processor, in controlling the drone to return to the target location, is configured to control the drone to return to the target location.
65. The drone of claim 64, wherein the target location comprises a specified height above a takeoff location of the drone.
66. The drone of claim 65, wherein the processor is further configured to perform palm detection on an area below the drone after the drone has navigated back to the target location, and to control the drone to land on the target palm when the target palm is detected.
67. A drone according to claim 66, wherein the processor, in controlling the drone to land on the target palm, is configured to obtain a distance between the drone and the target palm; when unmanned aerial vehicle with the speed of shortening of the distance between the target palm is greater than when unmanned aerial vehicle's the landing speed, reduce unmanned aerial vehicle's landing speed.
68. A drone according to claim 51, further including an atmosphere light.
69. A UAV according to claim 68 wherein the UAV is connected to a control device, and the processor is further adapted to control the light to emit light in a color corresponding to the color information and/or in a mode corresponding to the mode information according to the color information and/or mode information sent by the control device.
70. The drone of claim 68, wherein the processor is further configured to control the ambience lamp to emit light in a first color or a first mode before recognition is completed and to control the ambience lamp to emit light in a second color or a second mode after recognition is completed when the image captured by the camera is subjected to face recognition.
71. The drone of claim 68, wherein the processor is further configured to control the ambience lamp to emit light in a third color or a third mode before a blade speed of the drone does not reach a target speed and to control the ambience lamp to emit light in a fourth color or a fourth mode after the blade speed reaches the target speed at takeoff of the drone.
72. A drone as claimed in claim 68, wherein the lighting mode of the atmosphere light includes a breathing mode.
73. The drone of claim 72, wherein the processor is further configured to determine a breathing rate of the ambience light based on a remaining power of the drone.
74. The drone of claim 72, wherein the processor is further configured to obtain tempo information for the target music; and adjusting the breathing frequency of the atmosphere lamp according to the rhythm information.
75. The drone of claim 74, wherein the tempo information of the target music is obtained from a control device of the drone.
76. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the drone controlling method of any one of claims 1-25.
CN202080071047.0A 2020-11-05 2020-11-05 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium Pending CN114585985A (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/126757 WO2022094860A1 (en) 2020-11-05 2020-11-05 Unmanned aerial vehicle control method and device, unmanned aerial vehicle, and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN114585985A true CN114585985A (en) 2022-06-03

Family

ID=81458416

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202080071047.0A Pending CN114585985A (en) 2020-11-05 2020-11-05 Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium

Country Status (2)

Country Link
CN (1) CN114585985A (en)
WO (1) WO2022094860A1 (en)

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9505493B2 (en) * 2014-03-21 2016-11-29 Brandon Borko System for automatic takeoff and landing by interception of small UAVs
US10409276B2 (en) * 2016-12-21 2019-09-10 Hangzhou Zero Zero Technology Co., Ltd. System and method for controller-free user drone interaction
CN108521812A (en) * 2017-05-19 2018-09-11 深圳市大疆创新科技有限公司 Control method, unmanned plane and the machine readable storage medium of unmanned plane
CN114879715A (en) * 2018-01-23 2022-08-09 深圳市大疆创新科技有限公司 Unmanned aerial vehicle control method and device and unmanned aerial vehicle
US11148803B2 (en) * 2018-08-22 2021-10-19 Ford Global Technologies, Llc Take off and landing system for drone for use with an autonomous vehicle

Also Published As

Publication number Publication date
WO2022094860A1 (en) 2022-05-12

Similar Documents

Publication Publication Date Title
US11509817B2 (en) Autonomous media capturing
US20200104598A1 (en) Imaging control method and device
US10863073B2 (en) Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
CN112468717B (en) Method and system for shooting and playing video
CN108702448B (en) Unmanned aerial vehicle image acquisition method, unmanned aerial vehicle and computer readable storage medium
WO2018209702A1 (en) Method for controlling unmanned aerial vehicle, unmanned aerial vehicle and machine-readable storage medium
JP6755966B2 (en) Imaging using multiple unmanned aviation vehicles
WO2018098678A1 (en) Aircraft control method, device, and apparatus, and aircraft
KR20160147713A (en) Information processing device, information processing method, program, and imaging system
CN109948423B (en) Unmanned aerial vehicle travel accompanying service method applying face and posture recognition and unmanned aerial vehicle
WO2019227333A1 (en) Group photograph photographing method and apparatus
WO2022141369A1 (en) Systems and methods for supporting automatic video capture and video editing
TWI696122B (en) Interactive photographic system and method for unmanned aerial vehicle
CN114585985A (en) Unmanned aerial vehicle control method and device, unmanned aerial vehicle and computer readable storage medium
US20200382696A1 (en) Selfie aerial camera device
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
WO2022188151A1 (en) Image photographing method, control apparatus, movable platform, and computer storage medium
US11434002B1 (en) Personal drone assistant
CN114556256A (en) Flight control method, video editing method, device, unmanned aerial vehicle and storage medium
CN113841381A (en) Visual field determining method, visual field determining apparatus, visual field determining system, and medium
WO2018058264A1 (en) Video-based control method, device, and flying apparatus
US20230351899A1 (en) Navigation correction for excessive wind
WO2023035165A1 (en) Photographing method, photographing system, and storage medium
JP7173657B2 (en) Control device, imaging device, control method, and program
JP2021114312A (en) Movable type projection device and program for movable type projection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination