WO2018209702A1 - 无人机的控制方法、无人机以及机器可读存储介质 - Google Patents

无人机的控制方法、无人机以及机器可读存储介质 Download PDF

Info

Publication number
WO2018209702A1
WO2018209702A1 PCT/CN2017/085138 CN2017085138W WO2018209702A1 WO 2018209702 A1 WO2018209702 A1 WO 2018209702A1 CN 2017085138 W CN2017085138 W CN 2017085138W WO 2018209702 A1 WO2018209702 A1 WO 2018209702A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
preset target
preset
return
controlling
Prior art date
Application number
PCT/CN2017/085138
Other languages
English (en)
French (fr)
Inventor
张立天
刘昂
胡骁
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2017/085138 priority Critical patent/WO2018209702A1/zh
Priority to CN201780004588.XA priority patent/CN108521812A/zh
Publication of WO2018209702A1 publication Critical patent/WO2018209702A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast

Definitions

  • Embodiments of the present invention relate to the field of unmanned aerial vehicles, and particularly to a control method for a drone, a drone, and a machine readable storage medium.
  • UAV Unmanned Aerial Vehicle
  • UAV plant protection UAV aerial photography
  • UAV forest fire alarm Monitoring and so on UAV forest fire alarm Monitoring and so on
  • the present application discloses a control method for a drone, a drone, and a machine readable storage medium.
  • a control method for a drone comprising: controlling a drone to take off based on a user operation; controlling the drone to fly to a specific location; After the drone flies to the specific location, the photographing device triggers a photographing operation on the preset target; after the photographing device completes the photographing operation, the drone is automatically controlled to return to the returning position.
  • a drone is provided, the drone is mounted with a photographing device, and the drone includes a processor; the processor is configured to: control the drone to take off based on a user operation; and control the unmanned Flying to a specific location; after the drone flies to the specific location, triggering a shooting operation of the shooting device on a preset target; automatically controlling the unmanned after the shooting device completes the shooting operation The aircraft returns to the return position.
  • a machine readable storage medium on which a plurality of computer instructions are stored, the computer instructions being executed to perform processing of controlling drone takeoff based on user operations; Flying the UAV to a specific location; after the UAV flies to the specific location, triggering a shooting operation of the shooting device on the preset target; after the shooting device completes the shooting operation, automatically controlling the The drone is returned to the return position.
  • the drone can be automatically controlled to take off according to the preset condition, and the drone is automatically controlled to fly to a specific position, after the flying to a specific position, the shooting device is triggered to shoot the preset target, and subsequently It can also automatically control the drone to return to the return position, which makes the whole shooting process in one go, improves the user experience, and reduces the occupation of the drone's battery life by manual operation.
  • Figure 1 is a schematic architectural diagram of an unmanned flight system
  • FIG. 2 is a schematic flow chart of a control method of a drone
  • FIG. 3 is another schematic flowchart of a control method of a drone
  • FIG. 4 is a block diagram of one embodiment of a drone.
  • Embodiments of the present invention provide a control method for a drone, a drone, and a machine readable storage medium. It will be apparent to those skilled in the art that embodiments of the present invention can be applied to various types of drones. For example, it can be a small drone.
  • the drone may be a rotorcraft, for example, a multi-rotor aircraft propelled by air by a plurality of urging means, embodiments of the invention are not limited thereto, and the drone may be other Type of drone or mobile device.
  • Figure 1 is a schematic architectural diagram of an unmanned flight system. This embodiment is described by taking a rotorcraft as an example.
  • the unmanned flight system 100 can include a UAV 110, a pan/tilt head 120, a display device 130, and a steering device 140.
  • the UAV 110 may include a power system 150, a flight control system 160, and a rack 170.
  • the UAV 110 can communicate wirelessly with the manipulation device 140 and the display device 130.
  • Rack 170 can include a fuselage and a stand (also known as a landing gear).
  • the fuselage may include a center frame and one or more arms coupled to the center frame, the one or more arms extending radially from the center frame.
  • the stand is connected to the fuselage for supporting the UAV 110 when it is landing.
  • the powertrain 150 may include an electronic governor (referred to as ESC) 151, one or more propellers 153, and one or more motors 152 corresponding to one or more propellers 153, wherein the motor 152 is coupled to the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are disposed on the corresponding arm; the electronic governor 151 is configured to receive the driving signal generated by the flight controller 160, and provide a driving current to the motor 152 according to the driving signal to control The rotational speed of the motor 152.
  • Motor 152 is used to drive propeller rotation to power the flight of UAV 110, which enables UAV 110 to achieve one or more degrees of freedom of motion.
  • the UAV 110 can be rotated about one or more axes of rotation.
  • the above-described rotating shaft may include a roll axis, a pan axis, and a pitch axis.
  • the motor 152 can be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brushed motor.
  • Flight control system 160 may include flight controller 161 and sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the UAV, that is, the position information and state information of the UAV 110 in space, for example, three-dimensional position, three-dimensional angle, three-dimensional speed, three-dimensional acceleration, and three-dimensional angular velocity.
  • the sensing system 162 may include, for example, at least one of a gyroscope, an electronic compass, an IMU (Inertial Measurement Unit), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system may be a GPS (Global Positioning System).
  • the flight controller 161 is used to control the flight of the UAV 110, for example, the flight of the UAV 110 can be controlled based on the attitude information measured by the sensing system 162. It should be understood that the flight controller 161 may control the UAV 110 in accordance with pre-programmed program instructions, or may control the UAV 110 in response to one or more control commands from the steering device 140.
  • the pan/tilt 120 can include an ESC 121 and a motor 122.
  • the pan/tilt is used to carry the photographing device 123.
  • the flight controller 161 can control the motion of the platform 120 through the ESC 121 and the motor 122.
  • the platform 120 may further include a controller for controlling the movement of the platform 120 by controlling the ESC 121 and the motor 122.
  • the pan/tilt 120 may be independent of the UAV 110 or the UAV 110. a part of.
  • the motor 122 can be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brushed motor.
  • the platform 120 may be located at the top of the aircraft or at the bottom of the aircraft.
  • the photographing device 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the photographing 123 may communicate with the flight controller and perform photographing under the control of the flight controller.
  • the display device 130 is located at the ground end of the unmanned flight system 100 and can communicate with the UAV 110 wirelessly and can be used to display gesture information of the UAV 110. In addition, an image taken by the photographing device can also be displayed on the display device 130. It should be understood that the display device 130 may be a stand-alone device or may be disposed in the manipulation device 140.
  • the handling device 140 is located at the ground end of the unmanned flight system 100 and can communicate with the UAV 110 wirelessly for remote manipulation of the UAV 110.
  • the manipulation device may be, for example, a remote controller or a user terminal equipped with an APP (Application) that controls the UAV, for example, a smartphone, a tablet, or the like.
  • APP Application
  • receiving the user's input through the manipulation device may refer to the manipulation of the UAV 110 through an input device such as a pull wheel, a button, a button, a joystick, or a user interface (UI) on the user terminal.
  • UI user interface
  • FIG. 2 is a schematic flow chart of a control method of a drone.
  • the control method of FIG. 2 may be performed by a control device or a control device, such as flight controller 161 of FIG. 1, and the embodiment is not limited thereto.
  • the control method of FIG. 2 may also be carried by other controls carried on the drone.
  • the control method of FIG. 2 includes the following.
  • Step 210 controlling the drone to take off based on user operations.
  • the drone can first be unlocked based on user operations, ie, the drone enters a state in which the paddle can be taken off.
  • the paddle unlocking may be triggered based on the first operation of the user, and the drone is controlled to take off when the paddle is unlocked based on the second operation of the user.
  • the user's first operation may be a long press or double tap or other rule to operate a button on the drone, wherein the button may be a physical button on the drone, such as a power button, or a drone.
  • the second operation may be that the user faces the photographing device toward the face to perform face recognition.
  • the triggering of the paddle unlocking of the drone and the completion of the paddle unlocking may be completed simultaneously based on the first operation or the second operation of the user, so that the drone enters a state in which the paddle can be taken off.
  • the photographing device captures an image to obtain a target image; when the target image satisfies a preset condition, the moiré unlocking of the drone is completed.
  • the user can use one or both hands to hold the drone, and straighten the arm, and level the drone, so that the shooting device mounted on the drone faces the face or other The face of the person.
  • the shooting device is turned on after the power of the drone is turned on, and the drone detects the image captured by the shooting device. When the detected image satisfies the preset condition, the drone completes the paddle unlocking.
  • the pulping unlock is completed when the drone detects that it is in a particular posture and detects that the image meets the preset condition. For example, when the drone detects that its current posture is at a horizontal standstill and the detected image satisfies the preset condition, the paddle unlocking is completed. Alternatively, when the drone detects that its current posture is at a horizontal still state, the face recognition function is turned on, and when the detected image satisfies the preset condition, the paddle unlocking is completed.
  • the preset conditions are different depending on the usage mode of the drone.
  • the preset condition may include a face in the target image;
  • the preset condition may include a face in the target image, and the face The similarity with the preset face image reaches a preset similarity threshold.
  • the drone can detect the angle of each face in the target image, for example, The face, the side face, the position and size of each face in the target image, etc., according to the detection result, select one of the faces in the plurality of faces, for example, the selection angle is a positive face, and the position is biased toward the image
  • the face is in the middle; for example, the face with the right face and the largest face in the target image is selected. Subsequently, the drone can perform further similarity detection on the selected face.
  • the drone after determining that the drone is successfully unlocked, the drone can be automatically controlled to take off without the user's operation. In an embodiment, after the drone is unlocked, the drone can again determine whether the drone is currently capable of taking off, and if so, automatically control the drone to take off.
  • the drone after determining that the drone is successfully unlocked, the drone can be controlled to take off based on further operations of the user.
  • the drone can activate the power unit and control the power unit to rotate at an idle speed, and control the drone to take off after the power unit is idling for a preset time. By controlling the drone's power unit for idle rotation for a certain period of time, the false start of the drone can be avoided.
  • the power unit may be controlled to rotate at an idle speed after the drone is horizontally placed for more than a preset period of time.
  • the user places the drone horizontally (for example, horizontally in the palm of the hand), and the drone determines that the drone is in a horizontal state (for example, the attitude angle is zero) according to the posture information of the drone detected by the sensor.
  • the drone is automatically started, and the power unit is controlled to rotate at an idle speed. Further, the drone can also control the drone to take off after the power unit is idling for a preset time.
  • the power device may be controlled to idlely rotate when receiving a signal that allows idle rotation.
  • a signal that allows idle rotation For example, for safety reasons, signals that allow for idling rotation or signals that are allowed to be idling by an external device can be generated to control the idle rotation of the aircraft.
  • This embodiment can use these signals in combination with the automatic start of the drone to improve The safety of the automatic start of the drone.
  • controlling drone takeoff based on user operations may refer to controlling drone takeoff based on user operations after determining that the drone has completed paddle unlocking.
  • the user operation may be a user's throwing action on the drone, that is, after determining that the drone is successfully unlocked, if the user's throwing action on the drone is detected, the drone may be controlled based on the throwing action. take off. Specifically, when the drone is thrown away, the motion data of the drone is detected, and when the motion data meets the automatic start condition, the power device of the drone is automatically activated.
  • the motion data may include a distance that the drone is thrown.
  • the motion data meets an automatic start condition including: the distance that the drone is thrown is greater than or equal to a preset distance threshold.
  • the preset distance threshold may be 0 or a safe distance that prevents the drone from injuring the user. Therefore, when the distance between the drone and the user is a safe distance, the drone is controlled to take off, and damage to the user can be avoided.
  • the motion data may include a vertical speed or speed of the drone, in which case the motion data satisfies an automatic start condition including: the vertical speed or speed of the drone is less than or equal to a preset speed threshold, the pre- The set speed threshold can be equal to 0 or other value close to zero. Due to vertical speed or speed The degree is set to be less than or equal to the preset speed threshold and then restarted, which makes the flight more stable when the drone is started.
  • the drone can recognize the action of the user when throwing and select a suitable flight trajectory. Since the flight path of the drone user can be indicated by a simple action, the user experience is improved, and the occupation of the drone's battery life is further reduced.
  • the user's throwing action on the drone may be vertical throwing, oblique upward throwing, or flat throwing.
  • the drone may determine whether the preset automatic start condition is satisfied based on a user operation, and the drone automatically takes off when the preset automatic start condition is met.
  • the automatic take-off of the drone means that the drone can automatically switch on the start circuit of the drone when the preset automatic start condition is met, and control the power device of the drone to start working without manual start by button or button.
  • the drone and the control drone take off. Since the aircraft can be automatically started according to the preset automatic start condition, the manual operation of the aircraft's battery life is reduced, and the user experience is improved.
  • the drone can be automatically started as follows:
  • the user performs a throwing action on the drone, and detects the motion data of the drone when the drone is thrown away.
  • the power device of the drone is automatically activated.
  • the motion data of the man-machine is thrown off and the motion data meets the automatic start condition
  • a detailed description of the power device for automatically starting the drone can be referred to as described above, and after the paddle is successfully unlocked, according to the user's throwing action
  • the description of controlling the take-off of the drone will not be described in detail here.
  • step 220 the drone is controlled to fly to a specific location.
  • the specific location may be set in the factory setting of the drone or pre-set by the user before the drone takes off, and the specific location may be represented as having one relative to the preset target.
  • a user interface element such as a button for inputting a position parameter, a text box, a selection box, or the like may be set on the user interface, so that the user can select or input the drone through a user interface of an external device (eg, a user terminal or a remote controller).
  • the relative positional relationship between the shooting position and the current position of the preset target, the flight controller can be set with the external
  • the communication interface between the devices obtains the location parameters selected or input by the user. In this way, the user can accurately select or input the position parameter so that the drone can accurately fly to a specific position corresponding to the position parameter, and take an image desired by the user.
  • the specific location may be a location determined based on the acquired composition rule and the current location of the preset target.
  • the composition rule may include one or more of a position of the preset target in the shooting screen, a face of the preset target in the shooting picture, and a completeness of the face of the preset target in the shooting picture, and the composition rule may be Rules including one of the following composition rules: balanced composition, symmetric composition, diagonal composition, triangular composition, nine-square lattice composition, centripetal composition, bipartite composition, face in the picture, face, picture The face in the face is a side face.
  • the acquired composition rule may be a preset composition rule received from an external device, or a composition rule that receives a user input from an external device.
  • a movable selection box for inputting a composition rule may be set on the user interface.
  • User interface elements such as border lines, text boxes, and selection boxes are moved so that users can customize the composition rules. In this way, the user can accurately input the composition rule so that the photographing device mounted on the drone can take an image desired by the user.
  • the composition rule based on the user's control of the drone's throwing action to control the drone to take off, not only can the composition rule be obtained, but the flight distance after the drone is taken off based on the composition rule can also be determined based on the drone.
  • the direction of the throw determines the flight direction of the drone, and determines the specific position based on the flight distance and the flight direction.
  • the specific location may be a specific location set based on the shooting information for the preset target, wherein the shooting information may be used to indicate a range of the preset target in the captured image.
  • the shooting information may be a ratio of a preset target in the shooting screen or a range in which the preset target occupies in the shooting screen.
  • the above shooting information may be a scene selected by the user.
  • the scenes are divided into three categories: large scenes, medium scenes and small scenes according to the proportion or range of the preset targets in the shooting screen.
  • Each scene can be further subdivided.
  • the larger the scene the smaller the proportion or range of the preset target in the shot, and vice versa.
  • the portrait photography can determine the scene according to the proportion or range of the preset target area in the shooting picture, and is divided into a full body image, a large bust, a bust, a bust, a shoulder avatar, and a big avatar.
  • the shooting information may include at least one of a large scene, a medium scene, and a small scene; Or the shooting information may include at least one of a full body image, a large bust, a bust, a bust, a shouldered avatar, and a large avatar.
  • the correspondence between different scenes and the proportion or range of the preset target in the shooting screen may be pre-selected.
  • the preset target is determined according to the scene selected by the user.
  • the proportion or range of the medium may be utilized.
  • a box drawn on the touch screen indicates the extent of the preset target in the captured picture.
  • Step 230 After the drone flies to a specific location, the shooting operation of the shooting device to the preset target is triggered.
  • the drone may first control the composition of the photographing device, and trigger the photographing device when the image of the preset target in the photographing screen of the photographing device satisfies a preset composition rule.
  • the above-described composition of the photographing device is controlled such that the imaging of the preset target in the photographing screen satisfies the preset composition rule includes: adjusting the flying posture of the drone, the motion of the pan/tilt of the photographing device, and the focal length of the photographing device.
  • At least one of the controls controls the composition of the photographing device such that the position of the preset target in the photographing screen satisfies the preset composition rule.
  • an image that is currently presented in the captured image by the preset target may be captured by the photographing device, and the position occupied by the preset target in the captured image is determined by the image recognition, thereby determining whether the position of the preset target in the captured image satisfies
  • the composition rules For example, if the composition is a nine-square grid, for example, if the user selects a nine-square grid, the preset target can be imaged at the four intersections of the nine squares.
  • the nine-square lattice pattern can be further subdivided into four modes corresponding to the four intersection points, so that the user further selects which intersection point the preset target is imaged.
  • the center of the preset target is located at a certain intersection of the nine squares, or determine the distance and orientation of the center of the preset target from a certain intersection of the nine squares, and adjust the composition according to the preset target
  • the center eventually coincides with a certain intersection of Jiugongge.
  • the drone controls the composition of the photographing device, and when the image of the preset target in the photographing screen of the photographing device satisfies a preset composition rule, the drone may issue a letter indicating the composition to the user. For example, by displaying the instruction information of the composition completion on the APP interface, or by flashing the control indicator to flash the predetermined rule, the composition is completed, or the composition is completed by the sound.
  • the drone controls the composition of the photographing device, when the image of the preset target in the photographing screen of the photographing device satisfies a preset composition rule, or when the preset target is imaged in the photographing screen of the photographing device After the preset composition rule is met and the drone sends the instruction information to the user, the shooting device automatically triggers the shooting operation on the preset target.
  • the drone controls the composition of the photographing device, when the image of the preset target in the photographing screen of the photographing device satisfies a preset composition rule, or when the preset target is imaged in the photographing screen of the photographing device
  • the drone triggers the shooting device to perform the shooting operation on the preset target based on the user's operation.
  • the user sends a shooting signal to the drone through an external device input such as a remote controller or a mobile terminal equipped with an APP that controls the drone.
  • the user sends a shooting signal to the drone through sounds, gestures, and the like. In this way, the drone can trigger the photographing operation of the photographing device on the preset target based on the photographing signal.
  • the indication information for flying to a specific location may be returned to the external device, and the user may input a shooting signal, such as an LED light, a sound, etc., through the external device according to the indication information.
  • the flight controller can acquire a photographing signal input by the user through a communication interface with the external device. In this way, the drone can trigger the photographing operation of the photographing device on the preset target based on the photographing signal.
  • the photographing operation of the photographing device on the preset target may be a photographing operation on the preset target.
  • the user may input a first restriction condition of the photographing operation through the external device in advance, and the first restriction condition may include the number of photographing times, and the flight controller may obtain the photographing operation input by the user in advance through the communication interface between the user and the external device.
  • a limiting condition according to the first limiting condition, controlling a photographing operation of the photographing device on the preset target.
  • the photographing operation of the photographing device on the preset target may be an image capturing operation on the preset target.
  • the user may input a second restriction condition of the shooting operation in advance through an external device
  • the second restriction condition may include a preset shooting time, a specific trajectory at the time of shooting, and an aircraft between the aircraft and the external device.
  • the communication interface preliminarily acquires a second restriction condition of the photographing operation input by the user, and the root According to the second restriction condition, the photographing operation of the photographing device to the preset target is controlled. among them:
  • the drone can control the shooting device to shoot the preset target at a specific position, and when the actual shooting time reaches the preset shooting time, the drone You can control the shooting device to stop shooting.
  • the drone can directly control the shooting device to stop shooting, wherein the stop shooting signal can be passed by the user.
  • the input from the external device can also be sent by the user by executing the specified gesture indicating that the shooting is stopped.
  • the drone is controlled to fly along a specific trajectory, and when the drone flies to the end point of the specific trajectory, the shooting device is triggered. Stop the camera operation on the preset target.
  • the drone can directly control the shooting device to stop shooting.
  • the specific trajectory may be determined according to an input received from an external device, for example, a user interface element such as a button, a text box, a selection box, etc. for inputting a movement trajectory may be set on the user interface, so that the user can The movement track is input or selected, wherein the movement track is a movement track of the preset target in the camera image of the photographing device, and subsequently, the drone or the external device can determine a specific track when the drone is flying based on the movement track.
  • an external device for example, a user interface element such as a button, a text box, a selection box, etc. for inputting a movement trajectory may be set on the user interface, so that the user can The movement track is input or selected, wherein the movement track is a movement track of the preset target in the camera image of the photographing device, and subsequently, the drone or the external device can determine a specific track when the drone is flying based on the movement track.
  • the specific trajectory may be preset, for example, the specific trajectory may include at least one of a surround, a zoom, a zoom, and an S shape.
  • the drone can fly while shooting around the preset target during the imaging process.
  • the specific trajectory is zoomed in, the drone can move toward the preset target during the imaging process. The direction is taken while flying.
  • the triggering the photographing device to perform the photographing operation on the preset target comprises: controlling the photographing device to adjust the focal length of the photographing device according to the depth of field principle, and photographing the preset target by using the adjusted focal length.
  • the shooting device can be triggered to adjust the focal length according to the depth of field principle, as shown by equations (1), (2), and (3), the foreground depth is shallower than the back depth of field. Therefore, it is necessary to focus on the first 1/3, 1/3 of which is the experience value, and the lens can be focused on the first 1/3 of the depth of the queue. For example, if you take a group photo of five people, you can focus on the person in the middle of the second row, so you can use the foreground depth and back depth more effectively, and take a clear group photo.
  • is the allowable mass circle diameter
  • f is the lens focal length
  • F is the lens aperture value
  • L is the focus distance
  • ⁇ L 1 is the foreground depth
  • ⁇ L 2 is the back depth of field
  • ⁇ L is the depth of field.
  • the triggering the photographing device to perform the photographing operation on the preset target comprises: detecting the environmental condition information and/or the posture information of the preset target, and adjusting the photographing angle according to the environmental condition information and/or the posture information of the preset target.
  • the environmental condition information may be, for example, information indicating backlighting, weather conditions, light and darkness, and the like.
  • the posture information of the preset target may be, for example, information indicating a posture of turning, standing, sitting, or the like of the head.
  • the specific shooting angles may include a pan, a side, a head, and the like.
  • the shooting angle can be adjusted so that the front side of the preset target can be photographed.
  • the above functions may be set or selected by the user through a user interface of the external device (eg, a user interface on the user terminal) before the drone is launched.
  • the shooting angle can be adaptively adjusted according to the environmental condition information and/or the posture information of the preset target, the shooting process is intelligent, the manual interference during the shooting process is reduced, the user experience is improved, and the manual operation is reduced.
  • Step 240 After the photographing device completes the photographing operation, the drone is automatically controlled to return to the returning position.
  • the drone is automatically controlled to return from the starting position of the returning to the returning position, wherein the starting position of the returning flight may be one of the following three situations:
  • the drone stops shooting the preset target according to the stop shooting signal sent by the user then the starting position of the drone when returning to the drone is when the drone stops shooting the preset target, the drone is located s position.
  • the returning position is a take-off position of the drone, and the take-off position may be recorded when the drone is taken off.
  • the take-off position and the returning position may be in the northeast coordinate system. coordinate of.
  • the take-off position of the drone is the position where the drone is preset when the drone takes off. In this case, the drone takes off from the hand of the preset target, and can be ignored at this time. The difference between the drone and the location of the preset target.
  • the take-off position of the drone can be determined by referring to the position of the drone when the preset target is at the time of take-off. For example, the drone takes off by sweeping the face. According to the above description, the drone takes off by sweeping the face. When the preset target can straighten the arm, the takeoff position of the drone can be determined by the normal length of the arm and the position of the preset target.
  • the distance between the returning position and the current position of the preset target does not exceed a preset distance threshold.
  • a distance threshold may be preset, and the drone may determine the current distance of the preset target relative to the drone in real time based on TOF (Time of Flight) technology or 3D TOF technology during the return flight. If the current distance is greater than the distance threshold, the drone can be controlled to further approach the preset target, and when the current distance is not greater than the distance threshold, it can be determined that the drone has returned to the return position.
  • TOF Time of Flight
  • the present application is not limited to determining the current distance of the preset target relative to the drone in real time based on the TOF technology or the 3D TOF technology.
  • other stereo environment sensing sensors such as binocular may also be used to determine.
  • the current distance of the preset target relative to the drone may be determined.
  • the user's handheld or wearable external device transmits its current location to the drone so that the drone returns the current location as a return location.
  • the external device can send a signal carrying the current location of the external device to the drone at a certain frequency, so that the drone can return according to the current location of the external device.
  • the external device may be a remote controller, or a terminal device equipped with an APP for controlling the drone, or a smart watch for controlling the drone, or other device capable of communicating with an external device.
  • the drone can be controlled to return to the return position based on the image captured by the camera.
  • controlling the drone to return to the return position based on the image captured by the photographing device may include: a current position and a current size in the captured image based on the preset target, and a preset position of the preset target in the image. And preset size, control the drone to return to the return position.
  • the photographing device can capture the image of the preset target in real time during the returning process, then the photographing device determines the current position and the current size of the preset target in the captured image, for example, if the current size is smaller than the preset target in the captured image.
  • the preset size in the control unit controls the drone to be close to the preset target.
  • the drone is controlled to be away from the preset target; for example, if the current position is relative In the preset position, the side edge of the captured image is biased, and the drone is controlled to move toward the edge side.
  • the above adjustment process can be adjusted in a fixed step size or in a variable step size. Determining that a current position of the preset target in the captured image coincides with a preset position of the preset target in the captured image, and the current size of the preset target in the captured image and the preset target in the captured image When the preset size is the same, determine that the drone is flying to the return position.
  • controlling the drone to return to the return position based on the image captured by the photographing device may include: determining a coordinate of the preset target in a specific coordinate system based on a current position of the preset target in the captured image; The coordinate difference between the coordinates of the target in a specific coordinate system and the current coordinates of the drone, and the preset coordinate difference between the preset target and the drone to control the drone to return to the return position.
  • the specific coordinate system can be the North East coordinate system.
  • the photographing device can capture the image of the preset target in real time during the returning process, then the photographing device determines the current position of the preset target in the captured image, and can determine the relative positional relationship between the current position and the background in the image.
  • the coordinates of the preset target in a specific coordinate system can be derived from the position sensor in the drone. After that, the coordinates of the preset target in a specific coordinate system are unmanned. The current coordinates of the machine are poor, and the coordinate difference is obtained. Based on the coordinate difference and the preset coordinate difference, the drone is controlled to return. Finally, when the coordinate difference is consistent with the preset coordinate difference, The drone will fly to the return position.
  • the flight process of the drone returning to the return position may include: controlling the drone to descend from the current position to be in the same horizontal plane as the return position, and controlling the drone to fly along the horizontal plane to the return position.
  • the position and height of the preset target can be known by the position sensor, for example, GPS or a visual sensor, and the position and height are recorded. In this case, you can plan the path to bypass the obstacle, and if you can't bypass, try to lift the height to avoid the obstacle.
  • the nose can always be oriented in the forward direction during flight to ensure flight safety.
  • the drone when the drone flies to the returning position and detects the presence of the palm below, the drone can be controlled to land on the palm, and after the drone is landed on the palm, the drone can be controlled to stop the paddle. Further, it is also possible to control the drone to receive the paddle.
  • the drone can be automatically controlled to take off according to the preset condition, and the drone is automatically controlled to fly to a specific position, after the flying to a specific position, the shooting device is triggered to shoot the preset target, and subsequently It can also automatically control the drone to return to the return position, which makes the whole shooting process in one go, improves the user experience, and reduces the occupation of the drone's battery life by manual operation.
  • FIG. 3 is another schematic flow chart of a control method of a drone.
  • the control method of FIG. 3 is an example of the method of FIG. 2.
  • the control method of FIG. 3 includes the following contents:
  • Step 310 Control the drone to perform face recognition based on user operations.
  • the drone recognition can be first controlled based on the user operation to unlock the drone, that is, the drone enters a state in which the propeller can take off.
  • the user can hold the drone with one hand or both hands, and straighten the arm, and level the drone so that the photographing device mounted on the drone faces the face.
  • the photographing device performs face recognition on the captured image, and when the detected face meets the preset condition, the drone completes the paddle unlocking.
  • the face recognition function is activated when detecting that the user operates a button on the drone with a certain rule.
  • the button may be a physical button on the drone, such as a power button, or a virtual button on the drone.
  • the certain rule may be a long press, a double tap, or a long press and two short presses, and the like.
  • the face recognition function is activated when it is detected that the drone is in a specific posture. For example, after the drone detects that the user operates a button on the drone with a certain rule, it detects whether the current self is horizontally stationary, and if so, activates the face recognition function.
  • the face recognition function is activated when detecting that the user operates a button on the drone with a certain rule and detects that the drone is in a specific posture.
  • the pan/tilt for mounting the photographing device is controlled to swing up and down, and/or the pan/tilt is controlled to swing left and right to collect a face image.
  • Step 320 Automatically control the drone to take off when the face recognition is successful.
  • the drone when the drone detects that the target image acquired by the photographing device meets the preset condition when the drone performs the face recognition in step 310, the drone automatically controls to take off. In an embodiment, when the target image acquired by the photographing device meets the preset condition when the UAV performs face recognition, the drone judges once again whether the current state satisfies the condition that can be taken off, and if satisfied, automatically controls no The man and the plane took off.
  • the preset conditions are different according to different usage modes of the drone.
  • the preset condition may include a human face in the target image;
  • the preset condition may include a human face in the target image, and the similarity between the face and the preset face image reaches a preset similarity threshold.
  • the drone can detect the angle of each face in the target image, such as the face and the face. , the position and size of each face in the target image, and selecting one of the faces in the plurality of faces according to the detection result, for example, selecting a face with a positive face and a position biased toward the center of the image; for example , select the face with the right face and the largest face in the target image. Subsequently, the drone can perform further similarity detection on the selected face.
  • Step 330 Automatically control the drone to fly to a specific location.
  • Step 340 Automatically control the composition of the photographing device.
  • the posture of the drone, the movement of the pan/tilt of the photographing device, and At least one of the focal lengths of the photographing devices is used to control the composition of the photographing device such that the position of the preset target in the photographing screen satisfies a preset composition rule.
  • the attitude of the drone can be adjusted by controlling the speed of the propeller of the drone, so that the drone can change postures such as roll, pan and tilt. It is also possible to adjust the movement of the gimbal by controlling the rotation of the pan/tilt mechanism, the translation mechanism, and the tilt mechanism of the gimbal.
  • the above adjustment and control will cause the photographing device to move with the drone or the pan/tilt relative to the preset target, thereby being able to adjust the composition of the preset target in the photographing screen.
  • the focal length of the shooting device can be adjusted during shooting to get a clear composition.
  • Step 350 When the imaging of the preset target in the shooting screen of the photographing device satisfies the preset composition rule, the photographing device is automatically controlled to perform a photographing operation on the preset target.
  • a shooting instruction is automatically output to the photographing apparatus, indicating that the preset target is photographed.
  • Step 360 After the shooting device completes the shooting operation, the drone is automatically controlled to return to the returning position.
  • the drone can be automatically controlled to take off by the swept face, and the drone can be controlled to fly to a specific position. After the drone flies to a specific position, the intelligent composition is photographed according to the preset composition rule. It reduces the manual interference of the drone during the shooting process, improves the user experience, and reduces the occupation of the drone's battery life.
  • an embodiment of the present invention further provides a drone.
  • the drone 400 is mounted with a photographing device 410, and the drone 400 includes: a processor 420.
  • the processor 420 is configured to: control the drone to take off based on a user operation; control the drone to fly to a specific location; and after the drone flies to the specific location, trigger the photographing device to preset a target a photographing operation; after the photographing device completes the photographing operation, automatically controlling the drone to return to the returning position.
  • the return position is a takeoff position of the drone.
  • the distance between the returning position and the current position of the preset target is not Exceeded the preset distance threshold.
  • the processor 420 is configured to control the drone to return to a return position based on an image captured by the photographing device.
  • the processor 420 is configured to: based on a current position and a current size of the preset target in an image captured by the photographing device, and a preset position of the preset target in an image, and The preset size controls the drone to return to the return position.
  • the processor 420 is configured to: determine, according to a current position of the preset target in the captured image, coordinates of the preset target in a specific coordinate system; based on the preset target The coordinate difference between the coordinates in the specific coordinate system and the current coordinates of the drone, and the preset coordinate difference between the preset target and the drone control the drone to return to the return position.
  • the processor 420 is configured to: determine, according to a TOF technology, a current distance of the preset target relative to the drone; based on the current distance, and the preset target and the distance Threshold, controlling the drone to return to the return position.
  • the processor 420 is configured to: control the drone to descend from a current position to be in the same horizontal plane as the return position; and control the drone to fly along the horizontal plane to a return position.
  • the processor 420 is configured to: trigger the capturing device to capture an image based on a user operation to obtain a target image; and when the target image meets a preset condition, control the drone to take off.
  • the preset condition includes: the target image includes a human face; or the target image includes a human face, and the similarity between the human face and the preset facial image reaches a preset Similarity threshold.
  • the user operation is a throwing action of the user on the drone.
  • the processor 420 is configured to: perform paddle unlocking based on a user operation; and control the drone to take off when it is determined that the paddle unlocking is successful.
  • the processor 420 is further configured to: acquire a position parameter input by a user, where the position parameter represents a relative positional relationship between a shooting position of the drone and the preset target; The location parameter and the current location of the preset target determine a particular location.
  • the processor 420 is further configured to: acquire a composition rule; determine a specific location based on the composition rule and a current location of the preset target.
  • the processor 420 is further configured to: acquire a composition rule; determine a flight distance of the drone after takeoff based on a composition rule; and determine the unmanned based on a direction when the drone is thrown The flight direction of the machine; a specific position is determined based on the flight distance and the flight direction.
  • the processor 420 is configured to: trigger a photographing operation of the photographing device on a preset target.
  • the processor 420 is configured to: trigger an imaging operation of the shooting device by the shooting device.
  • the processor 420 is further configured to: after triggering the imaging operation of the shooting device on the preset target, control the drone to fly along a specific trajectory; when the drone flies to the location When the end point of the specific track is described, the photographing device is triggered to stop the image capturing operation.
  • the processor 420 is further configured to: acquire a movement trajectory input by the user, where the movement trajectory is a movement trajectory of the preset target in a photographing screen of the photographing device; A specific trajectory of the drone is determined.
  • the processor 420 is configured to: control a composition of the photographing device; when the imaging of the preset target in the photographing screen of the photographing device meets a preset composition rule, Preset the target to shoot.
  • the processor 420 is configured to: acquire a shooting signal input by a user; and trigger a shooting operation of the shooting device on the preset target based on the shooting signal.
  • the processor 420 is further configured to: when detecting that the drone is hovering in the returning position and detecting that the palm is located under the drone, controlling the drone Landed on the palm of the hand.
  • the embodiment of the present invention further provides a machine readable storage medium, where the machine readable storage medium can be located on a drone, and the machine readable storage medium stores a plurality of computer instructions.
  • the computer instructions are executed to: control the drone to take off based on user operations; control the drone to fly to a specific location; the drone flies to the specific location Afterwards, the photographing device is triggered to perform a photographing operation on the preset target; after the photographing device completes the photographing operation, the drone is automatically controlled to return to the returning position.
  • the return position is a takeoff position of the drone.
  • the distance between the returning position and the current position of the preset target does not exceed a preset distance threshold.
  • the following processing is performed: controlling the UAV to return to the return based on the image captured by the photographing device position.
  • the computer instruction is executed to perform processing according to the preset target The current position and the current size in the image captured by the photographing device, and the preset position and the preset size of the preset target in the image, control the drone to return to the return position.
  • the captured image is based on the preset target a current position in the determination of coordinates of the preset target in a particular coordinate system; a coordinate difference between a coordinate in the specific coordinate system and a current coordinate of the drone based on the preset target, and the preset target And the preset coordinate difference of the drone, controlling the drone to return to the return position.
  • the computer instruction is executed to perform the following process: determining the preset target relative to the drone based on the TOF technology a current distance; controlling the drone to return to the return position based on the current distance, and the preset target and the distance threshold.
  • the computer instruction in the process of controlling the UAV to return to the return position, is executed to: control the drone to descend from the current position to be located with the return position The same horizontal plane; controlling the drone to fly along the horizontal plane to the return position.
  • the computer instruction is executed to perform the following process: triggering the capturing device to capture an image based on a user operation to obtain a target image; Controls the drone to take off when the image meets the preset conditions.
  • the preset condition includes: the target image includes a human face; or the target image includes a human face, and the similarity between the human face and the preset facial image reaches a preset Similarity threshold.
  • the user operation is a throwing action of the user on the drone.
  • the drone in the process of controlling the drone takeoff based on the user operation, when the computer instruction is executed, the following processing is performed: the drone is unlocked based on the user operation; when it is determined that the paddle unlocking is successful Controlling the drone to take off.
  • the following processing is further performed: acquiring a position parameter input by the user, the position parameter indicating a relative positional relationship between the shooting position of the drone and the preset target Determining a specific location based on the location parameter and a current location of the preset target.
  • the computer instruction is further processed to: acquire a composition rule; and determine a specific location based on the composition rule and a current location of the preset target.
  • the computer instruction is further processed to: acquire a composition rule; determine a flight distance of the drone after takeoff based on a composition rule; determine a direction based on a direction when the drone is thrown The flight direction of the drone is determined; a specific position is determined based on the flight distance and the flight direction.
  • the following processing is performed: triggering the photographing operation of the photographing device on the preset target.
  • the following processing is performed: triggering the photographing operation of the photographing device on the preset target.
  • the following processing is further performed: after triggering the imaging operation of the shooting device on the preset target, controlling the drone to fly along a specific trajectory; when the drone When flying to the end point of the specific trajectory, the photographing device is triggered to stop the imaging operation.
  • the computer instruction is further processed to: acquire a movement trajectory input by a user, where the movement trajectory is a movement of the preset target in an imaging screen of the photographing device a trajectory; determining a specific trajectory of the drone based on the movement trajectory.
  • the following processing is performed: controlling the composition of the photographing device; when the preset target is When the imaging in the photographing screen of the photographing device satisfies a preset composition rule, a photographing operation is performed on the preset target.
  • the following processing is performed: acquiring a photographing signal input by the user; and triggering the photographing based on the photographing signal The shooting operation of the shooting device with the preset target.
  • the computer instruction is further processed to: when detecting that the drone is hovering at the return position and detecting that the palm is located below the drone, controlling the The drone landed on the palm of the hand.
  • the device embodiment since it basically corresponds to the method embodiment, reference may be made to the partial description of the method embodiment.
  • the device embodiments described above are merely illustrative, wherein the units described as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, ie may be located A place, or it can be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the embodiment. Those of ordinary skill in the art can understand and implement without any creative effort.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Studio Devices (AREA)

Abstract

一种无人机(400)的控制方法、无人机(400)以及机器可读存储介质,无人机(400)上挂载有拍摄设备(410),该方法包括:基于用户操作控制无人机(400)起飞;控制无人机(400)飞行至特定位置;无人机(400)飞行至特定位置后,触发拍摄设备(410)对预设目标的拍摄操作;在拍摄设备(410)完成拍摄操作后,自动控制无人机(400)返航至返航位置。应用该方法,可以使得无人机(400)的拍摄过程一气呵成,提高了用户体验,而且减少了手动操作对无人机(400)的续航时间的占用。

Description

无人机的控制方法、无人机以及机器可读存储介质 技术领域
本发明实施例涉及无人机技术领域,尤其涉及一种无人机的控制方法、无人机以及机器可读存储介质。
背景技术
随着飞行技术的发展,UAV(Unmanned Aerial Vehicle,无人飞行器),也称为无人机,已经从军用发展到越来越广泛的民用,例如,UAV植物保护、UAV航空拍摄、UAV森林火警监控等等,而民用化也是UAV未来发展的趋势。
目前,当使用无人机上携带的拍摄设备进行拍摄时,需要通过用户操作用户终端或遥控器来控制无人机起飞,控制无人机起飞后的飞行姿态、飞行距离和云台的转动来实现对拍摄的调整和控制,以及控制无人机在拍摄设备完成拍摄后进行返航,整个操作过程繁琐,操作体验并不友好;而且由于用户手动操作的时间占据了大量的续航时间,使得实际飞行时间减少。
发明内容
有鉴于此,本申请公开了一种无人机的控制方法、无人机以及机器可读存储介质。
第一方面,提供了一种无人机的控制方法,该无人机上挂载有拍摄设备,该方法包括:基于用户操作控制无人机起飞;控制所述无人机飞行至特定位置;所述无人机飞行至所述特定位置后,触发所述拍摄设备对预设目标的拍摄操作;在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
第二方面,提供了一种无人机,该无人机上挂载有拍摄设备,该无人机包括处理器;所述处理器用于:基于用户操作控制无人机起飞;控制所述无人机飞行至特定位置;所述无人机飞行至所述特定位置后,触发所述拍摄设备对预设目标的拍摄操作;在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
第三方面,提供了一种机器可读存储介质,所述机器可读存储介质上存储有若干计算机指令,所述计算机指令被执行时进行如下处理:基于用户操作控制无人机起飞;控制所述无人机飞行至特定位置;所述无人机飞行至所述特定位置后,触发所述拍摄设备对预设目标的拍摄操作;在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
根据本发明的实施例,由于可以根据预设的条件自动控制无人机起飞,并自动控制无人机飞抵特定位置,在飞抵特定位置后,触发拍摄设备对预设目标进行拍摄,后续,还可以自动控制无人机返航至返航位置,从而使得整个拍摄过程一气呵成,提高了用户体验,而且减少了手动操作对无人机的续航时间的占用。
附图说明
图1是无人飞行系统的示意性架构图;
图2是无人机的控制方法的一示意性流程图;
图3是无人机的控制方法的另一示意性流程图;
图4是无人机的一个实施例框图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明的实施例提供了无人机的控制方法、无人机以及机器可读存储介质。对于本领域技术人员将会显而易见的是,本发明的实施例可以应用于各种类型的无人机。例如,可以是小型的无人机。在某些实施例中,无人机可以是旋翼飞行器(rotorcraft),例如,由多个推动装置通过空气推动的多旋翼飞行器,本发明的实施例并不限于此,无人机也可以是其它类型的无人机或可移动装置。
图1是无人飞行系统的示意性架构图。本实施例以旋翼飞行器为例进行说明。
无人飞行系统100可以包括UAV110、云台120、显示设备130和操纵设备 140。其中,UAV110可以包括动力系统150、飞行控制系统160和机架170。UAV110可以与操纵设备140和显示设备130进行无线通信。
机架170可以包括机身和脚架(也称为起落架)。机身可以包括中心架以及与中心架连接的一个或多个机臂,一个或多个机臂呈辐射状从中心架延伸出。脚架与机身连接,用于在UAV110着陆时起支撑作用。
动力系统150可以包括电子调速器(简称为电调)151、一个或多个螺旋桨153以及与一个或多个螺旋桨153相对应的一个或多个电机152,其中电机152连接在电子调速器151与螺旋桨153之间,电机152和螺旋桨153设置在对应的机臂上;电子调速器151用于接收飞行控制器160产生的驱动信号,并根据驱动信号提供驱动电流给电机152,以控制电机152的转速。电机152用于驱动螺旋桨旋转,从而为UAV110的飞行提供动力,该动力使得UAV110能够实现一个或多个自由度的运动。在某些实施例中,UAV110可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴、平移轴和俯仰轴。应理解,电机152可以是直流电机,也可以交流电机。另外,电机152可以是无刷电机,也可以是有刷电机。
飞行控制系统160可以包括飞行控制器161和传感系统162。传感系统162用于测量UAV的姿态信息,即UAV110在空间的位置信息和状态信息,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度等。传感系统162例如可以包括陀螺仪、电子罗盘、IMU(Inertial Measurement Unit,惯性测量单元,)、视觉传感器、全球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是GPS(Global Positioning System,全球定位系统)。飞行控制器161用于控制UAV110的飞行,例如,可以根据传感系统162测量的姿态信息控制UAV110的飞行。应理解,飞行控制器161可以按照预先编好的程序指令对UAV110进行控制,也可以通过响应来自操纵设备140的一个或多个控制指令对UAV110进行控制。
云台120可以包括电调121和电机122。云台用于携带拍摄设备123。飞行控制器161可以通过电调121和电机122控制云台120的运动。可选的,作为另一实施例,云台120还可以包括控制器,用于通过控制电调121和电机122来控制云台120的运动。应理解,云台120可以独立于UAV 110,也可以为UAV 110 的一部分。应理解,电机122可以是直流电机,也可以交流电机。另外,电机122可以是无刷电机,也可以是有刷电机。还应理解,云台120可以位于飞行器的顶部,也可以位于飞行器的底部。
拍摄设备123例如可以是照相机或摄像机等用于捕获图像的设备,拍摄123可以与飞行控制器通信,并在飞行控制器的控制下进行拍摄。
显示设备130位于无人飞行系统100的地面端,可以通过无线方式与UAV110进行通信,并且可以用于显示UAV110的姿态信息。另外,还可以在显示设备130上显示拍摄设备拍摄的图像。应理解,显示设备130可以是独立的设备,也可以设置在操纵设备140中。
操纵设备140位于无人飞行系统100的地面端,可以通过无线方式与UAV110进行通信,用于对UAV110进行远程操纵。操纵设备例如可以是遥控器或者安装有控制UAV的APP(应用程序,Application)的用户终端,例如,智能手机、平板电脑等。本实施例中,通过操纵设备接收用户的输入,可以指通过遥控器上的拔轮、按钮、按键、摇杆等输入装置或者用户终端上的用户界面(UI)对UAV110进行操控。
应理解,上述对于无人飞行系统各组成部分的命名仅是出于标识的目的,并不应理解为对本发明实施例的限制。
图2是无人机的控制方法的一示意性流程图。图2的控制方法可以由控制装置或控制设备,例如,图1的飞行控制器161来执行,本实施例并不限于此,例如,图2的控制方法也可以由无人机上携带的其它控制装置或控制设备来实现。图2的控制方法包括如下内容。
步骤210,基于用户操作控制无人机起飞。
在一些实施例中,首先可以基于用户操作对无人机进行起桨解锁,即使得无人机进入可以转桨起飞的状态。在一实施例中,可以基于用户的第一操作触发起桨解锁,并基于用户的第二操作完成起桨解锁时控制无人机起飞。例如,用户的第一操作可以是长按或双击或者以其他规则操作无人机上的一个按键,其中,该按键可以是无人机上的一个物理按键,例如电源键,也可以是无人机上的一个虚拟按键。例如,第二操作可以是用户将拍摄设备朝向人脸,进行人脸识别。当人 脸识别成功时,完成起桨解锁。在一实施例中,可以基于用户的第一操作或者第二操作同时完成对无人机的起桨解锁的触发和起桨解锁的完成,使得无人机进入可以转桨起飞的状态。
在一实施例中,人脸识别时,所述拍摄设备捕获图像,得到目标图像;当所述目标图像满足预设条件时,完成无人机的起浆解锁。具体的,在无人机的电源开启后,用户可以使用单手或双手手持无人机,并伸直手臂,端平无人机,使得无人机上挂载的拍摄设备朝向自身脸部或者其他人的脸部。拍摄设备在无人机电源开启后处于开启状态,无人机对该拍摄设备所捕获到的图像进行检测,当检测出图像满足预设条件时,无人机完成起桨解锁。在一实施例中,当无人机检测到处于特定姿态下,且检测出图像满足预设条件时,才完成起浆解锁。例如,当无人机检测到自身的当前姿态处于水平静止,且检测图像满足预设条件时,才完成起桨解锁。或者,当无人机检测到自身的当前姿态处于水平静止时,才开启人脸识别功能,当检测图像满足预设条件时,才完成起桨解锁。
其中,根据无人机的使用模式不同,该预设条件有所不同。当无人机处于通用使用模式时,该预设条件可以为目标图像中包括人脸;当无人机处于主人使用模式时,该预设条件可以为目标图像中包括人脸,且该人脸与预设人脸图像的相似度达到预设相似度阈值。
在一实施例中,目标图像中可能存在不止一张人脸,例如,手持无人机的用户身后方站有其他用户,那么,无人机可以检测目标图像中每张人脸的角度,例如正脸、侧脸,每张人脸在目标图像中的位置、大小等,根据检测结果,在多张人脸中选择其中一张人脸,例如,选择角度为正脸,且位置偏向于图像正中的人脸;又例如,选择角度为正脸,且在目标图像中最大的人脸。后续,无人机可以对所选择的人脸进行进一步的相似度检测。
在一实施例中,在确定无人机起桨解锁成功后,可以无需用户进行操作,自动控制无人机起飞。在一实施例中,对无人机完成起桨解锁之后,无人机可以再次判断无人机当前是否能够起飞,若是,则自动控制无人机起飞。
在一实施例中,在确定无人机起桨解锁成功后,则可以基于用户的进一步操作控制无人机起飞。
在一实施例中,在确定无人机起桨解锁成功后,无人机可以启动动力装置并控制动力装置怠速转动,在动力装置怠速转动预设时间之后控制无人机起飞。通过控制无人机的动力装置进行一定时间的怠速转动,可以避免无人机的误启动。
在一实施例中,在确定无人机起桨解锁成功后,也可以在无人机水平放置超过预设时长之后控制动力装置怠速转动。例如,用户将无人机水平放置(例如,水平放置在手掌中),无人机根据传感器检测的无人机的姿态信息确定到无人机处于水平状态(例如,姿态角为零)超过预设时间之后,自动启动无人机,并控制动力装置怠速转动,进一步,无人机还可以在动力装置怠速转动预设时间之后控制无人机起飞。
在一实施例中,也可以在无人机起桨解锁成功后,进一步确认收到允许怠速转动的信号时控制动力装置怠速转动。例如,为了安全起见,可以生成一些允许怠速转动的信号或者接收外部设备发送的允许怠速转动的信号来控制飞行器怠速转动,本实施例可以利用这些信号与无人机的自动启动相结合,从而提高了无人机自动启动的安全性。
在一些实施例中,基于用户操作控制无人机起飞可以指的是在确定无人机完成起桨解锁后,基于用户操作控制无人机起飞。其中,该用户操作可以为用户对无人机的抛掷动作,即在确定无人机起桨解锁成功后,若检测到用户对无人机的抛掷动作,则可以基于该抛掷动作控制无人机起飞。具体的,在无人机被抛飞的情况下,检测无人机的运动数据,当该运动数据满足自动启动条件时,自动启动无人机的动力装置。
在一实施例中,该运动数据可以包括无人机被抛出的距离,在这种情况下,运动数据满足自动启动条件包括:无人机被抛出的距离大于或等于预设的距离阈值,该预设的距离阈值可以为0或者使得无人机不会对用户造成伤害的安全距离。从而,在无人机与用户之间的距离为安全距离时控制无人机起飞,可以避免对用户造成伤害。
可代替的,该运动数据可以包括无人机的垂直速度或速度,在这种情况下,运动数据满足自动启动条件包括:无人机的垂直速度或速度小于等于预设的速度阈值,该预设的速度阈值可以等于0或其它接近于0的值。由于将垂直速度或速 度设置为小于等于预设的速度阈值再启动,可以使得无人机启动时飞行更加稳定。
在一实施例中,无人机可以识别用户抛掷时的动作,选择合适的飞行轨迹。由于可以通过简单的动作指示无人机用户想要的飞行轨迹,提高了用户体验,并且进一步减少了对无人机续航时间的占用。
在一实施例中,用户对无人机的抛掷动作可以为垂直上抛、斜向上侧抛,或者平抛等。
在一实施例中,无人机可以基于用户操作,确定是否满足预设的自动启动条件,在满足预设的自动启动条件时,无人机自动起飞。
无人机自动起飞意味着无人机在满足预设的自动启动条件时,可以自动接通无人机的启动回路,控制无人机的动力装置开始工作,而无需人工通过按钮或按键手动启动无人机以及控制无人机起飞。由于可以根据预设的自动启动条件自动启动飞行器,从而减少了手动操作对飞行器续航时间的占用,同时提升了用户体验。
在一实施例中,可以按照如下方式自动启动无人机:
用户对无人机执行抛飞动作,在无人机被抛飞的情况下,检测无人机的运动数据,当该运动数据满足自动启动条件时,自动启动无人机的动力装置,关于无人机被抛飞时的运动数据,以及该运动数据满足自动启动条件时,自动启动无人机的动力装置的详细说明可以参见上述所描述的,在起桨解锁成功后,根据用户的抛掷动作控制无人机起飞的相关描述,在此不再详述。
步骤220,控制无人机飞行至特定位置。
在一实施例中,上述特定位置可以为无人机的出厂设置中所设置好的,或者在无人机起飞之前用户预先设定好的,该特定位置可以表示为相对于预设目标具有一个固定的方向和距离的位置,其中,该距离可以为水平距离、垂直距离中的至少一个,或者为无人机与预设目标之间的直线距离。例如,可以在用户界面上设置用于输入位置参数的按钮、文本框、选择框等用户界面元素,使得用户可以通过外部设备(例如用户终端或遥控器)的用户界面选择或输入无人机的拍摄位置与预设目标的当前位置之间的相对位置关系,飞行控制器可以通过其与外部设 备之间的通信接口获取用户选择或输入的位置参数。这样,用户可以精确地选择或输入位置参数,使得无人机能够准确飞抵与该位置参数对应的特定位置,拍摄出用户期望的图像。
在一实施例中,上述特定位置可以为基于获取的构图规则与预设目标的当前位置所确定的位置。其中,构图规则可以包括预设目标在拍摄画面中的位置、预设目标的面部在拍摄画面中角度、预设目标的面部在拍摄画面中的完整度中的一种或多种,构图规则可以包括如下构图规则之一的规则:均衡式构图、对称式构图、对角线构图、三角形构图、九宫格构图、向心式构图、对分式构图、拍摄画面中的人脸为正脸、拍摄画面中的人脸为侧脸。
获取的构图规则可以是从外部设备接收的预设的构图规则,或者是从外部设备接收到用户输入的构图规则,例如,可以在用户界面上设置用于输入构图规则的可移动选框、可移动边界线、文本框、选择框等用户界面元素,使得用户可以自定义构图规则。这样,用户可以精确地输入构图规则,使得无人机上挂载的拍摄设备可以拍摄出用户期望的图像。
在一实施例中,基于用户对无人机的抛掷动作控制无人机起飞的情况下,不仅可以获取构图规则,基于构图规则确定无人机起飞后的飞行距离,还可以基于无人机被抛掷时的方向确定无人机的飞行方向,基于飞行距离与飞行方向确定特定位置。
在一实施例中,上述特定位置可以是基于针对预设目标的拍摄信息所设定的特定位置,其中,拍摄信息可以用于指示预设目标在拍摄画面中的范围。例如,拍摄信息可以是预设目标在拍摄画面中所占的比例或预设目标在拍摄画面中所占的范围大小。
例如,上述拍摄信息可以是用户选择的景别。景别按照预设目标在拍摄画面中所占的比例或范围不同,可以分成大景别、中等景别和小景别三种,每种景别又可以进一步细分。景别越大,意味着预设目标在拍摄画面中所占的比例或范围越小,反之亦然。具体的,人像摄影可以根据预设目标的面积在拍摄画面中所占的比例或范围来确定其景别,分成全身像、大半身像、半身像、胸像、带肩头像和大头像。例如,拍摄信息可以包括大景别、中等景别和小景别中的至少一个; 或者拍摄信息可以包括全身像、大半身像、半身像、胸像、带肩头像和大头像中的至少一个。
例如,可以预选设置不同的景别与预设目标在拍摄画面中所占的比例或范围的对应关系,当用户选择了某种景别,则根据用户选择的景别确定预设目标在拍摄画面中所占的比例或范围。当然,本发明的实施例并不限于上述确定拍摄信息的方式,也可以根据在用户界面上输入的预设目标在拍摄画面中所占的比例或范围来确定拍摄信息,例如,可以利用用户在触摸屏上画出的方框来指示预设目标在拍摄画面中的范围。
应理解的是,上述景别的分类仅仅是举例说明,也可以根据实际需要定义不同的景别分类。
步骤230:无人机飞行至特定位置后,触发拍摄设备对预设目标的拍摄操作。
在一实施例中,无人机飞行至特定位置后,无人机可以首先控制拍摄设备的构图,当预设目标在拍摄设备的拍摄画面中的成像满足预设的构图规则时,触发拍摄设备对预设目标进行拍摄操作。具体地,上述控制拍摄设备的构图,使得预设目标在拍摄画面中的成像满足预设的构图规则包括:通过调整无人机的飞行姿态、拍摄设备的云台的运动和拍摄设备的焦距中的至少一个来控制拍摄设备的构图,使得预设目标在拍摄画面中的位置满足预设的构图规则。
例如,可以由拍摄设备捕获预设目标当前在拍摄画面中呈现的图像,并且通过图像识别确定预设目标在拍摄画面中所占的位置,从而确定预设目标在拍摄画面中的位置是否满足预设的构图规则。以构图为九宫格为例,例如,如果用户选择了九宫格构图,则可以将预设目标成像在九宫格的四个交叉点上。可选的,还可以进一步将九宫格构图细分与四个交叉点对应的四种模式,以供用户进一步选择将预设目标成像在哪个交叉点上。可以根据上述图像识别确定预设目标的中心是否位于九宫格的某个交叉点上,或者确定预设目标的中心距离九宫格的某个交叉点的距离和方位,并据此调整构图,使得预设目标的中心最终与九宫格的某个交叉点重合。
在一实施例中,无人机控制拍摄设备的构图,当预设目标在拍摄设备的拍摄画面中的成像满足预设的构图规则时,无人机可以向用户发出构图完毕的指示信 息,例如,通过在APP界面上显示构图完毕的指示信息,或者通过控制指示灯按预定规则闪亮来提示构图完毕,或者通过声音来提示构图完毕。
在一实施例中,无人机控制拍摄设备的构图,当预设目标在拍摄设备的拍摄画面中的成像满足预设的构图规则时,或者当预设目标在拍摄设备的拍摄画面中的成像满足预设的构图规则且无人机向用户发出构图完毕的指示信息后,自动触发拍摄设备对预设目标进行拍摄操作。
在一实施例中,无人机控制拍摄设备的构图,当预设目标在拍摄设备的拍摄画面中的成像满足预设的构图规则时,或者当预设目标在拍摄设备的拍摄画面中的成像满足预设的构图规则且无人机向用户发出构图完毕的指示信息后,无人机基于用户的操作触发拍摄设备对预设目标进行拍摄操作。例如,用户通过外部设备输入(如遥控器或者装有控制无人机的APP的移动终端)向无人机发出拍摄信号。又例如,用户通过声音、手势等等向无人机发出拍摄信号。这样,无人机可以基于该拍摄信号触发拍摄设备对预设目标的拍摄操作。
在一实施例中,无人机飞行至特定位置后,可以向外部设备返回飞抵特定位置的指示信息,用户可以根据该指示信息,通过外部设备输入拍摄信号,例如LED灯、声音等等,飞行控制器可以通过其与外部设备之间的通信接口获取用户输入的拍摄信号。这样,无人机可以基于该拍摄信号触发拍摄设备对预设目标的拍摄操作。
在一实施例中,拍摄设备对预设目标的拍摄操作可以为对预设目标的拍照操作。例如,用户可以预先通过外部设备输入拍照操作的第一限制条件,该第一限制条件可以包括拍照次数,飞行控制器可以通过其与外部设备之间的通信接口预先获取用户输入的拍照操作的第一限制条件,根据该第一限制条件,控制拍摄设备对预设目标的拍照操作。
在一实施例中,拍摄设备对预设目标的拍摄操作可以为对预设目标的摄像操作。
在一实施例中,用户可以预先通过外部设备输入拍摄操作的第二限制条件,该第二限制条件可以包括预设的拍摄时长、拍摄时的特定轨迹,飞行器可以通过其与外部设备之间的通信接口预先获取用户输入的拍摄操作的第二限制条件,根 据该第二限制条件,控制拍摄设备对预设目标的拍摄操作。其中:
1)在第二限制条件包括预设的拍摄时长的情况下,无人机可以控制拍摄设备在特定位置处对预设目标进行拍摄,当实际拍摄时长达到预设的拍摄时长时,无人机可以控制拍摄设备停止拍摄。
可代替的,若在实际拍摄时长未达到预设的拍摄时长之前,无人机接收到用户发送的停止拍摄信号时,也可以直接控制拍摄设备停止拍摄,其中,该停止拍摄信号可以为用户通过外部设备输入的,也可以为用户通过执行指定的表示停止拍摄的手势发送的。
在第二限制条件包括特定轨迹的情况下,在触发拍摄设备对预设目标的摄像操作后,控制无人机沿着特定轨迹飞行,当无人机飞行至特定轨迹的终点时,触发拍摄设备停止对预设目标的摄像操作。
可代替的,若无人机在沿着特定轨迹飞行的过程中,接收到用户发送的停止拍摄信号,无人机也可以直接控制拍摄设备停止拍摄。
在一实施例中,上述特定轨迹可以根据从外部设备接收到的输入确定,例如,可以在用户界面上设置用于输入移动轨迹的按钮、文本框、选择框等用户界面元素,以使用户可以输入或选择移动轨迹,其中,该移动轨迹为预设目标在拍摄设备的摄像画面中的移动轨迹,后续,无人机或外部设备可以基于该移动轨迹确定无人机飞行时的特定轨迹。
在一实施例中,上述特定轨迹可以是预设的,例如,特定轨迹可以包括环绕、拉远、拉近和S形中的至少一个。例如,假设特定轨迹为环绕,则无人机在摄像过程中可以环绕预设目标一边飞行一边拍摄,又例如,假设特定轨迹为拉近,则无人机在摄像过程中可以朝着预设目标的方向一边飞行一边拍摄。
在一实施例中,触发拍摄设备对预设目标进行拍摄操作包括:控制拍摄设备根据景深原理调整拍摄设备的焦距,并利用调整后的焦距对预设目标进行拍摄。
在拍摄位于不同距离上的景物时,例如,拍摄多排人物或者体积庞大的拍摄对象时,可以根据景深原理调整焦距,即设置合适的焦点,使得拍摄设备能够清晰地拍摄全部景物。
以多人合影为例,在无人机飞至特定位置后,可以触发拍摄设备根据景深原理调整焦距,如公式(1)、(2)和(3)所示,前景深比后景深更浅,故大概需要对焦在前1/3处,其中1/3是经验值,镜头可以聚焦在整个队列纵深的前1/3处。例如,如果给五排人拍摄合影,则可以将焦点对在第二排中间的人物上,这样可以更有效地利用前景深和后景深,拍出清晰的集体合影。
Figure PCTCN2017085138-appb-000001
Figure PCTCN2017085138-appb-000002
Figure PCTCN2017085138-appb-000003
σ为容许弥撒圆直径,f为镜头焦距,F为镜头的拍摄光圈值,L为对焦距离,ΔL1为前景深,ΔL2为后景深,ΔL为景深。
在一实施例中,触发拍摄设备对预设目标进行拍摄操作包括:检测环境状况信息和/或预设目标的姿势信息,并根据环境状况信息和/或预设目标的姿势信息调整拍摄角度。
环境状况信息例如可以为用于表示逆光、天气状况、光线明暗等信息。预设目标的姿势信息例如可以为用于表示头的转向、站立、坐下等姿势的信息。具体的拍摄角度可以包括俯拍、侧拍和仰拍等。
例如,当检测到以某个拍摄角度进行拍摄逆光时,可以避免以该拍摄角度进行拍摄。再如,当检测到预设目标处于侧身状态时,可以调整拍摄角度,使得能够拍摄到预设目标的正面照。应理解,上述功能可以在无人机起飞行前由用户通过外部设备的用户界面(例如,用户终端上的用户界面)进行设置或选择。
由于可以根据环境状况信息和/或预设目标的姿势信息对拍摄角度进行自适应的调整,使得拍摄过程智能化,减少了拍摄过程中的人工干涉,提高了用户体验,而且减少了手动操作对无人机的续航时间的占用。
步骤240:在拍摄设备完成拍摄操作后,自动控制无人机返航至返航位置。
在一实施例中,在拍摄设备完成拍摄操作后,自动控制无人机从返航的起点位置返航至返航位置,其中,返航的起点位置可以为下述三种情况之一:
1)若无人机在上述步骤230中所描述的特定位置完成对预设目标的拍摄操作,那么,无人机返航时的起点位置则为该特定位置。
2)若无人机沿着上述步骤230中所描述的特定轨迹完成对预设目标的拍摄操作,那么,无人机返航时的起点位置则为该特定轨迹的终点位置。
3)若无人机根据用户发送的停止拍摄信号,停止对预设目标进行拍摄,那么,无人机返航时的起点位置则为无人机停止对预设目标进行拍摄时,无人机所在的位置。
在一实施例中,上述返航位置为无人机的起飞位置,该起飞位置可以为无人机在起飞时所记录的,这里所说的起飞位置与返航位置可以是在北东地坐标系中的坐标。
在一实施例中,无人机的起飞位置即为无人机在起飞时,预设目标所在的位置,在该种情况下,无人机从预设目标的手上起飞,此时可以忽略无人机与预设目标所在位置之间的差别。
在一实施例中,无人机的起飞位置可以参考无人机在起飞时预设目标所在的位置确定,例如,无人机通过扫脸起飞,通过上述描述可知,无人机通过扫脸起飞时,预设目标可以伸直手臂,那么,通过手臂的通常长度以及预设目标所在的位置,即可确定无人机的起飞位置。
在一实施例中,上述返航位置与预设目标的当前位置之间的距离不超过预设的距离阈值。具体地,可以预先设置一个距离阈值,无人机在返航过程中,可以基于TOF(Time of Flight,飞行时间)技术或3D TOF技术,实时确定预设目标相对于无人机的当前距离,若该当前距离大于距离阈值,则可以控制无人机进一步接近预设目标,直至当前距离不大于距离阈值时,可以确定无人机已返航至返航位置。
本领域技术人员可以理解的是,本申请中并不限制于基于TOF技术或3D TOF技术实时确定预设目标相对于无人机的当前距离,例如还可以采用双目等其他立体环境感知传感器确定预设目标相对于无人机的当前距离。
在一实施例中,用户手持或可穿戴的外部设备将自身的当前位置发送至无人机,以使得无人机将该当前位置作为返航位置进行返航。具体的,该外部设备可以以一定频率,向无人机发送携带该外部设备的当前位置的信号,从而,无人机可以根据该外部设备的当前位置进行返航。其中,该外部设备可以是遥控器,或者是装有用于控制无人机的APP的终端设备,或者是用于控制无人机的智能手表,或者是其他能够与外部设备通信的设备。
在一实施例中,可以基于拍摄设备捕获的图像控制无人机返航至返航位置。
在一实施例中,基于拍摄设备捕获的图像控制无人机返航至返航位置可以包括:基于预设目标在捕获的图像中的当前位置和当前大小,以及预设目标在图像中的预设位置和预设大小,控制无人机返航至返航位置。例如,拍摄设备在返航过程中可以实时捕获预设目标的图像,那么,拍摄设备确定预设目标在捕获的图像中的当前位置和当前大小,例如,若当前大小小于预设目标在捕获的图像中的预设大小,则控制无人机靠近预设目标,若当前大小大于预设目标在捕获的图像中的预设大小,则控制无人机远离预设目标;又例如,若当前位置相对于预设位置而言,偏向于捕获的图像的一侧边缘,则控制无人机向该边缘侧移动。上述调整过程可以按固定的步长进行调整,也可以按照可变的步长进行调整。当确定预设目标在捕获的图像中的当前位置与预设目标在捕获的图像中的预设位置一致,且预设目标在捕获的图像中的当前大小与预设目标在捕获的图像中的预设大小一致时,确定无人机飞抵返航位置。
在一实施例中,基于拍摄设备捕获的图像控制无人机返航至返航位置可以包括:基于预设目标在捕获的图像中的当前位置确定预设目标在特定坐标系中的坐标;基于预设目标在特定坐标系中的坐标和无人机的当前坐标的坐标差,以及预设目标和无人机的预设坐标差,控制无人机返航至返航位置。其中,特定坐标系可以为北东地坐标系。例如,拍摄设备在返航过程中可以实时捕获预设目标的图像,那么,拍摄设备确定预设目标在捕获的图像中的当前位置,可以基于该当前位置与图像中背景之间的相对位置关系确定预设目标在特定坐标系中的坐标,无人机当前在特定坐标系中的坐标可以由无人机中的位置传感器得出,之后,将预设目标在特定坐标系中的坐标与无人机当前的坐标作差,得出坐标差,基于该坐标差与预设坐标差控制无人机返航,最终,当该坐标差与预设坐标差一致时,确 定无人机飞抵返航位置。
在一实施例中,无人机返航至返航位置的飞行过程可以包括:控制所述无人机从当前位置下降至与返航位置位于同一水平面,控制无人机沿着水平面飞行至返航位置。在该过程中,如果飞行方向上有障碍,则可以先通过位置传感器,例如,GPS或者视觉传感器,获知预设目标的位置和高度,并记录该位置和高度。在这种情况下,可以规划路径绕开障碍,如无法绕行则尝试抬升高度,以避开障碍。另外,飞行过程中机头可以始终朝向前进方向,以保证飞行安全。
在一实施例中,当无人机飞抵返航位置,检测到下方存在手掌时,可以控制无人机降落到手掌上,在无人机降落到手掌上之后,即可以控制无人机停止转桨,进一步,还可以控制无人机收桨。
根据本发明的实施例,由于可以根据预设的条件自动控制无人机起飞,并自动控制无人机飞抵特定位置,在飞抵特定位置后,触发拍摄设备对预设目标进行拍摄,后续,还可以自动控制无人机返航至返航位置,从而实使得整个拍摄过程一气呵成,提高了用户体验,而且减少了手动操作对无人机的续航时间的占用。
图3是无人机的控制方法的另一示意性流程图。图3的控制方法是图2的方法的示例。图3的控制方法包括如下内容:
步骤310:基于用户操作控制无人机进行扫脸识别。
在一些实施例中,首先可以基于用户操作控制无人机进行扫脸识别,以对无人机起桨解锁,即使得无人机进入可以转桨起飞的状态。
具体的,在无人机开启电源后,用户可以单手或双手手持无人机,并伸直手臂,端平无人机,使得无人机上挂载的拍摄设备朝向自身脸部。无人机开启电源后,拍摄设备对捕获到的图像进行人脸识别,当检测出人脸满足预设条件时,无人机完成起桨解锁。
在一实施例中,无人机开启电源后,当检测用户以一定规则操作无人机上的一个按键时,才启动人脸识别功能。其中,该按键可以是无人机上的一个物理按键,例如电源键,也可以是无人机上的一个虚拟按键。其中该一定规则可以是长按、双击或者一次长按加两次短按等等。
在一实施例中,无人机开启电源后,当检测到无人机处于特定姿态下时,才启动人脸识别功能。例如,无人机检测到用户以一定规则操作无人机上的一个按键后,检测当前自身是否处于水平静止,若是,则启动人脸识别功能。
在一实施例中,无人机开启电源后,当检测用户以一定规则操作无人机上的一个按键,且检测到无人机处于特定姿态下时,才启动人脸识别功能。
在一实施例中,在无人机进行人脸识别时,控制用于挂载拍摄设备的云台上下摆动,和/或控制该云台左右摆动,以采集人脸图像。
步骤320:当扫脸识别成功时,自动控制无人机起飞。
在一实施例中,当无人机检测出步骤310中无人机进行人脸识别时拍摄设备获取到的目标图像满足预设条件时,自动控制无人机起飞。在一实施例中,当无人机进行人脸识别时拍摄设备获取到的目标图像满足预设条件时,无人机再一次判断当前状态是否满足可以起飞的条件,若满足,则自动控制无人机起飞。
在一实施例中,根据无人机的使用模式的不同,上述预设条件有所不同,具体的,当无人机处于通用使用模式时,该预设条件可以为目标图像中包括人脸;当无人机处于主人使用模式时,该预设条件可以为目标图像中包括人脸,且该人脸与预设人脸图像的相似度达到预设相似度阈值。
其中,目标图像中可能存在不止一张人脸,例如手持无人机的用户身后方站有其他用户,那么,无人机可以检测目标图像中每张人脸的角度,例如正脸、侧脸,每张人脸在目标图像中的位置、大小等,根据检测结果在多张人脸中选择其中一张人脸,例如选择角度为正脸,且位置偏向于图像正中的人脸;又例如,选择角度为正脸,且在目标图像中最大的人脸。后续,无人机可以对所选择的人脸进行进一步的相似度检测。
步骤330:自动控制无人机飞行至特定位置。
自动控制无人机飞行至特定位置的过程可以参见上述图2所示方法中的220,在此不再赘述。
步骤340:自动控制拍摄设备的构图。
在一实施例中,可以通过调整无人机的姿态、拍摄设备的云台的运动和 拍摄设备的焦距中的至少一个来控制拍摄设备的构图,使得预设目标在拍摄画面中的位置满足预设的构图规则。
例如,可以通过控制无人机的螺旋桨的转速来调整无人机的姿态,使无人机产生横滚、平移和俯仰等姿态变化。还可以通过控制云台的横滚机构、平移机构和俯仰机构的旋转来调整云台的运动。上述调整和控制将使得拍摄设备随无人机或云台相对于预设目标产生运动,从而能够调整预设目标在拍摄画面中的构图。另外,在拍摄过程中还可以调整拍摄设备的焦距,以得到清晰的构图。
步骤350:当预设目标在拍摄设备的拍摄画面中的成像满足预设的构图规则时,自动控制拍摄设备对预设目标进行拍摄操作。
例如,当根据图像识别的结果确定预设目标的中心与九宫格的某个交叉点重合时,自动向拍摄设备输出拍摄指示,指示对预设目标进行拍摄。
步骤360:在拍摄设备完成拍摄操作后,自动控制无人机返航至返航位置。
控制无人机返航至返航位置的过程可以参见上述图2所示方法中的240,在此不再赘述。
根据本发明的实施例,可以通过扫脸自动控制无人机起飞,并控制无人机飞抵特定位置,无人机飞抵特定位置后,根据预设的构图规则进行智能化构图后进行拍摄,减少了拍摄过程中对无人机的人工干涉,提高了用户体验,并且减少对无人机的续航时间的占用。
基于与上述方法同样的发明构思,本发明实施例中还提供一种无人机,如图4所示,无人机400上挂载有拍摄设备410,该无人机400包括:处理器420;该处理器420用于:基于用户操作控制无人机起飞;控制所述无人机飞行至特定位置;所述无人机飞行至所述特定位置后,触发所述拍摄设备对预设目标的拍摄操作;在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
在一实施例中,所述返航位置为所述无人机的起飞位置。
在一实施例中,所述返航位置与所述预设目标的当前位置之间的距离不 超过预设的距离阈值。
在一实施例中,所述处理器420用于:基于所述拍摄设备捕获的图像控制所述无人机返航至返航位置。
在一实施例中,所述处理器420用于:基于所述预设目标在所述拍摄设备捕获的图像中的当前位置和当前大小,以及所述预设目标在图像中的预设位置和预设大小,控制所述无人机返航至返航位置。
在一实施例中,所述处理器420用于:基于所述预设目标在所述捕获的图像中的当前位置确定所述预设目标在特定坐标系中的坐标;基于所述预设目标在特定坐标系中的坐标和所述无人机的当前坐标的坐标差,以及所述预设目标和所述无人机的预设坐标差,控制所述无人机返航至返航位置。
在一实施例中,所述处理器420用于:基于TOF技术确定所述预设目标相对于所述无人机的当前距离;基于所述当前距离,以及所述预设目标与所述距离阈值,控制所述无人机返航至返航位置。
在一实施例中,所述处理器420用于:控制所述无人机从当前位置下降至与所述返航位置位于同一水平面;控制所述无人机沿着所述水平面飞行至返航位置。
在一实施例中,所述处理器420用于:基于用户操作触发所述拍摄设备捕获图像,得到目标图像;当所述目标图像满足预设条件时,控制无人机起飞。
在一实施例中,所述预设条件包括:所述目标图像中包括人脸;或,所述目标图像中包括人脸,且所述人脸与预设人脸图像的相似度达到预设相似度阈值。
在一实施例中,所述用户操作为所述用户对所述无人机的抛掷动作。
在一实施例中,所述处理器420用于:基于用户操作对无人机进行起桨解锁;当确定起桨解锁成功时,控制所述无人机起飞。
在一实施例中,所述处理器420还用于:获取用户输入的位置参数,所述位置参数表示所述无人机的拍摄位置与所述预设目标之间的相对位置关系;基于所述位置参数与所述预设目标的当前位置确定特定位置。
在一实施例中,所述处理器420还用于:获取构图规则;基于所述构图规则与所述预设目标的当前位置确定特定位置。
在一实施例中,所述处理器420还用于:获取构图规则;基于构图规则确定所述无人机起飞后的飞行距离;基于所述无人机被抛掷时的方向确定所述无人机的飞行方向;基于所述飞行距离与飞行方向确定特定位置。
在一实施例中,所述处理器420用于:触发所述拍摄设备对预设目标的拍照操作。
在一实施例中,所述处理器420用于:触发所述拍摄设备对预设目标的摄像操作。
在一实施例中,所述处理器420还用于:在触发所述拍摄设备对预设目标的摄像操作后,控制所述无人机沿特定轨迹飞行;当所述无人机飞行至所述特定轨迹的终点时,触发所述拍摄设备停止所述摄像操作。
在一实施例中,所述处理器420还用于:获取用户输入的移动轨迹,所述移动轨迹为所述预设目标在所述拍摄设备的拍摄画面中的移动轨迹;基于所述移动轨迹确定所述无人机的特定轨迹。
在一实施例中,所述处理器420用于:控制所述拍摄设备的构图;当所述预设目标在所述拍摄设备的拍摄画面中的成像满足预设的构图规则时,对所述预设目标进行拍摄操作。
在一实施例中,所述处理器420用于:获取用户输入的拍摄信号;基于所述拍摄信号触发所述拍摄设备对预设目标的拍摄操作。
在一实施例中,所述处理器420还用于:当检测到所述无人机悬停在所述返航位置上且检测到手掌位于所述无人机下方时,控制所述无人机降落到所述手掌上。
基于与上述方法同样的发明构思,本发明实施例中还提供一种机器可读存储介质,该机器可读存储介质可以位于无人机,所述机器可读存储介质上存储有若干计算机指令,所述计算机指令被执行时进行如下处理:基于用户操作控制无人机起飞;控制所述无人机飞行至特定位置;所述无人机飞行至所述特定位置 后,触发所述拍摄设备对预设目标的拍摄操作;在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
在一实施例中,所述返航位置为所述无人机的起飞位置。
在一实施例中,所述返航位置与所述预设目标的当前位置之间的距离不超过预设的距离阈值。
在一实施例中,所述控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:基于所述拍摄设备捕获的图像控制所述无人机返航至返航位置。
在一实施例中,所述根据所述拍摄设备捕获的图像控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:基于所述预设目标在所述拍摄设备捕获的图像中的当前位置和当前大小,以及所述预设目标在图像中的预设位置和预设大小,控制所述无人机返航至返航位置。
在一实施例中,所述根据所述实时图像控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:基于所述预设目标在所述捕获的图像中的当前位置确定所述预设目标在特定坐标系中的坐标;基于所述预设目标在特定坐标系中的坐标和所述无人机的当前坐标的坐标差,以及所述预设目标和所述无人机的预设坐标差,控制所述无人机返航至返航位置。
在一实施例中,所述控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:基于TOF技术确定所述预设目标相对于所述无人机的当前距离;基于所述当前距离,以及所述预设目标与所述距离阈值,控制所述无人机返航至返航位置。
在一实施例中,所述控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:控制所述无人机从当前位置下降至与所述返航位置位于同一水平面;控制所述无人机沿着所述水平面飞行至返航位置。
在一实施例中,所述基于用户操作控制无人机起飞的过程中,所述计算机指令被执行时进行如下处理:基于用户操作触发所述拍摄设备捕获图像,得到目标图像;当所述目标图像满足预设条件时,控制无人机起飞。
在一实施例中,所述预设条件包括:所述目标图像中包括人脸;或,所述目标图像中包括人脸,且所述人脸与预设人脸图像的相似度达到预设相似度阈值。
在一实施例中,所述用户操作为所述用户对所述无人机的抛掷动作。
在一实施例中,所述基于用户操作控制无人机起飞的过程中,所述计算机指令被执行时进行如下处理:基于用户操作对无人机进行起桨解锁;当确定起桨解锁成功时,控制所述无人机起飞。
在一实施例中,所述计算机指令被执行时还进行如下处理:获取用户输入的位置参数,所述位置参数表示所述无人机的拍摄位置与所述预设目标之间的相对位置关系;基于所述位置参数与所述预设目标的当前位置确定特定位置。
在一实施例中,所述计算机指令被执行时还进行如下处理:获取构图规则;基于所述构图规则与所述预设目标的当前位置确定特定位置。
在一实施例中,所述计算机指令被执行时还进行如下处理:获取构图规则;基于构图规则确定所述无人机起飞后的飞行距离;基于所述无人机被抛掷时的方向确定所述无人机的飞行方向;基于所述飞行距离与飞行方向确定特定位置。
在一实施例中,所述触发所述拍摄设备对预设目标的拍摄操作的过程中,所述计算机指令被执行时进行如下处理:触发所述拍摄设备对预设目标的拍照操作。
在一实施例中,所述触发所述拍摄设备对预设目标的拍摄操作的过程中,所述计算机指令被执行时进行如下处理:触发所述拍摄设备对预设目标的摄像操作。
在一实施例中,所述计算机指令被执行时还进行如下处理:在触发所述拍摄设备对预设目标的摄像操作后,控制所述无人机沿特定轨迹飞行;当所述无人机飞行至所述特定轨迹的终点时,触发所述拍摄设备停止所述摄像操作。
在一实施例中,所述计算机指令被执行时还进行如下处理:获取用户输入的移动轨迹,所述移动轨迹为所述预设目标在所述拍摄设备的摄像画面中的移 动轨迹;基于所述移动轨迹确定所述无人机的特定轨迹。
在一实施例中,所述触发所述拍摄设备对预设目标的拍摄操作的过程中,所述计算机指令被执行时进行如下处理:控制所述拍摄设备的构图;当所述预设目标在所述拍摄设备的拍摄画面中的成像满足预设的构图规则时,对所述预设目标进行拍摄操作。
在一实施例中,所述触发所述拍摄设备对预设目标的拍摄操作的过程中,所述计算机指令被执行时进行如下处理:获取用户输入的拍摄信号;基于所述拍摄信号触发所述拍摄设备对预设目标的拍摄操作。
在一实施例中,所述计算机指令被执行时还进行如下处理:当检测到所述无人机悬停在所述返航位置上且检测到手掌位于所述无人机下方时,控制所述无人机降落到所述手掌上。
对于装置实施例而言,由于其基本对应于方法实施例,所以相关之处参见方法实施例的部分说明即可。以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性劳动的情况下,即可以理解并实施。
需要说明的是,在本文中,诸如第一和第二等之类的关系术语仅仅用来将一个实体或者操作与另一个实体或操作区分开来,而不一定要求或者暗示这些实体或操作之间存在任何这种实际的关系或者顺序。术语“包括”、“包含”或者其任何其他变体意在涵盖非排他性的包含,从而使得包括一系列要素的过程、方法、物品或者设备不仅包括那些要素,而且还包括没有明确列出的其他要素,或者是还包括为这种过程、方法、物品或者设备所固有的要素。在没有更多限制的情况下,由语句“包括一个……”限定的要素,并不排除在包括所述要素的过程、方法、物品或者设备中还存在另外的相同要素。
以上对本发明实施例所提供的方法和装置进行了详细介绍,本文中应用了具体个例对本发明的原理及实施方式进行了阐述,以上实施例的说明只是用于 帮助理解本发明的方法及其核心思想;同时,对于本领域的一般技术人员,依据本发明的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内容不应理解为对本发明的限制。

Claims (66)

  1. 一种无人机的控制方法,所述无人机上挂载有拍摄设备,其特征在于,所述方法包括:
    基于用户操作控制无人机起飞;
    控制所述无人机飞行至特定位置;
    所述无人机飞行至所述特定位置后,触发所述拍摄设备对预设目标的拍摄操作;
    在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
  2. 根据权利要求1所述的方法,其特征在于,所述返航位置为所述无人机的起飞位置。
  3. 根据权利要求1所述的方法,其特征在于,所述返航位置与所述预设目标的当前位置之间的距离不超过预设的距离阈值。
  4. 根据权利要求3所述的方法,其特征在于,所述控制所述无人机返航至返航位置包括:
    基于所述拍摄设备捕获的图像控制所述无人机返航至返航位置。
  5. 根据权利要求4所述的方法,其特征在于,所述根据所述拍摄设备捕获的图像控制所述无人机返航至返航位置包括:
    基于所述预设目标在所述拍摄设备捕获的图像中的当前位置和当前大小,以及所述预设目标在图像中的预设位置和预设大小,控制所述无人机返航至返航位置。
  6. 根据权利要求4所述的方法,其特征在于,所述根据所述实时图像控制所述无人机返航至返航位置包括:
    基于所述预设目标在所述捕获的图像中的当前位置确定所述预设目标在特定坐标系中的坐标;
    基于所述预设目标在特定坐标系中的坐标和所述无人机的当前坐标的坐标差,以及所述预设目标和所述无人机的预设坐标差,控制所述无人机返航至返航位置。
  7. 根据权利要求3所述的方法,其特征在于,所述控制所述无人机返航至返航位置包括:
    基于TOF技术确定所述预设目标相对于所述无人机的当前距离;
    基于所述当前距离,以及所述预设目标与所述距离阈值,控制所述无人机返航至返航位置。
  8. 根据权利要求1所述的方法,其特征在于,所述控制所述无人机返航至返航位置,包括:
    控制所述无人机从当前位置下降至与所述返航位置位于同一水平面;
    控制所述无人机沿着所述水平面飞行至返航位置。
  9. 根据权利要求1所述的方法,其特征在于,所述基于用户操作控制无人机起飞包括:
    基于用户操作触发所述拍摄设备捕获图像,得到目标图像;
    当所述目标图像满足预设条件时,控制无人机起飞。
  10. 根据权利要求9所述的方法,其特征在于,所述预设条件包括:所述目标图像中包括人脸;或,
    所述目标图像中包括人脸,且所述人脸与预设人脸图像的相似度达到预设相似度阈值。
  11. 根据权利要求1所述的方法,其特征在于,所述用户操作为所述用户对所述无人机的抛掷动作。
  12. 根据权利要求1所述的方法,其特征在于,所述基于用户操作控制无人机起飞包括:
    基于用户操作对无人机进行起桨解锁;
    当确定起桨解锁成功时,控制所述无人机起飞。
  13. 根据权利要求1所述的方法,其特征在于,在所述控制所述无人机飞行至特定位置之前,所述方法还包括:
    获取用户输入的位置参数,所述位置参数表示所述无人机的拍摄位置与所述预设目标之间的相对位置关系;
    基于所述位置参数与所述预设目标的当前位置确定特定位置。
  14. 根据权利要求1所述的方法,其特征在于,在所述控制所述无人机飞行至特定位置之前,所述方法还包括:
    获取构图规则;
    基于所述构图规则与所述预设目标的当前位置确定特定位置。
  15. 根据权利要求11所述的方法,其特征在于,在所述控制所述无人机飞行至特定位置之前,所述方法还包括:
    获取构图规则;
    基于构图规则确定所述无人机起飞后的飞行距离;
    基于所述无人机被抛掷时的方向确定所述无人机的飞行方向;
    基于所述飞行距离与飞行方向确定特定位置。
  16. 根据权利要求1所述的方法,其特征在于,所述触发所述拍摄设备对预设目标的拍摄操作包括:
    触发所述拍摄设备对预设目标的拍照操作。
  17. 根据权利要求1所述的方法,其特征在于,所述触发所述拍摄设备对预设目标的拍摄操作包括:
    触发所述拍摄设备对预设目标的摄像操作。
  18. 根据权利要求17所述的方法,其特征在于,所述方法还包括:
    在触发所述拍摄设备对预设目标的摄像操作后,控制所述无人机沿特定轨迹飞行;
    当所述无人机飞行至所述特定轨迹的终点时,触发所述拍摄设备停止所述摄像操作。
  19. 根据权利要求18所述的方法,其特征在于,在所述控制所述无人机沿特定轨迹飞行之前,所述方法还包括:
    获取用户输入的移动轨迹,所述移动轨迹为所述预设目标在所述拍摄设备的摄像画面中的移动轨迹;
    基于所述移动轨迹确定所述无人机的特定轨迹。
  20. 根据权利要求1所述的方法,其特征在于,所述触发所述拍摄设备对预设目标的拍摄操作包括:
    控制所述拍摄设备的构图;
    当所述预设目标在所述拍摄设备的拍摄画面中的成像满足预设的构图规则时,对所述预设目标进行拍摄操作。
  21. 根据权利要求3所述的方法,其特征在于,所述控制所述无人机返航至返航位置包括:
    获取外部设备发送的位置信号,所述外部设备为与所述无人机进行通信的设 备,所述位置信号用于指示所述外部设备的当前位置;
    控制所述无人机返航至所述位置信号所指示的位置。
  22. 根据权利要求1所述的方法,其特征在于,所述方法还包括:
    当检测到所述无人机悬停在所述返航位置上且检测到手掌位于所述无人机下方时,控制所述无人机降落到所述手掌上。
  23. 一种无人机,其特征在于,所述无人机上挂载有拍摄设备,所述无人机包括:
    处理器;所述处理器用于:
    基于用户操作控制无人机起飞;
    控制所述无人机飞行至特定位置;
    所述无人机飞行至所述特定位置后,触发所述拍摄设备对预设目标的拍摄操作;
    在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
  24. 根据权利要求23所述的无人机,其特征在于,所述返航位置为所述无人机的起飞位置。
  25. 根据权利要求23所述的无人机,其特征在于,所述返航位置与所述预设目标的当前位置之间的距离不超过预设的距离阈值。
  26. 根据权利要求25所述的无人机,其特征在于,所述处理器用于:
    基于所述拍摄设备捕获的图像控制所述无人机返航至返航位置。
  27. 根据权利要求26所述的无人机,其特征在于,所述处理器用于:
    基于所述预设目标在所述拍摄设备捕获的图像中的当前位置和当前大小,以及所述预设目标在图像中的预设位置和预设大小,控制所述无人机返航至返航位置。
  28. 根据权利要求26所述的无人机,其特征在于,所述处理器用于:
    基于所述预设目标在所述捕获的图像中的当前位置确定所述预设目标在特定坐标系中的坐标;
    基于所述预设目标在特定坐标系中的坐标和所述无人机的当前坐标的坐标差,以及所述预设目标和所述无人机的预设坐标差,控制所述无人机返航至返航位置。
  29. 根据权利要求25所述的无人机,其特征在于,所述处理器用于:
    基于TOF技术确定所述预设目标相对于所述无人机的当前距离;
    基于所述当前距离,以及所述预设目标与所述距离阈值,控制所述无人机返航至返航位置。
  30. 根据权利要求23所述的无人机,其特征在于,所述处理器用于:
    控制所述无人机从当前位置下降至与所述返航位置位于同一水平面;
    控制所述无人机沿着所述水平面飞行至返航位置。
  31. 根据权利要求23所述的无人机,其特征在于,所述处理器用于:
    基于用户操作触发所述拍摄设备捕获图像,得到目标图像;
    当所述目标图像满足预设条件时,控制无人机起飞。
  32. 根据权利要求31所述的无人机,其特征在于,所述预设条件包括:所述目标图像中包括人脸;或,
    所述目标图像中包括人脸,且所述人脸与预设人脸图像的相似度达到预设相似度阈值。
  33. 根据权利要求23所述的无人机,其特征在于,所述用户操作为所述用户对所述无人机的抛掷动作。
  34. 根据权利要求23所述的无人机,其特征在于,所述处理器用于:
    基于用户操作对无人机进行起桨解锁;
    当确定起桨解锁成功时,控制所述无人机起飞。
  35. 根据权利要求23所述的无人机,其特征在于,所述处理器还用于:
    获取用户输入的位置参数,所述位置参数表示所述无人机的拍摄位置与所述预设目标之间的相对位置关系;
    基于所述位置参数与所述预设目标的当前位置确定特定位置。
  36. 根据权利要求23所述的无人机,其特征在于,所述处理器还用于:
    获取构图规则;
    基于所述构图规则与所述预设目标的当前位置确定特定位置。
  37. 根据权利要求33所述的无人机,其特征在于,所述处理器还用于:
    获取构图规则;
    基于构图规则确定所述无人机起飞后的飞行距离;
    基于所述无人机被抛掷时的方向确定所述无人机的飞行方向;
    基于所述飞行距离与飞行方向确定特定位置。
  38. 根据权利要求23所述的无人机,其特征在于,所述处理器用于:
    触发所述拍摄设备对预设目标的拍照操作。
  39. 根据权利要求23所述的无人机,其特征在于,所述处理器用于:
    触发所述拍摄设备对预设目标的摄像操作。
  40. 根据权利要求39所述的无人机,其特征在于,所述处理器还用于:
    在触发所述拍摄设备对预设目标的摄像操作后,控制所述无人机沿特定轨迹飞行;
    当所述无人机飞行至所述特定轨迹的终点时,触发所述拍摄设备停止所述摄像操作。
  41. 根据权利要求40所述的无人机,其特征在于,所述处理器还用于:
    获取用户输入的移动轨迹,所述移动轨迹为所述预设目标在所述拍摄设备的拍摄画面中的移动轨迹;
    基于所述移动轨迹确定所述无人机的特定轨迹。
  42. 根据权利要求23所述的无人机,其特征在于,所述处理器用于:
    控制所述拍摄设备的构图;
    当所述预设目标在所述拍摄设备的拍摄画面中的成像满足预设的构图规则时,对所述预设目标进行拍摄操作。
  43. 根据权利要求25所述的无人机,其特征在于,所述处理器用于:
    获取外部设备发送的位置信号,所述外部设备为与所述无人机进行通信的设备,所述位置信号用于指示所述外部设备的当前位置;
    控制所述无人机返航至所述位置信号所指示的位置。
  44. 根据权利要求23所述的无人机,其特征在于,所述处理器还用于:
    当检测到所述无人机悬停在所述返航位置上且检测到手掌位于所述无人机下方时,控制所述无人机降落到所述手掌上。
  45. 一种机器可读存储介质,其特征在于,所述机器可读存储介质上存储有若干计算机指令,所述计算机指令被执行时进行如下处理:
    基于用户操作控制无人机起飞;
    控制所述无人机飞行至特定位置;
    所述无人机飞行至所述特定位置后,触发所述拍摄设备对预设目标的拍摄操 作;
    在所述拍摄设备完成所述拍摄操作后,自动控制所述无人机返航至返航位置。
  46. 根据权利要求45所述的机器可读存储介质,其特征在于,所述返航位置为所述无人机的起飞位置。
  47. 根据权利要求45所述的机器可读存储介质,其特征在于,所述返航位置与所述预设目标的当前位置之间的距离不超过预设的距离阈值。
  48. 根据权利要求47所述的机器可读存储介质,其特征在于,所述控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:
    基于所述拍摄设备捕获的图像控制所述无人机返航至返航位置。
  49. 根据权利要求48所述的机器可读存储介质,其特征在于,所述根据所述拍摄设备捕获的图像控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:
    基于所述预设目标在所述拍摄设备捕获的图像中的当前位置和当前大小,以及所述预设目标在图像中的预设位置和预设大小,控制所述无人机返航至返航位置。
  50. 根据权利要求48所述的机器可读存储介质,其特征在于,所述根据所述实时图像控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:
    基于所述预设目标在所述捕获的图像中的当前位置确定所述预设目标在特定坐标系中的坐标;
    基于所述预设目标在特定坐标系中的坐标和所述无人机的当前坐标的坐标差,以及所述预设目标和所述无人机的预设坐标差,控制所述无人机返航至返航位置。
  51. 根据权利要求47所述的机器可读存储介质,其特征在于,所述控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:
    基于TOF技术确定所述预设目标相对于所述无人机的当前距离;
    基于所述当前距离,以及所述预设目标与所述距离阈值,控制所述无人机返航至返航位置。
  52. 根据权利要求45所述的机器可读存储介质,其特征在于,所述控制所 述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:
    控制所述无人机从当前位置下降至与所述返航位置位于同一水平面;
    控制所述无人机沿着所述水平面飞行至返航位置。
  53. 根据权利要求45所述的机器可读存储介质,其特征在于,所述基于用户操作控制无人机起飞的过程中,所述计算机指令被执行时进行如下处理:
    基于用户操作触发所述拍摄设备捕获图像,得到目标图像;
    当所述目标图像满足预设条件时,控制无人机起飞。
  54. 根据权利要求53所述的机器可读存储介质,其特征在于,所述预设条件包括:所述目标图像中包括人脸;或,
    所述目标图像中包括人脸,且所述人脸与预设人脸图像的相似度达到预设相似度阈值。
  55. 根据权利要求45所述的机器可读存储介质,其特征在于,所述用户操作为所述用户对所述无人机的抛掷动作。
  56. 根据权利要求45所述的机器可读存储介质,其特征在于,所述基于用户操作控制无人机起飞的过程中,所述计算机指令被执行时进行如下处理:
    基于用户操作对无人机进行起桨解锁;
    当确定起桨解锁成功时,控制所述无人机起飞。
  57. 根据权利要求45所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    获取用户输入的位置参数,所述位置参数表示所述无人机的拍摄位置与所述预设目标之间的相对位置关系;
    基于所述位置参数与所述预设目标的当前位置确定特定位置。
  58. 根据权利要求45所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    获取构图规则;
    基于所述构图规则与所述预设目标的当前位置确定特定位置。
  59. 根据权利要求55所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    获取构图规则;
    基于构图规则确定所述无人机起飞后的飞行距离;
    基于所述无人机被抛掷时的方向确定所述无人机的飞行方向;
    基于所述飞行距离与飞行方向确定特定位置。
  60. 根据权利要求45所述的机器可读存储介质,其特征在于,所述触发所述拍摄设备对预设目标的拍摄操作的过程中,所述计算机指令被执行时进行如下处理:
    触发所述拍摄设备对预设目标的拍照操作。
  61. 根据权利要求45所述的机器可读存储介质,其特征在于,所述触发所述拍摄设备对预设目标的拍摄操作的过程中,所述计算机指令被执行时进行如下处理:
    触发所述拍摄设备对预设目标的摄像操作。
  62. 根据权利要求61所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    在触发所述拍摄设备对预设目标的摄像操作后,控制所述无人机沿特定轨迹飞行;
    当所述无人机飞行至所述特定轨迹的终点时,触发所述拍摄设备停止所述摄像操作。
  63. 根据权利要求62所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    获取用户输入的移动轨迹,所述移动轨迹为所述预设目标在所述拍摄设备的摄像画面中的移动轨迹;
    基于所述移动轨迹确定所述无人机的特定轨迹。
  64. 根据权利要求45所述的机器可读存储介质,其特征在于,所述触发所述拍摄设备对预设目标的拍摄操作的过程中,所述计算机指令被执行时进行如下处理:
    控制所述拍摄设备的构图;
    当所述预设目标在所述拍摄设备的拍摄画面中的成像满足预设的构图规则时,对所述预设目标进行拍摄操作。
  65. 根据权利要求48所述的机器可读存储介质,其特征在于,所述根据所述拍摄设备捕获的图像控制所述无人机返航至返航位置的过程中,所述计算机指令被执行时进行如下处理:
    获取外部设备发送的位置信号,所述外部设备为与所述无人机进行通信的设备,所述位置信号用于指示所述外部设备的当前位置;
    控制所述无人机返航至所述位置信号所指示的位置。
  66. 根据权利要求45所述的机器可读存储介质,其特征在于,所述计算机指令被执行时还进行如下处理:
    当检测到所述无人机悬停在所述返航位置上且检测到手掌位于所述无人机下方时,控制所述无人机降落到所述手掌上。
PCT/CN2017/085138 2017-05-19 2017-05-19 无人机的控制方法、无人机以及机器可读存储介质 WO2018209702A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/085138 WO2018209702A1 (zh) 2017-05-19 2017-05-19 无人机的控制方法、无人机以及机器可读存储介质
CN201780004588.XA CN108521812A (zh) 2017-05-19 2017-05-19 无人机的控制方法、无人机以及机器可读存储介质

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/085138 WO2018209702A1 (zh) 2017-05-19 2017-05-19 无人机的控制方法、无人机以及机器可读存储介质

Publications (1)

Publication Number Publication Date
WO2018209702A1 true WO2018209702A1 (zh) 2018-11-22

Family

ID=63434467

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/085138 WO2018209702A1 (zh) 2017-05-19 2017-05-19 无人机的控制方法、无人机以及机器可读存储介质

Country Status (2)

Country Link
CN (1) CN108521812A (zh)
WO (1) WO2018209702A1 (zh)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110955258A (zh) * 2019-11-28 2020-04-03 深圳蚁石科技有限公司 四轴飞行器的控制方法、装置、控制器和存储介质
CN112154395A (zh) * 2019-10-18 2020-12-29 深圳市大疆创新科技有限公司 飞行控制方法、系统、无人飞行器及存储介质
CN113484765A (zh) * 2021-08-03 2021-10-08 广州极飞科技股份有限公司 无人机的续航时间确定方法、装置、处理设备及介质
CN113678081A (zh) * 2020-10-22 2021-11-19 深圳市大疆创新科技有限公司 控制方法、装置、可移动平台、遥控终端与控制系统
CN113795805A (zh) * 2020-07-24 2021-12-14 深圳市大疆创新科技有限公司 无人机的飞行控制方法和无人机
CN114261306A (zh) * 2021-12-20 2022-04-01 深圳市歌尔泰克科技有限公司 无人机返舱充电方法、无人机、充电舱及可读存储介质

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111630511A (zh) * 2019-04-29 2020-09-04 深圳市大疆创新科技有限公司 一种无人机的控制方法及相关设备
CN110444019B (zh) * 2019-08-01 2020-09-11 安徽科力信息产业有限责任公司 基于车载停车警示牌的车辆异常停车成因检测方法及系统
CN110830719A (zh) * 2019-11-14 2020-02-21 苏州臻迪智能科技有限公司 取景范围确定方法及系统,拍摄控制方法及系统
CN111332470A (zh) * 2020-03-06 2020-06-26 国网江西省电力有限公司检修分公司 清障无人机系统
CN111176305A (zh) * 2020-04-14 2020-05-19 常州市盈能电气有限公司 一种视觉导航方法
CN111256703B (zh) * 2020-05-07 2020-08-04 江苏方天电力技术有限公司 一种多旋翼无人机巡检路径规划方法
CN111857168A (zh) * 2020-07-03 2020-10-30 北京二郎神科技有限公司 无人机定位方法、装置与无人机停放姿态调整方法、装置
WO2022094860A1 (zh) * 2020-11-05 2022-05-12 深圳市大疆创新科技有限公司 无人机控制方法、装置、无人机及计算机可读存储介质
CN112637499A (zh) * 2020-12-22 2021-04-09 广州富港生活智能科技有限公司 影像拍摄方法、装置、控制设备及存储介质
CN112783198B (zh) * 2020-12-23 2022-07-29 武汉量宇智能科技有限公司 一种飞行器控制起点的判断方法
CN112711274A (zh) * 2021-01-19 2021-04-27 四川一电航空技术有限公司 无人机控制方法、装置、无人机及计算机可读存储介质
CN113438414B (zh) * 2021-06-11 2022-10-11 深圳市道通智能航空技术股份有限公司 一种对焦方法、对焦装置与无人飞行器
CN113885553A (zh) * 2021-09-07 2022-01-04 四川一电航空技术有限公司 无人机拍摄方法、装置、无人机及存储介质
CN113784050B (zh) * 2021-09-17 2023-12-12 深圳市道通智能航空技术股份有限公司 一种图像获取方法、装置、飞行器和存储介质
CN113848988B (zh) * 2021-11-05 2022-04-01 南京航空航天大学 适用于大规模无人机的网格化编队方法
CN114170818A (zh) * 2021-11-30 2022-03-11 深圳市睿恪斯科技有限公司 一种基于图像检测的融合指挥交互装置
WO2023178487A1 (zh) * 2022-03-21 2023-09-28 深圳市大疆创新科技有限公司 飞行器及其功耗的控制方法、控制装置和计算机存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070215748A1 (en) * 2006-03-20 2007-09-20 Robbins Brent A VTOL UA V with lift fans in joined wings
CN105391939A (zh) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 无人机拍摄控制方法和装置、无人机拍摄方法和无人机
CN105573338A (zh) * 2015-12-25 2016-05-11 广东美嘉欣创新科技股份有限公司 一种无人机的定点停留和返航控制系统
CN106406351A (zh) * 2016-10-28 2017-02-15 易瓦特科技股份公司 用于控制无人机航线的方法和设备
CN206115281U (zh) * 2016-10-09 2017-04-19 刘珉恺 可按设定点自动着陆无人机的装置
CN106652567A (zh) * 2016-10-09 2017-05-10 北京国泰北斗科技有限公司 用于告警无人机的方法以及空域管理系统

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW201248347A (en) * 2011-05-18 2012-12-01 Hon Hai Prec Ind Co Ltd System and method for controlling unmanned aerial vehicle
CN105512643A (zh) * 2016-01-06 2016-04-20 北京二郎神科技有限公司 一种图像采集方法和装置
CN105539874B (zh) * 2016-01-08 2019-03-15 天津远度科技有限公司 一种无人机手抛起飞方法及系统
CN105843241A (zh) * 2016-04-11 2016-08-10 零度智控(北京)智能科技有限公司 无人机、无人机起飞控制方法及装置
CN106603970B (zh) * 2016-11-11 2020-12-08 北京远度互联科技有限公司 视频拍摄方法、系统及无人机

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070215748A1 (en) * 2006-03-20 2007-09-20 Robbins Brent A VTOL UA V with lift fans in joined wings
CN105391939A (zh) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 无人机拍摄控制方法和装置、无人机拍摄方法和无人机
CN105573338A (zh) * 2015-12-25 2016-05-11 广东美嘉欣创新科技股份有限公司 一种无人机的定点停留和返航控制系统
CN206115281U (zh) * 2016-10-09 2017-04-19 刘珉恺 可按设定点自动着陆无人机的装置
CN106652567A (zh) * 2016-10-09 2017-05-10 北京国泰北斗科技有限公司 用于告警无人机的方法以及空域管理系统
CN106406351A (zh) * 2016-10-28 2017-02-15 易瓦特科技股份公司 用于控制无人机航线的方法和设备

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112154395A (zh) * 2019-10-18 2020-12-29 深圳市大疆创新科技有限公司 飞行控制方法、系统、无人飞行器及存储介质
CN112154395B (zh) * 2019-10-18 2024-05-28 深圳市大疆创新科技有限公司 飞行控制方法、系统、无人飞行器及存储介质
CN110955258A (zh) * 2019-11-28 2020-04-03 深圳蚁石科技有限公司 四轴飞行器的控制方法、装置、控制器和存储介质
CN113795805A (zh) * 2020-07-24 2021-12-14 深圳市大疆创新科技有限公司 无人机的飞行控制方法和无人机
CN113795805B (zh) * 2020-07-24 2024-02-27 深圳市大疆创新科技有限公司 无人机的飞行控制方法和无人机
CN113678081A (zh) * 2020-10-22 2021-11-19 深圳市大疆创新科技有限公司 控制方法、装置、可移动平台、遥控终端与控制系统
CN113484765A (zh) * 2021-08-03 2021-10-08 广州极飞科技股份有限公司 无人机的续航时间确定方法、装置、处理设备及介质
CN113484765B (zh) * 2021-08-03 2024-04-09 广州极飞科技股份有限公司 无人机的续航时间确定方法、装置、处理设备及介质
CN114261306A (zh) * 2021-12-20 2022-04-01 深圳市歌尔泰克科技有限公司 无人机返舱充电方法、无人机、充电舱及可读存储介质

Also Published As

Publication number Publication date
CN108521812A (zh) 2018-09-11

Similar Documents

Publication Publication Date Title
WO2018209702A1 (zh) 无人机的控制方法、无人机以及机器可读存储介质
US11188101B2 (en) Method for controlling aircraft, device, and aircraft
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US11649052B2 (en) System and method for providing autonomous photography and videography
CN110687902B (zh) 用于免控制器式用户无人机交互的系统和方法
US20200346753A1 (en) Uav control method, device and uav
JP5947634B2 (ja) 航空写真撮像方法及び航空写真撮像システム
CN108351650B (zh) 一种对飞行器的飞行控制方法、装置及飞行器
WO2018058320A1 (zh) 无人机控制方法及装置
US20160124435A1 (en) 3d scanning and imaging method utilizing a self-actuating compact unmanned aerial device
WO2019128275A1 (zh) 一种拍摄控制方法、装置及飞行器
WO2018214071A1 (zh) 用于控制无人机的方法和装置及无人机系统
US20200304719A1 (en) Control device, system, control method, and program
WO2020048365A1 (zh) 飞行器的飞行控制方法、装置、终端设备及飞行控制系统
JP7435599B2 (ja) 情報処理装置、情報処理方法、及びプログラム
JP6910785B2 (ja) 移動撮像装置およびその制御方法、ならびに撮像装置およびその制御方法、無人機、プログラム、記憶媒体
WO2019104684A1 (zh) 无人机的控制方法、装置和系统
WO2020042186A1 (zh) 可移动平台的控制方法、可移动平台、终端设备和系统
WO2022188151A1 (zh) 影像拍摄方法、控制装置、可移动平台和计算机存储介质
WO2022094808A1 (zh) 拍摄控制方法、装置、无人机、设备及可读存储介质
WO2019134148A1 (zh) 无人飞行器的控制方法、控制装置和可移动平台

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17910293

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17910293

Country of ref document: EP

Kind code of ref document: A1