WO2018098678A1 - 飞行器的控制方法、装置和设备以及飞行器 - Google Patents

飞行器的控制方法、装置和设备以及飞行器 Download PDF

Info

Publication number
WO2018098678A1
WO2018098678A1 PCT/CN2016/107997 CN2016107997W WO2018098678A1 WO 2018098678 A1 WO2018098678 A1 WO 2018098678A1 CN 2016107997 W CN2016107997 W CN 2016107997W WO 2018098678 A1 WO2018098678 A1 WO 2018098678A1
Authority
WO
WIPO (PCT)
Prior art keywords
aircraft
photographing
flight
shooting
subject
Prior art date
Application number
PCT/CN2016/107997
Other languages
English (en)
French (fr)
Inventor
周游
封旭阳
赵丛
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to CN201910392486.0A priority Critical patent/CN110119154A/zh
Priority to CN201680002531.1A priority patent/CN107087427B/zh
Priority to PCT/CN2016/107997 priority patent/WO2018098678A1/zh
Publication of WO2018098678A1 publication Critical patent/WO2018098678A1/zh
Priority to US16/426,182 priority patent/US11188101B2/en
Priority to US17/456,753 priority patent/US20220083078A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/0094Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot involving pointing a payload, e.g. camera, weapon, sensor, towards a fixed or moving target
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/102Simultaneous control of position or course in three dimensions specially adapted for aircraft specially adapted for vertical take-off of aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0004Transmission of traffic-related information to or from an aircraft
    • G08G5/0013Transmission of traffic-related information to or from an aircraft with a ground station
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0017Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
    • G08G5/0026Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located on the ground
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/003Flight plan management
    • G08G5/0034Assembly of a flight plan
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0047Navigation or guidance aids for a single aircraft
    • G08G5/0069Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G5/00Traffic control systems for aircraft, e.g. air-traffic control [ATC]
    • G08G5/0073Surveillance aids
    • G08G5/0086Surveillance aids for monitoring terrain
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/695Control of camera direction for changing a field of view, e.g. pan, tilt or based on tracking of objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/28Mobile studios
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • Embodiments of the present invention relate to the field of control technologies, and in particular, to a method, an apparatus, and an apparatus for controlling an aircraft, and an aircraft.
  • UAV Unmanned Aerial Vehicle
  • UAV plant protection UAV plant protection
  • UAV aerial photography UAV aerial photography
  • UAV forest fire alarm monitoring etc.
  • civilization is also the future development trend of UAV.
  • Embodiments of the present invention provide a method, an apparatus, and an apparatus for controlling an aircraft, and an aircraft, which can improve the flexibility of taking an image by using an aircraft, and simplify the operation process of the user on the drone.
  • a method of controlling an aircraft includes: determining shooting information for a photographic subject, wherein the shooting information is used to indicate a range of the photographic subject in a to-be-captured picture; and controlling the aircraft to fly to a shooting position according to the shooting information.
  • a control device for an aircraft comprising: a determining module that determines shooting information for a photographic subject, wherein the shooting information is used to indicate a range of the photographic subject in a picture to be captured; and a control module Controlling the flight of the aircraft to shoot according to the shooting information position.
  • a control device for an aircraft comprising: a processor and a memory, wherein the memory is for storing instructions to cause the processor to perform the method of the first aspect.
  • a fourth aspect provides an aircraft comprising: a sensing system for detecting a motion parameter of the aircraft; a control device for the aircraft according to the third aspect; and one or more propulsion devices for the aircraft Providing flight power; wherein the control device of the aircraft is in communication with the one or more propulsion devices and is in communication with the sensing system for controlling the one or more propulsions based on motion parameters detected by the sensing system The device operates to control the flight of the aircraft.
  • the aircraft can be controlled to fly to a suitable shooting position according to the range of the subject desired by the user in the image to be photographed, the manual intervention of the aircraft during the shooting process is reduced, and the manual operation on the endurance time is reduced. Occupation, in fact, improves the aircraft's endurance.
  • FIG. 1 is a schematic architectural diagram of an unmanned flight system in accordance with an embodiment of the present invention.
  • FIG. 2 is a schematic flow chart of a method of controlling an aircraft according to an embodiment of the present invention.
  • FIG. 3 is a schematic flow chart of a method of controlling an aircraft according to another embodiment of the present invention.
  • FIG. 4 is a schematic flow chart of a method of controlling an aircraft according to still another embodiment of the present invention.
  • FIG. 5 is a schematic flow chart of a control method of an aircraft according to still another embodiment of the present invention.
  • Fig. 6 is a schematic structural view of a control device for an aircraft according to an embodiment of the present invention.
  • Fig. 7 is a schematic structural view of a control device for an aircraft according to another embodiment of the present invention.
  • Figure 8 is a schematic illustration of the structure of an aircraft in accordance with an embodiment of the present invention.
  • Embodiments of the present invention provide a method, apparatus, and apparatus for controlling an aircraft.
  • the following description of the invention uses a drone UAV as an example of an aircraft.
  • the UAV can be a small UAV.
  • the UAV may be a rotorcraft, such as a multi-rotor aircraft that is propelled by air by a plurality of propelling devices, embodiments of the invention are not limited thereto, and the UAV may be other types of UAVs or Mobile device.
  • FIG. 1 is a schematic architectural diagram of an unmanned flight system 100 in accordance with an embodiment of the present invention. This embodiment is described by taking a rotorcraft as an example.
  • the unmanned flight system 100 can include a UAV 110, a pan/tilt head 120, a display device 130, and a steering device 140.
  • the UAV 110 may include a power system 150, a flight control system 160, and a rack 170.
  • the UAV 110 can communicate wirelessly with the manipulation device 140 and the display device 130.
  • Rack 170 can include a fuselage and a stand (also known as a landing gear).
  • the fuselage may include a center frame and one or more arms coupled to the center frame, the one or more arms extending radially from the center frame.
  • the stand is attached to the fuselage for supporting the landing of the UAV 110.
  • the powertrain 150 may include an electronic governor (referred to as ESC) 151, one or more propellers 153, and one or more motors 152 corresponding to one or more propellers 153, wherein the motor 152 is coupled to the electronic governor 151 and the propeller 153, the motor 152 and the propeller 153 are disposed on the corresponding arm; the electronic governor 151 is configured to receive the driving signal generated by the flight controller 160, and provide a driving current to the motor 152 according to the driving signal to control The rotational speed of the motor 152.
  • Motor 152 is used to drive propeller rotation to power the flight of UAV 110, which enables UAV 110 to achieve one or more degrees of freedom of motion.
  • the UAV 110 can be rotated about one or more axes of rotation.
  • the above-described rotating shaft may include a roll axis, a pan axis, and a pitch axis.
  • the motor 152 can be a DC motor or an AC motor.
  • the motor 152 may be a brushless motor or a brush motor.
  • Flight control system 160 may include flight controller 161 and sensing system 162.
  • the sensing system 162 is used to measure the attitude information of the UAV, that is, the position information and state information of the UAV 110 in space, for example, three-dimensional position, three-dimensional angle, three-dimensional speed, three-dimensional acceleration, and three-dimensional angular velocity.
  • the sensing system 162 can include, for example, a gyroscope, an electronic compass, an IMU (Inertial Measurement Unit, Inertial At least one of a sensor such as Measurement, Unit), a vision sensor, a global navigation satellite system, and a barometer.
  • the global navigation satellite system may be a GPS (Global Positioning System) or.
  • the flight controller 161 is used to control the flight of the UAV 110, for example, the flight of the UAV 110 can be controlled based on the attitude information measured by the sensing system 162. It should be understood that the flight controller 161 may control the UAV 110 in accordance with pre-programmed program instructions, or may control the UAV 110 in response to one or more control commands from the steering device 140.
  • the pan/tilt 120 can include an ESC 121 and a motor 122.
  • the pan/tilt is used to carry the photographing device 123.
  • the flight controller 161 can control the motion of the platform 120 through the ESC 121 and the motor 122.
  • the platform 120 may further include a controller for controlling the movement of the platform 120 by controlling the ESC 121 and the motor 122.
  • the pan/tilt 120 may be independent of the UAV 110 or may be part of the UAV 110.
  • the motor 122 can be a DC motor or an AC motor.
  • the motor 122 may be a brushless motor or a brush motor.
  • the pan/tilt can be located at the top of the aircraft or at the bottom of the aircraft.
  • the photographing device 123 may be, for example, a device for capturing an image such as a camera or a video camera, and the photographing 123 may communicate with the flight controller and perform photographing under the control of the flight controller.
  • Display device 130 is located at the ground end of unmanned flight system 100 and can communicate with UAV 110 wirelessly and can be used to display gesture information for UAV 110. In addition, an image taken by the photographing device can also be displayed on the display device 130. It should be understood that the display device 130 may be a stand-alone device or may be disposed in the manipulation device 140.
  • the handling device 140 is located at the ground end of the unmanned flight system 100 and can communicate with the UAV 110 wirelessly for remote manipulation of the UAV 110.
  • the manipulation device may be, for example, a remote controller or a user terminal equipped with an APP (Application) that controls the UAV, for example, a smartphone, a tablet, or the like.
  • APP Application
  • receiving the user's input through the manipulation device may refer to manipulating the UAV through an input device such as a pull wheel, a button, a button, a rocker, or a user interface (UI) on the user terminal.
  • UI user interface
  • FIG. 2 is a schematic flow chart of a method of controlling an aircraft according to an embodiment of the present invention.
  • the control method of FIG. 2 can be performed, for example, by a control device or a control device, such as the flight controller of FIG. 1, and embodiments of the present invention are not limited thereto.
  • the control method of FIG. 2 can also be carried by an aircraft.
  • Other control devices or control devices are implemented.
  • the control method of Figure 2 includes The following content.
  • the photographing information may be a ratio of a ratio of an image corresponding to the subject to the photograph to be photographed or an image of the photograph corresponding to the photograph to be photographed.
  • the above shooting information may be, for example, a scene selected by the user.
  • the scene can be divided into three categories: large scene, medium scene and small scene according to the proportion or range of the subject in the shooting picture. Each scene can be further subdivided. The larger the scene, the smaller the proportion or range of the subject in the shot, and vice versa.
  • the portrait photography can determine the scene according to the proportion or range of the area of the subject in the screen, and is divided into a full body image, a large bust, a bust, a bust, a shoulder avatar, and a large avatar.
  • the photographing information may include at least one of a large scene, a medium scene, and a small scene; or the photographing information may include at least one of a full body image, a large bust, a bust, a bust, a shouldered avatar, and a large avatar.
  • the correspondence between different scenes and the proportion or range of the subject in the shooting screen may be pre-selected.
  • the subject is determined to be in the to-be-shot according to the scene selected by the user.
  • the proportion or range may be determined according to the proportion or range of the shooting object input on the user interface in the image to be captured.
  • the user may be utilized.
  • a box drawn on the touch screen indicates the range of the subject in the picture to be taken.
  • the subject is also called a shooting target and a subject, and can be either a user who manipulates the aircraft or a person or thing other than the user.
  • the proportion or range of the subject in the image to be captured there is a correspondence between the proportion or range of the subject in the image to be captured and the distance between the photographing device and the photographing object (hereinafter referred to as the photographing distance), for example, the photographing object in the image to be photographed.
  • the proportion or range is inversely related to the shooting distance, that is, the larger the proportion or range of the subject in the picture to be taken, the closer the shooting device needs to the subject, and vice versa.
  • the aircraft controller may estimate the shooting position according to the proportion or range of the subject in the picture to be taken, thereby directly controlling the flight of the aircraft to the shooting position, or the aircraft controller may dynamically adjust the target position of the aircraft so that the subject is The scale or range in the picture to be captured tends to coincide with the range indicated by the shooting information, thereby controlling the final flight of the aircraft to The right shooting position.
  • the flying of the aircraft to the appropriate shooting position can be controlled according to the range of the photographic subject desired by the user in the image to be photographed, the manual intervention of the aircraft during the shooting process is reduced, the user experience is improved, and the manual is reduced.
  • the operation of the occupation of the end of life improves the aircraft's endurance.
  • the photographing information for the subject can be determined in the following manner:
  • the shooting information is determined based on the input of the received external device.
  • a user interface element such as a button for inputting shooting information, a text box, a selection box, or the like may be set on the user interface, so that the user can select or input the subject through a user interface of an external device (for example, a user terminal or a remote controller).
  • an external device for example, a user terminal or a remote controller.
  • the aircraft controller can receive shooting information through its communication interface with an external device. In this way, the user can accurately input the photographing information, so that an image of a desired size of the user can be accurately photographed.
  • Determining the shooting information by detecting the speed and acceleration when the aircraft is thrown means that the shooting information can be determined according to the amount of force when the user throws the aircraft. For example, the greater the force, the greater the speed and acceleration when the aircraft is thrown. It indicates that the user wants the aircraft to fly farther, that is, the larger the scene that the user wants to shoot, or the smaller the proportion or range of the subject in the picture to be taken, and vice versa.
  • the greater the angle between the throwing trajectory and the horizontal plane indicates that the user wants the aircraft to fly farther, that is, the user wants to take an image with a larger scene, and vice versa. Since the shooting information can be determined according to the user's throwing state, the user experience is not required to be manually set by the external device, and the occupation of the aircraft life time is further reduced, and the endurance capability of the aircraft is improved.
  • the definition of the photographing information and the manner of determining the photographing information are not limited to the above description as long as different photographing information can be distinguished.
  • control method of FIG. 2 further includes: acquiring an image of the photographic subject, and determining the photographic subject according to the image.
  • the photographing device may be controlled to acquire a feature image of the photographic subject, for example, when the photographic subject is an animal or a person, the feature image is a facial feature image.
  • the above image can be used to search, identify, and track a subject, for example, according to an image currently obtained by the photographing device and the image described above. In contrast, if the two match, the subject is searched, identified, and tracked.
  • an image of a subject can be acquired by one of the following methods:
  • the photographing device Before the aircraft takes off, the photographing device is controlled to take a photographing object, and an image of the photographing object is acquired. For example, before the aircraft flies, the user can point the photographing device at the photographing object to take a photograph to obtain a feature image thereof.
  • the shooting device is controlled to take a picture of the object, and an image of the object is acquired.
  • the aircraft can be controlled to turn off the handpiece and take a picture of the object to take a feature image after taking off.
  • acquiring an image of the photographic subject by receiving an image of the photographic subject transmitted by the external device, for example, the user may transmit the feature image of the photographic subject saved on the user terminal to the flight control through a communication interface between the user terminal and the flight controller. Device.
  • control method of FIG. 2 further includes: determining a flight trajectory of the aircraft when the photographic subject is photographed, and controlling the photographing device to photograph the photographic subject when the aircraft flies according to the flight trajectory.
  • the user can directly throw the aircraft, and the aircraft can recognize the action thrown by the user and select a suitable flight path. Since the flight path of the drone user can be indicated by a simple action, the user experience is improved, and the occupation of the aircraft life time is further reduced, and the aircraft's endurance capability is improved.
  • the flight trajectory of the aircraft may be determined based on input received from an external device.
  • a user may set a user interface element such as a button, a text box, a selection box, etc. for inputting flight path information on the user interface so that the user can input or select a flight trajectory.
  • the motion of the aircraft may be detected by the motion sensor of the aircraft, the first motion data output by the motion sensor is acquired, and the flight trajectory is determined according to the first motion data.
  • the first motion data may be, for example, one or more of a position, a speed, an angular acceleration, and an acceleration of the aircraft as a function of time.
  • the aircraft controller may acquire second motion data output by the motion sensor of the external device detecting the motion of the external device, and determine the flight trajectory according to the second motion data.
  • the second motion data may be, for example, one or more of a position, a speed, an angular acceleration, and an acceleration of the user terminal as a function of time.
  • the external device can be a user terminal.
  • the user can hold the user's end before the flight of the aircraft
  • the specific action is performed, and the motion sensor carried on the user terminal can detect the motion of the user terminal and output motion data to the flight controller, and the flight controller determines the flight trajectory according to the motion data. For example, if the action of the user terminal is a surround motion, the aircraft determines that the flight trajectory is a surround flight.
  • the above sensor may include at least one of a gyroscope, an electronic compass, an inertial measurement unit, an accelerometer, a global navigation satellite system, and a vision sensor.
  • the above motion includes at least one of the following motions: a surround motion, a pull motion, a zoom motion, and an S motion.
  • the above motion may include one of motion in a vertical plane and motion in a horizontal plane.
  • the wraparound motion can be a motion in a vertical plane or a motion in a horizontal plane. It should be understood that the above motions are merely examples, and other forms of motion may be employed to represent the flight trajectory.
  • control method of FIG. 2 further includes: detecting the rotation of the pitch axis of the pan/tilt of the aircraft before the flight of the aircraft; determining the flight path as a spiral rise and a spiral according to the detected rotation and wraparound motion of the pitch axis One of the declines.
  • control method of FIG. 2 before determining the flight trajectory of the aircraft, further includes: determining whether a signal to activate the determined flight trajectory is received, the signal being used to activate a process of determining a flight trajectory of the aircraft.
  • a flight trajectory is not input within a preset time, it is determined that the flight trajectory is a follow flight.
  • following flight means following a moving target flight.
  • the following may be GPS follow-up, ie, using GPS positioning technology to achieve follow-up flight, followed by visual follow-up, ie using visual sensors and image recognition to achieve follow-up flight.
  • the flight trajectory may include at least one of a surround, a zoom, a zoom, and an S shape.
  • control method of FIG. 2 further includes: after the aircraft flies to the shooting position, controlling the photographing device carried by the aircraft to photograph the photographing object.
  • control method of FIG. 2 further includes: receiving a preset composition rule from an external device; or determining a composition rule by identifying a preset action or gesture of the photographic subject.
  • the composition rule includes one or more of a position of the subject in the photographing screen, an angle of the subject's face in the photographing screen, and a completeness of the subject's face in the photograph.
  • the composition rule includes rules for one of the following composition rules: balanced composition, symmetric composition, diagonal composition, triangular composition, nine-square lattice composition, centripetal composition, bipartite composition, and shooting
  • balanced composition balanced composition
  • symmetric composition diagonal composition
  • triangular composition triangular composition
  • nine-square lattice composition centripetal composition
  • bipartite composition shooting
  • shooting The face in the picture is the front face, and the face in the picture is the side face.
  • the photographing device carried by the control aircraft to photograph the photographing object comprises: controlling the composition of the photographing device, so that the imaging of the photographing object in the photographing screen satisfies a preset composition rule; and the photographing object is in the photographing screen.
  • the image satisfies the preset composition rule, the subject is photographed.
  • the above-described composition of the photographing device is controlled such that imaging of the photographing object in the photographing screen satisfies a preset composition rule includes: adjusting at least one of a flying posture of the aircraft, a motion of the pan/tilt of the photographing device, and a focal length of the photographing device. To control the composition of the photographing device so that the position of the subject in the subject satisfies the preset composition rule.
  • the image currently presented by the photographic subject in the photographic image may be acquired from the photographic device, and the position occupied by the photographic subject in the photographic image is determined by the image recognition, thereby determining whether the position of the photographic subject in the photographic subject satisfies
  • the composition rules For example, if the composition is a nine-square grid, for example, if the user selects a nine-square grid, the subject can be imaged at the four intersections of the nine squares.
  • the nine-square lattice pattern can be further subdivided into four modes corresponding to the four intersection points, so that the user can further select which intersection point the imaging object is imaged.
  • the composition is adjusted accordingly, so that the center of the object is finally It coincides with a certain intersection of the nine squares.
  • the photographing device carried by the control aircraft to photograph the photographing object includes: controlling the photographing device to adjust a focal length of the photographing device according to the principle of depth of field, and photographing the photographing object by using the adjusted focal length.
  • the focal length can be adjusted according to the principle of depth of field. As shown in equations (1), (2) and (3), the foreground depth is shallower than the back depth of field, so it is necessary to focus on the first 1/3. One-third of them are empirical values, and the lens can be focused on the first 1/3 of the depth of the queue. For example, if you take a group photo of five people, you can focus on the person in the middle of the second row, so you can use the foreground depth and back depth more effectively, and take a clear group photo.
  • is the allowable mass circle diameter
  • f is the lens focal length
  • F is the lens aperture value
  • L is the focus distance
  • ⁇ L 1 is the foreground depth
  • ⁇ L 2 is the back depth of field
  • ⁇ L is the depth of field.
  • control method of FIG. 2 further includes: detecting environmental condition information and/or posture information of the photographic subject, and adjusting the shooting angle according to the environmental condition information and/or the posture information of the photographic subject.
  • the environmental condition information may be, for example, information indicating backlighting, weather conditions, light and darkness, and the like.
  • the posture information of the human body may be, for example, information indicating a posture of turning, standing, sitting, or the like of the head.
  • the specific shooting angles may include a pan, a side, a head, and the like.
  • the shooting angle can be adjusted so that the frontal photograph of the subject can be photographed.
  • the above functions may be set or selected by the user through a user interface of the external device (eg, a user interface on the user terminal) prior to flight of the aircraft.
  • the shooting angle can be adaptively adjusted according to the environmental condition information and/or the posture information of the shooting object, the shooting process is intelligent, the manual interference during the shooting process is reduced, the user experience is improved, and the manual operation on the aircraft is reduced.
  • the occupation of the endurance time improves the endurance of the aircraft.
  • control method of FIG. 2 further includes: automatically starting the aircraft when the aircraft meets a preset automatic start condition.
  • Automatically starting the aircraft means that when the preset automatic starting conditions are met, the starting circuit of the aircraft is directly turned on, and the power unit controlling the aircraft starts to work without manually starting the aircraft by buttons or buttons. Since the aircraft can be automatically activated according to preset conditions, it is possible to combine the start of the aircraft with the motion state for setting the flight trajectory or shooting information before the flight of the aircraft, thereby making the whole shooting process in one go, improving the user experience and reducing The manual operation of the aircraft's battery life, improve the aircraft's endurance.
  • the aircraft can be automatically activated as follows:
  • the third motion data may include a distance that the aircraft is thrown, in which case the third motion data satisfies the automatic start condition including: the distance at which the aircraft is thrown is greater than or equal to the first predetermined threshold.
  • the first predetermined threshold may be zero or a safe distance that does not cause damage to the user. When the distance between the aircraft and the user is a safe distance, the aircraft can be started to avoid harm to the user.
  • the third motion data may include a vertical speed or speed of the aircraft, in which case the third motion data satisfies an automatic start condition including: the vertical speed or speed of the aircraft is less than or equal to the second Preset threshold.
  • the second predetermined threshold may be equal to zero or other value close to zero. Since the vertical speed or speed is set to be less than or equal to the preset threshold and then restarted, the flight at the start of the aircraft is more stable.
  • the power unit Before the aircraft is thrown away, when the aircraft meets the preset idle condition, the power unit is activated and the power unit is controlled to rotate at an idle speed.
  • the aircraft can control the power unit to rotate at an idle speed after the aircraft is unlocked by the face.
  • the face By unlocking the face as a preset idle condition, the false start of the aircraft can be avoided.
  • automatic activation can be combined with face unlocking and confirmation of the subject, making the entire shooting process smoother and improving the user experience.
  • the power unit may be controlled to rotate at an idle speed after the aircraft has been placed horizontally for more than a preset length of time.
  • the user may place the aircraft horizontally after setting the flight trajectory (for example, horizontally placed in the palm), and the aircraft determines that the aircraft is in a horizontal state (for example, the attitude angle is zero) according to the attitude information of the aircraft detected by the sensor.
  • the aircraft is automatically started and the power unit is controlled to rotate at an idle speed.
  • the aircraft may also control the flight of the aircraft to the shooting position according to the shooting information after the power unit is idling for a preset time. Since the automatic start-up and flight path confirmation process can be combined, the entire shooting process is smoother and the user experience is improved.
  • control the power unit may idlely rotate upon confirmation of receipt of a signal permitting idle rotation.
  • signals that allow for idling rotation may be generated or signals transmitted by an external device to allow idle rotation to control the idle rotation of the aircraft, embodiments of the present invention may utilize these signals in conjunction with automatic activation of the aircraft to provide The safety of the aircraft's automatic start is increased.
  • the fourth motion data indicates a duration in which the attitude angle of the aircraft is within a preset threshold range
  • the fourth motion data satisfying the automatic start condition may include the duration exceeding the second preset threshold
  • the user may place the aircraft horizontally after setting the flight trajectory (for example, horizontally placed in the palm), and the aircraft determines that the aircraft is in a horizontal state (for example, the attitude angle is zero) according to the attitude information of the aircraft detected by the sensor. After the time, the aircraft is automatically started.
  • the flight trajectory for example, horizontally placed in the palm
  • the aircraft determines that the aircraft is in a horizontal state (for example, the attitude angle is zero) according to the attitude information of the aircraft detected by the sensor. After the time, the aircraft is automatically started.
  • controlling the flight of the aircraft to the photographing position according to the photographing information includes: searching and recognizing the determined photographing object by the photographing device of the aircraft; detecting the range of the photographing object in the photographing screen after searching and recognizing the photographing object Whether it is consistent with the range indicated by the shooting information; when the range of the subject in the shooting screen coincides with the range indicated by the shooting information, the position where the aircraft is located is determined as the shooting position.
  • the aircraft is adjusted away from the photographing object, and if the proportion of the photographing object in the current photographing screen is smaller than the ratio indicated by the photographing information, Adjust the aircraft closer to the subject.
  • the above adjustments can be adjusted in fixed steps or in variable steps. When it is determined that the ratio coincides with the ratio indicated in the photographing information, the current position can be taken as the final photographing position.
  • the flight control method of FIG. 2 further includes: determining a flight direction after the aircraft takes off, and controlling the aircraft to fly to the shooting position according to the flight direction and the shooting information. For example, when the above-described adjustment aircraft is moved away from or close to the subject, the aircraft can be adjusted to move away from or close to the subject in the flight direction.
  • the aircraft may be controlled to fly to the shooting position according to the shooting information and the shooting parameters of the shooting device.
  • the shooting parameter may be at least one of a Field of View (FOV) parameter and a focus parameter.
  • FOV Field of View
  • the longer the focal length the larger the step size used to adjust the aircraft away from or near the subject, and vice versa.
  • the aircraft is searched and identified by the photographing device.
  • the object specifically includes: when it is determined that there is no obstacle in front of the aircraft, controlling the nose of the aircraft or the head of the aircraft to make the photographing device face the take-off position, and use the photographing device to search for and identify the determined photographing object.
  • the opposite direction of the initial flight direction is controlled, so that the lens of the photographing device faces the subject, and the subject is searched for and recognized.
  • the tracking algorithm is used to lock the face of the subject, the subject is confirmed, and the entire body of the subject is searched using a Human Detector to determine the subject of the subject.
  • the above-mentioned aircraft searching and recognizing the determined object by the photographing device specifically includes: controlling the aircraft to bypass the obstacle when controlling the presence of an obstacle in front of the aircraft, and controlling the turning head of the aircraft or the head of the aircraft
  • the photographing device is caused to face and take off, and the photographing device is used to search for and identify the determined subject.
  • the position and height of the throw point can be known by a position sensor, such as a GPS or a visual sensor, and the position and height can be recorded. In this case, you can plan the path to bypass the obstacle, and if you can't bypass, try to lift the height to avoid the obstacle.
  • the nose can always be oriented in the forward direction during flight to ensure flight safety.
  • controlling the flight of the aircraft to the photographing position according to the photographing information includes: determining a photographing position of the aircraft with respect to the photographing object according to the photographing information, and controlling the aircraft to fly to the photographing position.
  • control method of FIG. 2 further includes: determining a preset composition rule that the photographic subject needs to satisfy in the photographic image; wherein the controlling the flight of the aircraft to the shooting position according to the shooting information includes: configuring the image according to the preset Rules and shooting information control the flight of the aircraft to the shooting position.
  • the flight control method of FIG. 2 further includes: determining a flight direction after the aircraft takes off, wherein determining the shooting position of the aircraft relative to the photographic object according to the shooting information comprises: determining that the aircraft takes off according to the shooting information.
  • the distance of the flight is determined, and the shooting position is determined according to the flight direction and the flight distance.
  • the flight distance may be a horizontal distance from the subject to the shooting position, and the flight direction and flight distance determine the height of the shooting position. Therefore, the height of the shooting position can be determined according to the flight direction and the flight distance.
  • the embodiment of the present invention does not limit the manner in which the flight direction is determined.
  • one of the following various manners may be used to determine the flight direction after the aircraft takes off:
  • the flight direction may be set to fly to the upper left or fly to the upper right or fly in front of the front before the aircraft takes off.
  • the flight direction is determined according to the direction of the nose of the aircraft when the aircraft takes off. For example, if the aircraft is thrown to the upper left or upper right, the flight direction is determined to be upper left or upper right.
  • determining the direction of flight based on the position at which the aircraft takes off for example, if the position of the aircraft at the time of takeoff is low, the direction of flight is pointing in a lower direction, and if the position of the aircraft is higher at the time of flight, The direction of flight points in a higher direction.
  • the flight direction is determined according to the position of the photographic subject, for example, if the photographic subject is located on a moving object (for example, a motor vehicle), the flight direction is directed to the direction in which the subject moves.
  • a moving object for example, a motor vehicle
  • the flight direction is determined according to the orientation of the photographic subject, for example, if the aircraft determines that the photographic subject is oriented to the upper left according to the detected posture of the photographic subject, it is determined that the flight direction is the upper left.
  • the flight direction is determined according to the selected shooting angle. For example, the user can determine the shooting angle before the aircraft takes off, and the flight direction can be determined according to the shooting angle.
  • the flight control method of FIG. 2 further includes: determining a shooting parameter of the photographing device for photographing the photographing object, wherein determining a photographing position of the aircraft relative to the photographing object according to the photographing information comprises: according to photographing The information determines the flight distance after the aircraft takes off; the shooting position is determined according to the flight distance and the shooting parameters of the photographing device.
  • the shooting parameter may be at least one of a Field of View (FOV) parameter and a focus parameter. Different FOV parameters or focal length parameters, the determined shooting positions are different for the same scene.
  • the focal length parameter can be a focal length
  • the field of view parameter can be a field of view angle. For the same scene demand, the longer the focal length, the larger the shooting distance required. The larger the angle of view, the smaller the shooting distance required.
  • the shooting position may also be determined according to the flight distance, the shooting parameters of the photographing apparatus, and the flight direction.
  • determining a shooting position of the aircraft relative to the photographic subject according to the shooting information includes determining a shooting position of the aircraft according to the number of the plurality of bodies and the shooting information. For example, the more the number of subjects being photographed, the farther the shooting position is from the subject.
  • FIG. 3 is a schematic flow chart of a method of controlling an aircraft according to another embodiment of the present invention.
  • the control method of FIG. 3 is an example of the method of FIG. 2.
  • the control method of FIG. 3 includes the following.
  • composition rules entered by the user can be received before the aircraft takes off.
  • the composition rule may also be determined according to the user's gesture after the aircraft takes off.
  • the composition rule may include one or more of a position of the subject in the photographing screen, an angle of the subject's face in the photographing screen, and a completeness of the subject's face in the photograph.
  • the composition rule when the composition rule includes the position of the photographic subject in the photographic picture, the composition rule may include, for example, one of the following: balanced composition, symmetric composition, diagonal composition, triangular composition, nine-square lattice, and centripetal composition , parting the composition.
  • the composition rule when the composition rule includes the angle of the face of the subject in the photographing screen, the composition rule may be, for example, that the face in the photographing screen is a positive face or the face in the photographing screen is a side face.
  • the composition rule when the composition rule includes the completeness of the face of the subject in the picture, the composition rule may be, for example, a partial image of the face or a complete image of the face.
  • embodiments of the invention do not limit the order in which 310 and 315 are performed. Both can be executed at the same time, or 315 can be executed between 310.
  • the aircraft controller can estimate the shooting distance based on the relationship between the range of the photographic subject in the shooting screen and the distance between the photographic subject and the shooting position (hereinafter referred to as the shooting distance), wherein the photographic subject occupies the shooting image.
  • the shooting distance can be calculated using a preset algorithm according to the range of the subject desired by the user in the image to be captured.
  • the relationship between the range of the subject in the shooting screen and the shooting distance may be set in advance, and the shooting distance may be determined by looking up the table according to the range of the subject desired by the user in the image to be captured.
  • the shooting position may be positioned according to the shooting distance, and the flying to the shooting position or the obstacle avoidance flight to the shooting position may be performed by means of pointing flight.
  • the aircraft may automatically initiate flight in a variety of manners as described in the embodiment of FIG. 2.
  • an image transmitted by a photographing device or other visual sensor carried on the aircraft may be received in real time, and a feature image of the preset photographing object is searched for and recognized in the received image, and the photographed object is tracked.
  • the shooting device can be controlled to perform intelligent composition. If the photo is taken for a single person, the classic nine-square grid pattern can be used for photographing, and can be adjusted in real time according to the facial recognition algorithm and the tracking result of the tracking algorithm. The position and orientation of the aircraft always captures the positive face of the target.
  • the composition of the photographing apparatus may be controlled by adjusting at least one of a flight attitude of the aircraft, a motion of the pan/tilt of the photographing apparatus, and a focal length of the photographing apparatus such that the position of the photographing object in the photographing object satisfies a preset composition rule.
  • the flight attitude of the aircraft can be adjusted by controlling the rotational speed of the propeller of the aircraft, so that the aircraft produces attitude changes such as roll, pan and tilt. It is also possible to adjust the movement of the gimbal by controlling the rotation of the pan/tilt mechanism, the translation mechanism, and the tilt mechanism of the gimbal.
  • the above adjustment and control will cause the photographing device to move with respect to the subject with the aircraft or the pan/tilt, thereby enabling adjustment of the composition of the subject in the photographing screen.
  • the focal length of the shooting device can be adjusted during shooting to get a clear composition.
  • an imaging instruction is output to the photographing apparatus, instructing the subject to perform photographing.
  • the shooting position of the aircraft relative to the shooting object can be estimated according to the range of the photographic subject desired by the user in the image to be captured, the aircraft is controlled to fly to the shooting position, and the smart is performed according to the preset composition rule.
  • the shooting after the composition reducing the artificial interference of the aircraft during the shooting process, improving the user experience, and reducing the occupation of the aircraft's battery life, in fact, improve the aircraft's endurance.
  • FIG. 4 is a schematic flow chart of a method of controlling an aircraft according to still another embodiment of the present invention.
  • the control method of FIG. 4 is an example of the method of FIG. 2.
  • the control method of FIG. 4 includes the following.
  • the photographing device may be controlled to take a picture of the subject before flight to acquire a feature image (for example, a face feature image) of the subject.
  • a feature image for example, a face feature image
  • an image feature of a subject may also be acquired from an external device (eg, a user terminal).
  • an image transmitted by a photographing device or other visual sensor carried on the aircraft may be received in real time after the aircraft takes off, and a feature image of the preset photographed object is searched for and recognized in the received image, and the photographed object is tracked. .
  • the aircraft may automatically initiate flight in a variety of manners as described in the embodiment of FIG. 2.
  • the proportion of the subject in the current photographing screen can be obtained by image recognition, and it is determined whether the ratio coincides with the ratio indicated in the photographing information.
  • the aircraft is adjusted away from the photographing object, and if the proportion of the photographing object in the current photographing screen is smaller than the ratio indicated by the photographing information, Adjust the aircraft closer to the subject.
  • the above adjustments can be adjusted in fixed steps or in variable steps.
  • the current position can be taken as the final photographing position.
  • the composition of the photographing apparatus can also be controlled such that the imaging of the photographing object in the photographing screen satisfies the preset composition rule, and when the imaging of the photographing object in the photographing screen satisfies the preset composition rule , shooting the subject.
  • Figure 5 is a schematic flow chart of a method of controlling an aircraft according to another embodiment of the present invention.
  • the control method of FIG. 5 is an example of the method of FIG. 2.
  • the control method of FIG. 5 includes the following.
  • the aircraft or the external device can be manipulated by the user, and the flight trajectory during the shooting process desired by the user is determined based on the motion data detected by the sensors on the aircraft or the external device.
  • the user can tell the flight path of the aircraft during the shooting process through some simple actions, thereby further reducing the external device (for example, the user terminal or the remote control during the shooting process).
  • the manual operation of the aircraft improves the user experience and reduces the consumption of electric energy and improves the endurance of the aircraft.
  • the method for specifically determining the flight trajectory refer to the embodiment of FIG. 2, and details are not described herein again.
  • a ranging sensor can be utilized to detect if there is an obstacle within a certain range (eg, 6-7 meters) in the flight direction. It should be understood that the range is an empirical value, which is related to the shooting parameters of the shooting device, and can be adjusted according to different types of shooting devices.
  • the aircraft may automatically initiate flight in a variety of manners as described in the embodiment of FIG. 2.
  • the opposite direction of the initial flight direction is controlled, so that the lens of the photographing device faces the subject while the subject is recorded and recognized by the facial features recorded in 515.
  • the tracking algorithm is used to lock the face of the subject, the subject is confirmed, and the entire body of the subject is searched using a Human Detector to determine the subject of the subject.
  • an image transmitted by a photographing device or other visual sensor may be received in real time, and the feature image determined in 515 is searched for and identified in the received image.
  • How to search for feature images in images is a conventional technique and will not be described here.
  • the position of the subject in the image to be photographed can be adjusted according to a preset composition rule, and the distance between the aircraft and the subject can be adjusted, so that the subject is in the image to be photographed.
  • the proportion in the image tends to coincide with the proportion of the subject indicated by the shooting information in the to-be-taken picture, and conforms to the preset composition rule.
  • the distance between the aircraft and the subject can be adjusted while adjusting the composition, or
  • the order of adjusting the composition and adjusting the distance between the aircraft and the subject is not limited. For example, it is also possible to first control the flight of the aircraft to a suitable shooting position, adjust the composition, or adjust the composition first, and then control the aircraft to fly to the appropriate shooting position.
  • the aircraft may shoot while flying around the subject during shooting, and if the aircraft determines that the flight trajectory is zoomed in, the aircraft may be in the direction of the subject during shooting. Shoot while flying.
  • Dronies Drone Selfies
  • the smart photographing method according to an embodiment of the present invention can more easily capture such a picture.
  • the position and height of the throw point can be known by a position sensor, such as a GPS or a visual sensor, and the position and height can be recorded. In this case, you can plan the path to bypass the obstacle, and if you can't bypass, try to lift the height to avoid the obstacle.
  • the nose can always be oriented in the forward direction during flight to ensure flight safety.
  • the aircraft After flying to the shooting position, the aircraft is controlled to turn the nose off, and based on the current position information and the recorded position and height of the throwing point, the facial features are first identified and the whole body is detected using a human body detector.
  • the composition of the photographing device may also be controlled such that the imaging of the photographing object in the photographing screen satisfies a preset composition rule, and when the photographing of the photographing object in the photographing screen satisfies a preset composition rule, the photographing is performed.
  • the subject is shooting.
  • the aircraft can automatically search, identify, and track the photographic subject according to the feature image acquired by the photographic device, and automatically compose the image, and after the aircraft is thrown, control the flight of the aircraft according to the preset composition rules and shooting information.
  • the shooting position completes a series of continuous shots, which is simple and intuitive, without the need for a remote control or user terminal.
  • the flight trajectory can be automatically planned according to the user's intention, and the whole process is smoother and friendly.
  • control method according to the embodiment of the present invention has been described above, and the control device for the aircraft, the control device for the aircraft, and the aircraft according to the embodiment of the present invention will be described below with reference to FIGS. 6 to 8, respectively.
  • the embodiment of the present invention further provides a computer storage medium, where the computer storage medium stores program instructions, and the program may include some or all of the steps of the control method of the aircraft in the corresponding embodiment in FIGS. 2-5.
  • FIG. 6 is a block diagram showing the structure of a control device 600 for an aircraft in accordance with one embodiment of the present invention.
  • the aircraft control device 600 can be, for example, the flight controller of FIG.
  • the aircraft control device 600 includes a determination module 610 and a control module 620.
  • the determination module 610 determines shooting information for the photographic subject, wherein the shooting information is used to indicate a range of the photographic subject in the picture to be captured.
  • the control module 620 controls the aircraft to fly to the shooting position according to the shooting information.
  • the flying of the aircraft to the appropriate shooting position can be controlled according to the range of the photographic subject desired by the user in the image to be photographed, the manual intervention of the aircraft during the shooting process is reduced, the user experience is improved, and the manual is reduced.
  • the operation of the occupation of the end of life improves the aircraft's endurance.
  • the photographing information includes at least one of a large scene, a medium scene, and a small scene; or the photographing information includes a full body image, a large bust, a bust, a bust, a shouldered avatar, and a large avatar at least one.
  • control module 620 is configured to determine a shooting position of the aircraft relative to the shooting object according to the shooting information, and control the aircraft to fly to the shooting position.
  • the determining module 610 is further configured to acquire an image of the photographic subject, and determine a photographic subject according to the image.
  • the determining module 610 is configured to control the photographing device to take a photographing object to acquire an image before the aircraft takes off; or control the photographing device to photograph the photographing object after the aircraft takes off, acquire an image; or send the external device by receiving The image of the subject.
  • control module 620 is further configured to control the photographing device carried by the aircraft to photograph the photographic subject after the aircraft flies to the shooting position.
  • control module 620 is configured to control the photographing device to adjust the focal length of the photographing device according to the principle of depth of field, and photograph the photographed object by using the adjusted focal length.
  • control module 620 is configured to control the composition of the photographing device such that the imaging of the photographing object in the photographing screen satisfies a preset composition rule, and the imaging of the photographing object in the photographing screen satisfies a preset composition rule. When shooting, shoot the subject.
  • control module 620 is configured to control the composition of the photographing device by adjusting at least one of a flight posture of the aircraft, a motion of the pan/tilt of the photographing device, and a focal length of the photographing device, such that the photographing object is in the photographing object.
  • the position satisfies the preset composition rules.
  • the determining module 610 is further configured to determine a flight trajectory of the aircraft when the photographic subject is photographed, wherein the control module 620 is configured to control the photographing device to photograph the photographic subject when the aircraft flies according to the flight trajectory.
  • the determination module 610 is configured to determine a flight trajectory of the aircraft based on input received from the external device.
  • the determining module 610 is configured to detect the motion of the aircraft by the motion sensor of the aircraft, acquire the first motion data output by the motion sensor, and determine the flight trajectory according to the first motion data.
  • the determining module 610 is configured to acquire second motion data output by the motion sensor of the external device to detect the motion of the external device, and determine a flight trajectory according to the second motion data.
  • the senor comprises at least one of a gyroscope, an electronic compass, an inertial measurement unit, an accelerometer, a global navigation satellite system, and a vision sensor.
  • the motion according to an embodiment of the present invention includes at least one of a following motion: a surround motion, a zoom motion, a zoom motion, and an S motion.
  • the motion is a surround motion, and further includes: a second detecting module 650, configured to detect a rotation of a pitch axis of the pan/tilt of the aircraft before the aircraft flies, and the determining module 610 is further configured to detect The rotation of the pitch axis and the surrounding motion determine that the flight path is one of a spiral rise and a spiral drop.
  • a second detecting module 650 configured to detect a rotation of a pitch axis of the pan/tilt of the aircraft before the aircraft flies
  • the determining module 610 is further configured to detect The rotation of the pitch axis and the surrounding motion determine that the flight path is one of a spiral rise and a spiral drop.
  • the movement comprises one of a motion in a vertical plane and a motion in a horizontal plane.
  • the determining module 610 further determines whether a signal for activating the determined flight trajectory is received before the flight trajectory of the aircraft is determined, and the signal is used to activate a process of determining a flight trajectory of the aircraft.
  • the determination module 610 is configured to determine that the flight trajectory is a follow flight if no flight trajectory is input within a preset time.
  • the flight trajectory comprises at least one of a surround, a zoom, a zoom, and an S shape.
  • control module 620 is further configured to automatically start the aircraft when the aircraft meets the preset automatic start condition.
  • control module 620 is configured to detect third motion data of the aircraft if the aircraft is thrown away, and automatically activate the powerplant of the aircraft when the third motion data satisfies an automatic start condition.
  • the motion data comprises a distance that the aircraft is thrown, and the third motion data satisfies an automatic start condition comprising: the distance at which the aircraft is thrown is greater than or equal to a first predetermined threshold; or the third motion data comprises an aircraft
  • the vertical speed or speed, the third motion data meeting the automatic start condition includes: the vertical speed or speed of the aircraft is less than or equal to the second preset threshold.
  • the first predetermined threshold is zero or a safe distance between the aircraft and the user.
  • control module 620 is configured to activate the power unit and control the idle rotation of the power unit when the aircraft meets a predetermined idle condition before the aircraft is towed.
  • control module 620 controls the power device to rotate at an idle speed after the aircraft is unlocked by the face; or controls the power device to rotate at an idle speed after the aircraft is horizontally placed for more than a preset period of time; or when it is confirmed that a signal allowing the idling rotation is received Control the power unit to rotate at an idle speed.
  • control module 620 detects the fourth motion data of the aircraft before the aircraft takes off, and automatically activates the power device of the aircraft when the fourth motion data satisfies the automatic start condition.
  • the fourth motion data indicates a duration in which the attitude angle of the aircraft is within a preset threshold range, and the fourth motion data satisfies an automatic start condition, including: the duration exceeds the second preset threshold.
  • control module 620 is configured to search for and identify the determined photographic subject by the photographic device of the aircraft; after searching and recognizing the photographic subject, detecting that the photographic subject is in the shooting Whether the range in the face coincides with the range indicated by the shooting information; when the range of the subject in the shooting screen coincides with the range indicated by the shooting information, the position where the aircraft is located is determined as the shooting position.
  • control module 620 is specifically configured to control the head of the aircraft or the head of the aircraft when the obstacle is determined to be in front of the aircraft, so that the photographing device is facing the take-off position, and the photographing device searches and identifies the determined The subject.
  • control module 620 is specifically configured to: when determining that there is an obstacle in front of the aircraft, control the aircraft to bypass the obstacle, and control the turning head of the aircraft or the head of the aircraft to position the photographing device toward the take-off position, The determined subject is searched and recognized using the photographing device.
  • the determining module 610 is further configured to determine a flight direction after the aircraft takes off, wherein the determining module 610 is configured to determine a flight distance after the aircraft takes off according to the shooting information, and determine the shooting according to the flight direction and the flight distance. position.
  • the determining module 610 is configured to determine the flight direction according to the setting of the aircraft before take-off; or determine the flight direction according to the direction of the nose of the aircraft when the aircraft takes off; or determine the flight according to the position at which the aircraft takes off.
  • Direction or determine the flight direction according to the position of the subject; or determine the flight direction according to the orientation of the subject; or determine the flight direction according to the selected shooting angle.
  • the determining module 610 is further configured to determine a shooting parameter of the shooting device for capturing a photographic object, wherein the determining module 610 determines a flying distance after the aircraft takes off according to the shooting information, and according to the flying distance and shooting The shooting parameters of the device determine the shooting position.
  • the shooting parameter is at least one of a field of view FOV parameter and a focus parameter.
  • the determining module 610 is further configured to determine a preset composition rule that the photographic subject needs to meet in the shooting picture, where the control module 620 is configured to control the flight of the aircraft according to the preset composition rule and the shooting information. To the shooting position.
  • the composition rule includes one or more of a position of the photographic subject in the photographing screen, an angle of the subject's face in the photographing screen, and a completeness of the subject's face in the photograph.
  • the determining module 610 is further configured to receive a preset composition rule from an external device; or determine a composition rule by identifying a preset action or gesture of the photographic subject.
  • the composition rule includes a rule of one of the following composition rules: a balanced composition, a symmetric composition, a diagonal composition, a triangular composition, a nine-square lattice, a centripetal composition,
  • a balanced composition a balanced composition
  • a symmetric composition a diagonal composition
  • a triangular composition a nine-square lattice
  • centripetal composition a centripetal composition
  • the determination module 610 determines the shooting information based on the input of the received external device.
  • the determining module 610 is specifically configured to determine the shooting position of the aircraft according to the number of the plurality of subjects and the shooting information.
  • the first detecting module 630 is configured to detect at least one of a speed, an acceleration, and a throwing trajectory when the aircraft is towed, wherein the determining module 610 is configured to use the speed, the acceleration, and the tossing At least one of the tracks selects shooting information from a plurality of pieces of shooting information set in advance.
  • control device 600 of the aircraft further includes: a third detecting module 660, configured to detect environmental condition information and/or posture information of the photographic subject, wherein the control module is further configured to be used according to an environmental condition The information and/or the posture information of the subject adjusts the shooting angle.
  • a third detecting module 660 configured to detect environmental condition information and/or posture information of the photographic subject, wherein the control module is further configured to be used according to an environmental condition The information and/or the posture information of the subject adjusts the shooting angle.
  • FIG. 7 is a schematic structural diagram of a control device 700 for an aircraft according to another embodiment of the present invention.
  • Control device 700 includes a processor 710 and a memory 720.
  • Memory 720 is for storing instructions to cause processor 720 to perform the method of any of the embodiments of Figures 2-5.
  • FIG. 8 is a block diagram of an aircraft 800 in accordance with an embodiment of the present invention.
  • the aircraft 800 includes a control device 810, one or more propulsion devices 820, and a sensing system 830 for detecting motion parameters of the aircraft 800, one or more propulsion devices 820 for providing flight power to the aircraft 800.
  • Control device 810 is in communication with one or more propulsion devices 820 and is in communication with sensor system 830 for controlling one or more propulsion devices 820 to operate in accordance with sensor system 830 detecting hand motion parameters to control flight of aircraft 800 .
  • Flight control device 810 can be the control device of FIG.
  • the propulsion device 820 can be the power system of FIG.
  • the disclosed systems, devices, and methods may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the functions may be stored in a computer readable storage medium if implemented in the form of a software functional unit and sold or used as a standalone product.
  • the technical solution of the present invention which is essential or contributes to the prior art, or a part of the technical solution, may be embodied in the form of a software product, which is stored in a storage medium, including
  • the instructions are used to cause a computer device (which may be a personal computer, server, or network device, etc.) to perform all or part of the steps of the methods described in various embodiments of the present invention.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Studio Devices (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

一种飞行器的控制方法、装置和设备以及飞行器。该控制方法包括:确定针对拍摄对象的拍摄信息,其中拍摄信息用于指示拍摄对象在待拍摄画面中的范围(210),并根据拍摄信息控制飞行器飞行至拍摄位置(220)。该控制方法能够减少拍摄过程中对飞行器的手动干涉,提高用户体验。

Description

飞行器的控制方法、装置和设备以及飞行器
版权申明
本专利文件披露的内容包含受版权保护的材料。该版权为版权所有人所有。版权所有人不反对任何人复制专利与商标局的官方记录和档案中所存在的该专利文件或者该专利披露。
技术领域
本发明实施例涉及控制技术领域,尤其涉及一种飞行器的控制方法、装置和设备以及飞行器。
背景技术
随着飞行技术的发展,飞行器,例如,UAV(Unmanned Aerial Vehicle,无人飞行器),也称为无人机,已经从军用发展到越来越广泛的民用,例如,UAV植物保护、UAV航空拍摄、UAV森林火警监控等等,而民用化也是UAV未来发展的趋势。
目前,当使用UAV上携带的拍摄设备进行拍摄时,需要通过用户操作用户终端或遥控器来控制飞行器的飞行姿态、飞行距离和云台的转动来实现对拍摄的调整和控制,操作过程繁琐,操作体验不友好。而且由于用户手动操作的时间占据了大量的续航时间,使得实际飞行时间减小。缺少容易使用的交互控制与拍摄控制系统,可能在某些应用中降低UAV航拍的有用性。
发明内容
本发明的实施例提供了一种飞行器的控制方法、装置和设备以及飞行器,能够提高利用飞行器进行拍摄的灵活性,简化用户对无人机的操作过程。
第一方面,提供了一种飞行器的控制方法。该控制方法包括:确定针对拍摄对象的拍摄信息,其中所述拍摄信息用于指示所述拍摄对象在待拍摄画面中的范围;根据所述拍摄信息控制所述飞行器飞行至拍摄位置。
第二方面,提供了一种飞行器的控制装置,包括:确定模块,确定针对拍摄对象的拍摄信息,其中所述拍摄信息用于指示所述拍摄对象在待拍摄画面中的范围;控制模块,用于根据所述拍摄信息控制所述飞行器飞行至拍摄 位置。
第三方面,提供一种飞行器的控制设备,包括:处理器和存储器,其中所述存储器用于存储指令以使得处理器执行如第一方面所述的方法。
第四方面,提供了一种飞行器,包括:传感系统,用于检测飞行器的运动参数;如第三方面所述的飞行器的控制设备;以及一个或多个推进装置,用于为所述飞行器提供飞行动力;其中,所述飞行器的控制设备与所述一个或多个推进装置通信连接,并且与传感系统通信连接,用于根据传感系统检测的运动参数控制所述一个或多个推进装置工作,以控制所述飞行器的飞行。
根据本发明的实施例,可以根据用户期望的拍摄对象在待拍摄画面中的范围,控制飞行器飞行至合适的拍摄位置,减少了拍摄过程中对飞行器的手动干涉,而且减少手动操作对续航时间的占用,实际上提高了飞行器的续航能力。
附图说明
为了更清楚地说明本发明实施例的技术方案,下面将对本发明实施例中所需要使用的附图作简单地介绍,显而易见地,下面所描述的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是根据本发明的实施例的无人飞行系统的示意性架构图。
图2是根据本发明的一个实施例的飞行器的控制方法的示意性流程图。
图3是根据本发明的另一实施例的飞行器的控制方法的示意性流程图。
图4是根据本发明的又一实施例的飞行器的控制方法的示意性流程图。
图5是根据本发明的再一实施例的飞行器的控制方法的示意性流程图。
图6是根据本发明的一个实施例的飞行器的控制装置的结构示意图。
图7是根据本发明的另一实施例的飞行器的控制设备的结构示意图。
图8是根据本发明的实施例的飞行器的结构示意图。
具体实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实 施例。基于本发明中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明的实施例提供了飞行器的控制方法、装置和设备。以下对本发明的描述使用无人机UAV作为飞行器的示例。对于本领域技术人员将会显而易见的是,可以不受限制地使用其他类型的飞行器,本发明的实施例可以应用于各种类型的UAV。例如,UAV可以是小型的UAV。在某些实施例中,UAV可以是旋翼飞行器(rotorcraft),例如,由多个推动装置通过空气推动的多旋翼飞行器,本发明的实施例并不限于此,UAV也可以是其它类型的UAV或可移动装置。
图1是根据本发明的实施例的无人飞行系统100的示意性架构图。本实施例以旋翼飞行器为例进行说明。
无人飞行系统100可以包括UAV 110、云台120、显示设备130和操纵设备140。其中,UAV 110可以包括动力系统150、飞行控制系统160和机架170。UAV 110可以与操纵设备140和显示设备130进行无线通信。
机架170可以包括机身和脚架(也称为起落架)。机身可以包括中心架以及与中心架连接的一个或多个机臂,一个或多个机臂呈辐射状从中心架延伸出。脚架与机身连接,用于在UAV 110着陆时起支撑作用。
动力系统150可以包括电子调速器(简称为电调)151、一个或多个螺旋桨153以及与一个或多个螺旋桨153相对应的一个或多个电机152,其中电机152连接在电子调速器151与螺旋桨153之间,电机152和螺旋桨153设置在对应的机臂上;电子调速器151用于接收飞行控制器160产生的驱动信号,并根据驱动信号提供驱动电流给电机152,以控制电机152的转速。电机152用于驱动螺旋桨旋转,从而为UAV 110的飞行提供动力,该动力使得UAV 110能够实现一个或多个自由度的运动。在某些实施例中,UAV 110可以围绕一个或多个旋转轴旋转。例如,上述旋转轴可以包括横滚轴、平移轴和俯仰轴。应理解,电机152可以是直流电机,也可以交流电机。另外,电机152可以是无刷电机,也可以有刷电机。
飞行控制系统160可以包括飞行控制器161和传感系统162。传感系统162用于测量UAV的姿态信息,即UAV 110在空间的位置信息和状态信息,例如,三维位置、三维角度、三维速度、三维加速度和三维角速度等。传感系统162例如可以包括陀螺仪、电子罗盘、IMU(惯性测量单元,Inertial  Measurement,Unit)、视觉传感器、全球导航卫星系统和气压计等传感器中的至少一种。例如,全球导航卫星系统可以是GPS(全球定位系统,Global Positioning System)或者。飞行控制器161用于控制UAV 110的飞行,例如,可以根据传感系统162测量的姿态信息控制UAV 110的飞行。应理解,飞行控制器161可以按照预先编好的程序指令对UAV 110进行控制,也可以通过响应来自操纵设备140的一个或多个控制指令对UAV 110进行控制。
云台120可以包括电调121和电机122。云台用于携带拍摄设备123。飞行控制器161可以通过电调121和电机122控制云台120的运动。可选地,作为另一实施例,云台120还可以包括控制器,用于通过控制电调121和电机122来控制云台120的运动。应理解,云台120可以独立于UAV 110,也可以为UAV 110的一部分。应理解,电机122可以是直流电机,也可以交流电机。另外,电机122可以是无刷电机,也可以有刷电机。还应理解,云台可以位于飞行器的顶部,也可以位于飞行器的底部。
拍摄设备123例如可以是照相机或摄像机等用于捕获图像的设备,拍摄123可以与飞行控制器通信,并在飞行控制器的控制下进行拍摄。
显示设备130位于无人飞行系统100的地面端,可以通过无线方式与UAV 110进行通信,并且可以用于显示UAV 110的姿态信息。另外,还可以在显示设备130上显示拍摄设备拍摄的图像。应理解,显示设备130可以是独立的设备,也可以设置在操纵设备140中。
操纵设备140位于无人飞行系统100的地面端,可以通过无线方式与UAV 110进行通信,用于对UAV 110进行远程操纵。操纵设备例如可以是遥控器或者安装有控制UAV的APP(应用程序,Application)的用户终端,例如,智能手机、平板电脑等。本发明的实施例中,通过操纵设备接收用户的输入,可以指通过遥控器上的拔轮、按钮、按键、摇杆等输入装置或者用户终端上的用户界面(UI)对UAV进行操控。
应理解,上述对于无人飞行系统各组成部分的命名仅是出于标识的目的,并不应理解为对本发明的实施例的限制。
图2是根据本发明的一个实施例的一种飞行器的控制方法的示意性流程图。图2的控制方法例如可以由控制装置或控制设备,例如,图1的飞行控制器,来执行,本发明的实施例并不限于此,例如,图2的控制方法也可以由飞行器上携带的其它控制装置或控制设备来实现。图2的控制方法包括如 下内容。
210,确定针对拍摄对象的拍摄信息,其中拍摄信息用于指示拍摄对象在待拍摄画面中的范围。例如,拍摄信息可以是拍摄对象对应的图像在待拍摄画面中所占的比例或拍摄对象对应的图像在待拍摄画面所占的范围大小。
上述拍摄信息例如可以是用户选择的景别。景别按照拍摄对象在拍摄画面中所占的比例或范围不同,可以分成大景别、中等景别和小景别三种,每种景别又可以进一步细分。景别越大,意味着拍摄对象在拍摄画面中所占的比例或范围越小,反之亦然。具体地,人像摄影可以根据拍摄对象的面积在画面中所占的比例或范围来确定其景别,分成全身像、大半身像、半身像、胸像、带肩头像和大头像。例如,拍摄信息可以包括大景别、中等景别和小景别中的至少一个;或者拍摄信息可以包括全身像、大半身像、半身像、胸像、带肩头像和大头像中的至少一个。
例如,可以预选设置不同的景别与拍摄对象在拍摄画面中所占的比例或范围的对应关系,当用户选择了某种景别,则根据用户选择的景别确定拍摄对象在待拍摄画面中所占的比例或范围。当然,本发明的实施例并不限于上述确定拍摄信息的方式,也可以根据在用户界面上输入的拍摄对象在待拍摄画面中所占的比例或范围来确定拍摄信息,例如,可以利用用户在触摸屏上画出的方框来指示拍摄对象在待拍摄画面中的范围。
拍摄对象又称为拍摄目标和拍摄主体,既可以是操纵飞行器的用户,也可以是用户以外的其它人或物。
应理解的是,上述景别的分类仅仅是举例说明,也可以根据实际需要定义不同的景别分类。
220,根据拍摄信息控制飞行器飞行至拍摄位置。
具体而言,拍摄对象在待拍摄画面中的所占的比例或范围与拍摄设备与拍摄对象之间的距离(以下简称为拍摄距离)存在对应关系,例如,拍摄对象在待拍摄画面中的所占的比例或范围与拍摄距离呈反向相关,即拍摄对象在待拍摄画面中所占的比例或范围越大,需要拍摄设备距离拍摄对象越近,反之亦然。例如,飞行器控制器可以根据拍摄对象在待拍摄画面中的比例或范围估计出拍摄位置,从而直接控制飞行器飞行至该拍摄位置,或者飞行器控制器也可以动态调整飞行器的目标位置,使得拍摄对象在待拍摄画面中的比例或范围与拍摄信息所指示的范围趋向一致,从而控制飞行器最终飞行至 合适的拍摄位置。
根据本发明的实施例,可以根据用户期望的拍摄对象在待拍摄画面中的范围,控制飞行器飞行至合适的拍摄位置,减少了拍摄过程中对飞行器的手动干涉,提高了用户体验,而且减少手动操作对续航时间的占用,实际上提高了飞行器的续航能力。
根据本发明的实施例,可以采用如下方式确定针对拍摄对象的拍摄信息:
1)根据接收到的外部设备的输入确定拍摄信息。例如,可以在用户界面上设置用于输入拍摄信息的按钮、文本框、选择框等用户界面元素,使得用户可以通过外部设备(例如,用户终端或遥控器)的用户界面选择或输入拍摄对象在待拍摄画面中所占的比例或范围。例如,飞行器控制器可以通过其与外部设备之间的通信接口接收拍摄信息。这样,用户可以精确地输入拍摄信息,使得能够准确地拍摄出用户期望大小的图像。
2)检测飞行器被抛飞时的速度、加速度和抛飞轨迹中的至少一个,并根据速度、加速度和抛飞轨迹中的至少一个从预先设置的多种拍摄信息中选择拍摄信息。通过检测飞行器被抛飞时的速度和加速度来确定拍摄信息意味着可以根据用户抛出飞行器时的用力大小来确定拍摄信息,例如,用力越大,飞行器被抛出时的速度和加速度越大,则表明用户希望飞行器飞行得越远,即用户希望拍摄的景别越大,或者希望拍摄对象在待拍摄画面中所占的比例或范围越小,反之亦然。再如,抛飞轨迹与水平面所成的角度越大,则表明用户希望飞行器飞行得越远,即用户希望拍摄景别越大的图像,反之亦然。由于可以根据用户的抛飞状态确定拍摄信息,因此,无需通过外部设备手动设置,提高了用户体验,并且进一步减少了对飞行器续航时间的占用,提高了飞行器的续航能力。
应理解,拍摄信息的定义以及确定拍摄信息的方式并不限于上述描述,只要能将不同的拍摄信息区分开即可。
可选地,作为另一实施例,图2的控制方法还包括:获取拍摄对象的图像,并根据图像确定拍摄对象。
具体而言,可以控制拍摄设备获取拍摄对象的特征图像,例如,当拍摄对象为动物或人时,该特征图像为面部特征图像。上述图像可以用于搜索、识别和跟踪拍摄对象,例如,根据拍摄设备当前得到的图像与上述图像进行 对比,如果两者一致,则搜索、识别和跟踪到该拍摄对象。
根据本发明的实施例,可以通过如下方式之一获取拍摄对象的图像:
1)在飞行器起飞之前,控制拍摄设备拍摄拍摄对象,获取拍摄对象的图像,例如,在飞行器起飞行之前,用户可以将拍摄设备对准拍摄对象进行拍摄,以获取其特征图像。
2)在飞行器起飞之后,控制拍摄设备拍摄拍摄对象,获取拍摄对象的图像,例如,可以控制飞行器在起飞后先掉转机头对准拍摄对象进行拍摄,以获取其特征图像。
3)通过接收外部设备发送的拍摄对象的图像,获取拍摄对象的图像,例如,用户可以将用户终端上保存的拍摄对象的特征图像通过用户终端与飞行控制器之间的通信接口发送给飞行控制器。
可选地,作为另一实施例,图2的控制方法还包括:确定拍摄所述拍摄对象时飞行器的飞行轨迹,并且控制拍摄设备在飞行器按照飞行轨迹飞行时对拍摄对象进行拍摄。
根据本发明的实施例,用户可以直接抛出飞行器,飞行器可以识别使用者抛出的动作,选择合适的飞行轨迹。由于可以通过简单的动作指示无人机用户想要的飞行轨迹,提高了用户体验,并且进一步减少了对飞行器续航时间的占用,提高了飞行器的续航能力。
可替代地,作为另一实施例,可以根据从外部设备接收到的输入确定飞行器的飞行轨迹。例如,用户可以在用户界面上设置用于输入飞行轨迹信息的按钮、文本框、选择框等用户界面元素,以便用户可以输入或选择飞行轨迹。
可替代地,作为另一实施例,还可以通过飞行器的运动传感器检测飞行器的运动,获取运动传感器输出的第一运动数据,并根据第一运动数据确定飞行轨迹。第一运动数据例如可以是飞行器随时间变化的位置、速度、角加速度、加速度中的一种或者多种。
可替代地,作为另一实施例,飞行器控制器可以获取外部设备的运动传感器检测外部设备运动而输出的第二运动数据,根据第二运动数据确定飞行轨迹。第二运动数据例如可以是用户终端随时间变化的位置、速度、角加速度、加速度中的一种或多种。
例如,外部设备可以是用户终端。用户可以在飞行器飞行前手持用户终 端做出特定的动作,用户终端上携带的运动传感器可以检测用户终端的运动并输出运动数据给飞行控制器,飞行控制器根据运动数据确定飞行轨迹。例如,如果用户终端的动作是环绕运动,则飞行器确定飞行轨迹为环绕飞行。
上述传感器可以包括:陀螺仪、电子罗盘、惯性测量单元、加速度计、全球导航卫星系统和视觉传感器中的至少一个。
上述运动包括如下运动中的至少一个:环绕运动、拉远运动、拉近运动和S形运动。上述运动可以包括在垂直平面内的运动和在水平平面内的运动之一。例如,环绕运动可以是垂直平面内的运动,也可以是水平平面内的运动。应理解,上述运动只是举例,也可以采用其它形式的运动来表示飞行轨迹。
上述运动为环绕运动时,图2的控制方法还包括:在飞行器飞行之前,检测飞行器的云台的俯仰轴的转动;根据检测到的俯仰轴的转动和环绕运动确定飞行轨迹为螺旋上升和螺旋下降之一。
可选地作为另一实施例,在确定飞行器的飞行轨迹前,图2的控制方法还包括:确定是否接收到激活确定飞行轨迹的信号,信号用于激活确定飞行器的飞行轨迹的过程。
根据本发明的实施例,如果在预设的时间内没有输入飞行轨迹,则确定飞行轨迹为跟随飞行。
具体地,跟随飞行是指跟随移动目标飞行。例如,控制飞行器跟随移动的拍摄对象飞行,跟随可以是GPS的跟随,即利用GPS定位技术实现跟随飞行,跟随也可以是视觉跟随,即利用视觉传感器和图像识别实现跟随飞行。
根据本发明的实施例,飞行轨迹可以包括环绕、拉远、拉近和S形中的至少一个。
可选地,作为另一实施例,图2的控制方法还包括:在飞行器飞行至拍摄位置后,控制飞行器携带的拍摄设备对拍摄对象进行拍摄。
可选地,作为另一实施例,图2的控制方法还包括:从外部设备接收预设的构图规则;或者通过识别拍摄对象的预设动作或姿势来确定构图规则。
具体地,构图规则包括拍摄对象在拍摄画面中的位置、拍摄对象的脸在拍摄画面中角度、拍摄对象的脸在画面中的完整度中的一种或多种。
例如,构图规则包括如下构图规则之一的规则:均衡式构图、对称式构图、对角线构图、三角形构图、九宫格构图、向心式构图、对分式构图、拍 摄画面中的人脸为正脸、拍摄画面中的人脸为侧脸。
根据本发明的实施例,上述控制飞行器携带的拍摄设备对拍摄对象进行拍摄包括:控制拍摄设备的构图,使得拍摄对象在拍摄画面中的成像满足预设的构图规则;在拍摄对象在拍摄画面中的成像满足预设的构图规则时,对拍摄对象进行拍摄。
具体地,上述控制拍摄设备的构图,使得拍摄对象在拍摄画面中的成像满足预设的构图规则包括:通过调整飞行器的飞行姿态、拍摄设备的云台的运动和拍摄设备的焦距中的至少一个来控制拍摄设备的构图,使得拍摄对象在拍摄对象中的位置满足预设的构图规则。
在拍摄过程中,可以从拍摄设备获取拍摄对象当前在拍摄画面中呈现的图像,并且通过图像识别确定拍摄对象在拍摄画面中所占的位置,从而确定拍摄对象在拍摄对象中的位置是否满足预设的构图规则。以构图为九宫格为例,例如,如果用户选择了九宫格构图,则可以将拍摄对象成像在九宫格的四个交叉点上。可选地,还可以进一步将九宫格构图细分与四个交叉点对应的四种模式,以供用户进一步选择将拍摄对象成像在哪个交叉点上。可以根据上述图像识别确定拍摄对象的中心是否位于九宫格的某个交叉点上,或者确定拍摄对象的中心距离九宫格的某个交叉点的距离和方位,并据此调整构图,使得拍摄对象的中心最终与九宫格的某个交叉点重合。
可选地,作为另一实施例,上述控制飞行器携带的拍摄设备对拍摄对象进行拍摄包括:控制拍摄设备根据景深原理调整拍摄设备的焦距,并利用调整后的焦距对拍摄对象进行拍摄。
在拍摄位于不同距离上的景物时,例如,拍摄多排人物或者体积庞大的拍摄对象时,可以根据景深原理调整焦距,即设置合适的焦点,使得拍摄设备能够清晰地拍摄全部景物。
以多人合影为例,可以先根据拍摄主体的个数(例如,可以在起飞行前由用户确认拍摄对象时进行计数)预判拍摄距离或拍摄位置,如果人越多则控制飞行器飞得越远,反之亦然。在飞行器飞至拍摄位置后,可以根据景深原理调整焦距,如公式(1)、(2)和(3)所示,前景深比后景深更浅,故大概需要对焦在前1/3处,其中1/3是经验值,镜头可以聚焦在整个队列纵深的前1/3处。例如,如果给五排人拍摄合影,则可以将焦点对在第二排中间的人物上,这样可以更有效地利用前景深和后景深,拍出清晰的集体合影。
Figure PCTCN2016107997-appb-000001
Figure PCTCN2016107997-appb-000002
Figure PCTCN2016107997-appb-000003
σ为容许弥撒圆直径,f为镜头焦距,F为镜头的拍摄光圈值,L为对焦距离,ΔL1为前景深,ΔL2为后景深,ΔL为景深。
可选地,作为另一实施例,图2的控制方法还包括:检测环境状况信息和/或拍摄对象的姿势信息,并根据环境状况信息和/或拍摄对象的姿势信息调整拍摄角度。
环境状况信息例如可以为用于表示逆光、天气状况、光线明暗等信息。人体的姿势信息例如可以为用于表示头的转向、站立、坐下等姿势的信息。具体的拍摄角度可以包括俯拍、侧拍和仰拍等。
例如,当检测到以某个拍摄角度进行拍摄逆光时,可以避免以该拍摄角度进行拍摄。再如,当检测到拍摄对象处于侧身状态时,可以调整拍摄角度,使得能够拍摄到拍摄对象的正面照。应理解,上述功能可以在飞行器起飞行前由用户通过外部设备的用户界面(例如,用户终端上的用户界面)进行设置或选择。
由于可以根据环境状况信息和/或拍摄对象的姿势信息对拍摄角度进行自适应的调整,使得拍摄过程智能化,减少了拍摄过程中的人工干涉,提高了用户体验,而且减少了手动操作对飞行器的续航时间的占用,提高了飞行器的续航能力。
可选地,作为另一实施例,图2的控制方法还包括:在飞行器满足预设的自动启动条件时,自动启动飞行器。
自动启动飞行器意味着在预设的自动启动条件满足时,直接接通飞行器的启动回路,控制飞行器的动力装置开始工作,而无需人工通过按钮或按键手动启动飞行器。由于可以根据预设的条件自动启动飞行器,使得能够将飞行器的启动与飞行器飞行前用于设置飞行轨迹或拍摄信息的运动状态相结合,从而实使得整个拍摄过程一气呵成,提高了用户体验,而且减少了手动操作对飞行器的续航时间的占用,提高了飞行器的续航能力。
根据本发明的实施例,可以按照如下方式自动启动飞行器:
1)在飞行器被抛飞的情况下,检测飞行器的第三运动数据;在第三运动数据满足自动启动条件时,自动启动飞行器的动力装置。
具体地,第三运动数据可以包括飞行器被抛出的距离,在这种情况下,第三运动数据满足自动启动条件包括:飞行器被抛出的距离大于或等于第一预设阈值。第一预设阈值可以为零或者使得飞行器不会对用户造成伤害的安全距离。在飞行器与用户之间的距离为安全距离时启动飞行器,能够避免对用户造成伤害。
可替代地,作为另一实施例,第三运动数据可以包括飞行器的垂直速度或速度,在这种情况下,第三运动数据满足自动启动条件包括:飞行器的垂直速度或速度小于或等于第二预设阈值。例如,第二预设阈值可以等于零或其它接近于零的值。由于将垂直速度或速度设置为小于等于预设阈值再启动,使得飞行器的启动时飞行更加稳定。
2)在飞行器被抛飞之前,在飞行器满足预设的怠速条件时,启动动力装置并控制动力装置怠速转动。
具体而言,飞行器可以在飞行器通过人脸解锁后控制动力装置怠速转动。通过将人脸解锁作为预设的怠速条件,可以避免飞行器的误启动。另外,可以将自动启动与人脸解锁以及拍摄对象的确认相结合,使得整个拍摄过程更加流畅,提高了用户体验。
可替代地,作为另一实施例,也可以在飞行器水平放置超过预设时长之后控制动力装置怠速转动。例如,用户可以在设置飞行轨迹之后顺势将飞行器水平放置(例如,水平放置在手掌中),飞行器根据传感器检测的飞行器的姿态信息确定到飞行器处于水平状态(例如,姿态角为零)超过预设时间之后,自动启动飞行器,并控制动力装置怠速转动。进一步,飞行器还可以在动力装置怠速转动预设时间之后根据拍摄信息控制飞行器飞行至拍摄位置。由于可以将自动启动与飞行轨迹的确认过程相结合,使得整个拍摄过程更加流畅,提高了用户体验。
可替代地,作为另一实施例,也可以在确认收到允许怠速转动的信号时控制动力装置怠速转动。例如,为了安全起见,可以生成一些允许怠速转动的信号或者接收外部设备发送的允许怠速转动的信号来控制飞行器怠速转动,本发明的实施例可以利用这些信号与飞行器的自动启动相结合,从而提 高了飞行器自动启动的安全性。
3)在飞行器起飞前,检测飞行器的第四运动数据;在第四运动数据满足自动启动条件时,则自动启动飞行器的动力装置。
根据本发明的实施例,第四运动数据指示飞行器的姿态角位于预设的阈值范围的时长,第四运动数据满足自动启动条件可以包括:该时长超过第二预设阈值。
例如,用户可以在设置飞行轨迹之后顺势将飞行器水平放置(例如,水平放置在手掌中),飞行器根据传感器检测的飞行器的姿态信息确定到飞行器处于水平状态(例如,姿态角为零)超过预设时间之后,自动启动飞行器。
应理解,上述自动启动的多个条件也可以组合使用,例如,人脸解锁和第四运动数据同时满足自动启动条件时,才自动启动飞行器。
根据本发明的实施例,上述根据拍摄信息控制飞行器飞行至拍摄位置包括:通过飞行器的拍摄设备搜索和识别确定的拍摄对象;在搜索和识别到拍摄对象后,检测拍摄对象在拍摄画面中的范围是否与拍摄信息指示的范围一致;在拍摄对象在拍摄画面中的范围与拍摄信息指示的范围一致时,确定飞行器所在的位置为拍摄位置。
例如,如果拍摄对象在当前拍摄画面中所占的比例大于拍摄信息所指示的比例,则调整飞行器远离拍摄对象,如果拍摄对象在当前拍摄画面中所占的比例小于拍摄信息所指示的比例,则调整飞行器靠近拍摄对象。上述调整可以按固定的步长进行调整,也可以按照可变的步长进行调整。当确定该比例与拍摄信息中指示的比例一致时,可以将当前位置作为最终的拍摄位置。
可选地,作为另一实施例,图2的飞行控制方法还包括:确定飞行器起飞后的飞行方向,并且根据飞行方向和拍摄信息控制飞行器飞行至拍摄位置。例如,在上述调整飞行器远离或靠近拍摄对象时,可以沿飞行方向调整飞行器远离或靠近拍摄对象。
可选地,作为另一实施例,也可以根据拍摄信息和拍摄设备的拍摄参数控制飞行器飞行至拍摄位置。例如,拍摄参数可以为视场(Field of View,FOV)参数和焦距参数中的至少一个。对于相同的景别需求,焦距越长,调整飞行器远离或靠近拍摄对象时所采用的步长可以越大,反之亦然。视场角越大,调整飞行器远离或靠近拍摄对象时所采用的步长可以越小,反之亦然。
根据本发明的实施例,上述飞行器通过拍摄设备搜索和识别确定的拍摄 对象具体包括:当确定飞行器的前方不存在障碍时,控制飞行器的机头或飞行器的云台,使拍摄设备朝向起飞的位置,利用拍摄设备搜索和识别确定的拍摄对象。
例如,如果飞行方向上没有障碍物,控制飞行器朝向初始飞行方向的反方向,使得拍摄设备的镜头朝向拍摄对象,并搜索和识别拍摄对象。一旦找到拍摄对象就采用跟踪算法锁定拍摄对象的面部,确认所述拍摄对象,并使用人体检测器(Human Detector)搜索拍摄对象的全身,从而确定拍摄对象的主体。
根据本发明的实施例,上述飞行器通过拍摄设备搜索和识别确定的拍摄对象具体包括:当确定飞行器前方存在障碍物时,控制飞行器绕开障碍物,并控制飞行器的调转机头或飞行器的云台,使拍摄设备朝向起飞的位置,利用拍摄设备搜索和识别确定的拍摄对象。
如果飞行方向上有障碍,则可以先通过位置传感器,例如,GPS或者视觉传感器,获知抛出点的位置和高度,并记录该位置和高度。在这种情况下,可以规划路径绕开障碍,如无法绕行则尝试抬升高度,以避开障碍。另外,飞行过程中机头可以始终朝向前进方向,以保证飞行安全。
根据本发明的实施例,上述根据拍摄信息控制飞行器飞行至拍摄位置包括:根据拍摄信息确定飞行器相对于拍摄对象的拍摄位置,并控制飞行器飞行至拍摄位置。
根据本发明的实施例,图2的控制方法还包括:确定拍摄对象在拍摄画面中需要满足的预设的构图规则;其中上述根据拍摄信息控制飞行器飞行至拍摄位置,包括:根据预设的构图规则和拍摄信息控制飞行器飞行至拍摄位置。
可选地,作为另一实施例,图2的飞行控制方法还包括:确定飞行器起飞后的飞行方向,其中上述根据拍摄信息确定飞行器相对于拍摄对象的拍摄位置,包括:根据拍摄信息确定飞行器起飞后的飞行距离,并根据飞行方向和飞行距离确定拍摄位置。例如,飞行距离可以是拍摄对象到拍摄位置的水平距离,而飞行方向和飞行距离决定了拍摄位置的高度。因此,可以根据飞行方向和飞行距离确定拍摄位置的高度。
本发明的实施例对飞行方向的确定方式不作限定,例如,可以采用如下多种方式之一来确定飞行器起飞后的飞行方向:
1)根据飞行器起飞前的设定确定飞行方向,例如,可以在飞行器起飞前设置飞行方向为向左上方飞行或右上方飞行或前上方飞行等。
2)根据飞行器起飞时飞行器的机头的方向确定飞行方向,例如,如果向左上方或右上方抛出飞行器,则飞行方向确定为左上方或右上方。
3)根据述飞行器起飞时所处的位置确定飞行方向,例如,如果起飞时飞行器所处的位置较低,则飞行方向指向较低的方向,如果起飞行时飞行器所处的位置较高,则飞行方向指向较高的方向。
4)根据拍摄对象的位置确定飞行方向,例如,如果拍摄对象位于移动的物体(例如,机动车辆)上,则飞行方向指向拍摄对象移动的方向。
5)根据拍摄对象的朝向确定飞行方向,例如,如果飞行器根据检测到的拍摄对象的姿势确定拍摄对象朝向左上方,则确定飞行方向为左上方。
6)根据被选择的拍摄角度确定飞行方向,例如,用户可以在飞行器起飞前预先确定拍摄角度,可以根据拍摄角度确定飞行方向。
可替代地,作为另一实施例,图2的飞行控制方法还包括:确定用于拍摄拍摄对象的拍摄设备的拍摄参数,其中根据拍摄信息确定飞行器相对于拍摄对象的拍摄位置,包括:根据拍摄信息确定飞行器起飞后的飞行距离;根据飞行距离和拍摄设备的拍摄参数确定拍摄位置。例如,拍摄参数可以为视场(Field of View,FOV)参数和焦距参数中的至少一个。不同的FOV参数或焦距参数,对于相同的景别,所确定的拍摄位置是不同的。焦距参数可以为焦距,视场参数可以为视场角。对于相同的景别需求,焦距越长,则需要的拍摄距离越大。视场角越大,则需要的拍摄距离越小。
应理解,根据本发明的实施例,也可以根据飞行距离、拍摄设备的拍摄参数以及飞行方向确定拍摄位置。
可替代地,作为另一实施例,拍摄对象包括多个主体时,根据拍摄信息确定飞行器相对于拍摄对象的拍摄位置,包括根据多个主体的数目和拍摄信息确定飞行器的拍摄位置。例如,被拍摄的主体的数目越多,拍摄位置距离拍摄对象越远。
图是3根据本发明的另一实施例的一种飞行器的控制方法的示意性流程图。图3的控制方法是图2的方法的示例。图3的控制方法包括如下内容。
310,确定针对拍摄对象的拍摄信息,其中拍摄信息用于指示拍摄对象在待拍摄画面中的范围。与图2的210相同,在此不再赘述。
315,确定拍摄对象在待拍摄画面中的构图规则。
例如,可以在飞行器起飞前接收用户输入的构图规则。可替代地,也可以在飞行器起飞后根据用户的手势确定构图规则。
构图规则可以包括拍摄对象在拍摄画面中的位置、拍摄对象的脸在拍摄画面中角度、拍摄对象的脸在画面中的完整度中的一种或多种。
具体而言,当构图规则包括拍摄对象在拍摄画面中的位置时,构图规则例如可以包括如下之一:均衡式构图、对称式构图、对角线构图、三角形构图、九宫格构图、向心式构图、对分式构图。当构图规则包括拍摄对象的脸在拍摄画面中角度时,构图规则例如可以是拍摄画面中的人脸为正脸或拍摄画面中的人脸为侧脸。当构图规则包括拍摄对象的脸在画面中的完整度时,构图规则例如可以是拍摄脸部的局部图像或者拍摄脸部的完整图像。
应理解,本发明的实施例并不限定执行310和315的顺序。两者可以同时执行,也可以是315在310之间执行。
320,根据拍摄信息确定飞行器相对于拍摄对象的拍摄位置。
具体地,飞行器控制器可以基于拍摄对象在拍摄画面中的范围与拍摄对象和拍摄位置之间距离(以下称为拍摄距离)的关系,估计出拍摄距离,其中拍摄对象在拍摄画面中所占的范围越小,拍摄距离越远,反之亦然。例如,可以根据用户期望的拍摄对象在待拍摄画面中的范围,采用预先设置的算法计算得到拍摄距离。再如,还可以预先设置拍摄对象在拍摄画面中的范围与拍摄距离的关系表,并根据用户期望的拍摄对象在待拍摄画面中的范围,通过查表的方式确定拍摄距离。
330,控制飞行器飞行至拍摄位置。
例如,可以根据拍摄距离对拍摄位置进行定位,并采用指点飞行的方式直线飞行至该拍摄位置或避障飞行至该拍摄位置。
应理解,飞行器可以如图2的实施例所述以多种方式自动启动飞行。
340,搜索、识别并跟踪拍摄对象。
例如,可以实时地接收拍摄设备或者飞行器上携带的其它视觉传感器传输的图像,并在接收到的图像中搜索并识别预设的拍摄对象的特征图像,并且对该拍摄对象进行跟踪。
345,控制拍摄设备的构图,使得拍摄对象在拍摄画面中的成像满足预设的构图规则。
例如,检测到拍摄对象后,可以控制拍摄设备进行智能构图,如果是为单人拍照,可采用经典的九宫格构图进行拍照,且可以根据面部识别算法与跟踪(Tracking)算法的反馈结果,实时调整飞行器位置与朝向,始终拍摄目标的正脸。
具体地,可以通过调整飞行器的飞行姿态、拍摄设备的云台的运动和拍摄设备的焦距中的至少一个来控制拍摄设备的构图,使得拍摄对象在拍摄对象中的位置满足预设的构图规则。
例如,可以通过控制飞行器的螺旋桨的转速来调整飞行器的飞行姿态,使飞行器产生横滚、平移和俯仰等姿态变化。还可以通过控制云台的横滚机构、平移机构和俯仰机构的旋转来调整云台的运动。上述调整和控制将使得拍摄设备随飞行器或云台相对于拍摄对象产生运动,从而能够调整拍摄对象在拍摄画面中的构图。另外,在拍摄过程中还可以调整拍摄设备的焦距,以得到清晰的构图。
350,在拍摄对象在拍摄画面中的成像满足预设的构图规则时,对拍摄对象进行拍摄。
例如,当根据图像识别的结果确定拍摄对象的中心与九宫格的某个交叉点重合时,向拍摄设备输出拍摄指示,指示拍摄对象进行拍摄。
根据本发明的实施例,可以根据用户期望的拍摄对象在待拍摄画面中的范围,估计出飞行器相对于拍摄对象的拍摄位置,控制飞行器飞行至该拍摄位置,并根据预设的构图规则进行智能化构图后进行拍摄,减少了拍摄过程中对飞行器的人工干涉,提高了用户体验,并且减少对飞行器的续航时间的占用,实际上提高了飞行器的续航能力。
图是4根据本发明的再一实施例的一种飞行器的控制方法的示意性流程图。图4的控制方法是图2的方法的示例。图4的控制方法包括如下内容。
410,获取拍摄对象的特征图像。
例如,可以在飞行前控制拍摄设备对拍摄对象进拍照,以获取拍摄对象的特征图像(例如,面部特征图像)。再如,也可以从外部设备(例如,用户终端)获取拍摄对象的图像特征。
420,确定针对拍摄对象的拍摄信息,其中拍摄信息用于指示拍摄对象在待拍摄画面中的范围。与图2的210相同,在此不再赘述。
应理解,本发明的实施例并不限定执行410和420的顺序。两者可以同 时执行,也可以是420在410之间执行。
430,搜索、识别并跟踪拍摄对象。
例如,可以在飞行器起飞后实时地接收拍摄设备或者飞行器上携带的其它视觉传感器传输的图像,并在接收到的图像中搜索并识别预设的拍摄对象的特征图像,并且对该拍摄对象进行跟踪。
应理解,飞行器可以如图2的实施例所述以多种方式自动启动飞行。
440,在搜索和识别到拍摄对象后,检测拍摄对象在当前拍摄画面中的范围是否与拍摄信息指示的范围一致。如果不一致,则执行450。如果一致,则执行460。
例如,在飞行器被抛飞后,通过图像识别可以得到拍摄对象在当前拍摄画面中所占的比例,并确定该比例是否与拍摄信息中指示的比例一致。
450,根据拍摄信息调整飞行器与拍摄对象之间的距离。继续执行440。
例如,如果拍摄对象在当前拍摄画面中所占的比例大于拍摄信息所指示的比例,则调整飞行器远离拍摄对象,如果拍摄对象在当前拍摄画面中所占的比例小于拍摄信息所指示的比例,则调整飞行器靠近拍摄对象。上述调整可以按固定的步长进行调整,也可以按照可变的步长进行调整。
460,在拍摄对象在拍摄画面中的范围与拍摄信息指示的范围一致时,确定飞行器当前所在的位置为拍摄位置。
例如,当确定该比例与拍摄信息中指示的比例一致时,可以将当前位置作为最终的拍摄位置。
470,控制拍摄设备对拍摄对象进行拍摄。
类似地,在本实施例中,也可以控制拍摄设备的构图,使得拍摄对象在拍摄画面中的成像满足预设的构图规则,并在拍摄对象在拍摄画面中的成像满足预设的构图规则时,对拍摄对象进行拍摄。
图是5根据本发明的另一实施例的一种飞行器的控制方法的示意性流程图。图5的控制方法是图2的方法的示例。图5的控制方法包括如下内容。
510,确定拍摄时飞行器的飞行轨迹。
具体地,在飞行前,可由用户操纵飞行器或外部设备,并根据飞行器或外部设备上的传感器检测到的运动数据,确定用户期望的拍摄过程中的飞行轨迹。这样用户可以通过一些简单的动作告诉飞行器在拍摄过程中的飞行轨迹,从而进一步减少了在拍摄过程中通过外部设备(例如,用户终端或遥控 器)对飞行器的手动操作,提高了用户体验,并且减少了对电能的消耗,提高了飞行器的续航能力。具体确定飞行轨迹的方法参见图2的实施例,在此不再赘述。
515,获取拍摄对象的特征图像。与410相同,在此不再赘述。
520,确定针对拍摄对象的拍摄信息,其中拍摄信息用于指示拍摄对象在待拍摄画面中的范围。与210相同,在此不再赘述。
525,确定拍摄对象在待拍摄画面中的构图规则。与315相同,在此不再赘述。
530,在飞行器被抛飞后,确定飞行方向上是否有障碍,如果没有障碍,则执行535,如果有障碍,则执行550。
在飞行器被抛出的一瞬间,首先通过飞行器上的传感系统,确定飞行方向是否安全。例如,可以利用测距传感器检测在飞行方向上的某个范围(例如,6-7米)内是否有障碍。应理解的是,该范围是个经验值,与拍摄设备的拍摄参数有关系,可以根据不同的型号的拍摄设备进行调整。
应理解,飞行器可以如图2的实施例所述以多种方式自动启动飞行。
535,如果飞行方向上没有障碍物,则控制飞行器掉转机头,并根据之前获取的特征图像搜索、识别并跟踪拍摄对象。
如果飞行方向上没有障碍物,控制飞行器朝向初始飞行方向的反方向,使得拍摄设备的镜头朝向拍摄对象,同时通过515中记录的面部特征,搜索和识别拍摄对象。一旦找到拍摄对象就采用跟踪算法锁定拍摄对象的面部,确认所述拍摄对象,并使用人体检测器(Human Detector)搜索拍摄对象的全身,从而确定拍摄对象的主体。
例如,可以实时地接收拍摄设备或者其它视觉传感器传输的图像,并在接收到的图像中搜索并识别515中确定的特征图像。如何在图像中搜索特征图像是常规技术,在此不再赘述。
540,根据预设的构图规则和拍摄信息控制飞行器飞行至拍摄位置。
具体而言,在搜索并识别拍摄对象后,可以按照预设的构图规则,调整拍摄对象在待拍摄画面中的位置,并调整飞行器与拍摄对象之间的距离,从而使得拍摄对象在待拍摄画面中所占的比例趋于与拍摄信息所指示的拍摄对象在待拍摄画面中所占的比例一致,并且符合预设的构图规则。
应理解,可以在调整构图同时调整飞行器与拍摄对象之间的距离,也可 以不限定调整构图和调整飞行器与拍摄对象之间距离的顺序。例如,也可以是先控制飞行器飞行至合适的拍摄位置,再调整构图,或者先调整构图,再控制飞行器飞行至合适的拍摄位置。
545,控制拍摄设备在飞行器按照飞行轨迹飞行时对拍摄对象进行拍摄。
例如,如果飞行器确定飞行轨迹为圆形,则飞行器在拍摄过程中可以环绕拍摄对象一边飞行一边拍摄,而如果飞行器确定飞行轨迹为拉近拍摄,则飞行器在拍摄过程中可以朝着拍摄对象的方向一边飞行一边拍摄。
例如,为了实现由近及远的运动长镜头,可以采用直线的飞行轨迹,在镜头的运动中实现空间的自然衔接转换,实现局部(拍摄对象)与整体(全景)的自然过渡。这种长镜头自拍方式被称为Dronies(Drone Selfies)。现在一般是手动控制实现,需要能够飞出较好的直线,同时控制镜头始终以拍摄对象为画面中心,对用户的飞行技术要求很高,一般人难以直接实现。根据本发明的实施例的智能拍摄方法能够更加容易的拍摄这类画面。
550,如果飞行方向上有障碍,则记录飞行器被抛出时的位置,并根据拍摄信息避开障碍飞行至拍摄位置。
如果飞行方向上有障碍,则可以先通过位置传感器,例如,GPS或者视觉传感器,获知抛出点的位置和高度,并记录该位置和高度。在这种情况下,可以规划路径绕开障碍,如无法绕行则尝试抬升高度,以避开障碍。另外,飞行过程中机头可以始终朝向前进方向,以保证飞行安全。
555,控制飞行器掉转机头,并根据飞行器被抛出时的位置和之前获取的特征图像搜索并识别拍摄对象。
在飞行至拍摄位置后,控制飞行器掉转机头,并根据当前位置信息和记录的抛出点的位置和高度,先识别脸部特征,并使用人体检测器检测出全身。
可替代地,也可以在抛飞后实时地确定飞行方向上是否有障碍,在确认没有障碍时马上掉转机头,搜索并识别拍摄对象,即一边倒着飞行,一边搜索、识别并跟踪拍摄对象。
560,控制拍摄设备在飞行器按照飞行轨迹飞行时对拍摄对象进行拍摄。
应理解,560中,也可以控制拍摄设备的构图,使得拍摄对象在拍摄画面中的成像满足预设的构图规则,并在拍摄对象在拍摄画面中的成像满足预设的构图规则时,对拍摄对象进行拍摄。
应理解,在540和555中,也可以不掉转机头,而是保持机头朝向飞行 方向不变,并将云台掉转方向来实现对拍摄对象的搜索、识别以及跟踪。
根据本发明的实施例,飞行器能够根据拍摄设备获取的特征图像,自动搜索、识别和跟踪拍摄对象,并自动构图,在飞行器被抛出后,根据预设的构图规则和拍摄信息控制飞行器飞行至拍摄位置完成一系列的连续拍摄,简单直观,无需使用遥控器或用户终端。另外,可以根据用户的意图自动规划飞行轨迹,整个过程更加平顺,交互友好。
以上描述了根据本发明实施例的控制方法,下面分别结合图6至图8描述根据本发明实施例的飞行器的控制装置、飞行器的控制设备和飞行器。
本发明实施例还提供了一种计算机存储介质,该计算机存储介质中存储有程序指令,所述程序执行时可包括如图2-5对应实施例中的飞行器的控制方法的部分或全部步骤。
图6是根据本发明的一个实施例的飞行器的控制装置600的结构示意图。飞行器的控制装置600例如可以为图1的飞行控制器。飞行器的控制装置600包括确定模块610和控制模块620。
确定模块610确定针对拍摄对象的拍摄信息,其中拍摄信息用于指示拍摄对象在待拍摄画面中的范围。控制模块620根据拍摄信息控制飞行器飞行至拍摄位置。
根据本发明的实施例,可以根据用户期望的拍摄对象在待拍摄画面中的范围,控制飞行器飞行至合适的拍摄位置,减少了拍摄过程中对飞行器的手动干涉,提高了用户体验,而且减少手动操作对续航时间的占用,实际上提高了飞行器的续航能力。
根据本发明的实施例,拍摄信息包括大景别、中等景别和小景别中的至少一个;或者拍摄信息包括全身像、大半身像、半身像、胸像、带肩头像和大头像中的至少一个。
根据本发明的实施例,控制模块620用于根据拍摄信息确定飞行器相对于拍摄对象的拍摄位置,并控制飞行器飞行至拍摄位置。
根据本发明的实施例,确定模块610还用于获取拍摄对象的图像,并根据图像确定拍摄对象。
根据本发明的实施例,确定模块610用于在飞行器起飞之前,控制拍摄设备拍摄拍摄对象,获取图像;或者在飞行器起飞之后,控制拍摄设备拍摄拍摄对象,获取图像;或者通过接收外部设备发送的拍摄对象的图像。
可选地,作为另一实施例,控制模块620还用于在飞行器飞行至拍摄位置后,控制飞行器携带的拍摄设备对拍摄对象进行拍摄。
根据本发明的实施例,控制模块620用于控制拍摄设备根据景深原理调整拍摄设备的焦距,并利用调整后的焦距对拍摄对象进行拍摄。
根据本发明的实施例,控制模块620用于控制拍摄设备的构图,使得拍摄对象在拍摄画面中的成像满足预设的构图规则,并且在拍摄对象在拍摄画面中的成像满足预设的构图规则时,对拍摄对象进行拍摄。
根据本发明的实施例,控制模块620用于通过调整飞行器的飞行姿态、拍摄设备的云台的运动和拍摄设备的焦距中的至少一个来控制拍摄设备的构图,使得拍摄对象在拍摄对象中的位置满足预设的构图规则。
可选地,作为另一实施例,确定模块610还用于确定拍摄拍摄对象时飞行器的飞行轨迹,其中控制模块620用于控制拍摄设备在飞行器按照飞行轨迹飞行时对拍摄对象进行拍摄。
根据本发明的实施例,确定模块610用于根据从外部设备接收到的输入确定飞行器的飞行轨迹。
根据本发明的实施例,确定模块610用于通过飞行器的运动传感器检测飞行器的运动,获取运动传感器输出的第一运动数据,根据第一运动数据确定飞行轨迹。
根据本发明的实施例,确定模块610用于获取外部设备的运动传感器检测外部设备运动而输出的第二运动数据,并根据第二运动数据确定飞行轨迹。
根据本发明的实施例,传感器包括:陀螺仪、电子罗盘、惯性测量单元、加速度计、全球导航卫星系统和视觉传感器中的至少一个。
根据本发明的实施例运动包括如下运动中的至少一个:环绕运动、拉远运动、拉近运动和S形运动。
可选地,作为另一实施例,运动为环绕运动,还包括:第二检测模块650,用于在飞行器飞行之前,检测飞行器的云台的俯仰轴的转动,确定模块610还用于根据检测到的俯仰轴的转动和环绕运动确定飞行轨迹为螺旋上升和螺旋下降之一。
根据本发明的实施例,运动包括在垂直平面内的运动和在水平平面内的运动之一。
可选地,作为另一实施例,确定模块610还在确定飞行器的飞行轨迹前,确定是否接收到激活确定飞行轨迹的信号,信号用于激活确定飞行器的飞行轨迹的过程。
根据本发明的实施例,确定模块610用于如果在预设的时间内没有输入飞行轨迹,则确定飞行轨迹为跟随飞行。
根据本发明的实施例,飞行轨迹包括环绕、拉远、拉近和S形中的至少一个。
可选地,作为另一实施例,控制模块620还用于在飞行器满足预设的自动启动条件时,自动启动飞行器。
根据本发明的实施例,控制模块620用于在飞行器被抛飞的情况下,检测飞行器的第三运动数据,并且在第三运动数据满足自动启动条件时,自动启动飞行器的动力装置。
根据本发明的实施例,运动数据包括飞行器被抛出的距离,第三运动数据满足自动启动条件包括:飞行器被抛出的距离大于或等于第一预设阈值;或者第三运动数据包括飞行器的垂直速度或速度,第三运动数据满足自动启动条件包括:飞行器的垂直速度或速度小于或等于第二预设阈值。
根据本发明的实施例,第一预设阈值为零或飞行器与用户之间的安全距离。
根据本发明的实施例,控制模块620用于在飞行器被抛飞之前,在飞行器满足预设的怠速条件时,启动动力装置并控制动力装置怠速转动。
根据本发明的实施例,控制模块620在飞行器通过人脸解锁后控制动力装置怠速转动;或者在飞行器水平放置超过预设时长之后控制动力装置怠速转动;或者在确认收到允许怠速转动的信号时控制动力装置怠速转动。
根据本发明的实施例,控制模块620在飞行器起飞前,检测飞行器的第四运动数据,在第四运动数据满足自动启动条件时,则自动启动飞行器的动力装置。
根据本发明的实施例,第四运动数据指示飞行器的姿态角位于预设的阈值范围的时长,第四运动数据满足自动启动条件,包括:时长超过第二预设阈值。
根据本发明的实施例,控制模块620用于通过飞行器的拍摄设备搜索和识别确定的拍摄对象;在搜索和识别到拍摄对象后,检测拍摄对象在拍摄画 面中的范围是否与拍摄信息指示的范围一致;在拍摄对象在拍摄画面中的范围与拍摄信息指示的范围一致时,确定飞行器所在的位置为拍摄位置。
根据本发明的实施例,控制模块620具体用于当确定飞行器的前方不存在障碍时,控制飞行器的机头或飞行器的云台,使拍摄设备朝向起飞的位置,利用拍摄设备搜索和识别确定的拍摄对象。
根据本发明的实施例,控制模块620具体用于当确定飞行器前方存在障碍物时,控制飞行器绕开障碍物,并且控制飞行器的调转机头或飞行器的云台,使拍摄设备朝向起飞的位置,利用拍摄设备搜索和识别确定的拍摄对象。
可选地,作为另一实施例,确定模块610还用于确定飞行器起飞后的飞行方向,其中确定模块610用于根据拍摄信息确定飞行器起飞后的飞行距离,并根据飞行方向和飞行距离确定拍摄位置。
根据本发明的实施例,确定模块610用于根据飞行器起飞前的设定确定飞行方向;或者根据飞行器起飞时飞行器的机头的方向确定飞行方向;或者根据述飞行器起飞时所处的位置确定飞行方向;或者根据拍摄对象的位置确定飞行方向;或者根据拍摄对象的朝向确定飞行方向;或者根据被选择的拍摄角度确定飞行方向。
可选地,作为另一实施例,确定模块610还用于确定用于拍摄拍摄对象的拍摄设备的拍摄参数,其中确定模块610根据拍摄信息确定飞行器起飞后的飞行距离,并根据飞行距离和拍摄设备的拍摄参数确定拍摄位置。
根据本发明的实施例,拍摄参数为视场FOV参数和焦距参数中的至少一个。
可选地,作为另一实施例,确定模块610还用于确定拍摄对象在拍摄画面中需要满足的预设的构图规则,其中控制模块620用于根据预设的构图规则和拍摄信息控制飞行器飞行至拍摄位置。
根据本发明的实施例,构图规则包括拍摄对象在拍摄画面中的位置、拍摄对象的脸在拍摄画面中角度、拍摄对象的脸在画面中的完整度中的一种或多种。
可选地,作为另一实施例,确定模块610还用于从外部设备接收预设的构图规则;或者通过识别拍摄对象的预设动作或姿势来确定构图规则。
根据本发明的实施例,构图规则包括如下构图规则之一的规则:均衡式构图、对称式构图、对角线构图、三角形构图、九宫格构图、向心式构图、 对分式构图、拍摄画面中的人脸为正脸、拍摄画面中的人脸为侧脸。
根据本发明的实施例,确定模块610根据接收到的外部设备的输入确定拍摄信息。
根据本发明的实施例,拍摄对象包括多个主体时,确定模块610具体用于根据多个主体的数目和拍摄信息确定飞行器的拍摄位置。
可选地,作为另一实施例,第一检测模块630,用于检测飞行器被抛飞时的速度、加速度和抛飞轨迹中的至少一个,其中确定模块610用于根据速度、加速度和抛飞轨迹中的至少一个从预先设置的多种拍摄信息中选择拍摄信息。
可选地,作为另一实施例,飞行器的控制装置600还包括:第三检测模块660,用于检测环境状况信息和/或拍摄对象的姿势信息,其中所述控制模块还用于根据环境状况信息和/或拍摄对象的姿势信息调整拍摄角度。
飞行器的控制装置700的各个模块的操作和功能可以参考上述图2至图5的方法,为了避免重复,在此不再赘述。
图7是根据本发明的另一实施例的一种飞行器的控制设备700的结构示意图。控制设备700包括:处理器710和存储器720。
存储器720用于存储指令以使得处理器720执行如图2至图5中的任一实施例的方法。
图8是根据本发明的实施例的一种飞行器800的结构示意图。飞行器800包括控制设备810、一个或多个推进装置820以及传感系统830,传感系统830用于检测飞行器800的运动参数,一个或多个推进装置820用于提供给飞行器800的飞行动力。控制设备810与一个或多个推进装置820通信连接,并且与传感系统830通信连接,用于根据传感系统830检测到手运动参数控制一个或多个推进装置820工作,以控制飞行器800的飞行。飞行控制设备810可以为图7的控制设备。推进装置820可以为图1的动力系统。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及算法步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本发明的范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描 述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本发明的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以权利要求的保护范围为准。

Claims (88)

  1. 一种飞行器的控制方法,其特征在于,包括:
    确定针对拍摄对象的拍摄信息,其中所述拍摄信息用于指示所述拍摄对象在待拍摄画面中的范围;
    根据所述拍摄信息控制所述飞行器飞行至拍摄位置。
  2. 根据权利要求1所述的控制方法,所述拍摄信息包括大景别、中等景别和小景别中的至少一个;或者
    所述拍摄信息包括全身像、大半身像、半身像、胸像、带肩头像和大头像中的至少一个。
  3. 根据权利要求1或2所述的控制方法,其特征在于,还包括:
    获取所述拍摄对象的图像;
    根据所述图像确定所述拍摄对象。
  4. 根据权利要求3所述的控制方法,其特征在于,所述获取所述拍摄对象的图像,包括:
    在所述飞行器起飞之前,控制所述拍摄设备拍摄所述拍摄对象,获取所述图像;或者
    在所述飞行器起飞之后,控制所述拍摄设备拍摄所述拍摄对象,获取所述图像;或者
    通过接收外部设备发送的所述拍摄对象的图像。
  5. 根据权利要求1至4中的任一项所述的控制方法,其特征在于,还包括:
    在所述飞行器飞行至所述拍摄位置后,控制所述飞行器携带的拍摄设备对所述拍摄对象进行拍摄。
  6. 根据权利要求5所述控制方法,其特征在于,所述控制所述飞行器携带的拍摄设备对所述拍摄对象进行拍摄,包括:
    控制所述拍摄设备根据景深原理调整所述拍摄设备的焦距,并利用调整后的焦距对所述拍摄对象进行拍摄。
  7. 根据权利要求5所述的控制方法,其特征在于,所述控制所述飞行器携带的拍摄设备对所述拍摄对象进行拍摄,包括:
    控制所述拍摄设备的构图,使得所述拍摄对象在所述拍摄画面中的成像满足预设的构图规则;
    在所述拍摄对象在所述拍摄画面中的成像满足预设的构图规则时,对所述拍摄对象进行拍摄。
  8. 根据权利要求7所述的控制方法,其特征在于,所述控制所述拍摄设备的构图,使得所述拍摄对象在所述拍摄画面中的成像满足预设的构图规则,包括:
    通过调整所述飞行器的飞行姿态、所述拍摄设备的云台的运动和所述拍摄设备的焦距中的至少一个来控制所述拍摄设备的构图,使得所述拍摄对象在所述拍摄对象中的位置满足所述预设的构图规则。
  9. 根据权利要求5至8中的任一项所述的控制方法,其特征在于,还包括:
    确定拍摄所述拍摄对象时所述飞行器的飞行轨迹,
    其中所述控制所述飞行器携带的拍摄设备对所述拍摄对象进行拍摄,包括:
    控制所述拍摄设备在所述飞行器按照所述飞行轨迹飞行时对所述拍摄对象进行拍摄。
  10. 根据权利要求9所述的控制方法,其特征在于,所述确定拍摄所述拍摄对象时所述飞行器的飞行轨迹,包括:
    根据从外部设备接收到的输入确定所述飞行器的飞行轨迹。
  11. 根据权利要求9所述的控制方法,其特征在于,所述确定拍摄所述拍摄对象时所述飞行器的飞行轨迹,包括:
    通过所述飞行器的运动传感器检测所述飞行器的运动,获取所述运动传感器输出的第一运动数据,根据所述第一运动数据确定所述飞行轨迹。
  12. 根据权利要求9所述的控制方法,其特征在于,所述确定拍摄所述拍摄对象时所述飞行器的飞行轨迹,包括:
    获取所述外部设备的运动传感器检测外部设备运动而输出的第二运动数据,根据所述第二运动数据确定所述飞行轨迹。
  13. 根据权利要求11或12所述的控制方法,其特征在于,所述传感器包括:陀螺仪、电子罗盘、惯性测量单元、加速度计、全球导航卫星系统和视觉传感器中的至少一个。
  14. 根据权利要求11至13中的任一项所述的控制方法,其特征在于,所述运动包括如下运动中的至少一个:环绕运动、拉远运动、拉近运动和S 形运动。
  15. 根据权利要求14所述的控制方法,其特征在于,所述运动为环绕运动,所述方法还包括:
    在飞行器飞行之前,检测所述飞行器的云台的俯仰轴的转动;
    根据检测到的所述俯仰轴的转动和所述环绕运动确定所述飞行轨迹为螺旋上升和螺旋下降之一。
  16. 根据权利要求11至15中的任一项所述的控制方法,其特征在于,所述运动包括在垂直平面内的运动和在水平平面内的运动之一。
  17. 根据权利要求9至16中的任一项所述的控制方法,其特征在于,在确定所述飞行器的飞行轨迹前,还包括:
    确定是否接收到激活确定所述飞行轨迹的信号,所述信号用于激活所述确定所述飞行器的飞行轨迹的过程。
  18. 根据权利要求9至17中的任一项所述的控制方法,其特征在于,所述确定拍摄所述拍摄对象时所述飞行器的飞行轨迹,包括:
    如果在预设的时间内没有输入飞行轨迹,则确定所述飞行轨迹为跟随飞行。
  19. 根据权利要求9至18中的任一项所述的控制方法,其特征在于,所述飞行轨迹包括环绕、拉远、拉近和S形中的至少一个。
  20. 根据权利要求1至19中的任一项所述的控制方法,其特征在于,还包括:
    在所述飞行器满足预设的自动启动条件时,自动启动所述飞行器。
  21. 根据权利要求20所述的控制方法,其特征在于,所述在所述飞行器满足预设的自动启动条件时,自动启动所述飞行器,包括:
    在所述飞行器被抛飞的情况下,检测所述飞行器的第三运动数据;
    在所述第三运动数据满足所述自动启动条件时,自动启动所述飞行器的动力装置。
  22. 根据权利要求21所述的控制方法,其特征在于,所述运动数据包括所述飞行器被抛出的距离,所述第三运动数据满足所述自动启动条件包括:所述飞行器被抛出的距离大于或等于第一预设阈值;或者
    所述第三运动数据包括所述飞行器的垂直速度或速度,所述第三运动数据满足所述自动启动条件包括:所述飞行器的垂直速度或速度小于或等于第 二预设阈值。
  23. 根据权利要求22所述的控制方法,其特征在于,所述第一预设阈值为零或所述飞行器与所述用户之间的安全距离。
  24. 根据权利要求20所述的控制方法,其特征在于,所述在所述飞行器满足预设的自动启动条件时,自动启动所述飞行器,包括:
    在所述飞行器被抛飞之前,在所述飞行器满足预设的怠速条件时,启动所述动力装置并控制所述动力装置怠速转动。
  25. 根据所述权利要求24所述的控制方法,其特征在于,在所述飞行器满足预设的怠速条件时,启动所述动力装置并控制所述动力装置怠速转动,包括:
    在所述飞行器通过人脸解锁后控制所述动力装置怠速转动;或者
    在所述飞行器水平放置超过预设时长之后控制所述动力装置怠速转动;或者
    在确认收到允许怠速转动的信号时控制所述动力装置怠速转动。
  26. 根据权利要求20所述的控制方法,其特征在于,所述在所述飞行器满足预设的自动启动条件时,自动启动所述飞行器,包括:
    在飞行器起飞前,检测所述飞行器的第四运动数据;
    在所述第四运动数据满足所述自动启动条件时,则自动启动所述飞行器的动力装置。
  27. 根据权利要求26所述的控制方法,其特征在于,所述第四运动数据指示所述飞行器的姿态角位于预设的阈值范围的时长,所述第四运动数据满足所述自动启动条件,包括:所述时长超过第二预设阈值。
  28. 根据权利要求1至27中的任一项所述的控制方法,其特征在于,所述根据所述拍摄信息控制所述飞行器飞行至拍摄位置,包括:
    通过所述飞行器的所述拍摄设备搜索和识别所述确定的拍摄对象;
    在搜索和识别到所述拍摄对象后,检测所述拍摄对象在拍摄画面中的范围是否与所述拍摄信息指示的所述范围一致;
    在所述拍摄对象在拍摄画面中的范围与所述拍摄信息指示的所述范围一致时,确定所述飞行器所在的位置为所述拍摄位置。
  29. 根据权利要求28所述的控制方法,其特征在于,所述飞行器通过所述拍摄设备搜索和识别所述确定的拍摄对象具体包括:
    当确定所述飞行器的前方不存在障碍时,控制所述飞行器的机头或所述飞行器的云台,使所述拍摄设备朝向起飞的位置,利用所述拍摄设备搜索和识别所述确定的拍摄对象。
  30. 根据权利要求28所述的控制方法,其特征在于,所述飞行器通过所述拍摄设备搜索和识别所述确定的拍摄对象具体包括:
    当确定所述飞行器前方存在障碍物时,控制所述飞行器绕开所述障碍物;
    控制所述飞行器的调转机头或所述飞行器的云台,使所述拍摄设备朝向起飞的位置,利用所述拍摄设备搜索和识别所述确定的拍摄对象。
  31. 根据权利要求1至27中的任一项所述的控制方法,其特征在于,所述根据所述拍摄信息控制所述飞行器飞行至拍摄位置,包括:
    根据所述拍摄信息确定所述飞行器相对于所述拍摄对象的拍摄位置;
    控制所述飞行器飞行至所述拍摄位置。
  32. 根据权利要求1至31中的任一项所述的控制方法,其特征在于,所述拍摄对象包括多个主体时,所述根据所述拍摄信息控制所述飞行器飞行至拍摄位置,包括:
    根据所述多个主体的数目和所述拍摄信息确定所述飞行器的拍摄位置。
  33. 根据权利要求1至31中的任一项所述的控制方法,其特征在于,还包括:
    确定所述飞行器起飞后的飞行方向,
    其中所述根据所述拍摄信息控制所述飞行器飞行至拍摄位置,包括:
    根据所述拍摄信息确定所述飞行器起飞后的飞行距离;
    根据所述飞行方向和所述飞行距离确定所述拍摄位置。
  34. 根据所述权利要求33所述的控制方法,其特征在于,所述确定所述飞行器起飞后的飞行方向,包括:
    根据所述飞行器起飞前的设定确定所述飞行方向;或者
    根据所述飞行器起飞时所述飞行器的机头的方向确定所述飞行方向;或者
    根据述飞行器起飞时所处的位置确定所述飞行方向;或者
    根据所述拍摄对象的位置确定所述飞行方向;或者
    根据所述拍摄对象的朝向确定所述飞行方向;或者
    根据被选择的拍摄角度确定所述飞行方向。
  35. 根据权利要求1至31中的任一项所述的控制方法,其特征在于,还包括:
    确定所述用于拍摄所述拍摄对象的拍摄设备的拍摄参数,
    其中所述根据所述拍摄信息控制所述飞行器飞行至拍摄位置,包括:
    根据所述拍摄信息确定所述飞行器起飞后的飞行距离;
    根据所述飞行距离和所述拍摄设备的拍摄参数确定所述拍摄位置。
  36. 根据权利要求35所述的控制方法,其特征在于,所述拍摄参数为视场FOV参数和焦距参数中的至少一个。
  37. 根据权利要求1至31中的任一项所述的控制方法,其特征在于,还包括:
    确定所述拍摄对象在所述拍摄画面中需要满足的预设的构图规则;
    其中所述根据所述拍摄信息控制所述飞行器飞行至拍摄位置,包括:
    根据所述预设的构图规则和所述拍摄信息控制所述飞行器飞行至所述拍摄位置。
  38. 根据权利要求7、8、37中的任一项所述的控制方法,其特征在于,还包括:
    从外部设备接收预设的所述构图规则;或者
    通过识别拍摄对象的预设动作或姿势来确定所述构图规则。
  39. 根据权利要求7、8、37、38中的任一项所述的控制方法,其特征在于,所述构图规则包括所述拍摄对象在拍摄画面中的位置、所述拍摄对象的脸在所述拍摄画面中角度、所述拍摄对象的脸在所述画面中的完整度中的一种或多种。
  40. 根据权利要求7、8、37至39中的任一项所述的控制方法,其特征在于,所述构图规则包括如下构图规则之一的规则:均衡式构图、对称式构图、对角线构图、三角形构图、九宫格构图、向心式构图、对分式构图、拍摄画面中的人脸为正脸、拍摄画面中的人脸为侧脸。
  41. 根据权利要求1至40中的任一项所述的控制方法,所述确定针对拍摄对象的拍摄信息,包括:
    根据接收到的外部设备的输入确定所述拍摄信息。
  42. 根据权利要求1至40中的任一项所述的控制方法,其特征在于, 还包括:
    检测所述飞行器被抛飞时的速度、加速度和抛飞轨迹中的至少一个;
    其中所述确定针对拍摄对象的拍摄信息,包括:
    根据所述速度、所述加速度和所述抛飞轨迹中的至少一个从预先设置的多种拍摄信息中选择所述拍摄信息。
  43. 根据权利要求1至42中的任一项所述控制方法,其特征在于,还包括:
    检测环境状况信息和/或所述拍摄对象的姿势信息;
    根据所述环境状况信息和/或所述拍摄对象的姿势信息调整拍摄角度。
  44. 一种飞行器的控制装置,其特征在于,包括:
    确定模块,用于确定针对拍摄对象的拍摄信息,其中所述拍摄信息用于指示所述拍摄对象在待拍摄画面中的范围;
    控制模块,用于根据所述拍摄信息控制所述飞行器飞行至拍摄位置。
  45. 根据权利要求44所述的控制装置,所述拍摄信息包括大景别、中等景别和小景别中的至少一个;或者,所述拍摄信息包括全身像、大半身像、半身像、胸像、带肩头像和大头像中的至少一个。
  46. 根据权利要求44或45所述的控制装置,其特征在于,所述确定模块还用于获取所述拍摄对象的图像,并根据所述图像确定所述拍摄对象。
  47. 根据权利要求46所述的控制装置,其特征在于,所述确定模块用于在所述飞行器起飞之前,控制所述拍摄设备拍摄所述拍摄对象,获取所述图像;或者在所述飞行器起飞之后,控制所述拍摄设备拍摄所述拍摄对象,获取所述图像;或者通过接收外部设备发送的所述拍摄对象的图像。
  48. 根据权利要求44至47中的任一项所述的控制装置,其特征在于,所述控制模块还用于在所述飞行器飞行至所述拍摄位置后,控制所述飞行器携带的拍摄设备对所述拍摄对象进行拍摄。
  49. 根据权利要求48所述控制装置,其特征在于,所述控制模块用于控制所述拍摄设备根据景深原理调整所述拍摄设备的焦距,并利用调整后的焦距对所述拍摄对象进行拍摄。
  50. 根据权利要求48所述的控制装置,其特征在于,所述控制模块用于控制所述拍摄设备的构图,使得所述拍摄对象在所述拍摄画面中的成像满足预设的构图规则,并且在所述拍摄对象在所述拍摄画面中的成像满足预设 的构图规则时,对所述拍摄对象进行拍摄。
  51. 根据权利要求50所述的控制装置,其特征在于,所述控制模块用于通过调整所述飞行器的飞行姿态、所述拍摄设备的云台的运动和所述拍摄设备的焦距中的至少一个来控制所述拍摄设备的构图,使得所述拍摄对象在所述拍摄对象中的位置满足所述预设的构图规则。
  52. 根据权利要求48至51中的任一项所述的控制装置,其特征在于,所述确定模块还用于确定拍摄所述拍摄对象时所述飞行器的飞行轨迹,其中所述控制模块用于控制所述拍摄设备在所述飞行器按照所述飞行轨迹飞行时对所述拍摄对象进行拍摄。
  53. 根据权利要求52所述的控制装置,其特征在于,所述确定模块用于根据从外部设备接收到的输入确定所述飞行器的飞行轨迹。
  54. 根据权利要求52所述的控制装置,其特征在于,所述确定模块用于通过所述飞行器的运动传感器检测所述飞行器的运动,获取所述运动传感器输出的第一运动数据,根据所述第一运动数据确定所述飞行轨迹。
  55. 根据权利要求52所述的控制装置,其特征在于,所述确定模块用于获取所述外部设备的运动传感器检测外部设备运动而输出的第二运动数据,并根据所述第二运动数据确定所述飞行轨迹。
  56. 根据权利要求54或55所述的控制装置,其特征在于,所述传感器包括:陀螺仪、电子罗盘、惯性测量单元、加速度计、全球导航卫星系统和视觉传感器中的至少一个。
  57. 根据权利要求54至56中的任一项所述的控制装置,其特征在于,所述运动包括如下运动中的至少一个:环绕运动、拉远运动、拉近运动和S形运动。
  58. 根据权利要求57所述的控制装置,其特征在于,所述运动为环绕运动,所述控制装置还包括:
    第二检测模块,用于在飞行器飞行之前,检测所述飞行器的云台的俯仰轴的转动,所述确定模块还用于根据检测到的所述俯仰轴的转动和所述环绕运动确定所述飞行轨迹为螺旋上升和螺旋下降之一。
  59. 根据权利要求51至58中的任一项所述的控制装置,其特征在于,所述运动包括在垂直平面内的运动和在水平平面内的运动之一。
  60. 根据权利要求51至59中的任一项所述的控制装置,其特征在于, 所述确定模块还在确定所述飞行器的飞行轨迹前,确定是否接收到激活确定所述飞行轨迹的信号,所述信号用于激活所述确定所述飞行器的飞行轨迹的过程。
  61. 根据权利要求51至60中的任一项所述的控制装置,其特征在于,所述确定模块用于如果在预设的时间内没有输入飞行轨迹,则确定所述飞行轨迹为跟随飞行。
  62. 根据权利要求51至61中的任一项所述的控制装置,其特征在于,所述飞行轨迹包括环绕、拉远、拉近和S形中的至少一个。
  63. 根据权利要求44至62中的任一项所述的控制装置,其特征在于,所述控制模块还用于在所述飞行器满足预设的自动启动条件时,自动启动所述飞行器。
  64. 根据权利要求63所述的控制装置,其特征在于,所述控制模块用于在所述飞行器被抛飞的情况下,检测所述飞行器的第三运动数据,并且在所述第三运动数据满足所述自动启动条件时,自动启动所述飞行器的动力装置。
  65. 根据权利要求64所述的控制装置,其特征在于,所述运动数据包括所述飞行器被抛出的距离,所述第三运动数据满足所述自动启动条件包括:所述飞行器被抛出的距离大于或等于第一预设阈值;或者
    所述第三运动数据包括所述飞行器的垂直速度或速度,所述第三运动数据满足所述自动启动条件包括:所述飞行器的垂直速度或速度小于或等于第二预设阈值。
  66. 根据权利要求65所述的控制装置,其特征在于,所述第一预设阈值为零或所述飞行器与所述用户之间的安全距离。
  67. 根据权利要求63所述的控制装置,其特征在于,所述控制模块用于在所述飞行器被抛飞之前,在所述飞行器满足预设的怠速条件时,启动所述动力装置并控制所述动力装置怠速转动。
  68. 根据所述权利要求67所述的控制装置,其特征在于,所述控制模块在所述飞行器通过人脸解锁后控制所述动力装置怠速转动;或者在所述飞行器水平放置超过预设时长之后控制所述动力装置怠速转动;或者在确认收到允许怠速转动的信号时控制所述动力装置怠速转动。
  69. 根据权利要求63所述的控制装置,其特征在于,所述控制模块在 飞行器起飞前,检测所述飞行器的第四运动数据,在所述第四运动数据满足所述自动启动条件时,则自动启动所述飞行器的动力装置。
  70. 根据权利要求69所述的控制装置,其特征在于,所述第四运动数据指示所述飞行器的姿态角位于预设的阈值范围的时长,所述第四运动数据满足所述自动启动条件,包括:所述时长超过第二预设阈值。
  71. 根据权利要求44至70中的任一项所述的控制装置,其特征在于,所述控制模块用于通过所述飞行器的所述拍摄设备搜索和识别所述确定的拍摄对象;在搜索和识别到所述拍摄对象后,检测所述拍摄对象在拍摄画面中的范围是否与所述拍摄信息指示的所述范围一致;在所述拍摄对象在拍摄画面中的范围与所述拍摄信息指示的所述范围一致时,确定所述飞行器所在的位置为所述拍摄位置。
  72. 根据权利要求71所述的控制装置,其特征在于,所述控制模块具体用于当确定所述飞行器的前方不存在障碍时,控制所述飞行器的机头或所述飞行器的云台,使所述拍摄设备朝向起飞的位置,利用所述拍摄设备搜索和识别所述确定的拍摄对象。
  73. 根据权利要求71所述的控制装置,其特征在于,所述控制模块具体用于当确定所述飞行器前方存在障碍物时,控制所述飞行器绕开所述障碍物,并且控制所述飞行器的调转机头或所述飞行器的云台,使所述拍摄设备朝向起飞的位置,利用所述拍摄设备搜索和识别所述确定的拍摄对象。
  74. 根据权利要求44至70中的任一项所述的控制装置,其特征在于,所述控制模块用于根据所述拍摄信息确定所述飞行器相对于所述拍摄对象的拍摄位置,并控制所述飞行器飞行至所述拍摄位置。
  75. 根据权利要求44至74中的任一项所述的控制装置,其特征在于,所述拍摄对象包括多个主体时,所述确定模块具体用于根据所述多个主体的数目和所述拍摄信息确定所述飞行器的拍摄位置。
  76. 根据权利要求44至74中的任一项所述的控制装置,其特征在于,所述确定模块还用于确定所述飞行器起飞后的飞行方向,其中所述确定模块用于根据所述拍摄信息确定所述飞行器起飞后的飞行距离,并根据所述飞行方向和所述飞行距离确定所述拍摄位置。
  77. 根据所述权利要求76所述的控制装置,其特征在于,所述确定模块用于根据所述飞行器起飞前的设定确定所述飞行方向;或者根据所述飞行 器起飞时所述飞行器的机头的方向确定所述飞行方向;或者根据述飞行器起飞时所处的位置确定所述飞行方向;或者根据所述拍摄对象的位置确定所述飞行方向;或者根据所述拍摄对象的朝向确定所述飞行方向;或者根据被选择的拍摄角度确定所述飞行方向。
  78. 根据权利要求44至74中的任一项所述的控制装置,其特征在于,所述确定模块还用于确定所述用于拍摄所述拍摄对象的拍摄设备的拍摄参数,其中所述确定模块根据所述拍摄信息确定所述飞行器起飞后的飞行距离,并根据所述飞行距离和所述拍摄设备的拍摄参数确定所述拍摄位置。
  79. 根据权利要求78所述的控制装置,其特征在于,所述拍摄参数为视场FOV参数和焦距参数中的至少一个。
  80. 根据权利要求44至74中的任一项所述的控制装置,其特征在于,所述确定模块还用于确定所述拍摄对象在所述拍摄画面中需要满足的预设的构图规则,其中所述控制模块用于根据所述预设的构图规则和所述拍摄信息控制所述飞行器飞行至所述拍摄位置。
  81. 根据权利要求51、52和80中的任一项所述的控制装置,其特征在于,所述确定模块还用于从外部设备接收预设的所述构图规则;或者通过识别拍摄对象的预设动作或姿势来确定所述构图规则。
  82. 根据权利要求51、52、80和81中的任一项所述的控制装置,其特征在于,所述构图规则包括所述拍摄对象在拍摄画面中的位置、所述拍摄对象的脸在所述拍摄画面中角度、所述拍摄对象的脸在所述画面中的完整度中的一种或多种。
  83. 根据权利要求51、52、80至82中的任一项所述的控制装置,其特征在于,所述构图规则包括如下构图规则之一的规则:均衡式构图、对称式构图、对角线构图、三角形构图、九宫格构图、向心式构图、对分式构图、拍摄画面中的人脸为正脸、拍摄画面中的人脸为侧脸。
  84. 根据权利要求44至83中的任一项所述的控制装置,其特征在于,所述确定模块根据接收到的外部设备的输入确定所述拍摄信息。
  85. 根据权利要求44至83中的任一项所述的控制装置,其特征在于,还包括:
    第一检测模块,用于检测所述飞行器被抛飞时的速度、加速度和抛飞轨迹中的至少一个,其中所述确定模块用于根据所述速度、所述加速度和所述 抛飞轨迹中的至少一个从预先设置的多种拍摄信息中选择所述拍摄信息。
  86. 根据权利要求44至85中的任一项所述控制装置,其特征在于,还包括:
    第三检测模块,用于检测环境状况信息和/或所述拍摄对象的姿势信息,其中所述控制模块还用于根据所述环境状况信息和/或所述拍摄对象的姿势信息调整拍摄角度。
  87. 一种飞行器的控制设备,其特征在于,包括:处理器和存储器,
    其中所述存储器用于存储指令以使得所述处理器执行如权利要求1至43所述的方法。
  88. 一种飞行器,包括:
    传感系统,用于检测所述飞行器的运动参数;
    如权利要求87所述的控制设备;以及
    一个或多个推进装置,用于为所述飞行器提供飞行动力;
    其中,所述控制装置与所述一个或多个推进装置通信连接,并且与传感系统通信连接,用于根据传感系统检测的运动参数控制所述一个或多个推进装置工作,以控制所述飞行器的飞行。
PCT/CN2016/107997 2016-11-30 2016-11-30 飞行器的控制方法、装置和设备以及飞行器 WO2018098678A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
CN201910392486.0A CN110119154A (zh) 2016-11-30 2016-11-30 飞行器的控制方法、装置和设备以及飞行器
CN201680002531.1A CN107087427B (zh) 2016-11-30 2016-11-30 飞行器的控制方法、装置和设备以及飞行器
PCT/CN2016/107997 WO2018098678A1 (zh) 2016-11-30 2016-11-30 飞行器的控制方法、装置和设备以及飞行器
US16/426,182 US11188101B2 (en) 2016-11-30 2019-05-30 Method for controlling aircraft, device, and aircraft
US17/456,753 US20220083078A1 (en) 2016-11-30 2021-11-29 Method for controlling aircraft, device, and aircraft

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/107997 WO2018098678A1 (zh) 2016-11-30 2016-11-30 飞行器的控制方法、装置和设备以及飞行器

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/426,182 Continuation US11188101B2 (en) 2016-11-30 2019-05-30 Method for controlling aircraft, device, and aircraft

Publications (1)

Publication Number Publication Date
WO2018098678A1 true WO2018098678A1 (zh) 2018-06-07

Family

ID=59614396

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/107997 WO2018098678A1 (zh) 2016-11-30 2016-11-30 飞行器的控制方法、装置和设备以及飞行器

Country Status (3)

Country Link
US (2) US11188101B2 (zh)
CN (2) CN110119154A (zh)
WO (1) WO2018098678A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220292990A1 (en) * 2021-03-12 2022-09-15 Thales Method and electronic device for generating at least one eosid trajectory for at least one runway, related computer program and electronic flight management system

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10375289B2 (en) * 2017-03-31 2019-08-06 Hangzhou Zero Zero Technology Co., Ltd. System and method for providing autonomous photography and videography
JP7057637B2 (ja) * 2017-08-23 2022-04-20 キヤノン株式会社 制御装置、制御システム、制御方法、プログラム、及び記憶媒体
CN107817820A (zh) * 2017-10-16 2018-03-20 复旦大学 一种基于深度学习的无人机自主飞行控制方法与系统
CN107968692B (zh) * 2017-11-17 2021-10-22 深圳市道通智能航空技术股份有限公司 一种无人机的通信方法、通信装置及无人机
CN109643131A (zh) * 2017-11-30 2019-04-16 深圳市大疆创新科技有限公司 无人机、其控制方法以及记录介质
CN109074095B (zh) * 2017-12-26 2022-04-01 深圳市大疆创新科技有限公司 一种飞行轨迹原路复演方法及飞行器
WO2019127111A1 (zh) * 2017-12-27 2019-07-04 深圳市大疆创新科技有限公司 拍摄系统、承载装置、拍摄装置以及它们的控制方法
CN108163203B (zh) * 2017-12-31 2020-10-13 深圳市道通智能航空技术有限公司 一种拍摄控制方法、装置及飞行器
CN109196438A (zh) * 2018-01-23 2019-01-11 深圳市大疆创新科技有限公司 一种飞行控制方法、设备、飞行器、系统及存储介质
US11511842B2 (en) * 2018-02-20 2022-11-29 Georgia Tech Research Corporation Miniature autonomous robotic blimp
CN110386087B (zh) * 2018-04-23 2022-04-12 上海擎感智能科技有限公司 基于车辆的拍摄方法、存储介质、电子设备、及车辆
WO2019227333A1 (zh) * 2018-05-30 2019-12-05 深圳市大疆创新科技有限公司 集体照拍摄方法和装置
EP3825763A1 (en) * 2018-07-18 2021-05-26 SZ DJI Technology Co., Ltd. Image photographing method and unmanned aerial vehicle
WO2020024104A1 (zh) * 2018-07-31 2020-02-06 深圳市大疆创新科技有限公司 返航控制方法、装置及设备
WO2020024134A1 (zh) * 2018-08-01 2020-02-06 深圳市大疆创新科技有限公司 轨迹切换的方法和装置
WO2020087346A1 (zh) * 2018-10-31 2020-05-07 深圳市大疆创新科技有限公司 拍摄控制方法、可移动平台、控制设备及存储介质
CN111586284B (zh) * 2019-02-19 2021-11-30 北京小米移动软件有限公司 景别提醒方法及装置
CN109960281A (zh) * 2019-04-17 2019-07-02 深圳市道通智能航空技术有限公司 环绕飞行的控制方法、装置、终端及存储介质
CN109976370B (zh) * 2019-04-19 2022-09-30 深圳市道通智能航空技术股份有限公司 立面环绕飞行的控制方法、装置、终端及存储介质
CN110375716B (zh) * 2019-07-23 2022-01-14 深圳市道通智能航空技术股份有限公司 无人飞行器寻找信息生成方法及无人飞行器
CN110519509A (zh) * 2019-08-01 2019-11-29 幻想动力(上海)文化传播有限公司 构图评价方法、摄影方法、装置、电子设备、存储介质
CN110312078B (zh) * 2019-08-02 2021-06-29 睿魔智能科技(深圳)有限公司 一种自动环绕目标拍摄方法及系统
CN110830719A (zh) * 2019-11-14 2020-02-21 苏州臻迪智能科技有限公司 取景范围确定方法及系统,拍摄控制方法及系统
CN110971819A (zh) * 2019-11-24 2020-04-07 西安呱牛信息技术有限公司 基于卫星导航用于拍摄的无人机路径控制方法及其系统
CN112243581A (zh) * 2019-11-26 2021-01-19 深圳市大疆创新科技有限公司 云台、云台的控制设备、云台的控制方法及存储介质
CN110989676B (zh) * 2019-12-20 2023-08-15 北京空天技术研究所 一种飞行器机动轨迹的过渡段轨迹生成方法及装置
JP7247904B2 (ja) * 2020-01-15 2023-03-29 トヨタ自動車株式会社 ドローンシステム及びドローンによる車両撮影方法
WO2021212445A1 (zh) * 2020-04-24 2021-10-28 深圳市大疆创新科技有限公司 拍摄方法、可移动平台、控制设备和存储介质
CN112639652A (zh) * 2020-05-07 2021-04-09 深圳市大疆创新科技有限公司 目标跟踪方法和装置、可移动平台以及成像平台
WO2022094808A1 (zh) * 2020-11-04 2022-05-12 深圳市大疆创新科技有限公司 拍摄控制方法、装置、无人机、设备及可读存储介质
WO2022141311A1 (zh) * 2020-12-30 2022-07-07 深圳市大疆创新科技有限公司 无人机控制方法、装置、无人机、终端、系统及存储介质
CN113784051A (zh) * 2021-09-23 2021-12-10 深圳市道通智能航空技术股份有限公司 控制飞行器基于人像模式拍摄的方法、装置、设备及介质
WO2023211695A1 (en) * 2022-04-27 2023-11-02 Snap Inc. Unlocking an autonomous drone for takeoff

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080204402A1 (en) * 2007-02-22 2008-08-28 Yoichi Hirata User interface device
CN104020777A (zh) * 2014-06-17 2014-09-03 成都华诚智印科技有限公司 一种体感跟随式飞行控制系统及其控制方法
CN104808675A (zh) * 2015-03-03 2015-07-29 广州亿航智能技术有限公司 基于智能终端的体感飞行操控系统及终端设备
CN104828256A (zh) * 2015-04-21 2015-08-12 杨珊珊 一种智能多模式飞行拍摄设备及其飞行控制方法
CN105512643A (zh) * 2016-01-06 2016-04-20 北京二郎神科技有限公司 一种图像采集方法和装置
CN105554480A (zh) * 2016-03-01 2016-05-04 深圳市大疆创新科技有限公司 无人机拍摄图像的控制方法、装置、用户设备及无人机
CN105843241A (zh) * 2016-04-11 2016-08-10 零度智控(北京)智能科技有限公司 无人机、无人机起飞控制方法及装置
CN105867362A (zh) * 2016-04-20 2016-08-17 北京博瑞爱飞科技发展有限公司 终端设备和无人驾驶飞行器的控制系统
CN105979147A (zh) * 2016-06-22 2016-09-28 上海顺砾智能科技有限公司 一种无人机智能拍摄方法

Family Cites Families (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090225001A1 (en) * 2007-11-06 2009-09-10 University Of Central Florida Research Foundation, Inc. Hybrid Display Systems and Methods
US8477190B2 (en) * 2010-07-07 2013-07-02 Pictometry International Corp. Real-time moving platform management system
CN107577247B (zh) * 2014-07-30 2021-06-25 深圳市大疆创新科技有限公司 目标追踪系统及方法
CN105438488B (zh) * 2014-09-30 2018-07-17 深圳市大疆创新科技有限公司 飞行器及其控制方法以及飞行器系统
CN108351574B (zh) * 2015-10-20 2020-12-22 深圳市大疆创新科技有限公司 用于设置相机参数的系统、方法和装置
CN105391939B (zh) * 2015-11-04 2017-09-29 腾讯科技(深圳)有限公司 无人机拍摄控制方法和装置、无人机拍摄方法和无人机
CN105430261A (zh) * 2015-11-16 2016-03-23 杨珊珊 无人飞行器拍摄方法及无人飞行器拍摄装置
CN115220475A (zh) * 2015-12-09 2022-10-21 深圳市大疆创新科技有限公司 用于uav飞行控制的系统和方法
CN113238581A (zh) * 2016-02-29 2021-08-10 星克跃尔株式会社 无人飞行器的飞行控制的方法和系统
US20170345317A1 (en) * 2016-05-24 2017-11-30 Sharper Shape Oy Dynamic routing based on captured data quality
US10032267B2 (en) * 2016-06-09 2018-07-24 Lockheed Martin Corporation Automating the assessment of damage to infrastructure assets
US10713961B2 (en) * 2016-06-10 2020-07-14 ETAK Systems, LLC Managing dynamic obstructions in air traffic control systems for unmanned aerial vehicles
US10074284B1 (en) * 2016-06-10 2018-09-11 ETAK Systems, LLC Emergency shutdown and landing for unmanned aerial vehicles with air traffic control systems
US10189566B2 (en) * 2016-06-10 2019-01-29 ETAK Systems, LLC 3D coverage mapping of wireless networks with unmanned aerial vehicles
WO2018095278A1 (zh) * 2016-11-24 2018-05-31 腾讯科技(深圳)有限公司 飞行器的信息获取方法、装置及设备

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080204402A1 (en) * 2007-02-22 2008-08-28 Yoichi Hirata User interface device
CN104020777A (zh) * 2014-06-17 2014-09-03 成都华诚智印科技有限公司 一种体感跟随式飞行控制系统及其控制方法
CN104808675A (zh) * 2015-03-03 2015-07-29 广州亿航智能技术有限公司 基于智能终端的体感飞行操控系统及终端设备
CN104828256A (zh) * 2015-04-21 2015-08-12 杨珊珊 一种智能多模式飞行拍摄设备及其飞行控制方法
CN105512643A (zh) * 2016-01-06 2016-04-20 北京二郎神科技有限公司 一种图像采集方法和装置
CN105554480A (zh) * 2016-03-01 2016-05-04 深圳市大疆创新科技有限公司 无人机拍摄图像的控制方法、装置、用户设备及无人机
CN105843241A (zh) * 2016-04-11 2016-08-10 零度智控(北京)智能科技有限公司 无人机、无人机起飞控制方法及装置
CN105867362A (zh) * 2016-04-20 2016-08-17 北京博瑞爱飞科技发展有限公司 终端设备和无人驾驶飞行器的控制系统
CN105979147A (zh) * 2016-06-22 2016-09-28 上海顺砾智能科技有限公司 一种无人机智能拍摄方法

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220292990A1 (en) * 2021-03-12 2022-09-15 Thales Method and electronic device for generating at least one eosid trajectory for at least one runway, related computer program and electronic flight management system

Also Published As

Publication number Publication date
CN107087427A (zh) 2017-08-22
CN107087427B (zh) 2019-06-07
US20220083078A1 (en) 2022-03-17
US11188101B2 (en) 2021-11-30
CN110119154A (zh) 2019-08-13
US20200027357A1 (en) 2020-01-23

Similar Documents

Publication Publication Date Title
WO2018098678A1 (zh) 飞行器的控制方法、装置和设备以及飞行器
WO2018209702A1 (zh) 无人机的控制方法、无人机以及机器可读存储介质
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US11649052B2 (en) System and method for providing autonomous photography and videography
US20200346753A1 (en) Uav control method, device and uav
CN113038016B (zh) 无人机图像采集方法及无人机
CN108351650B (zh) 一种对飞行器的飞行控制方法、装置及飞行器
WO2018098704A1 (zh) 控制方法、设备、系统、无人机和可移动平台
US20160124435A1 (en) 3d scanning and imaging method utilizing a self-actuating compact unmanned aerial device
WO2018214071A1 (zh) 用于控制无人机的方法和装置及无人机系统
WO2019128275A1 (zh) 一种拍摄控制方法、装置及飞行器
US20200304719A1 (en) Control device, system, control method, and program
WO2019227333A1 (zh) 集体照拍摄方法和装置
CN110809746A (zh) 控制装置、摄像装置、移动体、控制方法以及程序
US10308359B2 (en) Moving device, method of controlling moving device and storage medium
JP6910785B2 (ja) 移動撮像装置およびその制御方法、ならびに撮像装置およびその制御方法、無人機、プログラム、記憶媒体
WO2022188151A1 (zh) 影像拍摄方法、控制装置、可移动平台和计算机存储介质
CN107547793B (zh) 飞行装置、方法以及存储程序的存储介质
CN111226170A (zh) 控制装置、移动体、控制方法以及程序

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16922982

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16922982

Country of ref document: EP

Kind code of ref document: A1