WO2018227345A1 - Procédé de commande et véhicule aérien sans pilote - Google Patents

Procédé de commande et véhicule aérien sans pilote Download PDF

Info

Publication number
WO2018227345A1
WO2018227345A1 PCT/CN2017/087955 CN2017087955W WO2018227345A1 WO 2018227345 A1 WO2018227345 A1 WO 2018227345A1 CN 2017087955 W CN2017087955 W CN 2017087955W WO 2018227345 A1 WO2018227345 A1 WO 2018227345A1
Authority
WO
WIPO (PCT)
Prior art keywords
model
flight
drone
motion information
pan
Prior art date
Application number
PCT/CN2017/087955
Other languages
English (en)
Chinese (zh)
Inventor
陈竞
赵丛
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2017/087955 priority Critical patent/WO2018227345A1/fr
Priority to CN201780004914.7A priority patent/CN108700883A/zh
Publication of WO2018227345A1 publication Critical patent/WO2018227345A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft

Definitions

  • the invention relates to the technical field of drones, in particular to a control method and a drone.
  • the popularity of drones is growing, but many users are not professional flying hands.
  • the control technology of drones is poor. It is impossible to control drones to fly out of more professional flight trajectories to get a better shooting experience.
  • the automatic flight of the drone is mainly through the selection of waypoints, and then automatically flight based on the waypoints.
  • this flight plan automatically generates flight trajectories based on the waypoints, does not pay attention to the shooting effect during the flight, and cannot obtain high-quality shooting works.
  • the shooting value is low and the motivation for sharing is lacking.
  • Embodiments of the present invention provide a control method and a drone so that an ordinary user can also take a high-level photograph.
  • the control method of the embodiment of the present invention is applied to a drone.
  • the control method includes: collecting motion information during flight of the drone, determining model parameters of the flight model according to the motion information, and transmitting the model parameters to an external device.
  • the control method of the embodiment of the present invention is applied to a drone.
  • the drone communicates with an external device.
  • the control method includes: acquiring model parameters from the external device, determining a flight model according to the model parameters, and controlling the drone to fly according to the flight model.
  • the drone of the embodiment of the present invention includes a sensor, a processor, and a communication interface.
  • the sensor is configured to collect motion information during flight of the drone;
  • the processor is configured to determine model parameters of a flight model according to the motion information;
  • the communication interface is configured to send the model parameter to an external device .
  • the drone of the embodiment of the present invention communicates with an external device.
  • the drone includes a communication interface, a processor, the communication interface is configured to acquire model parameters from the external device, the processor is configured to determine a flight model according to the model parameter, and the processor is configured to control the The manned aircraft flies according to the flight model.
  • a drone includes a pan/tilt, a camera, one or more processors, a memory, and one or more programs, wherein the one or more programs are stored in the memory and configured to be
  • the one or more processors execute, the program including instructions for executing the control method described above.
  • users of non-professional flying hands can directly use the recorded flight model to control the drone to perform flight and aerial photography to obtain a higher level of shooting work, thereby improving the user experience.
  • FIG. 1 is a flow chart of a control method of some embodiments of the present invention.
  • FIG. 2 is a block diagram of a drone of some embodiments of the present invention.
  • FIG. 3 is a schematic diagram of the state of a control method of some embodiments of the present invention.
  • 4 to 24 are schematic flow charts of a control method according to some embodiments of the present invention.
  • 25 is a block diagram of a drone of some embodiments of the present invention.
  • 26 to 28 are schematic diagrams of application scenarios of a control method according to some embodiments of the present invention.
  • Control methods include:
  • the control method of the embodiment of the present invention can be implemented by the drone 100 of the embodiment of the present invention.
  • UAV of the embodiment of the present invention 100 includes a sensor 10, a processor 20, and a communication interface 30.
  • the drone 100 is also carried by the pan/tilt head 40.
  • the step S12 can be implemented by the sensor 10, the step S14 can be implemented by the processor 20, and the step S16 can be implemented by the communication interface 30.
  • the senor 10 is used to collect motion information during flight of the drone 100; the processor 20 is configured to determine model parameters of the flight model based on the motion information; and the communication interface 30 is configured to transmit the model parameters to the external device.
  • the processor 20 may be a flight controller of the drone, or may be other dedicated or general-purpose processors, and is not specifically limited herein.
  • the motion information includes position information of the drone 100, speed information of the drone 100, acceleration information of the drone 100, attitude information of the drone 100, angular velocity information of the drone 100, One or more of the gimbal attitude information of the drone 100, the angular velocity information of the pan-tilt 40 motion, and the angular acceleration information of the pan-tilt 40 motion.
  • the location information may be absolute location information (longitude, latitude) or relative location information, and the relative location information is location information relative to a reference point.
  • the sensor 10 for collecting motion information during flight of the drone includes a compass, an electronic compass, an Inertial Measurement Unit (IMU), an acceleration sensor, a vision sensor (binocular, monocular, visual odometer), GNSS At least one of a receiver, a barometer, an airspeed meter, a compass, an ultrasonic wave, or the like.
  • IMU Inertial Measurement Unit
  • IMU Inertial Measurement Unit
  • an acceleration sensor an acceleration sensor
  • a vision sensor binocular, monocular, visual odometer
  • GNSS At least one of a receiver, a barometer, an airspeed meter, a compass, an ultrasonic wave, or the like.
  • the position information of the drone 100 can be collected by a GNSS receiver, an inertial measurement unit, a vision sensor, etc. configured by the drone, and the speed information of the drone 100 can be configured by the GNSS receiver of the drone, inertial measurement.
  • the unit, the visual sensor, the ultrasonic sensor or the airspeed meter collects, and the acceleration information of the drone 100 can be collected by the acceleration sensor, and the attitude information of the drone 100 can be collected by the inertial measurement unit, and the angular velocity information of the drone 100
  • the acquisition can be performed by the inertial measurement unit, and the attitude information of the head of the drone 100 can be collected by the inertial measurement unit, and the angular velocity information of the movement of the platform 40 of the drone 100 can be collected by the inertial measurement unit.
  • the external device includes one or more of the drone 200, the server 300, and the control terminal 400.
  • the control terminal 400 may be one or more of a remote controller, a smart phone, a tablet computer, a drone ground control station, a watch, a wristband, a video glasses, and the like.
  • control terminal 400 receives input from a user. The user can input by operating a pull wheel, a button, a button, a remote sensing, or the like on the remote controller or by controlling a user interface (UI) on the terminal 400.
  • UI user interface
  • the control method of the embodiment of the present invention obtains the model parameters of the flight model of the drone 100 by processing the motion information during the flight of the collected drone 100, and the model parameters may be sent to an external device such as the drone 200. Flight can be made directly from model parameters. In this way, the sharing of model parameters of the flight model can be achieved.
  • the non-professional flying hand may have a relatively uncomfortable operation skill on the drone 200, and therefore cannot skillfully control the drone 200 to perform aerial photography to obtain a higher level of shooting work.
  • the professional flying hand uploads the recorded model parameters to the server 300 or directly to the drone 200, and the non-professional flying hand can control the drone 200 to directly fly according to the flight model obtained by the model parameters recorded by the professional flying hand. Thereby obtaining high quality aerial works. In this way, the user experience can be improved.
  • step S12 collects motion information during flight of the drone 100, including:
  • Step S14 determines model parameters of the flight model according to the motion information, including:
  • Sending the model parameters to the external device in step S16 includes:
  • S161 Send the model parameters of the flight path model to an external device.
  • step S121 can be implemented by sensor 10
  • step S141 can be implemented by processor 20
  • step S161 can be implemented by communication interface 30.
  • the senor 10 can be used to collect first motion information during flight of the drone 100; the processor 20 can be used to determine model parameters of the flight trajectory model based on the first motion information; the communication interface 30 can be used to model the flight trajectory The model parameters are sent to the external device.
  • the first motion information may include at least one of position information, speed information, and acceleration information of the drone 100.
  • the sensor 10 collects only first motion information related to the flight trajectory model during flight of the drone 100 from position A to position B.
  • the processor 20 obtains model parameters of the flight trajectory model by processing one or more of position information, speed information, and acceleration information.
  • the processor 20 transmits the model parameters of the flight trajectory model to an external device, such as the drone 200, for sharing.
  • the drone 200 can determine the flight trajectory model based on the model parameters described above and fly according to the flight trajectory indicated by the flight trajectory model.
  • step S12 collects motion information during flight of the drone 100, including:
  • S122 collecting first motion information and second motion information during flight of the drone 100;
  • Step S14 determines model parameters of the flight model according to the motion information, including:
  • Sending the model parameters to the external device in step S16 includes:
  • S162 Send the model parameters of the flight path model and the model parameters of the flight speed model to an external device.
  • step S122 can be implemented by sensor 10
  • step S142 can be implemented by processor 20
  • step S162 can be implemented by communication interface 30.
  • the sensor 10 can be used to collect the first motion information and the second motion information during the flight of the drone 100; the processor 20 can be configured to determine the model parameters of the flight trajectory model according to the first motion information, and according to the second The motion information determines model parameters of the flight speed model; the communication interface 30 can be used to transmit model parameters of the flight speed model and model parameters of the flight speed model to an external device.
  • the first motion information may include at least one of position information, speed information, and acceleration information of the drone 100
  • the second motion information may include one of speed information and acceleration information of the drone 100
  • the sensor 10 simultaneously acquires first motion information associated with the flight trajectory model and second motion information associated with the flight speed model during one flight of the drone 100 from position A to position B.
  • the processor 20 can obtain the model parameters of the flight trajectory model by processing one or more of the position information, the speed information and the acceleration information, and obtain the flight speed model by processing one or more of the speed information and the acceleration information. Model parameters. Subsequently, the processor 20 transmits the flight trajectory model and the model parameters of the flight speed model to an external device, such as the drone 200, for sharing.
  • the drone 200 can not only fly according to the flight trajectory indicated by the flight trajectory model determined by the model parameters of the flight trajectory model, but also can perform the speed of the drone 100 according to the flight speed model determined by the model parameters of the flight speed model. Control, so as to more accurately restore the flight path and flight speed of the drone 100 during recording, and obtain better flight effects and aerial photography effects.
  • step S12 collects motion information during flight of the drone 100, including:
  • Step S14 determines model parameters of the flight model according to the motion information, including:
  • S143 Determine a model parameter of the flight trajectory model according to the first motion information, and determine a model parameter of the PTZ attitude model according to the third motion information;
  • Sending the model parameters to the external device in step S16 includes:
  • S163 Send the model parameters of the flight path model and the model parameters of the pan/tilt attitude model to an external device.
  • step S123 can be implemented by sensor 10
  • step S143 can be implemented by processor 20
  • step S163 can be implemented by communication interface 30.
  • the senor 10 can be used to collect the first motion information and the third motion information during the flight of the drone 100; the processor 20 can be configured to determine a model parameter of the flight trajectory model according to the first motion information, and according to the third The motion information determines model parameters of the pan/tilt attitude model; the communication interface 30 can be used to transmit the model parameters of the flight trajectory model and the model parameters of the pan/tilt attitude model to an external device.
  • the first motion information may include at least one of position information, speed information, and acceleration information of the drone 100.
  • the third motion information may include at least the pan-tilt attitude information of the drone 100 and the angular velocity information of the pan-tilt motion.
  • One of the angular acceleration information that is, the pitch angle information of the pan/tilt head 40 of the drone 100 on the Pitch axis, the yaw angle information on the Yaw axis, and the roll angle information on the Roll axis.
  • the sensor 10 simultaneously acquires first motion information related to the flight trajectory model and third motion information related to the pan/tilt attitude model during one flight of the drone 100 flying from the position A to the position B.
  • the processor 20 can obtain the model parameters of the flight trajectory model by processing one or more of the position information, the speed information, and the acceleration information, and process one of the angular motion information of the pan-tilt motion, the angular velocity information of the pan-tilt motion, and the angular acceleration information.
  • the model parameters of the gimbal attitude model can be obtained by one or more kinds.
  • the processor 20 transmits the flight trajectory model and the model parameters of the pan/tilt attitude model to an external device, such as the drone 200, for sharing.
  • the drone 200 can not only fly according to the flight trajectory indicated by the flight trajectory model determined by the model parameters of the flight trajectory model, but also can perform the drone 200 according to the pan/tilt attitude model determined by the model parameters of the pan/tilt attitude model.
  • the angle adjustment of the pan/tilt head 80 can more accurately restore the flight path and the attitude of the head of the drone 100 during recording, and obtain better flight effects and aerial photography effects.
  • step S12 collects motion information during flight of the drone 100, including:
  • S124 collecting first motion information, second motion information, and third motion information during flight of the drone 100;
  • Step S14 determines model parameters of the flight model according to the motion information, including:
  • S144 Determine a model parameter of the flight trajectory model according to the first motion information, determine a model parameter of the flight speed model according to the second motion information, and determine a model parameter of the PTZ attitude model according to the third motion information;
  • Sending the model parameters to the external device in step S16 includes:
  • S164 sending model parameters of the flight path model, model parameters of the flight speed model, and model parameters of the pan/tilt attitude model to external device.
  • step S124 can be implemented by sensor 10
  • step S144 can be implemented by processor 20
  • step S164 can be implemented by communication interface 30.
  • the senor 10 can be used to collect the first motion information, the second motion information and the third motion information during the flight of the drone 100; the processor 20 can be configured to determine the model parameters of the flight trajectory model according to the first motion information. Determining a model parameter of the flight speed model according to the second motion information, and determining a model parameter of the pan/tilt attitude model according to the third motion information; the communication interface 30 may be configured to use the model parameter of the flight trajectory model, the model parameter of the flight speed model, and The model parameters of the pan/tilt attitude model are sent to an external device.
  • the first motion information may include at least one of position information, speed information, and acceleration information of the drone 100.
  • the second motion information may include at least one of speed information and acceleration information of the drone 100.
  • the three motion information may include at least one of the pan/tilt attitude information of the drone 100, the angular velocity information of the pan-tilt motion, and the angular acceleration information, that is, the attitude of the pan-tilt is the pan/tilt head 40 of the drone 100 on the Pitch axis.
  • the sensor 10 simultaneously acquires first motion information related to the flight trajectory model, second motion information related to the flight speed model, and related to the gimbal attitude model during a flight of the drone 100 from the position A to the position B.
  • the processor 20 can obtain the model parameters of the flight trajectory model by processing one or more of the position information, the speed information and the acceleration information, and obtain the flight speed model by processing one or more of the speed information and the acceleration information.
  • the model parameters can be obtained by processing one or more of the attitude information of the gimbal, the angular velocity information of the gimbal motion, and the angular acceleration information.
  • the processor 20 transmits the flight trajectory model, the flight speed model, and the model parameters of the pan/tilt attitude model to an external device, such as the drone 200, for sharing.
  • the drone 200 can not only fly according to the flight trajectory indicated by the flight trajectory model determined by the model parameters of the flight trajectory model, but also can control the speed of the drone 100 by the flight speed model determined by the model parameters of the flight speed model.
  • the angle adjustment of the pan/tilt head 80 of the drone 200 can be performed according to the pan/tilt attitude model determined by the model parameters of the gimbal attitude model, thereby more accurately restoring the flight trajectory, flight speed and the flying speed of the drone 100 during recording. PTZ attitude for better flight performance and aerial photography.
  • control method of the embodiment of the present invention further includes:
  • S11 controlling the drone 100 to fly on the flight trajectory indicated by the flight path model according to the model parameters of the flight trajectory model;
  • S125 collecting second motion information of the drone 100 during flight along the indicated flight trajectory
  • S165 Send model parameters of the flight speed model to an external device.
  • step 11 can be implemented by processor 20, step S125 can be implemented by sensor 10, step S145 can be implemented by processor 20, and step S165 can be implemented by communication interface 30.
  • the processor 20 can be used to control the drone 100 to fly on the flight trajectory indicated by the flight trajectory model according to the model parameters of the flight trajectory model; the sensor 10 can be used to collect the drone 100 to fly along the indicated flight trajectory The second motion information during flight; the processor is further configured to determine a model parameter of the flight speed model according to the second motion information; the communication interface 30 is configured to send the model parameter of the flight speed model to the external device.
  • the drone 100 first acquires first motion information during the first flight from position A to position B to determine model parameters of the flight trajectory model. Subsequently, the drone 100 performs the second flight based on the flight trajectory indicated by the flight trajectory model, starting from the position A and ending at the position B.
  • the drone 100 During the second flight, the drone 100 only collects the second motion information. Subsequently, the drone 100 processes the second motion information to determine model parameters of the flight speed model, and transmits the model parameters of the flight speed model to the external device. Thus, the drone 100 separately records the flight trajectory model and the flight speed model in the form of multiple flight recordings. In each recording, the user's focus is different, such as the first flight focuses on the flight trajectory, and the second flight focuses on the flight speed. In this way, the user's control of the drone 100 is more perfect, and the quality of the recorded flight model is better.
  • control method of the embodiment of the present invention further includes:
  • S11 controlling the drone 100 to fly on the flight trajectory indicated by the flight path model according to the model parameters of the flight trajectory model;
  • S126 Collecting third motion information of the drone 100 during flight along the indicated flight trajectory
  • S166 Send the model parameters of the PTZ attitude model to an external device.
  • step S11 can be implemented by processor 20
  • step S125 can be implemented by sensor 10
  • step S146 can be implemented by processor 20
  • step S166 can be implemented by communication interface 30.
  • the processor 10 can be used to control the drone 100 to fly on the flight path indicated by the flight path model according to the model parameters of the flight trajectory model; the sensor 20 can be used to collect the drone 100 to fly along the indicated flight path.
  • the third motion information during flight; the processor 20 is further configured to determine model parameters of the pan/tilt attitude model according to the third motion information; the communication interface 30 is configured to send the model parameters of the pan/tilt attitude model to the external device.
  • the drone 100 first acquires first motion information during the first flight from position A to position B to determine model parameters of the flight trajectory model. Subsequently, the drone 100 performs the second flight based on the flight trajectory indicated by the flight trajectory model, starting from the position A and ending at the position B. During the second flight, the drone 100 only collects third motion information. Subsequently, the drone 100 processes the third motion information to determine model parameters of the pan/tilt attitude model, and transmits the model parameters of the pan/tilt attitude model to the external device. In this way, the drone 100 separately records the flight path model and the pan/tilt attitude model in the form of multiple flight recordings. In each recording, the user's focus is different. For example, the first flight focuses on the flight trajectory, and the second flight focuses on the attitude of the gimbal. In this way, the user's control of the drone 100 is more perfect, and the quality of the recorded flight model is better.
  • control method of the embodiment of the present invention further includes:
  • S11 controlling the drone 100 to fly on the flight trajectory indicated by the flight path model according to the model parameters of the flight trajectory model;
  • S127 collecting second motion information and third motion information of the drone 100 during flight along the indicated flight trajectory
  • S167 Send the model parameters of the flight speed model and the model parameters of the pan/tilt attitude model to an external device.
  • step S11 can be implemented by processor 20
  • step S127 can be implemented by sensor 10
  • step S147 can be implemented by processor 20
  • step S167 can be implemented by communication interface 30.
  • the processor 20 can also be used to control the drone 100 to fly on the flight trajectory indicated by the flight trajectory model according to the model parameters of the flight trajectory model; the sensor 20 can also be used to collect the drone 100 along the indicated flight.
  • the second motion information and the third motion information during the flight of the trajectory flight; the processor 20 is further configured to determine a model parameter of the flight speed model according to the second motion information, and determine a model parameter of the pan/tilt attitude model according to the third motion information
  • the communication interface 30 can also be used to transmit model parameters of the flight speed model and model parameters of the pan/tilt attitude model to an external device.
  • the drone 100 first acquires first motion information during the first flight from position A to position B to determine model parameters of the flight trajectory model. Subsequently, the drone 100 performs the second flight based on the flight trajectory indicated by the flight trajectory model, starting from the position A and ending at the position B. During the second flight, the drone 100 simultaneously acquires the second motion information and the third motion information. Subsequently, the drone 100 processes the second motion information to determine the model parameters of the flight speed model, processes the third motion information to determine the model parameters of the pan-tilt attitude model, and models the parameters of the flight speed model and the pan/tilt The model parameters of the attitude model are sent to an external device.
  • the drone 100 separately records the flight trajectory model, the flight speed model, and the gimbal attitude model in the form of multiple flight recordings.
  • the user's focus is different.
  • the first flight focuses on the flight trajectory
  • the second flight focuses on the flight speed and the attitude of the gimbal.
  • the user's control of the drone 100 is more perfect, and the quality of the recorded flight model is better.
  • the drone 100 first acquires first motion information during the first flight from position A to position B to determine model parameters of the flight trajectory model. Subsequently, the drone 100 performs the second flight based on the flight trajectory indicated by the flight trajectory model, starting from the position A and ending at the position B. During the second flight, the drone 100 collects second motion information. Subsequently, the drone 100 processes the second motion information to determine the model parameters of the flight speed model, and transmits the model parameters of the flight speed model to the external device. Then, the drone 100 performs the third flight according to the flight speed model, starting from the position A and ending with the position B according to the flight trajectory indicated by the flight trajectory model. During the third flight, the drone 100 collects third motion information.
  • the UAV 100 processes the third motion information to determine the model parameters of the PTZ attitude model, and transmits the model parameters of the PTZ attitude model to the external device.
  • the drone 100 separately records the flight path model, the flight speed model, and the gimbal attitude model in the form of multiple flight recordings.
  • the user's attention points are different. For example, the first flight focuses on the flight trajectory, the second flight focuses on the flight speed, and the third flight focuses on the attitude of the gimbal. In this way, the user's control of the drone 100 is more perfect, and the quality of the recorded flight model is better.
  • the flight model includes one or more flight model segments that are segmented at time intervals.
  • Step S14 determines model parameters of the flight model according to the motion information, including:
  • S1482 Determine model parameters corresponding to the flight model segment according to motion information of each of the plurality of segments.
  • step S1481 and step S1482 can be implemented by processor 20.
  • the processor 20 can also be used to divide the flight process of the drone 100 into a plurality of segments according to time intervals, and determine the corresponding flight model segments according to the motion information of each of the plurality of segments. Model parameters.
  • the flight model includes a plurality of flight model segments, and the flight model segments are divided at time intervals, and each flight model segment corresponds to a functional formula. Therefore, the flight model is a segmentation function that segments at time intervals.
  • the entire flight process of the drone 100 is divided into segments at intervals of time, and each segment corresponds to a motion information and a functional formula, and each of the motion information and the functional formula can determine each The model parameters corresponding to a segment.
  • segmenting the flight model at time intervals allows the corresponding function of the flight model to better fit the flight path, flight speed, or pan/tilt attitude of the drone 100 throughout the flight. As such, the flight model is more accurate.
  • step S14 determines model parameters of the flight model based on the motion information, including:
  • S1491 Determine expected motion information corresponding to the motion information according to a time when the motion information is collected and a preset flight model;
  • step S1491, step S1492, and step S1493 can be implemented by processor 20.
  • the processor 20 is configured to: determine expected motion information corresponding to the motion information according to the time when the motion information is collected and the preset flight model, determine an error between the collected motion information and the expected motion information, and determine an error according to the error Determine the model parameters of the flight model.
  • the preset flight model includes a preset flight trajectory model indicating the flight of the drone 100, a preset flight speed model indicating the flight of the drone 100, and a preset pan-tilt attitude indicating the attitude of the pan/tilt head 40 of the drone 100.
  • the preset flight model has the same piecewise function expression as the flight model. The difference between the two is that the model parameter p in the preset flight model is unknown and needs to be obtained by the error equation.
  • the flight trajectory model is taken as an example for description.
  • the flight path model is a piecewise function f(p,t) about the model parameter p and time t, and the function is as follows:
  • [t 0 , t 1 ), [t 1 , t 2 ), ..., [t m-1 , t m ) represent flight model segments at intervals of time
  • m represents the corresponding segment of each flight model segment
  • the number, such as [t 0 , t 1 ), represents the first flight model segment
  • [t 1 , t 2 ) represents the second flight model segment
  • n represents the number of model parameters p in each flight model segment or the number of coefficients of the function corresponding to each flight model segment.
  • the model parameter p in the preset flight trajectory model is an unknown number and needs to be determined by the acquired first motion information.
  • the first motion information includes position information x, speed information v, and acceleration information a of the drone 100.
  • the model parameter p can be solved by an error equation.
  • the expression of the error equation is as follows:
  • the following label j distinguishes the position information x j , the speed information v j and the acceleration information a j collected at different time points.
  • the position information x j , the speed information v j and the acceleration information a j are the position information, the speed information and the acceleration information collected by the sensor 10 at the corresponding time point, that is, the motion information collected by the sensor 10 .
  • f(p,t j ) with Corresponding expected motion information of the position information x j , the speed information v j , and the acceleration information a j , respectively.
  • w x , w v , and w a are weights corresponding to position information, speed information, and acceleration information, respectively.
  • the error equation weights the position, velocity and acceleration calculated by the preset flight path model at each time point with the acquired position, velocity and acceleration.
  • a p value needs to be selected such that the value of the error equation, ie, the value of error(p), is minimized.
  • the value of error(p) should be zero.
  • the value of error(p) cannot reach zero.
  • the method for obtaining the p value is as follows: First, a p value is selected and substituted into the error equation for calculation to obtain the value of error(p), and the accuracy of the current flight trajectory model is measured according to the value of the error(p). If the value of error(p) is large, the value of p calculated by substituting the error equation above is based on re-selecting a new p-value into the error equation to obtain the value of the new error(p), and according to the new error. The value of (p) measures the accuracy of the new flight path model. This cycle is repeated until a p value is chosen such that the value of error(p) is as small as possible, ie the corresponding p value is taken as the model parameter of the final flight trajectory model.
  • model parameters of the flight speed model and the model parameters of the pan/tilt attitude model can also be obtained by referring to the above calculation method, and will not be described herein.
  • control methods include:
  • S26 Control the drone 200 to fly according to the flight model.
  • the control method of the embodiment of the present invention can be implemented by the drone 200 of the embodiment of the present invention.
  • the drone 200 of an embodiment of the present invention includes a processor 60 and a communication interface 70.
  • Step 22 can be implemented by communication interface 70.
  • Step S24 and step S26 can be implemented by the processor 60.
  • the communication interface 70 is used to acquire model parameters from an external device; the processor 60 is configured to: determine a flight model based on the model parameters, and control the drone 200 to fly according to the flight model.
  • the drone 200 according to the embodiment of the present invention also carries a pan/tilt head 80 on which the camera 90 is mounted.
  • the external device includes one or more of the drone 100, the server 300, and the control terminal 400.
  • the control terminal 400 may be one or more of a remote controller, a smart phone, a tablet computer, a drone ground control station, a watch, a wristband, a video glasses, and the like.
  • the control terminal receives input from the user. The user can input by operating a pull wheel, a button, a button, a remote sensing, or the like on the remote controller or by controlling a user interface (UI) on the terminal 400.
  • UI user interface
  • the control method of the embodiment of the present invention first processes the model parameter p acquired from the external device to obtain the flight model f(p, t). Specifically, the drone 200 acquires the model parameter p from the external device and substitutes the model parameter p into the flight.
  • the function of the model is in f(p,t).
  • the flight model f(p,t) is a piecewise function that is segmented at time intervals. Wherein, the model parameter p and the time t are both known quantities, and thus, the value of f(p, t) can be determined.
  • the drone 200 is then controlled to perform autonomous flight in accordance with the flight model described above.
  • the flying model recorded by the professional flying hand can be used to control the drone 200 to fly, and on the other hand, the flying process of the drone 200 can be ensured.
  • the safety in the middle can help non-professional flying hands to obtain higher-quality aerial photographs and enhance the user experience.
  • the flight model includes at least a flight trajectory model
  • step S26 controls the drone 200 to fly according to the flight model, including:
  • S261 Control the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model.
  • step S261 can be implemented by processor 60. That is, the processor 60 is further configured to control the drone 200 to fly in accordance with the flight trajectory indicated by the flight trajectory model.
  • the drone 200 flies from position A to position B by learning the flight path model in accordance with the flight trajectory indicated by the flight trajectory model, or from position C to position D according to the flight trajectory indicated by the flight trajectory model.
  • position A and position C refer to two different starting positions
  • position B and position D refer to two different ending positions.
  • the location information collected by the drone 100 may be absolute location information or relative location information.
  • the position information collected by the drone is absolute location information
  • the position information collected by the drone 100 is absolute position information from the position A to the position B
  • the drone 200 flies from the position A to the position B by learning the flight trajectory model according to the flight trajectory indicated by the flight trajectory model, and is completely repeated at this time.
  • the flight path flies from position C to position D. At this time, the drone 200 determines the position C and the position D according to the reference point, wherein the reference point can receive the control terminal A time position of UAV 200 to send a replay instruction.
  • the flight environment in which the drone 200 learns the flight model does not need to be the same as the flight environment when the drone 100 records the flight model.
  • the drone 200 can fly autonomously according to the indicated flight path of the flight path model, and the user does not need to manually control to change the flight position of the drone 200, and the intelligence and convenience of the drone 200 are greatly improved.
  • the flight environment when the drone 200 learns the flight model need not be the same as the flight environment when the drone 100 records the flight model, and the flight model has high portability.
  • the flight model further includes a flight speed model, and step S26 controls the drone 200 to fly according to the flight model, including:
  • S262 Control the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model and the flight speed indicated by the flight speed model.
  • step S262 can be implemented by processor 60. That is, the processor 60 is further configured to control the drone 200 to fly in accordance with the flight trajectory indicated by the flight trajectory model and the flight speed indicated by the flight speed model.
  • the drone 200 learns the flight trajectory model and the flight speed model, and flies from position A to position according to the flight trajectory indicated by the flight trajectory model and the flight speed indicated by the flight speed model. B, or flight from position C to position D according to the flight trajectory indicated by the flight path model and the flight speed indicated by the flight speed. In this way, the drone 200 can more accurately restore the flight process of the drone 100 during recording, and improve the flight effect of the drone 200.
  • the posture of the pan/tilt head 80 of the drone 200 can be automatically controlled by the user, and the flexibility of the shooting angle of the camera 90 at the time of aerial photography is improved.
  • the flight model further includes a pan/tilt attitude model, and step S26 controls the drone 200 to fly according to the flight model, including:
  • S263 Control the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model and the flight speed indicated by the flight speed model, and simultaneously adjust the attitude of the pan/tilt head 80 according to the pan/tilt attitude model.
  • step S263 can be implemented by processor 60. That is to say, the processor 60 is further configured to control the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model and the flight speed indicated by the flight speed model, and simultaneously adjust the attitude of the pan-tilt 80 according to the pan-tilt attitude model.
  • the drone 200 learns the flight path model, the flight speed model, and the pan/tilt attitude model, and flies from the position A to the position B according to the flight trajectory indicated by the flight trajectory model and the flight speed indicated by the flight speed model, while simultaneously following the gimbal attitude model.
  • the attitude of the pan/tilt head 80 is adjusted; or the flight trajectory indicated by the flight path model and the flight speed indicated by the flight speed model are flighted from the position C to the position D, while the attitude of the pan/tilt head 80 is adjusted according to the pan/tilt attitude model.
  • the drone 200 can more accurately restore the flight trajectory, the flight speed, and the attitude of the gimbal 80 when the drone 100 records the flight model, and the flight effect and aerial photography effect of the drone 200 are better.
  • the flight model further includes a pan/tilt attitude model, and step S26 controls the drone 200 to fly according to the flight model:
  • S264 Control the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model, and simultaneously adjust the attitude of the pan/tilt head 80 according to the pan/tilt attitude model.
  • step S264 can be implemented by processor 60. That is to say, the processor 60 is further configured to control the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model, and simultaneously adjust the attitude of the pan/tilt head 80 according to the pan/tilt attitude model.
  • the drone 200 learns the flight path model and the pan/tilt attitude model, and flies from the position A to the position B according to the flight trajectory indicated by the flight trajectory model, while adjusting the attitude of the pan/tilt head 80 according to the gimbal attitude model, or according to the flight trajectory model.
  • the indicated flight path flies from position C to position D, during which the attitude of the pan/tilt head 80 is adjusted according to the pan/tilt attitude model.
  • the drone 200 can restore the flight trajectory and the attitude of the head of the drone 100 when recording the flight model, and improve the flight effect and aerial photography effect of the drone 200.
  • the speed of the drone 200 can be controlled by the user, which increases the degree of freedom of control of the drone 200.
  • the drone 200 is in communication with the control terminal 400.
  • the control method of the embodiment of the present invention further includes:
  • S251 Receive a speed control instruction sent by the control terminal 400.
  • Step S26 controls the drone 200 to fly according to the flight model including:
  • S265 controlling the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model
  • S266 The drone 200 is controlled to adjust the flight speed of the drone 200 in accordance with the speed control command.
  • step S251 can be implemented by communication interface 70, and step S265 and step S266 can be implemented by processor 60.
  • the communication interface 70 is for receiving the speed control command sent by the control terminal 400; the processor 60 is for controlling the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model, and controlling the drone 200 to follow the speed control command. The flight speed of the drone 200 is adjusted.
  • the user can input a speed control command for adjusting the flight speed of the drone 200 at the control terminal 400, and the control terminal 400 transmits the speed control command to the drone 200. Thereafter, the drone 200 performs speed adjustment based on the speed control command.
  • the drone 200 has a high degree of freedom of operation, and the drone 200 can optimize the flight effect and the aerial shooting effect by changing the flight speed during flight along the flight path.
  • the drone 200 is in communication with the control terminal 400.
  • the control method of the embodiment of the present invention further includes:
  • S252 Receive a PTZ control command sent by the control terminal 400.
  • Step S26 controls the drone 200 to fly according to the flight model including:
  • S265 controlling the drone 200 to fly according to the flight trajectory indicated by the flight trajectory model
  • S267 The drone 200 is controlled to adjust the posture of the pan/tilt head 80 carried by the drone 200 according to the pan/tilt control command.
  • step S252 can be implemented by communication interface 70, and step S265 and step S267 can be implemented by processor 60.
  • the communication interface 70 is for receiving the PTZ control command sent by the control terminal 400; the processor 60 is for controlling the UAV 200 to fly according to the flight trajectory indicated by the flight trajectory model, and controlling the UAV 200 to follow the PTZ
  • the control command adjusts the attitude of the pan/tilt head 80 carried by the drone 200.
  • the user can input a PTZ control command for adjusting the posture of the PTZ 80 at the control terminal 400, and the control terminal 400 transmits the PTZ control command to the UAV.
  • the drone 200 performs the attitude adjustment of the pan/tilt head 80 according to the pan/tilt control command.
  • the drone 200 uses the flight model for autonomous flight, the flight location of the drone 200 and the flight location when the drone 100 records the flight model may be the same or different.
  • the user can change the shooting angle of the camera 90 mounted on the pan/tilt by adjusting the posture of the pan/tilt head 80, so as to put the main body of the user's attention into the shooting image of the camera 90, thereby obtaining a higher-level aerial photography. works.
  • the drone 200 is in communication with the control terminal 400.
  • the control method of the embodiment of the present invention further includes:
  • S254 Control the drone 200 to fly to the starting position point
  • Step S26 controls the drone 200 to fly according to the flight model including:
  • step S253 can be implemented by communication interface 70, and step S254 and step S268 can be implemented by processor 60.
  • the communication interface 70 is configured to receive a home position point indication command sent by the control terminal 400; the processor 60 is configured to control the drone 200 to fly to the home position point, and the drone 200 arrives After the start position point, the drone 200 is controlled to fly in accordance with the flight trajectory indicated by the flight trajectory model.
  • the user can input the starting position point by controlling the UI interface of the terminal 400.
  • the drone 200 flies to the starting position point, the drone 200 performs the flight according to the flight trajectory indicated by the flight model. In this way, the interaction between the drone 200 and the user is enhanced to meet the user's use requirements.
  • control method of the embodiment of the present invention further includes:
  • step S27 can be implemented by processor 60. That is to say, the processor 60 is further configured to control the attitude of the gimbal 80 during the flight trajectory indicated by the drone 200 in accordance with the flight trajectory model so that the photographic subject is always located in the photographic image of the camera 90.
  • the drone 200 controls the attitude of the pan/tilt head 80 during the learning of the flight model.
  • the object is always located in the shooting screen of the camera 90 to achieve the effect of surrounding the self-timer, thereby further improving the intelligence of the drone 200 and improving the user experience.
  • step S27 controls the attitude of the gimbal 80 during the flight of the drone 200 in accordance with the flight trajectory indicated by the flight trajectory model so that the photographic subject is always located at the camera 90.
  • the shooting screen includes:
  • S272 Determine a shooting object according to the determination instruction.
  • step S271 can be implemented by communication interface 70, which can be implemented by processor 60. That is to say, the communication interface 70 is further configured to receive the photographic subject determination instruction sent by the control terminal 400; the processor 60 is further configured to determine the photographic subject according to the determination instruction.
  • the drone 200 can transmit the image transmission data to the UI interface of the control terminal to 400.
  • the user can select the shooting object to be tracked in the UI interface, and then the control terminal 400 sends the shooting object determination instruction to the drone. 200.
  • the processor 60 adjusts the attitude of the pan/tilt head 80 by an algorithm of real-time tracking detection so that the subject is always within the photographing screen of the camera 90.
  • the drone 200 includes a sensor 50, and the control method of the embodiment of the present invention controls the drone 200 to fly after the flight trajectory indicated by the flight trajectory model in step S261. Also includes:
  • step S28 can be implemented by processor 60. That is to say, the processor 60 can also be used to control the drone 200 to avoid obstacles detected by the sensor 50 during the process of controlling the drone 200 to fly in accordance with the flight trajectory indicated by the flight trajectory model.
  • the sensor 50 includes an ultrasonic sensor, a TOF (time of flight) ranging sensor, a visual distance avoidance sensor, and the like.
  • the flight location of the drone 200 may be different from the flight location when the drone 100 records the flight trajectory model.
  • the starting position and the ending position are position A and position, respectively.
  • the starting position and the ending position are position C and position D, respectively.
  • the drone 200 may have a risk of hitting an obstacle during the flight from the position C to the position D in accordance with the flight trajectory indicated by the flight trajectory model. Therefore, when the sensor 50 of the drone 200 detects an obstacle, it needs to perform autonomous avoidance to ensure the safety of the drone 200 in flight.
  • the drone 100 and the drone 200 of the embodiment of the present invention may be the same type of drone. That is to say, the drone 100 can not only have a function of recording a flight model but also a function of flying in accordance with a flight model acquired from an external device.
  • the drone 200 not only has a function of flying in accordance with a flight model acquired from an external device, but also has a function of recording a flight model.
  • a drone 1000 of an embodiment of the present invention includes a pan/tilt head 500, a camera 600, one or more processors 700, a memory 800, and one or more programs 810.
  • One or more of the programs 810 are stored in the memory 800 and are configured to be executed by one or more processors 700.
  • the program 810 includes a control method for performing any of the above embodiments.
  • the program 810 can be used to execute instructions of the control method described in the following steps:
  • the program 810 can also be used to execute instructions of the control method described in the following steps:
  • S26 Control the drone 200 to fly according to the flight model.
  • a computer readable storage medium in accordance with an embodiment of the present invention includes a computer program for use in conjunction with a flight capable electronic device, which can be executed by processor 800 to perform the control method described in any of the above embodiments.
  • a computer program can be executed by processor 800 to perform the control methods described in the following steps:
  • the computer program can also be executed by processor 800 to perform the control methods described in the following steps:
  • S26 Control the drone 200 to fly according to the flight model.
  • a "computer-readable medium” can be any apparatus that can contain, store, communicate, propagate, or transport a program for use in an instruction execution system, apparatus, or device, or in conjunction with the instruction execution system, apparatus, or device.
  • computer readable media include the following: electrical connections (electronic devices) having one or more wires, portable computer disk cartridges (magnetic devices), random access memory (RAM), Read only memory (ROM), erasable editable read only memory (EPROM or flash memory), fiber optic devices, and portable compact disk read only memory (CDROM).
  • the computer readable medium may even be a paper or other suitable medium on which the program can be printed, as it may be optically scanned, for example by paper or other medium, followed by editing, interpretation or, if appropriate, other suitable The method is processed to obtain the program electronically and then stored in computer memory.
  • portions of the invention may be implemented in hardware, software, firmware or a combination thereof.
  • multiple steps or methods may be performed by software or firmware stored in a memory and executed by a suitable instruction execution system.
  • the hardware is executed, as in another embodiment, and can be performed by any one or combination of the following techniques well known in the art: discrete logic circuits having logic gates for performing logic functions on data signals, An application specific integrated circuit with a suitable combination of logic gates, a programmable gate array (PGA), a field programmable gate array (FPGA), and the like.
  • PGA programmable gate array
  • FPGA field programmable gate array
  • each functional unit in each embodiment of the present invention may be integrated into one processing module, or each unit may exist physically separately, or two or more units may be integrated into one module.
  • the above integrated modules can be executed in the form of hardware or in the form of software functional modules.
  • the integrated modules, if executed in the form of software functional modules and sold or used as separate products, may also be stored in a computer readable storage medium.
  • the above mentioned storage medium may be a read only memory, a magnetic disk or an optical disk or the like.

Abstract

La présente invention a trait à un procédé de commande destiné à être utilisé avec un véhicule aérien sans pilote (100), et à un véhicule aérien sans pilote (100). Le procédé de commande consiste : (S12) à collecter des informations de mouvement d'un véhicule aérien sans pilote (100) pendant le vol; (S14) à déterminer des paramètres de modèle d'un modèle de vol en fonction des informations de mouvement; (S16) à envoyer les paramètres de modèle à un dispositif externe.
PCT/CN2017/087955 2017-06-12 2017-06-12 Procédé de commande et véhicule aérien sans pilote WO2018227345A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/087955 WO2018227345A1 (fr) 2017-06-12 2017-06-12 Procédé de commande et véhicule aérien sans pilote
CN201780004914.7A CN108700883A (zh) 2017-06-12 2017-06-12 控制方法和无人机

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/087955 WO2018227345A1 (fr) 2017-06-12 2017-06-12 Procédé de commande et véhicule aérien sans pilote

Publications (1)

Publication Number Publication Date
WO2018227345A1 true WO2018227345A1 (fr) 2018-12-20

Family

ID=63844047

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/087955 WO2018227345A1 (fr) 2017-06-12 2017-06-12 Procédé de commande et véhicule aérien sans pilote

Country Status (2)

Country Link
CN (1) CN108700883A (fr)
WO (1) WO2018227345A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110109472A (zh) * 2019-04-25 2019-08-09 广州笨笨网络科技有限公司 一种无人机控制方法、系统、终端和无人机
CN110414359B (zh) * 2019-07-01 2022-07-26 中国石化销售有限公司华南分公司 长输管道无人机巡检数据分析与管理方法及系统
CN111435255B (zh) * 2019-10-23 2023-08-18 珠海全志科技股份有限公司 一种无人机制动控制方法、装置以及无人机

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294064A (zh) * 2013-06-07 2013-09-11 天津全华时代航天科技发展有限公司 一种自动驾驶飞行控制系统
EP2818957A1 (fr) * 2013-06-24 2014-12-31 Honeywell International Inc. Système et procédé d'atterrissage pour véhicule aérien sans pilote
CN104597912A (zh) * 2014-12-12 2015-05-06 南京航空航天大学 一种六旋翼无人直升机跟踪飞行控制系统及方法
CN104656660A (zh) * 2015-01-22 2015-05-27 南京航空航天大学 微小型无人直升机多模态自主飞行的控制系统及其方法
CN105652891A (zh) * 2016-03-02 2016-06-08 中山大学 一种旋翼无人机移动目标自主跟踪装置及其控制方法

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112614246A (zh) * 2014-09-30 2021-04-06 深圳市大疆创新科技有限公司 用于数据记录与分析的系统和方法
CN105373629A (zh) * 2015-12-17 2016-03-02 谭圆圆 基于无人飞行器的飞行状态数据处理装置及其方法
CN105678289A (zh) * 2016-03-07 2016-06-15 谭圆圆 一种无人飞行器的控制方法及装置
CN205721829U (zh) * 2016-03-07 2016-11-23 谭圆圆 一种无人飞行器
CN105823478A (zh) * 2016-03-14 2016-08-03 武汉卓拔科技有限公司 一种自主避障导航信息共享和使用方法
CN105892484A (zh) * 2016-04-12 2016-08-24 谭圆圆 一种无人飞行器的控制方法、装置及系统
CN205608526U (zh) * 2016-04-12 2016-09-28 谭圆圆 一种无人飞行器及无人飞行器控制装置
CN106197423A (zh) * 2016-06-28 2016-12-07 株洲斯凯航空科技有限公司 一种无人机自动施药航迹记录装置
CN106125758B (zh) * 2016-07-07 2019-03-15 衢州光明电力投资集团有限公司赋腾科技分公司 一种无人机编队控制系统及方法
CN106483980B (zh) * 2016-11-24 2019-05-31 腾讯科技(深圳)有限公司 一种无人机跟随飞行的控制方法、装置及系统
CN106774421B (zh) * 2017-02-10 2020-03-10 郑州云海信息技术有限公司 一种无人机轨迹规划系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103294064A (zh) * 2013-06-07 2013-09-11 天津全华时代航天科技发展有限公司 一种自动驾驶飞行控制系统
EP2818957A1 (fr) * 2013-06-24 2014-12-31 Honeywell International Inc. Système et procédé d'atterrissage pour véhicule aérien sans pilote
CN104597912A (zh) * 2014-12-12 2015-05-06 南京航空航天大学 一种六旋翼无人直升机跟踪飞行控制系统及方法
CN104656660A (zh) * 2015-01-22 2015-05-27 南京航空航天大学 微小型无人直升机多模态自主飞行的控制系统及其方法
CN105652891A (zh) * 2016-03-02 2016-06-08 中山大学 一种旋翼无人机移动目标自主跟踪装置及其控制方法

Also Published As

Publication number Publication date
CN108700883A (zh) 2018-10-23

Similar Documents

Publication Publication Date Title
US11649052B2 (en) System and method for providing autonomous photography and videography
US11347217B2 (en) User interaction paradigms for a flying digital assistant
US11656635B2 (en) Heading generation method and system of unmanned aerial vehicle
US20210116943A1 (en) Systems and methods for uav interactive instructions and control
WO2018098704A1 (fr) Procédé, appareil et système de commande, véhicule aérien sans pilote, et plateforme mobile
WO2019113966A1 (fr) Procédé et dispositif d'évitement d'obstacle, et véhicule aérien autonome
WO2017201698A1 (fr) Procédé et appareil de suivi de cible
WO2018227345A1 (fr) Procédé de commande et véhicule aérien sans pilote
US20210325886A1 (en) Photographing method and device
WO2020233682A1 (fr) Procédé et appareil de photographie circulaire autonome et véhicule aérien sans pilote
WO2023036260A1 (fr) Procédé et appareil d'acquisition d'image, et véhicule aérien et support de stockage
WO2020237478A1 (fr) Procédé de planification de vol et dispositif associé
US11107506B2 (en) Method and system for combining and editing UAV operation data and video data
CA3069813A1 (fr) Capture, connexion et utilisation de donnees d'interieur de batiment a partir de dispositifs mobiles
US20230359204A1 (en) Flight control method, video editing method, device, uav and storage medium
US20200217665A1 (en) Mobile platform, image capture path generation method, program, and recording medium
KR101876829B1 (ko) 소형 드론의 실내 비행제어를 위한 유도 제어 시스템
JP6515423B2 (ja) 制御装置、移動体、制御方法、及びプログラム
WO2022000211A1 (fr) Procédé de commande de système de photographie, dispositif, plateforme mobile et support de stockage
JP7468523B2 (ja) 移動体、位置推定方法、およびプログラム
CN111226093A (zh) 信息处理装置、飞行路径生成方法、程序以及记录介质
WO2022205294A1 (fr) Procédé et appareil de commande d'engin volant sans pilote embarqué, engin volant sans pilote embarqué, et support d'enregistrement
WO2020088397A1 (fr) Appareil d'estimation de position, procédé d'estimation de position, programme et support d'enregistrement
JP2023170180A (ja) 現場監視システム
JP2020052255A (ja) 移動体、制御方法、及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17913323

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17913323

Country of ref document: EP

Kind code of ref document: A1