WO2017096762A1 - Aircraft control method, mobile terminal and storage medium - Google Patents

Aircraft control method, mobile terminal and storage medium Download PDF

Info

Publication number
WO2017096762A1
WO2017096762A1 PCT/CN2016/083287 CN2016083287W WO2017096762A1 WO 2017096762 A1 WO2017096762 A1 WO 2017096762A1 CN 2016083287 W CN2016083287 W CN 2016083287W WO 2017096762 A1 WO2017096762 A1 WO 2017096762A1
Authority
WO
WIPO (PCT)
Prior art keywords
aircraft
sensor data
touch
touch operation
sensor
Prior art date
Application number
PCT/CN2016/083287
Other languages
French (fr)
Chinese (zh)
Inventor
黎凯锋
李家伦
宁京
Original Assignee
腾讯科技(深圳)有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to CN201510919245.9 priority Critical
Priority to CN201510919245.9A priority patent/CN105549604B/en
Application filed by 腾讯科技(深圳)有限公司 filed Critical 腾讯科技(深圳)有限公司
Publication of WO2017096762A1 publication Critical patent/WO2017096762A1/en
Priority claimed from US15/957,749 external-priority patent/US10587790B2/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C2201/00Unmanned aerial vehicles; Equipment therefor
    • B64C2201/14Unmanned aerial vehicles; Equipment therefor characterised by flight control
    • B64C2201/146Remote controls

Abstract

An aircraft control method, comprising: displaying an aircraft control interface; detecting a touch operation that acts on the aircraft control interface; if the touch operation is detected, obtaining sensor data and obtaining an aircraft control instruction at least according to the sensor data; and sending the aircraft control instruction to an aircraft.

Description

Aircraft control method, mobile terminal and storage medium

The present application claims priority to Chinese Patent Application No. 2015-1091924, filed on Dec. 10, 2015, the entire disclosure of which is incorporated herein by reference.

Technical field

The present invention relates to the field of aircraft technology, and in particular, to an aircraft control method, a mobile terminal, and a storage medium.

Background technique

The drone is referred to as the "UAV", abbreviated as "UAV" (Unmanned Aerial Vehicle), which is a remotely manned unmanned aircraft. UAVs include unmanned helicopters, unmanned fixed-wing aircraft, unmanned multi-rotor aircraft, unmanned airships, and unmanned paragliders. The drone was originally used in the military field and was mainly used as a reconnaissance aircraft and a drone. With the reduction in the cost of drones, drones have gradually entered the civilian sector.

At present, the operation of the drone is mainly realized by the joystick controller, and can also be realized by the mobile terminal simulating the joystick controller. However, both the joystick controller and the analog joystick controller require the user to have certain basic capabilities to control the drone. It is difficult for new users who have not touched the joystick controller to operate, and the user has no other options. Control method. Therefore, the current drone control method has a single control mode and needs to be improved.

Summary of the invention

According to various embodiments of the present application, an aircraft handling method, a mobile terminal, and a storage medium are provided.

An aircraft handling method includes:

Display the aircraft control interface;

Detecting a touch operation acting on the aircraft control interface;

If the touch operation is detected, acquiring sensor data, and obtaining an aircraft manipulation command based on at least the sensor data; and

The aircraft maneuver command is sent to the aircraft.

A mobile terminal comprising a memory and a processor, the memory storing computer readable instructions, the computer readable instructions being executed by the processor, such that the processor performs the following steps:

Display the aircraft control interface;

Detecting a touch operation acting on the aircraft control interface;

If the touch operation is detected, acquiring sensor data, and obtaining an aircraft manipulation command based on at least the sensor data; and

The aircraft maneuver command is sent to the aircraft.

One or more computer readable non-volatile storage media storing computer readable instructions, when executed by one or more processors, cause the one or more processors to perform the steps of:

Display the aircraft control interface;

Detecting a touch operation acting on the aircraft control interface;

If the touch operation is detected, acquiring sensor data, and obtaining an aircraft manipulation command based on at least the sensor data; and

The aircraft maneuver command is sent to the aircraft.

Details of one or more embodiments of the invention are set forth in the accompanying drawings and description below. Other features, objects, and advantages of the invention will be apparent from the description and appended claims.

DRAWINGS

In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, obviously, the following The drawings in the description are only some of the embodiments of the present invention, and those skilled in the art can obtain other drawings based on these drawings without any creative work.

1 is an application environment diagram of an aircraft handling system in an embodiment;

2 is a schematic structural diagram of a mobile terminal applying an aircraft handling method in an embodiment;

Figure 3 is a schematic structural view of an aircraft in an embodiment;

4 is a schematic flow chart of an aircraft handling method in an embodiment;

FIG. 5 is a schematic diagram of a display page displayed by a mobile terminal in an embodiment; FIG.

6 is a schematic diagram of an aircraft control interface displayed by a mobile terminal in an embodiment;

7 is a schematic flow chart of a step of manipulating an aircraft according to a touch operation on a second touch area in one embodiment;

Figure 8 is a schematic illustration of an aircraft handling interface in one embodiment;

9 is a flow chart showing the steps of obtaining sensor control commands from sensor data in an embodiment;

10 is a schematic diagram of a gesture in which the user swipes the mobile terminal in four main directions by pressing the first touch area to control the aircraft to perform four actions respectively;

11 is a schematic diagram showing an example of an action performed by a mobile terminal to a user to hold a first touch area and swing a mobile terminal in a first direction in an embodiment;

12 is a schematic diagram showing an example of an action performed by a mobile terminal to a user to hold a first touch area and swing a mobile terminal in a second direction in an embodiment;

FIG. 13 is a schematic diagram showing an example of an action performed by a mobile terminal to a user to hold a first touch area and swing a mobile terminal in a third direction in an embodiment; FIG.

14 is a schematic diagram showing an example of an action performed by a mobile terminal to a user to hold a first touch area and swing a mobile terminal in a fourth direction in an embodiment;

15 is a flow chart showing the steps of selecting a preset automatic control mode to operate an aircraft in one embodiment;

16 is a structural block diagram of a mobile terminal in an embodiment;

17 is a structural block diagram of a mobile terminal in another embodiment;

18 is a structural block diagram of a sensor data processing module in an embodiment;

Figure 19 is a block diagram showing the structure of a mobile terminal in still another embodiment.

detailed description

The present invention will be further described in detail below with reference to the accompanying drawings and embodiments. It is understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.

As shown in FIG. 1, in one embodiment, an aircraft handling system 100 is provided that includes a mobile terminal 102 and an aircraft 104. A wireless connection is established between the mobile terminal 102 and the aircraft 104 by which data is transferred between the mobile terminal 102 and the aircraft 104. The aircraft 104 is a flight device that can be remotely controlled, and may be a drone, and specifically may be any one of a fixed-wing drone, a rotor-wing drone, a wing-wing drone, a flapping drone, and an unmanned airship. The aircraft 104 can also be a powered aviation model.

As shown in FIG. 2, in one embodiment, a mobile terminal 102 is provided that includes a processor coupled via a system bus, a non-volatile storage medium, an internal memory, a communication device, a display screen, and an input device. Wherein the processor has computing functionality and functionality to control operation of the mobile terminal 102, the processor being configured to perform an aircraft handling method. The non-volatile storage medium includes at least one of a magnetic storage medium, an optical storage medium, and a flash storage medium. The non-volatile storage medium stores an operating system. The non-volatile storage medium or internal memory can store computer readable instructions that, when executed by the processor, cause the processor to perform an aircraft manipulation method. The internal memory is used to provide a cache for the operating system and computer readable instructions. The communication device is for wireless communication with the aircraft 104. The display includes at least one of a liquid crystal display, a flexible display, and an electronic ink display. The input device includes at least one of a physical button, a trackball, a touchpad, and a touch layer overlapping the display screen, wherein the touch layer and the display screen are combined to form a touch screen. The mobile terminal 102 may be at least one of a mobile phone, a tablet, a PDA (Personal Digital Assistant), and a touch remote control.

As shown in FIG. 3, in one embodiment, an aircraft 104 is provided, including a pass system Bus-connected processor, non-volatile storage medium, internal memory, communication device, flight drive, camera, and positioning device. Wherein the processor has a computing function and a function to control the operation of the aircraft 104, the processor being configured to execute a manipulation command or a combination manipulation command from the mobile terminal. The non-volatile storage medium includes at least one of a magnetic storage medium, an optical storage medium, and a flash storage medium. The non-volatile storage medium stores an operating system and a manipulation instruction execution device for executing a manipulation command or a combination manipulation command from the mobile terminal. The internal memory is used to provide a cache for the operating system and the manipulation instruction execution device. The communication device is for wirelessly communicating with the mobile terminal 102. The flight drive is used to control the flight behavior of the aircraft of the aircraft 104, primarily by controlling the flight speed and flight direction of the aircraft 104. For rotorcraft, the flight drive mainly includes rotor and rotor control devices. The camera is used to capture images, including images and videos. The positioning device may be a GPS (Global Positioning System) positioning device for locating the position of the aircraft 104.

As shown in FIG. 4, in one embodiment, an aircraft maneuvering method is provided. This embodiment is exemplified by the method applied to the mobile terminal 102 of FIGS. 1 and 2 described above. The method comprises the following steps:

Step 402, displaying an aircraft manipulation interface.

Specifically, the mobile terminal runs an aircraft control application, and the aircraft control application has the function of manipulating the aircraft, and may also have a function of processing photos or videos taken by the aircraft, where the processing of the photos or videos taken by the aircraft mainly includes classification. , show, share with social friends, and generate travel routes. The mobile terminal may specifically sort the travel route according to the shooting time of the photo or video, and may also sort the geographic location information recorded when the photo or video is taken according to the corresponding shooting time to generate a travel route. The travel route here can reflect the travel route of the aircraft, and can further reflect the travel route of the user.

The mobile terminal provides an aircraft control interface for triggering aircraft maneuver commands through the aircraft control application, specifically to the aircraft control interface on a display page for displaying photos or videos taken by the aircraft. For example, the mobile terminal runs the aircraft control application, first enters the display page as shown in FIG. 5, and the user can classify and view the photos taken by the aircraft in the display page or The video is shared with social friends, and it can also show the route generated based on the photos or videos taken by the aircraft. The mobile terminal enters the aircraft control interface as shown in FIG. 6 upon detecting the operation of the aircraft control icon 502.

Step 404, detecting a touch operation acting on the aircraft control interface.

Specifically, the first touch area is a specific area in the aircraft control interface for supporting the touch operation. The first touch area may be a button, and the button defaults to the first state, and changes to the second state when the touch operation is detected, where the state includes at least one of a shape, a color, and a pattern, for example, the button is convex by default. The up state changes to a sinking state after the touch operation is detected. The first touch area may also be an area identified by a preset mark, such as an area circled by a virtual frame or identified by a special color. The first touch area may also be unmarked, but instead indicated by a guide icon when entering the aircraft control interface for the first time. The touch operation may specifically be a touch click operation, a touch double click operation, a touch long press operation, a sliding operation, and a multi-touch operation, and the multi-touch operation is based on operations of multiple touch points, such as triggering multiple touch points. After that, multiple touch points are collected, or multiple touch points are triggered, and then multiple touch points are spread. The touch operation is applied to the first touch area, and the touch point of the touch operation is in the first touch area. The mobile terminal can detect the touch operation acting on the aircraft control interface in real time or periodically.

For example, referring to FIG. 6 , the first touch area may be an area 602 located in the aircraft control interface. When the user touches the first touch area 602 through the touch body and keeps the touch point from disappearing, the mobile terminal may detect the action. The touch operation of the first touch area 602. A touch body such as a stylus or a user's finger.

Step 406: If the touch operation is detected, the sensor data is acquired, and the aircraft manipulation command is obtained according to at least the sensor data.

The mobile terminal can specifically read the sensor data from the corresponding sensor through an interface that reads the sensor data, wherein the sensor data can be sensor data of the plurality of sensors. In one embodiment, the sensor data is from at least one of a direction sensor, a gravity sensor, an acceleration sensor, a light sensor, an electronic compass, a distance sensor, a three-axis gyro sensor, a temperature sensor, and a pressure sensor.

The mobile terminal can obtain an aircraft manipulation command according to the mapping relationship between the sensor data and the aircraft manipulation command, and the acquired sensor data. The mapping relationship between the sensor data and the aircraft maneuver command can be represented by a function, the argument of the function can be sensor data, and the dependent variable can be the identifier of the mapped aircraft maneuver instruction.

The aircraft control command may be a control command for controlling the flight state of the aircraft and the attitude of the aircraft, or may be a control command for controlling the aircraft to take a photo or video, or may be other instructions for controlling the aircraft to perform an action. The flight state such as at least one of a flight direction, a flight speed, a flying height, a hover, and a flight destination, the attitude of the aircraft such as a side body or a rotation.

For example, if the sensor data is a pressure value from a pressure sensor, or a temperature value from a temperature sensor, or a lightness value from a light sensor, an aircraft for controlling the flight speed of the aircraft can be obtained based on the sensor data. Manipulating commands, such as the higher the pressure value, the faster the aircraft flies, or the higher the temperature, the faster the aircraft flies. If the sensor data is a distance value from the distance sensor, an aircraft control command for decelerating when the distance value is less than the first preset value and stopping when the distance value is less than the second preset value may be obtained according to the sensor data, where the first The preset value is greater than the second preset value.

Step 408, transmitting an aircraft manipulation command to the aircraft.

Specifically, the mobile terminal transmits the aircraft manipulation command obtained according to the sensor data to the aircraft through a wireless connection with the aircraft, so that the aircraft performs the action specified by the aircraft manipulation command after receiving the aircraft manipulation command. If the aircraft receives a plurality of aircraft maneuver commands, the actions specified by the respective aircraft maneuver commands may be sequentially executed in the order of reception.

The aircraft control method displays an aircraft control interface, and the aircraft control interface has a first touch area. When the touch operation on the first touch area is detected, the sensor data is used to generate an aircraft control command and sent to the aircraft. In this way, when the user acts on the first touch area through the touch operation, the user can control the aircraft by changing the sensor data detected by the sensor, and provides a simple and brand-new control mode, so that the user is manipulating the aircraft. There are more choices and it is more convenient to control the aircraft.

In one embodiment, step 406 includes: if the first touch operation for opening the sensor control mode in the first touch area of the aircraft control interface is detected, acquiring sensor data, and obtaining the aircraft based on at least the sensor data. The command is commanded until the second touch operation acting on the first touch area is detected to stop.

Specifically, the touch operation detected by the mobile terminal includes a first touch operation for turning on the sensor control mode and a second touch operation for turning off the sensor control mode. The sensor control mode refers to a mode in which the aircraft is manipulated by sensor data. The mobile terminal acquires sensor data after turning on the sensor control mode and obtains an aircraft control command based on at least the sensor data, and transmits the aircraft control command to the aircraft. After the mobile terminal turns off the sensor control mode, the sensor data will no longer be acquired, or the aircraft control command will no longer be obtained according to the sensor data, or the aircraft control command will not be sent to the aircraft, but the aircraft can be manipulated by other means, such as by simulation. The joystick controls the aircraft.

In this embodiment, the timing of entering the sensor control mode can be flexibly controlled by the touch operation for turning the sensor control mode on and off, respectively, thereby using the change of the sensor data detected by the sensor to control the aircraft in the sensor control mode. The handling of the aircraft is more convenient.

In one embodiment, the first touch operation for turning on the sensor control mode and the second touch operation for turning off the sensor control mode may be the same. At this time, the touch operation can be selected from a touch click operation, a touch double click operation, a slide operation, and a multi-touch operation.

For example, if a touch click operation on the first touch area is detected, the sensor control mode is turned on, and the sensor data is acquired, and at least the aircraft control command is obtained according to the sensor data, and the aircraft control command is sent to the aircraft. If a touch click operation on the first touch area is detected again, the sensor control mode is turned off.

In one embodiment, the first touch operation for turning on the sensor control mode and the second touch operation for turning off the sensor control mode may be different. At this time, the two touch operations can be selected from a touch click operation, a touch double click operation, a slide operation, and a multi-touch operation, respectively.

For example, if a touch click operation on the first touch area is detected, the sensor control mode is turned on, and sensor data is acquired, and at least the aircraft control command is obtained according to the sensor data, and the flight is performed. The aircraft sends an aircraft control command. If a touch double click operation on the first touch area is detected, the sensor control mode is turned off.

In one embodiment, the first touch operation for turning on the sensor control mode and the second touch operation for turning off the sensor control mode may be included in one combined touch operation. The combined touch operation, such as a touch long press operation, includes a touch operation that triggers a long press operation and a touch operation that releases a long press operation.

In one embodiment, the step 406 includes: starting from the detection of the third touch operation, and if the timing reaches the preset duration and the third touch operation remains applied to the first touch area in the aircraft control interface, acquiring The sensor data, and at least the aircraft control command is obtained according to the sensor data until the third touch operation is stopped.

Specifically, the third touch operation detected in the embodiment is a continuous touch operation, and the mobile terminal detects the third touch operation after the third touch operation is used for the first touch area. Obtaining sensor data, and at least obtaining an aircraft control command according to the sensor data, and transmitting an aircraft control command to the aircraft until the effect of the touch point on the first touch area disappears. The action time of the touch operation refers to a time period from when the touch operation is detected to when the touch operation disappears.

The preset duration is for distinguishing from the touch operation. If the touch point disappears within the preset duration from the detection of the touch point, the touch operation is recognized as a touch operation, and the touch point does not disappear if the preset time is reached. , it is recognized as a continuous touch operation that needs to be detected, and enters the sensor control mode. The first touch operation for turning on the sensor control mode is started when the touch point of the third touch operation is detected and the time is up to the preset time. The effect of the touch point on the first touch area disappears. A second touch operation for turning off the sensor control mode.

In this embodiment, when the touch point is detected, the time is started. If the touch time is still applied to the first touch area when the preset time is reached, the aircraft may be prevented from being out of control due to the user accidentally touching the first touch area.

In other embodiments, the mobile terminal may also acquire sensor data immediately after detecting the touch operation, and at least obtain an aircraft control command according to the sensor data, and send an aircraft control command to the aircraft until the touch point is opposite to the first touch area. The role disappears.

As shown in FIG. 7 , in an embodiment, the method further includes the step of manipulating the aircraft according to the touch operation on the second touch area, and specifically includes the following steps:

Step 702: detecting a touch operation applied to the second touch area in the aircraft control interface; the second touch area is used to simulate the joystick operation.

Specifically, the second touch area is a specific area in the aircraft manipulation interface for withstanding a touch operation to simulate a joystick operation. Touch operations such as touch click operations, touch double tap operations, touch long press operations, swipe operations, and multi-touch operations. The touch operation applied to the second touch area and the touch operation applied to the first touch area respectively implement different control modes.

In one embodiment, the second touch area can surround the first touch area. The second touch area does not overlap with the first touch area. The touch operation and the detected touch operation may be the same. In other embodiments, the second touch area can be separated from the first touch area.

Step 704: Acquire an analog joystick manipulation command triggered by the touch operation.

In step 706, an analog joystick manipulation command is sent to the aircraft.

Specifically, the touch operation acts on different regions of the second touch area, and different analog joystick manipulation commands can be triggered respectively. The specific second touch area can define four main directions, such as up, down, left and right, so that the mobile terminal can trigger the corresponding analog joystick control according to the relative position of the second touch area relative to the four main directions according to the touch operation. instruction. The analog joystick control command does not conflict with the aircraft control command. Preferably, the analog rocker steering command is used to manipulate the vertical movement of the aircraft and the change in attitude of the aircraft, and the aircraft maneuver command is used to manipulate the movement of the aircraft in various directions along the horizontal plane.

Referring to FIG. 8 , if the touch point of the touch operation acts on the main direction of the second touch area 801, as shown by the four gestures 802, 803, 804, and 805 in FIG. 8 , the mobile terminal may trigger the touch operation. After the analog joystick manipulation command corresponding to the main direction is sent to the aircraft, the aircraft can perform the ascending, descending, left-handed or right-handed motion according to the received analog joystick manipulation command.

If the touch operation is applied to a position other than the main direction in the second touch area, the mobile terminal may trigger the corresponding combined analog joystick manipulation command according to the component of the touch operation mapped by the touch point in the main direction, and combine The analog joystick manipulation command is sent to the aircraft, and the aircraft can be received according to the The combined analog joystick manipulation command performs a left-handed rise, a right-handed rise, a left-handed fall, or a right-handed down action.

In this embodiment, the mobile terminal detects a touch operation applied to the aircraft control interface, and detects a touch operation of the second touch area acting on the aircraft control interface, thereby implementing different operations of the aircraft according to a combination of different detection results. the way. This makes the aircraft's control mode more diverse, and the handling of the aircraft is more flexible and convenient.

In one embodiment, step 406 includes: if it is detected that the touch point of the fourth touch operation acts on the first touch area, acquiring sensor data, and at least obtaining an aircraft control command according to the sensor data until the touch point disappears Stop when. The method further includes: when the touch point moves to the second touch area in the aircraft control interface, triggering the analog joystick manipulation command according to the position of the touch point in the second touch area and transmitting to the aircraft; the second touch The control area is used to simulate the joystick operation.

Specifically, the fourth touch operation detected in the embodiment is a continuous touch operation, and after detecting the fourth touch operation, the mobile terminal acquires sensor data during the action time of the corresponding touch point, and at least The aircraft control command is obtained based on the sensor data, and the aircraft control command is sent to the aircraft until the touch point disappears. The touch point that detects the fourth touch operation acts on the first touch area, and is the first touch operation for turning on the sensor control mode, and the touch point disappears is the second method for turning off the sensor control mode. Touch operation.

The mobile terminal can also start timing when the touch point of the fourth touch operation is detected. If the time reaches the preset time and the corresponding touch point remains in the first touch area, the sensor data is acquired, and at least according to the sensor. The data is obtained by the aircraft control command until the touch point disappears. The first touch operation for turning on the sensor control mode is started when the touch point of the fourth touch operation is detected and the timing is reached, and the touch point disappears to close the sensor control mode. The second touch operation.

Further, when the touch point moves to the second touch area in the aircraft control interface, the steps 702 to 706 can be triggered, and the action of the touch point on the second touch area acts on the second touch. The touch operation of the area.

In this embodiment, the continuous sensor touch control can be sequentially triggered by the fourth touch operation. The mode and the analog joystick control aircraft can realize the one-handed operation of the aircraft, making the operation of the aircraft more convenient and faster.

In one embodiment, step 406 includes: detecting a touch touch operation of a touch button applied to the second touch area of the aircraft control interface, and detecting sensor data when detecting; detecting the touch button following the touch operation at the second The movement in the touch area or the aircraft control interface acquires an analog joystick manipulation command according to the movement; the aircraft manipulation command is obtained according to the sensor data and the analog joystick manipulation command.

The analog joystick control command is a control command of the analog joystick remote control, and specifically defines four main directions in the second touch area, such as up, down, left, and right, so that the touch operation can be applied to the second touch area according to the pressing touch operation. The corresponding analog joystick manipulation command is triggered relative to the relative positions of the four main directions. The analog joystick control command does not conflict with the aircraft control command. Preferably, the analog rocker steering command is used to manipulate the vertical movement of the aircraft and the change in attitude of the aircraft.

In this embodiment, by pressing and touching the touch control button of the second touch area, the sensor control mode and the analog joystick control aircraft can be successively triggered, and the aircraft can be operated by one hand, so that the operation of the aircraft is more convenient and faster.

As shown in FIG. 9, in one embodiment, the step of obtaining an aircraft manipulation command based on at least the sensor data specifically includes the following steps:

Step 902: Determine an initial state of the mobile terminal where the sensor is located according to the obtained initial sensor data.

Specifically, the sensor data includes data for reflecting at least one of a posture and a motion of the mobile terminal. The initial sensor data is the sensor data initially received by the mobile terminal after entering the sensor control mode, and is used to determine the current state of the mobile terminal, which is defined as an initial state. The initial state includes an attitude state and a motion state of the mobile terminal, wherein the posture state includes a tilted portion of the mobile terminal, a tilted direction, and a tilt angle, and the like, and the motion state includes a motion speed, a motion acceleration, and a motion direction.

The mobile terminal can determine the initial state of the mobile terminal in three-dimensional space according to the three-dimensional fixed reference coordinate system of the mobile terminal. Wherein, if the fixed reference coordinate system is a three-dimensional reference coordinate system, including Straight three axes, where the two axes can be parallel to the display of the mobile terminal, while the remaining one axis is perpendicular to the display. The motion parameters include at least one of a moving direction, a moving range, and a moving speed. The initial state determined by the mobile terminal using the fixed reference coordinate system can accurately reflect the initial state of the mobile terminal in the three-dimensional space represented by the fixed reference coordinate system.

Step 904: Determine a subsequent state of the mobile terminal according to the acquired sensor data of the initial sensor data.

Specifically, after determining the initial state, the mobile terminal continues to acquire subsequent sensor data, thereby determining a subsequent state of the mobile terminal according to the subsequent sensor data, where the subsequent state includes a posture state and a motion state of the mobile terminal, where the posture state includes the mobile terminal tilting The state of the movement, the direction of the inclination, and the angle of inclination, etc., the state of motion includes the speed of motion, the acceleration of motion, and the direction of motion.

Step 906, generating an aircraft manipulation command according to a change of the subsequent state with respect to the initial state.

Specifically, the mobile terminal compares the subsequent state with the initial state based on the initial state, thereby generating an aircraft manipulation command according to the amount that the subsequent state changes with respect to the initial state. For example, if the initial state of the mobile terminal is inclined by 15° in the lower left corner of the mobile terminal, if the subsequent state is that the lower left corner of the mobile terminal changes from the tilt 15° to the horizontal level, then the lower left corner of the mobile terminal moves in the opposite direction by 15°, and the mobile terminal is at this time. An aircraft maneuver command is generated based on the amount of change.

For example, referring to FIG. 10 and FIG. 11 again, when the user presses and holds the first touch area, if the mobile terminal determines that the initial state of the mobile terminal where the sensor is located is a horizontal state, then the mobile terminal is swung to the left, then the mobile terminal can follow the follow-up The sensor data determines a subsequent state of the mobile terminal, and detects that the mobile terminal moves in a direction along the left side of the screen according to a change in the subsequent state with respect to the initial state. After the aircraft control command generated by the mobile terminal is transmitted to the aircraft, the aircraft performs a corresponding action, such as performing a left shifting action, so that the aircraft is linked according to the motion of the mobile terminal.

Further, if the user continues to hold the first touch area, and the mobile terminal is swung right, up, or down, the mobile terminal can detect that the mobile terminal moves along the back side, the upper side, or the next direction of the screen. After the corresponding aircraft maneuver command is sent to the aircraft, the aircraft is caused to perform a right shifting, advancing or retreating action, respectively.

In this embodiment, the user can pass the first touch area in the arbitrary state of the mobile terminal. The touch operation is used to turn on the sensor control mode, and the mobile terminal initializes, determines an initial state according to the initial sensor data, and then determines a subsequent state according to the subsequent sensor data, and generates an aircraft manipulation command according to the change of the subsequent state with respect to the initial state. In this way, the user does not have to place the mobile terminal horizontally to control the aircraft, and the control is more convenient and precise.

In an embodiment, when determining the motion state of the mobile terminal, the mobile terminal determines whether the motion amplitude exceeds a preset threshold, and if yes, performs step 906; if not, abandoning step 906. In this embodiment, the user can swipe the mobile terminal in one direction and slowly return to the mobile terminal in the same direction, so that the mobile terminal can continuously move the motion parameters in the same direction, so that the aircraft can be continuously operated to perform the same operation. Actions.

As shown in FIG. 15 , in an embodiment, the aircraft control method further includes the step of selecting a preset automatic control mode to operate the aircraft, and specifically includes the following steps:

Step 1502: Detect a selection instruction for a preset automatic control mode icon in the aircraft control interface.

Specifically, the mobile terminal can display a plurality of preset automatic manipulation mode icons in the aircraft manipulation interface, such as icons 606, 608, and 610 in FIG. When the user clicks on an icon, the corresponding selection instruction is triggered. The preset automatic control mode is an automatic control method that uses predefined parameters to control the aircraft to implement predefined actions.

Step 1504: Determine a corresponding preset automatic control mode according to the selection instruction.

Specifically, the mobile terminal selects a preset automatic control mode corresponding to the preset automatic control mode icon corresponding to the instruction as a corresponding preset automatic control mode determined according to the selection instruction.

Step 1506, reading the determined aircraft combination manipulation command associated with the determined preset automatic control mode.

Step 1508, transmitting an aircraft combination manipulation command to the aircraft, so that the aircraft sequentially performs a corresponding series of actions according to the aircraft combination manipulation command.

Specifically, each preset automatic control mode stored on the mobile terminal is pre-associated with a corresponding aircraft combination control command, and after the mobile terminal reads the aircraft combined control command and sends the command to the aircraft, the aircraft sequentially executes according to the aircraft combination control command. A series of actions to control the aircraft to automatically change from the current state to the target state specified by the preset automatic steering mode.

In one embodiment, the preset automatic control mode includes an in-situ landing mode, and returns to a preset location. At least one of a landing mode, an in-flight emergency hover mode, and a follow-lock target flight mode.

If the determined preset automatic control mode is the in-situ landing mode, the aircraft can automatically complete the in-situ landing after automatically stopping the horizontal movement, gradually reducing the flying height, and stopping the rotor after reaching a ground level. Flight mission.

If the determined preset automatic control mode is to return to the preset location drop mode, the aircraft may automatically perform the steps of acquiring the preset location coordinates, flying to the preset location coordinates, stopping the horizontal direction, gradually reducing the flight altitude, and arriving at the place. After the plane stops the series of actions of the rotor, the automatic flight task returns to the preset position and falls.

If the determined preset automatic control mode is the flight emergency hover mode, the aircraft may complete the automatic flight hovering emergency flight task after automatically performing a series of actions of stopping the horizontal direction movement and maintaining the flight altitude.

If the determined preset automatic control mode is the follow-lock target flight mode, the aircraft may complete the follow-lock target flight after automatically performing the acquisition of the locked target, the flight to the target preset distance of the distance lock and maintaining a series of actions. Automatic flight mission.

In this embodiment, the user can quickly control the aircraft by preset the automatic control mode, so that the aircraft automatically completes the corresponding flight task, thereby improving the convenience of operation. The in-situ landing mode, the returning preset landing mode and the flight emergency hover mode can realize emergency hedging or aircraft recovery. Following the locked target flight mode, the aircraft can be locked automatically after the target is locked, and one user can simultaneously Manipulate multiple aircraft.

As shown in FIG. 16, in one embodiment, a mobile terminal 1600 is provided. The internal structure of the mobile terminal 1600 may correspond to the mobile terminal structure as shown in FIG. 2. Each of the following modules may pass all or part of the software. , hardware or a combination thereof.

The mobile terminal 1600 includes an interface display module 1601, a touch operation detection module 1602, a sensor data processing module 1603, and a manipulation command transmission module 1604.

The interface display module 1601 is configured to display an aircraft manipulation interface.

Specifically, the mobile terminal runs an aircraft control application, and the aircraft control application has the function of manipulating the aircraft, and may also have a function of processing photos or videos taken by the aircraft. The processing of photos or videos taken by the aircraft here mainly includes classification, display, sharing with social friends, and generating a route of travel. The mobile terminal may specifically sort the travel route according to the shooting time of the photo or video, and may also sort the geographic location information recorded when the photo or video is taken according to the corresponding shooting time to generate a travel route. The travel route here can reflect the travel route of the aircraft, and can further reflect the travel route of the user.

The interface display module 1601 provides an aircraft control interface for triggering aircraft maneuver instructions through the aircraft control application, specifically to the aircraft control interface on a display page for displaying photos or videos taken by the aircraft. For example, the mobile terminal runs the aircraft control application, first enters the display page as shown in FIG. 5, in which the user can classify and view the photos or videos taken by the aircraft and share them with social friends, and can also display photos taken according to the aircraft. Or the route generated by the video. The mobile terminal enters the aircraft control interface as shown in FIG. 6 upon detecting the operation of the aircraft control icon 502.

The touch operation detecting module 1602 is configured to detect a touch operation applied to the aircraft control interface.

Specifically, the first touch area is a specific area in the aircraft control interface for supporting the touch operation. The first touch area may be a button, and the button defaults to a first state, and changes to a second state when the touch operation detecting module 1602 detects the touch operation, where the state includes at least one of a shape, a color, and a pattern. For example, the button is in a raised state by default, and changes to a sinking state after the touch operation detecting module 1602 detects the touch operation. The first touch area may also be an area identified by a preset mark, such as an area circled by a virtual frame or identified by a special color. The first touch area may also be unmarked, but instead indicated by a guide icon when entering the aircraft control interface for the first time. The touch operation may specifically be a touch click operation, a touch double click operation, a touch long press operation, a sliding operation, and a multi-touch operation, and the multi-touch operation is based on operations of multiple touch points, such as triggering multiple touch points. After that, multiple touch points are collected, or multiple touch points are triggered, and then multiple touch points are spread. The touch operation is applied to the first touch area, and the touch point of the touch operation is in the first touch area. The touch operation detecting module 1602 can detect the touch operation acting on the aircraft control interface in real time or periodically.

For example, referring to FIG. 6 , the first touch area may be an area located in the aircraft control interface. In the domain 602, the user touches the first touch area 602 through the touch body and keeps the touch point from disappearing, and the mobile terminal detects the touch operation on the first touch area 602. A touch body such as a stylus or a user's finger.

The sensor data processing module 1603 is configured to acquire sensor data if a touch operation is detected, and obtain an aircraft manipulation command based on at least the sensor data.

The sensor data processing module 1603 can specifically read the sensor data from the corresponding sensor through an interface that reads the sensor data, wherein the sensor data can be sensor data of the plurality of sensors. In one embodiment, the sensor data is from at least one of a direction sensor, a gravity sensor, an acceleration sensor, a light sensor, an electronic compass, a distance sensor, a three-axis gyro sensor, a temperature sensor, and a pressure sensor.

The sensor data processing module 1603 can obtain an aircraft manipulation command according to the mapping relationship between the sensor data and the aircraft manipulation command, and the acquired sensor data. The mapping relationship between the sensor data and the aircraft maneuver command can be represented by a function, the argument of the function can be sensor data, and the dependent variable can be the identifier of the mapped aircraft maneuver instruction.

The aircraft control command may be a control command for controlling the flight state of the aircraft and the attitude of the aircraft, or may be a control command for controlling the aircraft to take a photo or video, or may be other instructions for controlling the aircraft to perform an action. The flight state such as at least one of a flight direction, a flight speed, a flying height, a hover, and a flight destination, the attitude of the aircraft such as a side body or a rotation.

For example, if the sensor data is a pressure value from a pressure sensor, or a temperature value from a temperature sensor, or a lightness value from a light sensor, an aircraft for controlling the flight speed of the aircraft can be obtained based on the sensor data. Manipulating commands, such as the higher the pressure value, the faster the aircraft flies, or the higher the temperature, the faster the aircraft flies. If the sensor data is a distance value from the distance sensor, an aircraft control command for decelerating when the distance value is less than the first preset value and stopping when the distance value is less than the second preset value may be obtained according to the sensor data, where the first The preset value is greater than the second preset value.

The command transmission module 1604 is configured to send an aircraft control command to the aircraft.

Specifically, the manipulation command transmitting module 1604 transmits the obtained aircraft manipulation command to the aircraft through a wireless connection with the aircraft, so that the aircraft performs the action specified by the aircraft manipulation command after receiving the aircraft manipulation command. If the aircraft receives a plurality of aircraft maneuver commands, the actions specified by the respective aircraft maneuver commands may be sequentially executed in the order of reception.

The mobile terminal 1600 displays an aircraft control interface. The aircraft control interface has a first touch area. When a touch operation is applied to the first touch area, the sensor data is used to generate an aircraft control command and sent to the aircraft. In this way, when the user acts on the first touch area through the touch operation, the user can control the aircraft by changing the sensor data detected by the sensor, and provides a simple and brand-new control mode, so that the user is manipulating the aircraft. There are more choices and it is more convenient to control the aircraft.

In one embodiment, the sensor data processing module 1603 is specifically configured to acquire sensor data when detecting a first touch operation for opening the sensor control mode in the first touch area of the aircraft control interface, and at least according to The sensor data is obtained by the aircraft control command until the second touch operation acting on the first touch area is detected.

Specifically, the touch operation detected by the sensor data processing module 1603 includes a first touch operation for turning on the sensor control mode and a second touch operation for turning off the sensor control mode. The sensor control mode refers to a mode in which the aircraft is manipulated by sensor data. The sensor data processing module 1603 acquires sensor data after turning on the sensor control mode and obtains an aircraft control command based on at least the sensor data, and transmits the aircraft control command to the aircraft. The sensor data processing module 1603 will no longer acquire the sensor data after the sensor control mode is turned off, or no longer at least obtain the aircraft control command according to the sensor data, or no longer send the aircraft control command to the aircraft, but can manipulate the aircraft by other means. For example, the aircraft is controlled by an analog joystick.

In this embodiment, the timing of entering the sensor control mode can be flexibly controlled by the touch operation for turning the sensor control mode on and off, respectively, thereby using the change of the sensor data detected by the sensor to control the aircraft in the sensor control mode. The handling of the aircraft is more convenient.

In one embodiment, the first touch operation for turning on the sensor control mode and the second touch operation for turning off the sensor control mode may be the same. At this time, the touch operation can be performed from a touch click operation, Touch double-click, slide, and multi-touch to select.

For example, if a touch click operation on the first touch area is detected, the sensor control mode is turned on, and the sensor data is acquired, and at least the aircraft control command is obtained according to the sensor data, and the aircraft control command is sent to the aircraft. If a touch click operation on the first touch area is detected again, the sensor control mode is turned off.

In one embodiment, the first touch operation for turning on the sensor control mode and the second touch operation for turning off the sensor control mode may be different. At this time, the two touch operations can be selected from a touch click operation, a touch double click operation, a slide operation, and a multi-touch operation, respectively.

For example, if a touch click operation on the first touch area is detected, the sensor control mode is turned on, and sensor data is acquired, and at least the aircraft control command is obtained according to the sensor data, and the aircraft control command is sent to the aircraft. If a touch double click operation on the first touch area is detected, the sensor control mode is turned off.

In one embodiment, the first touch operation for turning on the sensor control mode and the second touch operation for turning off the sensor control mode may be included in one combined touch operation. The combined touch operation, such as a touch long press operation, includes a touch operation that triggers a long press operation and a touch operation that releases a long press operation.

In one embodiment, the sensor data processing module 1603 is specifically configured to start timing when the third touch operation is detected, if the timing reaches a preset duration and the third touch operation remains to act on the first touch in the aircraft control interface. For the area, the sensor data is acquired, and at least the aircraft control command is obtained according to the sensor data until the third touch operation is stopped.

Specifically, the touch operation detected in the embodiment is a continuous touch operation, and after detecting the touch operation, the sensor data processing module 1603 acquires a sensor during the use time of the touch operation on the first touch area. Data, and at least according to the sensor data, obtain an aircraft control command, and send an aircraft control command to the aircraft until the effect of the touch point on the first touch area disappears. The action time of the touch operation refers to a time period from when the touch operation is detected to when the touch operation disappears.

The preset duration is for distinguishing from the touch operation, and the preset duration from the touch point is detected. After the touch point disappears, it is recognized as a touch operation. If the touch point has not disappeared after reaching the preset time, it is recognized as a continuous touch operation that needs to be detected, and enters the sensor control mode. The first touch operation for turning on the sensor control mode is started when the touch point of the third touch operation is detected and the time is up to the preset time. The effect of the touch point on the first touch area disappears. A second touch operation for turning off the sensor control mode.

In this embodiment, when the touch point is detected, the time is started. If the touch time is still applied to the first touch area when the preset time is reached, the aircraft may be prevented from being out of control due to the user accidentally touching the first touch area.

In other embodiments, the sensor data processing module 1603 may also acquire sensor data immediately after detecting the touch operation, and at least obtain an aircraft manipulation command according to the sensor data until the effect of the touch point on the first touch area disappears.

In one embodiment, the sensor data processing module 1603 is specifically configured to detect a touch touch operation of a touch button applied to the second touch area of the aircraft control interface. When detected, the sensor data is acquired; and the touch button is detected to follow the touch touch. The operation is operated in the second touch area or the aircraft control interface, and the analog joystick manipulation command is acquired according to the movement; the aircraft manipulation command is obtained according to the sensor data and the analog joystick manipulation command.

As shown in FIG. 17, in one embodiment, the mobile terminal 1600 further includes a touch operation detection module 1605, an analog joystick manipulation command acquisition module 1606, and an analog joystick manipulation command transmission module 1607.

The touch operation detecting module 1605 is configured to detect a touch operation applied to the second touch area in the aircraft control interface. The second touch area is used to simulate the joystick operation.

Specifically, the second touch area is a specific area in the aircraft manipulation interface for withstanding a touch operation to simulate a joystick operation. Touch operations such as touch click operations, touch double tap operations, touch long press operations, swipe operations, and multi-touch operations. The touch operation applied to the second touch area and the touch operation applied to the first touch area respectively implement different control modes.

In one embodiment, the second touch area can surround the first touch area. The second touch area does not overlap with the first touch area. The touch operation and the detected touch operation may be the same. In other embodiments, the second touch area can be separated from the first touch area.

The analog joystick manipulation command acquisition module 1606 is configured to acquire an analog joystick manipulation command triggered by the touch operation.

The analog joystick manipulation command sending module 1607 is configured to send an analog joystick manipulation command to the aircraft.

Specifically, the touch operation acts on different regions of the second touch area, and different analog joystick manipulation commands can be triggered respectively. Four main directions may be defined in the second touch area, such as up, down, left and right, so that the analog rocker manipulation command acquisition module 1606 can be triggered according to the relative position of the second touch region relative to the four main directions according to the touch operation. The corresponding analog joystick control command. The analog joystick control command does not conflict with the aircraft control command. Preferably, the analog rocker steering command is used to manipulate the vertical movement of the aircraft and the change in attitude of the aircraft, and the aircraft maneuver command is used to manipulate the movement of the aircraft in various directions along the horizontal plane.

Referring to FIG. 8 , if the touch point of the touch operation acts on the main direction of the second touch area 801, as shown by the four gestures 802, 803, 804, and 805 in FIG. 8 , the mobile terminal may trigger the touch operation. After the analog joystick manipulation command corresponding to the main direction is sent to the aircraft, the aircraft can perform the ascending, descending, left-handed or right-handed motion according to the received analog joystick manipulation command.

If the touch operation is applied to a position other than the main direction in the second touch area, the analog joystick manipulation instruction acquisition module 1606 can trigger the corresponding combined analog shake according to the component of the touch operation map of the touch operation in the main direction. The lever manipulation command, the analog joystick manipulation command sending module 1607 sends the combined analog joystick manipulation command to the aircraft, and the aircraft can perform a left-handed rise, a right-handed rise, a left-handed fall, or a right-handed according to the received combined analog joystick manipulation command. The action of falling.

In this embodiment, the touch operation applied to the control interface of the aircraft is detected, and the touch operation of the second touch area acting on the control interface of the aircraft is detected, so that different control modes of the aircraft are implemented according to a combination of different detection results. This makes the aircraft's control mode more diverse, and the handling of the aircraft is more flexible and convenient.

In one embodiment, the sensor data processing module 1603 is specifically configured to: if the touch point of the fourth touch operation is detected to act on the first touch area, acquire sensor data, and obtain an aircraft control command according to at least the sensor data; Stopping when the touch point disappears; the mobile terminal 1600 also includes shaking The lever control simulation module (not shown) is configured to trigger the analog joystick control according to the position of the touch point in the second touch area when the touch point moves to the second touch area in the aircraft control interface The command is sent to the aircraft; the second touch area is used to simulate the joystick operation. The joystick manipulation simulation module may include the above-described touch operation detection module 1605, an analog joystick manipulation instruction acquisition module 1606, and an analog joystick manipulation command transmission module 1607.

As shown in FIG. 18, in one embodiment, the sensor data processing module 1603 includes an initial state determination module 1603a, a subsequent state determination module 1603b, and an aircraft manipulation command generation module 1603c.

The initial state determining module 1603a is configured to determine an initial state of the mobile terminal where the sensor is located according to the obtained initial sensor data.

Specifically, the sensor data includes data for reflecting at least one of a posture and a motion of the mobile terminal. The initial sensor data is the sensor data initially received by the mobile terminal after entering the sensor control mode, and is used to determine the current state of the mobile terminal, which is defined as an initial state. The initial state includes an attitude state and a motion state of the mobile terminal, wherein the posture state includes a tilted portion of the mobile terminal, a tilted direction, and a tilt angle, and the like, and the motion state includes a motion speed, a motion acceleration, and a motion direction.

The initial state determination module 1603a may determine an initial state of the mobile terminal in three-dimensional space according to a three-dimensional fixed reference coordinate system of the mobile terminal. Wherein, if the fixed reference coordinate system is a three-dimensional reference coordinate system, including three axes perpendicular to each other, two axes may be parallel to the display screen of the mobile terminal, and the remaining one axis is perpendicular to the display screen. The motion parameters include at least one of a moving direction, a moving range, and a moving speed. The initial state determined by the mobile terminal using the fixed reference coordinate system can accurately reflect the initial state of the mobile terminal in the three-dimensional space represented by the fixed reference coordinate system.

The subsequent state determining module 1603b is configured to determine a subsequent state of the mobile terminal according to the acquired sensor data subsequent to the sensor data.

Specifically, after determining the initial state, the subsequent state determining module 1603b continues to acquire subsequent sensor data, thereby determining a subsequent state of the mobile terminal according to the subsequent sensor data, where the subsequent state includes a posture state and a motion state of the mobile terminal, where the posture state includes Mobile terminal tilt The state of the movement, the direction of the inclination, and the angle of inclination, etc., the state of motion includes the speed of motion, the acceleration of motion, and the direction of motion.

The aircraft maneuver instruction generation module 1603c is configured to generate an aircraft maneuver instruction according to a change of the subsequent state with respect to the initial state.

Specifically, the aircraft manipulation command generation module 1603c compares the subsequent state with the initial state based on the initial state, thereby generating an aircraft manipulation command according to the amount of change of the subsequent state with respect to the initial state. For example, if the initial state of the mobile terminal is inclined by 15° in the lower left corner of the mobile terminal, if the subsequent state is that the lower left corner of the mobile terminal changes from the tilt 15° to the horizontal level, then the lower left corner of the mobile terminal moves in the opposite direction by 15°, and the mobile terminal is at this time. An aircraft maneuver command is generated based on the amount of change.

In this embodiment, the user can turn on the sensor control mode by using the touch operation on the first touch area in the arbitrary state of the mobile terminal, and the mobile terminal initializes the initial state according to the initial sensor data, and then according to the subsequent The sensor data determines a subsequent state, and an aircraft maneuver command is generated based on a change in the subsequent state relative to the initial state. In this way, the user does not have to place the mobile terminal horizontally to control the aircraft, and the control is more convenient and precise.

In one embodiment, the sensor data is from at least one of a direction sensor, a gravity sensor, an acceleration sensor, a light sensor, an electronic compass, a distance sensor, a three-axis gyro sensor, a temperature sensor, and a pressure sensor.

As shown in FIG. 19, in one embodiment, the mobile terminal 1600 further includes: a preset automatic steering mode determining module 1608, an aircraft combination steering command reading module 1609, and an aircraft combination steering command transmitting module 1610.

The preset automatic control mode determining module 1608 is configured to detect a selection instruction for the preset automatic control mode icon in the aircraft control interface; and determine a corresponding preset automatic control mode according to the selection instruction.

Specifically, the preset automatic manipulation mode determination module 1608 can display a plurality of preset automatic manipulation mode icons in the aircraft manipulation interface, such as icons 606, 608, and 610 in FIG. When the user clicks on an icon, the corresponding selection instruction is triggered. The preset automatic control mode is an automatic control method that uses predefined parameters to control the aircraft to implement predefined actions. The preset corresponding to the selection instruction will be The preset automatic control mode corresponding to the motion control mode icon is used as a corresponding preset automatic control mode determined according to the selection instruction.

The aircraft combination manipulation command reading module 1609 is configured to read the aircraft combination manipulation command associated with the determined preset automatic control mode.

The aircraft combination command transmission module 1610 is configured to send an aircraft combination manipulation command to the aircraft, so that the aircraft sequentially performs a corresponding series of actions according to the aircraft combination manipulation command.

Specifically, each preset automatic control mode stored on the mobile terminal is pre-associated with a corresponding aircraft combination manipulation command, and the aircraft combination manipulation command reading module 1609 reads the aircraft combination manipulation command and is transmitted by the aircraft combination manipulation command transmission module 1610. After the aircraft, the aircraft will perform a series of actions according to the aircraft combination control command to control the aircraft to automatically change from the current state to the target state specified by the preset automatic control mode.

In one embodiment, the preset automatic control mode includes at least one of an in situ landing mode, a return preset landing mode, an in-flight emergency hover mode, and a follow-lock target flight mode.

If the determined preset automatic control mode is the in-situ landing mode, the aircraft can automatically complete the in-situ landing after automatically stopping the horizontal movement, gradually reducing the flying height, and stopping the rotor after reaching a ground level. Flight mission.

If the determined preset automatic control mode is to return to the preset location drop mode, the aircraft may automatically perform the steps of acquiring the preset location coordinates, flying to the preset location coordinates, stopping the horizontal direction, gradually reducing the flight altitude, and arriving at the place. After the plane stops the series of actions of the rotor, the automatic flight task returns to the preset position and falls.

If the determined preset automatic control mode is the flight emergency hover mode, the aircraft may complete the automatic flight hovering emergency flight task after automatically performing a series of actions of stopping the horizontal direction movement and maintaining the flight altitude.

If the determined preset automatic control mode is the follow-lock target flight mode, the aircraft may complete the follow-lock target flight after automatically performing the acquisition of the locked target, the flight to the target preset distance of the distance lock and maintaining a series of actions. Automatic flight mission.

In this embodiment, the user can quickly control the aircraft by using a preset automatic control mode to make the fly The line automatically completes the corresponding flight task, which improves the convenience of operation. The in-situ landing mode, the returning preset landing mode and the flight emergency hover mode can realize emergency hedging or aircraft recovery. Following the locked target flight mode, the aircraft can be locked automatically after the target is locked, and one user can simultaneously Manipulate multiple aircraft.

One of ordinary skill in the art can understand that all or part of the process of implementing the foregoing embodiments can be completed by a computer program to instruct related hardware, and the program can be stored in a computer readable storage medium. When executed, the flow of an embodiment of the methods as described above may be included. The storage medium may be a non-volatile storage medium such as a magnetic disk, an optical disk, a read-only memory (ROM), or a random access memory (RAM).

The technical features of the above-described embodiments may be arbitrarily combined. For the sake of brevity of description, all possible combinations of the technical features in the above embodiments are not described. However, as long as there is no contradiction between the combinations of these technical features, All should be considered as the scope of this manual.

The above-described embodiments are merely illustrative of several embodiments of the present invention, and the description thereof is more specific and detailed, but is not to be construed as limiting the scope of the invention. It should be noted that a number of variations and modifications may be made by those skilled in the art without departing from the spirit and scope of the invention. Therefore, the scope of the invention should be determined by the appended claims.

Claims (20)

  1. An aircraft handling method includes:
    Display the aircraft control interface;
    Detecting a touch operation acting on the aircraft control interface;
    If the touch operation is detected, acquiring sensor data, and obtaining an aircraft manipulation command based on at least the sensor data; and
    The aircraft maneuver command is sent to the aircraft.
  2. The method according to claim 1, wherein if the touch operation is detected, acquiring sensor data, and obtaining the aircraft manipulation command based on at least the sensor data comprises:
    If the first touch operation for opening the sensor control mode is detected in the first touch area of the aircraft control interface, acquiring sensor data, and obtaining an aircraft control command according to at least the sensor data, until the detection is detected. The second touch operation applied to the first touch area is stopped.
  3. The method according to claim 1, wherein if the touch operation is detected, acquiring sensor data, and obtaining the aircraft manipulation command based on at least the sensor data comprises:
    Starting from the detection of the third touch operation, if the timing reaches a preset duration and the third touch operation remains applied to the first touch area in the aircraft control interface, the sensor data is acquired, and at least according to The sensor data is obtained by the aircraft control command until the third touch operation is stopped.
  4. The method according to claim 1, wherein if the touch operation is detected, acquiring sensor data, and obtaining the aircraft manipulation command based on at least the sensor data comprises:
    Detecting a pressing touch operation of a touch button acting on the second touch area of the aircraft control interface, and acquiring sensor data when detecting;
    Detecting the touch button following the pressing touch operation in the second touch area or the fly a movement in the console control interface to obtain an analog joystick manipulation command based on the movement; and
    An aircraft maneuver command is obtained based on the sensor data and an analog joystick manipulation command.
  5. The method according to claim 1, wherein if the touch operation is detected, acquiring sensor data, and obtaining the aircraft manipulation command based on at least the sensor data comprises:
    If it is detected that the touch point of the fourth touch operation acts on the first touch area in the aircraft control interface, acquiring sensor data, and obtaining an aircraft control command according to at least the sensor data, until the touch point Stop when disappearing;
    The method further includes:
    When the touch point moves to the second touch area in the aircraft control interface, triggering an analog joystick manipulation command according to the position of the touch point in the second touch area and transmitting to the The aircraft; the second touch area is used to simulate a joystick operation.
  6. The method of claim 1 wherein said obtaining aircraft control commands based on at least said sensor data comprises:
    Determining an initial state of the mobile terminal where the sensor is located according to the obtained initial sensor data;
    Determining a subsequent state of the mobile terminal according to the acquired sensor data subsequent sensor data; and
    An aircraft maneuver command is generated based on the change in the subsequent state relative to the initial state.
  7. The method according to claim 1, wherein the sensor data is derived from a direction sensor, a gravity sensor, an acceleration sensor, a light sensor, an electronic compass, a distance sensor, a three-axis gyro sensor, a temperature sensor, and a pressure sensor. At least one.
  8. The method of claim 1 further comprising:
    Detecting a selection instruction for a preset automatic control mode icon in the aircraft control interface;
    Determining a corresponding preset automatic control mode according to the selection instruction;
    Reading the determined aircraft combination manipulation command associated with the determined preset automatic control mode; and
    The aircraft combination steering command is transmitted to the aircraft to cause the aircraft to sequentially perform a corresponding series of actions in accordance with the aircraft combination steering command.
  9. The method according to claim 8, wherein the preset automatic control mode comprises at least one of an in situ landing mode, a return preset landing mode, an in-flight emergency hover mode, and a follow-lock target flight mode. .
  10. A mobile terminal comprising a memory and a processor, wherein the memory stores computer readable instructions, wherein when the computer readable instructions are executed by the processor, the processor performs the following steps:
    Display the aircraft control interface;
    Detecting a touch operation acting on the aircraft control interface;
    If the touch operation is detected, acquiring sensor data, and obtaining an aircraft manipulation command based on at least the sensor data; and
    The aircraft maneuver command is sent to the aircraft.
  11. The mobile terminal according to claim 10, wherein if the touch operation is detected, acquiring sensor data and obtaining an aircraft manipulation command based on at least the sensor data comprises:
    If the first touch operation for opening the sensor control mode is detected in the first touch area of the aircraft control interface, acquiring sensor data, and obtaining an aircraft control command according to at least the sensor data, until the detection is detected. The second touch operation applied to the first touch area is stopped.
  12. The mobile terminal according to claim 10, wherein if the touch operation is detected, acquiring sensor data and obtaining an aircraft manipulation command based on at least the sensor data comprises:
    Starting from the detection of the third touch operation, if the timing reaches a preset duration and the third touch operation remains applied to the first touch area in the aircraft control interface, the sensor data is acquired, and at least according to The sensor data is obtained by the aircraft control command until the third touch operation is stopped.
  13. The mobile terminal according to claim 10, wherein if the touch operation is detected, sensor data is acquired, and at least the aircraft control is obtained according to the sensor data. Instructions include:
    Detecting a pressing touch operation of a touch button acting on the second touch area of the aircraft control interface, and acquiring sensor data when detecting;
    Detecting that the touch button follows the movement of the pressing touch operation in the second touch area or the aircraft manipulation interface, and acquiring an analog joystick manipulation instruction according to the movement;
    An aircraft maneuver command is obtained based on the sensor data and an analog joystick manipulation command.
  14. The mobile terminal according to claim 10, wherein if the touch operation is detected, acquiring sensor data and obtaining an aircraft manipulation command based on at least the sensor data comprises:
    If it is detected that the touch point of the fourth touch operation acts on the first touch area in the aircraft control interface, acquiring sensor data, and obtaining an aircraft control command according to at least the sensor data, until the touch point Stop when disappearing;
    The computer readable instructions, when executed by the processor, further cause the processor to perform the following steps:
    When the touch point moves to the second touch area in the aircraft control interface, triggering an analog joystick manipulation command according to the position of the touch point in the second touch area and transmitting to the The aircraft; the second touch area is used to simulate a joystick operation.
  15. The mobile terminal according to claim 10, wherein the obtaining the aircraft manipulation command based on at least the sensor data comprises:
    Determining an initial state of the mobile terminal where the sensor is located according to the obtained initial sensor data;
    Determining a subsequent state of the mobile terminal according to the acquired sensor data subsequent sensor data; and
    An aircraft maneuver command is generated based on the change in the subsequent state relative to the initial state.
  16. The mobile terminal according to claim 10, wherein the sensor data is from a direction sensor, a gravity sensor, an acceleration sensor, a light sensor, an electronic compass, a distance sensor, a three-axis gyro sensor, a temperature sensor, and a pressure sensor. At least one of them.
  17. A mobile terminal according to claim 10, wherein said computer readable finger When executed by the processor, the processor is further caused to perform the following steps:
    Detecting a selection instruction for a preset automatic control mode icon in the aircraft control interface;
    Determining a corresponding preset automatic control mode according to the selection instruction;
    Reading the determined aircraft combination manipulation command associated with the determined preset automatic control mode; and
    The aircraft combination steering command is transmitted to the aircraft to cause the aircraft to sequentially perform a corresponding series of actions in accordance with the aircraft combination steering command.
  18. The mobile terminal according to claim 17, wherein the preset automatic control mode comprises at least one of an in-place landing mode, a return preset landing mode, an in-flight emergency hover mode, and a follow-lock target flight mode. Kind.
  19. One or more computer readable non-volatile storage media storing computer readable instructions, when executed by one or more processors, cause the one or more processors to perform the steps of:
    Display the aircraft control interface;
    Detecting a touch operation acting on the aircraft control interface;
    If the touch operation is detected, acquiring sensor data, and obtaining an aircraft manipulation command based on at least the sensor data; and
    The aircraft maneuver command is sent to the aircraft.
  20. The computer readable non-volatile storage medium according to claim 19, wherein if the touch operation is detected, acquiring sensor data and obtaining the aircraft manipulation command based on at least the sensor data comprises:
    If the first touch operation for opening the sensor control mode is detected in the first touch area of the aircraft control interface, acquiring sensor data, and obtaining an aircraft control command according to at least the sensor data, until the detection is detected. The second touch operation applied to the first touch area is stopped.
PCT/CN2016/083287 2015-12-10 2016-05-25 Aircraft control method, mobile terminal and storage medium WO2017096762A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201510919245.9 2015-12-10
CN201510919245.9A CN105549604B (en) 2015-12-10 2015-12-10 aircraft control method and device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US15/957,749 US10587790B2 (en) 2015-11-04 2018-04-19 Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
US15/959,032 US10623621B2 (en) 2015-11-04 2018-04-20 Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
US15/959,007 US10674062B2 (en) 2015-11-04 2018-04-20 Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
US15/959,014 US20180288304A1 (en) 2015-11-04 2018-04-20 Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle

Related Parent Applications (2)

Application Number Title Priority Date Filing Date
PCT/CN2016/082925 Continuation-In-Part WO2017075973A1 (en) 2015-11-04 2016-05-20 Method for providing interactive drone control interface, portable electronic apparatus and storage medium
PCT/CN2016/084973 Continuation-In-Part WO2017107396A1 (en) 2015-12-23 2016-06-06 Multimedia synchronization method and system, aerial vehicle, and storage medium

Related Child Applications (2)

Application Number Title Priority Date Filing Date
PCT/CN2016/082925 Continuation-In-Part WO2017075973A1 (en) 2015-11-04 2016-05-20 Method for providing interactive drone control interface, portable electronic apparatus and storage medium
PCT/CN2016/084973 Continuation-In-Part WO2017107396A1 (en) 2015-12-23 2016-06-06 Multimedia synchronization method and system, aerial vehicle, and storage medium

Publications (1)

Publication Number Publication Date
WO2017096762A1 true WO2017096762A1 (en) 2017-06-15

Family

ID=55828842

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/083287 WO2017096762A1 (en) 2015-12-10 2016-05-25 Aircraft control method, mobile terminal and storage medium

Country Status (2)

Country Link
CN (1) CN105549604B (en)
WO (1) WO2017096762A1 (en)

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10587790B2 (en) 2015-11-04 2020-03-10 Tencent Technology (Shenzhen) Company Limited Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
CN105549604B (en) * 2015-12-10 2018-01-23 腾讯科技(深圳)有限公司 aircraft control method and device
CN105867420B (en) * 2016-05-16 2020-06-02 深圳市智璟科技有限公司 Rapid mode switching system and method applied to unmanned aerial vehicle
CN105867363A (en) * 2016-05-26 2016-08-17 北京飞旋天行科技有限公司 UAV (unmanned aerial vehicle) control method, equipment and system
CN105974929A (en) * 2016-06-11 2016-09-28 深圳市哈博森科技有限公司 Unmanned plane control method based on operation and control of intelligent device
WO2018018378A1 (en) * 2016-07-25 2018-02-01 深圳市大疆创新科技有限公司 Method, device and system for controlling movement of moving object
CN106020214A (en) * 2016-08-11 2016-10-12 上海与德通讯技术有限公司 Unmanned aerial vehicle interaction control device and system
CN106155068A (en) * 2016-08-11 2016-11-23 上海与德通讯技术有限公司 Unmanned plane interaction control device and system
CN106444830A (en) * 2016-09-23 2017-02-22 重庆零度智控智能科技有限公司 Braking method and device for flying device
WO2018053845A1 (en) * 2016-09-26 2018-03-29 深圳市大疆创新科技有限公司 Method and system for controlling unmanned aerial vehicle, and user terminal
WO2018058309A1 (en) * 2016-09-27 2018-04-05 深圳市大疆创新科技有限公司 Control method, control device, electronic device, and aerial vehicle control system
CN106603142A (en) * 2016-10-19 2017-04-26 广东容祺智能科技有限公司 Communication flight security protection system of unmanned aerial vehicle and operation method thereof
CN106527458B (en) * 2016-11-24 2019-04-02 腾讯科技(深圳)有限公司 A kind of the salto implementation method and device of aircraft
CN107000839B (en) * 2016-12-01 2019-05-03 深圳市大疆创新科技有限公司 The control method of unmanned plane, device, equipment and unmanned plane control system
WO2018119981A1 (en) * 2016-12-30 2018-07-05 深圳市大疆创新科技有限公司 Control method, control device and control system for movable device
US10168704B2 (en) 2017-06-05 2019-01-01 Hanzhou Zero Zero Technology Co., Ltd. System and method for providing easy-to-use release and auto-positioning for drone applications
CN107577245A (en) * 2017-09-18 2018-01-12 深圳市道通科技股份有限公司 A kind of aircraft parameters establishing method and device and computer-readable recording medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110246015A1 (en) * 2010-03-31 2011-10-06 Massachusetts Institute Of Technology System and Method for Providing Perceived First-Order Control of an Unmanned Vehicle
CN102266672A (en) * 2010-03-11 2011-12-07 鹦鹉股份有限公司 Method and device for remote control of a drone, in particular a rotary-wing drone
CN103426282A (en) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 Remote control method and terminal
US20140371954A1 (en) * 2011-12-21 2014-12-18 Kt Corporation Method and system for remote control, and remote-controlled user interface
CN105549604A (en) * 2015-12-10 2016-05-04 腾讯科技(深圳)有限公司 Aircraft control method and apparatus

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2938774A1 (en) * 2008-11-27 2010-05-28 Parrot Device for controlling a drone
FR2967321B1 (en) * 2010-11-05 2013-06-14 Parrot Method of transmitting controls and a video stream between a drone and a remote control through a wireless network type link.
FR2985329B1 (en) * 2012-01-04 2015-01-30 Parrot Method for intuitive control of a drone using a remote control apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102266672A (en) * 2010-03-11 2011-12-07 鹦鹉股份有限公司 Method and device for remote control of a drone, in particular a rotary-wing drone
US20110246015A1 (en) * 2010-03-31 2011-10-06 Massachusetts Institute Of Technology System and Method for Providing Perceived First-Order Control of an Unmanned Vehicle
US20140371954A1 (en) * 2011-12-21 2014-12-18 Kt Corporation Method and system for remote control, and remote-controlled user interface
CN103426282A (en) * 2013-07-31 2013-12-04 深圳市大疆创新科技有限公司 Remote control method and terminal
CN105549604A (en) * 2015-12-10 2016-05-04 腾讯科技(深圳)有限公司 Aircraft control method and apparatus

Also Published As

Publication number Publication date
CN105549604A (en) 2016-05-04
CN105549604B (en) 2018-01-23

Similar Documents

Publication Publication Date Title
US9891621B2 (en) Control of an unmanned aerial vehicle through multi-touch interactive visualization
US10185318B2 (en) Return path configuration for remote controlled aerial vehicle
US9789612B2 (en) Remotely operating a mobile robot
US20190088157A1 (en) Systems and methods for flight simulation
US10156854B2 (en) UAV and UAV landing control device and method
US10466695B2 (en) User interaction paradigms for a flying digital assistant
US9586682B2 (en) Unmanned aerial vehicle control apparatus and method
Lugo et al. Framework for autonomous on-board navigation with the AR. Drone
US10469757B2 (en) Flying camera and a system
US20190094851A1 (en) Systems and methods for controlling an unmanned aerial vehicle
US20190042003A1 (en) Controller with Situational Awareness Display
CN106444795B (en) Method and system for assisting take-off of movable object
US9846429B2 (en) Systems and methods for target tracking
US9493232B2 (en) Remote control method and terminal
US20180158197A1 (en) Object tracking by an unmanned aerial vehicle using visual sensors
EP3098693A1 (en) Eyerwear-type terminal and method for controlling the same
EP2959352B1 (en) Remote control method and terminal
US9616993B1 (en) Simplified auto-flight system coupled with a touchscreen flight control panel
US20180109767A1 (en) Unmanned aerial vehicle sensor activation and correlation system
US20140324253A1 (en) Autonomous control of unmanned aerial vehicles
US10031518B1 (en) Feedback to facilitate control of unmanned aerial vehicles (UAVs)
US8918230B2 (en) Teleoperation of unmanned ground vehicle
US10168700B2 (en) Control of an aerial drone using recognized gestures
EP2672226A2 (en) Route display and review
WO2017075964A1 (en) Unmanned aerial vehicle photographing control method, unmanned aerial vehicle photographing method, mobile terminal and unmanned aerial vehicle

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16871967

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 31/10/2018)

122 Ep: pct application non-entry in european phase

Ref document number: 16871967

Country of ref document: EP

Kind code of ref document: A1