WO2017113648A1 - 体感遥控器、体感遥控飞行系统和方法、无头控制方法 - Google Patents

体感遥控器、体感遥控飞行系统和方法、无头控制方法 Download PDF

Info

Publication number
WO2017113648A1
WO2017113648A1 PCT/CN2016/086473 CN2016086473W WO2017113648A1 WO 2017113648 A1 WO2017113648 A1 WO 2017113648A1 CN 2016086473 W CN2016086473 W CN 2016086473W WO 2017113648 A1 WO2017113648 A1 WO 2017113648A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote controller
drone
control
flight
somatosensory
Prior art date
Application number
PCT/CN2016/086473
Other languages
English (en)
French (fr)
Inventor
郑卫锋
Original Assignee
北京臻迪机器人有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201511032422.8A external-priority patent/CN105469579B/zh
Priority claimed from CN201620268295.5U external-priority patent/CN205608991U/zh
Priority claimed from CN201610299333.8A external-priority patent/CN107346141A/zh
Priority claimed from CN201610297215.3A external-priority patent/CN107346140B/zh
Application filed by 北京臻迪机器人有限公司 filed Critical 北京臻迪机器人有限公司
Priority to EP16880456.5A priority Critical patent/EP3399380B1/en
Priority to US16/067,557 priority patent/US11327477B2/en
Publication of WO2017113648A1 publication Critical patent/WO2017113648A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/01Satellite radio beacon positioning systems transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/13Receivers
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0022Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the communication link
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0033Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by having the operator tracking the vehicle either by direct line of sight or via one or more cameras located remotely from the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/005Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with signals other than visual, e.g. acoustic, haptic
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U20/00Constructional aspects of UAVs
    • B64U20/80Arrangement of on-board electronics, e.g. avionics systems or wiring
    • B64U20/87Mounting of imaging devices, e.g. mounting of gimbals
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/10UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64U2201/104UAVs characterised by their flight controls autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • the present invention relates to the field of electronic device technologies, and in particular, to a somatosensory remote controller, a somatosensory remote control flight system and method, and a headless control method.
  • the drone referred to as the "unmanned aerial vehicle" is a non-manned aircraft operated by radio remote control equipment and its own program control device. There is no cockpit on board, but equipment such as autopilot and program control devices are installed. Personnel on the ground, on the ship or at the remote control station of the parent machine can track, locate, remotely control, telemetry and digitally transmit through radar and other equipment. It can take off like a normal airplane under the radio remote control or launch it with a booster rocket. It can also be brought into the air by the parent aircraft. When recycling, it can be landed automatically in the same way as a normal aircraft landing process, or it can be recovered by remote control with a parachute or block. Can be used multiple times. Widely used in aerial reconnaissance, surveillance, communication, anti-submarine, electronic interference and so on. In order to make it easier for operators to operate drones, people have designed a somatosensory remote control.
  • the somatosensory remote controller in the prior art includes a remote controller body and a sensor and a controller disposed on the remote controller body, and the sensor is electrically connected to the controller, and the sensor is used to acquire the moving direction of the remote controller body and transmitted to the controller and the controller. Control the flight of the drone according to the direction of movement.
  • the somatosensory remote controller needs to place the remote controller body at the center position when the center position is located at a fixed position, and the operator must position the somatosensory remote controller at the center position.
  • the drone is controlled by the somatosensory remote controller, so the sense of the prior art
  • the remote control requires a high level of skill for the controller and is not convenient for the operator to control.
  • the general drones are all organic. When the distance of the drone is relatively close, the user can see the nose of the drone with the eyes. However, once the drone is flying far away or the drone is flying, When the distance is beyond the line of sight of the person, the user cannot distinguish the direction of the nose of the drone with the eyes, and the flight of the drone cannot be well controlled, and the drone will fly randomly, which will have great safety hazards.
  • the object of the present invention is to provide a somatosensory remote controller, a somatosensory remote control flight system and method, and a headless control method to solve the technical problem in the prior art that is inconvenient for the operator to control.
  • a somatosensory remote controller includes: a posture sensor, a controller and a first transmission module, and a remote controller body, the first transmission module supports Bluetooth, wireless fidelity (WiFi), infrared, mobile network, and cable One or more transmission modes in the transmission; the posture sensor, the first transmission module and the controller are all disposed on the remote controller body, the posture sensor and the first transmission module are electrically connected to the controller; the posture sensor is used to acquire the remote controller Initial state information of the initial position of the body, and movement information of the movement of the remote controller body, and transmitted to the controller; the controller is configured to obtain a flight instruction according to the initial state information and the movement information, and issue the flight instruction through the first transmission module.
  • WiFi wireless fidelity
  • the embodiment of the present invention further provides a somatosensory remote control flight system, including an onboard flight control system and a somatosensory remote controller as described above; the airborne flight control system is provided with a second transmission module, a second transmission module and a A transmission module wireless communication connection; the onboard flight control system is used to control the drone flight according to flight instructions.
  • a somatosensory remote control flight system including an onboard flight control system and a somatosensory remote controller as described above; the airborne flight control system is provided with a second transmission module, a second transmission module and a A transmission module wireless communication connection; the onboard flight control system is used to control the drone flight according to flight instructions.
  • the embodiment of the invention further provides a somatosensory remote control flight method, which specifically comprises the following steps:
  • the posture sensor acquires movement information of the remote controller body and transmits the movement information to the controller;
  • the controller obtains a flight instruction according to the initial state information and the movement information
  • the controller transmits the flight command to the onboard flight control system, and the onboard flight control system controls the drone flight.
  • the somatosensory remote controller uses the posture sensor to acquire initial state information of the initial position of the remote controller body, and the movement information of the remote controller body moving to the preset position, and controlling The aircraft obtains a flight instruction based on the initial state information and the movement information.
  • the posture sensor acquires initial state information of the current position of the remote controller, which is the center position.
  • the flight state is obtained based on the initial state information and combined with the movement information.
  • the somatosensory remote controller of the present invention can be used as a center position of the somatosensory remote controller, the operator does not need to find the center position of the somatosensory remote controller, thereby reducing the technical level requirements of the controller and facilitating manipulation. Control.
  • the embodiment of the present invention further provides a method for headless control, which can perform headless control on the unmanned aerial vehicle by using the somatosensory remote controller provided by the embodiment of the present invention.
  • the method for the headless control includes:
  • the somatosensory remote controller receives the headless control command
  • the somatosensory remote controller calculates the heading angle of the remote control pointing to the drone in real time
  • the somatosensory remote controller transmits the headless attitude control amount C to the drone, so that the drone adjusts the flight attitude according to the headless attitude control amount C.
  • the drone when the user triggers the headless control key, when the user controls the drone flight by using the somatosensory remote controller, the drone will fly in the user's orientation regardless of the direction of the head of the drone, and Since the system calculates the heading angle of the somatosensory remote controller pointing to the drone in real time, the user uses the somatosensory remote controller to control the drone to fly to the left or right, the drone will take the user as the center and the distance from the drone to the user as the radius. Hovering flight.
  • FIG. 1 is a schematic structural diagram of a somatosensory remote controller according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of a somatosensory remote controller according to another embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of a somatosensory remote control flight system according to an embodiment of the present invention.
  • FIG. 4 is a flowchart of a somatosensory remote control flight method according to an embodiment of the present invention.
  • FIG. 5 is a flowchart of a somatosensory remote control flight method according to another embodiment of the present invention.
  • FIG. 6 is a flowchart of a somatosensory remote control flight method according to another embodiment of the present invention.
  • FIG. 7 is a flowchart of a somatosensory remote control flight method according to still another embodiment of the present invention.
  • FIG. 8 is a general flowchart of a somatosensory remote control flight method according to an embodiment of the present invention.
  • Figure 9 is a flow chart showing a method of headless control of one embodiment of the present invention.
  • Figure 10 is a flow chart showing the development of step Sa of one embodiment of the present invention.
  • 25-positioning module 26-barometer module; 27-UAV.
  • FIG. 1 is a schematic structural diagram of a somatosensory remote controller according to an embodiment of the present invention.
  • a somatosensory remote controller provided by the embodiment includes: a posture sensor 4, a controller 1 and a first transmission module 2, and a remote controller body 3; a posture sensor 4, a first transmission module 2, and a control
  • the device 1 is disposed on the remote controller body 3, and the posture sensor 4 and the first transmission module 2 are electrically connected to the controller 1;
  • the posture sensor 4 is used for acquiring initial state information of the initial position of the remote controller body 3, and the remote controller body 3 moving movement information is transmitted to the controller 1;
  • the controller 1 is configured to obtain a flight instruction according to the initial state information and the movement information, and issue the flight instruction through the first transmission module 2.
  • First A transmission module 2 supports one or more transmission modes of Bluetooth, wireless fidelity WiFi, infrared, mobile network, and wired transmission.
  • the initial state information includes an angular velocity and an acceleration of the initial position; and the movement information includes an angular velocity and an acceleration of the remote controller body 3 moving to the preset position.
  • the type of the posture sensor 4 may be various.
  • the posture sensor 4 includes a gyroscope and an accelerometer; the gyroscope and the accelerometer are disposed on the remote controller body 3 and electrically connected to the controller 1; the gyroscope and the accelerometer For respectively obtaining the angular velocity and acceleration of the initial position of the remote control body 3, and the angular velocity and acceleration of the remote controller body 3 moving to the preset position; the controller 1 moves to the pre-preparation according to the angular velocity and acceleration of the initial position and the remote controller body 3 The angular velocity and acceleration of the position are set, the flight command is obtained, and the flight command is transmitted to the airborne flight control system, and the airborne flight control system controls the flight of the drone 27.
  • the posture sensor 4 is an MPU 6050.
  • the type of the first transmission module 2 may be various, for example, Bluetooth, WiFi, or infrared, etc.
  • the wireless transmission module is one of a 915 MHz wireless transmission module and a 5.8 GHz wireless transmission module.
  • the somatosensory remote controller acquires initial state information of the initial position of the remote controller body 3 and the movement information of the remote controller body 3 moving to the preset position by using the posture sensor 4, and the controller 1 according to the initial state information and the movement information Get the flight instructions.
  • the posture sensor 4 acquires initial state information of the current position of the remote controller, which is the center position.
  • the flight state is obtained based on the initial state information and combined with the movement information. For example, on the basis of the initial position, when the somatosensory remote controller is tilted forward, the drone 27 can be controlled to move forward or downward; when the somatosensory remote controller is tilted backward, the drone 27 can be controlled to be backward. Or the tail is tilted forward; when the somatosensory remote controller is tilted to the left, the drone 27 can be controlled to roll left or left; when the somatosensory remote controller is tilted to the right, the drone 27 can be controlled to the right or to the right. Roll and so on.
  • any position can be used as the center position of the somatosensory remote controller, so the operator does not need to find the center position of the somatosensory remote controller, thereby reducing the technical level requirements of the controller, and facilitating The controller controls.
  • FIG. 2 is a schematic structural diagram of a somatosensory remote controller according to another embodiment of the present invention.
  • the remote control body 3 is further provided with a somatosensory activation button 6; the somatosensory activation button 6 is electrically connected to the controller 1 at the somatosensory activation button. After being pressed, The current position of the remote controller body is an initial position.
  • the body sense activation button 6 When the operator is operating with the somatosensory remote controller, the body sense activation button 6 is pressed to activate the controller 1, and the posture sensor 4 records the initial state information of the position of the remote controller body 3 at this time, and uses the position as the center position.
  • the operator can move the body remote controller body 3 after pressing the body feeling activation button 6, and then the drone 27 can be manipulated, thereby reducing the possibility of erroneous operation.
  • the remote controller body 3 is further provided with a GPS locator 5 and a return button 19; the GPS locator 5 is electrically connected to the controller for positioning. The position of the remote controller body; the return button 19 is electrically connected to the controller for controlling the drone 27 to return according to the position of the GPS locator.
  • the remote control body 3 is further provided with a take-off button 20, and the take-off and landing button 20 is electrically connected to the controller 1 for controlling the drone 27 to take off and land.
  • the remote control body 3 is further provided with two pan/tilt tilt buttons, a first pan tilt button 9 and a second pan tilt button 10, which are respectively used for controlling the drone attachment and tilt operations of the drone.
  • the remote control body 3 is also provided with a custom button 7, and the user can customize the function of the button.
  • a plurality of optional functions can be provided for the user to select.
  • the button can be selected as a follow mode button for controlling
  • the drone moves with the somatosensory remote controller, that is, after the follow-up mode is activated, the somatosensory remote controller moves, and the drone can also move with it; for example, the button can be selected as a hover function button for controlling the unmanned person. Hovering.
  • the camera body 3 is also provided with a camera/camera button 8 , and the camera/camera button 8 is electrically connected to the controller 1 for controlling the camera and the camera.
  • the button can be pressed to enter the camera function, and then the camera is pressed and the camera function is entered. camera function.
  • the design of the function keys of the somatosensory remote controller for the drone in this embodiment enables the operator to implement different functions under the simple key step, and the same function can be realized only by one button, and the operation is simple and simple. Fast.
  • the power switch 14, the forward button and the back button, and the accelerator button and the like can also be disposed on the remote controller body 3.
  • a battery 16 is further disposed in the remote controller body 3; the remote controller body 3 is further provided with a power switch 14 through which the battery 16 passes the power switch 14 and the controller 1
  • the first and third color LEDs 18 are connected to the battery 16
  • the first three-color LED lamp 18 is used to display the charging state of the battery 16; the remote controller body 3 is further provided with a USB interface 15 that is electrically connected to the controller 1 for firmware upgrade or charging of the battery 16.
  • the remote control body 3 is further provided with a vibration motor 13 and an active buzzer 12, and the vibration motor 13 and the active buzzer 12 are both connected to the controller. 1 electrical connection.
  • the vibration motor 13 and the active buzzer 12 can be used in conjunction with the GPS locator 5 to alert the operator when the drone 27 deviates from the preset flight trajectory, or arrives at a preset destination or the like.
  • the vibration motor 13 vibrates every time the controller performs an action, or the active buzzer 12 emits a sound.
  • a second three-color LED lamp 11 is further disposed on the remote controller body 3, and the second three-color LED lamp 11 is electrically connected to the controller 1 to display no The flight state of the man machine 27.
  • FIG. 3 is a schematic structural diagram of a somatosensory remote control flight system according to an embodiment of the present invention.
  • an embodiment of the present invention further provides a somatosensory remote control flight system, including an onboard flight control system 21 and the above-described somatosensory remote controller; the onboard flight control system 21 is provided with a second transmission module 22, The second transmission module 22 is wirelessly connected to the first transmission module 2; the onboard flight control system 21 is configured to control the drone 27 to fly according to flight instructions.
  • the airborne flight control system 21 further includes a positioning module 25, an attitude reference system 24 and a barometer module 26, and a microprocessor 23; the microprocessor 23 is configured to pass the positioning module 25, the azimuth reference system 24, and the barometer module. 26 acquires flight information of the drone 27 and transmits the flight information to the somatosensory remote controller through the second transmission module 22.
  • the onboard flight control system 21 transmits the flight information to the somatosensory remote controller, and the somatosensory remote controller can adjust the flight posture of the drone 27 based on the flight information so that the drone 27 can fly beyond the line of sight.
  • FIG. 4 is a flowchart of a somatosensory remote control flight method according to an embodiment of the present invention. As shown in FIG. 4, an embodiment of the present invention further provides a somatosensory remote control flight method, which specifically includes the following steps:
  • Step 100 when it is detected that the somatosensory remote control flight mode is activated, the current position of the remote control body 3 is the initial position, and the initial state information of the remote control body 3 is acquired by the posture sensor 4, and transmitted to the controller 1;
  • Step 200 when the remote controller body 3 moves, the posture sensor 4 acquires the movement information of the remote controller body 3, and transmits it to the controller 1;
  • Step 300 the controller 1 obtains a flight instruction according to the initial state information and the movement information
  • step 400 the controller 1 transmits the flight command to the onboard flight control system 21, and the onboard flight control system 21 controls the drone 27 to fly.
  • the somatosensory remote control flight method of the embodiment which uses the posture sensor 4 to acquire initial state information of the current position of the remote controller body 3, and movement information of the remote controller body 3 moving to the preset position, and the controller 1 according to the initial state information and the movement information Get the flight instructions.
  • the posture sensor 4 acquires initial state information of the current position of the remote controller, which is the center position.
  • the flight state is obtained based on the initial state information and combined with the movement information. In this way, the remote controller body 3 can take any position as the center position of the somatosensory remote controller. Therefore, the operator does not need to find the center position of the somatosensory remote controller, thereby reducing the technical level of the controller and facilitating manipulation by the controller.
  • FIG. 5 is a flowchart of a somatosensory remote flight method according to another embodiment of the present invention. As shown in FIG. 5, based on the foregoing embodiment, further, in step 300, the controller 1 obtains the flight instruction according to the initial state information and the movement information, and specifically includes the following steps:
  • Step 310 Calculate spatial coordinates of the initial position according to the initial state information, and record spatial coordinates of the initial position;
  • Step 320 calculating spatial coordinates of the preset position according to the movement information, and recording spatial coordinates of the preset position;
  • Step 330 Obtain a flight instruction according to the spatial coordinates of the initial position and the spatial coordinates of the preset position.
  • the controller 1 calculates the spatial coordinates of the somatosensory remote controller at the initial position, that is, the center position, according to the acquired initial state information, and then calculates the spatial coordinates of the preset position according to the movement information, and passes the remote controller body. 3 The movement in the space coordinates, thereby accurately acquiring the movement track of the remote controller body 3 in the space coordinate system, thereby realizing the precise control of the drone 27.
  • FIG. 6 is a flowchart of a somatosensory remote flight method according to still another embodiment of the present invention. As shown in FIG. 6, on the basis of the foregoing embodiment, further, in step 330, according to the spatial coordinates of the initial position and the spatial coordinates of the preset position, the flight instruction is obtained, which specifically includes the following steps:
  • Step 331 recording a quaternion corresponding to the spatial coordinate of the initial position, and calculating an initial direction cosine matrix DCM init corresponding to the quaternion;
  • Step 332 recording a quaternion corresponding to the spatial coordinate of the preset position, and calculating a current direction cosine matrix DCM cur corresponding to the quaternion;
  • Step 333 multiplying the transposed matrix of the initial direction cosine matrix by the cosine matrix of the current direction to obtain a cosine matrix DCM relative to the moving direction cosine matrix of the current direction relative to the cosine matrix of the initial direction;
  • Step 334 calculating a Euler angle of the current remote controller body 3 relative to the initial position by using the moving direction cosine matrix
  • Step 335 using the Euler angle to obtain a flight instruction.
  • the quaternion is a mathematical concept discovered by the Irish mathematician William Luyun Hamilton in 1843.
  • the multiplication of quaternions does not conform to the commutative law.
  • a quaternion is a non-exchangeable extension of a complex number. If the set of quaternions is considered as a multidimensional real space, the quaternion represents a four-dimensional space, and the complex number is a two-dimensional space.
  • the quaternion number can be used to quickly calculate the amount of change in the space coordinate system based on the initial position of the remote controller body 3, that is, the Euler angle of the rotation of the remote controller body 3, thereby reducing the amount of calculation and improving
  • the working efficiency of the controller 1 can further improve the handling precision.
  • the step 332 specifically includes:
  • the current quaternion q cur is recorded every predetermined time interval, and then q cur is converted into the current direction cosine matrix DCM cur , the current quaternion q cur and the current direction cosine matrix DCM cur are converted. as follows:
  • the predetermined time is 10 ms to 20 ms.
  • the predetermined time is set to 10 ms to 20 ms.
  • the moving direction cosine matrix DCM relative is:
  • FIG. 7 is a flowchart of a somatosensory remote control flight method according to still another embodiment of the present invention
  • FIG. 8 is a general flowchart of a somatosensory remote control flight method according to an embodiment of the present invention.
  • the flight instruction is obtained by using the Euler angle, and the method further includes the following steps:
  • Step 3351 the moving direction cosine matrix is rotated by 90° around the y-axis of the body coordinate system corresponding to the initial direction cosine matrix to obtain a final direction cosine matrix DCM final ;
  • Step 3352 using the final direction cosine matrix DCM final to obtain a control amount capable of controlling the attitude of the drone.
  • the pitch angle range is around 90° and -90°, there is a singularity in the Euler angle.
  • the control is more sensitive, and the vicinity of the center position of the somatosensory remote controller needs to be changed into the Euler angle. Singularity location. Therefore, the pitch angle corresponding to the moving direction cosine matrix DCM relative is rotated by 90° to obtain the final matrix DCM final .
  • the rotation method is as follows:
  • the three-dimensional coordinate system is established with the center of the posture sensor on the somatosensory remote controller as the coordinate origin O, and the Euler angle includes:
  • the somatosensory remote controller rotates the value roll around the X axis
  • the somatosensory remote controller rotates the value pitch around the Y axis
  • the somatosensory remote controller rotates the value yaw around the Z axis.
  • the somatosensory remote controller rotates the value pitch around the Y axis to:
  • the values of roll and yaw are determined according to the size of the pitch:
  • the drone is rotated around the X-axis and the value is determined. Whether it is less than 0.001, if it is less than the rotation value around the X axis roll is 0, the rotation value around the Z axis
  • the final matrix DCM final can be converted into the three-dimensional control quantities pitch, roll and yaw of the UAV flight attitude, and the specific values of pitch, roll and yaw are calculated, so that the pitch, roll and yaw can be The specific value is converted into a control amount that can be sent to the drone, thereby controlling the flight attitude of the drone, and completing the action that the user wants the drone to complete.
  • the unmanned aerial vehicle can be used for head control by using the somatosensory remote controller provided by the embodiment of the present application.
  • the somatosensory remote controller provided by the embodiment of the present application.
  • the present invention provides a technical solution for headless control of the drone, which can be used when the user controls the drone by using the somatosensory remote controller, regardless of which head of the drone is in which In the direction, the drone will fly in the user's position.
  • a method for headless control includes:
  • the somatosensory remote controller receives the headless control command
  • the somatosensory remote controller calculates the heading angle of the remote control pointing to the drone in real time
  • the somatosensory remote controller transmits the headless attitude control amount C to the drone, so that the drone adjusts the flight attitude according to the headless attitude control amount C.
  • the headless control button is provided on the somatosensory remote controller of the drone, and when the user presses the headless control button, the system calculates the heading angle theta of the somatosensory remote controller pointing to the drone, so that Knowing the direction of the drone relative to the somatosensory remote control (ie, the drone relative to the person), and then using the heading angle theta to determine the rotation matrix DCM, so that when the user uses the somatosensory remote control to send the initial to the drone
  • the rotation matrix DCM can be given to the initial control amount V, and the headless attitude control amount C of the drone can be controlled, and the drone can fly according to the headless attitude control amount C. So, no matter which direction the nose of the drone is in, after the above scheme, the drone will fly according to the orientation of the somatosensory remote controller (ie, the person).
  • the system calculates the heading angle theta of the drone in real time, when the user uses the somatosensory remote control to control the drone to fly to the left or right, the drone will take the user as the center to the drone to the user.
  • the distance is a radius hovering flight.
  • the user controls the flight of the aircraft using the somatosensory remote control
  • the user presses the headless control button on the somatosensory remote controller, and then the user controls the aircraft to fly forward using the somatosensory remote controller, so that no matter where the nose of the aircraft is,
  • the person flies in front, uses the somatosensory remote control to control the aircraft to fly backwards, and the aircraft will approach the person.
  • the aircraft will circle clockwise or counterclockwise with the user as the center. flight.
  • the drone when the user triggers the headless control key, when the user controls the drone flight by using the somatosensory remote controller, the drone will fly in the user's orientation regardless of the direction of the head of the drone, and Since the system calculates the heading angle of the somatosensory remote controller pointing to the drone in real time, the user uses the somatosensory remote controller to control the drone to fly to the left or right, the drone will take the user as the center and the distance from the drone to the user as the radius. Hovering flight.
  • the method further includes:
  • the flight state of the drone is determined.
  • the step Sa specifically includes:
  • Sa1 real-time calculation of the real-time distance S between the somatosensory remote controller and the drone
  • the drone when the real-time distance S is greater than the dangerous distance, the drone flies normally, and when the real-time distance S is less than or equal to the dangerous distance, the drone hovers.
  • the step Sa1 specifically includes:
  • the real-time distance S between the somatosensory remote controller and the drone is calculated according to lat1, lon1, lat2, lon2, ⁇ , b, and earth_radius, where earth_radius is the radius of the earth.
  • the real-time distance S is:
  • the latitude and longitude lon1 and lat1 of the remote controller, and the latitude and longitude lon2 and lat2 of the drone are acquired by using GPS, and then the latitude difference ⁇ of the somatosensory remote controller and the drone is calculated.
  • Lat1-lat2 and the longitude difference b lon1-lon2, so that the real-time distance S can be calculated.
  • the heading angle theta is:
  • Theta arctan2f(sin(b)*cos(lat2), cos(lat1)*cos(lat2)*sin(lat2)-sin(lat1)*cos(lat2)*cos(b)).
  • the rotation matrix DCM is:
  • pitch 0
  • roll 0
  • yaw theta
  • the somatosensory remote control directly controls the flight of the drone.
  • the step S4 specifically includes:
  • the somatosensory remote controller receives the initial control amount V input by the user;
  • the initial control amount V is:
  • V pitch is the control amount of the drone movement received by the somatosensory remote controller
  • V roll is the control amount of the drone movement of the drone received by the remote controller
  • V thrust is the control of the drone movement of the drone received by the somatosensory remote controller. the amount.
  • the headless attitude control amount C of the drone is:
  • C pitch is the attitude control amount of the UAV moving forward and backward under headless control
  • C roll is the attitude control amount of the UAV lateral movement under headless control
  • C thrust is the UAV under headless control The amount of gesture control that is moved.
  • the drone after receiving the control signal sent by the user using the somatosensory remote controller, the control signal is converted into a corresponding initial control amount V, and then the rotation matrix DCM is multiplied by the initial control amount V to obtain
  • the drone can adjust the flight attitude according to the headless attitude control amount C, and the flight can be performed in the user's orientation.
  • the drone when the user triggers the headless control button, when the user controls the drone flight by using the somatosensory remote controller, the drone will fly in the user's orientation regardless of the direction of the head of the drone. And because the system calculates the heading angle of the somatosensory remote control pointing to the drone in real time, the user uses the somatosensory remote control to control the drone to fly to the left or right, the drone will be the user-centered distance from the drone to the user. Hovering for a radius.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Human Computer Interaction (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Acoustics & Sound (AREA)
  • General Engineering & Computer Science (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Selective Calling Equipment (AREA)

Abstract

一种体感遥控器、体感遥控飞行系统和方法、无头控制方法。该体感遥控器包括:姿势传感器(4)、控制器(1)和第一传输模块(2),以及遥控器本体(3);姿势传感器(4)、第一传输模块(2)和控制器(1)均设置在遥控器本体(3)上,姿势传感器(4)和第一传输模块(2)均与控制器(1)电连接;姿势传感器(4)用于获取遥控器本体(3)的当前位置的初始状态信息,以及遥控器本体(3)移动的移动信息,并传递给控制器(1);控制器(1)用于根据初始状态信息和移动信息,得到飞行指令,并通过第一传输模块(2)将飞行指令发出。由于任意一个位置都能作为该体感遥控器的中心位,降低了对操控者的技术水平的要求,便于操控者操控。

Description

体感遥控器、体感遥控飞行系统和方法、无头控制方法
本申请要求在2015年12月31日提交中国专利局、申请号为201511032422.8、申请名称为“体感遥控器、体感遥控飞行系统和方法”的中国专利申请的优先权,以及,在2016年4月1日提交中国专利局、申请号为201620268295.5、申请名称为“用于无人机的体感遥控器”的中国专利申请的优先权,以及,在2016年5月6日提交中国专利局、申请号为201610299333.8、申请名称为“一种体感控制方法”的中国专利申请的优先权,以及,在2016年5月6日提交中国专利局、申请号为201610297215.3、申请名称为“一种无头控制的方法”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本发明涉及电子设备技术领域,尤其是涉及一种体感遥控器、体感遥控飞行系统和方法、无头控制方法。
背景技术
无人驾驶飞机简称“无人机”,是利用无线电遥控设备和自备的程序控制装置操纵的不载人飞机。机上无驾驶舱,但安装有自动驾驶仪、程序控制装置等设备。地面、舰艇上或母机遥控站人员通过雷达等设备,对其进行跟踪、定位、遥控、遥测和数字传输。可在无线电遥控下像普通飞机一样起飞或用助推火箭发射升空,也可由母机带到空中投放飞行。回收时,可用与普通飞机着陆过程一样的方式自动着陆,也可通过遥控用降落伞或拦网回收。可反复使用多次。广泛用于空中侦察、监视、通信、反潜、电子干扰等。而为了更加便于操作者操作无人机,人们设计了体感遥控器。
现有技术中的体感遥控器,包括遥控器本体以及设置在遥控器本体的传感器和控制器,传感器与控制器电连接,利用传感器获取遥控器本体的移动方向,并传递给控制器,控制器根据移动方向来控制无人机的飞行。
但是,现有技术中的体感遥控器在使用时,需要将遥控器本体置于中心位上,这个中心位位于一个固定位置上,操作者必须使体感遥控器的位置位于中心位上,才能够利用体感遥控器操控无人机,所以,现有技术中的体感 遥控器对于操控者的技术水平要求很高,不便于操控者操控。
另外,一般的无人机都有机头,当无人机飞行的距离比较近时,用户可以用眼睛看见无人机的机头,然而一旦无人机飞行的距离比较远或无人机的飞行距离超出人的视线范围时,用户无法用眼睛辨别无人机机头的方向,就不能很好的控制无人机的飞行,无人机就会乱飞,这样会存在很大的安全隐患。
发明内容
本发明的目的在于提供一种体感遥控器、体感遥控飞行系统和方法、无头控制方法,以解决现有技术中不便于操控者操控的技术问题。
本发明实施例提供的一种体感遥控器,包括:姿势传感器、控制器和第一传输模块,以及遥控器本体,第一传输模块支持蓝牙、无线保真(WiFi)、红外、移动网络、有线传输中的一种或多种传输方式;姿势传感器、第一传输模块和控制器均设置在遥控器本体上,姿势传感器和第一传输模块均与控制器电连接;姿势传感器用于获取遥控器本体的初始位置的初始状态信息,以及遥控器本体移动的移动信息,并传递给控制器;控制器用于根据初始状态信息和移动信息,得到飞行指令,并通过第一传输模块将飞行指令发出。
本发明实施例还提供了一种体感遥控飞行系统,包括机载飞控系统以及如上任一所述的体感遥控器;机载飞控系统上设置有第二传输模块,第二传输模块与第一传输模块无线通讯连接;机载飞控系统用于根据飞行指令控制无人机飞行。
本发明实施例还提供了一种体感遥控飞行方法,具体包括如下步骤:
当检测到体感遥控飞行模式被激活时,定位遥控器本体的当前位置为初始位置,并利用姿势传感器获取遥控器本体的初始状态信息,并传递给控制器;
当遥控器本体运动时,姿势传感器获取遥控器本体的移动信息,并传递给控制器;
控制器根据初始状态信息和移动信息,得到飞行指令;
控制器将飞行指令传递给机载飞控系统,机载飞控系统控制无人机飞行。
本发明实施例提供的体感遥控器,其利用姿势传感器获取遥控器本体初始位置的初始状态信息,以及遥控器本体移动至预设位置的移动信息,控制 器根据初始状态信息和移动信息得到飞行指令。当操作者在使用体感遥控器操控无人机时,姿势传感器会获取遥控器的当前位置的初始状态信息,该当前位置即为中心位。当操作者在移动遥控器本体时,均为以初始状态信息为基准,并结合移动信息得到飞行指令。由于本发明中的体感遥控器,任意一个位置都能作为体感遥控器的中心位,故而,操作者不需要再找体感遥控器的中心位了,降低对操控者的技术水平的要求,便于操控者操控。
本发明实施例还提出了一种无头控制的方法,可以利用本发明实施例提供的体感遥控器对无人机进行无头控制,所述无头控制的方法包括:
S1,体感遥控器接收到无头控制命令;
S2,体感遥控器实时计算遥控器指向无人机的航向角theta;
S3,根据航向角theta得出旋转矩阵DCM;
S4,将体感遥控器接收到的用户输入的初始控制量V与所述旋转矩阵DCM结合得出无人机的无头姿态控制量C;
S5,体感遥控器将无头姿态控制量C发送至无人机,以供无人机根据所述无头姿态控制量C调整飞行姿态。
通过上述技术方案,当用户触发无头控制键时,在用户利用体感遥控器控制无人机飞行时,无论无人机的机头处于哪个方向,无人机都会以用户的方位进行飞行,并且由于系统在实时计算体感遥控器指向无人机的航向角用户利用体感遥控器控制无人机向左或右飞行时,无人机就会以用户为中心以无人机到用户的距离为半径盘旋飞行。
附图说明
为了更清楚地说明本发明具体实施方式或现有技术中的技术方案,下面将对具体实施方式或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施方式,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本发明一实施例提供的体感遥控器的结构示意图;
图2为本发明另一实施例提供的体感遥控器的结构示意图;
图3为本发明实施例提供的体感遥控飞行系统的结构示意图;
图4为本发明一实施例提供的体感遥控飞行方法的流程图;
图5为本发明另一实施例提供的体感遥控飞行方法的流程图;
图6为本发明又一实施例提供的体感遥控飞行方法的流程图;
图7为本发明再一实施例提供的体感遥控飞行方法的流程图;
图8为本发明实施例提供的体感遥控飞行方法的总流程图;
图9示出了本发明的一个实施例的无头控制的方法的流程图;
图10示出了本发明的一个实施例的步骤Sa展开的流程图。
附图标记:
1-控制器;            2-第一传输模块;      3-遥控器本体;
4-姿势传感器;        5-GPS定位器;         6-体感激活按钮;
7-自定义按键          8-拍照/摄像按键;     9-第一云台俯仰按键
10-第二云台俯仰按键   11-第二三色LED灯;    12-有源蜂鸣器;
13-振动马达;         14-电源开关;         15-USB接口
16-电池;             17-电源充电管理器;   18-第一三色LED灯;
19-返航按键;         20-起降按键;         21-机载飞控系统;
22-第二传输模块;     23-微处理器;         24-航姿参考系统;
25-定位模块;         26-气压计模块;       27-无人机。
具体实施方式
下面将结合附图对本发明的技术方案进行清楚、完整地描述,显然,所描述的实施例是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
图1为本发明一实施例提供的体感遥控器的结构示意图。如图1所示,本实施例提供的一种体感遥控器,包括:姿势传感器4、控制器1和第一传输模块2,以及遥控器本体3;姿势传感器4、第一传输模块2和控制器1均设置在遥控器本体3上,姿势传感器4和第一传输模块2均与控制器1电连接;姿势传感器4用于获取遥控器本体3的初始位置的初始状态信息,以及遥控器本体3移动的移动信息,并传递给控制器1;控制器1用于根据初始状态信息和移动信息,得到飞行指令,并通过第一传输模块2将飞行指令发出。第 一传输模块2支持蓝牙、无线保真WiFi、红外、移动网络、有线传输中的一种或多种传输方式。
其中,初始状态信息包括初始位置的角速度和加速度;移动信息包括遥控器本体3移动至预设位置的角速度和加速度。
姿势传感器4的种类可以为多种,较佳地,姿势传感器4包括陀螺仪和加速度计;陀螺仪和加速度计设置在遥控器本体3上,且与控制器1电连接;陀螺仪和加速度计分别用于获取遥控器本体3的初始位置的角速度和加速度,以及遥控器本体3移动至预设位置的角速度和加速度;控制器1根据初始位置的角速度和加速度,以及遥控器本体3移动至预设位置的角速度和加速度,得到飞行指令,并将飞行指令传递给机载飞控系统,机载飞控系统控制无人机27的飞行。更加优选地,姿势传感器4为MPU6050。
第一传输模块2的种类可以为多种,例如:蓝牙、WiFi或者红外线等等,较佳地,无线传输模块为915MHz无线传输模块和5.8GHz无线传输模块中的一种。
本实施例提供的体感遥控器,其利用姿势传感器4获取遥控器本体3初始位置的初始状态信息,以及遥控器本体3移动至预设位置的移动信息,控制器1根据初始状态信息和移动信息得到飞行指令。当操作者在使用体感遥控器操控无人机27时,姿势传感器4会获取遥控器的当前位置的初始状态信息,该当前位置即为中心位。
当操作者在移动遥控器本体3时,均为以初始状态信息为基准,并结合移动信息得到飞行指令。例如:在初始位置的基础上,当体感遥控器往前倾斜,即可控制无人机27向前或者头部向下俯;当体感遥控器往后倾斜,即可控制无人机27向后或者尾部向前仰;当体感遥控器往左倾斜,即可控制无人机27向左或者向左横滚;当体感遥控器往右倾斜,即可控制无人机27向右或者向右横滚等等。
由于本实施例中的体感遥控器,任意一个位置都能作为体感遥控器的中心位,故而,操作者不需要再找体感遥控器的中心位了,降低对操控者的技术水平的要求,便于操控者操控。
图2为本发明另一实施例提供的体感遥控器的结构示意图。如图1和图2所示,在上述实施例的基础上,进一步地,遥控器本体3上还设置有体感激活按钮6;体感激活按钮6与控制器1电连接,在所述体感激活按钮被按下后, 所述遥控器本体的当前位置为初始位置。
当操作者在使用体感遥控器操控,按动体感激活按钮6,激活控制器1,姿势传感器4记录此时遥控器本体3所在位置的初始状态信息,并将该位置作为中心位。这样,操作者在按动体感激活按钮6后,再移动体感遥控器本体3,才能够操控无人机27,从而降低误操作的可能性。另外,也便于操作者找到适合自己的操作方位之后,再对无人机27进行操控。
如图2所示,在上述实施例的基础上,进一步地,遥控器本体3上还设置有GPS定位器5和返航按键19;GPS定位器5,与所述控制器电连接,用于定位所述遥控器本体的位置;返航按键19,与所述控制器电连接,用于根据所述GPS定位器定位的位置,操控无人机27返航。
遥控器本体3上还设置有起降按键20,起降按键20与控制器1电连接,用于操控无人机27起降。
遥控器本体3上还设置有2个云台俯仰按键,第一云台俯仰按键9和第二云台俯仰按键10,分别用于控制无人机云台附、仰操作。
遥控器本体3上还设置有自定义按键7,用户可以自定义该按键的功能,具体地,可以提供多个可选功能供用户选择,比如,可以选择该按键为跟随模式按键,用于控制无人机跟随体感遥控器移动,也即在启动跟随模式后,体感遥控器移动,则无人机也可以随着移动;再比如,可以选择该按键为悬停功能按键,用于控制无人机悬停。
遥控器本体3上还设置有拍照/摄像按键8,拍照/摄像按键8与控制器1电连接,用于控制拍照及摄像,可以长按该按键进入摄像功能,再长按退出摄像功能,进入拍照功能。
本实施例中的用于无人机的体感遥控器的功能按键的设计,能够让操作者在简单的按键步骤下实现不同的功能,同样的功能只需要一个按键就能实现,操作起来简单又快捷。
当然,在本实施例中,还可以在遥控器本体3上设置电源开关14、前进按键和后退按键,以及加速按键等等。
如图2所示,在上述实施例的基础上,进一步地,遥控器本体3内还设置有电池16;遥控器本体3上还设置有电源开关14,电池16通过电源开关14与控制器1电连接;遥控器本体3上还设置有第一三色LED灯18和电源充电管理器17,第一三色LED灯18通过电源充电管理器17与电池16电连 接,第一三色LED灯18用于显示电池16的充电状态;遥控器本体3上还设置有USB接口15,USB接口15与控制器1电连接,用于固件升级或者给电池16充电。
如图2所示,在上述实施例的基础上,进一步地,遥控器本体3上还设置有振动马达13和有源蜂鸣器12,振动马达13和有源蜂鸣器12均与控制器1电连接。振动马达13和有源蜂鸣器12,可以与GPS定位器5结合使用,当无人机27偏离预设飞行轨迹时,或者到达预设目的地等等可以提醒操作者。当然,也可以配合操作者的动作,以增加操控者的操控快感,例如:操控者每做一个动作时,振动马达13振动,或者有源蜂鸣器12会发出声音。
如图2所示,在上述实施例的基础上,进一步地,遥控器本体3上还设置有第二三色LED灯11,第二三色LED灯11与控制器1电连接,以显示无人机27的飞行状态。
图3为本发明实施例提供的体感遥控飞行系统的结构示意图。如图3所示,本发明一实施例还提供了一种体感遥控飞行系统,包括机载飞控系统21以及上述的体感遥控器;机载飞控系统21上设置有第二传输模块22,第二传输模块22与第一传输模块2无线通讯连接;机载飞控系统21用于根据飞行指令控制无人机27飞行。
其中,机载飞控系统21还包括定位模块25、航姿参考系统24和气压计模块26以及微处理器23;微处理器23用于通过定位模块25、航姿参考系统24和气压计模块26获取无人机27的飞行信息,并通过第二传输模块22将飞行信息传递给体感遥控器。
在本实施例中,机载飞控系统21将飞行信息传递给体感遥控器,体感遥控器可根据飞行信息来调整无人机27的飞行姿势,以使无人机27能够超视距飞行。
图4为本发明一实施例提供的体感遥控飞行方法的流程图。如图4所示,本发明实施例还提供了一种体感遥控飞行方法,具体包括如下步骤:
步骤100,当检测到体感遥控飞行模式被激活时,定位遥控器本体3的当前位置为初始位置,并利用姿势传感器4获取遥控器本体3的初始状态信息,并传递给控制器1;
步骤200,当遥控器本体3运动时,姿势传感器4获取遥控器本体3的移动信息,并传递给控制器1;
步骤300,控制器1根据初始状态信息和移动信息,得到飞行指令;
步骤400,控制器1将飞行指令传递给机载飞控系统21,机载飞控系统21控制无人机27飞行。
本实施例的体感遥控飞行方法,其利用姿势传感器4获取遥控器本体3当前位置的初始状态信息,以及遥控器本体3移动至预设位置的移动信息,控制器1根据初始状态信息和移动信息得到飞行指令。当操作者在使用体感遥控器操控无人机27时,姿势传感器4会获取遥控器的当前位置的初始状态信息,该当前位置即为中心位。当操作者在移动遥控器本体3时,均为以初始状态信息为基准,并结合移动信息得到飞行指令。这样,遥控器本体3能够以任意一个位置作为体感遥控器的中心位,故而,操作者无需再找体感遥控器的中心位了,降低对操控者的技术水平的要求,便于操控者操控。
图5为本发明另一实施例提供的体感遥控飞行方法的流程图。如图5所示,在上述实施例的基础上,进一步地,步骤300,控制器1根据初始状态信息和移动信息,得到飞行指令中,具体包括如下步骤:
步骤310,根据初始状态信息计算出初始位置的空间坐标,并记录所述初始位置的空间坐标;
步骤320,根据移动信息计算出预设位置的空间坐标,并记录预设位置的空间坐标;
步骤330,根据初始位置的空间坐标和预设位置的空间坐标得到飞行指令。
在本实施例中,控制器1根据获取的初始状态信息计算出体感遥控器在初始位置,即中心位的空间坐标,然后,再根据移动信息计算出预设位置的空间坐标,通过遥控器本体3在空间坐标上的移动,从而精准获取遥控器本体3在空间坐标系上的移动轨迹,从而实现对无人机27的精准控制。
图6为本发明又一实施例提供的体感遥控飞行方法的流程图。如图6所示,在上述实施例的基础上,进一步地,步骤330,根据初始位置的空间坐标和预设位置的空间坐标,得到飞行指令中,具体包括如下步骤:
步骤331,记录下初始位置的空间坐标对应的四元数,并算出该四元数对应的初始方向余弦矩阵DCMinit
步骤332,记录下预设位置的空间坐标对应的四元数,并算出该四元数对应的当前方向余弦矩阵DCMcur
步骤333,将初始方向余弦矩阵的转置矩阵与当前方向余弦矩阵相乘,得 到当前方向余弦矩阵相对于初始方向余弦矩阵的移动方向余弦矩阵DCMrelative
步骤334,利用移动方向余弦矩阵算出当前遥控器本体3相对于初始位置的欧拉角;
步骤335,利用欧拉角得到飞行指令。
其中,四元数是由爱尔兰数学家威廉·卢云·哈密顿在1843年发现的数学概念。四元数的乘法不符合交换律。明确地说,四元数是复数的不可交换延伸。如把四元数的集合考虑成多维实数空间的话,四元数就代表着一个四维空间,相对于复数为二维空间。
在本实施例中,利用四元数能够快速的计算出遥控器本体3以初始位置为基准、在空间坐标系中的变化量,即遥控器本体3转动的欧拉角,减少运算量,提高控制器1的工作效率,同时还能够进一步地提高操控精准度。
在上述技术方案中,首先,体感遥控器在接收到体感激活命令之后,就会获取遥控器此时的初始四元数qinit,其中,初始四元数为qinit=[w,x,y,z],当中的w,x,y,z是四元数qinit的4个参数,体感遥控器是用陀螺仪和加速度计来算出四元数的,qinit中的4个参数分别为:
w=cos(theta/2);
x=ax*sin(theta/2);
y=ay*sin(theta/2);
z=az*sin(theta/2);
(ax,ay,az)表示轴的矢量,theta表示绕此轴的旋转角度。
通过上述技术方案,能够快速准确地计算出体感遥控器初始状态时的四元数qinit,并且将qinit转换为初始方向余弦矩阵DCMinit,为:
Figure PCTCN2016086473-appb-000001
优选地,所述步骤332具体包括:
每间隔预定时间就记录一个当前四元数qcur,并将当前四元数qcur转换成当前方向余弦矩阵DCMcur
在上述技术方案中,每间隔预定时间就会记录一次当前四元数qcur,然后将qcur转换成当前方向余弦矩阵DCMcur,当前四元数qcur和当前方向余弦矩阵DCMcur的转换关系如下:
qcur=[w,x,y,z]
Figure PCTCN2016086473-appb-000002
优选地,所述预定时间为10ms至20ms。
在上述技术方案中,为了保证获取当前四元数qcur的实时性,将预定时间设定为10ms至20ms。
优选地,所述移动方向余弦矩阵DCMrelative为:
DCMrelative=DCMinit T*DCMcur
图7为本发明再一实施例提供的体感遥控飞行方法的流程图;图8为本发明实施例提供的体感遥控飞行方法的总流程图。如图7和图8所示,在上述实施例的基础上,进一步地,步骤335,利用所述欧拉角得到飞行指令中,具体还包括如下步骤:
步骤3351,将移动方向余弦矩阵绕初始方向余弦矩阵对应的机体坐标系的y轴旋转90°,得到最终方向余弦矩阵DCMfinal
步骤3352,利用最终方向余弦矩阵DCMfinal得到能够控制无人机姿态的控制量。
具体地,可以将所述最终方向余弦矩阵DCMfinal转换成欧拉角,并根据所述欧拉角换算得到能够控制无人机姿态的控制量。因为俯仰角范围在90°和-90°左右时,欧拉角存在奇点,为了躲避欧拉角奇点位置,使得操控更加灵敏,需要将体感遥控器的中心位的附近变成欧拉角的奇点位置。故而,将移动方向余弦矩阵DCMrelative对应的俯仰角旋转90°得到最终矩阵DCMfinal
旋转方式如下:
Figure PCTCN2016086473-appb-000003
优选地,以体感遥控器上的姿势传感器中心为坐标原点O建立三维坐标系,所述欧拉角包括:
体感遥控器围绕X轴旋转值roll,体感遥控器围绕Y轴旋转值pitch,体感遥控器围绕Z轴旋转值yaw。
在上述技术方案中,为了让无人机能够更好的根据用户的手势动作完成相应的动作,需要建立以姿势传感器中心为坐标原点O的坐标系,并分别用 roll、pitch和yaw,代表围绕X轴旋转值、围绕Y轴旋转值和围绕Z轴旋转值,这样就可以直接将用户的手势动作转换成相应的欧拉角,进而再将相应的欧拉角转换成能够控制无人机的控制量,并将相应的控制量转换为控制电信号发送至无人机,控制无人机的飞行。
优选地,所述体感遥控器围绕Y轴旋转值pitch为:
pitch=arcsin(-DCMfinal[2][0])。
优选地,根据pitch的大小确定roll和yaw的值:
Figure PCTCN2016086473-appb-000004
时,则roll=0,
Figure PCTCN2016086473-appb-000005
Figure PCTCN2016086473-appb-000006
时,则
Figure PCTCN2016086473-appb-000007
Figure PCTCN2016086473-appb-000008
在上述技术方案中,首先根据DCMfinal得出无人机围绕X轴旋转值roll,并且判断
Figure PCTCN2016086473-appb-000009
是否小于0.001,如果小于则围绕X轴旋转值roll为0,围绕Z轴旋转值
Figure PCTCN2016086473-appb-000010
通过上述技术方案,能够将最终矩阵DCMfinal转换为无人机飞行姿态的三维控制量pitch、roll和yaw,并计算出pitch、roll和yaw的具体数值,这样就可以将pitch、roll和yaw的具体数值转换为能够发送给无人机的控制量,进而控制无人机的飞行姿态,完成用户想要无人机完成的动作。
在具体实施中,可以利用本申请实施例提供的体感遥控器对无人机进行有头控制。但在有头控制方式下,当无人机飞行的距离比较远或无人机的飞行距离超出人的视线范围时,用户无法用眼睛辨别无人机机头的方向,就不能很好的控制无人机的飞行,基于此,本发明提供了一种对无人机进行无头控制的技术方案,能够在用户利用体感遥控器控制无人机飞行时,无论无人机的机头处于哪个方向,无人机都会以用户的方位进行飞行。
下面结合附图对本发明的具体实施方式作进一步详细的描述。
如图9所示,一种无头控制的方法,所述无头控制的方法包括:
S1,体感遥控器接收到无头控制命令;
S2,体感遥控器实时计算遥控器指向无人机的航向角theta;
S3,根据航向角theta得出旋转矩阵DCM;
S4,将体感遥控器接收到的用户输入的初始控制量V与所述旋转矩阵DCM结合得出无人机的无头姿态控制量C;
S5,体感遥控器将无头姿态控制量C发送至无人机,以供无人机根据所述无头姿态控制量C调整飞行姿态。
在上述技术方案中,在无人机的体感遥控器上设有无头控制键,当用户按下无头控制键时,系统就会计算该体感遥控器指向无人机的航向角theta,这样就知道了无人机相对于体感遥控器(即无人机相对于人)的方向,然后再利用该航向角theta确定出旋转矩阵DCM,这样,当用户利用体感遥控器向无人机发送初始控制量V时,就可以将旋转矩阵DCM赋予到初始控制量V上,就可以得出控制无人机的无头姿态控制量C,无人机就可以根据该无头姿态控制量C进行飞行,这样无论无人机的机头处于哪个方向,经过上述方案后无人机都会按照体感遥控器(即人)的方位进行飞行。
其中由于系统在实时计算体感遥控器指向无人机的航向角theta,所以用户利用体感遥控器控制无人机向左或右飞行时,无人机就会以用户为中心以无人机到用户的距离为半径盘旋飞行。
例如,当用户在利用体感遥控器控制飞机飞行时,用户按下体感遥控器上的无头控制键,然后用户利用体感遥控器控制飞机向前飞行,这样无论飞机的机头在哪,都会向人的前方飞行,利用体感遥控器控制飞机向后飞行,飞机就会向人靠近,当用户利用体感遥控器控制飞机向左或右飞行,这样飞机就会以用户为中心顺时针或逆时针盘旋飞行。
通过上述技术方案,当用户触发无头控制键时,在用户利用体感遥控器控制无人机飞行时,无论无人机的机头处于哪个方向,无人机都会以用户的方位进行飞行,并且由于系统在实时计算体感遥控器指向无人机的航向角用户利用体感遥控器控制无人机向左或右飞行时,无人机就会以用户为中心以无人机到用户的距离为半径盘旋飞行。
在所述步骤S1与S2之间还包括:
Sa,根据体感遥控器与无人机的实时距离S是否在危险距离范围内,来确定无人机的飞行状态。
优选地,如图10所示,所述步骤Sa具体包括:
Sa1,实时计算体感遥控器与无人机的实时距离S;
Sa2,当所述实时距离S大于危险距离时,无人机正常飞行,当所述实时距离S小于或等于危险距离时,无人机悬停。
在上述技术方案中,为了保证用户的安全,防止无人机出现故障掉落时伤害到用户,需要为无人机的飞行设置危险距离,这样当用户触发无头控制键后,系统就会实时计算遥控器与无人机的实施距离S,当计算得出的S实时距离S小于或等于危险距离(即,在危险距离范围内),就会控制无人机静止悬停在空中,这样无人机不会飞到用户的头顶排出安全隐患,同时,确保在进行无头模式飞行时,确保无人机前后运动时都沿着指向用户位置的半径方向运动。
通过上述技术方案,保证了用户的人身安全,提高了无人机的安全系数。
优选地,所述步骤Sa1具体包括:
实时获取体感遥控器的纬度lat1和经度lon1,以及无人机的纬度lat2和经度lon2;
计算体感遥控器和无人机的纬度差θ=lat1-lat2,和经度差b=lon1-lon2;
根据lat1、lon1、lat2、lon2、θ、b和earth_radius计算体感遥控器与无人机的实时距离S,其中,earth_radius是地球半径。
优选地,所述实时距离S为:
Figure PCTCN2016086473-appb-000011
在上述技术方案中,当用户触发无头控制键时,利用GPS获取遥控器的经纬度lon1和lat1,和无人机的经纬度lon2和lat2,然后计算体感遥控器和无人机的纬度差θ=lat1-lat2和经度差b=lon1-lon2,这样就可以计算实时距离S了。
优选地,所述航向角theta为:
theta=arctan2f(sin(b)*cos(lat2),cos(lat1)*cos(lat2)*sin(lat2)-sin(lat1)*cos(lat2)*cos(b))。
优选地,所述旋转矩阵DCM为:
Figure PCTCN2016086473-appb-000012
其中,pitch=0,roll=0,yaw=theta。
在上述技术方案中,根据pitch、roll和yaw计算并得出旋转矩阵DCM, 并且将pitch=0,roll=0,yaw=theta赋予到旋转矩阵DCM中,这时就可以认为无人机的机头方向已经改变了,机头变成人的前方方向,这样有利于用户利用体感遥控器直接控制无人机的飞行。
优选地,所述步骤S4具体包括:
体感遥控器接收用户输入的初始控制量V;
计算无人机的无头姿态控制量C为:
优选地,所述初始控制量V为:
Figure PCTCN2016086473-appb-000013
其中,Vpitch是体感遥控器接收的无人机前后移动的控制量,Vroll是遥控器接收的无人机横向移动的控制量,Vthrust是体感遥控器接收的无人机上下移动的控制量。
优选地,所述无人机的无头姿态控制量C为:
Figure PCTCN2016086473-appb-000014
其中,Cpitch是无头控制下的无人机前后移动的姿态控制量,Croll是无头控制下的无人机横向移动的姿态控制量,Cthrust是无头控制下的无人机上下移动的姿态控制量。
在上述技术方案中,当接收到用户利用体感遥控器发出的控制信号后,将控制信号转换成相应的初始控制量V,然后将旋转矩阵DCM与该初始控制量V做乘运算后得出能够控制无人机飞行的无头姿态控制量C,无人机就可以按照该无头姿态控制量C来调整飞行姿态,就可以以用户的方位进行飞行了。
通过本发明的技术方案,当用户触发无头控制键时,在用户利用体感遥控器控制无人机飞行时,无论无人机的机头处于哪个方向,无人机都会以用户的方位进行飞行,并且由于系统在实时计算体感遥控器指向无人机的航向角用户利用体感遥控器控制无人机向左或右飞行时,无人机就会以用户为中心以无人机到用户的距离为半径盘旋飞行。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通 技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (25)

  1. 一种体感遥控器,其特征在于,包括:姿势传感器、控制器和第一传输模块,以及遥控器本体;
    所述姿势传感器、所述第一传输模块和所述控制器均设置在所述遥控器本体上,所述姿势传感器和所述第一传输模块均与所述控制器电连接;
    所述姿势传感器用于获取所述遥控器本体的初始位置的初始状态信息,以及所述遥控器本体移动的移动信息,并传递给所述控制器;
    所述控制器用于根据所述初始状态信息和移动信息,得到飞行指令,并通过所述第一传输模块将飞行指令发出;
    所述第一传输模块支持蓝牙、无线保真WiFi、红外、移动网络、有线传输中的一种或多种传输方式。
  2. 根据权利要求1所述的体感遥控器,其特征在于,所述体感遥控器本体上还设置有体感激活按钮;所述体感激活按钮与所述控制器电连接,在所述体感激活按钮被按下后,所述遥控器本体的当前位置为初始位置。
  3. 根据权利要求1所述的体感遥控器,其特征在于,所述姿势传感器包括陀螺仪和加速度计;
    所述陀螺仪和加速度计设置在所述遥控器本体上,且与所述控制器电连接;所述陀螺仪和所述加速度计分别用于获取所述遥控器本体的初始位置的角速度和加速度,以及遥控器本体移动至预设位置的角速度和加速度;
    所述控制器根据所述初始位置的角速度和加速度,以及遥控器本体移动至预设位置的角速度和加速度,得到飞行指令,并将所述飞行指令传递给机载飞控系统,机载飞控系统控制无人机的飞行方向。
  4. 根据权利要求1-3任一项所述的体感遥控器,其特征在于,所述遥控器本体上还设置有以下按键中的一种或多种:
    GPS定位器,与所述控制器电连接,用于定位所述遥控器本体的位置;
    返航按键,与所述控制器电连接,用于根据所述GPS定位器定位的位置,操控无人机返航;
    起降按键,与所述控制器电连接,用于操控无人机起降;
    云台俯仰按键,用于控制无人机云台附仰操作;
    自定义按键,用于供用户自定义按键的功能;
    拍照/摄像按键,与所述控制器电连接,用于控制拍照及摄像。
  5. 根据权利要求1-4任一项所述的体感遥控器,其特征在于,所述遥控器本体内还设置有电池;所述遥控器本体上还设置有电源开关,所述电池通过所述电源开关与所述控制器电连接;
    所述遥控器本体上还设置有第一三色LED灯和电源充电管理器,所述第一三色LED灯通过所述电源充电管理器与所述电池电连接,所述第一三色LED灯用于显示所述电池的充电状态;
    所述遥控器本体上还设置有USB接口,所述USB接口与所述控制器电连接,用于固件升级或者给所述电池充电;
    和/或,所述遥控器本体上还设置有振动马达和有源蜂鸣器,所述振动马达和所述有源蜂鸣器均与所述控制器电连接;
    和/或,所述遥控器本体上还设置有第二三色LED灯,所述第二三色LED灯与所述控制器电连接,以显示无人机的飞行状态。
  6. 一种体感遥控飞行系统,其特征在于,包括机载飞控系统以及如权利要求1-7任一项所述的体感遥控器;
    所述机载飞控系统上设置有第二传输模块,所述第二传输模块与所述第一传输模块无线通讯连接;所述机载飞控系统用于根据飞行指令控制所述无人机飞行。
  7. 根据权利要求6所述的体感遥控飞行系统,其特征在于,所述机载飞控系统还包括定位模块、航姿参考系统和气压计模块以及微处理器;
    所述微处理器用于通过所述定位模块、所述航姿参考系统和所述气压计模块获取无人机的飞行信息,并通过所述第二传输模块将所述飞行信息传递给所述体感遥控器。
  8. 一种体感遥控飞行方法,其特征在于,具体包括如下步骤:
    当检测到体感遥控飞行模式被激活时,定位遥控器本体的当前位置为初始位置,并利用姿势传感器获取遥控器本体的初始状态信息,并传递给控制器;
    当遥控器本体运动时,姿势传感器获取遥控器本体的移动信息,并传递给控制器;
    控制器根据初始状态信息和移动信息,得到飞行指令;
    所述控制器将飞行指令传递给机载飞控系统,机载飞控系统控制无人机 飞行。
  9. 根据权利要求8所述的体感遥控飞行方法,其特征在于,所述控制器根据初始状态信息和移动信息,得到飞行指令的步骤中,具体包括如下步骤:
    根据所述初始状态信息计算出初始位置的空间坐标,并记录所述初始位置的空间坐标;
    根据移动信息计算出预设位置的空间坐标,并记录预设位置的空间坐标;
    根据初始位置的空间坐标和预设位置的空间坐标得到飞行指令。
  10. 根据权利要求9所述的体感遥控飞行方法,其特征在于,根据初始位置的空间坐标和预设位置的空间坐标,得到飞行指令的步骤中,具体包括如下步骤:
    记录下初始位置的空间坐标对应的四元数,并算出该四元数对应的初始方向余弦矩阵;
    记录下预设位置的空间坐标对应的四元数,并算出该四元数对应的当前方向余弦矩阵;
    将初始方向余弦矩阵的转置矩阵与当前方向余弦矩阵相乘,得到当前方向余弦矩阵相对于初始方向余弦矩阵的移动方向余弦矩阵;
    利用移动方向余弦矩阵算出当前遥控器本体相对于初始位置的欧拉角;
    利用欧拉角得到飞行指令。
  11. 根据权利要求10所述的体感遥控飞行方法,其特征在于,所述利用欧拉角得到飞行指令的步骤中,具体还包括如下步骤:
    将移动方向余弦矩阵绕初始方向余弦矩阵对应的机体坐标系的y轴旋转90°,得到最终方向余弦矩阵;
    利用最终方向余弦矩阵得到能够控制无人机姿态的控制量。
  12. 如权利要求11所述的体感遥控飞行方法,其特征在于,利用最终方向余弦矩阵得到能够控制无人机姿态的控制量,包括:
    将所述最终方向余弦矩阵转换成欧拉角,并根据所述欧拉角换算得到能够控制无人机姿态的控制量。
  13. 根据权利要求12所述的体感遥控飞行方法,其特征在于,遥控器上设有姿势传感器,以姿势传感器中心为坐标原点O建立三维坐标系,所述欧拉角包括:
    遥控器围绕X轴旋转值roll,遥控器围绕Y轴旋转值pitch,遥控器围 绕Z轴旋转值yaw。
  14. 根据权利要求13所述的体感遥控飞行方法,其特征在于,所述遥控器围绕Y轴旋转值pitch为:
    pitch=arcsin(-DCMfinal[2][0])。
  15. 根据权利要求14所述的体感遥控飞行方法,其特征在于,根据pitch的大小确定roll和yaw的值:
    Figure PCTCN2016086473-appb-100001
    时,则roll=0,
    Figure PCTCN2016086473-appb-100002
    Figure PCTCN2016086473-appb-100003
    时,则
    Figure PCTCN2016086473-appb-100004
    Figure PCTCN2016086473-appb-100005
  16. 一种无头控制的方法,其特征在于,包括以下步骤:
    S1,遥控器接收到无头控制命令;
    S2,遥控器实时计算遥控器指向无人机的航向角theta;
    S3,根据航向角theta得出旋转矩阵DCM;
    S4,将遥控器接收到的用户输入的初始控制量V与所述旋转矩阵DCM结合得出无人机的无头姿态控制量C;
    S5,遥控器将无头姿态控制量C发送至无人机,以供无人机根据所述无头姿态控制量C调整飞行姿态。
  17. 根据权利要求16所述的无头控制的方法,其特征在于,在所述步骤S1与S2之间还包括:
    Sa,根据遥控器与无人机的实时距离S是否在危险距离范围内,来确定无人机的飞行状态。
  18. 根据权利要求17所述的无头控制的方法,其特征在于,所述步骤Sa具体包括:
    Sa1,实时计算遥控器与无人机的实时距离S;
    Sa2,当所述实时距离S大于危险距离时,无人机正常飞行,当所述实时距离S小于或等于危险距离时,无人机悬停。
  19. 根据权利要求18所述的无头控制的方法,其特征在于,所述步骤Sa1具体包括:
    实时获取遥控器的纬度lat1和经度lon1,以及无人机的纬度lat2和经度lon2;
    计算遥控器和无人机的纬度差θ=lat1-lat2,和经度差b=lon1-lon2;
    根据lat1、lon1、lat2、lon2、θ、b和earth_radius计算遥控器与无人机的实时距离S,其中,earth_radius是地球半径。
  20. 根据权利要求19所述的无头控制的方法,其特征在于,所述实时距离S为:
    Figure PCTCN2016086473-appb-100006
  21. 根据权利要求19所述的无头控制的方法,其特征在于,所述航向角theta为:
    theta=arctan2f(sin(b)*cos(lat2),cos(lat1)*cos(lat2)*sin(lat2)-sin(lat1)*cos(lat2)*cos(b))。
  22. 根据权利要求21所述的无头控制的方法,其特征在于,所述旋转矩阵DCM为:
    Figure PCTCN2016086473-appb-100007
    其中,pitch=0,roll=0,yaw=theta。
  23. 根据权利要求16-22任一项所述的无头控制的方法,其特征在于,所述步骤S4具体包括:
    遥控器接收用户输入的初始控制量V;
    计算无人机的无头姿态控制量C为:
    C=DCM*V。
  24. 根据权利要求23所述的无头控制的方法,其特征在于,所述初始控制量V为:
    Figure PCTCN2016086473-appb-100008
    其中,Vpitch是遥控器接收的无人机前后移动的控制量,Vroll是遥控器接收的无人机横向移动的控制量,Vthrust是遥控器接收的无人机上下移动的控制量。
  25. 根据权利要求23所述的无头控制的方法,其特征在于,所述无人机的无头姿态控制量C为:
    Figure PCTCN2016086473-appb-100009
    其中,Cpitch是无头控制下的无人机前后移动的姿态控制量,Croll是无头控制下的无人机横向移动的姿态控制量,Cthrust是无头控制下的无人机上下移动的姿态控制量。
PCT/CN2016/086473 2015-12-31 2016-06-20 体感遥控器、体感遥控飞行系统和方法、无头控制方法 WO2017113648A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP16880456.5A EP3399380B1 (en) 2015-12-31 2016-06-20 Headless control method
US16/067,557 US11327477B2 (en) 2015-12-31 2016-06-20 Somatosensory remote controller, somatosensory remote control flight system and method, and head-less control method

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
CN201511032422.8 2015-12-31
CN201511032422.8A CN105469579B (zh) 2015-12-31 2015-12-31 体感遥控器、体感遥控飞行系统和方法
CN201620268295.5U CN205608991U (zh) 2016-04-01 2016-04-01 用于无人机的体感遥控器
CN201620268295.5 2016-04-01
CN201610299333.8A CN107346141A (zh) 2016-05-06 2016-05-06 一种体感控制方法
CN201610297215.3A CN107346140B (zh) 2016-05-06 2016-05-06 一种无头控制的方法
CN201610299333.8 2016-05-06
CN201610297215.3 2016-05-06

Publications (1)

Publication Number Publication Date
WO2017113648A1 true WO2017113648A1 (zh) 2017-07-06

Family

ID=59224333

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/086473 WO2017113648A1 (zh) 2015-12-31 2016-06-20 体感遥控器、体感遥控飞行系统和方法、无头控制方法

Country Status (3)

Country Link
US (1) US11327477B2 (zh)
EP (1) EP3399380B1 (zh)
WO (1) WO2017113648A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020502714A (ja) * 2016-12-15 2020-01-23 パワーヴィジョン・ロボット・インコーポレイテッド 遠隔コントローラを用いるドローン用の制御システムおよび方法
CN108885493A (zh) * 2017-12-22 2018-11-23 深圳市大疆创新科技有限公司 体感控制器控制云台的方法、云台、体感控制器和系统
US20190324447A1 (en) * 2018-04-24 2019-10-24 Kevin Michael Ryan Intuitive Controller Device for UAV
CN112136164B (zh) * 2019-09-30 2022-07-29 深圳市大疆创新科技有限公司 支架结构及可移动平台
US20230033760A1 (en) * 2019-12-23 2023-02-02 AirSelfie, Inc. Aerial Camera Device, Systems, and Methods
CN114153227B (zh) * 2021-11-30 2024-02-20 重庆大学 一种基于gps信号的无人机机群密钥提取与安全认证方法

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005006993A1 (de) * 2005-02-15 2006-09-14 Stefan Reich System zur Steuerung unbemannter Luftfahrzeuge
CN101916109A (zh) * 2010-07-16 2010-12-15 王靖微 体感式电视瞄准仪
US20120022719A1 (en) * 2006-09-06 2012-01-26 Matos Jeffrey A Systems and methods for detecting and managing the unauthorized use of an unmanned aircraft
CN104808675A (zh) * 2015-03-03 2015-07-29 广州亿航智能技术有限公司 基于智能终端的体感飞行操控系统及终端设备
CN105469579A (zh) * 2015-12-31 2016-04-06 北京臻迪机器人有限公司 体感遥控器、体感遥控飞行系统和方法

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TW200929014A (en) * 2007-12-17 2009-07-01 Omni Motion Technology Corp Method that controls a controlled device by detecting movement of a hand-held control device, and the hand-held control device
US8200375B2 (en) 2008-02-12 2012-06-12 Stuckman Katherine C Radio controlled aircraft, remote controller and methods for use therewith
US20100228406A1 (en) * 2009-03-03 2010-09-09 Honeywell International Inc. UAV Flight Control Method And System
CN101598556B (zh) 2009-07-15 2011-05-04 北京航空航天大学 一种未知环境下无人机视觉/惯性组合导航方法
EP2486728A4 (en) 2009-10-05 2015-10-21 Lumexis Corp IN-CAR RADIO COMMUNICATION SYSTEM
US8577535B2 (en) * 2010-03-31 2013-11-05 Massachusetts Institute Of Technology System and method for providing perceived first-order control of an unmanned vehicle
US8456329B1 (en) 2010-06-03 2013-06-04 The United States Of America As Represented By The Secretary Of The Navy Wand controller for aircraft marshaling
US8774982B2 (en) * 2010-08-26 2014-07-08 Leptron Industrial Robotic Helicopters, Inc. Helicopter with multi-rotors and wireless capability
CN103218059B (zh) 2012-01-19 2016-12-14 上海仪电数字技术有限公司 三维遥控装置及其定位方法
US20140008496A1 (en) * 2012-07-05 2014-01-09 Zhou Ye Using handheld device to control flying object
CN203253153U (zh) 2013-04-10 2013-10-30 吴俣余 体感控制器
WO2014187027A1 (zh) 2013-05-22 2014-11-27 上海九鹰电子科技有限公司 遥控信号的发送装置和方法、以及接收装置和方法
CN203315750U (zh) 2013-06-09 2013-12-04 北京虎渡能源科技有限公司 一种飞行类娱乐项目操控平台
CN103712598B (zh) 2013-12-31 2014-12-17 渤海大学 一种小型无人机姿态确定方法
CN103940442B (zh) 2014-04-03 2018-02-27 深圳市宇恒互动科技开发有限公司 一种采用加速收敛算法的定位方法及装置
JP6293304B2 (ja) * 2014-05-21 2018-03-14 エスゼット ディージェイアイ テクノロジー カンパニー リミテッドSz Dji Technology Co.,Ltd 遠隔制御装置、制御システム及び制御方法
CN105517666B (zh) * 2014-09-05 2019-08-27 深圳市大疆创新科技有限公司 基于情景的飞行模式选择
CN104298248B (zh) 2014-10-08 2018-02-13 南京航空航天大学 旋翼无人机精确视觉定位定向方法
CN104536453B (zh) 2014-11-28 2017-08-04 深圳一电航空技术有限公司 飞行器的控制方法及装置
CN104714556B (zh) 2015-03-26 2017-08-11 清华大学 无人机智能航向控制方法
CN104898699B (zh) * 2015-05-28 2020-03-17 小米科技有限责任公司 飞行控制方法及装置、电子设备
CN105223959B (zh) 2015-09-28 2018-07-13 佛山市南海区广工大数控装备协同创新研究院 一种无人机手套控制系统及控制方法
CN105278544B (zh) 2015-10-30 2018-05-08 小米科技有限责任公司 无人飞行器的控制方法及装置
US10197998B2 (en) * 2015-12-27 2019-02-05 Spin Master Ltd. Remotely controlled motile device system
CN105549608A (zh) 2016-02-29 2016-05-04 深圳飞豹航天航空科技有限公司 一种无人机方位调整方法及其系统

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102005006993A1 (de) * 2005-02-15 2006-09-14 Stefan Reich System zur Steuerung unbemannter Luftfahrzeuge
US20120022719A1 (en) * 2006-09-06 2012-01-26 Matos Jeffrey A Systems and methods for detecting and managing the unauthorized use of an unmanned aircraft
CN101916109A (zh) * 2010-07-16 2010-12-15 王靖微 体感式电视瞄准仪
CN104808675A (zh) * 2015-03-03 2015-07-29 广州亿航智能技术有限公司 基于智能终端的体感飞行操控系统及终端设备
CN105469579A (zh) * 2015-12-31 2016-04-06 北京臻迪机器人有限公司 体感遥控器、体感遥控飞行系统和方法

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3399380A4 *

Also Published As

Publication number Publication date
US11327477B2 (en) 2022-05-10
EP3399380A1 (en) 2018-11-07
US20190004509A1 (en) 2019-01-03
EP3399380B1 (en) 2021-12-29
EP3399380A4 (en) 2019-12-18

Similar Documents

Publication Publication Date Title
CN105469579B (zh) 体感遥控器、体感遥控飞行系统和方法
WO2017113648A1 (zh) 体感遥控器、体感遥控飞行系统和方法、无头控制方法
US11042074B2 (en) Flying camera with string assembly for localization and interaction
US11204611B2 (en) Assisted takeoff
JP6671375B2 (ja) 無人機の飛行補助方法
JP6816156B2 (ja) Uav軌道を調整するシステム及び方法
CN104808675B (zh) 基于智能终端的体感飞行操控系统及终端设备
CN109219785B (zh) 一种多传感器校准方法与系统
JP6212788B2 (ja) 無人航空機を稼働する方法及び無人航空機
WO2016192249A1 (zh) 一种飞行器的操控方法和装置
WO2017109758A1 (en) Systems and methods for controlling an unmanned aerial vehicle
JP4012749B2 (ja) 遠隔操縦システム
JP6767802B2 (ja) 無人飛行体及びその飛行制御方法
JP6829513B1 (ja) 位置算出方法及び情報処理システム
US20210181769A1 (en) Movable platform control method, movable platform, terminal device, and system
KR20180025416A (ko) 모션 인식 및 가상 현실을 이용한 드론 비행 제어 시스템 및 방법
CN205563980U (zh) 体感遥控器
JP7289152B2 (ja) 飛行制御システム
JP2021036452A (ja) Uav軌道を調整するシステム及び方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16880456

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2016880456

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 2016880456

Country of ref document: EP

Effective date: 20180731