WO2018184232A1 - Procédé de commande à distance à détection de corps, appareil de commande, cardan et véhicule aérien sans pilote - Google Patents

Procédé de commande à distance à détection de corps, appareil de commande, cardan et véhicule aérien sans pilote Download PDF

Info

Publication number
WO2018184232A1
WO2018184232A1 PCT/CN2017/079790 CN2017079790W WO2018184232A1 WO 2018184232 A1 WO2018184232 A1 WO 2018184232A1 CN 2017079790 W CN2017079790 W CN 2017079790W WO 2018184232 A1 WO2018184232 A1 WO 2018184232A1
Authority
WO
WIPO (PCT)
Prior art keywords
remote control
somatosensory
information
command
remote
Prior art date
Application number
PCT/CN2017/079790
Other languages
English (en)
Chinese (zh)
Inventor
缪宝杰
刘怀宇
吴一凡
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to PCT/CN2017/079790 priority Critical patent/WO2018184232A1/fr
Priority to CN201780013879.5A priority patent/CN108700893A/zh
Publication of WO2018184232A1 publication Critical patent/WO2018184232A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C17/00Arrangements for transmitting signals characterised by the use of a wireless electrical link
    • G08C17/02Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
    • GPHYSICS
    • G08SIGNALLING
    • G08CTRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
    • G08C19/00Electric signal transmission systems

Definitions

  • the invention belongs to the technical field of remote control, in particular to the technical field of control of an unmanned aerial vehicle, and particularly relates to a somatosensory remote control method and a corresponding somatosensory control device, which have corresponding pan/tilt and unmanned aerial vehicles.
  • the invention can also be applied to other devices that require remote control, such as various unmanned vehicles.
  • Unmanned aerial vehicles have the advantages of good maneuverability, low cost and convenient use, and have been applied in many industries, such as aerial photography, agricultural plant protection, surveying and so on.
  • a virtual reality head-mounted display device that is, a VR (Virtual Reality) head display, including VR glasses, VR eye masks, VR helmets, and the like.
  • the UAV aircraft can be connected with the virtual reality head-mounted display device to realize the first person perspective: the image captured by the camera device on the UAV aircraft can be transmitted back to the virtual reality head-mounted display device in real time; through the mobile terminal
  • the controller can directly control the throttle, attitude angle and flight speed of the aircraft to achieve very precise control of the aircraft.
  • Currently, in the field of VR glasses technology there is no specific VR glasses product for maneuvering unmanned aerial vehicles. Most of them can only watch the picture and cannot control the attitude of the drone.
  • An aspect of the present invention provides a somatosensory remote control method for remotely executing a device by a somatosensory method, comprising: acquiring somatosensory information; generating a remote control command for controlling the execution device according to the somatosensory information; The remote command execution action is described.
  • a somatosensory control apparatus comprising: an acquisition module for acquiring somatosensory information; and an instruction generation module for generating a remote control instruction for controlling the execution device based on the somatosensory information.
  • pan/tilt head comprising a receiving module, configured to receive a remote control command, the remote control
  • the instructions are generated by the somatosensory information; the execution module is configured to perform an action according to the remote command.
  • an unmanned aerial vehicle includes a receiving module for receiving a remote control command, the remote control command being generated by the somatosensory information, and an execution module for performing an action according to the remote control command.
  • Another aspect of the present invention provides a VR body sensing device including the somatosensory control device.
  • Figure 1 shows a block diagram of one embodiment of a somatosensory control device of the present invention.
  • Fig. 2 is a block diagram showing another embodiment of the somatosensory control device of the present invention.
  • Fig. 3 is a diagram showing an embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • Fig. 4 is a block diagram showing the module of the somatosensory control device and the executing device in the embodiment of Fig. 3.
  • 5 to 8 are schematic diagrams showing the corresponding postures of the control area of the somatosensory control device and the VR body-sensing glasses.
  • Fig. 9 is a diagram showing another embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • FIG. 10 and 11 are block diagrams showing the configuration of the somatosensory control device, the executing device, and the remote control device of the executing device in the embodiment of Fig. 9.
  • the present invention provides a somatosensory remote control method in which the device is remotely controlled by a somatosensory method.
  • the term "soul” as used herein refers to body perception, including the perception of various body state information of a person.
  • the body state information in turn includes body motion state information, including exercise time, motion speed, motion acceleration, motion angle, motion angular velocity, motion posture, and the like.
  • an actuator refers to any device or device that can perform any action under the command, such as a drone, or a gimbal on a drone.
  • the drone is usually remotely controlled by a remote control device with a control handle, and the operator can control the handle without remote control of the drone. If the operator's own body-sensing information of the drone can be well utilized, the operator's operating load can be greatly reduced.
  • the starting point of the present invention is to utilize the operator's somatosensory information, that is, first, the somatosensory information needs to be acquired.
  • the present invention can be utilized. That is, the present invention is not limited to how to acquire somatosensory information and what type of somatosensory information.
  • the present invention entails generating a remote command for controlling the execution device based on the somatosensory information; the executing device can thereby perform an action in accordance with the remote command.
  • a drone can perform an action based on a remote command of a remote controller, which is well known.
  • the somatosensory information cannot be directly used as a remote control command. Therefore, the present invention proposes to convert the somatosensory information into a predetermined command format that can be read and recognized by the drone. Thereby, the executing device can perform the corresponding action in accordance with the remote control command in the usual manner.
  • the present invention is not limited to a specific device or element for acquiring somatosensory information, in view of convenience of handling and sense of presence, the present invention preferably uses a wearable device to collect somatosensory information.
  • the wearable device is, for example, a smart bracelet, a smart watch, a virtual helmet, smart glasses, smart sports shoes, etc.
  • the wearable device can collect motion information or posture information of at least one part of the user as the body feeling information. That is, the present invention can embed a somatosensory control device in a wearable device.
  • the present invention more preferably employs wearable devices with virtual reality (VR) functions, such as VR helmets, VR blinds. , VR glasses, etc.
  • VR virtual reality
  • the shooting picture so that the operator produces a first-view immersive feeling.
  • the existing VR device does not have the function of obtaining the sensory information of the controller.
  • the present invention proposes to make VR
  • the device has the function of obtaining somatosensory information, thereby converting the movement of a person or a part of a person into a manipulation information of the unmanned vehicle, so that the controller obtains an immersive manipulation feeling.
  • the VR glasses will be described as a carrier of the body feeling control device as an example.
  • FIG. 1 shows a block diagram of one embodiment of a somatosensory control device of the present invention.
  • the somatosensory control device 10 can be included in a VR body spectacle 1 including an acquisition module 11 and an instruction generation module 12.
  • the acquisition module 11 is configured to acquire the somatosensory information
  • the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device (not shown) according to the somatosensory information.
  • the somatosensory information acquired by the acquisition module 11 or the remote control command generated by the command generation module 12 can be directly transferred to the VR stereoscopic glasses 1 , and the VR stereoscopic glasses 1 complete the transmission of the somatosensory information or the remote control command and the interaction with the execution device.
  • Fig. 2 is a block diagram showing another embodiment of the somatosensory control device of the present invention.
  • the somatosensory control device 10 further includes a transmitting module 13. It is used to send the somatosensory information acquired by the acquisition module 11 or the remote control command generated by the instruction generation module 12 to the execution device or the remote control device of the execution device. That is, the body feeling control device 10 of this embodiment itself performs the transmission of the body feeling information or the remote control command and the interaction with the execution device.
  • Fig. 3 is a diagram showing an embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • the somatosensory control device 10 is included in the VR body spectacle 1 which generates a remote control command based on the somatosensory information acquired by the somatosensory control device 10, and the remote control command is transmitted from the VR tactile glasses 1 to the drone.
  • the flight control module or the pan/tilt control module of the drone 2 serves as the execution device (not shown).
  • Fig. 4 is a block diagram showing the module of the somatosensory control device and the executing device in the embodiment of Fig. 3.
  • the body feeling control device 10 includes an acquisition module 11, an instruction generation module, and a transmission module.
  • the acquisition module 11 is configured to acquire the somatosensory information
  • the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device 20 according to the somatosensory information.
  • the sending module 13 is configured to send the somatosensory information acquired by the acquiring module 11 or the remote control command generated by the command generating module 12 to the executing device 20.
  • the execution device 20 includes an execution module 21, a main control module 22, and a receiving module 23.
  • the execution module is configured to perform a corresponding action in accordance with an instruction of the control module 22.
  • the execution module may be a component that controls the flight state of the drone, such as an engine, a rotor, a rudder, and the like.
  • the execution module may be an actuating mechanism that controls the attitude of the gimbal.
  • the invention is not limited to the specific configuration of the specific execution device and execution module.
  • the receiving module 23 of the executing device is configured to receive the somatosensory information or the remote control command from the transmitting module 13 of the somatosensory control device 10 by wire or wirelessly, and forward it to the control module 22.
  • the control module 22 generates a remote control command for controlling the execution module 21 according to the somatosensory information or a remote control command.
  • the control module 22 receives the somatosensory information, it first generates a remote control command according to the somatosensory information, and then generates a remote control command of the execution module 21 according to the remote control command.
  • the obtaining module 11 further includes the step of acquiring the posture information of the current body feeling control device 20 when the VR body glasses 1 is turned on before the obtaining the body feeling information.
  • the posture includes a direction, an angle, and the like, and the posture information of the body feeling control device 20 when the VR body glasses 1 is turned on is stored as reference information.
  • the step of acquiring the somatosensory information may generate the somatosensory information according to the difference between the current posture information and the reference information by acquiring the current posture information of the somatosensory control device 20. Since the sensible control device 20 is built in the VR body spectacles 1, the posture information of the sensation control device 20 can also be regarded as the posture information of the VR body spectacles 1.
  • the reference information of the VR body-spectrum 1 and the current attitude information may be acquired by inertial measurement elements, that is, the acquisition device may be implemented by an inertial measurement element, and the conventional measurement element may include a gyroscope and an accelerometer.
  • the command generation module when the VR body-spectrum 1 is turned on, the command generation module generates a back-in command, and can be sent to the execution device 20 through the sending module, so that the control module 22 of the executing device 20 controls execution.
  • the device causes the posture of the actuator 20 to be in an initial posture.
  • the initial attitude may be that the three axes of the drone or the pan/tilt are perpendicular to each other, and the heading axis (yaw axis) is a vertical direction.
  • the control area of the somatosensory control device 10 includes a control dead zone and a control active area.
  • the command generation module 12 does not generate a remote control command.
  • the control region of the somatosensory control device 10 further includes a control saturation region, and when the difference between the current posture and the reference information is greater than or equal to a preset threshold, the remote control command is unchanged.
  • FIG. 5 to 8 are schematic diagrams showing the corresponding postures of the control area of the somatosensory control device and the VR body-sensing glasses.
  • the angle of the heading axis (yaw axis) is taken as an example of the attitude information, but it should be understood that the attitude information may be other angles.
  • the acquisition module acquires the posture information of the current body feeling control device 20, and uses the posture information acquired at this time as reference information, wherein the current yaw axis angle is used as a reference yaw axis angle.
  • the yaw axis angle difference is 0.
  • the present invention sets the angle range as a dead zone, and when the yaw axis angle difference is in the dead zone, The command generation module 12 of the somatosensory control device 10 does not generate a remote command, so that no remote command is sent to the executing device.
  • the manipulation command is given to the execution device such as the drone, so that the drone and its pan/tilt do not excessively act on the user's minute movements.
  • the reaction ensures the stability of the drone's operating state.
  • the angular range of the dead zone is, for example, between -5 degrees and +5 degrees.
  • the posture balls in FIGS. 5 to 8 can be displayed on the screen of the VR body-wearing glasses 1 so that the user can intuitively feel the magnitude of the motion range and the accuracy of the operation.
  • the "control effective area" shown in the figure is entered.
  • the generation module 12 generates remote command to manipulate the actuator 30 based on the magnitude of the yaw axis deflection.
  • the difference in the angular direction of the current yaw axis direction minus the dead zone boundary is taken as the control angle ⁇ , and a remote command is generated based on the angle of the ⁇ , and the execution device is instructed to operate.
  • the yaw axis of the drone can be controlled to deflect the alpha angle or to deflect an angle associated with the alpha angle.
  • the executing device is included in the drone as the flight control module, a remote command for deflecting the unmanned machine yaw axis by a certain amount of the lever may be generated according to the alpha angle, and the lever amount has a corresponding relationship with the alpha angle.
  • the "saturation control zone" shown in the figure is entered.
  • the generation module 12 no longer generates remote command according to the magnitude of the yaw axis deflection angle, but still generates a remote command by controlling the yaw axis deflection angle corresponding to the maximum boundary of the effective area.
  • the magnitude of the motion of the controller is too large, there is no instruction to make the execution device unexecutable or dangerous or malfunctioning during execution.
  • it can ensure that the drone and its head are operating within the design tolerance to ensure the stability of flight or posture.
  • Fig. 9 is a diagram showing another embodiment of the manner of information interaction between the somatosensory control device and the executing device of the present invention.
  • the somatosensory control device 10 is included in the VR body-wearing glasses 1, and the VR-skinned eyewear 1 generates a remote control command according to the somatosensory information acquired by the somatosensory control device 10, and the remote control command is sensed by the VR.
  • the glasses 1 are sent to the drone 2.
  • Flight control module or pan/tilt control mode of drone 2 The block acts as the execution device (not shown).
  • the actuator also has a remote control device 3, such as a remote control for the drone.
  • the somatosensory information acquired by the somatosensory control device 10 or the generated remote control command may be transmitted to the remote control device 3 instead of being directly transmitted to the executing device.
  • the remote control device 3 itself has a transceiver module that can directly transfer the somatosensory information or remote control commands to the execution device 2.
  • the body feeling control device 10 includes an acquisition module 11, an instruction generation module, and a transmission module.
  • the acquisition module 11 is configured to acquire the somatosensory information;
  • the instruction generation module 12 is configured to generate a remote control instruction for controlling the execution device 20 according to the somatosensory information.
  • the sending module 13 is configured to send the somatosensory information acquired by the acquiring module 11 or the remote control command generated by the command generating module 12 to the executing device 20.
  • the execution device 20 of this embodiment also includes an execution module 21, a main control module 22, and a receiving module 23.
  • the execution module 21 is configured to perform a corresponding action according to an instruction of the control module 22.
  • the receiving module 23 of the executing device is configured to receive the somatosensory information or the remote control command from the transmitting module 13 of the somatosensory control device 10 by wire or wirelessly, and forward it to the control module 22.
  • the control module 22 generates a remote control command for controlling the execution module 21 according to the somatosensory information or a remote control command.
  • the control module 22 When the control module 22 receives the somatosensory information, it first generates a remote control command according to the somatosensory information, and then generates a remote control command of the execution module 21 according to the remote control command.
  • the body feeling control device 10 may not send the body feeling information or the remote control command directly to the execution device, but may simply transmit it to the remote control device 3.
  • the remote control device 3 itself has a transceiver module 31 and a control module 32.
  • the transceiver module 31 can directly forward the received somatosensory information or remote control command to the executing device 2, or process the received somatosensory information or remote control command and send it to the executing device. 2.
  • the processing may be converting the somatosensory information into a remote control command or a fusion of the remote control commands. The fusion of remote command is further explained below.
  • Unmanned vehicles such as drones usually have matching remote controls, and the remote control itself can generate remote commands.
  • the remote control command generated by the somatosensory control device 10 as the first remote control command
  • the remote control command generated by the remote control device as the second remote control command.
  • the second remote command can be generated by a user input element such as a joystick.
  • the remote control device fuses the first remote control command and the second remote control command to form a third remote control command.
  • the fusion may be a direct superposition, or may be superimposed according to a predetermined weight, or calculated according to a predetermined linear or non-linear formula, and finally the third remote control command is obtained.
  • the merging operation can be implemented by the remote control device 3 or directly by the execution device.
  • Figure 10 and Figure 11 show the two cases, respectively, in the case shown in Figure 10, the fusion module 33 as a remote control A portion of control module 32 of 3, in the illustrated situation of FIG. 11, fusion module 34 is a portion of control module 22 that is the execution device 2.
  • the operation of using the posture of the VR body-wearing glasses to control the yaw of the drone and the pitch of the pan-tilt is realized.
  • the yaw (left and right turn) direction of the eyeglasses controls the yaw of the aircraft, and the pitch of the eyeglasses of the eyeglasses controls the pan/tilt, and the control of the pan/tilt is consistent with the control of the pan/tilt.
  • the glasses When the VR body glasses are turned on, the glasses periodically sample the yaw axis and pitch axis data of their own IMU (Inertial Measurement Unit), calculate the angular difference (including the yaw and pitch components) from the reference pose, and then use this difference as the cloud.
  • the angle of the station or drone is offset, and a remote command is generated to be sent to the drone or its head. After the drone or pan/tilt receives the remote control command of the angle control, adjust the yaw and pitch angles of the drone or pan/tilt according to the yaw and pitch components of the angular difference indicated by the command.
  • the "sense control" has a higher priority than the "remote control”. Therefore, it is possible to set the control of the remote control device to be ineffective when the somatosensory control is set. Further, as a specific embodiment, when the VR body glasses are turned off, the periodic acquisition of the body feeling information is stopped, and at this time, a remote control command is transmitted to control the return of the drone or the pan/tilt as the executing device.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Theoretical Computer Science (AREA)
  • Selective Calling Equipment (AREA)

Abstract

L'invention concerne un procédé de commande à distance à détection de corps et un appareil de commande à détection de corps faisant appel audit procédé, permettant de commander à distance un actionneur au moyen d'un mode de détection de corps. Le procédé de commande à distance à détection de corps comprend : l'acquisition d'informations de détection de corps ; la génération d'une instruction de commande à distance servant à commander l'actionneur en fonction desdites informations de détection de corps ; l'exécution par l'actionneur d'une action en fonction de l'instruction de commande à distance. L'appareil de commande à détection de corps comprend au moins : un module d'acquisition, servant à acquérir des informations de détection de corps ; et un module de génération d'instruction, servant à générer une instruction de commande à distance servant à commander l'appareil d'exécution en fonction des informations de détection de corps. La présente invention concerne également un cardan et un véhicule aérien sans pilote correspondants.
PCT/CN2017/079790 2017-04-07 2017-04-07 Procédé de commande à distance à détection de corps, appareil de commande, cardan et véhicule aérien sans pilote WO2018184232A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2017/079790 WO2018184232A1 (fr) 2017-04-07 2017-04-07 Procédé de commande à distance à détection de corps, appareil de commande, cardan et véhicule aérien sans pilote
CN201780013879.5A CN108700893A (zh) 2017-04-07 2017-04-07 体感遥控方法、控制装置、云台和无人飞行器

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/079790 WO2018184232A1 (fr) 2017-04-07 2017-04-07 Procédé de commande à distance à détection de corps, appareil de commande, cardan et véhicule aérien sans pilote

Publications (1)

Publication Number Publication Date
WO2018184232A1 true WO2018184232A1 (fr) 2018-10-11

Family

ID=63712994

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/079790 WO2018184232A1 (fr) 2017-04-07 2017-04-07 Procédé de commande à distance à détection de corps, appareil de commande, cardan et véhicule aérien sans pilote

Country Status (2)

Country Link
CN (1) CN108700893A (fr)
WO (1) WO2018184232A1 (fr)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111247494A (zh) * 2019-01-29 2020-06-05 深圳市大疆创新科技有限公司 一种可移动平台的控制方法、装置及可移动平台
CN110162088B (zh) * 2019-05-16 2022-01-04 沈阳无距科技有限公司 无人机控制方法、装置、无人机、可穿戴设备及存储介质
WO2020237429A1 (fr) * 2019-05-24 2020-12-03 深圳市大疆创新科技有限公司 Procédé de commande pour dispositif de commande à distance et dispositif de commande à distance
CN111123965A (zh) * 2019-12-24 2020-05-08 中国航空工业集团公司沈阳飞机设计研究所 一种面向飞行器控制的体感操作方法及操作平台
CN114830070A (zh) * 2020-12-25 2022-07-29 深圳市大疆创新科技有限公司 飞行指引方法、电机校准方法、显示设备及可读存储介质
CN116710870A (zh) * 2021-03-16 2023-09-05 深圳市大疆创新科技有限公司 基于体感遥控器的控制方法、装置及存储介质

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN205103661U (zh) * 2015-07-24 2016-03-23 刘思成 一种基于体感控制技术的无人机操控系统
CN105469579A (zh) * 2015-12-31 2016-04-06 北京臻迪机器人有限公司 体感遥控器、体感遥控飞行系统和方法
CN105739525A (zh) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 一种配合体感操作实现虚拟飞行的系统
WO2016168117A2 (fr) * 2015-04-14 2016-10-20 John James Daniels Interfaces humain/humain, machine/humain, multi-sensorielles, électriques et portables
CN106155090A (zh) * 2016-08-29 2016-11-23 电子科技大学 基于体感的可穿戴无人机控制设备

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808675B (zh) * 2015-03-03 2018-05-04 广州亿航智能技术有限公司 基于智能终端的体感飞行操控系统及终端设备
US9836117B2 (en) * 2015-05-28 2017-12-05 Microsoft Technology Licensing, Llc Autonomous drones for tactile feedback in immersive virtual reality
CN105828062A (zh) * 2016-03-23 2016-08-03 常州视线电子科技有限公司 无人机3d虚拟现实拍摄系统
CN106227231A (zh) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 无人机的控制方法、体感交互装置以及无人机

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016168117A2 (fr) * 2015-04-14 2016-10-20 John James Daniels Interfaces humain/humain, machine/humain, multi-sensorielles, électriques et portables
CN205103661U (zh) * 2015-07-24 2016-03-23 刘思成 一种基于体感控制技术的无人机操控系统
CN105469579A (zh) * 2015-12-31 2016-04-06 北京臻迪机器人有限公司 体感遥控器、体感遥控飞行系统和方法
CN105739525A (zh) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 一种配合体感操作实现虚拟飞行的系统
CN106155090A (zh) * 2016-08-29 2016-11-23 电子科技大学 基于体感的可穿戴无人机控制设备

Also Published As

Publication number Publication date
CN108700893A (zh) 2018-10-23

Similar Documents

Publication Publication Date Title
WO2018184232A1 (fr) Procédé de commande à distance à détection de corps, appareil de commande, cardan et véhicule aérien sans pilote
JP6642432B2 (ja) 情報処理装置及び情報処理方法、並びに画像表示システム
CN108769531B (zh) 控制拍摄装置的拍摄角度的方法、控制装置及遥控器
WO2015154627A1 (fr) Système d'élément de réalité virtuelle
US20120004791A1 (en) Teleoperation method and human robot interface for remote control of a machine by a human operator
JP4012749B2 (ja) 遠隔操縦システム
KR20170090888A (ko) Hmd를 이용한 무인기 조종 장치
JP7390541B2 (ja) アニメーション制作システム
US11804052B2 (en) Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path
CN106327583A (zh) 一种实现全景摄像的虚拟现实设备及其实现方法
WO2019019398A1 (fr) Télécommande et système d'aéronef sans pilote
JP2021193613A (ja) アニメーション制作方法
Xia et al. A 6-DOF telexistence drone controlled by a head mounted display
JP6744033B2 (ja) 移動体操縦システム、操縦シグナル送信システム、移動体操縦方法、プログラム、および記録媒体
CN108475064B (zh) 用于设备控制的方法、设备和计算机可读存储介质
JP6964302B2 (ja) アニメーション制作方法
O'Keeffe et al. Oculus rift application for training drone pilots
KR101973174B1 (ko) 제스처 인식 기반 무인기 제어 장치 및 방법
WO2021220407A1 (fr) Dispositif d'affichage monté sur la tête et procédé de commande d'affichage
WO2024000189A1 (fr) Procédé de commande, dispositif d'affichage monté sur la tête, système de commande et support de stockage
Martínez-Carranza et al. On combining wearable sensors and visual SLAM for remote controlling of low-cost micro aerial vehicles
JP7115698B2 (ja) アニメーション制作システム
JP7218873B2 (ja) アニメーション制作システム
WO2024125465A1 (fr) Système d'affichage suivant l'angle de visualisation, système d'exploitation, système de commande de détection de reconstruction et procédé de commande
KR102263227B1 (ko) 상대 항법 정보를 활용한 증강 현실 기반 무인 이동체 제어 방법 및 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17904817

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17904817

Country of ref document: EP

Kind code of ref document: A1