WO2018053824A1 - 无人机控制方法、头戴式显示眼镜及系统 - Google Patents

无人机控制方法、头戴式显示眼镜及系统 Download PDF

Info

Publication number
WO2018053824A1
WO2018053824A1 PCT/CN2016/100072 CN2016100072W WO2018053824A1 WO 2018053824 A1 WO2018053824 A1 WO 2018053824A1 CN 2016100072 W CN2016100072 W CN 2016100072W WO 2018053824 A1 WO2018053824 A1 WO 2018053824A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
screen
drone
head
information
Prior art date
Application number
PCT/CN2016/100072
Other languages
English (en)
French (fr)
Inventor
吴一凡
刘怀宇
吴军
龚明
高修峰
Original Assignee
深圳市大疆创新科技有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市大疆创新科技有限公司 filed Critical 深圳市大疆创新科技有限公司
Priority to JP2019516208A priority Critical patent/JP6851470B2/ja
Priority to PCT/CN2016/100072 priority patent/WO2018053824A1/zh
Priority to CN201680003670.6A priority patent/CN107454947A/zh
Priority to EP16916563.6A priority patent/EP3511758A4/en
Publication of WO2018053824A1 publication Critical patent/WO2018053824A1/zh
Priority to US16/363,497 priority patent/US20190220040A1/en

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0016Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • G05D1/0816Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability
    • G05D1/085Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft to ensure stability to ensure coordination between different movements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0038Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with simple or augmented images from one or more cameras located onboard the vehicle, e.g. tele-operation
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0011Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
    • G05D1/0044Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement by providing the operator with a computer generated representation of the environment of the vehicle, e.g. virtual reality, maps
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2201/00UAVs characterised by their flight controls
    • B64U2201/20Remote controls

Definitions

  • the embodiments of the present invention relate to the field of drones, and in particular, to a drone control method, a head mounted display glasses, and a system.
  • a mobile phone, a remote control with a touch screen, and a virtual reality (VR) glasses control device can be used to control the drone, and an application for controlling the drone is installed on the mobile phone, and the user operates the application.
  • the drone is controlled by touching the screen of the mobile phone, and the drone can be controlled by the user by touching the screen of the remote controller.
  • the VR glasses lack a touch screen, and the control methods of controlling the drone through the touch screen such as the pointing flight mode, the smart following mode, and the camera focus cannot be realized.
  • Embodiments of the present invention provide a drone control method, a head mounted display glasses, and a system to achieve the purpose of controlling a drone.
  • An aspect of an embodiment of the present invention provides a drone control method, including:
  • the movable marker is moved to control the drone based on the input information.
  • the one or more processors are used to:
  • the movable marker is moved to control the drone.
  • a drone control system including: a drone, and the head mounted display glasses, wherein the head mounted display glasses are used to control the Drone.
  • the UAV control method, the head-mounted display glasses and the system provided by the embodiments of the present invention obtain the input information of the user through the head-mounted display glasses, and the input information may be the posture information of the user, or may be input by the user through the input device.
  • the direction information for moving the cursor that is, the head-mounted display glasses can control the movement of the cursor on the screen according to the posture information of the user, or can control the cursor on the screen according to the direction information for moving the cursor input by the user through the input device.
  • FIG. 1 is a flowchart of a method for controlling a drone according to an embodiment of the present invention
  • FIG. 1A is a schematic diagram of a “pointing flight” control interface of a drone according to an embodiment of the present invention
  • FIG. 1B is a schematic diagram of a “smart following” control interface of a drone according to an embodiment of the present invention
  • FIG. 2 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 2A is a top view of a VR eyeglass and a user's head according to another embodiment of the present invention.
  • FIG. 2B is a schematic diagram of a cursor moving to the left of a screen according to another embodiment of the present invention.
  • 2C is a top view of a VR eyeglass and a user's head according to another embodiment of the present invention.
  • 2D is a schematic diagram of a cursor moving to the right of a screen according to another embodiment of the present invention.
  • 2E is a side view of a VR eyeglass and a user's head according to another embodiment of the present invention.
  • 2F is a schematic diagram of a cursor moving downward on a screen according to another embodiment of the present invention.
  • 2G is a side view of a VR eyeglass and a user's head according to another embodiment of the present invention.
  • 2H is a schematic diagram of a cursor moving upward on a screen according to another embodiment of the present invention.
  • 2I is a schematic diagram of coordinates of a cursor on a screen according to another embodiment of the present invention.
  • FIG. 3 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 4 is a flowchart of a method for controlling a drone according to another embodiment of the present invention.
  • FIG. 5 is a structural diagram of a head mounted display glasses according to an embodiment of the present invention.
  • FIG. 6 is a structural diagram of a head mounted display glasses according to another embodiment of the present invention.
  • FIG. 7 is a schematic structural diagram of a head mounted display glasses according to another embodiment of the present invention.
  • FIG. 8 is a structural diagram of a head mounted display glasses according to another embodiment of the present invention.
  • a component when referred to as being "fixed” to another component, it can be directly on the other component or the component can be present. When a component is considered to "connect” another component, it can be directly connected to another component or possibly a central component.
  • FIG. 1 is a flowchart of a method for controlling a drone according to an embodiment of the present invention.
  • the execution body of this embodiment may be a drone control end for controlling the drone, and the drone control end may include but is not limited to head mounted display glasses (VR glasses, VR helmets, etc.), mobile phones, and remote controls. (such as a remote control with a display), a smart bracelet, a tablet, etc.
  • the drone can operate in different modes including, but not limited to, pointing flight, smart follow, camera focus, and the like.
  • the user can select a flight target by clicking on a point or an area on a display device (such as a screen) of the drone control end, and the drone can fly toward the flight target.
  • a display device such as a screen
  • the user can control the drone to follow the movable object by selecting a movable object (such as a person, an animal, etc.) on a display device (such as a screen) of the drone control terminal. .
  • a movable object such as a person, an animal, etc.
  • a display device such as a screen
  • the user can control the imaging device (such as a camera) of the drone by clicking a point or an area on the display device (such as a screen) of the drone control terminal to focus.
  • the imaging device such as a camera
  • clicking a point or an area on the display device such as a screen
  • the method in this embodiment may include:
  • Step S101 displaying a movable mark.
  • the UAV control terminal may be a virtual reality head-mounted display device.
  • the method in this embodiment is applicable to a user controlling a drone through a virtual reality head-mounted display device, and the virtual reality head-mounted display device is specifically It can be Virtual Reality (VR) glasses, VR headers.
  • VR Virtual Reality
  • the user wears VR glasses, the screen of the VR glasses displays a movable mark, the movable mark is specifically a cursor, and the cursor can be moved on the screen of the VR glasses.
  • Step S102 Acquire input information of the user.
  • the VR glasses can obtain input information of the user, and the input information may be posture information of the user, or may be input by the user through the input device for moving the movable target. Record the direction information.
  • the VR glasses are equipped with an Inertial Measurement Unit (IMU).
  • IMU Inertial Measurement Unit
  • the IMU can sense the position and posture of the VR glasses.
  • the VR glasses can obtain the user's posture information through the IMU.
  • the IMU can be used.
  • the posture information of the user's head is sensed, and the posture information includes at least one of a pitch angle and a yaw angle.
  • the VR glasses control the position of the cursor on the screen according to the posture information of the user's head sensed by the IMU. From the user's point of view, the user can control the cursor to move on the screen by turning the head.
  • the VR glasses can also be equipped with sensors that sense the movement of the user's eyes and the changes of the eyes (such as the eyeball), such as an eye tracker.
  • the manner in which the VR glasses acquire the user's posture information can also be realized by the sensor, and the VR glasses are The sensor senses the user's binocular motion and eye changes, and controls the cursor to move on the screen. From the user's point of view, the user can control the cursor to move on the screen through binocular movement and eye changes.
  • the VR glasses may be connected to an input device, and the input device includes at least one of the following: a five-dimensional key, a joystick, and a touch screen.
  • the connection between the VR glasses and the input device can be wired (such as cable) or wireless (such as Wifi, Bluetooth, etc.).
  • the VR glasses may include a remote controller, and the remote controller is provided with a five-dimensional key, and the VR glasses control the cursor on the screen according to the input information indicating the direction generated by the user clicking the up, down, left and right keys of the five-dimensional key. mobile.
  • the VR glasses can also control the movement of the cursor on the screen according to the input information indicating the direction generated by the user turning the joystick.
  • the VR glasses can also control the movement of the cursor on the screen according to the input information generated by the user's sliding or touch operation on the touch screen.
  • the VR glasses are also provided with a touchpad on the frame, and the user can also control the movement of the cursor on the screen by manipulating the touchpad.
  • the virtual button can also be displayed on the screen of the VR glasses.
  • the virtual button can be similar to a five-dimensional button, including virtual buttons in four directions of up, down, left, and right, and a virtual button for confirming, and the user can also pass the virtual button. Manipulate to control the cursor to move across the screen.
  • Step S103 Move the movable marker to control the drone according to the input information.
  • the VR glasses can control the cursor to move on the screen according to the posture information of the user, or can also be used according to the user's input through the input device for moving the movable marker.
  • Direction information which controls the cursor to move on the screen.
  • VR glasses can convert the movement information of the cursor on the screen into instructions for the user to control the drone through the VR glasses. For example, the cursor moves to the right on the screen to control the drone to fly to the right.
  • the VR glasses can convert the movement information of the cursor on the screen into instructions for the user to control the onboard device of the drone through the VR glasses, for example, the cursor moves to the right on the screen to control the unmanned person.
  • the imaging device of the machine (such as a camera) moves to the right.
  • the VR glasses display a user interface on the screen, and the VR glasses can control the drone to enter the smart follow mode, the pointing flight mode or the camera focus mode according to the operation of the user on the user interface, and the specific implementation manner may be: the user
  • the interface includes icons for controlling the drone, such as the "pointing flight” icon, the "smart following” icon, the "camera focusing” icon, etc., the VR glasses are rotated according to the user's head, the eye movement, and the five-dimensional key is clicked up and down.
  • the key, the joystick, the touch sliding touch screen or the touch sliding touchpad control the cursor to move on the screen.
  • the user determines to select the icon, and the VR glasses control according to the confirmation information input by the user.
  • the drone enters the mode corresponding to the icon.
  • the user can determine the manner of selecting the icon by any of the following methods: clicking the confirmation button of the five-dimensional key, clicking the touch screen, clicking the touchpad, stopping the eye for a predetermined period of time, continuously moving the eye, and the cursor stays on the icon for a predetermined period. Time, click on the virtual button, operate the joystick.
  • the VR glasses control the cursor to move on the screen according to the direction of rotation of the user's head.
  • the method confirms that the “pointing flight” icon is selected, and the VR glasses control the drone to enter the pointing flight mode according to the user's selection instruction.
  • the user interface presents a “pointing flight” control interface, as shown in FIG. 1A, the screen of the VR glasses. Display the image screen (such as the image screen taken by the drone), and the user can again use the head rotation, eye movement, click the up and down buttons of the five-dimensional button, turn the joystick, touch the touch screen or touch the touch trackpad.
  • the control cursor 11 is moved on the screen.
  • the manner in which the user determines to select the target point is consistent with the manner in which the user determines the selected icon, and details are not described herein again.
  • the VR glasses send the location information of the target point selected by the user to the drone, so that the drone can fly to the location specified by the target point, that is, the smart flight mode where the user points to where the drone flies.
  • the VR glasses control the cursor to move on the screen according to the manner in which the user's eyes rotate.
  • the cursor moves to the "smart follow” icon
  • the user confirms the selection of the "smart follow” icon by clicking the touchpad or using other methods for confirming the target disclosed by the present invention, and the VR glasses control the drone according to the user's selection instruction.
  • the user interface presents a "smart follow" control interface.
  • the following target 12 is displayed on the screen of the VR glasses, and the user can select the follow target by means of a frame selection.
  • the frame selection process can be implemented as: for rotating the head to control the cursor to move to the starting point position of the desired rectangular frame, as shown in FIG. 1B, the cursor position 13 is pressed, the confirmation key of the five-dimensional key is pressed or the touchpad is clicked.
  • the ending point is selected in the same manner, such as the cursor position 14 shown in FIG. 1B, and the broken line frame 15 indicates the rectangular frame in the frame selection process.
  • the implementation of the camera's focus mode of the drone is the same as that of the pointing flight mode, and will not be described here.
  • the input information of the user is obtained by the control end of the drone, and the input information may be the posture information of the user, or may be the direction information used by the user to input the cursor through the input device, that is, the control end of the drone may be according to the user.
  • the gesture information controls the cursor to move on the screen.
  • the cursor can also be moved on the screen according to the direction information used by the user to move the cursor through the input device.
  • the cursor moves to the target point of the screen, the user inputs according to the selection.
  • the command selects the target point and controls the drone to enter the corresponding flight mode or control mode, thereby realizing the control method of controlling the drone by pointing flight, intelligent following, camera focusing, and the like.
  • FIG. 2 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 2, on the basis of the embodiment shown in FIG. 1, the method in this embodiment may include:
  • Step S201 displaying a movable mark.
  • Step S201 is the same as step S101, and details are not described herein again.
  • Step S202 Acquire a yaw angle and a pitch angle of the user's head sensed by the inertial measurement unit.
  • the VR glasses are equipped with an Inertial Measurement Unit (IMU).
  • IMU Inertial Measurement Unit
  • the IMU can sense the position and posture of the VR glasses.
  • the VR glasses can obtain the user's posture information through the IMU.
  • the IMU can be used.
  • the posture information of the user's head is sensed, and the posture information includes at least one of the following: a yaw angle and a pitch angle.
  • Step S203 Determine a position of the movable mark on the screen according to the yaw angle and the pitch angle.
  • the IMU can sense the yaw angle of the user's head.
  • the IMU can sense.
  • the VR glasses control the position of the movable marker (such as a cursor) on the screen based on the yaw and pitch angles of the user's head sensed by the IMU.
  • the VR glasses according to the yaw angle and the elevation angle of the user's head sensed by the IMU, the position of the movable marker on the screen may be specifically implemented: determining, according to the yaw angle, the movable marker on the screen Coordinates of the upper horizontal direction; and coordinates of the movable mark on the screen in the vertical direction according to the pitch angle.
  • the VR glasses can be rotated according to the left and right sides of the user's head, and the cursor is controlled to rotate in the horizontal direction of the screen. According to the pitch motion of the user's head, the cursor is controlled to rotate in the vertical direction of the screen.
  • a broken line frame 1 indicates a top view of the VR glasses
  • a broken line frame 2 indicates a top view of the user's head
  • a broken line frame 3 indicates a rotation direction of the user's head
  • a broken line frame 4 indicates a screen of the VR glasses
  • a broken line frame 5 indicates a screen. Removable mark on.
  • the user's head is rotated to the left, the VR glasses control the cursor to move to the left in the horizontal direction of the screen, and the result of the cursor moving to the left in the horizontal direction of the screen is as shown in FIG. 2B.
  • the VR glasses control the cursor to move to the right in the horizontal direction of the screen, and the result of the cursor moving to the right in the horizontal direction of the screen is as shown in FIG. 2D.
  • a broken line frame 6 indicates a side view of the VR glasses
  • a broken line frame 7 indicates a side view of the user's head
  • a broken line frame 8 indicates a moving direction of the user's head
  • a broken line frame 9 indicates a screen of the VR glasses
  • a broken line frame 10 Represents a moveable marker on the screen.
  • the user's head is rotated downward, that is, the user's head is in a top view state
  • the VR glasses control the cursor to move downward in the vertical direction of the screen, and the cursor moves downward in the vertical direction of the screen as shown in FIG. 2F. Shown.
  • the user's head is rotated upward, that is, the user's head is in a head-up state, and the VR glasses control the cursor to move upward in the vertical direction of the screen, and the result of the cursor moving upward in the vertical direction of the screen is as shown in FIG. 2H. Show.
  • the VR glasses determine the coordinates of the cursor in the horizontal direction on the screen according to the yaw angle, and determine the coordinates of the cursor in the vertical direction on the screen according to the elevation angle.
  • VR glasses according to the yaw Angle determining the coordinates of the cursor in the horizontal direction on the screen can be realized in two ways.
  • determining the coordinates of the cursor in the vertical direction on the screen can also be realized in two ways. Two implementation modes are specifically described below:
  • the lower left corner of the VR glasses screen is taken as the coordinate origin
  • the horizontal arrow passing the origin indicates the X-axis direction
  • the vertical arrow passing the origin indicates the Y-axis direction
  • 11 indicates the cursor
  • x indicates the cursor 11 is on the X-axis of the screen.
  • the coordinates of y represent the coordinates of the Y axis of the cursor 11 on the screen.
  • the coordinates of the cursor on the screen are determined by the following formulas (1) and (2):
  • x is the coordinate of the X-axis of the cursor on the screen
  • screenW is the number of pixels contained in the X-axis direction (width of the screen)
  • yaw angle is the yaw of the user's head obtained by one sampling of the IMU.
  • constant factorX represents the conversion factor of the X-axis direction of the screen.
  • y represents the coordinates of the cursor on the Y-axis of the screen
  • screenH represents the number of pixels contained in the Y-axis direction (the length of the screen)
  • the pitch angle represents the pitch angle of the user's head obtained by one sampling of the IMU.
  • the constant factorY represents the conversion factor of the screen in the Y-axis direction.
  • deltaX represents the coordinate offset of the cursor in the horizontal direction of the screen
  • deltaYaw represents the angular difference of the yaw angle (yaw angle) currently acquired by the IMU and the yaw angle of the last acquisition
  • constant factorX represents the conversion coefficient of the X-axis direction of the screen.
  • deltaY deltaPitch*factorY (4)
  • deltaY represents the coordinate offset of the cursor in the vertical direction of the screen
  • deltaPitch represents the angle difference between the pitch angle (pitch angle) currently acquired by the IMU and the pitch angle of the last acquired
  • constant factorY represents the conversion coefficient of the Y-axis direction of the screen.
  • the deltaYaw may also be the angular difference of the yaw angle acquired by the IMU any two times
  • the deltaPitch may also be the angular difference of the pitch angle acquired by the IMU any two times.
  • Step S204 Select a target object displayed on the screen according to a position of the movable mark on the screen.
  • the VR glasses can determine the coordinate position of the cursor on the screen.
  • the cursor can move on the screen, and the target object displayed on the screen can be selected by moving the cursor on the screen.
  • the IMU senses the yaw angle and the elevation angle of the user's head
  • the VR glasses determine the coordinates of the cursor in the horizontal direction on the screen according to the yaw angle, and determine the coordinates of the cursor in the vertical direction on the screen according to the elevation angle, and the user head
  • the motion information of the part is accurately converted into the position information of the cursor on the screen, which improves the accuracy of the cursor position detection, and also improves the accuracy of the user's somatosensory control drone.
  • FIG. 3 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 3, on the basis of the embodiment shown in FIG. 2, the method of selecting the target object displayed on the screen according to the position of the movable mark on the screen can be realized by the following steps:
  • Step S301 Adjust a position of the movable mark on the screen to point the movable mark to the target object.
  • the cursor when the user's head rotates left and right, the cursor can be controlled to move in the horizontal direction of the screen, and when the user's head is tilted up and down, the cursor can be controlled in the vertical direction of the screen. Moving up, the VR glasses adjust the position of the cursor on the screen according to the left and right rotation of the user's head and the pitch up and down.
  • the user interface is displayed on the screen of the VR glasses, and the user interface includes icons for controlling the drone, such as a "pointing flight” icon, a “smart following” icon, a “camera focusing” icon, etc., and the VR glasses can be selected according to the icon selected by the user. Controls the drone to enter the mode corresponding to the icon. For example, after entering the pointing flight mode, the map layer or image screen is displayed on the screen of the VR glasses, and the user adjusts the position of the cursor on the map layer or the image screen by the head movement so that the cursor points to the target point on the map layer. Or the target object on the image screen.
  • icons for controlling the drone such as a "pointing flight” icon, a "smart following” icon, a "camera focusing” icon, etc.
  • the VR glasses can be selected according to the icon selected by the user. Controls the drone to enter the mode corresponding to the icon. For example, after entering the pointing flight mode, the map layer or image screen is displayed on the screen of
  • Step S302 Receive a selection instruction input by the user, where the selection instruction is used to select the target object.
  • the VR glasses When the cursor points to the target point on the map layer or the target object on the image screen, the VR glasses receive a selection instruction input by the user, indicating that the user confirms the selection of the target point on the map layer or the target object on the image screen.
  • the VR glasses receiving the selection instruction input by the user for selecting the target object can be implemented in the following manners:
  • the touchpad can be set on the frame of the VR glasses.
  • the VR glasses may also be equipped with a sensor (such as an eye tracker) that senses both the user's binocular motion and eye changes.
  • a sensor such as an eye tracker
  • the user may stop rotating for a predetermined time or eyes through the eyes.
  • the eye information such as continuous swaying indicates that the target point or the target object on the selection screen is confirmed.
  • the VR glasses can also be externally connected with a touch screen by means of wired or wireless, and the user can confirm the selection of the target point or the target object on the screen by operating the touch screen such as pressing the touch screen for a predetermined time.
  • the touch screen After sensing the touch operation of the user, the related touch operation can be converted into an electrical signal, and the electrical signal is sent to the VR glasses, so that the VR glasses determine the selection instruction input by the user according to the electrical signal.
  • the user can also confirm the selection of the target point or target object on the screen by clicking on the touch screen (such as clicking or double-clicking).
  • the VR glasses determine that the user selects the target point or target object.
  • a virtual button may be displayed on the screen or outside the screen of the VR glasses, and the virtual button may also serve as a confirmation button.
  • the selection instruction may be a click or a frame selection.
  • the VR glasses adjust the position of the cursor on the screen according to the movement of the user's head, so that the cursor can be pointed to the target point or the target object desired by the user, thereby achieving the selection or frame selection of the target point or the target object, thereby realizing
  • the user selects or selects the frame through the VR glasses to make up for the lack of the touch screen of the VR glasses, and the control mode of the drone controlled by the touch screen, such as the pointing flight mode, the intelligent following mode, the camera focus, etc., cannot be realized. defect.
  • FIG. 4 is a flowchart of a method for controlling a drone according to another embodiment of the present invention. As shown in FIG. 4, on the basis of any of the above embodiments, the drone control method provided in this embodiment may include:
  • Step S401 Acquire status information of the drone.
  • This embodiment is applicable to a scenario where multiple control devices simultaneously control the drone when the unmanned aerial vehicle has multiple control devices.
  • VR glasses, smart terminals, remote controls with screens, smart bracelets, etc. can be used as control devices for controlling drones. Since multiple control devices can independently control drones, when multiple control devices simultaneously control one When the man-machine is used, it will lead to control confusion or mutual interference.
  • Disturb For example, user A holds a smart terminal, and the drone enters the smart follow mode through the smart terminal. After user A selects the target on the screen of the smart terminal, the drone enters the smart follow mode and sends the smart follow screen in real time. To the smart terminal, the user A views the smart follow-up screen in real time through the screen of the smart terminal.
  • User B wears VR glasses.
  • each control device provided in this embodiment needs to acquire state information of the drone in real time or periodically, and the drone's state information.
  • the status information includes flight mode, camera setting information, video playback information, and the like.
  • Each control device can actively obtain the status information of the drone, and can also passively acquire the status information of the drone.
  • the active acquisition of the status information of the drone can be realized as: the control device sends a status information acquisition request to the drone, and the drone obtains the request according to the status information, and returns the status information to the control device.
  • Passive acquisition of the status information of the drone can be realized as: the drone broadcasts its status information to each control device in a broadcast manner.
  • the drone periodically broadcasts status information.
  • the passive acquisition of the state information of the drone can also be realized as: when a control device acquires the state information of the drone, the state information is sent to other control devices, and the other control devices passively receive the state information of the drone.
  • Step S402 Synchronize the state information of the UAV stored locally according to the state information of the UAV.
  • each control device After acquiring the status information of the drone, each control device synchronizes the state information of the unmanned drone stored locally.
  • the intelligent flight mode of the drone stored locally by the smart terminal is “pointing flight”, and the current intelligent flight mode of the drone is “smart following”, and the smart terminal needs to intelligently fly the locally stored drone.
  • the mode is updated to “smart follow”, and at the same time, the smart terminal passively switches to the “smart follow” control interface, so that the user holding the smart terminal knows that other control devices have configured the drone into the smart follow mode.
  • the status information of the UAV is obtained by using a plurality of control devices, and the status information of the UAV stored locally is synchronized according to the status information of the UAV, so that the status information of the UAV stored by each control device is The current status information of the drone is consistent, avoiding multiple control settings When controlling a drone at the same time, there is a phenomenon of control chaos or mutual interference.
  • Embodiments of the present invention provide a head mounted display glasses.
  • the head mounted display glasses provided in this embodiment may specifically be the drone control end in the above method embodiment.
  • FIG. 5 is a structural diagram of the head mounted display glasses according to an embodiment of the present invention.
  • the head mounted display glasses 50 include one or more processors 51 and a screen 52 for displaying movable tags.
  • One or more processors 51 are used to acquire input information of the user; according to the input information, the movable mark is moved to control the drone.
  • the one or more processors may include, but are not limited to, a microprocessor (English: microcontroller), a reduced instruction set computer (English: reduced RISC), an application specific integrated circuit (English: application specific Integrated circuits (ASIC), application-specific instruction-set processor (ASIP), central processing unit (English: central processing unit, CPU), physical processor English (English) : physics processing unit (referred to as: PPU), digital signal processor (English: digital signal processor, referred to as: DSP), field programmable gate array (English: field programmable gate array, referred to as: FPGA).
  • a microprocessor English: microcontroller
  • a reduced instruction set computer English: reduced RISC
  • an application specific integrated circuit English: application specific Integrated circuits (ASIC), application-specific instruction-set processor (ASIP)
  • central processing unit English: central processing unit, CPU
  • PPU physical processor English
  • DSP digital signal processor
  • FPGA field programmable gate array
  • the head mounted display glasses can include VR glasses, or VR helmets, and the like.
  • the input information is posture information of the user.
  • the input information is direction information input by the user through the input device for moving the movable marker.
  • the input device includes at least one of the following: a five-dimensional key, a joystick, a touchpad, a touch screen, or a virtual button.
  • the attitude information includes at least one of the following: a yaw angle and a pitch angle.
  • the input information of the user is obtained by the head-mounted display glasses, and the input information may be the posture information of the user, or may be the direction information used by the user to input the cursor through the input device, that is, the head-mounted display glasses may be according to the user.
  • Gesture information, controlling the cursor to move on the screen The cursor can also be moved on the screen according to the direction information used by the user to input the cursor for moving the cursor.
  • the target point is selected according to the selection instruction input by the user, and the drone is controlled to enter.
  • the corresponding flight mode or control mode realizes the control method of controlling the drone by pointing flight, intelligent following, camera focusing and the like.
  • FIG. 6 is a structural diagram of a head mounted display glasses according to another embodiment of the present invention.
  • the head mounted display glasses 50 further includes the one or more processors. 51 connected inertial measurement unit 53 for sensing a yaw angle and a pitch angle of the user's head, and transmitting the yaw and pitch angles of the user's head to the one or more Processor 51.
  • the one or more processors 51 are configured to determine coordinates of the movable mark on a horizontal direction of the screen according to the yaw angle; and determine, according to the pitch angle, the movable mark in the The coordinates of the vertical direction on the screen.
  • the one or more processors 51 are configured to determine a coordinate position of the movable marker on a horizontal direction of the screen according to a yaw angle of the user's head sensed by the inertial measurement unit; Determining a coordinate position of the movable marker in a vertical direction on the screen according to a pitch angle of the user's head sensed by the inertial measurement unit.
  • the one or more processors 51 determine the coordinate offset of the movable mark on the screen in the horizontal direction according to the angular difference between the two yaw angles of the user's head sensed by the inertial measurement unit. And determining a coordinate offset of the movable mark in a vertical direction on the screen according to an angular difference between two pitch angles of the user's head sensed by the inertial measurement unit.
  • the IMi senses the yaw angle and the elevation angle of the user's head, and the one or more processors determine the coordinates of the cursor in the horizontal direction on the screen according to the yaw angle, and determine the vertical direction of the cursor on the screen according to the elevation angle.
  • the coordinates of the user's head are accurately converted into the position information of the cursor on the screen, which improves the accuracy of the cursor position detection and also improves the user body.
  • the sense controls the accuracy of the drone.
  • FIG. 7 is a structural diagram of a head mounted display glasses according to another embodiment of the present invention.
  • one or more processors 51 are used to adjust the movable mark in the same manner. a position on the screen to point the movable marker to the target object; the head mounted display glasses 50 further comprising: a receiving unit 54 coupled to the one or more processors 51, the receiving unit 54 for receiving a selection instruction input by the user, the selection instruction being used to select the target object.
  • the receiving unit 54 is configured to receive, by the user, a selection instruction for selecting the target object, including: receiving a selection instruction that is input by the user by clicking a confirmation key of a five-dimensional key; or receiving the user a selection instruction input by clicking the touch panel; or receiving a selection instruction input by the user through eye movement; or receiving a selection instruction input by the user by clicking the touch screen; or stopping the movable object on the target object After a predetermined period of time, selecting the target object; or receiving a selection instruction input by the user by clicking the virtual button; or receiving a selection instruction input by the user by controlling the joystick.
  • the head mounted display glasses 50 further includes an eye sensor 55 connected to the one or more processors 51 for sensing motion information of the user's eyes and transmitting the motion information.
  • the one or more processors 51 are used; the one or more processors 51 are configured to determine the selection instruction based on the motion information.
  • the selection instruction includes a click or a frame selection.
  • the VR glasses adjust the position of the cursor on the screen according to the movement of the user's head, so that the cursor can be pointed to the target point or the target object desired by the user, thereby achieving the selection or frame selection of the target point or the target object, thereby realizing
  • the user selects or selects the frame through the VR glasses to make up for the lack of the touch screen of the VR glasses, and the control mode of the drone controlled by the touch screen, such as the pointing flight mode, the intelligent following mode, the camera focus, etc., cannot be realized. defect.
  • Embodiments of the present invention provide a head mounted display glasses.
  • Figure 8 is a diagram of another embodiment of the present invention A structural view of the provided head-mounted display glasses, as shown in FIG. 8, on the basis of any of the above-described embodiments of the head-mounted display glasses, taking the embodiment shown in FIG. 7 as an example, the one or more processors 51 further And acquiring state information of the drone; and synchronizing state information of the unmanned machine stored locally according to the state information of the drone.
  • the head mounted display glasses 51 further includes a transmitting unit 56 connected to the one or more processors 51, the transmitting unit 56 is configured to send a status information acquisition request to the drone; and the receiving unit 54 is further configured to receive Status information sent by the drone.
  • the receiving unit 54 is further configured to receive status information of the drone broadcast.
  • the receiving unit 54 is further configured to receive status information of the drone sent by another drone control end, where the other drone control end is used to control the drone, and receive the drone Status information of the broadcast.
  • the status information of the UAV is obtained by using a plurality of control devices, and the status information of the UAV stored locally is synchronized according to the status information of the UAV, so that the status information of the UAV stored by each control device is The current state information of the drone is consistent, which avoids the phenomenon of control chaos or mutual interference when multiple control devices simultaneously control one drone.
  • Embodiments of the present invention provide a drone control system.
  • the drone control system provided in this embodiment includes a drone, and the head mounted display glasses described in the above embodiments, and the head mounted display glasses are used to control the drone.
  • the input information of the user is obtained by the head-mounted display glasses, and the input information may be the posture information of the user, or may be the direction information used by the user to input the cursor through the input device, that is, the head-mounted display glasses may be according to the user.
  • the gesture information controls the cursor to move on the screen.
  • the cursor can also be moved on the screen according to the direction information used by the user to move the cursor through the input device.
  • the cursor moves to the target point of the screen, the user inputs according to the selection.
  • the command selects the target point and controls the drone to enter the corresponding flight mode or control mode, thereby realizing the control method of controlling the drone by pointing flight, intelligent following, camera focusing, and the like.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of the unit is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of hardware plus software functional units.
  • the above-described integrated unit implemented in the form of a software functional unit can be stored in a computer readable storage medium.
  • the above software functional unit is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) or a processor to perform the methods of the various embodiments of the present invention. Part of the steps.
  • the foregoing storage medium includes: a U disk, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk, or an optical disk, and the like, which can store program codes. .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Automation & Control Theory (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Selective Calling Equipment (AREA)

Abstract

一种无人机控制方法、头戴式显示眼镜及系统,该方法包括:显示可移动标记(10);获取用户的输入信息;根据所述输入信息,移动可移动标记(10)以控制所述无人机。通过头戴式显示眼镜(50)获取用户的输入信息,头戴式显示眼镜(50)具体可根据用户的姿态信息,控制光标(11)在屏幕(52)上移动,也可以根据用户通过输入设备输入的用于移动光标(11)的方向信息,控制光标(11)在屏幕(52)上移动,当光标(11)移动到屏幕(52)的目标点后,根据用户输入的选择指令选择目标点,并控制无人机进入相应的飞行模式或控制模式,从而实现了指点飞行、智能跟随、相机对焦等控制无人机的控制方式。

Description

无人机控制方法、头戴式显示眼镜及系统 技术领域
本发明实施例涉及无人机领域,尤其涉及一种无人机控制方法、头戴式显示眼镜及系统。
背景技术
现有技术中手机、带触摸屏的遥控器、虚拟现实(Virtual Reality,简称VR)眼镜等控制设备均可用于控制无人机,手机上安装有控制无人机的应用程序,用户通过操作应用程序触控手机屏幕而控制无人机,另外,用户还可以通过触控遥控器的屏幕而控制无人机。
但是VR眼镜缺少触控屏幕,导致通过触控屏幕控制无人机的控制方式如指点飞行模式、智能跟随模式、相机对焦等无法实现。
发明内容
本发明实施例提供一种无人机控制方法、头戴式显示眼镜及系统,以实现体感控制无人机的目的。
本发明实施例的一个方面是提供一种无人机控制方法,包括:
显示可移动标记;
获取用户的输入信息;以及
根据所述输入信息,移动所述可移动标记以控制所述无人机。
本发明实施例的另一个方面是提供一种头戴式显示眼镜,包括一个或多个处理器以及屏幕,所述屏幕用于:
显示可移动标记;
所述一个或多个处理器用于:
获取用户的输入信息;
根据所述输入信息,移动所述可移动标记以控制无人机。
本发明实施例的另一个方面是提供一种无人机控制系统,包括:无人机、以及所述的头戴式显示眼镜,所述头戴式显示眼镜用于控制所述 无人机。
本发明实施例提供的无人机控制方法、头戴式显示眼镜及系统,通过头戴式显示眼镜获取用户的输入信息,输入信息可以是用户的姿态信息,也可以是用户通过输入设备输入的用于移动光标的方向信息,即头戴式显示眼镜可根据用户的姿态信息,控制光标在屏幕上移动,也可以根据用户通过输入设备输入的用于移动光标的方向信息,控制光标在屏幕上移动,当光标移动到屏幕的目标点后,根据用户输入的选择指令选择目标点,并控制无人机进入相应的飞行模式或控制模式,从而实现了指点飞行、智能跟随、相机对焦等控制无人机的控制方式。
附图说明
为了更清楚地说明本发明实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1为本发明实施例提供的无人机控制方法的流程图;
图1A为本发明实施例提供的无人机的“指点飞行”控制界面的示意图;
图1B为本发明实施例提供的无人机的“智能跟随”控制界面的示意图;
图2为本发明另一实施例提供的无人机控制方法的流程图;
图2A为本发明另一实施例提供的VR眼镜和用户头部的俯视图;
图2B为本发明另一实施例提供的光标在屏幕向左移动的示意图;
图2C为本发明另一实施例提供的VR眼镜和用户头部的俯视图;
图2D为本发明另一实施例提供的光标在屏幕向右移动的示意图;
图2E为本发明另一实施例提供的VR眼镜和用户头部的侧视图;
图2F为本发明另一实施例提供的光标在屏幕向下移动的示意图;
图2G为本发明另一实施例提供的VR眼镜和用户头部的侧视图;
图2H为本发明另一实施例提供的光标在屏幕向上移动的示意图;
图2I为本发明另一实施例提供的光标在屏幕上的坐标示意图;
图3为本发明另一实施例提供的无人机控制方法的流程图;
图4为本发明另一实施例提供的无人机控制方法的流程图;
图5为本发明实施例提供的头戴式显示眼镜的结构图;
图6为本发明另一实施例提供的头戴式显示眼镜的结构图;
图7为本发明另一实施例提供的头戴式显示眼镜的结构;
图8为本发明另一实施例提供的头戴式显示眼镜的结构图。
附图标记:
1-VR眼镜的俯视图   2-用户头部的俯视图   3-用户头部的旋转方向
4-VR眼镜的屏幕     5-可移动标记         6-VR眼镜的侧视图
7-用户头部的侧视图 8-用户头部的运动方向
9-VR眼镜的屏幕     10-可移动标记        11-光标
12-跟随目标        13-光标位置          14-光标位置
15-矩形框          50-头戴式显示眼镜    51-处理器
52-屏幕            53-惯性测量单元      54-接收单元
55-眼部传感器      56-发送单元
实施方式
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
需要说明的是,当组件被称为“固定于”另一个组件,它可以直接在另一个组件上或者也可以存在居中的组件。当一个组件被认为是“连接”另一个组件,它可以是直接连接到另一个组件或者可能同时存在居中组件。
除非另有定义,本文所使用的所有的技术和科学术语与属于本发明的技术领域的技术人员通常理解的含义相同。本文中在本发明的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本发明。本文所使用的术语“及/或”包括一个或多个相关的所列项目的 任意的和所有的组合。
下面结合附图,对本发明的一些实施方式作详细说明。在不冲突的情况下,下述的实施例及实施例中的特征可以相互组合。
本发明实施例提供一种无人机控制方法。图1为本发明实施例提供的无人机控制方法的流程图。本实施例的执行主体可以是用于控制无人机的无人机控制端,所述无人机控制端可以包括但不限于头戴式显示眼镜(VR眼镜、VR头盔等)、手机、遥控器(如带显示屏的遥控器)、智能手环、平板电脑等。所述无人机可以工作在不同模式,所述模式包括但不限于,指点飞行、智能跟随、相机对焦等。
在指点飞行模式中,用户可以通过点击所述无人机控制端的显示装置(如屏幕)上的一点或一区域,而选择一飞行目标,所述无人机可以朝所述飞行目标飞行。
在智能跟随模式中,用户可以通过选择所述无人机控制端的显示装置(如屏幕)上的一个可移动物体(如人、动物等),控制所述无人机跟随所述可移动物体飞行。
在相机对焦模式中,用户可以通过点击所述无人机控制端的显示装置(如屏幕)上的一点或一区域,控制所述无人机的成像装置(如相机),对焦。
如图1所示,本实施例中的方法,可以包括:
步骤S101、显示可移动标记。
在本实施例中,无人机控制端可以是虚拟现实头戴式显示设备,本实施例的方法适用于用户通过虚拟现实头戴式显示设备控制无人机,虚拟现实头戴式显示设备具体可以是虚拟现实(Virtual Reality,简称VR)眼镜、虚拟现实头盔(VR header)。用户佩戴有VR眼镜,VR眼镜的屏幕显示有可移动标记,可移动标记具体为光标,光标可在VR眼镜的屏幕上移动。
步骤S102、获取用户的输入信息。
在本实施例中,VR眼镜可获取用户的输入信息,输入信息可以是用户的姿态信息,也可以是用户通过输入设备输入的用于移动所述可移动标 记的方向信息。
VR眼镜安装有惯性测量单元(Inertial Measurement Unit,简称IMU),IMU可感测VR眼镜的位置和姿态,当用户佩戴VR眼镜后,VR眼镜可通过IMU获取用户姿态信息,具体的,IMU可用于感测用户头部的姿态信息,该姿态信息包括如下至少一种:偏航角(pitch angle)和俯仰角(yaw angle)。VR眼镜根据IMU感测到的用户头部的姿态信息,控制光标在屏幕上的位置。从用户角度来讲,用户可通过转动头部的方式,控制光标在屏幕上移动。
另外,VR眼镜还可以安装有感测用户双眼运动和眼睛(如眼球)变化的传感器,如眼动仪(eye tracker),VR眼镜获取用户姿态信息的方式还可以通过该传感器实现,VR眼镜根据该传感器感测的用户双眼运动和眼睛变化,控制光标在屏幕上移动。从用户角度来讲,用户可通过双眼运动和眼睛变化,控制光标在屏幕上移动。
此外,VR眼镜还可连接有输入设备,输入设备包括如下至少一种:五维键、摇杆、触摸屏。VR眼镜与输入设备的连接方式可以是有线(如线缆),也可以是无线(如Wifi、蓝牙等)。在一些实施例中,VR眼镜可以包括一遥控器,所述遥控器上设有五维键,VR眼镜根据用户点击五维键的上下左右键产生的表示方向的输入信息,控制光标在屏幕上移动。VR眼镜还可根据用户转动摇杆产生的表示方向的输入信息,控制光标在屏幕上移动。VR眼镜还可根据用户在触摸屏上的滑动或触摸操作产生的输入信息,控制光标在屏幕上移动。
此外,VR眼镜的镜框上还设置有触控板,用户还可通过操纵触控板来控制光标在屏幕上移动。
另外,VR眼镜的屏幕上还可显示有虚拟按键,该虚拟按键可以类似于五维键,包括上下左右四个方向的虚拟按键,以及一个表示确认的虚拟按键,用户还可通过对虚拟按键的操纵来控制光标在屏幕上移动。
步骤S103、根据所述输入信息,移动所述可移动标记以控制所述无人机。
从上述步骤可知,VR眼镜可根据用户的姿态信息,控制光标在屏幕上移动,也可以根据用户通过输入设备输入的用于移动所述可移动标记的 方向信息,控制光标在屏幕上移动。
VR眼镜可以将光标在屏幕上的移动信息,转换为用户通过VR眼镜控制无人机的指令,比如,光标在屏幕上向右移动,控制无人机向右飞行。在一些实施例中,VR眼镜可以将光标在屏幕上的移动信息,转换为用户通过VR眼镜控制无人机的机载设备的指令,比如,光标在屏幕上向右移动,控制所述无人机的成像装置(如相机)向右移动。
另外,VR眼镜的屏幕上显示有用户界面,VR眼镜可根据用户在该用户界面上的操作,控制无人机进入智能跟随模式、指点飞行模式或相机对焦模式,具体实现方式可以是:该用户界面包括用于控制无人机的图标,例如“指点飞行”图标、“智能跟随”图标、“相机对焦”图标等,VR眼镜根据用户头部转动、眼部运动、点击五维键的上下左右键、转动摇杆、触摸滑动触摸屏或触摸滑动触控板等方式,控制光标在屏幕上移动,当光标移动到相应的图标时,用户确定选择该图标,VR眼镜根据用户输入的确认信息,控制无人机进入该图标对应的模式。用户确定选择图标的方式可以通过如下任一种方式实现:点击五维键的确认键、点击触摸屏、点击触控板、眼睛停止转动一段预定时间、眼睛连续眨动、光标在图标上停留一段预定时间、点击虚拟按键、操作摇杆。
例如,VR眼镜根据用户头部的转动方向,控制光标在屏幕上移动,当光标移动到“指点飞行”图标时,用户通过点击五维键的确认键或使用其他本发明所披露的确认目标的方法,确认选择“指点飞行”图标,VR眼镜根据用户的选择指令,控制无人机进入指点飞行模式,此时,用户界面呈现“指点飞行”控制界面,如图1A所示,VR眼镜的屏幕上显示图像画面(如无人机拍摄的图像画面),用户再次通过头部转动、眼部运动、点击五维键的上下左右键、转动摇杆、触摸滑动触摸屏或触摸滑动触控板等方式,控制光标11在屏幕上移动,当光标移动到用户预期的目标点时,确定选择该目标点,用户确定选择目标点的方式和用户确定选择图标的方式一致,此处不再赘述。VR眼镜将用户选择的目标点的位置信息发送给无人机,以使无人机向该目标点指定的位置飞行,即实现用户指到哪里,无人机飞行到哪里的智能飞行模式。
再如,VR眼镜根据用户眼睛转动的方式,控制光标在屏幕上移动, 当光标移动到“智能跟随”图标时,用户通过点击触控板或使用其他本发明所披露的确认目标的方法,确认选择“智能跟随”图标,VR眼镜根据用户的选择指令,控制无人机进入智能跟随模式,此时,用户界面呈现“智能跟随”控制界面,如图1B所示,VR眼镜的屏幕上显示有待选定的跟随目标12,用户可通过框选的方式选择该跟随目标,框选过程可实现为:用于通过转动头部控制光标移动到所需矩形框的起始点位置,如图1B所示的光标位置13,按下五维键的确认键或者单击触控板来选定起始点或使用其他本发明所披露的确认目标的方法;再以同样方式选定终止点,如图1B所示的光标位置14,虚线框15表示框选过程中的矩形框。
另外,无人机的相机对焦模式的实现方式和指点飞行模式的实现方式一致,此处不再赘述。
本实施例通过无人机控制端获取用户的输入信息,输入信息可以是用户的姿态信息,也可以是用户通过输入设备输入的用于移动光标的方向信息,即无人机控制端可根据用户的姿态信息,控制光标在屏幕上移动,也可以根据用户通过输入设备输入的用于移动光标的方向信息,控制光标在屏幕上移动,当光标移动到屏幕的目标点后,根据用户输入的选择指令选择目标点,并控制无人机进入相应的飞行模式或控制模式,从而实现了指点飞行、智能跟随、相机对焦等控制无人机的控制方式。
本发明实施例提供一种无人机控制方法。图2为本发明另一实施例提供的无人机控制方法的流程图。如图2所示,在图1所示实施例的基础上,本实施例中的方法,可以包括:
步骤S201、显示可移动标记。
步骤S201与步骤S101一致,此处不再赘述。
步骤S202、获取惯性测量单元感测的所述用户头部的偏航角和俯仰角。
VR眼镜安装有惯性测量单元(Inertial Measurement Unit,简称IMU),IMU可感测VR眼镜的位置和姿态,当用户佩戴VR眼镜后,VR眼镜可通过IMU获取用户姿态信息,具体的,IMU可用于感测用户头部的姿态信息,该姿态信息包括如下至少一种:偏航角和俯仰角。
步骤S203、根据所述偏航角和所述俯仰角,确定所述可移动标记在屏幕上的位置。
用户佩戴VR眼镜后,以VR眼镜的屏幕垂直于地面为例,当用户头部左右转动时,IMU可感测到用户头部的偏航角,当用户头部上下俯仰时,IMU可感测到用户头部的俯仰角。VR眼镜根据IMU感测到的用户头部的偏航角和俯仰角,控制可移动标记(如光标)在屏幕上的位置。
VR眼镜根据IMU感测到的用户头部的偏航角和俯仰角,控制可移动标记在屏幕上的位置具体可实现为:根据所述偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标;以及根据所述俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标。可选的,VR眼镜可根据用户头部左右转动,控制光标在屏幕的水平方向上转动,根据用户头部的俯仰运动,控制光标在屏幕的垂直方向上转动。
如图2A所示,虚线框1表示VR眼镜的俯视图,虚线框2表示用户头部的俯视图,虚线框3表示用户头部的旋转方向,虚线框4表示VR眼镜的屏幕,虚线框5表示屏幕上的可移动标记。如图2A所示,用户头部向左转动,VR眼镜控制光标在屏幕的水平方向上向左移动,光标在屏幕的水平方向上向左移动的结果如图2B所示。
另外,如图2C所示,用户头部向右转动,VR眼镜控制光标在屏幕的水平方向上向右移动,光标在屏幕的水平方向上向右移动的结果如图2D所示。
如图2E所示,虚线框6表示VR眼镜的侧视图,虚线框7表示用户头部的侧视图,虚线框8表示用户头部的运动方向,虚线框9表示VR眼镜的屏幕,虚线框10表示屏幕上的可移动标记。如图2E所示,用户头部向下转动,即用户头部呈俯视状态,VR眼镜控制光标在屏幕的垂直方向上向下移动,光标在屏幕的垂直方向上向下移动的结果如图2F所示。
另外,如图2G所示,用户头部向上转动,即用户头部呈仰视状态,VR眼镜控制光标在屏幕的垂直方向上向上移动,光标在屏幕的垂直方向上向上移动的结果如图2H所示。
具体的,VR眼镜根据偏航角,确定光标在屏幕上水平方向的坐标,根据俯仰角,确定光标在屏幕上垂直方向的坐标。VR眼镜根据所述偏航 角,确定光标在所述屏幕上水平方向的坐标可通过两种方式实现,根据俯仰角,确定光标在屏幕上垂直方向的坐标也可通过两种方式实现,下面具体详述两种实现方式:
方式一
根据所述惯性测量单元感测的所述用户头部的一次偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标位置。根据所述惯性测量单元感测的所述用户头部的一次俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标位置。
如图2I所示,以VR眼镜屏幕的左下角为坐标原点,过原点的横向箭头表示X轴方向,过原点的纵向箭头表示Y轴方向,11表示光标,x表示光标11在屏幕上X轴的坐标,y表示光标11在屏幕上Y轴的坐标,具体通过下述公式(1)和(2)确定光标在屏幕上的坐标位置:
x=screenW/2+yaw*factorX        (1)
其中,x表示光标在屏幕上X轴的坐标,screenW表示屏幕在X轴方向包含的像素个数(屏幕的宽度),yaw角(偏航角)表示IMU一次采样获得的用户头部的偏航角,常数factorX表示屏幕X轴方向的转换系数。
y=screenH/2+pitch*factorY       (2)
其中,y表示光标在屏幕上Y轴的坐标,screenH表示屏幕在Y轴方向包含的像素个数(屏幕的长度),pitch角(俯仰角)表示IMU一次采样获得的用户头部的俯仰角,常数factorY表示屏幕Y轴方向的转换系数。
根据公式(1)和(2),假设IMU当前采样得到的{yaw,pitch}方向的角度值为{-33,21}度,factorX和factorY都为-20,屏幕分辨率为1920x1080,则光标在屏幕上的坐标是{1620,120}。
方式二
根据所述惯性测量单元感测的所述用户头部的两次偏航角的角度差,确定所述可移动标记在所述屏幕上水平方向的坐标偏移。根据所述惯性测量单元感测的所述用户头部的两次俯仰角的角度差,确定所述可移动标记在所述屏幕上垂直方向的坐标偏移。具体通过下述公式(3)和(4)确定光标在屏幕上的坐标位置:
deltaX=deltaYaw*factorX       (3)
其中,deltaX表示光标在屏幕水平方向的坐标偏移,deltaYaw表示IMU当前采集到的yaw角(偏航角)和上一次采集到的yaw角的角度差,常数factorX表示屏幕X轴方向的转换系数。
deltaY=deltaPitch*factorY        (4)
其中,deltaY表示光标在屏幕垂直方向的坐标偏移,deltaPitch表示IMU当前采集到的pitch角(俯仰角)和上一次采集到的pitch角的角度差,常数factorY表示屏幕Y轴方向的转换系数。
另外,在其他实施例中,deltaYaw还可以是IMU任意两次采集到的yaw角的角度差,deltaPitch还可以是IMU任意两次采集到的pitch角的角度差。
步骤S204、根据所述可移动标记在所述屏幕上的位置,选择所述屏幕上显示的目标物体。
VR眼镜根据前述步骤可确定光标在屏幕上的坐标位置,当用户头部左右转动、上下俯仰时,光标可在屏幕上移动,通过光标在屏幕上移动,可选择屏幕上显示的目标物体。
本实施例通过IMU感测用户头部的偏航角和俯仰角,VR眼镜根据偏航角确定光标在屏幕上水平方向的坐标,根据俯仰角确定光标在屏幕上垂直方向的坐标,将用户头部的运动信息精确的转换为光标在屏幕上的位置信息,提高了光标位置检测的精度,同时也提高了用户体感控制无人机的精度。
本发明实施例提供一种无人机控制方法。图3为本发明另一实施例提供的无人机控制方法的流程图。如图3所示,在图2所示实施例的基础上,根据所述可移动标记在所述屏幕上的位置,选择所述屏幕上显示的目标物体的方法,可通过如下步骤实现:
步骤S301、调整所述可移动标记在所述屏幕上的位置,以使所述可移动标记指向所述目标物体。
在上述实施例的基础上,当用户头部左右转动时,可控制光标在屏幕的水平方向上移动,当用户头部上下俯仰时,可控制光标在屏幕的垂直方 向上移动,VR眼镜根据用户头部的左右转动、上下俯仰,调整光标在屏幕上的位置。
VR眼镜的屏幕上显示有用户界面,用户界面包括用于控制无人机的图标,例如“指点飞行”图标、“智能跟随”图标、“相机对焦”图标等,VR眼镜根据用户选择的图标可以控制无人机进入该图标对应的模式。例如,进入指点飞行模式后,VR眼镜的屏幕上显示地图图层或图像画面,用户通过头部运动,调整光标在地图图层或图像画面上的位置,以便光标指向地图图层上的目标点或图像画面上的目标物体。
步骤S302、接收所述用户输入的选择指令,所述选择指令用于选择所述目标物体。
当光标指向地图图层上的目标点或图像画面上的目标物体后,VR眼镜接收用户输入的选择指令,表示用户确认选择地图图层上的目标点或图像画面上的目标物体。
具体的,VR眼镜接收用户输入的用于选择所述目标物体的选择指令可通过以下几种方式实现:
1)接收所述用户通过点击五维键的确认键输入的选择指令。当光标指向目标点或目标物体时,用户点击五维键的确认键,VR眼镜根据用户的点击操作,接收用户输入的选择指令,该选择指令表示用户确认选择屏幕上的目标点或目标物体。
2)接收所述用户通过点击触控板输入的选择指令。触控板可设置在VR眼镜的镜框上,当光标指向目标点或目标物体时,用户点击(如单击或双击)触控板,VR眼镜根据用户点击触控板的操作,接收用户输入的选择指令。
3)接收所述用户通过眼睛运动输入的选择指令。在本实施例中,VR眼镜还可以安装有感测用户双眼运动和眼睛变化的传感器(如眼动仪),当光标指向目标点或目标物体时,用户可通过眼睛停止转动一段预定时间或眼睛连续眨动等眼部信息表示确认选择屏幕上的目标点或目标物体。
4)接收所述用户通过点击触摸屏输入的选择指令。另外,VR眼镜还可通过有线或无线的方式外接有触摸屏,用户可通过操作触摸屏如按压触摸屏一段预定时间,表示确认选择屏幕上的目标点或目标物体。当触摸屏 感测到用户的触摸操作后,可将相关的触摸操作转换为电信号,并将电信号发送给VR眼镜,以使VR眼镜根据电信号确定用户输入的选择指令。用户也可以通过点击触摸屏(如单击或双击),表示确认选择屏幕上的目标点或目标物体。
5)在所述可移动标记在所述目标物体上停留一段预定时间之后,选择所述目标物体。当光标指向目标点或目标物体时,用户保持头部不动,使光标停留在目标点或目标物体上,当光标停留在目标点或目标物体上的时间达到预定时间时(如3秒、5秒等),VR眼镜确定用户选择该目标点或目标物体。
6)接收所述用户通过点击所述虚拟按键输入的选择指令。另外,VR眼镜的屏幕上或屏幕之外还可显示有虚拟按键,该虚拟按键也可作为确认按键。
7)接收所述用户通过操纵摇杆输入的选择指令。如:所述用户可以通过按压所述摇杆上的确认键,以表示确认选择屏幕上的目标点或目标物体。
另外,该选择指令可以是点选,也可以是框选。
本实施例通过VR眼镜根据用户头部运动,调整光标在屏幕上的位置,可使光标指向用户期望的目标点或目标物体,进而实现对目标点或目标物体的点选或框选,实现了用户通过VR眼镜进行点选或框选操作,弥补了VR眼镜由于缺少触控屏幕,而导致通过触控屏幕控制无人机的控制方式如指点飞行模式、智能跟随模式、相机对焦等无法实现的缺陷。
本发明实施例提供一种无人机控制方法。图4为本发明另一实施例提供的无人机控制方法的流程图。如图4所示,在上述任一实施例的基础上,本实施例提供的无人机控制方法,可以包括:
步骤S401、获取所述无人机的状态信息。
本实施例适用于无人机对应有多个控制设备时,多个控制设备同时控制无人机的场景。VR眼镜、智能终端、带屏幕的遥控器、智能手环等设备均可作为控制无人机的控制设备,由于多个控制设备可以独立的控制无人机,当多个控制设备同时控制一个无人机时,会导致控制混乱或相互干 扰。例如,用户A持有智能终端,通过智能终端控制无人机进入智能跟随模式,用户A在智能终端的屏幕上框选跟随目标后,无人机进入智能跟随模式,并将智能跟随画面实时发送到智能终端,用户A通过智能终端的屏幕,实时观看智能跟随画面。用户B佩戴有VR眼镜,若用户B并不知道此时无人机已经处于智能跟随模式,通过VR眼镜也控制无人机进入智能跟随模式,则用户A的智能终端与用户B的智能终端之前会产生控制冲突,例如:用户A观看的智能跟随画面会被终止,为了避免这样的问题出现,本实施例提供的各个控制设备需要实时或周期性获取无人机的状态信息,无人机的状态信息包括飞行模式、相机的设置信息、视频回放信息等。
每个控制设备可主动获取无人机的状态信息,也可以被动获取无人机的状态信息。主动获取无人机的状态信息可实现为:控制设备向无人机发送状态信息获取请求,无人机根据状态信息获取请求,将其状态信息返回给控制设备。被动获取无人机的状态信息可实现为:无人机以广播方式将其状态信息发送给各个控制设备。可选的,无人机周期性广播状态信息。被动获取无人机的状态信息还可实现为:当一个控制设备获取到无人机的状态信息后,将该状态信息发送给其他控制设备,其他控制设备被动接收该无人机的状态信息。
步骤S402、根据所述无人机的状态信息,同步本地存储的所述无人机的状态信息。
各个控制设备获取到无人机的状态信息后,同步本地存储的所述无人机无人机的状态信息。例如,智能终端本地存储的所述无人机的智能飞行模式是“指点飞行”,无人机当前的智能飞行模式是“智能跟随”,则智能终端需要将本地存储的无人机的智能飞行模式更新为“智能跟随”,同时,智能终端被动切换到“智能跟随”控制界面,以使持有智能终端的用户知晓:已经有其他的控制设备将无人机配置成了智能跟随模式。
另外,“指点飞行”和“远程回放无人机端视频”的情况也和“智能跟随”类似,此处不再赘述。
本实施例通过多个控制设备分别获取无人机的状态信息,并根据无人机的状态信息,同步本地存储的无人机的状态信息,使得各个控制设备存储的无人机的状态信息与无人机当前的状态信息一致,避免了多个控制设 备同时控制一个无人机时,出现控制混乱或相互干扰的现象。
本发明实施例提供一种头戴式显示眼镜。本实施例提供的头戴式显示眼镜具体可以是上述方法实施例中的无人机控制端。图5为本发明实施例提供的头戴式显示眼镜的结构图,如图5所示,头戴式显示眼镜50包括一个或多个处理器51以及屏幕52,屏幕52用于显示可移动标记;一个或多个处理器51用于获取用户的输入信息;根据所述输入信息,移动所述可移动标记以控制无人机。
具体地,所述一个或多个处理器可以包括但不限于微处理器(英文:microcontroller),精简指令集计算机(英文:reduced instruction set computer,简称:RISC),专用集成电路(英文:application specific integrated circuits,简称:ASIC),专用指令集处理器(英文:application-specific instruction-set processor,简称:ASIP),中央处理单元(英文:central processing unit,简称:CPU),物理处理器英文(英文:physics processing unit,简称:PPU),数字信号处理器(英文:digital signal processor,简称:DSP),现场可编程门阵列(英文:field programmable gate array,简称:FPGA)等。
在一些实施例中,所述头戴式显示眼镜可以包括VR眼镜、或VR头盔等。
在本实施例中,所述输入信息为所述用户的姿态信息。或者所述输入信息为所述用户通过输入设备输入的用于移动所述可移动标记的方向信息。
所述输入设备包括如下至少一种:五维键、摇杆、触控板、触摸屏或虚拟按键。
所述姿态信息包括如下至少一种:偏航角和俯仰角。
本发明实施例提供的头戴式显示眼镜的具体原理和实现方式均与图1所示实施例类似,此处不再赘述。
本实施例通过头戴式显示眼镜获取用户的输入信息,输入信息可以是用户的姿态信息,也可以是用户通过输入设备输入的用于移动光标的方向信息,即头戴式显示眼镜可根据用户的姿态信息,控制光标在屏幕上移动, 也可以根据用户通过输入设备输入的用于移动光标的方向信息,控制光标在屏幕上移动,当光标移动到屏幕的目标点后,根据用户输入的选择指令选择目标点,并控制无人机进入相应的飞行模式或控制模式,从而实现了指点飞行、智能跟随、相机对焦等控制无人机的控制方式。
本发明实施例提供一种头戴式显示眼镜。图6为本发明另一实施例提供的头戴式显示眼镜的结构图,如图6所示,在图5的基础上,头戴式显示眼镜50还包括与所述一个或多个处理器51连接的惯性测量单元53,惯性测量单元53用于感测所述用户头部的偏航角和俯仰角,并将所述用户头部的偏航角和俯仰角发送给所述一个或多个处理器51。
一个或多个处理器51用于根据所述偏航角和所述俯仰角,确定所述可移动标记在所述屏幕上的位置;以及根据所述可移动标记在所述屏幕上的位置,选择所述屏幕上显示的目标物体。
具体的,一个或多个处理器51用于根据所述偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标;以及根据所述俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标。
可选的,一个或多个处理器51用于根据所述惯性测量单元感测的所述用户头部的一次偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标位置;根据所述惯性测量单元感测的所述用户头部的一次俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标位置。
或者,一个或多个处理器51根据所述惯性测量单元感测的所述用户头部的两次偏航角的角度差,确定所述可移动标记在所述屏幕上水平方向的坐标偏移;以及根据所述惯性测量单元感测的所述用户头部的两次俯仰角的角度差,确定所述可移动标记在所述屏幕上垂直方向的坐标偏移。
本发明实施例提供的头戴式显示眼镜的具体原理和实现方式均与图2所示实施例类似,此处不再赘述。
本实施例通过IMU感测用户头部的偏航角和俯仰角,所述一个或多个处理器根据偏航角确定光标在屏幕上水平方向的坐标,根据俯仰角确定光标在屏幕上垂直方向的坐标,将用户头部的运动信息精确的转换为光标在屏幕上的位置信息,提高了光标位置检测的精度,同时也提高了用户体 感控制无人机的精度。
本发明实施例提供一种头戴式显示眼镜。图7为本发明另一实施例提供的头戴式显示眼镜的结构图,如图7所示,在图6的基础上,一个或多个处理器51用于调整所述可移动标记在所述屏幕上的位置,以使所述可移动标记指向所述目标物体;头戴式显示眼镜50还包括:与所述一个或多个处理器51连接的接收单元54,接收单元54用于接收所述用户输入的选择指令,所述选择指令用于选择所述目标物体。
可选的,接收单元54用于接收所述用户输入的用于选择所述目标物体的选择指令,包括:接收所述用户通过点击五维键的确认键输入的选择指令;或者接收所述用户通过点击触控板输入的选择指令;或者接收所述用户通过眼睛运动输入的选择指令;或者接收所述用户通过点击触摸屏输入的选择指令;或者在所述可移动标记在所述目标物体上停留一段预定时间之后,选择所述目标物体;或者接收所述用户通过点击所述虚拟按键输入的选择指令;或者接收所述用户通过控制摇杆输入的选择指令。
另外,头戴式显示眼镜50还包括与所述一个或多个处理器51连接的眼部传感器55,眼部传感器55用于感测所述用户眼睛的运动信息,并将所述运动信息发送给所述一个或多个处理器51;一个或多个处理器51用于根据所述运动信息,确定所述选择指令。
所述选择指令包括点选或框选。
本发明实施例提供的头戴式显示眼镜的具体原理和实现方式均与图3所示实施例类似,此处不再赘述。
本实施例通过VR眼镜根据用户头部运动,调整光标在屏幕上的位置,可使光标指向用户期望的目标点或目标物体,进而实现对目标点或目标物体的点选或框选,实现了用户通过VR眼镜进行点选或框选操作,弥补了VR眼镜由于缺少触控屏幕,而导致通过触控屏幕控制无人机的控制方式如指点飞行模式、智能跟随模式、相机对焦等无法实现的缺陷。
本发明实施例提供一种头戴式显示眼镜。图8为本发明另一实施例提 供的头戴式显示眼镜的结构图,如图8所示,在上述任一头戴式显示眼镜实施例的基础上,以图7所示实施例为例,一个或多个处理器51还用于获取所述无人机的状态信息;以及根据所述无人机的状态信息,同步本地存储的所述无人机的状态信息。
另外,头戴式显示眼镜51还包括与所述一个或多个处理器51连接的发送单元56,发送单元56用于向所述无人机发送状态信息获取请求;接收单元54还用于接收所述无人机发送的状态信息。
或者,接收单元54还用于接收所述无人机广播的状态信息。
或者,接收单元54还用于接收其他无人机控制端发送的所述无人机的状态信息,所述其他无人机控制端用于控制所述无人机,并接收所述无人机广播的状态信息。
本发明实施例提供的头戴式显示眼镜的具体原理和实现方式均与图4所示实施例类似,此处不再赘述。
本实施例通过多个控制设备分别获取无人机的状态信息,并根据无人机的状态信息,同步本地存储的无人机的状态信息,使得各个控制设备存储的无人机的状态信息与无人机当前的状态信息一致,避免了多个控制设备同时控制一个无人机时,出现控制混乱或相互干扰的现象。
本发明实施例提供一种无人机控制系统。本实施例提供的无人机控制系统包括无人机,以及上述实施例所述的头戴式显示眼镜,所述头戴式显示眼镜用于控制所述无人机。
本发明实施例提供的头戴式显示眼镜的具体原理和实现方式均与上述实施例类似,此处不再赘述。
本实施例通过头戴式显示眼镜获取用户的输入信息,输入信息可以是用户的姿态信息,也可以是用户通过输入设备输入的用于移动光标的方向信息,即头戴式显示眼镜可根据用户的姿态信息,控制光标在屏幕上移动,也可以根据用户通过输入设备输入的用于移动光标的方向信息,控制光标在屏幕上移动,当光标移动到屏幕的目标点后,根据用户输入的选择指令选择目标点,并控制无人机进入相应的飞行模式或控制模式,从而实现了指点飞行、智能跟随、相机对焦等控制无人机的控制方式。
在本发明所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
上述以软件功能单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。上述软件功能单元存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本发明各个实施例所述方法的部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,ROM)、随机存取存储器(Random Access Memory,RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
本领域技术人员可以清楚地了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。上述描述的装置的工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
最后应说明的是:以上各实施例仅用以说明本发明的技术方案,而非对其限制;尽管参照前述各实施例对本发明进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术 方案进行修改,或者对其中部分或者全部技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本发明各实施例技术方案的范围。

Claims (52)

  1. 一种无人机控制方法,其特征在于,包括:
    显示可移动标记;
    获取用户的输入信息;以及
    根据所述输入信息,移动所述可移动标记以控制所述无人机。
  2. 根据权利要求1所述的方法,其特征在于,所述输入信息为所述用户的姿态信息。
  3. 根据权利要求1所述的方法,其特征在于,所述输入信息为所述用户通过输入设备输入的用于移动所述可移动标记的方向信息。
  4. 根据权里要求3所述的方法,其特征在于,所述输入设备包括如下至少一种:
    五维键、摇杆、触控板、触摸屏或虚拟按键。
  5. 根据权利要求2所述的方法,其特征在于,所述姿态信息包括如下至少一种:
    偏航角和俯仰角。
  6. 根据权利要求5所述的方法,其特征在于,所述获取用户的输入信息,包括:
    获取惯性测量单元感测的所述用户头部的偏航角和俯仰角。
  7. 根据权利要求6所述的方法,其特征在于,所述根据所述输入信息,移动所述可移动标记以控制所述无人机,包括:
    根据所述偏航角和所述俯仰角,确定所述可移动标记在屏幕上的位置;以及
    根据所述可移动标记在所述屏幕上的位置,选择所述屏幕上显示的目标物体。
  8. 根据权利要求7所述的方法,其特征在于,所述根据所述偏航角和所述俯仰角,确定所述可移动标记在所述屏幕上的位置,包括:
    根据所述偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标;以及
    根据所述俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标。
  9. 根据权利要求8所述的方法,其特征在于,所述根据所述偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标,包括:
    根据所述惯性测量单元感测的所述用户头部的一次偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标位置;
    所述根据所述俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标,包括:
    根据所述惯性测量单元感测的所述用户头部的一次俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标位置。
  10. 根据权利要求8所述的方法,其特征在于,所述根据所述偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标,包括:
    根据所述惯性测量单元感测的所述用户头部的两次偏航角的角度差,确定所述可移动标记在所述屏幕上水平方向的坐标偏移;
    所述根据所述俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标,包括:
    根据所述惯性测量单元感测的所述用户头部的两次俯仰角的角度差,确定所述可移动标记在所述屏幕上垂直方向的坐标偏移。
  11. 根据权利要求7所述的方法,其特征在于,所述根据所述可移动标记在所述屏幕上的位置,选择屏幕上显示的目标物体,包括:
    调整所述可移动标记在所述屏幕上的位置,以使所述可移动标记指向所述目标物体;以及
    接收所述用户输入的选择指令,所述选择指令用于选择所述目标物体。
  12. 根据权利要求11所述的方法,其特征在于,所述接收用户输入的用于选择所述目标物体的选择指令,包括:
    接收所述用户通过点击五维键的确认键输入的选择指令;或者
    接收所述用户通过点击触控板输入的选择指令;或者
    接收所述用户通过眼睛运动输入的选择指令;或者
    接收所述用户通过点击触摸屏输入的选择指令;或者
    在所述可移动标记在所述目标物体上停留一段预定时间之后,选择所述目标物体;或者
    接收所述用户通过点击所述虚拟按键输入的选择指令;或者
    接收所述用户通过操纵摇杆输入的选择指令。
  13. 根据权利要求12所述的方法,其特征在于,所述选择指令包括点选或框选。
  14. 根据权利要求1-13任一项所述的方法,其特征在于,还包括:
    获取所述无人机的状态信息;以及
    根据所述无人机的状态信息,同步本地存储的所述无人机的状态信息。
  15. 根据权利要求14所述的方法,其特征在于,所述获取所述无人机的状态信息,包括:
    向所述无人机发送状态信息获取请求;以及
    接收所述无人机发送的状态信息。
  16. 根据权利要求14所述的方法,其特征在于,所述获取所述无人机的状态信息,包括:
    接收所述无人机广播的状态信息。
  17. 一种头戴式显示眼镜,其用于控制无人机,其特征在于,包括:一个或多个处理器以及一个屏幕,所述屏幕用于显示可移动标记;所述一个或多个处理器用于获取用户的输入信息;以及根据所述输入信息,移动所述可移动标记以控制所述无人机。
  18. 根据权利要求17所述的头戴式显示眼镜,其特征在于,所述输入信息为所述用户的姿态信息。
  19. 根据权利要求17所述的头戴式显示眼镜,其特征在于,所述输入信息为所述用户通过输入设备输入的用于移动所述可移动标记的方向信息。
  20. 根据权利要求19所述的头戴式显示眼镜,其特征在于,所述输入设备包括如下至少一种:
    五维键、摇杆、触控板、触摸屏或虚拟按键。
  21. 根据权利要求18所述的头戴式显示眼镜,其特征在于,所述姿态信息包括如下至少一种:
    偏航角和俯仰角。
  22. 根据权利要求21所述的头戴式显示眼镜,其特征在于,还包括:
    与所述一个或多个处理器连接的惯性测量单元,所述惯性测量单元用于感测所述用户头部的偏航角和俯仰角,并将所述用户头部的偏航角和俯仰角发送给所述一个或多个处理器。
  23. 根据权利要求22所述的头戴式显示眼镜,其特征在于,所述一个或多个处理器用于:
    根据所述偏航角和所述俯仰角,确定所述可移动标记在所述屏幕上 的位置;以及
    根据所述可移动标记在所述屏幕上的位置,选择所述屏幕上显示的目标物体。
  24. 根据权利要求23所述的头戴式显示眼镜,其特征在于,所述一个或多个处理器用于:
    根据所述偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标;以及
    根据所述俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标。
  25. 根据权利要求24所述的头戴式显示眼镜,其特征在于,所述一个或多个处理器用于:
    根据所述惯性测量单元感测的所述用户头部的一次偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标位置;
    根据所述惯性测量单元感测的所述用户头部的一次俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标位置。
  26. 根据权利要求24所述的头戴式显示眼镜,其特征在于,所述一个或多个处理器用于:
    根据所述惯性测量单元感测的所述用户头部的两次偏航角的角度差,确定所述可移动标记在所述屏幕上水平方向的坐标偏移;以及
    根据所述惯性测量单元感测的所述用户头部的两次俯仰角的角度差,确定所述可移动标记在所述屏幕上垂直方向的坐标偏移。
  27. 根据权利要求23所述的头戴式显示眼镜,其特征在于,所述一个或多个处理器用于调整所述可移动标记在所述屏幕上的位置,以使所述可移动标记指向所述目标物体;
    所述头戴式显示眼镜还包括:
    与所述一个或多个处理器连接的接收单元,所述接收单元用于接收 所述用户输入的选择指令,所述选择指令用于选择所述目标物体。
  28. 根据权利要求27所述的头戴式显示眼镜,其特征在于,所述接收单元用于接收所述用户输入的用于选择所述目标物体的选择指令,包括:
    接收所述用户通过点击五维键的确认键输入的选择指令;或者
    接收所述用户通过点击触控板输入的选择指令;或者
    接收所述用户通过眼睛运动输入的选择指令;或者
    接收所述用户通过点击触摸屏输入的选择指令;或者
    在所述可移动标记在所述目标物体上停留一段预定时间之后,选择所述目标物体;或者
    接收所述用户通过点击所述虚拟按键输入的选择指令;或者
    接收所述用户通过操纵摇杆输入的选择指令。
  29. 根据权利要求27所述的头戴式显示眼镜,其特征在于,还包括:
    与所述一个或多个处理器连接的眼部传感器,所述眼部传感器用于感测所述用户眼睛的运动信息,并将所述运动信息发送给所述一个或多个处理器;
    所述一个或多个处理器用于根据所述运动信息,确定所述选择指令。
  30. 根据权利要求27所述的头戴式显示眼镜,其特征在于,所述选择指令包括点选或框选。
  31. 根据权利要求17-30任一项所述的头戴式显示眼镜,其特征在于,所述一个或多个处理器还用于:
    获取所述无人机的状态信息;以及
    根据所述无人机的状态信息,同步本地存储的所述无人机的状态信息。
  32. 根据权利要求31所述的头戴式显示眼镜,其特征在于,还包括:
    与所述一个或多个处理器连接的发送单元,所述发送单元用于向所述无人机发送状态信息获取请求;
    所述接收单元还用于接收所述无人机发送的状态信息。
  33. 根据权利要求31所述的头戴式显示眼镜,其特征在于,所述接收单元还用于接收所述无人机广播的状态信息。
  34. 根据权利要求31所述的头戴式显示眼镜,其特征在于,所述接收单元还用于接收其他无人机控制端发送的所述无人机的状态信息,所述其他无人机控制端用于控制所述无人机,并接收所述无人机广播的状态信息。
  35. 一种无人机控制系统,其特征在于,包括:无人机以及头戴式显示眼镜,其特征在于,所述头戴式显示眼镜包括:
    屏幕,用于显示可移动标记;
    一个或多个处理器,用于:
    获取用户的输入信息;以及
    根据所述输入信息,移动所述可移动标记以控制所述无人机。
  36. 根据权利要求35所述的无人机控制系统,其特征在于,所述输入信息为所述用户的姿态信息。
  37. 根据权利要求35所述的无人机控制系统,其特征在于,所述输入信息为所述用户通过输入设备输入的用于移动所述可移动标记的方向信息。
  38. 根据权利要求37所述的无人机控制系统,其特征在于,所述输入设备包括如下至少一种:
    五维键、摇杆、触控板、触摸屏或虚拟按键。
  39. 根据权利要求36所述的无人机控制系统,其特征在于,所述姿态信息包括如下至少一种:
    偏航角和俯仰角。
  40. 根据权利要求39所述的无人机控制系统,其特征在于,还包括:
    与所述一个或多个处理器连接的惯性测量单元,所述惯性测量单元用于感测所述用户头部的偏航角和俯仰角,并将所述用户头部的偏航角和俯仰角发送给所述一个或多个处理器。
  41. 根据权利要求40所述的无人机控制系统,其特征在于,所述一个或多个处理器用于:
    根据所述偏航角和所述俯仰角,确定所述可移动标记在所述屏幕上的位置;以及
    根据所述可移动标记在所述屏幕上的位置,选择所述屏幕上显示的目标物体。
  42. 根据权利要求41所述的无人机控制系统,其特征在于,所述一个或多个处理器用于:
    根据所述偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标;以及
    根据所述俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标。
  43. 根据权利要求42所述的无人机控制系统,其特征在于,所述一个或多个处理器用于:
    根据所述惯性测量单元感测的所述用户头部的一次偏航角,确定所述可移动标记在所述屏幕上水平方向的坐标位置;
    根据所述惯性测量单元感测的所述用户头部的一次俯仰角,确定所述可移动标记在所述屏幕上垂直方向的坐标位置。
  44. 根据权利要求42所述的无人机控制系统,其特征在于,所述一个或多个处理器用于:
    根据所述惯性测量单元感测的所述用户头部的两次偏航角的角度差,确定所述可移动标记在所述屏幕上水平方向的坐标偏移;以及
    根据所述惯性测量单元感测的所述用户头部的两次俯仰角的角度差,确定所述可移动标记在所述屏幕上垂直方向的坐标偏移。
  45. 根据权利要求41所述的无人机控制系统,其特征在于,所述一个或多个处理器用于调整所述可移动标记在所述屏幕上的位置,以使所述可移动标记指向所述目标物体;
    所述头戴式显示眼镜还包括:
    与所述一个或多个处理器连接的接收单元,所述接收单元用于接收所述用户输入的选择指令,所述选择指令用于选择所述目标物体。
  46. 根据权利要求45所述的无人机控制系统,其特征在于,所述接收单元用于接收所述用户输入的用于选择所述目标物体的选择指令,包括:
    接收所述用户通过点击五维键的确认键输入的选择指令;或者
    接收所述用户通过点击触控板输入的选择指令;或者
    接收所述用户通过眼睛运动输入的选择指令;或者
    接收所述用户通过点击触摸屏输入的选择指令;或者
    在所述可移动标记在所述目标物体上停留一段预定时间之后,选择所述目标物体;或者
    接收所述用户通过点击所述虚拟按键输入的选择指令;或者
    接收所述用户通过操纵摇杆输入的选择指令。
  47. 根据权利要求45所述的无人机控制系统,其特征在于,还包括:
    与所述一个或多个处理器连接的眼部传感器,所述眼部传感器用于感测所述用户眼睛的运动信息,并将所述运动信息发送给所述一个或多个处理器;
    所述一个或多个处理器用于根据所述运动信息,确定所述选择指令。
  48. 根据权利要求45所述的无人机控制系统,其特征在于,所述选择指令包括点选或框选。
  49. 根据权利要求35-48任一项所述的无人机控制系统,其特征在于,所述一个或多个处理器还用于:
    获取所述无人机的状态信息;以及
    根据所述无人机的状态信息,同步本地存储的所述无人机的状态信息。
  50. 根据权利要求49所述的无人机控制系统,其特征在于,还包括:
    与所述一个或多个处理器连接的发送单元,所述发送单元用于向所述无人机发送状态信息获取请求;
    所述接收单元还用于接收所述无人机发送的状态信息。
  51. 根据权利要求49所述的无人机控制系统,其特征在于,所述接收单元还用于接收所述无人机广播的状态信息。
  52. 根据权利要求49所述的无人机控制系统,其特征在于,所述接收单元还用于接收其他无人机控制端发送的所述无人机的状态信息,所述其他无人机控制端用于控制所述无人机,并接收所述无人机广播的状态信息。
PCT/CN2016/100072 2016-09-26 2016-09-26 无人机控制方法、头戴式显示眼镜及系统 WO2018053824A1 (zh)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP2019516208A JP6851470B2 (ja) 2016-09-26 2016-09-26 無人機の制御方法、頭部装着式表示メガネおよびシステム
PCT/CN2016/100072 WO2018053824A1 (zh) 2016-09-26 2016-09-26 无人机控制方法、头戴式显示眼镜及系统
CN201680003670.6A CN107454947A (zh) 2016-09-26 2016-09-26 无人机控制方法、头戴式显示眼镜及系统
EP16916563.6A EP3511758A4 (en) 2016-09-26 2016-09-26 METHOD FOR CONTROLLING AN UNMANUFACTURED AIRCRAFT, GLASSES OF A HEAD-MOUNTED DISPLAY AND SYSTEM
US16/363,497 US20190220040A1 (en) 2016-09-26 2019-03-25 Unmanned aerial vehicle control method, head-mounted display glasses, and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2016/100072 WO2018053824A1 (zh) 2016-09-26 2016-09-26 无人机控制方法、头戴式显示眼镜及系统

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/363,497 Continuation US20190220040A1 (en) 2016-09-26 2019-03-25 Unmanned aerial vehicle control method, head-mounted display glasses, and system

Publications (1)

Publication Number Publication Date
WO2018053824A1 true WO2018053824A1 (zh) 2018-03-29

Family

ID=60485742

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/100072 WO2018053824A1 (zh) 2016-09-26 2016-09-26 无人机控制方法、头戴式显示眼镜及系统

Country Status (5)

Country Link
US (1) US20190220040A1 (zh)
EP (1) EP3511758A4 (zh)
JP (1) JP6851470B2 (zh)
CN (1) CN107454947A (zh)
WO (1) WO2018053824A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109309709A (zh) * 2018-08-22 2019-02-05 北京臻迪科技股份有限公司 一种可远端控制无人装置的控制方法

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019082891A (ja) * 2017-10-31 2019-05-30 セイコーエプソン株式会社 頭部装着型表示装置、表示制御方法、およびコンピュータープログラム
CN107943088A (zh) * 2017-12-22 2018-04-20 广州亿航智能技术有限公司 一种控制无人机的方法及其系统
CN109992096A (zh) * 2017-12-29 2019-07-09 北京亮亮视野科技有限公司 激活智能眼镜功能图标的方法
CN110337621A (zh) * 2018-04-09 2019-10-15 深圳市大疆创新科技有限公司 运动轨迹确定、延时摄影方法、设备及机器可读存储介质
CN109316741A (zh) * 2018-07-17 2019-02-12 派视觉虚拟现实(深圳)软件技术有限公司 一种vr场景中控制角色移动的方法、装置及设备
CN109300478A (zh) * 2018-09-04 2019-02-01 上海交通大学 一种听力障碍者的辅助对话装置
CN109540834A (zh) * 2018-12-13 2019-03-29 深圳市太赫兹科技创新研究院 一种电缆老化监测方法及系统
CN110412996A (zh) * 2019-06-18 2019-11-05 中国人民解放军军事科学院国防科技创新研究院 一种基于手势和眼动的无人机操控方法、装置和系统
CN110347163B (zh) * 2019-08-07 2022-11-18 京东方科技集团股份有限公司 一种无人驾驶设备的控制方法、设备及无人驾驶控制系统
CN110764521B (zh) * 2019-10-15 2021-09-24 中国航空无线电电子研究所 面向多无人机的地面站任务飞行一体化监控系统及方法
CN110753313A (zh) * 2019-10-30 2020-02-04 深圳市道通智能航空技术有限公司 一种数据同步方法和系统
WO2021090906A1 (ja) * 2019-11-08 2021-05-14 株式会社エスイーフォー 制御装置、制御ユニット、それらを有する制御システム
US11804052B2 (en) * 2020-03-26 2023-10-31 Seiko Epson Corporation Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path
CN112771465A (zh) * 2020-04-27 2021-05-07 深圳市大疆创新科技有限公司 无人机的控制方法、系统、装置及存储介质
CN114527864B (zh) * 2020-11-19 2024-03-15 京东方科技集团股份有限公司 增强现实文字显示系统、方法、设备及介质
US20220374085A1 (en) * 2021-05-19 2022-11-24 Apple Inc. Navigating user interfaces using hand gestures
KR102605774B1 (ko) * 2021-11-24 2023-11-29 주식회사 딥파인 스마트 글래스 및 이를 포함하는 음성 인식 시스템

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102266672A (zh) * 2010-03-11 2011-12-07 鹦鹉股份有限公司 一种远程控制无人驾驶飞机、尤其是旋翼无人驾驶飞机的方法和装置
CN202587217U (zh) * 2011-12-16 2012-12-05 新时代集团国防科技研究中心 用于无人飞行器作业的辅助显示装置
US20130038692A1 (en) * 2011-08-09 2013-02-14 Kabushiki Kaisha Topcon Remote Control System
WO2013158050A1 (en) * 2012-04-16 2013-10-24 Airnamics, Napredni Mehatronski Sistemi D.O.O. Stabilization control system for flying or stationary platforms
CN103620527A (zh) * 2011-05-10 2014-03-05 寇平公司 使用动作和语音命令来控制信息显示和远程设备的头戴式计算机
CN105222761A (zh) * 2015-10-29 2016-01-06 哈尔滨工业大学 借助虚拟现实及双目视觉技术实现的第一人称沉浸式无人机驾驶系统及驾驶方法
CN105739525A (zh) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 一种配合体感操作实现虚拟飞行的系统
CN105763790A (zh) * 2014-11-26 2016-07-13 鹦鹉股份有限公司 用于以沉浸模式来驾驶无人机的视频系统

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001306144A (ja) * 2000-04-21 2001-11-02 Yamaha Motor Co Ltd 無人ヘリコプタの飛行制御システム
US8521339B2 (en) * 2008-09-09 2013-08-27 Aeryon Labs Inc. Method and system for directing unmanned vehicles
FR2985329B1 (fr) * 2012-01-04 2015-01-30 Parrot Procede de pilotage intuitif d'un drone au moyen d'un appareil de telecommande.
JP2014063411A (ja) * 2012-09-24 2014-04-10 Casio Comput Co Ltd 遠隔制御システム、制御方法、及び、プログラム
JP2014212479A (ja) * 2013-04-19 2014-11-13 ソニー株式会社 制御装置、制御方法及びコンピュータプログラム
CN205139708U (zh) * 2015-10-28 2016-04-06 上海顺砾智能科技有限公司 一种无人机的动作识别远程控制装置
CN105357220B (zh) * 2015-12-04 2019-04-26 深圳一电航空技术有限公司 无人机管控方法及系统
CN105867613A (zh) * 2016-03-21 2016-08-17 乐视致新电子科技(天津)有限公司 基于虚拟现实系统的头控交互方法及装置

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102266672A (zh) * 2010-03-11 2011-12-07 鹦鹉股份有限公司 一种远程控制无人驾驶飞机、尤其是旋翼无人驾驶飞机的方法和装置
CN103620527A (zh) * 2011-05-10 2014-03-05 寇平公司 使用动作和语音命令来控制信息显示和远程设备的头戴式计算机
US20130038692A1 (en) * 2011-08-09 2013-02-14 Kabushiki Kaisha Topcon Remote Control System
CN202587217U (zh) * 2011-12-16 2012-12-05 新时代集团国防科技研究中心 用于无人飞行器作业的辅助显示装置
WO2013158050A1 (en) * 2012-04-16 2013-10-24 Airnamics, Napredni Mehatronski Sistemi D.O.O. Stabilization control system for flying or stationary platforms
CN105763790A (zh) * 2014-11-26 2016-07-13 鹦鹉股份有限公司 用于以沉浸模式来驾驶无人机的视频系统
CN105222761A (zh) * 2015-10-29 2016-01-06 哈尔滨工业大学 借助虚拟现实及双目视觉技术实现的第一人称沉浸式无人机驾驶系统及驾驶方法
CN105739525A (zh) * 2016-02-14 2016-07-06 普宙飞行器科技(深圳)有限公司 一种配合体感操作实现虚拟飞行的系统

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3511758A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109309709A (zh) * 2018-08-22 2019-02-05 北京臻迪科技股份有限公司 一种可远端控制无人装置的控制方法
CN109309709B (zh) * 2018-08-22 2021-08-10 北京臻迪科技股份有限公司 一种可远端控制无人装置的控制方法

Also Published As

Publication number Publication date
US20190220040A1 (en) 2019-07-18
CN107454947A (zh) 2017-12-08
JP2020506443A (ja) 2020-02-27
EP3511758A1 (en) 2019-07-17
JP6851470B2 (ja) 2021-03-31
EP3511758A4 (en) 2019-08-28

Similar Documents

Publication Publication Date Title
WO2018053824A1 (zh) 无人机控制方法、头戴式显示眼镜及系统
US10674062B2 (en) Control method for photographing using unmanned aerial vehicle, photographing method using unmanned aerial vehicle, mobile terminal, and unmanned aerial vehicle
US11914370B2 (en) System and method for providing easy-to-use release and auto-positioning for drone applications
US10897567B2 (en) Systems and methods for controlling an image captured by an imaging device
WO2018214078A1 (zh) 拍摄控制方法及装置
CN104486543B (zh) 智能终端触控方式控制云台摄像头的系统
US9703288B1 (en) System and method for aerial system control
KR20180075191A (ko) 무인 이동체를 제어하기 위한 방법 및 전자 장치
WO2018133593A1 (zh) 智能终端的控制方法和装置
US11228737B2 (en) Output control apparatus, display terminal, remote control system, control method, and non-transitory computer-readable medium
CN113448343B (zh) 用于设定飞行器的目标飞行路径的方法、系统和可读介质
JP2020005146A (ja) 出力制御装置、表示端末、情報処理装置、移動体、遠隔制御システム、出力制御方法、プログラムおよび撮影制御装置
KR20120136719A (ko) 손과 눈의 3차원 위치정보를 이용한 원거리 스크린 상의 물체지목 및 제어방법
CN113677412B (zh) 信息处理装置、信息处理方法和程序
US10250813B2 (en) Methods and systems for sharing views
WO2023025203A1 (zh) 云台相机的变焦控制方法、装置及终端
WO2023025202A1 (zh) 云台方向的控制方法、装置及终端
WO2017187220A1 (en) System and method for aerial system control
KR20170093389A (ko) 무인 비행체 제어 방법
CN117891343A (zh) 用于对输入设备的输入进行消抖的方法和设备
CN115278178A (zh) 一种处理方法、处理装置和电子设备
CN115562479A (zh) 电子设备的控制方法、电子设备的控制装置和穿戴设备

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16916563

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019516208

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2016916563

Country of ref document: EP

Effective date: 20190411