CN108475064B - Method, apparatus, and computer-readable storage medium for apparatus control - Google Patents

Method, apparatus, and computer-readable storage medium for apparatus control Download PDF

Info

Publication number
CN108475064B
CN108475064B CN201780004525.4A CN201780004525A CN108475064B CN 108475064 B CN108475064 B CN 108475064B CN 201780004525 A CN201780004525 A CN 201780004525A CN 108475064 B CN108475064 B CN 108475064B
Authority
CN
China
Prior art keywords
space
determining
coordinate
mapping relationship
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201780004525.4A
Other languages
Chinese (zh)
Other versions
CN108475064A (en
Inventor
陈喆君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SZ DJI Technology Co Ltd
Original Assignee
SZ DJI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by SZ DJI Technology Co Ltd filed Critical SZ DJI Technology Co Ltd
Publication of CN108475064A publication Critical patent/CN108475064A/en
Application granted granted Critical
Publication of CN108475064B publication Critical patent/CN108475064B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • G05D1/106Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/08Control of attitude, i.e. control of roll, pitch, or yaw
    • G05D1/0808Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/10Simultaneous control of position or course in three dimensions
    • G05D1/101Simultaneous control of position or course in three dimensions specially adapted for aircraft
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

A method for a first device (100) to control a second device (200), a corresponding first device (100) and computer readable storage medium. A method (700) performed at a first device (100) for controlling a second device (200) comprises: (step S710) determining a first space (10) associated with the first device (100) and a second space (20) associated with the second device (200); (step S720) determining a first coordinate mapping relationship between the first space (10) and the second space (20); (step S730) determining a second operation to be performed in the second space (20) by the second device (200) from the first operation of the first device (100) in the first space (10) based on the first coordinate mapping relationship; (step S740) and sending a control instruction to the second device (200) to instruct the second device (200) to perform the second operation.

Description

Method, apparatus, and computer-readable storage medium for apparatus control
Copyright declaration
The disclosure of this patent document contains material which is subject to copyright protection. The copyright is owned by the copyright owner. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the patent and trademark office official records and records.
Technical Field
The present disclosure relates to the field of remote control, and more particularly, to a method, apparatus, and computer-readable storage medium for apparatus control.
Background
An Unmanned Aerial Vehicle (UAV), also commonly referred to as an "unmanned aerial vehicle," "unmanned flight system (UAS)," or by several other names, is an aircraft on which there is no human pilot. The flight of a drone may be controlled in various ways: such as remote control by a human operator (sometimes also referred to as a "flyer"), or flight by a drone in a semi-autonomous or fully-autonomous manner, and so forth.
When remote control, need the flight hand can come the flight gesture of adjustment unmanned aerial vehicle at any time as required. However, for most people, the control mode of the unmanned aerial vehicle is far away from the life experience of driving a car and remotely controlling a toy in daily life, so that the unmanned aerial vehicle needs complicated and lengthy professional training. In this case, how to simplify the operation of the drone, or even automate or semi-automate the operation thereof, becomes one of the problems to be solved urgently.
Disclosure of Invention
According to a first aspect of the present disclosure, a method performed at a first device for controlling a second device is presented. The method comprises the following steps: determining a first space associated with the first device and a second space associated with the second device; determining a first coordinate mapping relationship between the first space and the second space; determining, based on the first coordinate mapping relationship, a second operation to be performed in the second space by the second device in accordance with a first operation of the first device in the first space; and sending a control instruction to the second device to instruct the second device to perform the second operation.
According to a second aspect of the present disclosure, a first device for controlling a second device is presented. The first device includes: a space determination module to determine a first space associated with the first device and a second space associated with the second device; a first mapping relation determining module, configured to determine a first coordinate mapping relation between the first space and the second space; a second operation determination module for determining, based on the first coordinate mapping relationship, a second operation to be performed in the second space by the second device according to a first operation of the first device in the first space; and the instruction sending module is used for sending a control instruction to the second equipment so as to instruct the second equipment to execute the second operation.
According to a third aspect of the present disclosure, a first device for controlling a second device is presented. The first device includes: a processor; a memory having instructions stored therein, which when executed by the processor, cause the processor to: determining a first space associated with the first device and a second space associated with the second device; determining a first coordinate mapping relationship between the first space and the second space; determining, based on the first coordinate mapping relationship, a second operation to be performed in the second space by the second device in accordance with a first operation of the first device in the first space; and sending a control instruction to the second device to instruct the second device to perform the second operation.
According to a fourth aspect of the present disclosure, a computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the method according to the first aspect of the present disclosure is presented.
Drawings
For a more complete understanding of the disclosed embodiments and the advantages thereof, reference is now made to the following descriptions taken in conjunction with the accompanying drawings, in which:
fig. 1 is a diagram illustrating an example first space according to an embodiment of the present disclosure.
Fig. 2 is a diagram illustrating an example second space according to an embodiment of the present disclosure.
Fig. 3 is a flowchart illustrating an example synchronization process of an example first device with an example second device, according to an embodiment of the present disclosure.
Fig. 4 is a diagram illustrating an example scenario when an example first device leaves a first space, according to an embodiment of the present disclosure.
Fig. 5 is an example scenario illustrating an example second device encountering an obstacle while the example first device controls the example second device, according to an embodiment of the disclosure.
Fig. 6 is a flowchart illustrating an example resynchronization procedure when an example second space is redetermined according to an embodiment of the present disclosure.
Fig. 7 is a flowchart illustrating an example method for an example first device to control an example second device in accordance with an embodiment of the present disclosure.
Fig. 8 is a functional block diagram illustrating an example first device for controlling an example second device according to an embodiment of the present disclosure.
Fig. 9 is a hardware schematic diagram illustrating an example first device for controlling an example second device, according to an embodiment of the present disclosure.
Furthermore, the figures are not necessarily to scale, but rather are shown in a schematic manner that does not detract from the reader's understanding.
Detailed Description
Other aspects, advantages, and salient features of the disclosure will become apparent to those skilled in the art from the following detailed description, which, taken in conjunction with the annexed drawings, discloses exemplary embodiments of the disclosure.
In the present disclosure, the terms "include" and "comprise," as well as derivatives thereof, mean inclusion without limitation.
In this specification, the various embodiments described below which are used to describe the principles of the present disclosure are by way of illustration only and should not be construed in any way to limit the scope of the disclosure. The following description with reference to the accompanying drawings is provided to assist in a comprehensive understanding of exemplary embodiments of the present disclosure as defined by the claims and their equivalents. The following description includes various specific details to aid understanding, but such details are to be regarded as illustrative only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Moreover, descriptions of well-known functions and constructions are omitted for clarity and conciseness. Further, the same reference numbers are used throughout the drawings for the same or similar functions and operations. Moreover, although aspects may have been described in terms of various features in different embodiments, those skilled in the art will recognize that: all or portions of the features of the different embodiments may be combined to form new embodiments without departing from the spirit and scope of the present disclosure.
Please note that: although the following embodiments are described in detail with the drone as a manipulation object and the head-mounted display as a manipulation subject, the present disclosure is not limited thereto. In fact, the manipulation object may be any manipulation object, such as a robot, a remote control car, an airplane, etc., or any device that can be remotely controlled. Furthermore, the manipulating body may also be, for example, a fixed terminal (e.g., desktop), a mobile terminal (e.g., mobile phone, tablet), other wearable devices besides a head-mounted display, a remote controller, a handle, a joystick, etc., or any device that can issue a manipulation instruction.
Before formally describing some embodiments of the present disclosure, some of the terms that will be used herein will first be described.
Virtual Reality (VR): the virtual reality technology is an important direction of the simulation technology, and is a collection of various technologies such as the simulation technology, the computer graphics man-machine interface technology, the multimedia technology sensing technology network technology and the like. Virtual reality technology (VR) mainly includes aspects of simulating environment, perception, natural skills, sensing equipment and the like. The simulated environment is typically a three-dimensional stereoscopic realistic image generated by a computer, dynamic in real time. Perception means that an ideal VR should have the perception that everyone has. In addition to the visual perception generated by computer graphics technology, there may be perceptions such as auditory sensation, tactile sensation, force sensation, and movement, and even olfactory sensation and taste sensation, which are also referred to as multi-perceptions. The natural skill refers to the head rotation, eyes, gestures or other human body behavior actions of a person, data adaptive to the actions of the participants are processed by the computer, and real-time response is made to the input of the user and respectively fed back to the five sense organs of the user. The sensing device refers to a three-dimensional interaction device.
Euler angle/attitude angle: the relationship of the body coordinate system (three axes in a direction, e.g., with the tail pointing towards the nose, the left wing pointing towards the right wing, and a direction perpendicular to both directions (i.e., perpendicular to the plane of the aircraft) and pointing below the body) to the ground coordinate system (also known as the geodetic coordinate system, three axes in directions, e.g., east, north, and geocentric) is three euler angles, which reflect the attitude of the aircraft relative to the ground. The three euler angles are: pitch angle (pitch), yaw angle (yaw), and roll angle (roll).
Pitch angle θ (pitch): the X-axis of the body coordinate system (e.g., the direction from the tail to the head) is at an angle to the ground level. The pitch angle is positive when the positive semi-axis of the X-axis is above the horizontal plane through the origin of coordinates (head up), and negative otherwise. When the pitch angle of an aircraft changes, it usually means that its subsequent flying height changes. If the pitch angle of the image sensor changes, the change usually means that the height of the picture taken by the image sensor changes.
Yaw angle ψ (yaw): when the X axis of the body coordinate system rotates anticlockwise to the projection line of the X axis of the ground coordinate system, the yaw angle is positive, namely the right yaw of the machine head is positive, and vice versa. When the yaw angle of an aircraft changes, it usually means that its subsequent horizontal flight direction changes. If the yaw angle of the image sensor changes, the change usually means that the picture shot by the image sensor moves left and right.
Roll angle Φ (roll): the angle between the Z-axis of the coordinate system of the body (e.g., the direction in which the plane of the aircraft is facing downward) and a vertical plane passing through the X-axis of the body is positive for roll to the right and negative for roll to the right. When the roll angle of an aircraft changes, it usually means that its horizontal plane is rotated. If the roll angle of the image sensor changes, the change usually means that the picture taken by the image sensor is inclined to the left or inclined to the right.
As previously mentioned, a simple and easy-to-use unmanned aerial vehicle control mode is needed. In a common unmanned plane control mode, the unmanned plane is usually controlled by a hand-held remote controller. For example, the magnitude of the velocity in each direction of the aircraft can be controlled by the magnitude of the displacement acting on the joystick. Meanwhile, the flyer usually needs to pay attention to the shooting angle of a camera (or more generally, an image sensor or an image sensing component) on the unmanned aerial vehicle, and certain requirements are imposed on the burden and proficiency of the flyer.
In the event of traffic congestion in urban areas, sudden rush of people due to major events or festivals, sudden disasters (earthquake, fire, terrorist attack, etc.), and even small-scale military conflicts, decision-making personnel need to have the ability to grasp instant and intuitive local information for conducting on-site command. As another example, the news industry needs to report events from as many angles as possible, all around. In the state of the art, the most important way is coverage monitoring from the air, either by fixed or portable monitoring devices on site, or by aircraft. However, in these common schemes, there are the following problems: sufficient numbers and operators of equipment are required to completely cover the monitored area, while ground monitoring equipment has limited mobility and is difficult to deploy flexibly. In addition, for aerial manned vehicles or UAVs, it is not possible to quickly switch from high altitude to other viewing angles. The above described approach also requires multiple professional operations and summaries to be presented to the decision-maker.
In order to at least partially solve or mitigate the above-mentioned problems, a virtual reality based drone handling system according to an embodiment of the present disclosure is proposed. The system can enable commanders to flexibly and intuitively master the instant pictures of various angles in the spot, and simultaneously reduce the requirements on the number of personnel and equipment. Through the intuitive control mode of the system, the unmanned aerial vehicle and the camera angle thereof can be controlled by the natural behaviors of human (such as standing, squatting, walking and/or head movement) and the control of the flight track and the lens angle which are difficult to be completed by both hands can be completed by walking and turning.
In some embodiments of the present disclosure, the hardware portion of the drone steering system may consist essentially of: an input terminal, a communication terminal, and a terminal. In some embodiments, the input may include, for example, a Head Mounted Display (HMD) and/or a handle controller. The main effect of input lies in providing the virtual reality picture and providing operation interface for the operator can carry out corresponding observation, operation to unmanned aerial vehicle etc. according to the virtual reality picture of observing. Please note that: in this context, the virtual reality view is not limited to a pure virtual view generated by a computer, but may also include an actual view captured by an image sensor of a drone, a combination of an actual view and a virtual view, and/or a pure virtual view, for example. In other words, in the context of the present disclosure, Virtual Reality (VR) also includes Augmented Reality (AR). In some embodiments, the communication end may include, for example: various networks (e.g., the internet, a local area network, a mobile communications network (3G, 4G, and/or 5G, etc.), a WiMax network, a fiber optic network, etc.), a control center, and/or a ground station, etc. The main role of the communication end is to provide a communication link, communication control, etc. for the input end and the terminal. The communication terminal may transmit data, signals, etc. between the input terminal and the terminal in a wired manner, a wireless manner, or a combination thereof. In some embodiments, the terminal may include, for example, an Unmanned Aerial Vehicle (UAV), a robot, a remotely controlled car, an airplane, etc., or any device that may be remotely controlled.
In the embodiments described herein below, the detailed description will be given by taking the input end as a Head Mounted Display (HMD), the communication end as a wireless communication network (e.g., 4G network), and the terminal as a drone as an example. However, the present disclosure is not limited thereto as described above. Furthermore, herein, the HMD (or more generally, the manipulation subject) will also often be referred to with "first device" and the drone (or more generally, the manipulation object) with "second device", although the disclosure is not limited thereto. In fact, as the speed of networks (e.g., 5G networks) increases and technologies develop, a function performed in a single device may be distributed among a plurality of devices in a distributed manner. For example, some of the individual steps of the method performed at the first device as described below may be performed entirely at the communication end or terminal, so that the combination of the hardware parts of the devices performing these steps may be considered equivalent to the "first device". Similarly, some of the individual steps of the method performed at the second device, e.g. as described below, may be performed entirely at the input or communication end, so that the combination of the hardware parts of the devices performing these steps may be considered equivalent to the "second device".
Next, an initialization process for controlling a second device (e.g., a drone) using a first device (e.g., a head mounted display) according to an embodiment of the present disclosure will be described in detail in conjunction with fig. 1-3.
Fig. 1 is a view illustrating an example first space 10 according to an embodiment of the present disclosure, and fig. 2 is a view illustrating an example second space 20 according to an embodiment of the present disclosure. As shown in fig. 1, the first space 10 may be a space associated with the first device 100 for a user wearing the first device 100 to perform actual operations therein. As shown in fig. 1, a user wearing the first device 100 can perform actions of standing, walking, turning, squatting, jumping, turning, etc. in the space. Upon sensing these actions, the first device 100 may interpret the actions of the user and/or the first device 100 in the first space 10 and convert them accordingly into actions to be performed by a maneuver object (e.g., the drone 200 shown in fig. 3) in the second space 20 shown in fig. 2, and then send maneuver instructions indicative of these actions to the drone 200, in a manner as described below. Please note that: herein, since the user always wears the first device 100, the user and the first device 100 will not be distinguished below unless otherwise stated. In other words, unless stated otherwise, hereinafter "user" and "first device 100" may be used interchangeably.
To determine the extent of this first space 10, a user wearing the first device 100 may do so by specifying all or part of the vertices and/or all or part of the side lengths of the first space 10. For example, in the embodiment shown in fig. 1, the first space 10 may be a cube as shown by a dotted line, and in order to specify the cube space, the user may specify any vertex of the first space 10 as an origin and specify lengths in various directions (e.g., X, Y and the Z-axis shown in fig. 1) as side lengths. In other embodiments, the extent of the first space 10 may also be specified by determining at least one of: the positions of the two vertices of the first space 10 and the length of the first space 10 on the coordinate axis perpendicular to the line formed by the two vertices; the positions of the three non-collinear vertexes of the first space 10 and the length of the first space 10 in the direction of the perpendicular to the plane formed by the three vertexes; and the position of the non-coplanar at least four vertices of the first space 10.
Furthermore, although the first space 10 is exemplified as a cube in the embodiment shown herein for the convenience and intuition of the reader, the present disclosure is not limited thereto. The first space 10 may have other shapes including (but not limited to): a sphere, a frustum, a pyramid, a cylinder, a cone, or any other regular or irregular solid structure.
In some embodiments, the manner in which the vertex is determined may be, for example, the user pressing a manual controller when walking to a point or by a head action (e.g., nodding head, shaking head) or any other action (e.g., jumping, squatting) or even by another device manipulated by a spectator to inform the first device 100 that the point is a certain vertex of the first space 100. The way of determining the side length may be, for example, manually input by the user through an input device such as a keyboard, or by detecting the distance the user actually walks, etc. In other embodiments, the user may also determine the extent of the first space 10 by virtually framing the captured images of the venue.
Furthermore, in some embodiments, the height of the origin of the first space 10 may be altered in the following manner: for example, when the first device 100 detects that the user is in a squatting state for more than a certain amount of time (e.g., 1 second, 3 seconds, or any other suitable time), or by triggering an event after squatting (e.g., pressing a corresponding button of a remote control, etc.), or by a combination of the two, the origin of the first space 10 may be raised to, for example, the height of the user's eyes, at which time the second device may be caused to remain at a low height in the second space, e.g., to move along the bottom surface of the second space. Furthermore, in some embodiments, the height of the top surface of the first space 10 may be lowered to the user's eye level by detecting a fixed action or more generally a triggering event (e.g., pressing a corresponding button of a remote control, etc.), thereby allowing the corresponding second device to move along the top surface of the second space. Further, in some embodiments, the change to the operational height of the second device in the foregoing embodiments may be deactivated by setting a timer (e.g., after the second device has been operated along the changed height for 1 second, 3 seconds, or any other suitable time) or triggering an event (e.g., pressing a corresponding button of a remote control). In the event that the fixed travel height of the second device is released, the second device can be returned to the respective second spatial position, for example, for vertical travel to return to the respective position. Further, in some embodiments, in order to enable the user to manipulate the second device through a squatting motion, a certain point in a range in which the user can squat may be selected as the origin of the first space 10 when determining the first space 10, thereby making it possible to highly change the origin in the above-described embodiments.
In some embodiments, the operations in the above embodiments may all be prompted to the user through an output device (e.g., a display) of the first device 100. For example, when squatting is detected for more than a predetermined time or a triggering event is detected, the user may be prompted to enter the height change mode, or the second device releases the height change mode, etc. through the display of the first device 100 while the operations of the foregoing embodiments are performed.
As shown in fig. 2, the second space 20 may be a space associated with the second device 200 for the second device 200 (e.g., a drone) to actually operate therein. As shown in fig. 2, the second device 200 can perform actions such as hovering, flying, turning, descending, ascending, adjusting the angle of view with a camera, etc. in the space. The second device 200 may receive the manipulation instruction from the first device 100 and perform a corresponding action according to a manner as described below. In the embodiment shown in fig. 2, the top and bottom of the second space 20 may correspond to the highest and lowest flying heights of the second device 200, respectively, although the present disclosure is not limited thereto.
Similar to the first space 10 shown in fig. 1, in order to determine the extent of the second space 20, the user may specify all or part of the vertices and/or all or part of the side lengths of the second space 20 on the three-dimensional electronic map. For example, in the embodiment shown in fig. 2, the second space 20 may also be a cube as shown by a dotted line, and in order to specify the cube space, the user may specify any vertex of the second space 20 as an origin and specify lengths in various directions (e.g., X, Y and the Z-axis shown in fig. 2) as side lengths. In other embodiments, the extent of the second space 20 may also be specified by determining at least one of: the position of the two vertices of the second space 20 on the three-dimensional electronic map and the length of the second space on the coordinate axis perpendicular to the line formed by the two vertices; the positions of the non-collinear three vertices of the second space 20 on the three-dimensional electronic map and the lengths of the second space 20 in the direction of the perpendicular to the plane formed by the three vertices; and the position of the non-coplanar at least four vertices of the second space 20 on the three-dimensional electronic map.
Further, although the second space 20 is exemplified as a cube in the embodiment shown herein for the convenience and intuition of the reader, the present disclosure is not limited thereto. The second space 20 may also have other shapes including (but not limited to): a sphere, a frustum, a pyramid, a cylinder, a cone, or any other regular or irregular solid structure. For example, in the case of approaching a no-fly zone such as an airport, since the no-fly zone has a rounded frustum shape, a side of the second space 20 close to the airport may have an irregular solid structure having a narrow-top and wide-bottom form.
In some embodiments, the manner of determining the vertex may be, for example, the user selecting a range of the second space 20 in the three-dimensional electronic map by framing, for example, selecting each vertex, or a part of the vertices and the side length, etc. of the second space 20 in the three-dimensional electronic map. Furthermore, in other embodiments, the range of the second space 20 may also be determined by manipulating the second device 200 to fly to a certain designated point in the air, then indicating the designated point to the first device 100 as a certain vertex (e.g., origin, center point, etc.) of the second space 20, and then specifying the respective side lengths.
In the embodiment shown in fig. 2, in order to facilitate the operation of the second device 200, the second space 20 may be generally designated such that no object exists therein that may block the flight of the second device 200. However, the present disclosure is not limited thereto, and for example, as shown in fig. 5, 6, the upper right side, permanent or temporary obstacles may exist in the second space 20 (or 20') to affect the flight of the second device 200.
Furthermore, the operations for determining the first space 10 and the second space 20 described above in conjunction with fig. 1 and 2, respectively, may be performed sequentially, simultaneously, or partially sequentially and partially simultaneously, and the order of determining the two is not limited to the order described herein (i.e., first determining the first space 10 and then determining the second space 20), but may be reversed (i.e., first determining the second space 20 and then determining the first space 10).
Next, a coordinate mapping relationship between the coordinates in the first space 10 and the coordinates in the second space 20 needs to be determined. As previously shown, the first space 10 shown in fig. 1 and the second space 20 shown in fig. 2 each have a cubic shape for convenience and intuition of explanation. In most cases, the second space 20 has a size much larger than the first space 10, for example, the size of the second space 20 may be on the order of kilometers, while the size of the first space 10 may be on the order of meters. However, the present disclosure is not limited thereto, and the sizes of the first space 10 and the second space 20 may also be substantially equivalent or the size of the first space 10 may be larger than the size of the second space 20.
In the embodiment shown in fig. 1 and 2, since the first space 10 and the second space 20 are both cubes, a linear mapping relationship can be established between the coordinates of the two. For example, the origins of the first space 10 and the second space 20 (e.g., manually determined as described above or automatically determined by the first apparatus 100 according to the range of the first space 10 and/or the second space 20) are mapped to each other, and the X-axis, the Y-axis, and the Z-axis of the two are linearly mapped, respectively.
In some embodiments, the respective mapping ratios between the respective side lengths of the first space 10 and the respective side lengths of the second space 20 may be consistent, but this is not necessarily required. For example, the lengths of the X, Y and Z axes of the first space 10 may be, for example, 10 meters, 5 meters and 2 meters, respectively, and the lengths of the X, Y and Z axes of the second space 20 may be, for example, 5 kilometers, 2.5 kilometers and 1 kilometer, respectively, so that the respective ratios of the side lengths on the three axes are 1/500. For another example, the lengths of the X, Y and Z axes of the first space 10 may be, for example, 10 meters and 2 meters, respectively, and the lengths of the X, Y and Z axes of the second space 20 may be, for example, 5 kilometers, 2.5 kilometers and 0.75 kilometer, respectively, so that the respective ratios of the side lengths on the three axes are 1/500, 1/250 and 1/375, respectively.
In the former case, in the case where the user wearing the first device 100 walks 3 meters along the X-axis of the first space 10, the second device 200 may fly 1.5 kilometers along its X-axis accordingly; in case the user wearing the first device 100 walks 3 meters along the Y-axis of the first space 10, the second device 200 may then fly 1.5 km along its Y-axis, respectively. In the latter case, the second device 200 may still fly 1.5 km along its X-axis, in case the user wearing the first device 100 walks 3 meters along the X-axis of the first space 10; whereas in the case of a user wearing the first device 100 walking 3 meters along the Y-axis of the first space 10, the second device 200 may correspondingly fly 0.75 kilometers along its Y-axis instead of 1.5 nails, unlike the previous case.
It can be seen that in some embodiments of the present disclosure, when respective proportions of the lengths of the first space 10 and the second space 20 in the respective coordinate axes are determined, a (first) coordinate mapping relationship of the coordinates in the first space 10 and the coordinates in the second space 20 may be determined based on the respective proportions. Thus, the first device 100 may map the action such as the displacement of the user in the first space 10 to the action such as the displacement to be performed by the second device 200 in the second space 20. Such a mapping is intuitive and simple, facilitating the operation of the second device 200 by the user (or the first device 100).
The synchronization process in the handling initialization will be described in detail next with reference to fig. 3.
Fig. 3 is a flowchart illustrating an example synchronization process of the example first device 100 and the example second device 200 according to an embodiment of the present disclosure. After establishing the coordinate mapping relationship of the first space 10 and the second space 20 as described in connection with fig. 1 and 2, as shown in the lower part of fig. 3, the user wearing the first device 100 may walk into a certain point in the first space 10 (e.g. approximately in the center as shown in fig. 3) and indicate that a synchronization process is to be started between the first device 100 and the second device 200 by indicating a "synchronization activation" to the first device 100 (e.g. by a controller handle in the user's hand, or by detecting a user nodding, shaking or any other triggering action with the HMD).
Upon receiving the "synchronization activation" instruction, the first device 100 may detect (first) coordinates of itself in the first space 10 and determine (second) coordinates of a position where the second device 200 is to be located in the second space 20 according to the previously determined coordinate mapping relationship. When the first device 100 determines the second coordinates, it may send a "synchronization activation" instruction to the second device 200 through, for example, the aforementioned communication terminal or directly send the second device 200 to instruct the second device 200 to hover at the second coordinates and enter a "synchronization" state. Further, in some embodiments, while the second device 200 is in flight to the second coordinate, i.e., the second device is synchronizing, the user may be prompted by the first device 100, for example, with a "in sync" word or icon or other indicator to indicate to the user to not move for a while, thereby avoiding lengthening the synchronization process.
As shown in the upper part of fig. 3, the second device 200 may be initially outside the second space 20 and enter the second space 20 at a predetermined approach altitude (take-off and) upon receiving a "synchronization activation" command. The approach height may depend on the highest and/or lowest height of the second space 20, or may be other heights specified by the user. During the approach, the second device 200 may use any obstacle avoidance means or measures of its own to avoid any obstacles during the entry into the second space 20. In other words, the flight path of the second device 200 may not be a broken line as shown in the upper part of fig. 3, but a path having any form (e.g., a curve, a straight line, a random line, etc.) and length, for example, in order to bypass an obstacle directly above the second device 200, the second device 200 may even fly a distance away from the second space 20, climb to an approach height, and then advance to the second coordinate in the second space 20. Meanwhile, the user can observe the flight status and the surrounding environment of the second device 20 through a real-time image presented on the first device 10 and captured by an image sensing component (e.g., a camera) mounted on the second device 20, so as to ensure that the second device 20 is not unexpected when entering the field.
In some embodiments, when the second device 200 reaches the second coordinate, it may return a "synchronization activation" confirmation message to the first device 100 via, for example, the communication terminal or directly to the first device 100 to indicate that it has reached the specified location and entered the "synchronization" state. At this time, the picture captured by the image sensing component of the second device 200 may be transmitted and displayed on the first device 100 in real time. The operator can perform any actions within the first space 10, such as walking, steering, raising/lowering the head, squatting/jumping, etc., to act on the first device 100 and control the second device 200 accordingly.
In some embodiments, parameters such as instantaneous acceleration, instantaneous speed, geometric coordinates, azimuth (yaw) angle, and/or pitch angle of the user or the first device 100 may be obtained in real time by a gyroscope, an accelerometer, or a magnetic sensor, a positioning device (e.g., GPS, etc.) mounted on the first device 100. For example, using an accelerometer on the first device 100, the first device 100 may determine its acceleration in a certain direction over a period of time, and thus its velocity, and may determine its displacement over the period of time and its coordinates relative to the initial position in the direction. For another example, by using a gyroscope, the first device 100 may detect the magnitude of the user's head turning and/or head raising/lowering movements, and in conjunction with the side lengths of the first space 20, may determine changes in the azimuth (yaw) angle and/or the pitch angle of the movements in the first space 20.
Upon determining the (first) operation of the first device 100 in the first space 10, the first device 100 may determine the (second) operation to be performed in the second space 20 by the second device 200 corresponding to the first operation according to the aforementioned first coordinate mapping relationship. For example, as previously described, when the first apparatus 100 moves 3 meters along the X-axis of the first space 10,the second operation may be determined as the second device 200 flying 1.5 kilometers along its X-axis according to the first coordinate mapping of 1/500. For another example, when the first device 100 has a pitch angle of +15 degrees in the plane of the X-axis (or Y-axis) and the Z-axis (i.e., the user's line of sight is raised by 15 degrees), then the second operation may be determined as the second device 200 and/or the image sensing assembly thereof has a pitch angle of +15 degrees in the plane of the X-axis (or Y-axis) and the Z-axis because the proportions between the coordinate axes in the first coordinate mapping relationship are consistent. Further, if the proportions between the coordinate axes in the first coordinate mapping relationship are not consistent, for example the proportion of the X-axis (or Y-axis) is 1/500 and the proportion of the Z-axis is 1/375, the second operation may be determined as the pitch angle of the second device 200 and/or its image sensing assembly in the plane of its X-axis (or Y-axis) and Z-axis is about +11.3 degrees (i.e.,
Figure BDA0001690108670000141
). The purpose of this is mainly to ensure that: although the first space 10 and the second space 20 are scaled differently in each axis, the maximum pitch range that can be achieved by the user corresponds. Furthermore, the azimuth angle can be similarly processed.
Further, in some embodiments, when the user or the first device 100, for example, generates a change in altitude (e.g., a jump, a squat, etc.), the first device 100 may determine a highest or lowest altitude at which the first device 100 is ascending or descending and compare the highest or lowest altitude to some preset highest or lowest threshold altitude. When the difference between the detected height and the threshold height is obtained, it may be mapped to a height difference in the second space 20 according to the aforementioned first coordinate mapping relationship, and the second device 200 is instructed to ascend or descend the height difference accordingly. In other embodiments, the height conversion may be performed without taking the first coordinate mapping into account. For example, the second device 200 may be raised a fixed height, e.g. 10 meters, whenever the user jumps once. For another example, the second apparatus 200 may be lowered a fixed height, for example 5 meters, whenever the user squats once. Further, the height of the second device 200 may be adjusted accordingly according to the actual height change of the first device 100 without setting the threshold. However, this is not beneficial for the manipulation of the second device, considering the slight height variations that humans can undergo when walking naturally.
More generally, in some embodiments, the first device 100 may determine a first translation route in the first space 10 when performing the translation operation, map the first translation route to a second translation route in the second space 20 based on the above-described first coordinate mapping relationship, and determine the second operation as an operation indicating that the second device 200 moves along the second translation route. In other embodiments, the first device 100 may determine a first azimuth angle of the first device 100 in the first space 10 at the end of the steering operation; mapping the first azimuth to a second azimuth in the second space 20 based on the first coordinate mapping; and determining the second operation as indicating the second device 200 or the image sensing component of the second device 200 to steer to the second azimuth angle. In still other embodiments, the first device 100 may determine a first pitch angle of the first device 100 in the first space 10 at the end of the view angle changing operation, map the first pitch angle to a second pitch angle in the second space 20 based on the first coordinate mapping relationship; and determining the second operation as indicating that the image sensing assembly of the second device 200 is steered to the second pitch angle. In still other embodiments, the first device 100 may determine the highest or lowest elevation reached by the first device 100 in the first space 10 during the performance of the elevation change operation; if the highest or lowest height is above or below the highest or lowest threshold, respectively, then mapping the difference between the highest or lowest height and the corresponding highest or lowest threshold to a height difference in the second space 20 based on the first coordinate mapping relationship; and determining the second operation as indicating the second device 200 to ascend or descend the height difference.
However, the present disclosure is not limited thereto. In other embodiments, direct conversion may also be performed for angular changes such as pitch and/or azimuth without regard to the first coordinate mapping. For example, if the first operation is the first device 100 rotating clockwise 45 degrees in place within the first space 10, the second operation may be determined as the second device 200 also rotating clockwise 45 degrees in place within the second space 20. By way of example, if the first operation is the lowering of the first apparatus 100 by 30 degrees in the first space 10, the second operation may be determined as the lowering of the pitch angle of the second apparatus 200 by 30 degrees in the second space 20. The purpose of this is mainly to ensure that: although the first space 10 and the second space 20 are different in scale in each axis, the maximum pitch angle that can be achieved by the user corresponds. Furthermore, the azimuth angle can be similarly processed.
Further, although roll angles are not discussed herein, this is primarily because the second apparatus 200 generally does not require rolling, and the first apparatus 100 also generally does not perform a rolling operation. However, the present disclosure is not so limited and similar processing may be performed for roll angle.
In some embodiments, after the second operation is determined, a control instruction may be sent to the second device 200 to instruct the second device 200 to perform the second operation.
Thus, in an ideal situation, the coordinates of the operator or the first space 10 in which the first device 100 is located may be synchronized in real time with the corresponding coordinates of the second device 200 in the second space 20 in which it is located, and the azimuth and/or pitch angles of the image sensing assembly of the second device 200 may correspond to the corresponding angles of the first device 100. The user can conveniently and intuitively operate the unmanned aerial vehicle, and a real-time view corresponding to the current head posture of the user is obtained.
Next, a process of the first device 100 and the second device 200 being desynchronized will be described in detail with reference to fig. 4.
Fig. 4 is a diagram illustrating an example scenario when the example first device 100 leaves the first space 10, according to an embodiment of the present disclosure. In the embodiment shown in fig. 4, when the user wearing the first device 100 leaves the first space 10, for example, he walks out of the boundary of the first space 10, the second device 200 should theoretically also be at the corresponding boundary of the second space 20 and ready to fly out. At this point, in some embodiments, the first device 100 may send a "sync cancel" instruction to the second device 200 to instruct the second device 200 to disengage from the synchronization state and to hover in place to receive further instructions. In other embodiments, the second device 200 may also self-disengage the synchronization state and hover in place upon detecting an exit from the second space 20. At this time, the second device 200 may also select a state in which it reports "desynchronization" to the first device 100. In addition to the user or the first device 100 releasing the synchronization when leaving the first space 10, the user or the first device 100 may also select to actively release the synchronization state, for example by pressing a certain fixed button on a handheld controller, by nodding the head on a head-mounted display, by shaking the head, or by some other specified action, etc. At this time, the second device 200 may release the synchronization state and hover in place.
In some embodiments, when the original user or another user wearing the first device 100 comes back into the first space 10 again, the first device 100 may accordingly issue a resynchronization instruction (e.g., the aforementioned "synchronization activation" instruction or another resynchronization instruction) to the second device 20 to instruct the second device 20 to enter a synchronized state and fly to a correct location in the second space 20 corresponding to the current location of the first device 100 in the first space 10. Furthermore, in other embodiments, the user wearing the first device 100 may also choose to manually activate the synchronization state, for example, to reset the first space 10 and send a "synchronization activation" instruction to the second device 200 as described above.
Next, the obstacle avoidance process of the second device 200 will be described in detail with reference to fig. 5.
Fig. 5 is a diagram illustrating an example scenario in which an example second appliance 200 encounters an obstacle while the example first appliance 100 controls the example second appliance 200, according to an embodiment of the present disclosure. As shown in the lower part of fig. 5, the user or the first apparatus 100 performs the displacement operation. At this time, the second device 200 should also be displaced to the corresponding position in the aforementioned manner. If an obstacle occurs on the route along which the second device 200 is displaced, as shown in the upper part of fig. 5, the second device 200 may choose to re-route itself and move to a corresponding location. In some embodiments, the obstacle avoidance process may be performed autonomously by the second device 200. In this way, obstacles can be avoided more quickly. In other embodiments, the obstacle avoidance process may also be controlled by the first device 100 after receiving the report from the second device 200. Either way, the second device 200 can reach the designated location after avoiding the obstacle.
In some embodiments, when the second device 200 re-routes to avoid the obstacle, it may release the synchronization state with the first device 100 and re-enter the synchronization state after reaching the designated location. In other embodiments, the synchronized state may be maintained at all times and only released when the second device 200 does not reach the specified location within a specified time period.
Next, a process of re-determining the second space 20 will be described in detail with reference to fig. 6.
Fig. 6 is a flowchart illustrating an example resynchronization procedure when the example second space 20 is redetermined according to an embodiment of the present disclosure. As shown in fig. 6, when the movement of the first device 10 in the first space 10 cannot completely cover all positions of the whole second space 20 (for example, there is a mistake in initial setting) or wants to observe a certain area outside the second space 20 or needs to displace the second device 200 with the operator still, the user can designate a target point (or more generally, a new second space) on the three-dimensional electronic map, for example, by a handle, and remap the coordinate relationship between the first space 10 and the second space 20 ', so that the corresponding coordinate of the first space 10 where the current user is located instantly changes the coordinate of the target point in the second space 20'.
For example, as shown in fig. 6, the user may specify a position 150 to be reached in the virtual space by controlling a handle or other means (e.g., detecting the motion of the user's arm in the virtual space by a gesture detector mounted on the user's arm), the second device 200 may go to the corresponding spatial coordinates in the second space 20 and reset the second space 20 ' in such a way that the new position corresponds to the current position of the user or the first device 100 in the first space 10, and continue the subsequent operations.
Further, in some embodiments, when the second device 200 is not synchronized in real time with the position of the first device 100 for some reason, the first device 100 may switch its rendered view to a 3D modeling view (i.e., a pure virtual view) that was previously completed for the flight zone. The coordinate system used in the 3D space and the angle of view of the image sensing assembly coincide with those in the actual flight area. In addition, an environment such as weather may be simulated by a 3D static model and a peripheral condition such as time and weather of the current flight area, a flight area screen close to live view shooting may be rendered and displayed on the first device 100, and an operator may perform an operation using a reference object without shooting the screen. In some embodiments, after the second device 200 maintains the synchronized state with the first device 100 again, it may switch to the live view shot.
In addition, the control center (e.g., the first device 100 or other control facility) may monitor the current environment, power status, and return distance of the second device 200 to automatically determine when to dispatch a candidate second device 200 to ensure that the job is performed as usual. For example, another first device 100 may dispatch a candidate second device 200 close to the current second device 200, ensuring that the replacement is reached and completed before the current second device 200 power is reduced to the return power, thereby enabling uninterrupted monitoring of the monitored area.
Through the equipment control scheme described above with reference to fig. 1 to 6, commanders can flexibly and intuitively master instant pictures at various angles in the field, and the operator can obtain a desired monitoring angle only by walking to a proper position under the permission of flight conditions. In addition, the dependence on other personnel and equipment is reduced.
The method for controlling the second device 200 performed at the first device 100 according to the embodiment of the present disclosure and the functional configuration of the corresponding first device 100 will be described in detail below with reference to fig. 7 to 8.
Fig. 7 is a flowchart illustrating a method 700 performed at the first appliance 100 for controlling the second appliance 200 according to an embodiment of the present disclosure. As shown in fig. 7, the method 700 may include steps S710, S720, S730, and S740. Some of the steps of method 700 may be performed separately or in combination, and may be performed in parallel or sequentially in accordance with the present disclosure and are not limited to the specific order of operations shown in fig. 7. In some embodiments, method 700 may be performed by first device 100 shown in fig. 1-6, first device 800 shown in fig. 8, or device 900 shown in fig. 9.
Fig. 8 is a functional block diagram illustrating an example first device 800 (e.g., first device 100, head-mounted display, handheld controller, or other control device, etc.) in accordance with an embodiment of the present disclosure. As shown in fig. 8, the first device 800 may include: a space determining module 810, a first mapping relation determining module 820, a second operation determining module 830 and an instruction transmitting module 840.
The space determination module 810 may be used to determine a first space 10 associated with the first device 100 and a second space 20 associated with the second device 200. The space determining module 810 may be a central processing unit, Digital Signal Processor (DSP), microprocessor, microcontroller, etc. of the first device 100, which may cooperate with, for example, an input device of the first device 100, etc. to determine a first space 10 associated with the first device 100 and a second space 20 associated with the second device 200.
The first mapping relationship determining module 820 may be used to determine a first coordinate mapping relationship between the first space 10 and the second space 20. The first mapping relation determining module 820 may also be a central processing unit, a Digital Signal Processor (DSP), a microprocessor, a microcontroller, etc. of the first device 100, which may determine the first coordinate mapping relation between the first space 10 and the second space 20 according to the size, shape, orientation, etc. of the first space 10 and the second space 20.
The second operation determination module 830 may be configured to determine a second operation to be performed by the second device 200 in the second space 20 according to the first operation of the first device 100 in the first space 10 based on the first coordinate mapping relationship. The second operation determining module 830 may also be a central processing unit, a Digital Signal Processor (DSP), a microprocessor, a microcontroller, etc. of the first device 100, which may convert a first operation of the first device 100 into a second operation of the second device 200, thereby enabling a user to intuitively and simply manipulate the second device 200.
The instruction transmitting module 840 may be configured to transmit a control instruction to the second device 200 to instruct the second device 200 to perform the second operation. The instruction transmitting module 840 may also be a central processing unit, a Digital Signal Processor (DSP), a microprocessor, a microcontroller, etc. of the first device 100, which may cooperate with a communication part (e.g., a wired/wireless communication unit, specifically, e.g., an RF unit, a WiFi unit, a cable, an ethernet interface card) of the first device 100 to transmit a control instruction to the second device 200 to instruct the second device 200 to perform the second operation.
In addition, the first device 800 may further include other functional modules not shown in fig. 8, such as a first coordinate determination module, a second coordinate mapping module, a synchronization activation instruction transmission module, a synchronization release instruction transmission module, an operation stop instruction transmission module, a third coordinate determination module, a fourth coordinate mapping module, a second space re-determination module, a mapping relationship re-determination module, and/or a second operation re-determination module, and the like. In some embodiments, the first coordinate determination module may be to determine first coordinates of the first device in the first space when performing the synchronization activation operation. In some embodiments, the second coordinate mapping module may be to map the first coordinate to a second coordinate in a second space based on the first coordinate mapping relationship. In some embodiments, the synchronization activation instruction sending module may be configured to send a synchronization activation instruction to the second device to indicate that the second device moves to the second coordinate and indicates that the second device is in a "synchronized" state. In some embodiments, the synchronization release instruction sending module may be configured to send a synchronization cancellation instruction to the second device to indicate that the second device is in an "desynchronized" state. In some embodiments, the operation-stop instruction sending module may be to send the operation-stop instruction to the second device to instruct the second device to stop performing the corresponding second operation and hover at the current location if the first device leaves the first space during the first operation. In some embodiments, the third coordinate determination module may be to determine a third coordinate of the first device when returning to the first space if the first device returns to the first space. In some embodiments, the fourth coordinate mapping module may be to map the third coordinate to a fourth coordinate in the second space based on the first coordinate mapping relationship. In some embodiments, the second operation determination module may be further operable to determine the second operation as an operation that indicates the second device is moved to the fourth coordinate. In some embodiments, the second space re-determination module may be to re-determine a second space associated with the second device. In some embodiments, the mapping relationship re-determination module may be for determining a second coordinate mapping relationship between the first space and the re-determined second space. In some embodiments, the second operation re-determination module may be to determine, based on the second coordinate mapping relationship, a second operation to be performed by the second device in the re-determined second space in accordance with the first operation of the first device in the first space.
Furthermore, the first device 800 may also include other functional modules not shown in fig. 8, however, since they do not affect the understanding of the embodiments of the present disclosure by those skilled in the art, they are omitted in fig. 8. For example, the first device 800 may further include one or more of the following functional modules: power supply, memory, data bus, antenna, wireless transceiver, etc.
A method 700 for controlling the second device 200 and the first device 800 (e.g., the first device 100) performed at the first device 800 (e.g., the first device 100) according to an embodiment of the present disclosure will be described in detail below with reference to fig. 7 and 8.
The method 700 begins at step S710, and at step S710, a first space 10 associated with a first device 800 and a second space 20 associated with a second device 200 may be determined by a space determination module 810 of the first device 800.
In step S720, a first coordinate mapping relationship between the first space 10 and the second space 20 may be determined by the first mapping relationship determining module 820 of the first device 800.
In step S730, a second operation to be performed in the second space 20 by the second device 200 may be determined according to the first operation of the first device 800 in the first space 10 based on the first coordinate mapping relationship by the second operation determination module 830 of the first device 800.
In step S740, a control instruction may be transmitted to the second device 200 by the instruction transmitting module 840 of the first device 800 to instruct the second device 200 to perform the second operation.
In some embodiments, step S710 may include determining at least one of: the position of a vertex of the first space and the length of the first space on each coordinate axis; the positions of the two vertices of the first space and the length of the first space on a coordinate axis perpendicular to a line formed by the two vertices; positions of three non-collinear vertexes of the first space and a length of the first space in a direction of a perpendicular to a plane formed by the three vertexes; and the positions of the at least four vertices of the first space that are not coplanar. In some embodiments, step S710 may include determining at least one of: the position of one vertex of the second space on the three-dimensional electronic map and the length of the second space on each coordinate axis of the three-dimensional electronic map; the positions of two vertexes of the second space on the three-dimensional electronic map and the length of the second space on a coordinate axis perpendicular to a line formed by the two vertexes; the positions of the three non-collinear vertexes of the second space on the three-dimensional electronic map and the length of the second space in the direction of the perpendicular to the plane formed by the three vertexes; and the positions of the non-coplanar at least four vertices of the second space on the three-dimensional electronic map. In some embodiments, step S720 may include: setting respective origins of the first space and the second space; determining the corresponding proportion of the lengths of the first space and the second space on each coordinate axis; and determining a first coordinate mapping relation of the coordinates in the first space and the coordinates in the second space based on the origin and the corresponding proportion of the first space and the second space. In some embodiments, method 700 may further include: determining first coordinates of the first device in a first space when performing a synchronization activation operation; mapping the first coordinate to a second coordinate in a second space based on the first coordinate mapping relationship; and sending a synchronization activation instruction to the second device to instruct the second device to move to the second coordinate and to indicate that the second device is in a "synchronized" state. In some embodiments, the method 700 may further include sending a synchronization cancellation instruction to the second device to indicate that the second device is in an "unsynchronized" state.
In some embodiments, the first operation may include at least one of: a panning operation, a steering operation, a viewing angle changing operation, and a height changing operation. In some embodiments, if the first operation is a translation operation, step S730 may include: determining a first translation route of the first device in the first space when performing the translation operation; mapping the first translation route to a second translation route in a second space based on the first coordinate mapping relationship; and determining the second operation as an operation instructing the second device to move along the second translation route. In some embodiments, if the first operation is a steering operation, step S730 may include: determining a first azimuth angle of the first device in a first space at the end of the steering operation; mapping the first azimuth to a second azimuth in a second space based on the first coordinate mapping relationship; and determining the second operation as indicating the second device or the image sensing component of the second device to steer to the second azimuth angle. In some embodiments, if the first operation is a viewing angle changing operation, step S730 may include: determining a first pitch angle of the first device in the first space at the end of the view angle changing operation; mapping the first pitch angle to a second pitch angle in a second space based on the first coordinate mapping relationship; and determining the second operation as indicating that the image sensing assembly of the second device is steered to the second pitch angle. In some embodiments, if the first operation is a height change operation, step S730 may include: determining a highest elevation or a lowest elevation reached by the first device in the first space during performance of the elevation change operation; if the highest or lowest altitude is respectively higher or lower than a highest or lowest threshold, mapping a difference between the highest or lowest altitude and the corresponding highest or lowest threshold as a height difference in the second space based on the first coordinate mapping relationship; and determining the second operation as indicating a second device rise or fall height difference.
In some embodiments, method 700 may further include: if the first device leaves the first space during the first operation, a control instruction is sent to the second device to instruct the second device to stop performing the corresponding second operation and to hover at the current location. In some embodiments, method 700 may further include: determining a third coordinate of the first device when returning to the first space if the first device returns to the first space; mapping the third coordinate to a fourth coordinate in the second space based on the first coordinate mapping relationship; and determining the second operation as an operation indicating that the second device is moved to the fourth coordinate. In some embodiments, method 700 may further include: re-determining a second space associated with a second device; determining a second coordinate mapping relationship between the first space and the re-determined second space; determining, based on the second coordinate mapping relationship, a second operation to be performed by the second device in the re-determined second space in accordance with the first operation of the first device in the first space; and sending a control instruction to the second device to instruct the second device to perform the second operation.
FIG. 9 is a block diagram illustrating an example hardware arrangement 900 of the first device 100 shown in FIGS. 1-6 or the first device 800 shown in FIG. 8 according to an embodiment of the disclosure. The hardware arrangement 900 may include a processor 906 (e.g., a Central Processing Unit (CPU), a Digital Signal Processor (DSP), a microcontroller unit (MCU), etc.). Processor 906 may be a single processing unit or multiple processing units for performing different actions of the processes described herein. The arrangement 900 may also comprise an input unit 902 for receiving signals from other entities, and an output unit 904 for providing signals to other entities. The input unit 902 and the output unit 904 may be arranged as a single entity or as separate entities.
Furthermore, the arrangement 900 may comprise at least one readable storage medium 908 in the form of a non-volatile or volatile memory, for example an electrically erasable programmable read-only memory (EEPROM), a flash memory, and/or a hard disk drive. The readable storage medium 908 comprises computer program instructions 910, the computer program instructions 910 comprising code/computer readable instructions that, when executed by the processor 906 in the arrangement 900, cause the hardware arrangement 900 and/or the first device 100 or the first device 800 comprising the hardware arrangement 900 to perform, for example, the procedures described above in connection with fig. 1-7 and any variations thereof.
The computer program instructions 910 may be configured as computer program instruction code having, for example, an architecture of computer program instruction modules 910A-910D. Thus, in an example embodiment when the hardware arrangement 900 is used, for example, in the first device 100 or 800, the code in the computer program instructions of the arrangement 900 comprises: a module 910A for determining a first space associated with a first device and a second space associated with a second device. The code in the computer program instructions further comprises: module 910B is configured to determine a first coordinate mapping relationship between the first space and the second space. The code in the computer program instructions further comprises: module 910C determines, based on the first coordinate mapping relationship, a second operation to be performed by the second device in the second space according to the first operation of the first device in the first space. The code in the computer program instructions further comprises: module 910D sends a control instruction to the second device to instruct the second device to perform the second operation.
The computer program instruction modules may perform substantially each of the actions of the processes illustrated in figures 1-7 to simulate the first device 100 or 800. In other words, when different modules of computer program instructions are executed in the processor 906, they may correspond to the different modules described above in the first device 100 or 800.
Although the code means in the embodiments disclosed above in connection with fig. 9 are implemented as modules of computer program instructions which, when executed in the processor 906, cause the hardware arrangement 900 to perform the actions described above in connection with fig. 1-7, in alternative embodiments at least one of the code means may be implemented at least partly as hardware circuits.
The processor may be a single CPU (central processing unit), but may also include two or more processing units. For example, a processor may include a general purpose microprocessor, an instruction set processor, and/or related chip sets and/or special purpose microprocessors (e.g., an Application Specific Integrated Circuit (ASIC)). The processor may also include on-board memory for caching purposes. The computer program instructions may be carried by a computer program instruction product coupled to a processor. The computer program instruction product may include a computer-readable medium having computer program instructions stored thereon. For example, the computer program instruction product may be flash memory, Random Access Memory (RAM), Read Only Memory (ROM), EEPROM, and the above-described modules of computer program instructions may be distributed in alternative embodiments in the form of memory within the UE to different computer program instruction products.
It should be noted that the functions described herein as being implemented by pure hardware, pure software and/or firmware, can also be implemented by special purpose hardware, a combination of general purpose hardware and software, etc. For example, functions described as being implemented by dedicated hardware (e.g., Field Programmable Gate Array (FPGA), Application Specific Integrated Circuit (ASIC), etc.) may be implemented by a combination of general purpose hardware (e.g., Central Processing Unit (CPU), Digital Signal Processor (DSP)) and software, and vice versa.
While the disclosure has been shown and described with reference to certain exemplary embodiments thereof, it will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the disclosure as defined by the appended claims and their equivalents. Accordingly, the scope of the present disclosure should not be limited to the above-described embodiments, but should be defined not only by the appended claims, but also by equivalents thereof.

Claims (40)

1. A method performed at a first device for controlling a second device, comprising:
determining a first space associated with the first device and a second space associated with the second device;
determining a first coordinate mapping relationship between the first space and the second space;
determining, based on the first coordinate mapping relationship, a second operation to be performed in the second space by the second device in accordance with a first operation of the first device in the first space; and
sending a control instruction to the second device to instruct the second device to execute the second operation;
wherein determining a first space associated with the first device comprises determining at least one of:
a position of a vertex of the first space and a length of the first space on each coordinate axis;
the positions of two vertices of the first space and the length of the first space on a coordinate axis perpendicular to a line formed by the two vertices;
positions of three non-collinear vertices of the first space and a length of the first space in a direction perpendicular to a plane formed by the three vertices; and
the positions of at least four vertices of the first space that are not coplanar;
wherein the step of determining a second space associated with the second device comprises determining at least one of:
the position of one vertex of the second space on the three-dimensional electronic map and the length of the second space on each coordinate axis of the three-dimensional electronic map;
the positions of two vertexes of the second space on the three-dimensional electronic map and the length of the second space on a coordinate axis perpendicular to a line formed by the two vertexes;
the positions of the non-collinear three vertexes of the second space on the three-dimensional electronic map and the length of the second space in the direction of the perpendicular to the plane formed by the three vertexes; and
the positions of the non-coplanar at least four vertices of the second space on the three-dimensional electronic map.
2. The method of claim 1, wherein determining the first coordinate mapping relationship between the first space and the second space comprises:
setting respective origins of the first space and the second space;
determining a respective ratio of lengths of the first space and the second space on each coordinate axis; and
determining a first coordinate mapping relationship of the coordinates in the first space and the coordinates in the second space based on the origin and the corresponding ratio of the first space and the second space.
3. The method of claim 1, wherein the method further comprises:
determining first coordinates of the first device in the first space when performing a synchronization activation operation;
mapping the first coordinate to a second coordinate in the second space based on the first coordinate mapping relationship; and
sending a synchronization activation instruction to the second device to instruct the second device to move to the second coordinate and to instruct the second device to be in a "synchronization" state.
4. The method of claim 3, wherein the method further comprises:
sending a synchronization cancellation instruction to the second device to indicate that the second device is in an 'desynchronization' state.
5. The method of claim 3, wherein the first operation comprises at least one of: a panning operation, a steering operation, a viewing angle changing operation, and a height changing operation.
6. The method of claim 5, wherein if the first operation is a translation operation, determining a second operation to be performed by the second device in the second space from the first operation of the first device in the first space based on the first coordinate mapping relationship comprises:
determining a first translation route of the first device in the first space when performing the translation operation;
mapping the first translation route to a second translation route in the second space based on the first coordinate mapping relationship; and
determining the second operation as an operation that instructs the second device to move along the second translational route.
7. The method of claim 5, wherein if the first operation is a steering operation, determining a second operation to be performed by the second device in the second space from the first operation of the first device in the first space based on the first coordinate mapping relationship comprises:
determining a first azimuth angle of the first device in the first space at the end of the steering operation;
mapping the first azimuth to a second azimuth in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating the second device or an image sensing component of the second device to steer to the second azimuth angle.
8. The method of claim 5, wherein if the first operation is a perspective change operation, determining a second operation to be performed by the second device in the second space from the first operation of the first device in the first space based on the first coordinate mapping relationship comprises:
determining a first pitch angle of the first device in the first space at the end of the view angle changing operation;
mapping the first pitch angle to a second pitch angle in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating that an image sensing assembly of the second device is steered to the second pitch angle.
9. The method of claim 5, wherein if the first operation is an altitude change operation, determining a second operation to be performed by the second device in the second space from the first operation of the first device in the first space based on the first coordinate mapping relationship comprises:
determining a highest elevation or a lowest elevation reached by the first device in the first space during performance of the elevation change operation;
if the highest or lowest elevation is above or below a highest or lowest threshold, respectively, mapping a difference between the highest or lowest elevation and the corresponding highest or lowest threshold as an elevation difference in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating the second device to ascend or descend the height difference.
10. The method of claim 3, further comprising:
if the first device leaves the first space during the first operation, sending a control instruction to the second device to instruct the second device to stop performing the corresponding second operation and hover at the current position.
11. The method of claim 10, further comprising:
determining a third coordinate of the first device when returning to the first space if the first device returns to the first space;
mapping the third coordinate to a fourth coordinate in the second space based on the first coordinate mapping relationship; and
determining the second operation as an operation that indicates the second device to move to the fourth coordinate.
12. The method of claim 10, further comprising:
re-determining a second space associated with the second device;
determining a second coordinate mapping relationship between the first space and the re-determined second space;
determining, based on the second coordinate mapping relationship, a second operation to be performed by the second device in the re-determined second space according to a first operation of the first device in the first space; and
sending a control instruction to the second device to instruct the second device to perform the second operation.
13. The method of claim 1, wherein the first device is a head mounted display and the second device is a drone.
14. A first device for controlling a second device, comprising:
a space determination module to determine a first space associated with the first device and a second space associated with the second device;
a first mapping relation determining module, configured to determine a first coordinate mapping relation between the first space and the second space;
a second operation determination module for determining, based on the first coordinate mapping relationship, a second operation to be performed in the second space by the second device according to a first operation of the first device in the first space; and
the instruction sending module is used for sending a control instruction to the second equipment so as to instruct the second equipment to execute the second operation;
wherein the space determination module is further configured to determine at least one of:
a position of a vertex of the first space and a length of the first space on each coordinate axis;
the positions of two vertices of the first space and the length of the first space on a coordinate axis perpendicular to a line formed by the two vertices;
positions of three non-collinear vertices of the first space and a length of the first space in a direction perpendicular to a plane formed by the three vertices; and
the positions of at least four vertices of the first space that are not coplanar;
wherein the space determination module is further configured to determine at least one of:
the position of one vertex of the second space on the three-dimensional electronic map and the length of the second space on each coordinate axis of the three-dimensional electronic map;
the positions of two vertexes of the second space on the three-dimensional electronic map and the length of the second space on a coordinate axis perpendicular to a line formed by the two vertexes;
the positions of the non-collinear three vertexes of the second space on the three-dimensional electronic map and the length of the second space in the direction of the perpendicular to the plane formed by the three vertexes; and
the positions of the non-coplanar at least four vertices of the second space on the three-dimensional electronic map.
15. The first device of claim 14, wherein the first mapping determination module is further to:
setting respective origins of the first space and the second space;
determining a respective ratio of lengths of the first space and the second space on each coordinate axis; and
determining a first coordinate mapping relationship of the coordinates in the first space and the coordinates in the second space based on the origin and the corresponding ratio of the first space and the second space.
16. The first device of claim 14, further comprising:
a first coordinate determination module to determine first coordinates of the first device in the first space when performing a synchronization activation operation;
a second coordinate mapping module for mapping the first coordinate to a second coordinate in the second space based on the first coordinate mapping relationship; and
and the synchronous activation instruction sending module is used for sending a synchronous activation instruction to the second equipment so as to indicate that the second equipment moves to the second coordinate and indicate that the second equipment is in a 'synchronous' state.
17. The first device of claim 16, further comprising:
and the synchronization canceling instruction sending module is used for sending a synchronization canceling instruction to the second equipment so as to indicate that the second equipment is in a synchronization canceling state.
18. The first device of claim 16, wherein the first operation comprises at least one of: a panning operation, a steering operation, a viewing angle changing operation, and a height changing operation.
19. The first device of claim 18, wherein if the first operation is a translation operation, the second operation determination module is further to:
determining a first translation route of the first device in the first space when performing the translation operation;
mapping the first translation route to a second translation route in the second space based on the first coordinate mapping relationship; and
determining the second operation as an operation that instructs the second device to move along the second translational route.
20. The first device of claim 18, wherein if the first operation is a steering operation, the second operation determination module is further to:
determining a first azimuth angle of the first device in the first space at the end of the steering operation;
mapping the first azimuth to a second azimuth in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating the second device or an image sensing component of the second device to steer to the second azimuth angle.
21. The first device of claim 18, wherein if the first operation is a view change operation, the second operation determination module is further to:
determining a first pitch angle of the first device in the first space at the end of the view angle changing operation;
mapping the first pitch angle to a second pitch angle in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating that an image sensing assembly of the second device is steered to the second pitch angle.
22. The first device of claim 18, wherein if the first operation is an altitude change operation, the second operation determination module is further to:
determining a highest elevation or a lowest elevation reached by the first device in the first space during performance of the elevation change operation;
if the highest or lowest elevation is above or below a highest or lowest threshold, respectively, mapping a difference between the highest or lowest elevation and the corresponding highest or lowest threshold as an elevation difference in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating the second device to ascend or descend the height difference.
23. The first device of claim 16, further comprising:
an operation stop instruction sending module, configured to send an operation stop instruction to the second device to instruct the second device to stop performing the corresponding second operation and hover at the current location if the first device leaves the first space during the first operation.
24. The first device of claim 23, further comprising:
a third coordinate determination module for determining a third coordinate of the first device when returning to the first space if the first device returns to the first space;
a fourth coordinate mapping module for mapping the third coordinate to a fourth coordinate in the second space based on the first coordinate mapping relation,
wherein the second operation determination module is further to determine the second operation as an operation indicating that the second device moved to the fourth coordinate.
25. The first device of claim 14, further comprising:
a second space re-determination module to re-determine a second space associated with the second device;
a mapping relationship re-determination module for determining a second coordinate mapping relationship between the first space and the re-determined second space; and
a second operation re-determination module to determine, based on the second coordinate mapping relationship, a second operation to be performed by the second device in the re-determined second space according to the first operation of the first device in the first space.
26. The first device of claim 14, wherein the first device is a head mounted display and the second device is a drone.
27. A first device for controlling a second device, comprising:
a processor;
a memory having instructions stored therein, which when executed by the processor, cause the processor to:
determining a first space associated with the first device and a second space associated with the second device;
determining a first coordinate mapping relationship between the first space and the second space;
determining, based on the first coordinate mapping relationship, a second operation to be performed in the second space by the second device in accordance with a first operation of the first device in the first space; and
sending a control instruction to the second device to instruct the second device to execute the second operation;
wherein the instructions, when executed by the processor, further cause the processor to determine at least one of:
a position of a vertex of the first space and a length of the first space on each coordinate axis;
the positions of two vertices of the first space and the length of the first space on a coordinate axis perpendicular to a line formed by the two vertices;
positions of three non-collinear vertices of the first space and a length of the first space in a direction perpendicular to a plane formed by the three vertices; and
the positions of at least four vertices of the first space that are not coplanar;
wherein the instructions, when executed by the processor, further cause the processor to determine at least one of:
the position of one vertex of the second space on the three-dimensional electronic map and the length of the second space on each coordinate axis of the three-dimensional electronic map;
the positions of two vertexes of the second space on the three-dimensional electronic map and the length of the second space on a coordinate axis perpendicular to a line formed by the two vertexes;
the positions of the non-collinear three vertexes of the second space on the three-dimensional electronic map and the length of the second space in the direction of the perpendicular to the plane formed by the three vertexes; and
the positions of the non-coplanar at least four vertices of the second space on the three-dimensional electronic map.
28. The first device of claim 27, wherein the instructions, when executed by the processor, further cause the processor to:
setting respective origins of the first space and the second space;
determining a respective ratio of lengths of the first space and the second space on each coordinate axis; and
determining a first coordinate mapping relationship of the coordinates in the first space and the coordinates in the second space based on the origin and the corresponding ratio of the first space and the second space.
29. The first device of claim 27, wherein the instructions, when executed by the processor, further cause the processor to:
determining first coordinates of the first device in the first space when performing a synchronization activation operation;
mapping the first coordinate to a second coordinate in the second space based on the first coordinate mapping relationship; and
sending a synchronization activation instruction to the second device to instruct the second device to move to the second coordinate and to instruct the second device to be in a "synchronization" state.
30. The first device of claim 29, wherein the instructions, when executed by the processor, further cause the processor to:
sending a synchronization cancellation instruction to the second device to indicate that the second device is in an 'desynchronization' state.
31. The first device of claim 29, wherein the first operation comprises at least one of: a panning operation, a steering operation, a viewing angle changing operation, and a height changing operation.
32. The first device of claim 31, wherein if the first operation is a translation operation, the instructions, when executed by the processor, further cause the processor to:
determining a first translation route of the first device in the first space when performing the translation operation;
mapping the first translation route to a second translation route in the second space based on the first coordinate mapping relationship; and
determining the second operation as an operation that instructs the second device to move along the second translational route.
33. The first device of claim 31, wherein if the first operation is a steering operation, the instructions, when executed by the processor, further cause the processor to:
determining a first azimuth angle of the first device in the first space at the end of the steering operation;
mapping the first azimuth to a second azimuth in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating the second device or an image sensing component of the second device to steer to the second azimuth angle.
34. The first device of claim 31, wherein if the first operation is a view change operation, the instructions, when executed by the processor, further cause the processor to:
determining a first pitch angle of the first device in the first space at the end of the view angle changing operation;
mapping the first pitch angle to a second pitch angle in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating that an image sensing assembly of the second device is steered to the second pitch angle.
35. The first device of claim 31, wherein if the first operation is an altitude change operation, the instructions, when executed by the processor, further cause the processor to:
determining a highest elevation or a lowest elevation reached by the first device in the first space during performance of the elevation change operation;
if the highest or lowest elevation is above or below a highest or lowest threshold, respectively, mapping a difference between the highest or lowest elevation and the corresponding highest or lowest threshold as an elevation difference in the second space based on the first coordinate mapping relationship; and
determining the second operation as indicating the second device to ascend or descend the height difference.
36. The first device of claim 29, wherein the instructions, when executed by the processor, further cause the processor to:
if the first device leaves the first space during the first operation, sending a control instruction to the second device to instruct the second device to stop performing the corresponding second operation and hover at the current position.
37. The first device of claim 36, wherein the instructions, when executed by the processor, further cause the processor to:
determining a third coordinate of the first device when returning to the first space if the first device returns to the first space;
mapping the third coordinate to a fourth coordinate in the second space based on the first coordinate mapping relationship; and
determining the second operation as an operation that indicates the second device to move to the fourth coordinate.
38. The first device of claim 27, wherein the instructions, when executed by the processor, further cause the processor to:
re-determining a second space associated with the second device;
determining a second coordinate mapping relationship between the first space and the re-determined second space;
determining, based on the second coordinate mapping relationship, a second operation to be performed by the second device in the re-determined second space according to a first operation of the first device in the first space; and
sending a control instruction to the second device to instruct the second device to perform the second operation.
39. The first device of claim 27, wherein the first device is a head mounted display and the second device is a drone.
40. A computer-readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the method of any of claims 1-13.
CN201780004525.4A 2017-05-16 2017-05-16 Method, apparatus, and computer-readable storage medium for apparatus control Expired - Fee Related CN108475064B (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/084531 WO2018209557A1 (en) 2017-05-16 2017-05-16 Method and device for controlling device, and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN108475064A CN108475064A (en) 2018-08-31
CN108475064B true CN108475064B (en) 2021-11-05

Family

ID=63266469

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201780004525.4A Expired - Fee Related CN108475064B (en) 2017-05-16 2017-05-16 Method, apparatus, and computer-readable storage medium for apparatus control

Country Status (2)

Country Link
CN (1) CN108475064B (en)
WO (1) WO2018209557A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109395382A (en) * 2018-09-12 2019-03-01 苏州蜗牛数字科技股份有限公司 A kind of linear optimization method for rocking bar
CN109062259A (en) * 2018-10-31 2018-12-21 西安天问智能科技有限公司 A kind of unmanned plane automatic obstacle-avoiding method and device thereof
CN109799838B (en) * 2018-12-21 2022-04-15 金季春 Training method and system

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000507A (en) * 2006-09-29 2007-07-18 浙江大学 Method for moving robot simultanously positioning and map structuring at unknown environment
CN102589544A (en) * 2012-01-10 2012-07-18 合肥工业大学 Three-dimensional attitude acquisition method based on space characteristics of atmospheric polarization mode
CN102749080A (en) * 2012-06-18 2012-10-24 北京航空航天大学 Unmanned aerial vehicle three-dimensional air route generation method based on hydrodynamics
CN103150309A (en) * 2011-12-07 2013-06-12 清华大学 Method and system for searching POI (Point of Interest) points of awareness map in space direction
CN103226386A (en) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 Gesture identification method and system based on mobile terminal
JP2014059824A (en) * 2012-09-19 2014-04-03 Casio Comput Co Ltd Function driving device, function driving method, and function driving program
WO2015020540A1 (en) * 2013-08-09 2015-02-12 Fisher & Paykel Healthcare Limited Asymmetrical nasal delivery elements and fittings for nasal interfaces
CN105424024A (en) * 2015-11-03 2016-03-23 葛洲坝易普力股份有限公司 Spatial target position and orientation calibration method based on total station
CN105786011A (en) * 2016-03-07 2016-07-20 重庆邮电大学 Control method and control equipment for remote-controlled aerial vehicle
CN106023657A (en) * 2015-03-30 2016-10-12 国际商业机器公司 Implementing A Restricted-Operation Region For Unmanned Vehicles
CN205942090U (en) * 2016-04-29 2017-02-08 深圳市大疆创新科技有限公司 Wearable equipment and unmanned aerial vehicle system
CN206031749U (en) * 2016-08-31 2017-03-22 佛山世寰智能科技有限公司 Unmanned aerial vehicle's four -axis rotor fixed knot constructs
CN106569596A (en) * 2016-10-20 2017-04-19 努比亚技术有限公司 Gesture control method and equipment

Family Cites Families (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8577535B2 (en) * 2010-03-31 2013-11-05 Massachusetts Institute Of Technology System and method for providing perceived first-order control of an unmanned vehicle
CN102184572B (en) * 2011-05-19 2017-07-21 威盛电子股份有限公司 3-D graphic method of cutting out, rendering method and its graphic processing facility
US9423876B2 (en) * 2011-09-30 2016-08-23 Microsoft Technology Licensing, Llc Omni-spatial gesture input
US20140018979A1 (en) * 2012-07-13 2014-01-16 Honeywell International Inc. Autonomous airspace flight planning and virtual airspace containment system
EP4099136A1 (en) * 2013-02-22 2022-12-07 Sony Group Corporation Head- mounted display and image display device
CN107168360B (en) * 2013-07-05 2021-03-30 深圳市大疆创新科技有限公司 Flight assistance method and device for unmanned aerial vehicle
CN103499346B (en) * 2013-09-29 2016-05-11 大连理工大学 One SUAV earth station three-dimensional navigation map realization method
US9746984B2 (en) * 2014-08-19 2017-08-29 Sony Interactive Entertainment Inc. Systems and methods for providing feedback to a user while interacting with content
CN104991561B (en) * 2015-08-10 2019-02-01 北京零零无限科技有限公司 A kind of method, apparatus and unmanned plane of hand-held unmanned plane recycling
WO2017003538A2 (en) * 2015-04-14 2017-01-05 Tobin Fisher System for authoring, executing, and distributing unmanned aerial vehicle flight-behavior profiles
FR3035523B1 (en) * 2015-04-23 2017-04-21 Parrot IMMERSION DRONE DRIVING SYSTEM
KR101679741B1 (en) * 2015-05-06 2016-11-28 고려대학교 산학협력단 Method for extracting outter static structure of space from geometric data of space
KR101797208B1 (en) * 2015-09-07 2017-11-13 한국항공대학교산학협력단 Live, virtual and constructive operation system and method for experimentation and training of unmanned aircraft vehicle
CN205216197U (en) * 2015-12-07 2016-05-11 南京邮电大学 Model aeroplane and model ship aircraft safety remote control system based on active gesture detects
CN105607740A (en) * 2015-12-29 2016-05-25 清华大学深圳研究生院 Unmanned aerial vehicle control method and device based on computer vision
CN106064378A (en) * 2016-06-07 2016-11-02 南方科技大学 The control method of a kind of unmanned plane mechanical arm and device
CN106155069A (en) * 2016-07-04 2016-11-23 零度智控(北京)智能科技有限公司 UAV Flight Control device, method and remote terminal
CN106227230A (en) * 2016-07-09 2016-12-14 东莞市华睿电子科技有限公司 A kind of unmanned aerial vehicle (UAV) control method
CN106125747A (en) * 2016-07-13 2016-11-16 国网福建省电力有限公司 Based on the servo-actuated Towed bird system in unmanned aerial vehicle onboard the first visual angle mutual for VR
CN106292679B (en) * 2016-08-29 2019-04-19 电子科技大学 The control method of wearable unmanned aerial vehicle (UAV) control equipment based on body-sensing
CN106228615A (en) * 2016-08-31 2016-12-14 陈昊 Unmanned vehicle experiencing system based on augmented reality and experiential method thereof

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101000507A (en) * 2006-09-29 2007-07-18 浙江大学 Method for moving robot simultanously positioning and map structuring at unknown environment
CN103150309A (en) * 2011-12-07 2013-06-12 清华大学 Method and system for searching POI (Point of Interest) points of awareness map in space direction
CN102589544A (en) * 2012-01-10 2012-07-18 合肥工业大学 Three-dimensional attitude acquisition method based on space characteristics of atmospheric polarization mode
CN102749080A (en) * 2012-06-18 2012-10-24 北京航空航天大学 Unmanned aerial vehicle three-dimensional air route generation method based on hydrodynamics
JP2014059824A (en) * 2012-09-19 2014-04-03 Casio Comput Co Ltd Function driving device, function driving method, and function driving program
CN103226386A (en) * 2013-03-13 2013-07-31 广东欧珀移动通信有限公司 Gesture identification method and system based on mobile terminal
WO2015020540A1 (en) * 2013-08-09 2015-02-12 Fisher & Paykel Healthcare Limited Asymmetrical nasal delivery elements and fittings for nasal interfaces
CN106023657A (en) * 2015-03-30 2016-10-12 国际商业机器公司 Implementing A Restricted-Operation Region For Unmanned Vehicles
CN105424024A (en) * 2015-11-03 2016-03-23 葛洲坝易普力股份有限公司 Spatial target position and orientation calibration method based on total station
CN105786011A (en) * 2016-03-07 2016-07-20 重庆邮电大学 Control method and control equipment for remote-controlled aerial vehicle
CN205942090U (en) * 2016-04-29 2017-02-08 深圳市大疆创新科技有限公司 Wearable equipment and unmanned aerial vehicle system
CN206031749U (en) * 2016-08-31 2017-03-22 佛山世寰智能科技有限公司 Unmanned aerial vehicle's four -axis rotor fixed knot constructs
CN106569596A (en) * 2016-10-20 2017-04-19 努比亚技术有限公司 Gesture control method and equipment

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications;Zhang, JS等;《SENSORS》;20150630;第15卷(第6期);13874-13898 *
基于几何力学模型的无人机运动规划与导引方法研究;李杰;《中国博士学位论文全文数据库 工程科技II辑》;20160115(第01期);C031-40 *
某飞行器建模与三维视景仿真研究;左善超;《中国优秀硕士学位论文全文数据库 工程科技II辑》;20150415(第04期);C031-2 *

Also Published As

Publication number Publication date
CN108475064A (en) 2018-08-31
WO2018209557A1 (en) 2018-11-22

Similar Documents

Publication Publication Date Title
JP6228679B2 (en) Gimbal and gimbal simulation system
US11932392B2 (en) Systems and methods for adjusting UAV trajectory
US11385645B2 (en) Remote control method and terminal
JP6811336B2 (en) Multi gimbal assembly
CN105759833A (en) Immersive unmanned aerial vehicle driving flight system
WO2018187916A1 (en) Cradle head servo control method and control device
CN108475064B (en) Method, apparatus, and computer-readable storage medium for apparatus control
CN108733070A (en) Unmanned aerial vehicle (UAV) control method and control system
WO2017169841A1 (en) Display device and display control method
US11804052B2 (en) Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path
Prexl et al. User studies of a head-mounted display for search and rescue teleoperation of UAVs via satellite link
WO2020209167A1 (en) Information processing device, information processing method, and program
EP3288828B1 (en) Unmanned aerial vehicle system and method for controlling an unmanned aerial vehicle
O'Keeffe et al. Oculus rift application for training drone pilots
Cummings et al. Development and testing of a quad rotor smartphone control system for novice users
WO2024000189A1 (en) Control method, head-mounted display device, control system and storage medium
SHARMA Deployment of drone demonstators-Automatic take-off and landing of a drone on a mobile robot.
JP2021036452A (en) System and method for adjusting uav locus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20211105