CN108475064A - Method, equipment and computer readable storage medium for equipment control - Google Patents
Method, equipment and computer readable storage medium for equipment control Download PDFInfo
- Publication number
- CN108475064A CN108475064A CN201780004525.4A CN201780004525A CN108475064A CN 108475064 A CN108475064 A CN 108475064A CN 201780004525 A CN201780004525 A CN 201780004525A CN 108475064 A CN108475064 A CN 108475064A
- Authority
- CN
- China
- Prior art keywords
- space
- equipment
- coordinate
- mapping relations
- vertex
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000000034 method Methods 0.000 title claims abstract description 52
- 238000003860 storage Methods 0.000 title claims abstract description 9
- 238000013507 mapping Methods 0.000 claims abstract description 97
- 230000001360 synchronised effect Effects 0.000 claims description 38
- 230000008859 change Effects 0.000 claims description 34
- 238000013519 translation Methods 0.000 claims description 30
- 230000004913 activation Effects 0.000 claims description 19
- 230000000007 visual effect Effects 0.000 claims description 16
- 238000012545 processing Methods 0.000 claims description 11
- 230000006399 behavior Effects 0.000 claims description 7
- 230000000875 corresponding effect Effects 0.000 description 30
- 238000004590 computer program Methods 0.000 description 18
- 230000009471 action Effects 0.000 description 16
- 238000004891 communication Methods 0.000 description 14
- 238000005516 engineering process Methods 0.000 description 12
- 230000006870 function Effects 0.000 description 11
- 230000001276 controlling effect Effects 0.000 description 10
- 210000003128 head Anatomy 0.000 description 9
- 230000008569 process Effects 0.000 description 9
- 238000006073 displacement reaction Methods 0.000 description 7
- 230000004888 barrier function Effects 0.000 description 5
- 238000003384 imaging method Methods 0.000 description 5
- 238000012544 monitoring process Methods 0.000 description 5
- 230000008447 perception Effects 0.000 description 5
- 238000003825 pressing Methods 0.000 description 4
- 238000013459 approach Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000005611 electricity Effects 0.000 description 3
- 230000001788 irregular Effects 0.000 description 3
- 230000033001 locomotion Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 238000012806 monitoring device Methods 0.000 description 2
- 239000011800 void material Substances 0.000 description 2
- 206010064950 Head titubation Diseases 0.000 description 1
- 240000007594 Oryza sativa Species 0.000 description 1
- 235000007164 Oryza sativa Nutrition 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000012790 confirmation Methods 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 239000000835 fiber Substances 0.000 description 1
- 230000010006 flight Effects 0.000 description 1
- 230000004886 head movement Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 238000011065 in-situ storage Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000000149 penetrating effect Effects 0.000 description 1
- 238000002360 preparation method Methods 0.000 description 1
- 238000012797 qualification Methods 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 235000009566 rice Nutrition 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 238000005096 rolling process Methods 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 230000014860 sensory perception of taste Effects 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000012549 training Methods 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/106—Change initiated in response to external conditions, e.g. avoidance of elevated terrain or of no-fly zones
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
One kind controlling method, corresponding first equipment (100) and the computer readable storage medium of the second equipment (200) for the first equipment (100).What is executed at the first equipment (100), which is used to control the method (700) of the second equipment (200), includes:(step S710) determines the first space (10) associated with the first equipment (100) and second space associated with the second equipment (200) (20);(step S720) determines the first coordinate mapping relations between the first space (10) and second space (20);(step S730) is based on the first coordinate mapping relations, and the second operation to be executed in second space (20) by the second equipment (200) is determined according to first operation of the first equipment (100) in the first space (10);(step S740) and control instruction is sent to the second equipment (200), to indicate that the second equipment (200) executes the second operation.
Description
Copyright notice
This patent document disclosure includes material protected by copyright.The copyright is all for copyright holder.Copyright
Owner does not oppose the patent document in the presence of anyone replicates the proce's-verbal of Patent&Trademark Office and archives or specially
Profit discloses.
Technical field
This disclosure relates to remote control field, method, equipment and the computer for more particularly relating to equipment control can
Read storage medium.
Background technology
Unmanned vehicle (UAV), generally also referred to as " unmanned plane ", " nolo flight system (UAS) " if or
Other titles are done, are a kind of not no aircraft of human pilot thereon.The flight of unmanned plane can be controlled by various modes
System:Such as remote control is subject to by human operator who (also sometimes referred to as " winged hand "), or by unmanned plane it is semi-autonomous or it is complete from
Master mode fly etc..
In remote control, need winged hand that can adjust the flight attitude of unmanned plane at any time as needed.However, for
For most people, the control mode of unmanned plane is mutually gone with driving, the adventure in daily life of remote-control toy in their daily lifes
It is very remote, it is therefore desirable to which that they carry out complicated, interminable professional training.In this case, how to simplify the operation of unmanned plane, or very
To its operation automation or semi-automation is made, just become one of urgent problem to be solved.
Invention content
According to the first aspect of the disclosure, it is proposed that a kind of side for controlling the second equipment executed at the first equipment
Method.This method includes:Determine associated with first equipment the first space and with second equipment associated second
Space;Determine the first coordinate mapping relations between first space and the second space;It is reflected based on first coordinate
Relationship is penetrated, it will be by second equipment described to determine according to first operation of first equipment in first space
The second operation executed in second space;And control instruction is sent to second equipment, to indicate that second equipment is held
Row second operation.
According to the second aspect of the disclosure, it is proposed that a kind of the first equipment for controlling the second equipment.First equipment
Including:Space determining module, for determining and associated first space of first equipment and related with second equipment
The second space of connection;First mapping relations determining module, for determining between first space and the second space
One coordinate mapping relations;Second operation determining module, for being based on the first coordinate mapping relations, according to first equipment
First in first space operates to determine the to be executed in the second space by second equipment second behaviour
Make;And instruction sending module, for sending control instruction to second equipment, to indicate described in the second equipment execution
Second operation.
According to the third aspect of the disclosure, it is proposed that a kind of the first equipment for controlling the second equipment.First equipment
Including:Processor;Memory, wherein being stored with instruction, described instruction makes the processor when being executed by the processor:
Determine the first space associated with first equipment and second space associated with second equipment;Determine described
The first coordinate mapping relations between one space and the second space;Based on the first coordinate mapping relations, according to described
First operation of first equipment in first space will be executed by second equipment in the second space to determine
Second operation;And control instruction is sent to second equipment, to indicate that second equipment executes second operation.
According to the fourth aspect of the disclosure, it is proposed that a kind of computer readable storage medium of store instruction, described instruction
When executed by the processor so that the processor executes the method according to disclosure first aspect.
Description of the drawings
In order to which the embodiment of the present disclosure and its advantage is more fully understood, referring now to being described below in conjunction with attached drawing,
In:
Fig. 1 is the first space of example shown according to the embodiment of the present disclosure.
Fig. 2 is the example second space shown according to the embodiment of the present disclosure.
Fig. 3 is shown according to the first equipment of example of the embodiment of the present disclosure and the example sync mistake of exemplary second device
Journey.
Fig. 4 is the exemplary scene when the first equipment of example leaves the first space shown according to the embodiment of the present disclosure.
Fig. 5 be show according to the embodiment of the present disclosure the first equipment of example control exemplary second device when example second
Equipment encounters the exemplary scene of barrier.
Fig. 6 is the example re-synchronization mistake when redefining example second space shown according to the embodiment of the present disclosure
Journey.
Fig. 7 is the example side that exemplary second device is controlled for the first equipment of example shown according to the embodiment of the present disclosure
The flow chart of method.
Fig. 8 is the function of showing the first equipment of example for controlling exemplary second device according to the embodiment of the present disclosure
Module frame chart.
Fig. 9 is the hardware for showing the first equipment of example for controlling exemplary second device according to the embodiment of the present disclosure
Schematic diagram.
In addition, each attached drawing is not necessarily to scale to draw, but only shown with the illustrative way for not influencing reader's understanding
Go out.
Specific implementation mode
According in conjunction with attached drawing to the described in detail below of disclosure exemplary embodiment, other aspects, the advantage of the disclosure
Those skilled in the art will become obvious with prominent features.
In the disclosure, term " comprising " and " containing " and its derivative mean including and it is unrestricted.
In the present specification, following various embodiments for describing disclosure principle only illustrate, should not be with any
Mode is construed to limitation scope of disclosure.With reference to attached drawing the comprehensive understanding described below that is used to help by claim and its equivalent
The exemplary embodiment for the disclosure that object limits.It is described below to help to understand including a variety of details, but these details are answered
Think to be only exemplary.Therefore, it will be appreciated by those of ordinary skill in the art that not departing from the scope of the present disclosure and spirit
In the case of, embodiment described herein can be made various changes and modifications.In addition, for clarity and brevity,
The description of known function and structure is omitted.In addition, running through attached drawing, same reference numerals are used for same or analogous function and behaviour
Make.In addition, although the scheme with different characteristic may be described in different embodiments, those skilled in the art should
Recognize:All or part of feature of different embodiments can be combined, spirit and scope of the present disclosure are not departed to be formed
New embodiment.
It please notes that:Although following embodiment is using unmanned plane as manipulation object and head-mounted display as main controlling body
It is described in detail, however the present disclosure is not limited thereto.In fact, manipulation object can be any manipulation object, such as machine
People, remote-control car, aircraft etc. or any equipment being remotely controllable.In addition, main controlling body can also be such as fixed terminal
(for example, desktop computer), mobile terminal (for example, mobile phone, tablet computer), other other than head-mounted display wearable are set
Standby, remote controler, handle, rocking bar etc. or any equipment for sending out manipulation instruction.
Before formally description some embodiments of the present disclosure, it will describe first herein by part art to be used
Language.
Virtual reality (Virtual Reality, referred to as VR):Virtual reality technology is an important side of emulation technology
To being the multiple technologies such as emulation technology and computer graphics human-machine interface technology multimedia technology sensing technology network technology
Set.Virtual reality technology (VR) includes mainly simulated environment, perception, natural technical ability and sensing equipment etc..Simulated environment
Dynamic 3 D stereo photorealism typically generated by computer, real-time.Perception, which refers to ideal VR, should have all
Perception possessed by people.In addition to the visual perception that computer graphics techniques are generated, can also there are the sense of hearing, tactile, power to feel, move
Deng perception, or even further include smell and sense of taste etc., also referred to as more perception.Natural technical ability refers to the head rotation of people, eyes, hand
Gesture or other human body behavior acts handle the data adaptable with the action of participant by computer, and to the input of user
Real-time response is made, and feeds back to the face of user respectively.Sensing equipment refers to three-dimension interaction equipment.
Eulerian angles/attitude angle:Body coordinate system (direction of head is directed toward with such as tail, left wing be directed toward right flank direction,
And the coordinate that the direction below (that is, with aircraft horizontal plane) vertical with the two directions and direction body is three axis
System) and earth axes (also known as earth coordinates, with such as east orientation, north orientation, the coordinate system that the earth's core direction is three axis)
Relationship is three Eulerian angles, has reacted posture of the aircraft with respect to ground.These three Eulerian angles are respectively:Pitch angle (pitch),
Yaw angle (yaw) and roll angle (roll).
Pitching angle theta (pitch):Body coordinate system X-axis (for example, the direction of head is directed toward by tail) and the earth horizontal plane
Angle.On the positive axis of X-axis is located at zeroaxial horizontal plane when (new line), pitch angle be just, be otherwise negative.When winged
When the pitch angle variation of row device, generally mean that its subsequent flying height can change.If it is bowing for imaging sensor
The elevation angle changes, then generally means that the picture of its shooting will appear height change.
Yaw angle ψ (yaw):Body coordinate system X-axis projects in the horizontal plane (in the horizontal plane, to be referred to earth axes X-axis
To target be just) between angle, when going to the projection line of earth axes X-axis counterclockwise by body coordinate system X-axis, yaw angle
For just, i.e., head right avertence boat is just, on the contrary is negative.When the variation of the yaw angle of aircraft, its subsequent level is generally meant that
Heading can change.It changes if it is the yaw angle of imaging sensor, then generally means that the picture of its shooting
It will appear and move left and right.
Roll angle Φ (roll):Body coordinate system Z axis (for example, the direction of aircraft horizontal plane downward) and pass through body
Angle between the vertical guide of X-axis, body roll to the right as just, on the contrary is negative.When the variation of the roll angle of aircraft, generally mean that
Its horizontal plane rotates.It changes if it is the roll angle of imaging sensor, then generally means that the picture of its shooting will appear
Left-leaning or Right deviation.
As previously mentioned, needing unmanned plane control mode easy to use.It is usually logical in common unmanned plane control mode
Hand-held remote controller is crossed to be manipulated.For example, each side of aircraft can be controlled by acting on the amount of displacement of control stick
To velocity magnitude.Fly hand simultaneously and usually also needs to video camera (or more generally, imaging sensor or the figure be concerned about on unmanned plane
As sensory package) shooting angle, have certain requirement for the burden and qualification that fly hand.
In such as city area-traffic congestion, major event or sudden peak traffic, sudden calamity caused by red-letter day
Difficult (earthquake, fire, attack of terrorism etc.), even small-scale military conflict when, decision-maker need to be grasped it is instant with it is intuitive
Now the ability of information, to carry out floor manager.In another example news industry need as far as possible Multi-angle omnibearing to thing
Part is reported.Under existing technical conditions, most important mode is the fixation or portable monitoring device by scene,
Or aircraft carries out covering monitoring from the air.However, in these common schemes, have the following problems:Need enough numbers
Amount and the equipment of operating personnel completely to cover monitoring area, and the limited flexibility of the monitoring device on ground, it is difficult to flexible
Deployment.In addition, for aerial manned vehicle or UAV, can not accomplish quickly to switch to other visual angles from high-altitude.The above institute
The mode of description, while many-sided professional's operation is also needed to, and summarize to be presented to decision-maker.
In order at least partly solve or mitigate the above problem, it is proposed that according to the embodiment of the present disclosure based on virtual reality
Unmanned plane control system.This system can allow commanding same in the instant picture for flexibly intuitively grasping existing various angles
When, it reduces to personnel and number of devices demand.By the intuitive maneuverability pattern of this system, the non-human act (example of the mankind can be used
Such as, stand, squat down, walking and/or head movement etc.) manipulate unmanned plane and its camera angle, and can by walking with
Rotary head can complete the flight path for being with the hands difficult to complete and angle lens control.
In some embodiments of the present disclosure, the hardware components of the unmanned plane control system can be substantially by with lower part structure
At:Input terminal, communication ends and terminal.In some embodiments, input terminal may include such as head-mounted display (Head
Mounted Display, referred to as HMD) and/or handle controller.The main function of input terminal is to provide to operator empty
Quasi- reality picture and offer operation interface so that operator can carry out corresponding according to observed virtual reality picture
Observation, the operation etc. to unmanned plane.It please notes that:Herein, virtual reality picture is not limited to the pure void generated by computer
Quasi- picture, but can also include the real screen for example captured by the imaging sensor of unmanned plane, real screen and void
The combination of quasi- picture, and/or pure virtual screen.In other words, in the context of the disclosure, virtual reality (VR) also includes enhancing
Real (Augmented Reality, referred to as AR).In some embodiments, communication ends may include for example:Various networks
In (for example, internet, LAN, mobile radio communication (3G, 4G, and/or 5G etc.), WiMax network, fiber optic network etc.), control
The heart, and/or earth station etc..The main function of communication ends is to provide communication link, communication control etc. for input terminal and terminal.It is logical
Believe end can with it is wired, wireless, or combinations thereof mode between input terminal and terminal transmission data, signal etc..In some realities
It applies in example, terminal may include that any can be controlled such as unmanned vehicle (UAV), robot, remote-control car, aircraft or remotely
The equipment of system.
To be head-mounted display (HMD), communication ends using input terminal for channel radio in this paper embodiment described below
Communication network (for example, 4G networks), terminal is are described in detail for unmanned plane.However, the present disclosure is not limited thereto as described above.This
Outside, herein, HMD (or more generally, main controlling body) is also frequently referred to " the first equipment ", is come with " the second equipment "
Unmanned plane (or more generally, manipulating object) is referred to, however the present disclosure is not limited thereto.In fact, as network is (for example, 5G nets
Network) promotion of speed and the development of technology, the function of executing in a single device form may disperse in a distributed manner
In multiple equipment.For example, some in each step of the method executed at the first equipment as described below are complete
It can be executed at communication ends or terminal, so as to which the combination for executing the hardware components of these steps in these equipment to be considered as
It is equal to " the first equipment ".Similarly, for example, in each step of the method executed at the second equipment as described below
Some can be executed at input terminal or communication ends completely, so as to which the Hardware Subdivision of these steps will be executed in these equipment
The combination divided, which is considered as, to be equal to " the second equipment ".
Next, by the first equipment of use according to the embodiment of the present disclosure is described in detail (for example, wearing in conjunction with Fig. 1~3
Formula display) control the initialization procedure of the second equipment (for example, unmanned plane).
Fig. 1 is the first space of example 10 shown according to the embodiment of the present disclosure, and Fig. 2 is shown according to disclosure reality
Apply the example second space 20 of example.As shown in Figure 1, the first space 10 can be space associated with the first equipment 100, it is used for
User for wearing the first equipment 100 carries out practical operation wherein.As shown in Figure 1, the user for having worn the first equipment 100 can
To be stood, be walked within this space, be turned to, be squatted down, be jumped, the actions such as rotary head.First equipment 100 is to sense these dynamic
When making, these of user and/or the first equipment 100 in the first space 10 can be acted according to mode as described below
It is understood, and it is accordingly converted into manipulation object (for example, unmanned plane 200 shown in Fig. 3) in the second sky as shown in Figure 2
Between the action to be executed in 20, then send instruction these manipulation instructions for acting to unmanned plane 200.It please notes that:Herein,
It is unless otherwise stated, otherwise following not differentiate between user and the first equipment since user wears the first equipment 100 always
100.In other words, unless otherwise stated, otherwise hereinafter " user " and " the first equipment 100 " may be used interchangeably.
In order to determine the range in first space 10, the user for wearing the first equipment 100 can be by specifying the first space
10 all or part of vertex and/or all or part of length of side carry out.For example, in the embodiment shown in fig. 1, the first space 10
Can be cube as shown by dashed lines, then user is in order to specify the cubic space, it is possible to specify appoint in first space 10
One vertex is as origin, and the length in specified all directions (for example, X, Y shown in FIG. 1 and Z axis) is as the length of side.Another
It, can also be by specifying the range in the first space 10 at least one of below determining in a little embodiments:Two of first space 10
The length of the position on vertex and the first space 10 in the reference axis of vertical plane for being formed by line by the two vertex;First space
The position and the first space 10 on 10 non-colinear three vertex are being formed by by these three vertex on the vertical line direction in face
Length;And first space 10 non-coplanar at least four vertex position.
In addition, although in order to enable reader's understanding is more convenient, intuitive, with the first space 10 in embodiment illustrated herein
It is for cube, but the present disclosure is not limited thereto.First space, 10 electricity can have other shapes, including but not limited to:Ball
Shape, terrace with edge, pyramid, cylinder, circular cone or any other rule or irregular stereochemical structure.
In some embodiments, determine that the mode on vertex may, for example, be when user goes at certain point by middle controller of starting
Or by headwork (for example, nod, shake the head) or any other action (for example, jump, squat down) or even through looking on
Another equipment that person is manipulated determines, to notify that the first equipment 100 point is some vertex in the first space 100.Determine side
Long mode can be that such as user is manually entered by the input equipment of keyboard etc, or user is practical to walk by detecting
The modes such as distance.In further embodiments, user can also carry out virtual box choosing by the image in the place to taking, and come
Determine the range in the first space 10.
In addition, in some embodiments, the origin height in the first space 10 can be changed with following manner:For example, working as
First equipment 100 detect user be in state of squatting down more than certain time (for example, 1 second, 3 seconds or any other it is appropriate when
Between), the either group by trigger event after squatting down (for example, pressing the corresponding button etc. of remote controler) or passing through both
It closes, the origin in the first space 10 can be increased to the height of such as eyes of user, the second equipment can be made second at this time
A low clearance, such as the bottom surface carry out activity along second space are kept in space.It, can be in addition, in some embodiments
By detecting fixed action or more generally trigger event (for example, pressing the corresponding button etc. of remote controler) so that the first space
The height of 10 top surface is down to eyes of user height, so that corresponding second equipment can be moved along the top surface of second space
It is dynamic.In addition, in some embodiments, it can be by the way that timer be arranged (for example, the height operation 1 in the second equipment after change
After second, 3 seconds or any other appropriate time) or trigger event (for example, pressing the corresponding button of remote controler), it is aforementioned to release
The change of height is run in embodiment to the second equipment.In the case where relieving the fixed operation height of the second equipment, second
Equipment may return to corresponding second space position, such as vertical operation to return to corresponding position.In addition, in some embodiments
In, in order to enable user can manipulate the second equipment by squatting motion, user can be selected when determining the first space 10
Some in the range that can be squatted down puts the origin as the first space 10, so that the origin height in above-described embodiment becomes
More it is possibly realized.
In some embodiments, the operation in above-described embodiment can pass through the output equipment (example of the first equipment 100
Such as, display) it is prompted to user.For example, when detect squat down more than the predetermined time or detect trigger event when, can be with
While executing the operation of previous embodiment, the second equipment is prompted the user with by the display of the first equipment 100 and enters height
Degree change pattern or the second equipment relieve height change pattern etc..
As shown in Fig. 2, second space 20 can be space associated with the second equipment 200, for supplying the second equipment 200
(for example, unmanned plane) carries out practical operation wherein.As shown in Fig. 2, the second equipment 200 can be hovered within this space,
The actions such as flight, steering, decline, rising, camera adjustment visual angle.Second equipment 200 can be according to side as described below
Formula receives manipulation instruction from the first equipment 100, and executes corresponding action.In the embodiment depicted in figure 2, the top of second space 20
Portion and bottom can correspond respectively to the highest and minimum flight altitude of the second equipment 200, however the present disclosure is not limited thereto.
It is similar with the first space 10 shown in FIG. 1, in order to determine the range of the second space 20, user can by
All or part of vertex and/or all or part length of side of second space 20 is specified to carry out on electronic three-dimensional map.For example,
In embodiment illustrated in fig. 2, second space 20 can also be cube as shown by dashed lines, then user is in order to specify the cube empty
Between, it is possible to specify any vertex of the second space 20 is as origin, and specified all directions are (for example, X, Y and Z shown in Fig. 2
Axis) on length as the length of side.It in further embodiments, can also be by specifying second space at least one of below determining
20 range:Position and second space of two vertex of second space 20 on electronic three-dimensional map are by the two vertex institutes
Length in the reference axis of the vertical plane of the line of formation;Non-colinear three vertex of second space 20 are on electronic three-dimensional map
Position and second space 20 are in the length being formed by by these three vertex on the vertical line direction in face;And second space 20 is non-
Position of at least four coplanar vertex on electronic three-dimensional map.
In addition, although in order to enable reader's understanding is more convenient, intuitive, with second space 20 in embodiment illustrated herein
It is for cube, but the present disclosure is not limited thereto.Second space 20 can also have other shapes, including but not limited to:Ball
Shape, terrace with edge, pyramid, cylinder, circular cone or any other rule or irregular stereochemical structure.For example, on close such as airport
Etc no-fly zone in the case of, due to no-fly zone be in reversed round bench shaped, the side on the close airport of second space 20 can
It can be the irregular stereochemical structure with up-narrow and down-wide form.
In some embodiments, determine that the mode on vertex may, for example, be user and select second in electronic three-dimensional map center
The range in space 20, for example, choosing each vertex or part vertex and the length of side of the second space 20 in electronic three-dimensional map
Deng.In addition, in further embodiments, it can also be by manipulating the flight of the second equipment 200 to some aerial specified point, then
It indicates that the specified point is some vertex (for example, origin, central point etc.) of second space 20 to the first equipment 100, then specifies
The mode of each length of side determines the range of second space 20.
In the embodiment depicted in figure 2, the second equipment 200 works for convenience, can usually specify second space 20 so that
Without the object presence that can stop the flight of the second equipment 200.However the present disclosure is not limited thereto, such as Fig. 5, Fig. 6 are right
Shown in upside, there may be the flights that barrier permanently or temporarily influences the second equipment 200 in second space 20 (or 20 ').
In addition, the operation above in conjunction with the first space of determination 10 and second space 20 described respectively Fig. 1 and Fig. 2 can be
Sequence carries out, be carried out at the same time or part order part is carried out at the same time, and determines that the sequence of both is also not necessarily limited to herein
Description order (that is, first determining that the first space 10 determines second space 20 again), but can be opposite (that is, first determining that second is empty
Between 20 again determine the first space 10).
Next, it needs to be determined that the coordinate mapping between coordinate in coordinate and second space 20 in the first space 10 is closed
System.As previously shown, for convenience of description and intuitively, the first space 10 shown in FIG. 1 and second space shown in Fig. 2 20 have
There is cubic shaped.In most cases, second space 20 has the size much larger than the first space 10, for example, second is empty
Between 20 size may be in the rank of kilometer, and the size in the first space 10 may be in the rank of rice.However, the disclosure is not
Be limited to this, the size of the first space 10 and second space 20 can also roughly the same or the first space 10 size to be more than second empty
Between 20 size.
It, can be with since the first space 10 and second space 20 are cube in Fig. 1 and embodiment illustrated in fig. 2
Linear mapping relation is established between the coordinate of both.For example, by the origin of the first space 10 and second space 20 (for example,
Manually determined as previously described or come automatically according to the range of the first space 10 and/or second space 20 by the first equipment 100
Determining) mutually mapping, and Linear Mapping is carried out respectively to both X-axis, Y-axis and Z axis.
In some embodiments, each between each length of side in the first space 10 and each length of side of second space 20 is reflected
The ratio of penetrating can be consistent, but this is not required in that.For example, for the X-axis in the first space 10, Y-axis and Z axis length can be distinguished
Such as 10 meters, 5 meters and 2 meters, and the X-axis of second space 20, Y-axis and Z axis length can be respectively such as 5 kilometers, 2.5 kilometers and 1
Kilometer, then the length of side corresponding proportion on three axis is 1/500.In another example the X-axis in the first space 10, Y-axis and Z axis length can
Be respectively such as 10 meters, 10 meters and 2 meters, and the X-axis of second space 20, Y-axis and Z axis length can be respectively such as 5 kilometers,
2.5 kilometers and 0.75 kilometer, then the length of side corresponding proportion on three axis is respectively 1/500,1/250 and 1/375.
In the former case, wear the first equipment 100 user along the first space 10 X-axis walk 3 meters in the case of,
Second equipment 200 then can accordingly prolong its X-axis and fly 1.5 kilometers;The user of the first equipment 100 is being worn along the first space 10
In the case of 3 meters of Y-axis walking, the second equipment 200 then can accordingly prolong its Y-axis and also fly 1.5 kilometers.In the later case, exist
The user for wearing the first equipment 100 along the first space 10 X-axis walk 3 meters in the case of, the second equipment 200 can still prolong it
X-axis is flown 1.5 kilometers;And wear the first equipment 100 user along the first space 10 Y-axis walk 3 meters in the case of, then with
Former case is different, and the second equipment 200 can accordingly prolong its Y-axis and fly 0.75 kilometer, rather than 1.5 public first.
As it can be seen that in some embodiments of the present disclosure, when determining the first space 10 and second space 20 in each reference axis
On length corresponding proportion when, can be based on the corresponding proportion, to determine the coordinate in the first space 10 and second space 20
In coordinate (first) coordinate mapping relations.To which the first equipment 100 can be by displacement etc. of the user in the first space 10
Action is mapped as the actions such as the second displacement to be executed in second space 20 of equipment 200.Such mapping intuitive, simple, side
Operation of user's (or first equipment 100) to the second equipment 200.
Next by conjunction with Fig. 3 come be described in detail manipulation initialization in synchronizing process.
Fig. 3 be show it is same with the example of exemplary second device 200 according to the first equipment of example 100 of the embodiment of the present disclosure
Step process.After the coordinate mapping relations for establishing the first space 10 and second space 20 as described in conjunction with Fig. 1 and Fig. 2,
As shown in the lower part of Figure 3, the user for wearing the first equipment 100 can enter into some point in the first space 10 (for example, shown in Fig. 3
Approximate center), and by the first equipment 100 instruction " synchronous activate " (for example, passing through the controller hand in user hand
Handle or nod detecting user by using HMD, shake the head or any other trigger action) indicate the first equipment 100 and second
To start synchronizing process between equipment 200.
When receiving " synchronous activation " instruction, the first equipment 100 can be detected from (the in the first space 10
One) coordinate, and according to coordinate mapping relations determined by before come determine the second equipment 200 will in second space 20 it is residing
(second) coordinate of position.It, can be for example, by aforementioned communication end to second when the first equipment 100 determines the second coordinate
Equipment 200 sends or directly sends " synchronous activation " instruction to the second equipment 200, to indicate the flight of the second equipment 200 to second
It hovers at coordinate and enters " synchronization " state.In addition, in some embodiments, when the second equipment 200 is to flying at the second coordinate
Row on the way, i.e., when the second equipment synchronizes, can prompt the user with such as printed words or figure " in synchronizing " by the first equipment 100
Mark or other indicators, to indicate that user not move temporarily, to avoid extending synchronizing process.
As shown in the tops Fig. 3, the second equipment 200 may be at when initial except second space 20, and " synchronous receiving
It (takes off simultaneously) when activation " instruction and second space 20 is entered with predetermined approach altitude.The approach altitude can depend on second space
20 maximum and/or minimum height can also be by other height specified by user.Into field interval, the second equipment 200 can
To use any obstacle avoidance apparatus or the measure of itself, any obstacle for entering 20 period of second space is avoided.In other words, it second sets
Standby 200 flight path can not be the broken line as shown in the tops Fig. 3, but with any form (for example, curve, straight line, with
Machine circuit etc.), the route of length, for example, in order to get around the barrier right over the second equipment 200, the second equipment 200 even can
With elder generation to far from second space 20 direction flight a distance, climb to approach altitude, then second into second space 20
Coordinate advances.At the same time, user can be by being presented the image sensing by carrying in the second equipment 20 in the first equipment 10
The realtime graphic that component (for example, camera) is captured observes state of flight, the surrounding enviroment of the second equipment 20, to ensure
Accident does not occur when marching into the arena for the second equipment 20.
It in some embodiments, can be via such as communication ends to first when the second equipment 200 reaches the second coordinate
Equipment 100 returns or directly returns to " synchronous activation " confirmation message to the first equipment 100, to show that it arrived designated position simultaneously
Into " synchronization " state.At this point, the picture that is captured of the image sensing component of the second equipment 200 with real-time Transmission and can be shown in
In first equipment 100.Operator can arbitrarily walk in the first space 10, turn to, lift/bow, squat down/jump etc. it is dynamic
Make, to act on the first equipment 100 and accordingly control the second equipment 200.
In some embodiments, can by gyroscope, accelerometer or the Magnetic Sensor installed in the first equipment 100,
The devices such as positioning device (for example, GPS etc.), come the instantaneous acceleration of user in real or the first equipment 100, instantaneous velocity,
The parameters such as geometric coordinate, orientation (yaw) angle and/or pitch angle.For example, by using the accelerometer in the first equipment 100,
First equipment 100 can determine itself acceleration whithin a period of time in a certain direction, and then when determining from this section
In speed, may thereby determine that itself displacement during this period of time and in this direction relative to the seat of initial position
Mark.In another example can be detected by using gyroscope, the first equipment 100 user head turn to and/or come back/it is low
The amplitude of first-class action, and combine each length of side in the first space 20, it may be determined that the action is (inclined in the orientation in the first space 20
Boat) angle and/or pitch angle variation.
When (first) operation of the first equipment 100 in the first space 10 is determined, the first equipment 100 can be before
The first coordinate mapping relations are stated to determine what the second equipment 200 corresponding with the first operation to be executed in second space 20
(the second) it operates.For example, as previously mentioned, when X-axis of first equipment 100 along the first space 10 moves 3 meters, then it can be according to 1/
Second operation is determined as the second equipment 200 and prolongs 1.5 kilometers of its X-axis flight by 500 the first coordinate mapping relations.In another example when
Pitch angle of first equipment 100 in the plane where X-axis (or Y-axis) and Z axis is+15 degree (that is, the sight lifting 15 of user
Degree) when, then since the ratio between each reference axis in the first coordinate mapping relations is consistent, and therefore the second operation can be determined
It is+15 for the pitch angle of the second equipment 200 and/or its image sensing component in the plane where its X-axis (or Y-axis) and Z axis
Degree.In addition, if the ratio in the first coordinate mapping relations between each reference axis is not consistent, for example, X-axis (or Y-axis) ratio
For 1/500, the ratio of Z axis is 1/375, then can the second operation be determined as the second equipment 200 and/or its image sensing group
Pitch angle of the part in the plane where its X-axis (or Y-axis) and Z axis about+11.3 degree (that is,).The purpose done so mainly ensures:Although the first space 10 and second space 20 exist
Ratio is different on each axis, but the maximum pitching scope achieved by user is corresponding.In addition, azimuth can also similarity
Reason.
In addition, in some embodiments, when user or the first equipment 100 for example generate height change (for example, jump,
Squat down), the first equipment 100 can determine the highest or minimum altitude that the first equipment 100 rises or falls, and by the highest or
Minimum altitude is compared with some pre-set highest threshold level or lowest threshold height.When getting the height detected
When the difference of degree and threshold level, the height that can be mapped as according to aforementioned first coordinate mapping relations in second space 20
Difference, and accordingly indicate that the second equipment 200 rises or falls the difference in height.In further embodiments, it can not also consider first
Coordinate mapping relations carry out the height conversion.For example, as long as user's jump is primary, the second equipment 200 can rise fixed height
Degree, such as 10 meters.In another example as long as user squats down once, the second equipment 200 can decline level altitude, such as 5 meters.In addition,
It can also be changed according to the actual height of the first equipment 100 accordingly to adjust the second equipment 200 in the case where being not provided with threshold value
Height.However, it is contemplated that the small level variation that the mankind can occur when walking naturally, this carrys out the manipulation of the second equipment
Say it is unhelpful.
More generally, in some embodiments, the first equipment 100 can be determined when executing translation in the first space
First translation route is mapped as in second space 20 by the first translation route in 10 based on above-mentioned first coordinate mapping relations
Second translation route, and the second operation is determined as the operation that the second equipment 200 of instruction is moved along the second translation route.Another
In a little embodiments, the first equipment 100 can determine first of the first equipment 100 when terminating steering operation in the first space 10
Azimuth;The second party parallactic angle being mapped as first party parallactic angle based on above-mentioned first coordinate mapping relations in second space 20;With
And the second operation is determined as to indicate that the image sensing component of the second equipment 200 or the second equipment 200 is diverted to second party parallactic angle.
The first equipment 100 can determine the first equipment 100 when terminating visual angle change operation in the first space in yet other embodiments,
The first pitch angle in 10, the second pitching being mapped as the first pitch angle based on the first coordinate mapping relations in second space 20
Angle;And the second operation is determined as to indicate that the image sensing component of the second equipment 200 is diverted to the second pitch angle.Still other
In embodiment, the first equipment 100 can determine the first equipment 100 institute in the first space 10 during executing height change operation
Reach maximum height or minimum altitude;If maximum height or minimum altitude are accordingly higher or lower than highest threshold value or minimum threshold
Value, then reflected maximum height or minimum altitude and corresponding highest threshold value or the difference of lowest threshold based on the first coordinate mapping relations
It penetrates as the difference in height in second space 20;And the second operation is determined as to indicate that the second equipment 200 rises or falls difference in height.
However, the present disclosure is not limited thereto.In further embodiments, for the angle at pitch angle and/or azimuth etc
Variation, can also directly be converted under the premise of not considering the first coordinate mapping relations.For example, if the first operation is the
One equipment 100 rotates clockwise 45 degree in original place in the first space 10, then can the second operation be determined as the second equipment 200 exists
Also original place rotates clockwise 45 degree in second space 20.By for example, if the first operation is the first equipment 100 in the first space 10
It inside bows 30 degree, then can the second operation be determined as pitch angle of second equipment 200 in second space 20 also declines 30 degree.
The purpose done so mainly ensures:Although the first space 10 and second space 20 ratio on each axis are different, user institute
The maximum pitch angle that can be realized is corresponding.In addition, azimuth can also similar process.
In addition, although not discussing roll angle herein, this is primarily due to the second equipment 200 typically without rolling,
And first equipment 100 also will not usually carry out tumbling action.However the present disclosure is not limited thereto, can also be carried out to roll angle similar
Processing.
In some embodiments, after the second operation is determined, control instruction can be sent to the second equipment 200, with
Indicate that the second equipment 200 executes second operation.
Therefore, in the ideal situation, the coordinate in first space 10 at 100 place of operator or the first equipment can be same in real time
Walk the corresponding coordinate in the second second space 20 where it of equipment 200, and the side of the image sensing component of the second equipment 200
Parallactic angle and/or pitch angle can correspond to the respective angles of the first equipment 100.It allows users to convenience, intuitively operate nothing
It is man-machine, and obtain RUNTIME VIEW corresponding with user's current head posture.
Next, by synchronous process is released with the second equipment 200 in conjunction with Fig. 4 the first equipment 100 is described in detail.
Fig. 4 is the example when the first equipment of example 100 leaves the first space 10 shown according to the embodiment of the present disclosure
Scene.In the embodiment shown in fig. 4, when the user for wearing the first equipment 100 leaves the first space 10, such as it walks out first
When the boundary in space 10, theoretically the second equipment 200 should also be as the corresponding boundary in second space 20 and prepare to fly out.This
When, in some embodiments, the first equipment 100 can send " synchronous to cancel " instruction to the second equipment 200, be set with instruction second
Standby 200 release synchronous regime, and the further instruction to be received such as hovering in situ.In further embodiments, the second equipment
200 can also voluntarily release synchronous regime and original place hovering when detecting second space 20 to be left.At this point, the second equipment
200 can also select the state to the first equipment 100 report its " release and synchronize ".In addition to user or the first equipment 100 leave
It is released when one space 10 except synchronizing, user or the first equipment 100 can also select actively to release synchronous regime, such as by pressing
Some fixed button on dynamic hand held controller, by the nodding of head-mounted display, head shaking movement or some other it is specified dynamic
Make etc..At this point, the second equipment 200 can release synchronous regime, and original place is hovered.
In some embodiments, when the original subscriber or another user that wear the first equipment 100 come back to the first space 10
When middle, the first equipment 100 can send out the instruction of re-synchronization (for example, " synchronous activation " above-mentioned to the second equipment 20 accordingly
Instruct or another re-synchronization instruct), to indicate that the second equipment 20 enters synchronous regime, and flies and extremely exist with the first equipment 100
Correct position in the corresponding second space in current location in first space 10 20.In addition, in further embodiments, wearing
The user for wearing the first equipment 100 can also select manual actuation synchronous regime, such as reset the first space 10, and such as preceding institute
It states to the second equipment 200 and sends " synchronous activation " instruction.
Next, by the avoidance process of the second equipment 200 is described in detail in conjunction with Fig. 5.
Fig. 5 is to show to be shown when the first equipment of example 100 controls exemplary second device 200 according to the embodiment of the present disclosure
The second equipment of example 200 encounters the exemplary scene of barrier.As shown in the lower part of Figure 5, user or the first equipment 100 have carried out displacement behaviour
Make.At this point, the second equipment 200 should also be displaced to corresponding position in the manner previously described.If in 200 displacement of the second equipment
Occurs barrier on route, as shown in the tops Fig. 5, then the second equipment 200 can be selected voluntarily to redesign route and is moved to
Corresponding position.In some embodiments, which can voluntarily be carried out by the second equipment 200.With such side
Formula, can quickly avoiding obstacles.In further embodiments, which can also be to be connect by the first equipment 100
It receives and accordingly controls after the report of the second equipment 200.No matter which kind of mode, the second equipment 200 can be in avoiding obstacles
After reach specified location.
In some embodiments, it when the second equipment 200, which reselects route, carrys out avoiding obstacles, can release and the
The synchronous regime of one equipment 100, and reenter synchronous regime after reaching designated position.In further embodiments, this is same
Step state can also remain, just be released when only at the appointed time the second equipment 200 does not reach appointed place in section.
Next, by the process for redefining second space 20 is described in detail in conjunction with Fig. 6.
Fig. 6 is the example re-synchronization when redefining example second space 20 shown according to the embodiment of the present disclosure
Process.As shown in fig. 6, when entire second space 20 cannot be completely covered in movement of first equipment 10 in the first space 10
When all positions (for example, initial setting up is wrong) or want observation second space 20 outside some region when either need exist
When the second equipment 200 being made to carry out displacement in the case that operator is stationary, user can be for example by handle in three-dimensional electronic
A target point (or more generally, new second space) is specified on map, will remap the first space 10 and second space
20 ' coordinate relationship makes the corresponding coordinate in the first space 10 where active user change target point thus at once empty second
Between coordinate in 20 '.
For example, as shown in fig. 6, user can be by controlling handle or other modes (for example, by being mounted on user's arm
On gesture detector detect action of user's arm in Virtual Space), one is specified in Virtual Space to reach
Position 150, then the second equipment 200 can go at the additional space coordinate in second space 20, and with the new position pair
The mode of current location that should be in user or the first equipment 100 in the first space 10 resets second space 20 ', and continues
Subsequent operation.
In addition, in some embodiments, when the second equipment 200 is for some reason without real-time synchronization in the first equipment
When 100 position, the first equipment 100 can be by screen switching that it is presented to being drawn in advance to the 3D modeling that flight range is completed
Face (that is, pure virtual screen).The visual angle of coordinate system and image sensing component used in the 3d space and practical flight region
In it is consistent.Furthermore, it is possible to which the ambient conditions such as the static models by 3D and time, weather according to current flight region are come mould
Environment, the flight range pictures rendered close to shoot on location such as quasi- illumination weather are shown in the first equipment 100, allow operation
Person can be operated in the case of no shooting picture using object of reference.In some embodiments, when the second equipment 200 again
After the secondary holding synchronous regime with the first equipment 100, shoot on location picture can be switched to.
What (for example, the first equipment 100 or other control facilities) can be current to the second equipment 200 in addition, control centre
Surrounding enviroment, state of charge and the data such as distance of making a return voyage are monitored, and when send the second equipment of candidate 200 to protect with automatic decision
Demonstrate,prove carrying out as usual for operation.For example, another first equipment 100 can send the second equipment of candidate 200 close to current second equipment
200 position, ensure the electricity in current second equipment 200 be reduced to make a return voyage electricity before reach and complete to replace, to real
Now to the monitoring free of discontinuities of monitoring area.
By that above in conjunction with equipment control program described in Fig. 1~6, commanding can be allowed flexibly intuitively to grasp
Now the instant picture of various angles, only need to be in the case where flying condition allows, and operator goes to suitable position, can be wanted
Monitoring angle.In addition, also reducing to other staff and equipment dependence.
Being used for of being executed at the first equipment 100 according to the embodiment of the present disclosure is described in detail below with reference to Fig. 7~8
Control the method for the second equipment 200 and the functional configuration of corresponding first equipment 100.
Fig. 7 be show executed at the first equipment 100 according to the embodiment of the present disclosure for controlling second equipment 200
Method 700 flow chart.As shown in fig. 7, method 700 may include step S710, S720, S730 and S740.According to this public affairs
It opens, some steps of method 700 can be individually performed or combine execution, and can execute or sequentially execute parallel, not office
It is limited to concrete operations sequence shown in Fig. 7.In some embodiments, method 700 can be by Fig. 1~the first equipment shown in fig. 6
100, the first equipment 800 or equipment shown in Fig. 9 900 shown in Fig. 8 execute.
Fig. 8 is to show the first equipment of example 800 according to the embodiment of the present disclosure (for example, the first equipment 100, wear-type
Display, hand held controller or other control devices etc.) functional block diagram.As shown in figure 8, the first equipment 800 can wrap
It includes:Space determining module 810, the first mapping relations determining module 820, second operation determining module 830 and instruction sending module
840。
Space determining module 810 is determined for first space 10 associated with the first equipment 100 and is set with second
Standby 200 associated second spaces 20.Space determining module 810 can be the central processing unit of the first equipment 100, number letter
Number processor (DSP), microprocessor, microcontroller etc., can be with such as matchings such as input unit of the first equipment 100
It closes, determines first space 10 associated with the first equipment 100 and second space associated with the second equipment 200 20.
First mapping relations determining module 820 is determined for the first seat between the first space 10 and second space 20
Mark mapping relations.First mapping relations determining module 820 can also be central processing unit, the digital signal of the first equipment 100
Processor (DSP), microprocessor, microcontroller etc., can be according to size, the shape in the first space 10 and second space 20
Shape, orientation etc. determine the first coordinate mapping relations between the first space 10 and second space 20.
Second operation determining module 830 can be used for being based on the first coordinate mapping relations, according to the first equipment 100 first
First in space 10 operates to determine the to be executed in second space 20 by the second equipment 200 second operation.Second operation
Determining module 830 can also be the central processing unit of the first equipment 100, digital signal processor (DSP), microprocessor, micro-
First operation of the first equipment 100 can be converted to the second operation of the second equipment 200 by controller etc., so that
User can intuitively, simply manipulate the second equipment 200.
Instruction sending module 840 can be used for sending control instruction to the second equipment 200, to indicate that the second equipment 200 is held
Row second operates.Instruction sending module 840 can also be the central processing unit of the first equipment 100, digital signal processor
(DSP), microprocessor, microcontroller etc., can be with the communications portion of the first equipment 100 (for example, wire/wireless communication
Unit, specifically such as RF units, WiFi units, cable, ethernet network interface card) it matches, it sends and controls to the second equipment 200
Instruction, to indicate that the second equipment 200 executes the second operation.
In addition, the first equipment 800 can also include other unshowned function modules in Fig. 8, such as the first coordinate determines
Module, the second coordinate mapping module, synchronous activation instruction sending module, synchronous releasing instruction sending module, operation halt instruction
Sending module, third coordinate determining module, 4-coordinate mapping block, second space redefine module, mapping relations again
Determining module, and/or the second operation redefine module etc..In some embodiments, the first coordinate determining module can be used for
Determine first coordinate of first equipment when executing synchronous activation operation in the first space.In some embodiments, it second sits
The second coordinate that mark mapping block can be used for being mapped as the first coordinate based on the first coordinate mapping relations in second space.
In some embodiments, synchronous activation instruction sending module can be used for sending synchronous activation instruction to the second equipment, to indicate the
Two equipment are moved at the second coordinate and indicate that the second equipment is in " synchronization " state.In some embodiments, synchronous release refers to
Sending module is enabled to can be used for sending synchronous cancellation instruction to the second equipment, to indicate that the second equipment is in " releasing synchronization " shape
State.In some embodiments, if operation halt instruction sending module can be used for the first equipment and be left during first operates
First space then sends operation halt instruction to the second equipment, to indicate that the second equipment stops executing corresponding second operation simultaneously
Hover over current position.In some embodiments, if third coordinate determining module, which can be used for the first equipment, returns to first
Space, it is determined that third coordinate of first equipment when returning to the first space.In some embodiments, 4-coordinate mapping block
The 4-coordinate that can be used for being mapped as third coordinate based on the first coordinate mapping relations in second space.In some embodiments
In, the second operation determining module can be also used for being determined as the second operation to indicate that the second equipment is moved to the behaviour at 4-coordinate
Make.In some embodiments, second space redefines module and can be used for redefining associated with the second equipment second
Space.In some embodiments, mapping relations redefine module be determined for the first space and through redefining
The second coordinate mapping relations between two spaces.In some embodiments, the second operation redefines module and can be used for being based on
Second coordinate mapping relations, will be by the second equipment through again to determine according to first operation of first equipment in the first space
The second operation executed in determining second space.
In addition, the first equipment 800 can also include other unshowned function modules in Fig. 8, however due to its not shadow
It rings those skilled in the art understand that embodiment of the present disclosure, therefore is omitted in fig. 8.For example, the first equipment 800 may be used also
To include following one or more function modules:Power supply, memory, data/address bus, antenna, radio receiving-transmitting unit etc..
Below with reference to Fig. 7 and Fig. 8, to according to the embodiment of the present disclosure in the first equipment 800 (for example, the first equipment
100) method 700 for being used to control the second equipment 200 and the first equipment 800 (for example, first equipment 100) that place executes carry out detailed
Thin description.
Method 700 starts from step S710, can be by the space determining module 810 of the first equipment 800 in step S710
To determine first space 10 associated with the first equipment 800 and second space associated with the second equipment 200 20.
In step S720, can the first space 10 be determined by the first mapping relations determining module 820 of the first equipment 800
The first coordinate mapping relations between second space 20.
In step S730, can the mapping of the first coordinate be based on by the second operation determining module 830 of the first equipment 800 and closed
System, will be by the second equipment 200 in second space 20 to determine according to first operation of first equipment 800 in the first space 10
The second operation executed.
In step S740, it can send and control from the instruction sending module 840 of the first equipment 800 to the second equipment 200
Instruction, to indicate that the second equipment 200 executes the second operation.
In some embodiments, step S710 can include determining that at least one of following:One vertex in the first space
The length of position and the first space in each reference axis;The position and the first space on two vertex in the first space are pushed up by two
Length in the reference axis for the vertical plane that point is formed by line;The position and the first space on non-colinear three vertex in the first space
In the length being formed by by three vertex on the vertical line direction in face;And first space non-coplanar at least four vertex
Position.In some embodiments, step S710 can include determining that at least one of following:One vertex of second space is in three-dimensional
The length of position and second space in each reference axis of electronic three-dimensional map on electronic map;Two vertex of second space
The length of position and second space in the reference axis for the vertical plane for being formed by line by two vertex on electronic three-dimensional map;
Position and second space of non-colinear three vertex of second space on electronic three-dimensional map are formed by three vertex
Face vertical line direction on length;And position of non-coplanar at least four vertex of second space on electronic three-dimensional map
It sets.In some embodiments, step S720 may include:First space and the respective origin of second space are set;Determine first
The corresponding proportion of the length of space and second space in each reference axis;And the origin based on the first space and second space and
Corresponding proportion determines the first coordinate mapping relations of the coordinate in the coordinate and second space in the first space.In some implementations
In example, method 700 can also include:Determine first seat of first equipment when executing synchronous activation operation in the first space
Mark;The second coordinate being mapped as the first coordinate based on the first coordinate mapping relations in second space;And it is sent out to the second equipment
Synchronous activation instruction is sent, to indicate that the second equipment is moved at the second coordinate and indicates that the second equipment is in " synchronization " state.
In some embodiments, method 700 can also include sending synchronous cancel to the second equipment to instruct, to indicate that the second equipment is in
" release and synchronize " state.
In some embodiments, the first operation may include at least one of following:Translation, steering operation, visual angle become
Change operation and height change operation.In some embodiments, if the first operation is translation, step S730 can be with
Including:Determine first translation route of first equipment when executing translation in the first space;It is mapped based on the first coordinate
First translation route is mapped as the second translation route in second space by relationship;And the second operation is determined as instruction second
The operation that equipment is moved along the second translation route.In some embodiments, if the first operation is steering operation, step S730
May include:Determine first party parallactic angle of first equipment when terminating steering operation in the first space;It is reflected based on the first coordinate
Penetrate the second party parallactic angle that first party parallactic angle is mapped as in second space by relationship;And the second operation is determined as instruction second and is set
Standby or the second equipment image sensing component is diverted to second party parallactic angle.In some embodiments, if the first operation is visual angle
Variation operation, then step S730 may include:Determine the of the first equipment when terminating visual angle change operation in the first space
One pitch angle;The second pitch angle being mapped as the first pitch angle based on the first coordinate mapping relations in second space;And it will
Second operation is determined as indicating that the image sensing component of the second equipment is diverted to the second pitch angle.In some embodiments, if
First operation is height change operation, then step S730 may include:Determine the first equipment during executing height change operation
Reached maximum height or minimum altitude in the first space;If maximum height or minimum altitude are accordingly higher or lower than highest
Threshold value or lowest threshold are then based on the first coordinate mapping relations by maximum height or minimum altitude and corresponding highest threshold value or most
The difference of Low threshold is mapped as the difference in height in second space;And the second operation is determined as to indicate that the second equipment rises or falls
Difference in height.
In some embodiments, method 700 can also include:If the first equipment leaves the first sky during first operates
Between, then control instruction is sent to the second equipment, to indicate corresponding second operation of the second equipment stopping execution and hover over current
At position.In some embodiments, method 700 can also include:If the first equipment returns to the first space, it is determined that first sets
The standby third coordinate when returning to the first space;Third coordinate is mapped as in second space based on the first coordinate mapping relations
4-coordinate;And the second operation is determined as to indicate that the second equipment is moved to the operation at 4-coordinate.In some embodiments
In, method 700 can also include:Redefine second space associated with the second equipment;Determine the first space and through again
The second coordinate mapping relations between determining second space;Based on the second coordinate mapping relations, according to the first equipment first
First in space operates to determine the to be executed in the second space through redefining by the second equipment second operation;And
Control instruction is sent to the second equipment, to indicate that the second equipment executes the second operation.
Fig. 9 is the first equipment shown in the first equipment 100 or Fig. 8 shown in Fig. 1~6 that show according to the embodiment of the present disclosure
The block diagram of 800 exemplary hardware arrangement 900.Hardware layout 900 may include processor 906 (for example, central processing unit (CPU), number
Word signal processor (DSP), micro controller unit (MCU) etc.).Processor 906 can be performed for flow described herein
Different actions single treatment units either multiple processing units.Arrangement 900 can also include for being connect from other entities
Collect mail number input unit 902 and for other entities provide signal output unit 904.Input unit 902 and output
Unit 904 can be arranged to the entity that single entities either detach.
In addition, arrangement 900 may include having non-volatile or form of volatile memory at least one readable storage
Medium 908, e.g. electrically erasable programmable read-only memory (EEPROM), flash memory, and/or hard disk drive.Readable storage
Medium 908 includes computer program instructions 910, which includes code/computer-readable instruction,
Make hardware layout 900 and/or the first equipment including hardware layout 900 when being executed by the processor 906 in arrangement 900
100 or first equipment 800 can execute for example above in conjunction with flow described in Fig. 1~7 and its any deformation.
Computer program instructions 910 can be configured with such as computer program instructions module 910A~910D frameworks
Computer program instruction code.Therefore, example embodiment when hardware layout 900 is used in such as the first equipment 100 or 800
In, arrange that the code in 900 computer program instructions includes:Module 910A, for determining associated with the first equipment the
One space and second space associated with the second equipment.Code in computer program instructions further includes:Module 910B, is used for
Determine the first coordinate mapping relations between the first space and second space.Code in computer program instructions further includes:Mould
Block 910C is based on the first coordinate mapping relations, will be by second to determine according to first operation of first equipment in the first space
The second operation that equipment executes in second space.Code in computer program instructions further includes:Module 910D is set to second
Preparation send control instruction, to indicate that the second equipment executes the second operation.
Computer program instructions module can substantially execute each action in the flow shown in Fig. 1~7, with mould
Quasi- first equipment 100 or 800.In other words, when executing different computer program instructions modules in processor 906, they can
With corresponding to the above-mentioned disparate modules in the first equipment 100 or 800.
Although being implemented as computer program instructions module above in conjunction with the code means in Fig. 9 the disclosed embodiments,
It makes hardware layout 900 execute above in conjunction with the described action in Fig. 1~7, however alternative when being executed in processor 906
In embodiment, at least one in the code means can at least be implemented partly as hardware circuit.
Processor can be single cpu (central processing unit), but can also include two or more processing units.Example
Such as, processor may include general purpose microprocessor, instruction set processor and/or related chip group and/or special microprocessor (example
Such as, application-specific integrated circuit (ASIC)).Processor can also include the onboard storage device for caching purposes.Computer program refers to
Order can be carried by being connected to the computer program instructions product of processor.Computer program instructions product may include thereon
It is stored with the computer-readable medium of computer program instructions.For example, computer program instructions product can be flash memory, deposit at random
Access to memory (RAM), read-only memory (ROM), EEPROM, and above computer program instruction module in an alternative embodiment may be used
To be distributed in the form of the memory in UE in different computer program instructions products.
It should be noted that it is described herein as the function of being realized by pure hardware, pure software and/or firmware,
It can also be realized by modes such as the combinations of specialized hardware, common hardware and software.For example, being described as passing through specialized hardware
(for example, field programmable gate array (FPGA), application-specific integrated circuit (ASIC) etc.), can be by common hardware come the function of realizing
The mode of the combination of (for example, central processing unit (CPU), digital signal processor (DSP)) and software realizes, otherwise also
So.
Although the disclosure, art technology has shown and described with reference to the certain exemplary embodiments of the disclosure
Personnel it should be understood that in the case of the spirit and scope of the present disclosure limited without departing substantially from the following claims and their equivalents,
A variety of changes in form and details can be carried out to the disclosure.Therefore, the scope of the present disclosure should not necessarily be limited by above-described embodiment,
But should be not only determined by appended claims, also it is defined by the equivalent of appended claims.
Claims (46)
1. a kind of method for controlling the second equipment executed at the first equipment, including:
Determine the first space associated with first equipment and second space associated with second equipment;
Determine the first coordinate mapping relations between first space and the second space;
Based on the first coordinate mapping relations, determined according to first operation of first equipment in first space
The second operation to be executed in the second space by second equipment;And
Control instruction is sent to second equipment, to indicate that second equipment executes second operation.
2. according to the method described in claim 1, wherein it is determined that the step of the first space associated with first equipment wrap
Include at least one below determining:
Length of the position and first space on one vertex in first space in each reference axis;
The position and first space on two vertex in first space are being formed by hanging down for line by described two vertex
Length in the reference axis in face;
The position and first space on non-colinear three vertex in first space are formed by three vertex
Face vertical line direction on length;And
The position on non-coplanar at least four vertex in first space.
3. according to the method described in claim 1, wherein it is determined that the step of second space associated with second equipment wrap
Include at least one below determining:
Position and the second space of one vertex of the second space on electronic three-dimensional map are in electronic three-dimensional map
Each reference axis on length;
Position and the second space of two vertex of the second space on electronic three-dimensional map are by described two tops
Length in the reference axis for the vertical plane that point is formed by line;
Position and the second space of non-colinear three vertex of the second space on electronic three-dimensional map are by institute
State the length that three vertex are formed by the vertical line direction in face;And
Position of non-coplanar at least four vertex of the second space on electronic three-dimensional map.
4. according to the method described in claim 1, wherein it is determined that first between first space and the second space sits
Mark mapping relations the step of include:
First space and the respective origin of the second space are set;
Determine the corresponding proportion of the length of first space and the second space in each reference axis;And
Origin and corresponding proportion based on first space and the second space, determine coordinate in first space with
First coordinate mapping relations of the coordinate in the second space.
5. according to the method described in claim 1, wherein, the method further includes:
Determine first coordinate of first equipment when executing synchronous activation operation in first space;
The second coordinate being mapped as first coordinate based on the first coordinate mapping relations in the second space;And
Synchronous activation instruction is sent to second equipment, to indicate that second equipment is moved at second coordinate and refers to
Show that second equipment is in " synchronization " state.
6. according to the method described in claim 5, wherein, the method further includes:
Synchronous cancel is sent to second equipment to instruct, to indicate that second equipment is in " release and synchronize " state.
7. according to the method described in claim 5, wherein, first operation includes at least one of following:Translation, steering
Operation, visual angle change operation and height change operation.
8. according to the method described in claim 7, wherein, if first operation is translation, being based on described first
Coordinate mapping relations will be by second equipment to determine according to first operation of first equipment in first space
Executed in the second space second operation the step of include:
Determine first translation route of first equipment when executing the translation in first space;
Based on the first coordinate mapping relations the second translation that route is mapped as in the second space is translated by described first
Route;And
Second operation is determined as to indicate the operation that second equipment is moved along the second translation route.
9. according to the method described in claim 7, wherein, if first operation is steering operation, being based on described first
Coordinate mapping relations will be by second equipment to determine according to first operation of first equipment in first space
Executed in the second space second operation the step of include:
Determine first party parallactic angle of first equipment when terminating the steering operation in first space;
The second party parallactic angle being mapped as the first party parallactic angle based on the first coordinate mapping relations in the second space;
And
It is determined as second operation to indicate that second equipment or the image sensing component of second equipment are diverted to institute
State second party parallactic angle.
10. according to the method described in claim 7, wherein, if first operation is visual angle change operation, being based on described
First coordinate mapping relations will be by described second to determine according to first operation of first equipment in first space
Equipment executed in the second space second operation the step of include:
Determine first pitch angle of first equipment when terminating the visual angle change operation in first space;
The second pitch angle being mapped as first pitch angle based on the first coordinate mapping relations in the second space;
And
It is determined as second operation to indicate that the image sensing component of second equipment is diverted to second pitch angle.
11. according to the method described in claim 7, wherein, if first operation is height change operation, being based on described
First coordinate mapping relations will be by described second to determine according to first operation of first equipment in first space
Equipment executed in the second space second operation the step of include:
Determine first equipment reached maximum height in first space during executing the height change operation
Or minimum altitude;
If the maximum height or the minimum altitude are accordingly higher or lower than highest threshold value or lowest threshold, based on described
First coordinate mapping relations are by the maximum height or the minimum altitude and the corresponding highest threshold value or the minimum threshold
The difference of value is mapped as the difference in height in the second space;And
It is determined as second operation to indicate that second equipment rises or falls the difference in height.
12. according to the method described in claim 5, further including:
If first equipment leaves first space during first operation, sends and control to second equipment
System instruction, to indicate corresponding second operation of the second equipment stopping execution and hover over current position.
13. according to the method for claim 12, further including:
If first equipment returns to first space, it is determined that first equipment is when returning to first space
Third coordinate;
The 4-coordinate being mapped as the third coordinate based on the first coordinate mapping relations in the second space;And
It is determined as second operation to indicate that second equipment is moved to the operation at the 4-coordinate.
14. according to the method for claim 12, further including:
Redefine second space associated with second equipment;
Determine the second coordinate mapping relations between first space and the second space through redefining;
Based on the second coordinate mapping relations, determined according to first operation of first equipment in first space
The second operation to be executed in the second space through redefining by second equipment;And
Control instruction is sent to second equipment, to indicate that second equipment executes second operation.
15. according to the method described in claim 1, wherein, first equipment is head-mounted display, and second equipment
It is unmanned plane.
16. a kind of the first equipment for controlling the second equipment, including:
Space determining module, for determining and associated first space of first equipment and associated with second equipment
Second space;
First mapping relations determining module, for determining that the first coordinate between first space and the second space maps
Relationship;
Second operation determining module, for being based on the first coordinate mapping relations, according to first equipment described first
First in space operates to determine the to be executed in the second space by second equipment second operation;And
Instruction sending module, for sending control instruction to second equipment, to indicate that second equipment executes described the
Two operations.
17. the first equipment according to claim 16, wherein the space determining module is additionally operable to determine following at least one
:
Length of the position and first space on one vertex in first space in each reference axis;
The position and first space on two vertex in first space are being formed by hanging down for line by described two vertex
Length in the reference axis in face;
The position and first space on non-colinear three vertex in first space are formed by three vertex
Face vertical line direction on length;And
The position on non-coplanar at least four vertex in first space.
18. the first equipment according to claim 16, wherein the space determining module is additionally operable to determine following at least one
:
Position and the second space of one vertex of the second space on electronic three-dimensional map are in electronic three-dimensional map
Each reference axis on length;
Position and the second space of two vertex of the second space on electronic three-dimensional map are by described two tops
Length in the reference axis for the vertical plane that point is formed by line;
Position and the second space of non-colinear three vertex of the second space on electronic three-dimensional map are by institute
State the length that three vertex are formed by the vertical line direction in face;And
Position of non-coplanar at least four vertex of the second space on electronic three-dimensional map.
19. the first equipment according to claim 16, wherein the first mapping relations determining module is additionally operable to:
First space and the respective origin of the second space are set;
Determine the corresponding proportion of the length of first space and the second space in each reference axis;And
Origin and corresponding proportion based on first space and the second space, determine coordinate in first space with
First coordinate mapping relations of the coordinate in the second space.
20. the first equipment according to claim 16, further includes:
First coordinate determining module, for determining first equipment when executing synchronous activation operation in first space
The first coordinate;
First coordinate is mapped as described second by the second coordinate mapping module for being based on the first coordinate mapping relations
The second coordinate in space;And
Synchronous activation instruction sending module, for sending synchronous activation instruction to second equipment, to indicate that described second sets
It is standby to be moved at second coordinate and indicate that second equipment is in " synchronization " state.
21. the first equipment according to claim 20, further includes:
It is synchronous to release instruction sending module, cancel instruction for sending to synchronize to second equipment, to indicate that described second sets
It is standby to be in " release and synchronize " state.
22. the first equipment according to claim 20, wherein first operation includes at least one of following:Translation behaviour
Work, steering operation, visual angle change operation and height change operation.
23. the first equipment according to claim 22, wherein if first operation is translation, described the
Two operation determining modules are additionally operable to:
Determine first translation route of first equipment when executing the translation in first space;
Based on the first coordinate mapping relations the second translation that route is mapped as in the second space is translated by described first
Route;And
Second operation is determined as to indicate the operation that second equipment is moved along the second translation route.
24. the first equipment according to claim 22, wherein if first operation is steering operation, described the
Two operation determining modules are additionally operable to:
Determine first party parallactic angle of first equipment when terminating the steering operation in first space;
The second party parallactic angle being mapped as the first party parallactic angle based on the first coordinate mapping relations in the second space;
And
It is determined as second operation to indicate that second equipment or the image sensing component of second equipment are diverted to institute
State second party parallactic angle.
25. the first equipment according to claim 22, wherein if first operation is visual angle change operation, institute
The second operation determining module is stated to be additionally operable to:
Determine first pitch angle of first equipment when terminating the visual angle change operation in first space;
The second pitch angle being mapped as first pitch angle based on the first coordinate mapping relations in the second space;
And
It is determined as second operation to indicate that the image sensing component of second equipment is diverted to second pitch angle.
26. the first equipment according to claim 22, wherein if first operation is height change operation, institute
The second operation determining module is stated to be additionally operable to:
Determine first equipment reached maximum height in first space during executing the height change operation
Or minimum altitude;
If the maximum height or the minimum altitude are accordingly higher or lower than highest threshold value or lowest threshold, based on described
First coordinate mapping relations are by the maximum height or the minimum altitude and the corresponding highest threshold value or the minimum threshold
The difference of value is mapped as the difference in height in the second space;And
It is determined as second operation to indicate that second equipment rises or falls the difference in height.
27. the first equipment according to claim 20, further includes:
Halt instruction sending module is operated, if leaving first sky during first operation for first equipment
Between, then operation halt instruction is sent to second equipment, to indicate that second equipment stops executing corresponding second operation
And hover over current position.
28. the first equipment according to claim 27, further includes:
Third coordinate determining module, if returning to first space for first equipment, it is determined that first equipment
Third coordinate when returning to first space;
The third coordinate is mapped as described second by 4-coordinate mapping block for being based on the first coordinate mapping relations
4-coordinate in space,
Wherein, the second operation determining module is additionally operable to be determined as second operation to indicate that second equipment is moved to
Operation at the 4-coordinate.
29. the first equipment according to claim 16, further includes:
Second space redefines module, for redefining second space associated with second equipment;
Mapping relations redefine module, for determining first space between the second space through redefining
Second coordinate mapping relations;And
Second operation redefines module, for being based on the second coordinate mapping relations, according to first equipment described
The first operation in first space will be executed by second equipment in the second space through redefining to determine
Second operation.
30. the first equipment according to claim 16, wherein first equipment is head-mounted display, and described
Two equipment are unmanned planes.
31. a kind of the first equipment for controlling the second equipment, including:
Processor;
Memory, wherein being stored with instruction, described instruction makes the processor when being executed by the processor:
Determine the first space associated with first equipment and second space associated with second equipment;
Determine the first coordinate mapping relations between first space and the second space;
Based on the first coordinate mapping relations, determined according to first operation of first equipment in first space
The second operation to be executed in the second space by second equipment;And
Control instruction is sent to second equipment, to indicate that second equipment executes second operation.
32. the first equipment according to claim 31, wherein described instruction also makes institute when being executed by the processor
State at least one of processor determination or less:
Length of the position and first space on one vertex in first space in each reference axis;
The position and first space on two vertex in first space are being formed by hanging down for line by described two vertex
Length in the reference axis in face;
The position and first space on non-colinear three vertex in first space are formed by three vertex
Face vertical line direction on length;And
The position on non-coplanar at least four vertex in first space.
33. the first equipment according to claim 31, wherein described instruction also makes institute when being executed by the processor
State at least one of processor determination or less:
Position and the second space of one vertex of the second space on electronic three-dimensional map are in electronic three-dimensional map
Each reference axis on length;
Position and the second space of two vertex of the second space on electronic three-dimensional map are by described two tops
Length in the reference axis for the vertical plane that point is formed by line;
Position and the second space of non-colinear three vertex of the second space on electronic three-dimensional map are by institute
State the length that three vertex are formed by the vertical line direction in face;And
Position of non-coplanar at least four vertex of the second space on electronic three-dimensional map.
34. the first equipment according to claim 31, wherein described instruction also makes institute when being executed by the processor
State processor:
First space and the respective origin of the second space are set;
Determine the corresponding proportion of the length of first space and the second space in each reference axis;And
Origin and corresponding proportion based on first space and the second space, determine coordinate in first space with
First coordinate mapping relations of the coordinate in the second space.
35. the first equipment according to claim 31, wherein described instruction also makes institute when being executed by the processor
State processor:
Determine first coordinate of first equipment when executing synchronous activation operation in first space;
The second coordinate being mapped as first coordinate based on the first coordinate mapping relations in the second space;And
Synchronous activation instruction is sent to second equipment, to indicate that second equipment is moved at second coordinate and refers to
Show that second equipment is in " synchronization " state.
36. the first equipment according to claim 35, wherein described instruction also makes institute when being executed by the processor
State processor:
Synchronous cancel is sent to second equipment to instruct, to indicate that second equipment is in " release and synchronize " state.
37. the first equipment according to claim 35, wherein first operation includes at least one of following:Translation behaviour
Work, steering operation, visual angle change operation and height change operation.
38. according to the first equipment described in claim 37, wherein if first operation is translation, the finger
Order also makes the processor when being executed by the processor:
Determine first translation route of first equipment when executing the translation in first space;
Based on the first coordinate mapping relations the second translation that route is mapped as in the second space is translated by described first
Route;And
Second operation is determined as to indicate the operation that second equipment is moved along the second translation route.
39. according to the first equipment described in claim 37, wherein if first operation is steering operation, the finger
Order also makes the processor when being executed by the processor:
Determine first party parallactic angle of first equipment when terminating the steering operation in first space;
The second party parallactic angle being mapped as the first party parallactic angle based on the first coordinate mapping relations in the second space;
And
It is determined as second operation to indicate that second equipment or the image sensing component of second equipment are diverted to institute
State second party parallactic angle.
40. according to the first equipment described in claim 37, wherein if first operation is visual angle change operation, institute
It states instruction and also makes the processor when being executed by the processor:
Determine first pitch angle of first equipment when terminating the visual angle change operation in first space;
The second pitch angle being mapped as first pitch angle based on the first coordinate mapping relations in the second space;
And
It is determined as second operation to indicate that the image sensing component of second equipment is diverted to second pitch angle.
41. according to the first equipment described in claim 37, wherein if first operation is height change operation, institute
It states instruction and also makes the processor when being executed by the processor:
Determine first equipment reached maximum height in first space during executing the height change operation
Or minimum altitude;
If the maximum height or the minimum altitude are accordingly higher or lower than highest threshold value or lowest threshold, based on described
First coordinate mapping relations are by the maximum height or the minimum altitude and the corresponding highest threshold value or the minimum threshold
The difference of value is mapped as the difference in height in the second space;And
It is determined as second operation to indicate that second equipment rises or falls the difference in height.
42. the first equipment according to claim 35, wherein described instruction also makes institute when being executed by the processor
State processor:
If first equipment leaves first space during first operation, sends and control to second equipment
System instruction, to indicate corresponding second operation of the second equipment stopping execution and hover over current position.
43. the first equipment according to claim 42, wherein described instruction also makes institute when being executed by the processor
State processor:
If first equipment returns to first space, it is determined that first equipment is when returning to first space
Third coordinate;
The 4-coordinate being mapped as the third coordinate based on the first coordinate mapping relations in the second space;And
It is determined as second operation to indicate that second equipment is moved to the operation at the 4-coordinate.
44. the first equipment according to claim 31, wherein described instruction also makes institute when being executed by the processor
State processor:
Redefine second space associated with second equipment;
Determine the second coordinate mapping relations between first space and the second space through redefining;
Based on the second coordinate mapping relations, determined according to first operation of first equipment in first space
The second operation to be executed in the second space through redefining by second equipment;And
Control instruction is sent to second equipment, to indicate that second equipment executes second operation.
45. the first equipment according to claim 31, wherein first equipment is head-mounted display, and described
Two equipment are unmanned planes.
46. a kind of computer readable storage medium of store instruction, described instruction make the processing when executed by the processor
Device executes the method according to any one of claim 1~15.
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/084531 WO2018209557A1 (en) | 2017-05-16 | 2017-05-16 | Method and device for controlling device, and computer readable storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN108475064A true CN108475064A (en) | 2018-08-31 |
CN108475064B CN108475064B (en) | 2021-11-05 |
Family
ID=63266469
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201780004525.4A Expired - Fee Related CN108475064B (en) | 2017-05-16 | 2017-05-16 | Method, apparatus, and computer-readable storage medium for apparatus control |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108475064B (en) |
WO (1) | WO2018209557A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109062259A (en) * | 2018-10-31 | 2018-12-21 | 西安天问智能科技有限公司 | A kind of unmanned plane automatic obstacle-avoiding method and device thereof |
CN109395382A (en) * | 2018-09-12 | 2019-03-01 | 苏州蜗牛数字科技股份有限公司 | A kind of linear optimization method for rocking bar |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109799838B (en) * | 2018-12-21 | 2022-04-15 | 金季春 | Training method and system |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000507A (en) * | 2006-09-29 | 2007-07-18 | 浙江大学 | Method for moving robot simultanously positioning and map structuring at unknown environment |
CN102589544A (en) * | 2012-01-10 | 2012-07-18 | 合肥工业大学 | Three-dimensional attitude acquisition method based on space characteristics of atmospheric polarization mode |
CN102749080A (en) * | 2012-06-18 | 2012-10-24 | 北京航空航天大学 | Unmanned aerial vehicle three-dimensional air route generation method based on hydrodynamics |
US20120293512A1 (en) * | 2011-05-19 | 2012-11-22 | Via Technologies, Inc. | Three-dimensional graphics clipping method, three-dimensional graphics displaying method, and graphics processing apparatus using the same |
US20130082978A1 (en) * | 2011-09-30 | 2013-04-04 | Microsoft Corporation | Omni-spatial gesture input |
CN103150309A (en) * | 2011-12-07 | 2013-06-12 | 清华大学 | Method and system for searching POI (Point of Interest) points of awareness map in space direction |
CN103226386A (en) * | 2013-03-13 | 2013-07-31 | 广东欧珀移动通信有限公司 | Gesture identification method and system based on mobile terminal |
CN103499346A (en) * | 2013-09-29 | 2014-01-08 | 大连理工大学 | Implementation method of ground station three-dimensional navigation map of small unmanned air vehicle |
EP2685336A1 (en) * | 2012-07-13 | 2014-01-15 | Honeywell International Inc. | Autonomous airspace flight planning and virtual airspace containment system |
JP2014059824A (en) * | 2012-09-19 | 2014-04-03 | Casio Comput Co Ltd | Function driving device, function driving method, and function driving program |
WO2015020540A1 (en) * | 2013-08-09 | 2015-02-12 | Fisher & Paykel Healthcare Limited | Asymmetrical nasal delivery elements and fittings for nasal interfaces |
CN104991561A (en) * | 2015-08-10 | 2015-10-21 | 北京零零无限科技有限公司 | Hand-held unmanned aerial vehicle recovery method and device and unmanned aerial vehicle |
CN105424024A (en) * | 2015-11-03 | 2016-03-23 | 葛洲坝易普力股份有限公司 | Spatial target position and orientation calibration method based on total station |
CN105607740A (en) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | Unmanned aerial vehicle control method and device based on computer vision |
CN105786011A (en) * | 2016-03-07 | 2016-07-20 | 重庆邮电大学 | Control method and control equipment for remote-controlled aerial vehicle |
CN106023657A (en) * | 2015-03-30 | 2016-10-12 | 国际商业机器公司 | Implementing A Restricted-Operation Region For Unmanned Vehicles |
CN106064378A (en) * | 2016-06-07 | 2016-11-02 | 南方科技大学 | The control method of a kind of unmanned plane mechanical arm and device |
CN106094863A (en) * | 2015-04-23 | 2016-11-09 | 鹦鹉无人机股份有限公司 | The system of unmanned plane is driven for immersion |
US20160328862A1 (en) * | 2015-05-06 | 2016-11-10 | Korea University Research And Business Foundation | Method for extracting outer space feature information from spatial geometric data |
CN106125747A (en) * | 2016-07-13 | 2016-11-16 | 国网福建省电力有限公司 | Based on the servo-actuated Towed bird system in unmanned aerial vehicle onboard the first visual angle mutual for VR |
CN106228615A (en) * | 2016-08-31 | 2016-12-14 | 陈昊 | Unmanned vehicle experiencing system based on augmented reality and experiential method thereof |
CN106227230A (en) * | 2016-07-09 | 2016-12-14 | 东莞市华睿电子科技有限公司 | A kind of unmanned aerial vehicle (UAV) control method |
CN106292679A (en) * | 2016-08-29 | 2017-01-04 | 电子科技大学 | The control method of wearable unmanned aerial vehicle (UAV) control equipment based on body-sensing |
CN205942090U (en) * | 2016-04-29 | 2017-02-08 | 深圳市大疆创新科技有限公司 | Wearable equipment and unmanned aerial vehicle system |
WO2017043687A1 (en) * | 2015-09-07 | 2017-03-16 | 한국항공대학교산학협력단 | L-v-c operation system, and unmanned air vehicle training/experimental method using same |
CN206031749U (en) * | 2016-08-31 | 2017-03-22 | 佛山世寰智能科技有限公司 | Unmanned aerial vehicle's four -axis rotor fixed knot constructs |
CN106569596A (en) * | 2016-10-20 | 2017-04-19 | 努比亚技术有限公司 | Gesture control method and equipment |
CN106664401A (en) * | 2014-08-19 | 2017-05-10 | 索尼互动娱乐股份有限公司 | Systems and methods for providing feedback to a user while interacting with content |
CN107065914A (en) * | 2013-07-05 | 2017-08-18 | 深圳市大疆创新科技有限公司 | The flight householder method and device of unmanned vehicle |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8577535B2 (en) * | 2010-03-31 | 2013-11-05 | Massachusetts Institute Of Technology | System and method for providing perceived first-order control of an unmanned vehicle |
EP4099136A1 (en) * | 2013-02-22 | 2022-12-07 | Sony Group Corporation | Head- mounted display and image display device |
WO2017003538A2 (en) * | 2015-04-14 | 2017-01-05 | Tobin Fisher | System for authoring, executing, and distributing unmanned aerial vehicle flight-behavior profiles |
CN205216197U (en) * | 2015-12-07 | 2016-05-11 | 南京邮电大学 | Model aeroplane and model ship aircraft safety remote control system based on active gesture detects |
CN106155069A (en) * | 2016-07-04 | 2016-11-23 | 零度智控(北京)智能科技有限公司 | UAV Flight Control device, method and remote terminal |
-
2017
- 2017-05-16 CN CN201780004525.4A patent/CN108475064B/en not_active Expired - Fee Related
- 2017-05-16 WO PCT/CN2017/084531 patent/WO2018209557A1/en active Application Filing
Patent Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101000507A (en) * | 2006-09-29 | 2007-07-18 | 浙江大学 | Method for moving robot simultanously positioning and map structuring at unknown environment |
US20120293512A1 (en) * | 2011-05-19 | 2012-11-22 | Via Technologies, Inc. | Three-dimensional graphics clipping method, three-dimensional graphics displaying method, and graphics processing apparatus using the same |
US20130082978A1 (en) * | 2011-09-30 | 2013-04-04 | Microsoft Corporation | Omni-spatial gesture input |
CN103150309A (en) * | 2011-12-07 | 2013-06-12 | 清华大学 | Method and system for searching POI (Point of Interest) points of awareness map in space direction |
CN102589544A (en) * | 2012-01-10 | 2012-07-18 | 合肥工业大学 | Three-dimensional attitude acquisition method based on space characteristics of atmospheric polarization mode |
CN102749080A (en) * | 2012-06-18 | 2012-10-24 | 北京航空航天大学 | Unmanned aerial vehicle three-dimensional air route generation method based on hydrodynamics |
EP2685336A1 (en) * | 2012-07-13 | 2014-01-15 | Honeywell International Inc. | Autonomous airspace flight planning and virtual airspace containment system |
JP2014059824A (en) * | 2012-09-19 | 2014-04-03 | Casio Comput Co Ltd | Function driving device, function driving method, and function driving program |
CN103226386A (en) * | 2013-03-13 | 2013-07-31 | 广东欧珀移动通信有限公司 | Gesture identification method and system based on mobile terminal |
CN107065914A (en) * | 2013-07-05 | 2017-08-18 | 深圳市大疆创新科技有限公司 | The flight householder method and device of unmanned vehicle |
WO2015020540A1 (en) * | 2013-08-09 | 2015-02-12 | Fisher & Paykel Healthcare Limited | Asymmetrical nasal delivery elements and fittings for nasal interfaces |
CN103499346A (en) * | 2013-09-29 | 2014-01-08 | 大连理工大学 | Implementation method of ground station three-dimensional navigation map of small unmanned air vehicle |
CN106664401A (en) * | 2014-08-19 | 2017-05-10 | 索尼互动娱乐股份有限公司 | Systems and methods for providing feedback to a user while interacting with content |
CN106023657A (en) * | 2015-03-30 | 2016-10-12 | 国际商业机器公司 | Implementing A Restricted-Operation Region For Unmanned Vehicles |
CN106094863A (en) * | 2015-04-23 | 2016-11-09 | 鹦鹉无人机股份有限公司 | The system of unmanned plane is driven for immersion |
US20160328862A1 (en) * | 2015-05-06 | 2016-11-10 | Korea University Research And Business Foundation | Method for extracting outer space feature information from spatial geometric data |
CN104991561A (en) * | 2015-08-10 | 2015-10-21 | 北京零零无限科技有限公司 | Hand-held unmanned aerial vehicle recovery method and device and unmanned aerial vehicle |
WO2017043687A1 (en) * | 2015-09-07 | 2017-03-16 | 한국항공대학교산학협력단 | L-v-c operation system, and unmanned air vehicle training/experimental method using same |
CN105424024A (en) * | 2015-11-03 | 2016-03-23 | 葛洲坝易普力股份有限公司 | Spatial target position and orientation calibration method based on total station |
CN105607740A (en) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | Unmanned aerial vehicle control method and device based on computer vision |
CN105786011A (en) * | 2016-03-07 | 2016-07-20 | 重庆邮电大学 | Control method and control equipment for remote-controlled aerial vehicle |
CN205942090U (en) * | 2016-04-29 | 2017-02-08 | 深圳市大疆创新科技有限公司 | Wearable equipment and unmanned aerial vehicle system |
CN106064378A (en) * | 2016-06-07 | 2016-11-02 | 南方科技大学 | The control method of a kind of unmanned plane mechanical arm and device |
CN106227230A (en) * | 2016-07-09 | 2016-12-14 | 东莞市华睿电子科技有限公司 | A kind of unmanned aerial vehicle (UAV) control method |
CN106125747A (en) * | 2016-07-13 | 2016-11-16 | 国网福建省电力有限公司 | Based on the servo-actuated Towed bird system in unmanned aerial vehicle onboard the first visual angle mutual for VR |
CN106292679A (en) * | 2016-08-29 | 2017-01-04 | 电子科技大学 | The control method of wearable unmanned aerial vehicle (UAV) control equipment based on body-sensing |
CN106228615A (en) * | 2016-08-31 | 2016-12-14 | 陈昊 | Unmanned vehicle experiencing system based on augmented reality and experiential method thereof |
CN206031749U (en) * | 2016-08-31 | 2017-03-22 | 佛山世寰智能科技有限公司 | Unmanned aerial vehicle's four -axis rotor fixed knot constructs |
CN106569596A (en) * | 2016-10-20 | 2017-04-19 | 努比亚技术有限公司 | Gesture control method and equipment |
Non-Patent Citations (6)
Title |
---|
MAIER, M等: "Landing of VTOL UAVs Using a Stationary Robot Manipulator: A New Approach for Coordinated Control", 《54TH IEEE CONFERENCE ON DECISION AND CONTROL》 * |
ZHANG, JS等: "A Space-Time Network-Based Modeling Framework for Dynamic Unmanned Aerial Vehicle Routing in Traffic Incident Monitoring Applications", 《SENSORS》 * |
左善超: "某飞行器建模与三维视景仿真研究", 《中国优秀硕士学位论文全文数据库 工程科技II辑》 * |
康丽坤等: "《高等数学实用教程》", 30 September 2016, 北京工业大学出版社 * |
张腾: "基于智能手机运动感知的小型无人飞行器姿态控制", 《中国优秀硕士学位论文全文数据库 工程科技Ⅱ辑》 * |
李杰: "基于几何力学模型的无人机运动规划与导引方法研究", 《中国博士学位论文全文数据库 工程科技II辑》 * |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109395382A (en) * | 2018-09-12 | 2019-03-01 | 苏州蜗牛数字科技股份有限公司 | A kind of linear optimization method for rocking bar |
CN109062259A (en) * | 2018-10-31 | 2018-12-21 | 西安天问智能科技有限公司 | A kind of unmanned plane automatic obstacle-avoiding method and device thereof |
Also Published As
Publication number | Publication date |
---|---|
CN108475064B (en) | 2021-11-05 |
WO2018209557A1 (en) | 2018-11-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN104808675B (en) | Body-sensing flight control system and terminal device based on intelligent terminal | |
US9947230B2 (en) | Planning a flight path by identifying key frames | |
US9928649B2 (en) | Interface for planning flight path | |
CN108351653B (en) | System and method for UAV flight control | |
EP3783454B1 (en) | Systems and methods for adjusting uav trajectory | |
JP6228679B2 (en) | Gimbal and gimbal simulation system | |
CN110300938A (en) | System and method for exempting from the interaction of controller formula user's unmanned plane | |
CN106125747A (en) | Based on the servo-actuated Towed bird system in unmanned aerial vehicle onboard the first visual angle mutual for VR | |
CN107223199A (en) | Air navigation aid and equipment based on three-dimensional map | |
CN106716272A (en) | Systems and methods for flight simulation | |
CN109388150A (en) | Multi-sensor environment map structuring | |
CN106227231A (en) | The control method of unmanned plane, body feeling interaction device and unmanned plane | |
US11804052B2 (en) | Method for setting target flight path of aircraft, target flight path setting system, and program for setting target flight path | |
WO2018187916A1 (en) | Cradle head servo control method and control device | |
CN108475064A (en) | Method, equipment and computer readable storage medium for equipment control | |
CN110209202A (en) | A kind of feas ible space generation method, device, aircraft and aerocraft system | |
KR20170090888A (en) | Apparatus for unmanned aerial vehicle controlling using head mounted display | |
CN205983222U (en) | Unmanned aerial vehicle machine carries hardware connection structure of first visual angle nacelle device | |
WO2020209167A1 (en) | Information processing device, information processing method, and program | |
Mahayuddin et al. | Comparison of human pilot (remote) control systems in multirotor unmanned aerial vehicle navigation | |
O'Keeffe et al. | Oculus rift application for training drone pilots | |
WO2024000189A1 (en) | Control method, head-mounted display device, control system and storage medium | |
WO2022094808A1 (en) | Photographing control method and apparatus, unmanned aerial vehicle, device, and readable storage medium | |
CN108332738A (en) | Unmanned plane blind guiding system and blind guiding system | |
KR102263227B1 (en) | Method and apparatus for control unmanned vehicle using artificial reality with relative navigation information |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant | ||
CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20211105 |
|
CF01 | Termination of patent right due to non-payment of annual fee |