CN106227231A - The control method of unmanned plane, body feeling interaction device and unmanned plane - Google Patents
The control method of unmanned plane, body feeling interaction device and unmanned plane Download PDFInfo
- Publication number
- CN106227231A CN106227231A CN201610562193.9A CN201610562193A CN106227231A CN 106227231 A CN106227231 A CN 106227231A CN 201610562193 A CN201610562193 A CN 201610562193A CN 106227231 A CN106227231 A CN 106227231A
- Authority
- CN
- China
- Prior art keywords
- feeling interaction
- instruction
- unmanned plane
- body feeling
- view data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000003993 interaction Effects 0.000 title claims abstract description 241
- 238000000034 method Methods 0.000 title claims abstract description 52
- 230000009471 action Effects 0.000 claims abstract description 59
- 238000009877 rendering Methods 0.000 claims description 33
- 230000002452 interceptive effect Effects 0.000 claims description 25
- 230000008569 process Effects 0.000 claims description 13
- 238000000605 extraction Methods 0.000 claims description 6
- 238000000926 separation method Methods 0.000 claims description 6
- 230000005540 biological transmission Effects 0.000 claims description 4
- 230000000875 corresponding effect Effects 0.000 description 20
- 230000000007 visual effect Effects 0.000 description 12
- 230000015572 biosynthetic process Effects 0.000 description 6
- 239000000284 extract Substances 0.000 description 6
- 238000003786 synthesis reaction Methods 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 238000009499 grossing Methods 0.000 description 3
- 235000007926 Craterellus fallax Nutrition 0.000 description 2
- 240000007175 Datura inoxia Species 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 238000005096 rolling process Methods 0.000 description 2
- 230000002194 synthesizing effect Effects 0.000 description 2
- 240000007711 Peperomia pellucida Species 0.000 description 1
- 235000012364 Peperomia pellucida Nutrition 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000009826 distribution Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention discloses the control method of a kind of unmanned plane, body feeling interaction device and unmanned plane, the control method of described unmanned plane includes: body feeling interaction controls device and gathers target body image, and described target body image is carried out posture or action recognition;Obtain the body feeling interaction instruction corresponding with recognition result;The instruction of described body feeling interaction is sent to unmanned plane, so that described unmanned plane performs the instruction of described body feeling interaction.By the way, not only enrich the control mode to unmanned plane in prior art, and, it is not necessary to unmanned plane is controlled by user by physical equipment, the most easy to operate, reduces operational error rate, also brings more enriching experiences to user.
Description
Technical field
The present invention relates to intelligent terminal's technical field, particularly relate to the control method of a kind of unmanned plane, body feeling interaction dress
Put and unmanned plane.
Background technology
Along with virtual reality and augmented reality and the appearance of unmanned plane, people have visually had increasingly
The complete new experience of many different visual angles.
Such as, at present the control mode of unmanned plane is realized unmanned generally by by remote controller or manual touch-control
The control of machine, this brings a lot of inconvenience to the operation of user, and operational error rate is the highest.
Summary of the invention
The technical problem that present invention mainly solves is to provide the control method of a kind of unmanned plane, body feeling interaction device and nothing
Man-machine, not only enrich the control mode to unmanned plane in prior art, and, it is not necessary to user passes through physical equipment to unmanned plane
It is controlled, the most easy to operate, reduce operational error rate, also bring more enriching experiences to user.
For solving above-mentioned technical problem, the technical scheme that the present invention uses is: provide the controlling party of a kind of unmanned plane
Method, described control method includes:
Body feeling interaction controls device and gathers target body image, described target body image carries out posture or action is known
Not;
Obtain the body feeling interaction instruction corresponding with recognition result;
The instruction of described body feeling interaction is sent to unmanned plane, so that described unmanned plane performs the instruction of described body feeling interaction.
Wherein, described control method also includes:
Receive the first view data and the second image that described unmanned plane returns after performing the instruction of described body feeling interaction
Data;Wherein, described first view data is obtained by the first photographic head of described unmanned plane, and described second view data is by described
The second camera of unmanned plane obtains;
Described first view data and the second view data are synthesized 3D rendering, shows described 3D rendering.
Wherein, the described instruction of described body feeling interaction transmission also includes after the step of unmanned plane:
The instruction of described body feeling interaction is converted into human body house-keeping instruction;
There is provided to described target body according to described human body house-keeping instruction, hold with unmanned plane described in described 3D rendering information
Synchronization action during row described body feeling interaction instruction generation action is experienced.
Wherein, body feeling interaction controls device and gathers target body image, and described target body image carries out posture or dynamic
The step making to identify specifically includes:
Gather target body image;
After described target body image is carried out denoising, background separation, skeletal extraction and gesture recognition process, obtain appearance
Gesture or the recognition result of action.
Wherein, the instruction of described body feeling interaction includes that flight attitude adjusts, speed adjusts, heading adjusts and described the
One photographic head is or/and at least one in described second camera aperture, focal length, shooting direction or mode adjustment.
For solving above-mentioned technical problem, another technical solution used in the present invention is: provide the control of a kind of unmanned plane
Method, described control method includes:
Unmanned plane receives the body feeling interaction instruction that body-sensing interactive device sends;Wherein, described interactive instruction is described body-sensing
Interactive device is gathering target body image, carries out described target body image obtained by after posture or action recognition;
Perform the instruction of described body feeling interaction.
Wherein, described control method also includes:
The first view data and second under current environment is obtained respectively by the first photographic head and second camera
View data;
Described first view data and the second view data are sent to described body feeling interaction device, so that described body-sensing
Interactive device shows after described first view data and the second view data are synthesized 3D rendering.
For solving above-mentioned technical problem, another technical scheme that the present invention uses is: provide a kind of body feeling interaction device,
Described body feeling interaction device includes image acquisition units, control unit and transmitting element;
Described image acquisition units is used for gathering target body image, and described target body image is carried out posture or action
Identify;
Described control unit is for obtaining the body feeling interaction instruction corresponding with recognition result;
Described transmitting element is for sending the instruction of described body feeling interaction to unmanned plane.
Wherein, described body feeling interaction device also includes receiving unit and display unit,
Described reception unit is for receiving the first image that described unmanned plane returns after performing the instruction of described body feeling interaction
Data and the second view data;Wherein, described first view data is obtained by the first photographic head of described unmanned plane, and described the
Two view data are obtained by the second camera of described unmanned plane;
Described display unit is for synthesizing 3D rendering by described first view data and the second view data, and display is described
3D rendering.
For solving above-mentioned technical problem, another technical scheme that the present invention uses is: provide a kind of unmanned plane, described nothing
Man-machine including receives unit and control unit,
The body feeling interaction instruction that described reception unit sends for receiving body-sensing interactive device;Wherein, described interactive instruction
It is that described body feeling interaction device is gathering target body image, described target body image is carried out institute after posture or action recognition
Obtain;
Described control unit is used for performing the instruction of described body feeling interaction.
The invention has the beneficial effects as follows: be different from the situation of prior art, the body feeling interaction of present embodiment controls device
Gather target body image, described target body image is carried out posture or action recognition;Obtain the body corresponding with recognition result
Sense interactive instruction;The instruction of described body feeling interaction is sent to unmanned plane, so that described unmanned plane performs the instruction of described body feeling interaction.
Above by the mode of body-sensing action control unmanned plane, not only enrich the control mode to unmanned plane in prior art, and,
Unmanned plane is controlled by physical equipment without user, the most easy to operate, reduce operational error rate, also bring to user
More enriching experiences.And, while improving Consumer's Experience, it is possible to effectively simplify required for unmanned aerial vehicle control system
Hardware device, reduces production cost.
Accompanying drawing explanation
Fig. 1 is the structural representation of control system one embodiment of unmanned plane of the present invention;
Fig. 2 is the schematic flow sheet of control method one embodiment of unmanned plane of the present invention;
Fig. 3 is the schematic flow sheet of another embodiment of control method of unmanned plane of the present invention;
Fig. 4 is the schematic flow sheet of the control method a further embodiment of unmanned plane of the present invention;
Fig. 5 is the schematic flow sheet of the control method a further embodiment of unmanned plane of the present invention;
Fig. 6 is the structural representation of body feeling interaction device one embodiment of the present invention;
Fig. 7 is the structural representation of body feeling interaction another embodiment of device of the present invention;
Fig. 8 is the structural representation of unmanned plane one embodiment of the present invention;
Fig. 9 is the structural representation of another embodiment of unmanned plane of the present invention.
Detailed description of the invention
It it is the structural representation of control system one embodiment of unmanned plane of the present invention refering to Fig. 1, Fig. 1.As it is shown in figure 1,
The control system of present embodiment includes body feeling interaction device 101 and unmanned plane 102.Wherein, this body feeling interaction device 101 with
Unmanned plane 102 passes through wireless connections.
Wherein, this body feeling interaction device 101 can realize man-to-man wireless connections with unmanned plane 102, it is also possible to passes through
Mode for multiple unmanned planes distribution ID realizes body feeling interaction device 101 and multiple unmanned planes 102 set up wireless connections.
In order to improve the most real visual experience that unmanned plane brings to user, also make user while viewing is appreciated
Having more active operation, body feeling interaction device 101 and unmanned plane are passed through wireless by the control system that present embodiment provides
The mode connected combines, and by body-sensing action control unmanned plane, improves Consumer's Experience.
Specifically, this body feeling interaction device 101 gathers target body image, and this target body image carries out posture or dynamic
Identify, obtain the body feeling interaction instruction corresponding with recognition result, and the instruction of described body feeling interaction is sent to unmanned plane 102.
Unmanned plane 102 is used for receiving and perform the instruction of described body feeling interaction.
It addition, unmanned plane 102 obtains first under current environment respectively also by the first photographic head and second camera
View data and the second view data;And described first view data and the second view data are sent to the friendship of described body-sensing
Device 101 mutually.
Body feeling interaction device 101 receives the first view data and the second view data, by described first view data with
And second view data synthesis 3D rendering, show described 3D rendering.
For the apparent above-mentioned work process of explanation, further regard to the control that Fig. 2, Fig. 2 are unmanned planes of the present invention
The schematic flow sheet of method one embodiment.Wherein, the body-sensing during the executive agent of the control method of present embodiment is Fig. 1 is handed over
Device mutually.This control method comprises the steps:
201: body feeling interaction controls device and gathers target body image, and described target body image is carried out posture or action
Identify.
Body feeling interaction to be realized operates, and first body feeling interaction device carries out the collection of target body image to user.Specifically
Ground, first obtains the image information of user, and wherein this image information is deep image information, and deep image information includes degree of depth phase
At least one in sheet and video.Then this image information is filtered, after denoising and smoothing processing, this image believed
Breath carries out background and the separation of target body image, extracts target body image from this image information.
After getting image information, this target body image is carried out skeletal extraction and posture or gesture or action is entered
Row identifies, first body feeling interaction device carries out the discriminating of effective information, enter including to effective posture or action this image information
Row identifies, swings left and right arm such as user or shoulder arrives angle etc., obtain the recognition result of posture or action, do not do at this
Limit.
202: obtain the body feeling interaction instruction corresponding with recognition result.
Owing to the type of unmanned plane can include in many rotor wing unmanned aerial vehicles or fixed-wing unmanned plane and other kinds of nothing
Man-machine, the power set of different unmanned planes are different, and control method is different, and the instruction of executable body feeling interaction there may be not
With.The body feeling interaction instruction that same type of unmanned plane is capable of identify that also has multiple.Such as flight attitude adjusts instruction, speed is adjusted
Whole instruction, heading adjust instruction etc..Specifically, many rotor wing unmanned aerial vehicles machine can include hovering instruction, retreat instruction, by track
Flight directive and self-navigation instruction etc..Fixed-wing unmanned plane includes acceleration and deceleration instruction, pitch demand, driftage instruction and rolling
Turn instruction etc., do not limit at this.
Body feeling interaction device is after obtaining the recognition result of posture or action, further according to this recognition result and in advance
The corresponding relation set obtains the body feeling interaction instruction that this recognition result is corresponding.
In other embodiments, in order to improve the accuracy rate obtaining body feeling interaction instruction further, it is also possible to take body
The mode that sense combines with touch-control issues body feeling interaction instruction, as body feeling interaction device is getting body-sensing friendship according to recognition result
Mutually after instruction, can first pass through its display screen and show that this body feeling interaction instructs, user can be determined by touch controlled key and realizes more
Control accurately, do not limit at this.
203: the instruction of described body feeling interaction is sent to unmanned plane, so that described unmanned plane performs described body feeling interaction and refers to
Order.
Such as, the body feeling interaction instruction that current body feeling interaction device sends is spiraling, then unmanned plane is receiving this
After body feeling interaction instruction, change existing flying method, be adjusted to the pattern of spiraling.If current body feeling interaction device is sent out
The body feeling interaction instruction sent is assisted instruction, and unmanned plane is after receiving the instruction of this body feeling interaction, in existing flight speed
Accelerate flight.
In another embodiment, unmanned plane is not before flying, it is also possible to is directly instructed by body feeling interaction and controls
It takes off, such as, and body feeling interaction device and unmanned plane wireless connections.Power supply is started at unmanned plane and body feeling interaction device
After entering duty, body feeling interaction device collection obtains image information during user operation, extracts target according to image information
Human object, after this target body image is carried out posture or action recognition, determine this recognition result corresponding for unmanned plane
Instruction of taking off, body feeling interaction device by this take off instruction be wirelessly transmitted to correspondence unmanned plane, this unmanned plane receives and takes off
After instruction, start power set, flight of taking off.
In another embodiment, in order to enable users to access flight picture more intuitively, i.e. unmanned by this
Machine obtains the landscape around this unmanned plane or environmental information, experiences unmanned plane different gestures, no under body feeling interaction instruction controls
The excitement brought with path or friction speed, i.e. realizes the experience of first person vision, further refering to Fig. 3, in step
303: the instruction of described body feeling interaction is sent to unmanned plane so that described unmanned plane perform the instruction of described body feeling interaction step it
After, also comprise the steps:
304: receive the first view data and second that described unmanned plane returns after performing the instruction of described body feeling interaction
View data;Wherein, described first view data by described unmanned plane first photographic head obtain, described second view data by
The second camera of described unmanned plane obtains.
Specifically, unmanned plane is after receiving the body feeling interaction instruction that body feeling interaction device sends, by flat on its The Cloud Terrace
The environment being presently in is shot by the first photographic head and second camera that row is arranged.Wherein the instruction of this body feeling interaction is also
Can be that this first photographic head adjusts instruction or/and second camera aperture adjusts instruction, Focussing instruction or screening-mode
In at least one, with obtain more can clearly or meet user require image information.
Unmanned plane is getting the first view data by the first photographic head, and second camera gets the second view data
After, this first view data and the second view data are wirelessly returned to body feeling interaction device.
It should be noted that unmanned plane to the shooting process of the first view data and the second view data with return donor
The process of sense interactive device is real-time synchronization.
Accordingly, body feeling interaction device receives this first view data and the second view data.
305: described first view data and the second view data are synthesized 3D rendering, shows described 3D rendering.
Data owing to being shot by photographic head are generally 2D data, in order to provide the user with more true and shake further
The effect shaken, body feeling interaction device after receiving the first view data and the second view data, to this first view data and
Second view data carries out three-dimensional fusion, will the two synthesis 3D rendering.Store this 3D rendering, and show this 3D by its screen
Image.
In another particular embodiment of the invention, this body feeling interaction device is with a VR head that can watch 3D video data
The mode of helmet presents, and the both sides of this VR helmet are provided with left screen and right screen, and wherein, left screen is for display the first picture number
According to, and this first view data being sent to the left eye of user, right screen is for display the second view data, and by this second figure
As data are sent to the right eye of user, so that user feels 3D effect, do not limit at this.
In any of the above-described embodiment, in order to visual experience can not only be brought for user, but in visual experience
While, there is the experience of health sense of touch further, i.e. realize the first flight visual angle truly, body feeling interaction device is being incited somebody to action
Body feeling interaction instruction sends after unmanned plane, further the instruction of this body feeling interaction is converted to human body house-keeping instruction, and root
Perform the instruction of this body feeling interaction according to this human body house-keeping instruction to target body i.e. user offer and unmanned plane in 3D rendering and produce dynamic
Synchronization action when making is experienced.
Wherein, during human body house-keeping instruction includes turning to flight directive, super weightless flight instruction and pneumatic instruction at least
A kind of.
Such as, current body feeling interaction device is landing instruction, then body feeling interaction to the body feeling interaction instruction that unmanned plane sends
This landing instruction is converted into weightless instruction and two kinds of human body house-keeping instructions of pneumatic instruction by device.And set by corresponding hardware
Standby or VR or AR technology provides the user weightlessness and the pneumatic experience that this unmanned plane is presented in descent, and this experience
The 3D rendering enjoyed with user is Tong Bu.The most such as, the body feeling interaction instruction that current body feeling interaction device sends to unmanned plane
When instructing for turning left, unmanned plane transmission is turned left to instruct the people be converted to the left-hand rotation that user implements by body feeling interaction device by this
Body house-keeping instruction, and when unmanned plane performs instruction, by hardware device to human feeling to left-hand rotation action, do not do at this
Limit.
Being different from prior art, the body feeling interaction of present embodiment controls device and gathers target body image, to described mesh
Mark human body image carries out posture or action recognition;Obtain the body feeling interaction instruction corresponding with recognition result;By described body feeling interaction
Instruction sends to unmanned plane, so that described unmanned plane performs the instruction of described body feeling interaction.Unmanned above by body-sensing action control
The mode of machine, not only enriches the control mode to unmanned plane in prior art, and, it is not necessary to user passes through physical equipment to nothing
Man-machine it is controlled, the most easy to operate, reduce operational error rate, also bring more enriching experiences to user.And,
While improving Consumer's Experience, it is possible to effectively simplify the hardware device required for unmanned aerial vehicle control system, reduce production cost.
It addition, the first view data of body feeling interaction device reception unmanned plane the first photographic head return and second camera return
The second view data returned, and this first view data and the second view data are synthesized the mode that 3D shows, it is possible to make user
Obtain flight picture more intuitively, experience unmanned plane different gestures, different path or difference under body feeling interaction instruction controls
The excitement that speed is brought, improves the sense of reality of picture, further increases Consumer's Experience.
It addition, body feeling interaction device is after being sent to unmanned plane by body feeling interaction instruction, further this body-sensing is handed over
Instruction is converted to human body house-keeping instruction mutually, and provides and nothing in 3D rendering to the i.e. user of target body according to this human body house-keeping instruction
Synchronization action during the instruction generation action of this body feeling interaction of man-machine execution is experienced.Can make user while visual experience, enter
One step has the experience of health sense of touch, it is achieved the first flight visual angle truly.
It it is the schematic flow sheet of the control method a further embodiment of unmanned plane of the present invention refering to Fig. 4, Fig. 4.Wherein, this reality
The control method executing mode is implemented by the unmanned plane 102 in Fig. 1.Specifically, the control method of present embodiment includes walking as follows
Rapid:
401: unmanned plane receives the body feeling interaction instruction that body-sensing interactive device sends;Wherein, described interactive instruction is described
Body feeling interaction device is gathering target body image, obtained by carrying out described target body image after posture or action recognition
's.
Body feeling interaction to be realized operates, and first body feeling interaction device carries out the collection of target body image to user.Specifically
Ground, first obtains the image information of user, and wherein this image information is deep image information, and deep image information includes degree of depth phase
At least one in sheet and video.Then this image information is filtered, after denoising and smoothing processing, this image believed
Breath carries out background and the separation of target body image, extracts target body image from this image information.
After getting image information, this target body image is carried out skeletal extraction and posture or gesture or action is entered
Row identifies, first body feeling interaction device carries out the discriminating of effective information, enter including to effective posture or action this image information
Row identifies, swings left and right arm such as user or shoulder arrives angle etc., obtain the recognition result of posture or action, do not do at this
Limit.
Body feeling interaction device is after obtaining the recognition result of posture or action, further according to this recognition result and in advance
The corresponding relation set obtains the body feeling interaction instruction that this recognition result is corresponding.
In other embodiments, in order to improve the accuracy rate obtaining body feeling interaction instruction further, it is also possible to take body
The mode that sense combines with touch-control issues body feeling interaction instruction, as body feeling interaction device is getting body-sensing friendship according to recognition result
Mutually after instruction, can first pass through its display screen and show that this body feeling interaction instructs, user can be determined by touch controlled key and realizes more
Control accurately, do not limit at this.
After obtaining body feeling interaction instruction, the instruction of this body feeling interaction is sent to unmanned plane by body feeling interaction equipment, accordingly,
Unmanned plane receives the instruction of this body feeling interaction.
402: perform the instruction of described body feeling interaction.
Such as, the body feeling interaction instruction that current body feeling interaction device sends is spiraling, and unmanned plane is receiving this body
After sense interactive instruction, change existing flying method, be adjusted to the pattern of spiraling.If current body feeling interaction device sends
Body feeling interaction instruction for assisted instruction, unmanned plane, after receiving the instruction of this body feeling interaction, adds in existing flight speed
Speed flight.
In another embodiment, unmanned plane is not before flying, it is also possible to is directly instructed by body feeling interaction and controls
It takes off, such as, and body feeling interaction device and unmanned plane wireless connections.Power supply is started at unmanned plane and body feeling interaction device
After entering duty, body feeling interaction device collection obtains image information during user operation, extracts target according to image information
Human object, after this target body image is carried out posture or action recognition, determine this recognition result corresponding for unmanned plane
Instruction of taking off, body feeling interaction device by this take off instruction be wirelessly transmitted to correspondence unmanned plane, this unmanned plane receives and takes off
After instruction, start power set, flight of taking off.
In another embodiment, in order to enable users to access flight picture more intuitively, i.e. unmanned by this
Machine obtains the landscape around this unmanned plane or environmental information, experiences unmanned plane different gestures, no under body feeling interaction instruction controls
The excitement brought with path or friction speed, i.e. realizes the experience of first person vision, further refering to Fig. 5, in step
502: after performing the step of described body feeling interaction instruction, also comprise the steps:
503: by the first photographic head and second camera obtain respectively the first view data under current environment and
Second view data.
Specifically, unmanned plane is after receiving the body feeling interaction instruction that body feeling interaction device sends, by flat on its The Cloud Terrace
The environment being presently in is shot by the first photographic head and second camera that row is arranged.Wherein the instruction of this body feeling interaction is also
Can be that this first photographic head adjusts instruction or/and second camera aperture adjusts instruction, Focussing instruction or screening-mode
In at least one, with obtain more can clearly or meet user require image information.
504: described first view data and the second view data are sent to described body feeling interaction device, so that described
Body feeling interaction device shows after described first view data and the second view data are synthesized 3D rendering.
Unmanned plane is getting the first view data by the first photographic head, and second camera gets the second view data
After, this first view data and the second view data are wirelessly returned to body feeling interaction device.
It should be noted that unmanned plane to the shooting process of the first view data and the second view data with return donor
The process of sense interactive device is real-time synchronization.
Body feeling interaction device receives this first view data and the second view data, by described first view data and
Two view data synthesis 3D renderings, show described 3D rendering.
Being different from prior art, the unmanned plane of present embodiment receives the body feeling interaction instruction that body-sensing interactive device sends;
Wherein, described interactive instruction is that described body feeling interaction device is gathering target body image, carries out described target body image
Obtained by after posture or action recognition;Perform the instruction of described body feeling interaction.Side above by body-sensing action control unmanned plane
Formula, not only enriches the control mode to unmanned plane in prior art, and, it is not necessary to unmanned plane is entered by user by physical equipment
Row controls, the most easy to operate, reduces operational error rate, also brings more enriching experiences to user.And, use improving
While family is experienced, it is possible to effectively simplify the hardware device required for unmanned aerial vehicle control system, reduce production cost.
It addition, unmanned plane is taken the photograph by the first view data and second returning the first photographic head shooting to body feeling interaction device
As the second view data of head shooting, make body feeling interaction device that this first view data and the second view data are synthesized 3D and show
Mode, it is possible to make user obtain flight picture more intuitively, experience unmanned plane different appearances under body feeling interaction instruction controls
The excitement that gesture, different path or friction speed are brought, improves the sense of reality of picture, further increases Consumer's Experience.
It is the structural representation of body feeling interaction device one embodiment of the present invention refering to Fig. 6, Fig. 6.As shown in Figure 6, this reality
The body feeling interaction device executing mode includes image acquisition units 601, control unit 602 and transmitting element 603.
Image acquisition units 601 is used for gathering target body image, and described target body image is carried out posture or action
Identify.
Body feeling interaction to be realized operates, and image acquisition units 601 carries out the collection of target body image to user.Specifically
Ground, first image acquisition units 601 obtains the image information of user, and wherein this image information is deep image information, depth map
As information includes at least one in degree of depth photograph and video.Then this image information is filtered, denoising and smooth
After process, this image information is carried out background and the separation of target body image, from this image information, extract target body figure
Picture.
Image acquisition units 601, after getting image information, carries out skeletal extraction to this target body image further
And posture or gesture or action are identified, first body feeling interaction device carries out the discriminating of effective information to this image information,
Be identified including to effective posture or action, swing left and right arm such as user or shoulder arrives angle etc., obtain posture or
The recognition result of action, does not limits at this.
Control unit 602 is for obtaining the body feeling interaction instruction corresponding with recognition result.
Owing to the type of unmanned plane can include in many rotor wing unmanned aerial vehicles or fixed-wing unmanned plane and other kinds of nothing
Man-machine, the power set of different unmanned planes are different, and control method is different, and the instruction of executable body feeling interaction there may be not
With.The body feeling interaction instruction that same type of unmanned plane is capable of identify that also has multiple.Such as flight attitude adjusts instruction, speed is adjusted
Whole instruction, heading adjust instruction etc..Specifically, many rotor wing unmanned aerial vehicles machine can include hovering instruction, retreat instruction, by track
Flight directive and self-navigation instruction etc..Fixed-wing unmanned plane includes acceleration and deceleration instruction, pitch demand, driftage instruction and rolling
Turn instruction etc., do not limit at this.
Control unit 602 is after obtaining the recognition result of posture or action, further according to this recognition result and in advance
The corresponding relation set obtains the body feeling interaction instruction that this recognition result is corresponding.
In other embodiments, in order to improve the accuracy rate obtaining body feeling interaction instruction further, control unit 602 is also
Body feeling interaction instruction can be issued to take body-sensing, as body feeling interaction device is obtaining according to recognition result in the way of combining with touch-control
After getting body feeling interaction instruction, can first pass through its display screen and show that this body feeling interaction instructs, user can be determined by touch controlled key
Realize controlling more accurately, do not limit at this.
Transmitting element 603 is for sending the instruction of described body feeling interaction to unmanned plane.
Such as, the body feeling interaction instruction that current body feeling interaction device sends is spiraling, then unmanned plane is receiving this
After body feeling interaction instruction, change existing flying method, be adjusted to the pattern of spiraling.If current body feeling interaction device is sent out
The body feeling interaction instruction sent is assisted instruction, and unmanned plane is after receiving the instruction of this body feeling interaction, in existing flight speed
Accelerate flight.
In another embodiment, in order to enable users to access flight picture more intuitively, i.e. unmanned by this
Machine obtains the landscape around this unmanned plane or environmental information, experiences unmanned plane different gestures, no under body feeling interaction instruction controls
The excitement brought with path or friction speed, i.e. realizes the experience of first person vision, further refering to Fig. 7, and this enforcement
The body feeling interaction device of mode also includes receiving unit 704 and display unit 705,
Receive unit 704 for receiving the first picture number that described unmanned plane returns after performing the instruction of described body feeling interaction
According to this and the second view data;Wherein, described first view data is obtained by the first photographic head of described unmanned plane, and described second
View data is obtained by the second camera of described unmanned plane.
Specifically, unmanned plane is after receiving the body feeling interaction instruction that body feeling interaction device sends, by flat on its The Cloud Terrace
The environment being presently in is shot by the first photographic head and second camera that row is arranged.Wherein the instruction of this body feeling interaction is also
Can be that this first photographic head adjusts instruction or/and second camera aperture adjusts instruction, Focussing instruction or screening-mode
In at least one, with obtain more can clearly or meet user require image information.
Unmanned plane is getting the first view data by the first photographic head, and second camera gets the second view data
After, this first view data and the second view data are wirelessly returned to body feeling interaction device.
It should be noted that unmanned plane to the shooting process of the first view data and the second view data with return donor
The process of sense interactive device is real-time synchronization.
Accordingly, receive unit 704 and receive this first view data and the second view data.
Display unit 705 is for synthesizing 3D rendering by described first view data and the second view data, and display is described
3D rendering.
Data owing to being shot by photographic head are generally 2D data, in order to provide the user with more true and shake further
The effect shaken, display unit 705 is after receiving the first view data and the second view data, to this first view data and
Two view data carry out three-dimensional fusion, will the two synthesis 3D rendering.It is stored in this 3D rendering, and shows this 3D by its screen
Image.
In any of the above-described embodiment, in order to visual experience can not only be brought for user, but in visual experience
While, there is the experience of health sense of touch further, i.e. realize the first flight visual angle truly, body feeling interaction device is being incited somebody to action
Body feeling interaction instruction sends after unmanned plane, and the instruction of this body feeling interaction is converted to human body auxiliary and refers to by control unit further
Order, and perform the instruction of this body feeling interaction according to this human body house-keeping instruction to target body i.e. user offer and unmanned plane in 3D rendering
Synchronization action during generation action is experienced.
Wherein, during human body house-keeping instruction includes turning to flight directive, super weightless flight instruction and pneumatic instruction at least
A kind of.
Such as, the body feeling interaction instruction that currently transmitted unit sends to unmanned plane is landing instruction, then control unit should
Landing instruction is converted into weightless instruction and two kinds of human body house-keeping instructions of pneumatic instruction.And by corresponding hardware device or VR or
AR technology provides the user weightlessness and the pneumatic experience that this unmanned plane is presented in descent, and this experience and user institute
The 3D rendering enjoyed synchronizes.The most such as, currently transmitted unit instructs to the body feeling interaction that unmanned plane sends when instructing for turning left,
Unmanned plane transmission is turned left to instruct the human body house-keeping instruction be converted to the left-hand rotation that user implements by control unit by this, and in nothing
When man-machine execution turns left to instruct, by hardware device to human feeling to left-hand rotation action, do not limit at this.
Being different from prior art, the image acquisition units of present embodiment gathers target body image, to described target person
Body image carries out posture or action recognition;Control unit obtains the body feeling interaction instruction corresponding with recognition result;Transmitting element will
The instruction of described body feeling interaction sends to unmanned plane, so that described unmanned plane performs the instruction of described body feeling interaction.Above by body-sensing
The mode of action control unmanned plane, not only enriches the control mode to unmanned plane in prior art, and, it is not necessary to user passes through
Unmanned plane is controlled by physical equipment, the most easy to operate, reduces operational error rate, also brings more horn of plenty to user
Experience.And, while improving Consumer's Experience, it is possible to effectively simplify the hardware device required for unmanned aerial vehicle control system, fall
Low production cost.
It addition, what the first view data of reception unit reception unmanned plane the first photographic head return and second camera returned
Second view data, this first view data and the second view data are synthesized the mode that 3D shows by display unit, it is possible to use
Family obtains flight picture more intuitively, experiences unmanned plane different gestures, different path or not under body feeling interaction instruction controls
The excitement brought with speed, improves the sense of reality of picture, further increases Consumer's Experience.
It addition, control unit is after body feeling interaction instruction is sent to unmanned plane by transmitting element, further by this body
Sense interactive instruction is converted to human body house-keeping instruction, and provides and 3D rendering to the i.e. user of target body according to this human body house-keeping instruction
Synchronization action when middle unmanned plane performs the instruction generation action of this body feeling interaction is experienced.Same in visual experience of user can be made
Time, there is the experience of health sense of touch further, it is achieved the first flight visual angle truly.
It is the structural representation of unmanned plane one embodiment of the present invention refering to Fig. 8, Fig. 8.The unmanned plane bag of present embodiment
Include reception unit 801 and control unit 802.
Receive the body feeling interaction instruction that unit 801 sends for receiving body-sensing interactive device;Wherein, described interactive instruction is
Described body feeling interaction device is gathering target body image, and described target body image carries out gained after posture or action recognition
Arrive.
Body feeling interaction to be realized operates, and first body feeling interaction device carries out the collection of target body image to user.Specifically
Ground, first obtains the image information of user, and wherein this image information is deep image information, and deep image information includes degree of depth phase
At least one in sheet and video.Then this image information is filtered, after denoising and smoothing processing, this image believed
Breath carries out background and the separation of target body image, extracts target body image from this image information.
After getting image information, this target body image is carried out skeletal extraction and posture or gesture or action is entered
Row identifies, first body feeling interaction device carries out the discriminating of effective information, enter including to effective posture or action this image information
Row identifies, swings left and right arm such as user or shoulder arrives angle etc., obtain the recognition result of posture or action, do not do at this
Limit.
Body feeling interaction device is after obtaining the recognition result of posture or action, further according to this recognition result and in advance
The corresponding relation set obtains the body feeling interaction instruction that this recognition result is corresponding.
In other embodiments, in order to improve the accuracy rate obtaining body feeling interaction instruction further, it is also possible to take body
The mode that sense combines with touch-control issues body feeling interaction instruction, as body feeling interaction device is getting body-sensing friendship according to recognition result
Mutually after instruction, can first pass through its display screen and show that this body feeling interaction instructs, user can be determined by touch controlled key and realizes more
Control accurately, do not limit at this.
After obtaining body feeling interaction instruction, the instruction of this body feeling interaction is sent to unmanned plane by body feeling interaction equipment, accordingly,
Receive unit 801 and receive the instruction of this body feeling interaction.
Control unit 802 is used for performing the instruction of described body feeling interaction.
Such as, the body feeling interaction instruction that current body feeling interaction device sends is spiraling, and control unit 802 is receiving list
After unit 801 receives the instruction of this body feeling interaction, change existing flying method, be adjusted to the pattern of spiraling.If it is current
The body feeling interaction instruction that body feeling interaction device sends is assisted instruction, and unmanned plane is after receiving the instruction of this body feeling interaction, existing
Flight is accelerated in some flight speeds.
In another embodiment, in order to enable users to access flight picture more intuitively, i.e. unmanned by this
Machine obtains the landscape around this unmanned plane or environmental information, experiences unmanned plane different gestures, no under body feeling interaction instruction controls
The excitement brought with path or friction speed, i.e. realizes the experience of first person vision, further refering to Fig. 9, and this enforcement
The unmanned plane of mode also includes image acquisition units 903 and image transmitting element 904.
Image acquisition units 903 is for obtaining the under current environment respectively by the first photographic head and second camera
One view data and the second view data.
Specifically, image acquisition units 903 is after receiving the body feeling interaction instruction that body feeling interaction device sends, by it
The environment being presently in is shot by the first photographic head and the second camera that be arranged in parallel on The Cloud Terrace.Wherein this body-sensing is handed over
Instruction can also be that this first photographic head is or/and second camera aperture adjusts instruction, Focussing instruction or screening-mode mutually
Adjust instruction at least one, with obtain more can clearly or meet user requirement image information.
Image transmitting element 904 is for sending described first view data and the second view data to the friendship of described body-sensing
Device mutually, so that after described first view data and the second view data are synthesized 3D rendering information by described body feeling interaction device
Show.
Image transmitting element 904 gets the first view data in image acquisition units 903 by the first photographic head, and second
After photographic head gets the second view data, this first view data and the second view data are wirelessly returned to
Body feeling interaction device.It should be noted that unmanned plane to the shooting process of the first view data and the second view data with return
Process back to body feeling interaction device is real-time synchronization.
Body feeling interaction device receives this first view data and the second view data, by described first view data and
Two view data synthesis 3D renderings, show described 3D rendering.
Being different from prior art, the unit that receives of the unmanned plane of present embodiment receives the body-sensing that body-sensing interactive device sends
Interactive instruction;Wherein, described interactive instruction is that described body feeling interaction device is gathering target body image, to described target body
Image is carried out obtained by after posture or action recognition;Control unit performs the instruction of described body feeling interaction.Move above by body-sensing
Make to control the mode of unmanned plane, not only enrich the control mode to unmanned plane in prior art, and, it is not necessary to user passes through thing
Unmanned plane is controlled by reason equipment, the most easy to operate, reduces operational error rate, also brings the body of more horn of plenty to user
Test.And, while improving Consumer's Experience, it is possible to effectively simplify the hardware device required for unmanned aerial vehicle control system, reduce
Production cost.
It addition, the transmitting element of unmanned plane leads to the first photographic head bat by returning image acquisition units to body feeling interaction device
The first view data taken the photograph and the second view data of second camera shooting, make body feeling interaction device by this first view data
The mode shown with the second view data synthesis 3D, it is possible to make user obtain flight picture more intuitively, experience unmanned plane and exist
Body feeling interaction instruction controls the excitement that lower different gestures, different path or friction speed are brought, and improves the sense of reality of picture,
Further increase Consumer's Experience.
The foregoing is only embodiments of the present invention, not thereby limit the scope of the claims of the present invention, every utilization is originally
Equivalent structure or equivalence flow process that description of the invention and accompanying drawing content are made convert, or are directly or indirectly used in what other were correlated with
Technical field, is the most in like manner included in the scope of patent protection of the present invention.
Claims (10)
1. the control method of a unmanned plane, it is characterised in that described control method includes:
Body feeling interaction controls device and gathers target body image, and described target body image is carried out posture or action recognition;
Obtain the body feeling interaction instruction corresponding with recognition result;
The instruction of described body feeling interaction is sent to unmanned plane, so that described unmanned plane performs the instruction of described body feeling interaction.
Control method the most according to claim 1, it is characterised in that described control method also includes:
Receive the first view data and the second view data that described unmanned plane returns after performing the instruction of described body feeling interaction;
Wherein, described first view data is obtained by the first photographic head of described unmanned plane, and described second view data is by described unmanned
The second camera of machine obtains;
Described first view data and the second view data are synthesized 3D rendering, shows described 3D rendering.
Control method the most according to claim 1, it is characterised in that described by the most unmanned for the instruction transmission of described body feeling interaction
Also include after the step of machine:
The instruction of described body feeling interaction is converted into human body house-keeping instruction;
There is provided to described target body according to described human body house-keeping instruction, perform institute with unmanned plane described in described 3D rendering information
Synchronization action when stating body feeling interaction instruction generation action is experienced.
Control method the most according to claim 1, it is characterised in that body feeling interaction controls device and gathers target body figure
Picture, the step that described target body image is carried out posture or action recognition specifically includes:
Gather target body image;
After described target body image is carried out denoising, background separation, skeletal extraction and gesture recognition process, obtain posture or
The recognition result of action.
Control method the most according to claim 1, it is characterised in that the instruction of described body feeling interaction includes that flight attitude is adjusted
Whole, speed adjusts, heading adjusts and described first photographic head is or/and described second camera aperture, focal length, shooting side
To or mode adjustment at least one.
6. the control method of a unmanned plane, it is characterised in that described control method includes:
Unmanned plane receives the body feeling interaction instruction that body-sensing interactive device sends;Wherein, described interactive instruction is described body feeling interaction
Device is gathering target body image, carries out described target body image obtained by after posture or action recognition;
Perform the instruction of described body feeling interaction.
Control method the most according to claim 6, it is characterised in that described control method also includes:
The first view data under current environment and the second image is obtained respectively by the first photographic head and second camera
Data;
Described first view data and the second view data are sent to described body feeling interaction device, so that described body feeling interaction
Device shows after described first view data and the second view data are synthesized 3D rendering.
8. a body feeling interaction device, it is characterised in that described body feeling interaction device include image acquisition units, control unit with
And transmitting element;
Described image acquisition units is used for gathering target body image, described target body image carries out posture or action is known
Not;
Described control unit is for obtaining the body feeling interaction instruction corresponding with recognition result;
Described transmitting element is for sending the instruction of described body feeling interaction to unmanned plane.
Body feeling interaction device the most according to claim 8, it is characterised in that described body feeling interaction device also includes receiving list
Unit and display unit,
Described reception unit is for receiving the first view data that described unmanned plane returns after performing the instruction of described body feeling interaction
And second view data;Wherein, described first view data is obtained by the first photographic head of described unmanned plane, described second figure
As data are obtained by the second camera of described unmanned plane;
Described display unit, for described first view data and the second view data are synthesized 3D rendering, shows that described 3D schemes
Picture.
10. a unmanned plane, it is characterised in that described unmanned plane includes receiving unit and control unit,
The body feeling interaction instruction that described reception unit sends for receiving body-sensing interactive device;Wherein, described interactive instruction is institute
State body feeling interaction device and gather target body image, obtained by described target body image being carried out after posture or action recognition
's;
Described control unit is used for performing the instruction of described body feeling interaction.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610562193.9A CN106227231A (en) | 2016-07-15 | 2016-07-15 | The control method of unmanned plane, body feeling interaction device and unmanned plane |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201610562193.9A CN106227231A (en) | 2016-07-15 | 2016-07-15 | The control method of unmanned plane, body feeling interaction device and unmanned plane |
Publications (1)
Publication Number | Publication Date |
---|---|
CN106227231A true CN106227231A (en) | 2016-12-14 |
Family
ID=57520120
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610562193.9A Pending CN106227231A (en) | 2016-07-15 | 2016-07-15 | The control method of unmanned plane, body feeling interaction device and unmanned plane |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106227231A (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN106878651A (en) * | 2016-12-31 | 2017-06-20 | 歌尔科技有限公司 | A kind of three-dimensional video communication method based on unmanned plane, communication equipment and unmanned plane |
CN107516451A (en) * | 2017-10-08 | 2017-12-26 | 景遥(上海)信息技术有限公司 | Fixed-wing UAV Intelligent flight training system |
CN108181922A (en) * | 2017-12-01 | 2018-06-19 | 北京臻迪科技股份有限公司 | Unmanned plane landing control method, apparatus and system |
WO2018116028A1 (en) * | 2016-12-21 | 2018-06-28 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for controller-free user drone interaction |
WO2018145650A1 (en) * | 2017-02-08 | 2018-08-16 | 亿航智能设备(广州)有限公司 | Aircraft and control method therefor |
CN108700893A (en) * | 2017-04-07 | 2018-10-23 | 深圳市大疆创新科技有限公司 | Body-sensing remote control method, control device, holder and unmanned vehicle |
CN108769531A (en) * | 2018-06-21 | 2018-11-06 | 深圳市道通智能航空技术有限公司 | Control method, control device and the wearable device of the shooting angle of filming apparatus |
CN109121434A (en) * | 2017-04-17 | 2019-01-01 | 英华达(上海)科技有限公司 | Unmanned plane interaction camera system and method |
CN109196438A (en) * | 2018-01-23 | 2019-01-11 | 深圳市大疆创新科技有限公司 | A kind of flight control method, equipment, aircraft, system and storage medium |
CN109270954A (en) * | 2018-10-30 | 2019-01-25 | 西南科技大学 | A kind of unmanned plane interactive system and its control method based on gesture recognition |
WO2019144300A1 (en) * | 2018-01-23 | 2019-08-01 | 深圳市大疆创新科技有限公司 | Target detection method and apparatus, and movable platform |
WO2020042186A1 (en) * | 2018-08-31 | 2020-03-05 | 深圳市大疆创新科技有限公司 | Control method for movable platform, movable platform, terminal device and system |
CN111526295A (en) * | 2020-04-30 | 2020-08-11 | 北京臻迪科技股份有限公司 | Audio and video processing system, acquisition method, device, equipment and storage medium |
CN111610850A (en) * | 2019-02-22 | 2020-09-01 | 东喜和仪(珠海市)数据科技有限公司 | Method for man-machine interaction based on unmanned aerial vehicle |
CN112162630A (en) * | 2020-09-14 | 2021-01-01 | 上海钦龙工贸启东有限公司 | Interactive experience system, experience platform and interactive experience method |
CN112740226A (en) * | 2020-04-28 | 2021-04-30 | 深圳市大疆创新科技有限公司 | Operating system and method of movable object based on human body indication |
CN114578858A (en) * | 2022-03-16 | 2022-06-03 | 思翼科技(深圳)有限公司 | Unmanned aerial vehicle remote controller remote control system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102043410A (en) * | 2010-09-30 | 2011-05-04 | 清华大学 | Servo system for instructing pan-tilt system of unmanned aerial vehicle (UAV) by adopting head movement of operator |
CN202632581U (en) * | 2012-05-28 | 2012-12-26 | 戴震宇 | Flight simulation control and experience device based on real air environment |
CN104811615A (en) * | 2015-04-17 | 2015-07-29 | 刘耀 | Motion control camera shooting system and method |
CN104834249A (en) * | 2015-03-16 | 2015-08-12 | 张时勉 | Wearable remote controller |
CN104950902A (en) * | 2015-06-10 | 2015-09-30 | 杨珊珊 | Multi-rotor aircraft and control method thereof |
CN105677300A (en) * | 2016-02-04 | 2016-06-15 | 普宙飞行器科技(深圳)有限公司 | Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle |
-
2016
- 2016-07-15 CN CN201610562193.9A patent/CN106227231A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102043410A (en) * | 2010-09-30 | 2011-05-04 | 清华大学 | Servo system for instructing pan-tilt system of unmanned aerial vehicle (UAV) by adopting head movement of operator |
CN202632581U (en) * | 2012-05-28 | 2012-12-26 | 戴震宇 | Flight simulation control and experience device based on real air environment |
CN104834249A (en) * | 2015-03-16 | 2015-08-12 | 张时勉 | Wearable remote controller |
CN104811615A (en) * | 2015-04-17 | 2015-07-29 | 刘耀 | Motion control camera shooting system and method |
CN104950902A (en) * | 2015-06-10 | 2015-09-30 | 杨珊珊 | Multi-rotor aircraft and control method thereof |
CN105677300A (en) * | 2016-02-04 | 2016-06-15 | 普宙飞行器科技(深圳)有限公司 | Gesture identification based unmanned aerial vehicle control method and system as well as unmanned aerial vehicle |
Cited By (23)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10409276B2 (en) | 2016-12-21 | 2019-09-10 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for controller-free user drone interaction |
WO2018116028A1 (en) * | 2016-12-21 | 2018-06-28 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for controller-free user drone interaction |
US11340606B2 (en) | 2016-12-21 | 2022-05-24 | Hangzhou Zero Zero Technology Co., Ltd. | System and method for controller-free user drone interaction |
CN110300938A (en) * | 2016-12-21 | 2019-10-01 | 杭州零零科技有限公司 | System and method for exempting from the interaction of controller formula user's unmanned plane |
CN106878651A (en) * | 2016-12-31 | 2017-06-20 | 歌尔科技有限公司 | A kind of three-dimensional video communication method based on unmanned plane, communication equipment and unmanned plane |
WO2018145650A1 (en) * | 2017-02-08 | 2018-08-16 | 亿航智能设备(广州)有限公司 | Aircraft and control method therefor |
CN108700893A (en) * | 2017-04-07 | 2018-10-23 | 深圳市大疆创新科技有限公司 | Body-sensing remote control method, control device, holder and unmanned vehicle |
CN109121434A (en) * | 2017-04-17 | 2019-01-01 | 英华达(上海)科技有限公司 | Unmanned plane interaction camera system and method |
CN109121434B (en) * | 2017-04-17 | 2021-07-27 | 英华达(上海)科技有限公司 | Unmanned aerial vehicle interactive shooting system and method |
CN107516451A (en) * | 2017-10-08 | 2017-12-26 | 景遥(上海)信息技术有限公司 | Fixed-wing UAV Intelligent flight training system |
CN108181922A (en) * | 2017-12-01 | 2018-06-19 | 北京臻迪科技股份有限公司 | Unmanned plane landing control method, apparatus and system |
WO2019144300A1 (en) * | 2018-01-23 | 2019-08-01 | 深圳市大疆创新科技有限公司 | Target detection method and apparatus, and movable platform |
CN109196438A (en) * | 2018-01-23 | 2019-01-11 | 深圳市大疆创新科技有限公司 | A kind of flight control method, equipment, aircraft, system and storage medium |
CN108769531A (en) * | 2018-06-21 | 2018-11-06 | 深圳市道通智能航空技术有限公司 | Control method, control device and the wearable device of the shooting angle of filming apparatus |
WO2020042186A1 (en) * | 2018-08-31 | 2020-03-05 | 深圳市大疆创新科技有限公司 | Control method for movable platform, movable platform, terminal device and system |
CN109270954A (en) * | 2018-10-30 | 2019-01-25 | 西南科技大学 | A kind of unmanned plane interactive system and its control method based on gesture recognition |
CN111610850A (en) * | 2019-02-22 | 2020-09-01 | 东喜和仪(珠海市)数据科技有限公司 | Method for man-machine interaction based on unmanned aerial vehicle |
CN112740226A (en) * | 2020-04-28 | 2021-04-30 | 深圳市大疆创新科技有限公司 | Operating system and method of movable object based on human body indication |
CN111526295A (en) * | 2020-04-30 | 2020-08-11 | 北京臻迪科技股份有限公司 | Audio and video processing system, acquisition method, device, equipment and storage medium |
CN111526295B (en) * | 2020-04-30 | 2023-02-28 | 臻迪科技股份有限公司 | Audio and video processing system, acquisition method, device, equipment and storage medium |
CN112162630A (en) * | 2020-09-14 | 2021-01-01 | 上海钦龙工贸启东有限公司 | Interactive experience system, experience platform and interactive experience method |
CN114578858A (en) * | 2022-03-16 | 2022-06-03 | 思翼科技(深圳)有限公司 | Unmanned aerial vehicle remote controller remote control system |
CN114578858B (en) * | 2022-03-16 | 2022-09-20 | 思翼科技(深圳)有限公司 | Unmanned aerial vehicle remote controller remote control system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106227231A (en) | The control method of unmanned plane, body feeling interaction device and unmanned plane | |
CN105739525B (en) | A kind of system that cooperation somatosensory operation realizes virtual flight | |
US10628675B2 (en) | Skeleton detection and tracking via client-server communication | |
CN110300938A (en) | System and method for exempting from the interaction of controller formula user's unmanned plane | |
CN104699247B (en) | A kind of virtual reality interactive system and method based on machine vision | |
CN108769531B (en) | Method for controlling shooting angle of shooting device, control device and remote controller | |
CN107074348A (en) | Control method, device, equipment and unmanned vehicle | |
CN105892474A (en) | Unmanned plane and control method of unmanned plane | |
CN107438804B (en) | It is a kind of for controlling the wearable device and UAV system of unmanned plane | |
CN106708074A (en) | Method and device for controlling unmanned aerial vehicle based on VR glasses | |
CN109164829A (en) | A kind of flight mechanical arm system and control method based on device for force feedback and VR perception | |
CN105898346A (en) | Control method, electronic equipment and control system | |
CN108521812A (en) | Control method, unmanned plane and the machine readable storage medium of unmanned plane | |
CN204614276U (en) | A kind of emulation omnidirectional simulated flight device with mixed reality function | |
CN105700543B (en) | A kind of flight instruments control system, control method and unmanned plane of taking photo by plane | |
CN105847684A (en) | Unmanned aerial vehicle | |
CN105912980A (en) | Unmanned plane and unmanned plane system | |
CN104759095A (en) | Virtual reality head wearing display system | |
CN107639620A (en) | A kind of control method of robot, body feeling interaction device and robot | |
CN108731681A (en) | Rotor wing unmanned aerial vehicle method of navigation, related computer program, electronic equipment and unmanned plane | |
CN109062407A (en) | Remote mobile terminal three-dimensional display & control system and method based on VR technology | |
CN108475074A (en) | Holder follow-up control method and control device | |
WO2018184232A1 (en) | Body sensing remote control method, control apparatus, gimbal and unmanned aerial vehicle | |
WO2017166723A1 (en) | Unmanned aerial vehicle system and flight control method thereof | |
CN105847682A (en) | Panoramic image photographing method, device and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
C06 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20161214 |
|
RJ01 | Rejection of invention patent application after publication |