CN105334959A - System and method for controlling gesture motion in virtual reality environment - Google Patents

System and method for controlling gesture motion in virtual reality environment Download PDF

Info

Publication number
CN105334959A
CN105334959A CN201510695303.4A CN201510695303A CN105334959A CN 105334959 A CN105334959 A CN 105334959A CN 201510695303 A CN201510695303 A CN 201510695303A CN 105334959 A CN105334959 A CN 105334959A
Authority
CN
China
Prior art keywords
data
gesture motion
reality environment
gesture
action
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201510695303.4A
Other languages
Chinese (zh)
Other versions
CN105334959B (en
Inventor
李为
张瑞生
张建
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Pico Technology Co Ltd
Original Assignee
Beijing Pico Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Pico Technology Co Ltd filed Critical Beijing Pico Technology Co Ltd
Priority to CN201510695303.4A priority Critical patent/CN105334959B/en
Publication of CN105334959A publication Critical patent/CN105334959A/en
Application granted granted Critical
Publication of CN105334959B publication Critical patent/CN105334959B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

The invention discloses a system and a method for controlling a gesture motion in a virtual reality environment. The system comprises a gesture capturing module, a data transmission module, an analysis recognition module and an interaction control module. According to the technical scheme disclosed by the invention, through collecting motion data of a main joint of a hand, the motion data are transmitted to the analysis recognition module through a way that wired connection is combined with wireless connection, and the gesture recognition efficiency and the gesture recognition accuracy are increased; a corresponding relation form of the gesture motion and a motion order in the virtual reality environment is established by utilizing a way that an Android system is combined with a Unity game engine, and the motion order of the gesture motion in the virtual reality environment is obtained by searching the corresponding relation form, so that gesture motion interaction control of the virtual reality environment is realized, and immersive control can be carried out in the virtual reality environment by the user, and real, comfortable and accurate operation experience can be brought to the user.

Description

Gesture motion control system in a kind of reality environment and method
Technical field
The present invention relates to human-computer interaction technique field, the gesture motion control system particularly in a kind of reality environment and method.
Background technology
In recent years, along with the progressively maturation of virtual reality technology, the hardware that various virtual reality is relevant, software product occurs successively, but the operative technique that virtual reality is relevant is also very immature, and present main way is still by traditional interactive mode as keyboard, mouse, handle etc., but these interactive modes all cannot reach good interaction effect and Consumer's Experience in reality environment.Although develop to some extent based on the emerging technology of reality environment, as the generation of the interactive mode of gesture identification, also there is no the ripe intersection control routine of gesture identification and virtual reality being combined closely on the market at present.
Summary of the invention
In view of prior art lacks the problem of the ripe intersection control routine of gesture identification and virtual reality being combined closely, propose the gesture motion control system in a kind of reality environment of the present invention and method, to overcome the problems referred to above or to solve the problem at least in part.
According to one aspect of the present invention, provide the gesture motion control system in a kind of reality environment, this system comprises: gesture capture module, data transfer module, parsing identification module and interactive controlling module;
Described gesture capture module, utilizes the body sense catcher being arranged on user's hand major joint place, catches the gesture motion in effective coverage in real time, and records action data;
Described data transfer module, the mode combined by wired connection and wireless connections, is transferred to described parsing identification module by the action data of described real time record;
Described parsing identification module, for resolving the action data of described real time record, identifies corresponding gesture motion;
Described interactive controlling module, be preset with the mapping table of the action command in gesture motion and reality environment, inquire about this mapping table and obtain the action command of described gesture motion in described reality environment, carry out interactive controlling according to described action command and described reality environment.
Alternatively, described parsing identification module comprises culling unit;
Described culling unit, for before the action data of resolving described real time record, eliminate redundancy and invalid data, the data that described redundancy and invalid data comprise repeating data, are not inconsistent with the misdata that produces in the data that existing data and curves deviation is very large before, transmitting procedure and timestamp.
Alternatively, described parsing identification module also comprises grouped element and resolution unit;
Described grouped element, for according to computing power, rejects the packet after processing by described culling unit and puts into gesture data buffer memory sequence;
Described resolution unit, for resolving the gesture data in the buffer memory sequence after grouping, identifies an effective gesture motion.
Alternatively, described gesture motion obtains under Android system, and the mapping table of the action command in described gesture motion and reality environment utilizes Unity game engine to preset.
Alternatively, the mapping table of the action command in described gesture motion and reality environment can be modified according to different reality environments.
According to another aspect of the present invention, provide the gesture motion control method in a kind of reality environment, the method comprises:
Utilize the body sense catcher being arranged on user's hand major joint place, catch the gesture motion in effective coverage in real time, and record action data;
By the mode that wired connection and wireless connections combine, transmit the action data of described real time record;
Resolve the action data of described real time record, identify corresponding gesture motion;
Inquire about the mapping table of the action command in the gesture motion and reality environment preset, obtain the action command of described gesture motion in reality environment, carry out interactive controlling according to described action command and described reality environment.
Alternatively, the action data of the described real time record of described parsing, identifies that corresponding gesture motion comprises:
Before the action data of resolving described real time record, eliminate redundancy and invalid data, the data that described redundancy and invalid data comprise repeating data, are not inconsistent with the misdata that produces in the data that existing data and curves deviation is very large before, transmitting procedure and timestamp.
Alternatively, the action data of the described real time record of described parsing, identifies that corresponding gesture motion also comprises:
According to computing power, gesture data buffer memory sequence is put in the packet after rejecting process;
Gesture data in buffer memory sequence after grouping is resolved, identifies an effective gesture motion.
Alternatively, under Android system, obtain described gesture motion, utilize Unity game engine to preset the mapping table of the action command in described gesture motion and reality environment.
Alternatively, modify according to the mapping table of different reality environments to the action command in described gesture motion and reality environment.
In sum, technical scheme of the present invention gathers the action data at hand major joint place by body sense catcher, and the action data that the mode combined by wired connection and wireless connections transmits real time record improves gesture motion recognition speed and precision; By the mapping table of default gesture motion and action command in inquiry reality environment, obtain the action command of gesture motion in reality environment, interactive controlling is carried out according to this action command and reality environment, allow user can on the spot in person as operate in reality environment, bring true, comfortable, operating experience accurately to user.
Accompanying drawing explanation
Gesture motion control system schematic diagram in a kind of reality environment that Fig. 1 provides for one embodiment of the invention;
Identification module schematic diagram is resolved in gesture motion control system in a kind of reality environment that Fig. 2 provides for one embodiment of the invention;
Gesture motion control method process flow diagram in a kind of reality environment that Fig. 3 provides for one embodiment of the invention;
Embodiment
For making the object, technical solutions and advantages of the present invention clearly, below in conjunction with accompanying drawing, embodiment of the present invention is described further in detail.
Gesture motion control system schematic diagram in a kind of reality environment that Fig. 1 provides for one embodiment of the invention, as shown in Figure 1, the gesture motion control system 100 in this reality environment comprises: gesture capture module 110, data transfer module 120, parsing identification module 130 and interactive controlling module 140.
Gesture capture module 110, utilizes the body sense catcher being arranged on user's hand major joint place, catches the gesture motion in effective coverage in real time, and records action data.
Because the hand motion of human body is very complicated, at short notice, the hand each several part of human body is all in position, and angle, size, the aspects such as shape all there occurs a large amount of changes, completely using the whole data of personage's hand as gesture data, are unpractical in present stage.The present invention adopts key point to intercept, and multiple major joint of personage's hand have installed catcher, because the part between each joint of human body is rigid body, therefore can be taken out the molar behavior of whole hand by the data variation in joint; Define hand motion just effective in effective operable area, effective hand motion just can go on record simultaneously.On the basis of not reducing action recognition, reduce data acquisition amount like this, more true, effective data can be gathered within a short period of time, improve speed and the precision of gesture motion seizure.
Data transfer module 120, the mode combined by wired connection and wireless connections, is transferred to parsing identification module by the action data of real time record.
The advantages such as the transmission of physical connection line has at a high speed, stable and anti-interference, can improve the speed of data transmission, and can tackle the concurrent transmission of mass data.But physical connection line has length to limit, and physical connection line occupies certain physical space region, the zone of action of user is caused to be restricted, therefore, we provide the mode of wirelessly transmitting data simultaneously, as Bluetooth transmission and purple honeybee (ZigBee) transmission etc., compare the transmission of physical connection line, wireless transmission has partial sacrifice in stability and speed, but be greatly improved in use dirigibility, opereating specification and movable region can be greatly improved, better experience can be provided in specific virtual reality applications scene.
Resolving identification module 130, for resolving the action data of real time record, identifying corresponding gesture motion.
Interactive controlling module 140, be preset with the mapping table of the action command in gesture motion and reality environment, inquire about this mapping table and obtain the action command of gesture motion in reality environment, carry out interactive controlling according to action command and reality environment.
The gesture motion captured in real time is modeled as certain concrete action of personage or object in reality environment, allow user can on the spot in person as in reality environment, carry out interactive controlling, bring true, comfortable, operating experience accurately to user.
Identification module schematic diagram is resolved in gesture motion control system in a kind of reality environment that Fig. 2 provides for one embodiment of the invention.As shown in Figure 2, resolve identification module 130 and comprise culling unit 131, grouped element 132 and resolution unit 133.
Culling unit 131, for before the action data of resolving real time record, eliminate redundancy and invalid data, the data that redundancy and invalid data comprise repeating data, are not inconsistent with the misdata that produces in the data that existing data and curves deviation is very large before, transmitting procedure and timestamp.
Grouped element 132, for according to computing power, rejects the packet after process and puts into gesture data buffer memory sequence by culling unit 131.
Resolution unit 133, for resolving the gesture data in the buffer memory sequence after grouping, identifies an effective gesture motion.
Owing to eliminating a large amount of redundancies and invalid data before Data Analysis identification, calculated amount can be reduced greatly, improve counting yield, and, each only needs processes the action data in buffer memory sequence, ensure that current action can be resolved quickly and efficiently and identifies.Along with the continuous increase of computing power, only need the longest recognition time adjusting single gesture, constantly can expand the various gesture motion that correspondence becomes increasingly complex, even the teamwork etc. of combinative movement, many people identifies, is convenient to the interactive controlling that control system of the present invention realizes more accurate refinement further.
In one embodiment of the invention, gesture motion obtains under Android system, and the mapping table of the action command in gesture motion and reality environment utilizes Unity game engine to preset.
Unity is the multi-platform comprehensive development of games instrument that of being developed by UnityTechnologies allows player easily create such as 3 D video to play, build the type interaction contents such as visual, realtime three dimensional animation, is a professional game engine comprehensively integrated.The mode utilizing Android system and Unity game engine to combine, the correspondence realizing the action command in gesture motion and reality environment is interactive, can improve the ease for use of the present invention on using, and intersection control routine of the present invention is understood convenient, uses flexibly.
On this basis, in one embodiment of the invention, the mapping table of the action command in gesture motion and reality environment can be modified according to different reality environments.Therefore, intersection control routine of the present invention can realize same gesture action command corresponding different in different virtual actual environment.And can easily by the expansion of gesture content, in reality environment, respective extension goes out more interactive action.
Gesture motion control method process flow diagram in a kind of reality environment that Fig. 3 provides for one embodiment of the invention, as shown in Figure 3, the method comprises:
Step S310, utilizes the body sense catcher being arranged on user's hand major joint place, catches the gesture motion in effective coverage in real time, and records action data.
Step S320, the mode combined by wired connection and wireless connections, the action data of transmission real time record.
Step S330, resolves the action data of real time record, identifies corresponding gesture motion.
Step S340, inquires about the mapping table of the action command in the gesture motion and reality environment preset, obtains the action command of gesture motion in reality environment, carry out interactive controlling according to action command and reality environment.
In one embodiment of the invention, the action data of the described real time record of described parsing, identifies that corresponding gesture motion comprises:
Step S331, before the action data of resolving real time record, eliminate redundancy and invalid data, the data that redundancy and invalid data comprise repeating data, are not inconsistent with the misdata that produces in the data that existing data and curves deviation is very large before, transmitting procedure and timestamp.
Step S332, according to computing power, puts into gesture data buffer memory sequence by the packet after rejecting process.
Step S333, resolves the gesture data in the buffer memory sequence after grouping, identifies an effective gesture motion.
In one embodiment of the invention, under Android system, obtain described gesture motion, utilize Unity game engine to preset the mapping table of the action command in described gesture motion and reality environment.
In one embodiment of the invention, modify according to the mapping table of different reality environments to the action command in gesture motion and reality environment.
It should be noted that, the embodiment of method shown in Fig. 3 is corresponding identical with each embodiment of Fig. 1-system shown in Figure 2 above, describes in detail above, does not repeat them here.
In sum, technical scheme of the present invention utilizes the body sense catcher being arranged on user's hand major joint place, Quick Acquisition gesture motion critical data, the mode combined by wired, wireless transmission transmits data, rejecting process is carried out to the data collected, and identify gesture motion by remaining valid data parsing, inquire about the mapping table of the action command in gesture motion and reality environment, obtain the action command of gesture motion in reality environment, carry out interactive controlling with reality environment.
The invention has the advantages that: the crucial gesture data 1, gathering personage's hand major joint place, improve speed and the precision of gesture seizure.2, adopt mode that is wired, wireless transmission combination, adapting to the concurrent transmission to mass data on the one hand, improving the scope of activities of user on the one hand when requiring low to transmission speed, improve Consumer's Experience.3, eliminate redundancy and invalid data before Data Analysis identification, reduce calculated amount, improve the speed and accuracy of resolving and identifying, and only need the longest recognition time adjusting single gesture, just can tackle the gesture motion become increasingly complex, use expansion convenient.4, accurately catch hand motion and transmit parsing fast, be reflected to reality environment, make to use in reality environment more smooth, experience outstanding, and different gesture motion interactive strategies can be provided for different virtual actual environment, improve the interactive experience of user, applicability is strong, is widely used.
The foregoing is only preferred embodiment of the present invention, be not intended to limit protection scope of the present invention.All any amendments done within the spirit and principles in the present invention, equivalent replacement, improvement etc., be all included in protection scope of the present invention.

Claims (10)

1. the gesture motion control system in reality environment, is characterized in that, this system comprises: gesture capture module, data transfer module, parsing identification module and interactive controlling module;
Described gesture capture module, utilizes the body sense catcher being arranged on user's hand major joint place, catches the gesture motion in effective coverage in real time, and records action data;
Described data transfer module, the mode combined by wired connection and wireless connections, is transferred to described parsing identification module by the action data of described real time record;
Described parsing identification module, for resolving the action data of described real time record, identifies corresponding gesture motion;
Described interactive controlling module, be preset with the mapping table of the action command in gesture motion and reality environment, inquire about this mapping table and obtain the action command of described gesture motion in described reality environment, carry out interactive controlling according to described action command and described reality environment.
2. control system as claimed in claim 1, it is characterized in that, described parsing identification module comprises culling unit;
Described culling unit, for before the action data of resolving described real time record, eliminate redundancy and invalid data, the data that described redundancy and invalid data comprise repeating data, are not inconsistent with the misdata that produces in the data that existing data and curves deviation is very large before, transmitting procedure and timestamp.
3. control system as claimed in claim 2, it is characterized in that, described parsing identification module also comprises grouped element and resolution unit;
Described grouped element, for according to computing power, rejects the packet after processing by described culling unit and puts into gesture data buffer memory sequence;
Described resolution unit, for resolving the gesture data in the buffer memory sequence after grouping, identifies an effective gesture motion.
4. control system as claimed in claim 1, it is characterized in that, described gesture motion obtains under Android system, and the mapping table of the action command in described gesture motion and reality environment utilizes Unity game engine to preset.
5. the control system as described in any one of claim 1-4, is characterized in that, the mapping table of the action command in described gesture motion and reality environment can be modified according to different reality environments.
6. the gesture motion control method in reality environment, it is characterized in that, the method comprises:
Utilize the body sense catcher being arranged on user's hand major joint place, catch the gesture motion in effective coverage in real time, and record action data;
By the mode that wired connection and wireless connections combine, transmit the action data of described real time record;
Resolve the action data of described real time record, identify corresponding gesture motion;
Inquire about the mapping table of the action command in the gesture motion and reality environment preset, obtain the action command of described gesture motion in reality environment, carry out interactive controlling according to described action command and described reality environment.
7. control method as claimed in claim 6, is characterized in that, the action data of the described real time record of described parsing, identifies that corresponding gesture motion comprises:
Before the action data of resolving described real time record, eliminate redundancy and invalid data, the data that described redundancy and invalid data comprise repeating data, are not inconsistent with the misdata that produces in the data that existing data and curves deviation is very large before, transmitting procedure and timestamp.
8. control method as claimed in claim 7, is characterized in that, the action data of the described real time record of described parsing, identifies that corresponding gesture motion also comprises:
According to computing power, gesture data buffer memory sequence is put in the packet after rejecting process;
Gesture data in buffer memory sequence after grouping is resolved, identifies an effective gesture motion.
9. control method as claimed in claim 6, is characterized in that, under Android system, obtain described gesture motion, utilize Unity game engine to preset the mapping table of the action command in described gesture motion and reality environment.
10. the control method as described in any one of claim 6-9, is characterized in that, modifies according to the mapping table of different reality environments to the action command in described gesture motion and reality environment.
CN201510695303.4A 2015-10-22 2015-10-22 Gesture motion control system and method in a kind of reality environment Active CN105334959B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510695303.4A CN105334959B (en) 2015-10-22 2015-10-22 Gesture motion control system and method in a kind of reality environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510695303.4A CN105334959B (en) 2015-10-22 2015-10-22 Gesture motion control system and method in a kind of reality environment

Publications (2)

Publication Number Publication Date
CN105334959A true CN105334959A (en) 2016-02-17
CN105334959B CN105334959B (en) 2019-01-15

Family

ID=55285558

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510695303.4A Active CN105334959B (en) 2015-10-22 2015-10-22 Gesture motion control system and method in a kind of reality environment

Country Status (1)

Country Link
CN (1) CN105334959B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955469A (en) * 2016-04-26 2016-09-21 乐视控股(北京)有限公司 Control method and device of virtual image
CN106095068A (en) * 2016-04-26 2016-11-09 乐视控股(北京)有限公司 The control method of virtual image and device
CN106371602A (en) * 2016-09-14 2017-02-01 惠州Tcl移动通信有限公司 Method and system for controlling virtual reality device based on intelligent wearable device
CN106683528A (en) * 2017-01-13 2017-05-17 北京黑晶科技有限公司 Teaching method and system based on VR/AR
CN107281750A (en) * 2017-05-03 2017-10-24 深圳市恒科电子科技有限公司 VR aobvious action identification methods and VR show
CN107885316A (en) * 2016-09-29 2018-04-06 阿里巴巴集团控股有限公司 A kind of exchange method and device based on gesture
CN110058673A (en) * 2018-01-17 2019-07-26 广西米克尔森科技股份有限公司 A kind of virtual reality and augmented reality show exchange technology
CN110209451A (en) * 2019-05-28 2019-09-06 南京南方电讯有限公司 A kind of horse race lamp display system and method based on the superposition of different display engines

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6452584B1 (en) * 1997-04-23 2002-09-17 Modern Cartoon, Ltd. System for data management based on hand gestures
US20020140674A1 (en) * 2001-03-13 2002-10-03 Canon Kabushiki Kaisha Position/posture sensor or marker attachment apparatus
US7084884B1 (en) * 1998-11-03 2006-08-01 Immersion Corporation Graphical object interactions
US20070132722A1 (en) * 2005-12-08 2007-06-14 Electronics And Telecommunications Research Institute Hand interface glove using miniaturized absolute position sensors and hand interface system using the same
CN104238738A (en) * 2013-06-07 2014-12-24 索尼电脑娱乐美国公司 Systems and Methods for Generating an Augmented Virtual Reality Scene Within A Head Mounted System
CN104756045A (en) * 2012-10-04 2015-07-01 微软公司 Wearable sensor for tracking articulated body-parts
US20150241969A1 (en) * 2013-09-13 2015-08-27 Nod, Inc. Methods and Systems for Integrating One or More Gestural Controllers into a Head Mounted Wearable Display or Other Wearable Devices

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6452584B1 (en) * 1997-04-23 2002-09-17 Modern Cartoon, Ltd. System for data management based on hand gestures
US7084884B1 (en) * 1998-11-03 2006-08-01 Immersion Corporation Graphical object interactions
US20020140674A1 (en) * 2001-03-13 2002-10-03 Canon Kabushiki Kaisha Position/posture sensor or marker attachment apparatus
US20070132722A1 (en) * 2005-12-08 2007-06-14 Electronics And Telecommunications Research Institute Hand interface glove using miniaturized absolute position sensors and hand interface system using the same
CN104756045A (en) * 2012-10-04 2015-07-01 微软公司 Wearable sensor for tracking articulated body-parts
CN104238738A (en) * 2013-06-07 2014-12-24 索尼电脑娱乐美国公司 Systems and Methods for Generating an Augmented Virtual Reality Scene Within A Head Mounted System
US20150241969A1 (en) * 2013-09-13 2015-08-27 Nod, Inc. Methods and Systems for Integrating One or More Gestural Controllers into a Head Mounted Wearable Display or Other Wearable Devices

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105955469A (en) * 2016-04-26 2016-09-21 乐视控股(北京)有限公司 Control method and device of virtual image
CN106095068A (en) * 2016-04-26 2016-11-09 乐视控股(北京)有限公司 The control method of virtual image and device
CN106371602A (en) * 2016-09-14 2017-02-01 惠州Tcl移动通信有限公司 Method and system for controlling virtual reality device based on intelligent wearable device
CN107885316A (en) * 2016-09-29 2018-04-06 阿里巴巴集团控股有限公司 A kind of exchange method and device based on gesture
TWI742079B (en) * 2016-09-29 2021-10-11 香港商阿里巴巴集團服務有限公司 Gesture-based interactive method and device
CN106683528A (en) * 2017-01-13 2017-05-17 北京黑晶科技有限公司 Teaching method and system based on VR/AR
CN107281750A (en) * 2017-05-03 2017-10-24 深圳市恒科电子科技有限公司 VR aobvious action identification methods and VR show
CN110058673A (en) * 2018-01-17 2019-07-26 广西米克尔森科技股份有限公司 A kind of virtual reality and augmented reality show exchange technology
CN110209451A (en) * 2019-05-28 2019-09-06 南京南方电讯有限公司 A kind of horse race lamp display system and method based on the superposition of different display engines

Also Published As

Publication number Publication date
CN105334959B (en) 2019-01-15

Similar Documents

Publication Publication Date Title
CN105334959A (en) System and method for controlling gesture motion in virtual reality environment
JP6259545B2 (en) System and method for inputting a gesture in a 3D scene
CN104057450B (en) A kind of higher-dimension motion arm teleoperation method for service robot
CN204463032U (en) System and the virtual reality helmet of gesture is inputted in a kind of 3D scene
CN103246884B (en) Real-time body's action identification method based on range image sequence and device
CN105528082A (en) Three-dimensional space and hand gesture recognition tracing interactive method, device and system
CN104571511B (en) The system and method for object are reappeared in a kind of 3D scenes
US20150227211A1 (en) Gesture input system, method, and program
CN102142055A (en) True three-dimensional design method based on augmented reality interactive technology
CN102778953B (en) Motion sensing control method of shadow play remote digital performing based on Kinect
CN104227716B (en) A kind of cameras people real-time control method based on remote operating
CN103793060A (en) User interaction system and method
CN103399637A (en) Man-computer interaction method for intelligent human skeleton tracking control robot on basis of kinect
CN102169366A (en) Multi-target tracking method in three-dimensional space
WO2020061432A1 (en) Markerless human movement tracking in virtual simulation
CN106095094A (en) The method and apparatus that augmented reality projection is mutual with reality
CN106406875A (en) Virtual digital sculpture method based on natural gesture
CN108564653A (en) Human skeleton tracing system and method based on more Kinect
CN111596767A (en) Gesture capturing method and device based on virtual reality
CN109806580A (en) Mixed reality system and method based on wireless transmission
CN105184622A (en) Network shopping for consumer by utilization of virtual technology
CN111488059A (en) Interactive projection method suitable for interactive fusion
CN204440491U (en) A kind of augmented reality system with a key screenshotss sharing function
KR20150075909A (en) Method and apparatus for editing 3D character motion
CN107643820B (en) VR passive robot and implementation method thereof

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant