CN106371607A - Man-machine interaction method and system based on cooperative game - Google Patents

Man-machine interaction method and system based on cooperative game Download PDF

Info

Publication number
CN106371607A
CN106371607A CN201610833471.XA CN201610833471A CN106371607A CN 106371607 A CN106371607 A CN 106371607A CN 201610833471 A CN201610833471 A CN 201610833471A CN 106371607 A CN106371607 A CN 106371607A
Authority
CN
China
Prior art keywords
player
operating
game
man
true player
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201610833471.XA
Other languages
Chinese (zh)
Inventor
黄源浩
刘龙
肖振中
许星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Orbbec Co Ltd
Original Assignee
Shenzhen Orbbec Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Orbbec Co Ltd filed Critical Shenzhen Orbbec Co Ltd
Priority to CN201610833471.XA priority Critical patent/CN106371607A/en
Publication of CN106371607A publication Critical patent/CN106371607A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/847Cooperative playing, e.g. requiring coordinated actions from several players to achieve a common goal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a man-machine interaction method and system based on a cooperative game. The method comprises the following steps: A1, acquiring range images of a real player and an environment where the player is positioned by a range image acquisition unit; A2, reminding the player of cooperative action ending time by a node control unit; A3, allowing the player to make cooperative action according to prompt of the node control unit; A4, performing whole-process tracking recognition and extracting the real player and cooperative action in the range images by an extraction unit; A5, judging whether the cooperative action made by the real player meets the requirement by an execution unit or not, if so, executing the next step A6, otherwise executing the step A1 again; A6, judging an cooperative result according to game rules by a judgment unit, and displaying the cooperative result by a display. According to the method and the system, three-dimensional action acquisition is realized by acquiring the range images of the real player; moreover, by extracting the real player and the cooperative action, the blocking problem during the game and the problems of real-time property and accuracy existing in an ordinary RGB camera are solved.

Description

A kind of man-machine interaction method based on collaborative game and system
Technical field
The present invention relates to computer and digital image processing field, more particularly, to a kind of man-machine mutual based on collaborative game Dynamic method and system.
Background technology
Collaborative game type is more, including to handy-dandy, cooperation, collaborative rescue etc., sending out with computer technology Exhibition, man-machine coordination and remote collaborative game will become a kind of novel game mode.In wherein collaborative game to guessing trip Play species also has a lot, such as rely on gesture finger-guessing game game, rely on both feet posture to handy-dandy etc..
Man-machine to guessing system and remotely handy-dandy system being required for by means of sighting device such as photographic head to player's Gesture or action are identified, and are then contrasted with the gesture of machine or remote player or action again, judge victory or defeat.? Have and man-machine in technology rely primarily on rgb photographic head and correlation machine people or computer program is played to guessing, in system Can photographic head identify that the gesture of player or action are the most key exactly, due to the gesture of different players or the difference of action Property larger, player execute also can exist in gesture or course of action a certain degree of block, the real-time of game and accuracy Difference.And long-range handy-dandy is also only rested on select to carry out with another player again after virtual gesture using mouse or keyboard right Than.Either man-machine to guess system or long-range to handy-dandy system, the simply mechanical interaction of mechanical type of generation, real and virtual It is not bound with, visual experience lacks intuitively interactive, do not bring real interaction effect to user.
Content of the invention
The present invention provides a kind of man-machine interaction method based on collaborative game and system, accurately can identify object for appreciation in real time The co-operating of family.
The present invention provides a kind of man-machine interaction method based on collaborative game, and the method comprises the steps: a1, depth map As harvester obtains the depth image of true player and its place environment;A2, node control unit point out player's co-operating The termination time;A3, player, according to the prompting of node control unit, make co-operating;A4, extraction unit all-the-way tracking identification, Extract the true player in depth image and co-operating;A5, performance element judge co-operating that true player gone out whether Reach requirement, reach requirement and then execute next step a6, otherwise, re-execute a2;A6, judging unit judge according to game rule Synergistic results, then synergistic results are shown by display.
Preferably, described collaborative game includes to handy-dandy, and described co-operating includes to guessing action.
Preferably, described co-operating includes human skeleton posture and/or detailed shape;Further preferred, described thin Section shape includes one or more of hand, head and feet shape.
Preferably, the described player in described step a2 and step a3 includes at least two people;Described player includes truly playing One of first virtual players of family and system setting themselves or two kinds.
Preferably, the time that terminates described in described step a2 includes the termination time of system setting themselves or truly plays the members of a family The termination time stated.
Preferably, the identification of extraction unit all-the-way tracking described in described step a4, the true player extracting in depth image And co-operating includes: extraction unit utilizes edge detection algorithm to extract human body contour outline from depth image, then uses random forest Identify human skeleton posture and/or detailed shape.
Preferably, performance element described in described step a5 judges whether the co-operating that true player is gone out reaches requirement Including: judge that whether static co-operating and/or judge co-operating whether specification when the time of termination.
Preferably, methods described also includes: mapping block is whole to map true player and co-operating to the second virtual object for appreciation Family.
The present invention also provides a kind of human-computer interaction system based on collaborative game, this system include depth image harvester, Processor, display, described processor includes node control unit, extraction unit, performance element and judging unit;Wherein, deep Degree image acquisition device is used for obtaining true player and its depth image of place environment;Node control unit is used for pointing out player to assist The termination time with action;Extraction unit is used for all-the-way tracking identification, true player and the co-operating of extracting in depth image; Performance element judges whether the co-operating that true player is gone out reaches requirement;Judging unit is used for judging synergistic results;Display Device is used for showing collaborative scene, process and result.
Preferably, described depth image harvester includes the depth camera based on binocular vision, the depth based on structure light One or more of camera and the depth camera based on tof.
Preferably, described collaborative game includes to handy-dandy, and described co-operating includes to guessing action.
Preferably, described co-operating includes human skeleton posture and/or detailed shape;Further preferred, described thin Section shape includes one or more of hand, head and feet shape.
Preferably, described system also includes voice module;Described voice module be used for identify true player voice and/or Send prompting of termination time etc..
Preferably, described system also includes Network Interface Module, described Network Interface Module, for connecting the Internet, real Connection between existing multiple collaborative games systems.
Preferably, described system also includes mapping block, and described mapping block is used for the true player of whole mapping and works in coordination with Action is to the second virtual players.
The invention has the benefit that proposing a kind of man-machine interaction method based on collaborative game and system, by depth Degree image acquisition device gathers the depth image of true player, realizes the collection of three-dimensional motion;Extracted very by extraction unit again Real player and co-operating, effectively overcome the occlusion issue in game process, and overcome existing for common rgb photographic head Real-time and accuracy problem.
Preferably scheme also has following beneficial effect: is mapped to true player and co-operating by mapping block Second virtual players, then by display by whole for the second virtual players display, realize the dynamic of true player and the second virtual players Make synchronous so that reality is combined with virtualphase, strengthen interaction in games system for the true player, realize human-computer interaction, to object for appreciation Family brings real game experiencing.
Brief description
Accompanying drawing 1 is the embodiment of the present invention based on the man-machine interaction method schematic flow sheet to handy-dandy.
Accompanying drawing 2 is the embodiment of the present invention based on the human-computer interaction system schematic diagram to handy-dandy.
Specific embodiment
With reference to specific embodiment and compare accompanying drawing the present invention is described in further detail it should be emphasised that, The description below is merely exemplary, rather than in order to limit the scope of the present invention and its application.
Collaborative game type is various, and including cooperation, to handy-dandy, collaborative rescue etc., it is required for when certain Between carry out coordinated by two or more objects in section.
By most commonly seen to handy-dandy as a example illustrate.A kind of based on the man-machine interaction method to handy-dandy, such as Fig. 1 Shown, concrete steps include: depth image harvester obtains the depth image of true player and its place environment;Node control list Unit prompting player is to the termination time guessing action;True player, according to the prompting of node control unit, makes to guessing action;Extract The identification of unit all-the-way tracking, extract true player in depth image and to guessing action;Mapping block is whole to map true player And to guessing action to the second virtual players;Performance element judge that true player gone out to guessing whether action reaches requirement, reach Require then to execute next step, otherwise show to guessing that action do not reach requirement by display, then Resurvey depth image;Judge single Unit judges to guessing result according to game rule, then shows to guessing result by display.
Also various to handy-dandy species, both comprised the finger-guessing game game relying on gesture, also comprise to rely on double-legged posture to guessing Game etc..In the present invention to guessing for broad sense to guessing, both included both sides between traditional two objects to guessing, More than two people, between many people to guessing.
Embodiment 1
The present embodiment is judged by the details of hand to handy-dandy with modal, i.e. finger-guessing game game is said Bright.
As shown in Fig. 2 this finger-guessing game games system, by depth image harvester, processor, display, mapping block, voice Module forms.In actual applications, these partly integrated can constitute the special game machine of independent finger-guessing game in one apparatus, Finger-guessing game games system can also be connected to form by Network Interface Module and computer.
Depth image harvester is used for obtaining true player and its depth image of place environment.Depth image harvester mesh Front mainly have three kinds of forms: the depth camera based on binocular vision, the depth camera based on structure light and be based on the tof (time Flight method) depth camera.No matter which kind of form may be used in this finger-guessing game games system.
It is to utilize binocular vision technology based on the depth camera of binocular vision, using two cameras pair being in different visual angles The same space is taken pictures, the depth that in the image that two cameras are shot, the difference of same object place pixel is located with this object Degree is directly related, thus obtains depth information by image processing techniquess by calculating pixel deviations.
Depth camera based on structure light passes through to project coding structure light pattern to object space, then passes through collected by camera mesh The image of structured light patterns is contained in mark space, then carries out processing by this image and such as carries out mating meter with reference configuration light image Calculation etc. can directly obtain depth information.
Depth camera based on tof passes through to launch laser pulse to object space, and laser pulse is connect after target reflection Receive after unit receives and record the turnaround time of laser pulse, go out the depth information of target by this Time Calculation.
In these three methods, the first typically adopts color camera, thus big by illumination effect, obtains depth information simultaneously Amount of calculation larger.Latter two typically utilizes infrared light, is not subject to illumination effect, amount of calculation is relatively small simultaneously, thus more suitable For this finger-guessing game games system.
Processor is processed after receiving the depth image that the transmission of depth image harvester comes and game process is controlled System and result judgement.Processor includes node control unit, extraction unit, performance element and judging unit.
Node control unit, for pointing out player the termination time to guessing action.Player at least two people, including true player With one of the first virtual players of system setting themselves or two kinds.For man-machine battle, because a side is that system itself sets The first fixed virtual players, thus need not judge whether the punch of this first virtual players reaches requirement, as long as processor controls First virtual players carry out punch when the time of termination.The finger-guessing game game that plural true player is participated in, first First require each true player's punch time almost consistent, in order to reach this purpose, a punch can be arranged by system itself Time point, that is, to the termination time guessing action.This termination time can by system using " preparation, start " or " 3,2, 1st, beginning " etc. is reminded and to be determined it is also possible to be determined by the true player dictated voice of itself by voice module.Due to can not Can guarantee that at a time different true player's punches simultaneously, as long as phase within a minimum time period of human eye None- identified The punch that continues is recognized as meeting the requirements, and the termination time therefore here can regard a time period as.
Extraction unit is whole in game process to be tracked to true player identifying, and extracts true player and its right Guess action.Specifically, extraction unit extracts human body contour outline using edge detection algorithm first from depth image;Then utilize Combination-the random forest of the decision-tree model trained through mass data, identifies human skeleton posture and/or details shape Shape, for example: the principle according to identification human skeleton model identifies the finger model of hand, and by the model of finger as finger-guessing game Basis for estimation.
Performance element by judge the gesture of true player when the time of termination whether static judging true player's punch Whether meet the requirements, after the time of termination determines, true player is considered to reach requirement in this time punch.Concrete judgement The whether satisfactory method of punch has multiple, hand in such as human skeleton in the continuously several two field pictures near the termination time Significantly movement just may be considered static the point being located, and that is, punch meets the requirements, and otherwise regards as undesirable.
After multiple true player's punches meet the requirements, performance element using to guess maneuver library or learning model etc. as judge Foundation, to player to guessing whether specification judges for action.For finger-guessing game game, first have to identify each true player Effective finger model, then using housebroken grader, finger model is classified, finally by each player classification after Model carry out standardization judgement.Here effective finger model refers to the finger model that player is presented when the time of termination.
Mapping block is used for by true player and to guessing that action is mapped to the second virtual players, and the second virtual players are in void Intend the role corresponding to true player setting up in game, the human skeleton posture of the second virtual players and details after mapping Shape is corresponded with true player, that is, become true player to the incarnation in handy-dandy.In same virtual game environment Can there are multiple second virtual players.For long-range between true player and true player to handy-dandy, by network interface Module, realizes the connection between multiple systems handy-dandy.
Judging unit is carried out to the judgement guessing result according to game rule.Display is except for guessing in addition to result, also showing To guessing scene, the overall process to handy-dandy and its to guessing result, the first virtual players and second are included to the overall process of handy-dandy The game overall process of virtual players.
In order to provide more preferable game experiencing, system also includes voice module, for identify true player voice and For to guessing that system sends voice reminder of termination time etc..
Embodiment 2
Difference with embodiment 1 is, is to be judged by the attitude of whole body to handy-dandy, such as both hands with And the combined shaped of both feet.For this to handy-dandy, judging unit carry out need not considering when victory or defeat judges human hands or Person's other details shape.
Above content is to further describe it is impossible to recognize with reference to concrete/preferred embodiment made for the present invention Determine the present invention be embodied as be confined to these explanations.For general technical staff of the technical field of the invention, Without departing from the inventive concept of the premise, it can also make some replacements or modification to the embodiment that these have described that, And these substitute or variant all should be considered as belonging to protection scope of the present invention.

Claims (10)

1. a kind of man-machine interaction method based on collaborative game is it is characterised in that comprise the steps:
A1, depth image harvester obtain the depth image of true player and its place environment;
A2, node control unit point out the termination time of player's co-operating;
A3, player, according to the prompting of node control unit, make co-operating;
A4, the identification of extraction unit all-the-way tracking, true player and the co-operating extracted in depth image;
A5, performance element judge whether the co-operating that true player is gone out reaches requirement, reach requirement and then execute next step A6, otherwise, re-executes a2;
A6, judging unit judge synergistic results, then show synergistic results by display.
2. the man-machine interaction method based on collaborative game as claimed in claim 1 is it is characterised in that described collaborative game includes To handy-dandy, described co-operating includes to guessing action.
3. the man-machine interaction method based on collaborative game as claimed in claim 1 is it is characterised in that described co-operating includes Human skeleton posture and/or detailed shape, described detailed shape includes one or more of hand, head and feet shape.
4. the man-machine interaction method based on collaborative game as claimed in claim 1 is it is characterised in that described step a2 and step Described player in a3 includes at least two people;Described player is included in true player and the first virtual players of system setting themselves One or two.
5. the man-machine interaction method based on collaborative game as claimed in claim 1 is it is characterised in that described in described step a2 The termination time includes the termination time of system setting themselves or the termination time of true player oral account.
6. the man-machine interaction method based on collaborative game as claimed in claim 1 is it is characterised in that described in described step a5 Performance element judges whether the co-operating that true player is gone out reaches and requires to include: judges that co-operating when the time of termination is No static and/or judge co-operating whether specification.
7. the described man-machine interaction method based on collaborative game as arbitrary in claim 1-6 it is characterised in that methods described also Including: mapping block is whole to map true player and co-operating to the second virtual players.
8. a kind of human-computer interaction system based on collaborative game is it is characterised in that including depth image harvester, processor, showing Show device, described processor includes node control unit, extraction unit, performance element and judging unit;Wherein, depth image collection Device is used for obtaining true player and its depth image of place environment;Node control unit is used for pointing out the end of player's co-operating The only time;Extraction unit is used for all-the-way tracking identification, true player and the co-operating of extracting in depth image;Performance element is used In judging whether the co-operating that true player is gone out reaches requirement;Judging unit is used for judging synergistic results;Display is used for The collaborative scene of display, process and result.
9. the human-computer interaction system based on collaborative game as claimed in claim 8 is it is characterised in that described system also includes reflecting Penetrate module, described mapping block is used for the true player of whole mapping and co-operating to the second virtual players.
10. the human-computer interaction system based on collaborative game as claimed in claim 8 is it is characterised in that described system also includes Voice module and/or Network Interface Module, described voice module is used for including identifying the voice of true player and/or sending termination The prompting of time;Described Network Interface Module, for connecting the Internet, realizes the connection between multiple collaborative games systems.
CN201610833471.XA 2016-09-19 2016-09-19 Man-machine interaction method and system based on cooperative game Pending CN106371607A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610833471.XA CN106371607A (en) 2016-09-19 2016-09-19 Man-machine interaction method and system based on cooperative game

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610833471.XA CN106371607A (en) 2016-09-19 2016-09-19 Man-machine interaction method and system based on cooperative game

Publications (1)

Publication Number Publication Date
CN106371607A true CN106371607A (en) 2017-02-01

Family

ID=57898429

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610833471.XA Pending CN106371607A (en) 2016-09-19 2016-09-19 Man-machine interaction method and system based on cooperative game

Country Status (1)

Country Link
CN (1) CN106371607A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107463659A (en) * 2017-07-31 2017-12-12 广东欧珀移动通信有限公司 Object search method and its device
CN107551549A (en) * 2017-08-09 2018-01-09 广东欧珀移动通信有限公司 Video game image method of adjustment and its device
CN107643890A (en) * 2017-08-09 2018-01-30 广东欧珀移动通信有限公司 Scene of game construction method and device
CN110119547A (en) * 2019-04-28 2019-08-13 腾讯科技(深圳)有限公司 A kind of prediction group defeats negative method, apparatus and control equipment

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102279644A (en) * 2010-06-08 2011-12-14 华宝通讯股份有限公司 Interactive system
CN102614663A (en) * 2011-01-30 2012-08-01 德信互动科技(北京)有限公司 Device for achieving multiplayer game
CN103116686A (en) * 2011-11-17 2013-05-22 苏州蜗牛数字科技股份有限公司 Method of competing movements in network games
CN103419204A (en) * 2012-05-22 2013-12-04 林其禹 Finger guessing game robot system
CN104063573A (en) * 2013-03-20 2014-09-24 玩酷科技股份有限公司 Method for conducting cooperative behavior through joint efforts of multiple players
CN104606882A (en) * 2014-12-31 2015-05-13 南宁九金娃娃动漫有限公司 Motion sensing game interaction method and system
CN105468249A (en) * 2014-09-09 2016-04-06 联胜(中国)科技有限公司 Intelligent interaction system and control method therefor

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102279644A (en) * 2010-06-08 2011-12-14 华宝通讯股份有限公司 Interactive system
CN102614663A (en) * 2011-01-30 2012-08-01 德信互动科技(北京)有限公司 Device for achieving multiplayer game
CN103116686A (en) * 2011-11-17 2013-05-22 苏州蜗牛数字科技股份有限公司 Method of competing movements in network games
CN103419204A (en) * 2012-05-22 2013-12-04 林其禹 Finger guessing game robot system
CN104063573A (en) * 2013-03-20 2014-09-24 玩酷科技股份有限公司 Method for conducting cooperative behavior through joint efforts of multiple players
CN105468249A (en) * 2014-09-09 2016-04-06 联胜(中国)科技有限公司 Intelligent interaction system and control method therefor
CN104606882A (en) * 2014-12-31 2015-05-13 南宁九金娃娃动漫有限公司 Motion sensing game interaction method and system

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107463659A (en) * 2017-07-31 2017-12-12 广东欧珀移动通信有限公司 Object search method and its device
CN107463659B (en) * 2017-07-31 2020-07-17 Oppo广东移动通信有限公司 Object searching method and device
CN107551549A (en) * 2017-08-09 2018-01-09 广东欧珀移动通信有限公司 Video game image method of adjustment and its device
CN107643890A (en) * 2017-08-09 2018-01-30 广东欧珀移动通信有限公司 Scene of game construction method and device
CN110119547A (en) * 2019-04-28 2019-08-13 腾讯科技(深圳)有限公司 A kind of prediction group defeats negative method, apparatus and control equipment
CN110119547B (en) * 2019-04-28 2021-07-30 腾讯科技(深圳)有限公司 Method, device and control equipment for predicting group war victory or defeat

Similar Documents

Publication Publication Date Title
US9245177B2 (en) Limiting avatar gesture display
CN102449576B (en) Gesture shortcuts
US8542252B2 (en) Target digitization, extraction, and tracking
CN103347437B (en) Gaze detection in 3D mapping environment
CN102301311B (en) Standard gestures
US8451278B2 (en) Determine intended motions
EP2409277B1 (en) Chaining animations
JP5943913B2 (en) User tracking feedback
US8744121B2 (en) Device for identifying and tracking multiple humans over time
CA2753051C (en) Virtual object manipulation
US20110025689A1 (en) Auto-Generating A Visual Representation
CN106125903B (en) Multi-person interaction system and method
CN102301315A (en) gesture recognizer system architecture
CN102129293A (en) Tracking groups of users in motion capture system
CN102314595A (en) Be used to improve the RGB/ degree of depth camera of speech recognition
CN102665838A (en) Methods and systems for determining and tracking extremities of a target
CN102207771A (en) Intention deduction of users participating in motion capture system
CN103038727A (en) Skeletal joint recognition and tracking system
CN106371607A (en) Man-machine interaction method and system based on cooperative game
WO2010126714A2 (en) Show body position
CN102222431A (en) Hand language translator based on machine
WO2013059751A1 (en) Calculating metabolic equivalence with a computing device
CN103207667A (en) Man-machine interaction control method and application thereof
CN102591456A (en) Detection of body and props
Peng et al. Design and Implementation of Multi-mode Natural Interaction of Game Animation Characters in Mixed Reality: A Novel User Experience Method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 11-13 / F, joint headquarters building, high tech Zone, 63 Xuefu Road, Yuehai street, Nanshan District, Shenzhen, Guangdong 518000

Applicant after: Obi Zhongguang Technology Group Co., Ltd

Address before: A808, Zhongdi building, industry university research base, China University of Geosciences, No.8, Yuexing Third Road, Nanshan District, Shenzhen, Guangdong 518000

Applicant before: SHENZHEN ORBBEC Co.,Ltd.

CB02 Change of applicant information
RJ01 Rejection of invention patent application after publication

Application publication date: 20170201

RJ01 Rejection of invention patent application after publication