CN206224385U - A kind of motion capture system with positioning function for reality environment - Google Patents

A kind of motion capture system with positioning function for reality environment Download PDF

Info

Publication number
CN206224385U
CN206224385U CN201621168446.6U CN201621168446U CN206224385U CN 206224385 U CN206224385 U CN 206224385U CN 201621168446 U CN201621168446 U CN 201621168446U CN 206224385 U CN206224385 U CN 206224385U
Authority
CN
China
Prior art keywords
inertial sensor
user
motion capture
capture system
positioning
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201621168446.6U
Other languages
Chinese (zh)
Inventor
周言明
黄昌正
王磊
韦伟
陈曦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangzhou Science And Technology Co Ltd
Original Assignee
Guangzhou Science And Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangzhou Science And Technology Co Ltd filed Critical Guangzhou Science And Technology Co Ltd
Priority to CN201621168446.6U priority Critical patent/CN206224385U/en
Application granted granted Critical
Publication of CN206224385U publication Critical patent/CN206224385U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

A kind of motion capture system with positioning function for reality environment, including the first fingerstall, the second fingerstall, the tri-finger stall that arm part, wrist portion, glove bulk and the glove bulk are connected, it is characterised in that:The arm part includes power module, positioning and composition module, communication module, the utility model is by above-mentioned motion capture system, virtual reality system can be made to obtain the upper arm of user, forearm, wrist, palm, three movable informations of finger, the attitude of the arm of the user for wearing the gloves can be determined, and then reaction can make user that the motion of complete arm is seen by display in virtual reality, so as to lift Consumer's Experience, and provide spatial positional information and obstacle information with the space environment where composition module Virtual User by positioning, user is allowed to there is the locomotivity in virtual reality scenario and avoid being experienced with the ability that object at one's side is collided, improve security, further increase the experience effect of user.

Description

A kind of motion capture system with positioning function for reality environment
Technical field
The utility model is related to a kind of technical field of virtual reality, especially it is a kind of for reality environment with fixed The motion capture system of bit function.
Background technology
The five fingers motion capture gloves are usually based on inertial sensor motion capture gloves in virtual reality system, by hand Wrist dorsal part, the back of the hand, thumb second knuckle, thumb second knuckle, forefinger second knuckle, forefinger third knuckle, middle finger second The inertial sensor that is set on finger joint, nameless second knuckle and little finger of toe second knuckle determines forearm, palm and corresponding hand The position of finger and attitude.
The motion capture gloves of prior art it is confirmed that wrist, palm and finger relative to ancon relative position, nothing Method determines position and the attitude of upper arm, therefore man-machine interaction in immersive VR environment and unnatural;And, it is existing Motion capture gloves need inertial sensor is all installed in every finger, be so not easy to wearing and finger flexible motion, And it is relatively costly, and prior art assumes that shoulder opposing headers position is fixed, and rotary head, the first-class action in side in practice can make The site error that interaction data is inaccurate, appearance is larger is obtained, and because sensor use time is long, sensor can be caused to detect Limb action it is inaccurate, in addition, prior art is unable to actual environment where Virtual User, user experience is relatively low.
The content of the invention
In view of the shortcomings of the prior art, the utility model provide it is a kind of for reality environment with positioning function Motion capture system.
The technical solution of the utility model is:A kind of motion capture system with positioning function for reality environment System, including the first fingerstall, the second fingerstall, the tri-finger stall that arm part, wrist portion, glove bulk and the glove bulk are connected, The first inertial sensor is provided with first fingerstall, the second inertial sensor, described are provided with second fingerstall The 3rd inertial sensor is provided with tri-finger stall, the 4th inertial sensor is provided with the glove bulk;The wrist portion Dorsal part is provided with the 5th inertial sensor;It is characterized in that:The arm part includes power module, positioning and composition module, leads to News module, the positioning is connected with power module, communication module respectively with composition module, the communication module wirelessly with Terminal is connected, and the terminal is connected with head mounted display, and infrared fileter, the hand are provided with the head mounted display Set body on be additionally provided with the infrared LED lamp being engaged with infrared fileter, by optical filter gather infrared LED lamp send it is red Outer light, so that it is determined that the position of glove bulk, further improves the accuracy of user's limb action.
The positioning includes processor, SOC, motion tracking camera, 3D depth cameras, the 6th with composition module Inertial sensor, the processor respectively with SOC, motion tracking camera, 3D depth cameras, the 6th inertial sensor Connection, before the processor is by obtaining the image information that motion tracking camera is gathered and the user that 3D depth cameras are gathered The depth information of square object body, is calculated by SOC, and spatial information where the user that will be obtained passes through communication module Transmit to terminal, and by the locus where head mounted display display user.
The processor is also passed with the first inertial sensor, the second inertial sensor, the 3rd inertial sensor, the 4th inertia Sensor, the 5th inertial sensor connection, the processor by receive from the first inertial sensor, the second inertial sensor, 3rd inertial sensor, the 4th inertial sensor, the 5th inertial sensor, user's limb motion of the 6th inertial sensor collection Information, and determined to wear the first finger, the second-hand of the motion capture gloves according to the user's limb motion information for being received Refer to and the 3rd finger, palm, forearm, the limb motion information of upper arm, according to identified limb motion information generate arm and The attitude of the virtual objects of the back of the hand, position and gesture information, and the attitude of the virtual objects of generation, position and gesture information are led to Wireless module is crossed to transmit to terminal, and synchronously reduction shows that user generates attitude, the position of virtual objects by head mounted display Put and gesture information.
The power module also with the first inertial sensor, the second inertial sensor, the 3rd inertial sensor, the 4th inertia Sensor, the 5th inertial sensor, the connection of the 6th inertial sensor.
The terminal is mobile phone terminal or PC terminals.
The beneficial effects of the utility model are:By above-mentioned motion capture system, can obtain virtual reality system The upper arm of user, forearm, wrist, palm, the movable information of three fingers, may thereby determine that the hand of the user for wearing the gloves The attitude of arm, and then reaction can make user that the motion of complete arm is seen by display in virtual reality, so as to carry Consumer's Experience is risen, and spatial positional information and obstacle information are provided with composition module by positioning, allow has with user Locomotivity in virtual reality scenario and the ability collided with object at one's side is avoided, security is improved, while with specific frequency Rate, the back of the hand infrared image is obtained using the infrared camera on the helmet and infrared LED lamp, and the back of the hand is obtained with reference to image processing algorithm Locus and attitude information relative to head mobile phone camera, the arm obtained to previously described motion capture system Dynamic calibration is carried out with the hand air force attitude mode that is defined, attitude offsets are asked caused by solving inertial sensor drift accumulated error Topic, further increases the virtual precision of user's limb action, further increases the experience effect of user.
Brief description of the drawings
Fig. 1 is structural representation of the present utility model;
Fig. 2 is the structural representation of the utility model positioning and composition module.
In figure, the inertial sensors of 1- first, the inertial sensors of 2- second, the inertial sensors of 3- the 3rd, 4- glove bulks, The inertial sensors of 41- the 4th, 42- infrared LED lamps, 5- wrist portions, the inertial sensors of 51- the 5th, 6- arm parts, 61- power supply moulds Block, 62- communication modules, 63- positioning and composition module, 7- head mounted displays.
Specific embodiment
Specific embodiment of the present utility model is described further below in conjunction with the accompanying drawings:
As depicted in figs. 1 and 2, a kind of motion capture system with positioning function for reality environment, including First fingerstall, the second fingerstall, the tri-finger stall of arm part 6, wrist portion 5, glove bulk 4 and the glove bulk 4 connection, institute State and the first inertial sensor 1 is provided with the first fingerstall, the second inertial sensor 2, described are provided with second fingerstall The 3rd inertial sensor 3 is provided with tri-finger stall, the 4th inertial sensor 41 is provided with the glove bulk 4;The wrist The dorsal part in portion 5 is provided with the 5th inertial sensor 51;It is characterized in that:The arm part 6 includes power module 61, positioning and structure Module 63, communication module 62, the positioning are connected with power module 61, communication module 62 respectively with composition module 63, described Communication module 62 is wirelessly connected with terminal, and the terminal is connected with head mounted display 7, the head mounted display 7 On be provided with infrared fileter, the infrared LED lamp 42 being engaged with infrared fileter is additionally provided with the glove bulk 4, lead to The infrared light that optical filter collection infrared LED lamp 42 sends is crossed, so that it is determined that the position of glove bulk 4, further improves user's limb Body accuracy of action.
The positioning includes processor, SOC, motion tracking camera, 3D depth cameras, the with composition module 63 Six inertial sensors, the processor respectively with SOC, motion tracking camera, 3D depth cameras, the 6th inertia sensing Device is connected, and after the infrared light of the 3D depth cameras transmitting runs into barrier, is reflected, by its time difference so as to calculate Spatial information where user is obtained, the processor is taken the photograph by obtaining the image information and 3D depth of the collection of motion tracking camera As the depth information of user's objects in front of head collection, calculated by SOC, and the space where the user that will be obtained Information is transmitted to terminal by communication module, and by the locus where the display user of head mounted display 7.
The processor is also used to the first inertial sensor 1, the second inertial sensor 2, the 3rd inertial sensor the 3, the 4th Property sensor 41, the 5th inertial sensor 51 connect, the processor is used to by receiving from the first inertial sensor 1, second Property sensor 2, the 3rd inertial sensor 3, the 4th inertial sensor 41, the 5th inertial sensor 51, the 6th inertial sensor are adopted User's limb motion information of collection, and determined to wear the motion capture gloves according to the user's limb motion information for being received First finger, second finger and the 3rd finger, palm, forearm, the limb motion information of upper arm, according to identified limb motion The information generation attitude of virtual objects, position and gesture information, and the attitude of the virtual objects that will be generated, position and gesture information Transmitted to terminal by wireless module, and show that user generates attitude, position and the hand of virtual objects by head mounted display 7 Gesture information.
The power module 61 also with the first inertial sensor 1, the second inertial sensor 2, the 3rd inertial sensor 3, Four inertial sensors 41, the 5th inertial sensor 51, the connection of the 6th inertial sensor.
The terminal is mobile phone terminal or PC terminals.
Principle of the present utility model and most preferred embodiment simply are illustrated described in above-described embodiment and specification, is not being taken off On the premise of the utility model spirit and scope, the utility model also has various changes and modifications, these changes and improvements Both fall within the range of claimed the utility model.

Claims (5)

1. a kind of motion capture system with positioning function for reality environment, including arm part, wrist portion, gloves Body and the first fingerstall, the second fingerstall, the tri-finger stall of glove bulk connection, first is provided with first fingerstall Inertial sensor, is provided with the second inertial sensor on second fingerstall, the 3rd inertia biography is provided with the tri-finger stall Sensor, is provided with the 4th inertial sensor on the glove bulk;The dorsal part of the wrist portion is provided with the 5th inertial sensor; It is characterized in that:The arm part includes power module, positioning and composition module, communication module, the positioning and composition module It is connected with power module, communication module respectively, the communication module is wirelessly connected with terminal, the terminal and wear-type Display is connected, and infrared fileter is provided with the head mounted display, is additionally provided with the glove bulk and infrared absorption filter The infrared LED lamp that mating plate is engaged.
2. a kind of motion capture system with positioning function for reality environment according to claim 1, its It is characterised by:The positioning includes processor, SOC, motion tracking camera, 3D depth cameras, the with composition module Six inertial sensors, the processor respectively with SOC, motion tracking camera, 3D depth cameras, the 6th inertia sensing Device is connected.
3. a kind of motion capture system with positioning function for reality environment according to claim 1, its It is characterised by:The power module and the first inertial sensor, the second inertial sensor, the 3rd inertial sensor, the 4th inertia Sensor, the 5th inertial sensor, the connection of the 6th inertial sensor.
4. a kind of motion capture system with positioning function for reality environment according to claim 1, its It is characterised by:The terminal is mobile phone terminal or PC terminals.
5. a kind of motion capture system with positioning function for reality environment according to claim 2, its It is characterised by:The processor also with the first inertial sensor, the second inertial sensor, the 3rd inertial sensor, the 4th inertia Sensor, the connection of the 5th inertial sensor, the processor come from the first inertial sensor, the second inertia sensing by receiving Device, the 3rd inertial sensor, the 4th inertial sensor, the 5th inertial sensor, user's limbs of the 6th inertial sensor collection Movable information.
CN201621168446.6U 2016-11-02 2016-11-02 A kind of motion capture system with positioning function for reality environment Active CN206224385U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201621168446.6U CN206224385U (en) 2016-11-02 2016-11-02 A kind of motion capture system with positioning function for reality environment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201621168446.6U CN206224385U (en) 2016-11-02 2016-11-02 A kind of motion capture system with positioning function for reality environment

Publications (1)

Publication Number Publication Date
CN206224385U true CN206224385U (en) 2017-06-06

Family

ID=58788540

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201621168446.6U Active CN206224385U (en) 2016-11-02 2016-11-02 A kind of motion capture system with positioning function for reality environment

Country Status (1)

Country Link
CN (1) CN206224385U (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108509024A (en) * 2018-01-25 2018-09-07 北京奇艺世纪科技有限公司 A kind of data processing method and device based on virtual reality device
CN109066416A (en) * 2018-08-09 2018-12-21 深圳供电局有限公司 A kind of substation inspection system and method based on VR
CN109102731A (en) * 2018-08-09 2018-12-28 深圳供电局有限公司 A kind of substation simulation platform and method based on VR
CN109116977A (en) * 2017-06-22 2019-01-01 韩国电子通信研究院 Virtual experience content providing and device for it
CN109542210A (en) * 2017-09-21 2019-03-29 福建天晴数码有限公司 Arm motion based on virtual engine simulates restoring method, storage medium
CN110162172A (en) * 2019-04-29 2019-08-23 太平洋未来科技(深圳)有限公司 A kind of equipment identifying athletic posture
CN111338481A (en) * 2020-02-28 2020-06-26 武汉灏存科技有限公司 Data interaction system and method based on whole body dynamic capture

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109116977A (en) * 2017-06-22 2019-01-01 韩国电子通信研究院 Virtual experience content providing and device for it
CN109542210A (en) * 2017-09-21 2019-03-29 福建天晴数码有限公司 Arm motion based on virtual engine simulates restoring method, storage medium
CN109542210B (en) * 2017-09-21 2022-02-08 福建天晴数码有限公司 Virtual engine-based arm motion simulation reduction method and storage medium
CN108509024A (en) * 2018-01-25 2018-09-07 北京奇艺世纪科技有限公司 A kind of data processing method and device based on virtual reality device
CN109066416A (en) * 2018-08-09 2018-12-21 深圳供电局有限公司 A kind of substation inspection system and method based on VR
CN109102731A (en) * 2018-08-09 2018-12-28 深圳供电局有限公司 A kind of substation simulation platform and method based on VR
CN110162172A (en) * 2019-04-29 2019-08-23 太平洋未来科技(深圳)有限公司 A kind of equipment identifying athletic posture
CN111338481A (en) * 2020-02-28 2020-06-26 武汉灏存科技有限公司 Data interaction system and method based on whole body dynamic capture

Similar Documents

Publication Publication Date Title
CN206224385U (en) A kind of motion capture system with positioning function for reality environment
CN104317403B (en) A kind of wearable device for Sign Language Recognition
CN206162394U (en) A motion capture system for virtual reality environment based on tactile feedback
EP2984541B1 (en) Near-plane segmentation using pulsed light source
CN103279186B (en) Merge the multiple goal motion capture system of optical alignment and inertia sensing
EP3545385B1 (en) Wearable motion tracking system
CN106326881B (en) Gesture recognition method and gesture recognition device for realizing man-machine interaction
CN104461013A (en) Human body movement reconstruction and analysis system and method based on inertial sensing units
WO2014071254A4 (en) Wireless wrist computing and control device and method for 3d imaging, mapping, networking and interfacing
KR20200074609A (en) Supporting method and system for home fitness
US20200201460A1 (en) Universal Handheld Controller of a Computer System
CN103677289A (en) Intelligent interactive glove and interactive method
CN206162395U (en) A motion capture system for virtual reality environment
CN104771175B (en) Catch the wearable intelligent ring of human limb three-dimensional attitude
CN104002307A (en) Wearable rescue robot control method and system
JP2023507241A (en) A proxy controller suit with arbitrary dual-range kinematics
CN113268141A (en) Motion capture method and device based on inertial sensor and fabric electronics
Jovanov et al. Avatar—A multi-sensory system for real time body position monitoring
US20190339768A1 (en) Virtual reality interaction system and method
CN115777091A (en) Detection device and detection method
CN205450966U (en) A stage property for virtual reality system
Shi et al. Human motion capture system and its sensor analysis
CN111870249A (en) Human body posture tracking system based on micro inertial sensor and use method thereof
US20230140030A1 (en) Method, system and recording medium for accessory pairing
WO2022179279A1 (en) Interaction method, electronic device, and interaction system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant