CN207198800U - A kind of VR three-dimensional experiencing systems of gesture identification - Google Patents

A kind of VR three-dimensional experiencing systems of gesture identification Download PDF

Info

Publication number
CN207198800U
CN207198800U CN201721329642.1U CN201721329642U CN207198800U CN 207198800 U CN207198800 U CN 207198800U CN 201721329642 U CN201721329642 U CN 201721329642U CN 207198800 U CN207198800 U CN 207198800U
Authority
CN
China
Prior art keywords
gesture
unit
dimensional
ray receiver
identification
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201721329642.1U
Other languages
Chinese (zh)
Inventor
韩明
张培
赵科
裴学邦
段赫奎
徐玉静
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hebei Ya Heng Engineering Technology Co Ltd
Shijiazhuang University
Original Assignee
Hebei Ya Heng Engineering Technology Co Ltd
Shijiazhuang University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hebei Ya Heng Engineering Technology Co Ltd, Shijiazhuang University filed Critical Hebei Ya Heng Engineering Technology Co Ltd
Priority to CN201721329642.1U priority Critical patent/CN207198800U/en
Application granted granted Critical
Publication of CN207198800U publication Critical patent/CN207198800U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)

Abstract

The utility model discloses a kind of VR three-dimensional experiencing systems of gesture identification, including ray receiver unit, gesture original state unit and gesture positioning unit, wherein gesture positioning unit includes gesture information collection and analytic unit and identification and the applying unit of gesture, ray receiver unit includes ray receiver, and gesture information collection includes structured light projector, light time-of-flight sensor and polygonal imaging camera head with analytic unit.The utility model can directly obtain hand in the three-dimensional information in space and the movable information of finger, and recognizable gesture species is more, can be identified in real time, using it is simple, operation is comfortable, driving scope is wide, the quality of data is high, suitable for numerous areas.

Description

A kind of VR three-dimensional experiencing systems of gesture identification
Technical field
It the utility model is related to VR technical fields, the VR three-dimensional experiencing systems of specifically a kind of gesture identification.
Background technology
Since spontaneous clear instrument, just need to set up by a kind of mode to contact between instrument, grasped instrument Handle be exactly the one kind contacted.Into the electronics technology epoch, interactive importance highlights further, just looks like remote control in sky Adjust, keyboard and mouse in computer, game paddle in video game etc..Especially VR is experienced, VR experience it is emphasised that feeling of immersion, And the source of feeling of immersion be with the external world isolation and bring up, what is stood in the breach is the isolation of vision and the sense of hearing so that brain It is spoofed, produces the virtual immersion sense departed from real world.This generates it is new the problem of:The body of oneself is not seen, especially It is that human interaction perceives most important hand, can not produce and interact with virtual environment, become one in the virtual world and see Visitor.
VR handles of the prior art can not reduce for the fine movement of hand joint, and can not carry out hand motion Precise positioning, while easily reduced precision by ferromagnetic influence of surrounding environment.
Utility model content
The purpose of this utility model is the VR three-dimensional experiencing systems for providing a kind of gesture identification, to solve above-mentioned background skill The problem of being proposed in art.
To achieve the above object, the utility model provides following technical scheme:
A kind of VR three-dimensional experiencing systems of gesture identification, including ray receiver unit, gesture original state unit and hand Gesture positioning unit, wherein gesture positioning unit include gesture information collection and analytic unit and identification and the applying unit of gesture, The ray receiver unit includes ray receiver;Gesture information collection and analytic unit include structured light projector, Light time-of-flight sensor and polygonal imaging camera head;The structured light projector uses laser projecting apparatus, structured light projector It is externally provided with a grating for being carved with certain patterns;The smooth time-of-flight sensor includes light-emitting component and COMS image sensings Device.
As further program of the utility model:The gesture information collection and analytic unit, pass through ray receiver Some gesture motions that user is made using VR equipment are acquired, and is transmitted to computer and carries out data analysis.
As further program of the utility model:The gesture information collection also includes adopting for detection with analytic unit Structure set light projector projects the pattern acquisition camera of pattern on body surface.
As further program of the utility model:The identification of the gesture and applying unit, after computer analysis Data transfer is among the program of virtual scene, according to the data passed over, carries out the recognition and verification of gesture, recognition and verification it Interact in the scene afterwards.
As further program of the utility model:The quantity at least two of the polygonal imaging camera head.
Compared with prior art, the beneficial effects of the utility model are:
The utility model can pass to the posture of gesture motion in real time virtual environment exactly, and can with Contact feedback of the information of the virtual hand with dummy object is had more so as to make to operator between operator and virtual environment with more natural The mode of feeling of immersion interacts.In summary, the utility model can directly obtain three-dimensional information and finger of the hand in space Movable information, recognizable gesture species is more, can be identified in real time, using it is simple, operation is comfortable, driving scope Extensively, the quality of data is high, suitable for numerous areas.
Brief description of the drawings
Fig. 1 is the flow chart of the VR three-dimensional experiencing systems of gesture identification.
Embodiment
Below in conjunction with the accompanying drawing in the utility model embodiment, the technical scheme in the embodiment of the utility model is carried out Clearly and completely describing, it is clear that described embodiment is only the utility model part of the embodiment, rather than whole Embodiment.Based on the embodiment in the utility model, those of ordinary skill in the art are not under the premise of creative work is made The every other embodiment obtained, belong to the scope of the utility model protection.
Referring to Fig. 1, in the utility model embodiment, a kind of VR three-dimensional experiencing systems of gesture identification, including ray connect Device unit, gesture original state unit and gesture positioning unit are received, wherein gesture positioning unit includes gesture information collection and divided Unit and identification and the applying unit of gesture are analysed, the ray receiver unit includes ray receiver, and ray receiver is used for Receive the VR heads that user uses and show equipment and gesture equipment;The gesture information collection includes project structured light with analytic unit Device, light time-of-flight sensor and polygonal imaging camera head;The structured light projector uses laser projecting apparatus, project structured light Device is externally provided with a grating for being carved with certain patterns, and laser can reflect when carrying out projection imaging by grating, so that The final drop point on a surface of an of laser produces displacement, when object distance laser projecting apparatus is closer, refraction and produced Raw displacement is with regard to small;When object distance farther out when, refraction and caused displacement also will become big accordingly;The gesture information is adopted The pattern collection that collection also includes projecting pattern on body surface for detecting collection structured light projector with analytic unit images Head, pattern acquisition camera are gathered by the detection to pattern change in displacement, and position and the depth of object are calculated using algorithm Information is spent, and then restores whole three dimensions;The smooth time-of-flight sensor includes light-emitting component and COMS imaging sensors, COMS imaging sensors are used for catching the photon for being sent by light-emitting component and being reflected from body surface, so as to obtain photon Flight time, according to photon flight time and then the distance of photon flight can be extrapolated, also just obtained the depth of object Information;The quantity at least two of the polygonal imaging camera head, image is absorbed simultaneously using at least two cameras, passes through ratio The difference of the image obtained to these different cameras in synchronization, depth information is calculated using algorithm, so as to polygonal three Dimension imaging;The gesture original state unit, can be self-defined initial according to user's needs primarily to distinguish gesture motion State, use family more comfortable;The gesture information collection and analytic unit, utilize user by ray receiver Some gesture motions that VR equipment is made are acquired, and are transmitted to computer and are carried out data analysis;The identification of the gesture is with answering With unit, among data transfer after computer analysis to the program of virtual scene, according to the data passed over, to carry out The recognition and verification of gesture, recognition and verification interact in the scene afterwards.
The VR three-dimensional experiencing systems of gesture identification of the present utility model, its course of work are as follows:
Step 1:Starting device, include operation Unity3D computer, receive virtual reality hardware device(Hereinafter referred to as VR equipment)Data simultaneously pass it to computer, and intervention communicates normal VR equipment, and opens the display system of virtual scene System.By in VR equipment access of virtual scenes, while confirm that data transfer is normal between VR equipment and computer, completes related beam worker Make;
Step 2:User's mobile device is placed in appropriate position, to complete simulated actions, while is also the wireless WIFI of guarantee Network communication it is unimpeded, it usually needs relatively open space environment;Start VR equipment, open the operation of VR equipment on computers Set, VR is corrected, user operates VR equipment, carries out posture position correction, completes the initialization procedure of VR equipment;
Step 3:Steam and Unity3D is run on computers and completes initialization procedure, is imported in Unity3D virtual Scene, user operate VR equipment, can be observed by display screen, VR equipment and the virtual human model in virtual environment are closed Connection, so as to be operated;
Step 4:Manipulated by VR gestures, VR gestures are divided into one-handed performance and bimanualness, and one-handed performance is such as:It is left Finger is moved to the left to the left side, human body, is pointed to the right, is moved right, be directing forwardly, and is moved forward, and is pointed to rear, is moved back Dynamic, one hand rotation, human body can also rotate, and the angle of human body rotating is consistent with the angle of singlehanded rotation, and finger is in front one Refer to, dynamic panel can be recalled, carry out scene setting, by that analogy, realize the manipulation of one hand;Bimanualness is such as:Both hands close, Reduce scene;Both hands are opened, and expand scene, and both hands can carry out the jump between scene into parastate;
Step 5:After carrying out analysis identification to gesture information, it is applied among scene, so as to realize VR gestures to virtual field The manipulation of scape.
It is obvious to a person skilled in the art that the utility model is not limited to the details of above-mentioned one exemplary embodiment, and And in the case of without departing substantially from spirit or essential attributes of the present utility model, it can realize that this practicality is new in other specific forms Type.Therefore, no matter from the point of view of which point, embodiment all should be regarded as exemplary, and is nonrestrictive, this practicality is new The scope of type limits by appended claims rather than described above, it is intended that the equivalency fallen in claim is contained All changes in justice and scope are included in the utility model.Any reference in claim should not be considered as limitation Involved claim.
Moreover, it will be appreciated that although the present specification is described in terms of embodiments, not each embodiment is only wrapped Containing an independent technical scheme, this narrating mode of specification is only that those skilled in the art should for clarity Using specification as an entirety, the technical solutions in the various embodiments may also be suitably combined, forms those skilled in the art It is appreciated that other embodiment.

Claims (5)

1. a kind of VR three-dimensional experiencing systems of gesture identification, including ray receiver unit, gesture original state unit and gesture Positioning unit, wherein gesture positioning unit include gesture information collection and analytic unit and identification and the applying unit of gesture, its It is characterised by, the ray receiver unit includes ray receiver, and the gesture information collection includes structure with analytic unit Light projector, light time-of-flight sensor and polygonal imaging camera head, the structured light projector use laser projecting apparatus, structure Light projector is externally provided with a grating for being carved with certain patterns, and the smooth time-of-flight sensor includes light-emitting component and COMS schemes As sensor.
A kind of 2. VR three-dimensional experiencing systems of gesture identification according to claim 1, it is characterised in that the gesture information Collection and analytic unit, are acquired some gesture motions that user is made using VR equipment by ray receiver, and pass Data analysis is carried out to computer.
A kind of 3. VR three-dimensional experiencing systems of gesture identification according to claim 1, it is characterised in that the gesture information The pattern collection that gather also includes projecting pattern on body surface for detecting collection structured light projector with analytic unit is taken the photograph As head.
A kind of 4. VR three-dimensional experiencing systems of gesture identification according to claim 1, it is characterised in that the knowledge of the gesture Not and applying unit, the data transfer after computer analysis is among the program of virtual scene, according to the data passed over, The recognition and verification of gesture is carried out, is interacted in the scene after recognition and verification.
A kind of 5. VR three-dimensional experiencing systems of gesture identification according to claim 1, it is characterised in that the polygonal imaging The quantity at least two of camera.
CN201721329642.1U 2017-10-17 2017-10-17 A kind of VR three-dimensional experiencing systems of gesture identification Active CN207198800U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201721329642.1U CN207198800U (en) 2017-10-17 2017-10-17 A kind of VR three-dimensional experiencing systems of gesture identification

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201721329642.1U CN207198800U (en) 2017-10-17 2017-10-17 A kind of VR three-dimensional experiencing systems of gesture identification

Publications (1)

Publication Number Publication Date
CN207198800U true CN207198800U (en) 2018-04-06

Family

ID=61788034

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201721329642.1U Active CN207198800U (en) 2017-10-17 2017-10-17 A kind of VR three-dimensional experiencing systems of gesture identification

Country Status (1)

Country Link
CN (1) CN207198800U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917921A (en) * 2019-03-28 2019-06-21 长春光华学院 It is a kind of for the field VR every empty gesture identification method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109917921A (en) * 2019-03-28 2019-06-21 长春光华学院 It is a kind of for the field VR every empty gesture identification method

Similar Documents

Publication Publication Date Title
US10739861B2 (en) Long distance interaction with artificial reality objects using a near eye display interface
US10712901B2 (en) Gesture-based content sharing in artificial reality environments
CN109754471B (en) Image processing method and device in augmented reality, storage medium and electronic equipment
CN104102343B (en) Interactive input system and method
US10068547B2 (en) Augmented reality surface painting
CN103793060B (en) A kind of user interactive system and method
CN102959616B (en) Interactive reality augmentation for natural interaction
CN103443742B (en) For staring the system and method with gesture interface
CN104423578B (en) Interactive input system and method
TW202004421A (en) Eye tracking with prediction and late update to GPU for fast foveated rendering in an HMD environment
CN110647239A (en) Gesture-based projection and manipulation of virtual content in an artificial reality environment
CN106133649B (en) It is tracked using the eye gaze that binocular gaze constrains
US20150245010A1 (en) Apparatus and system for interfacing with computers and other electronic devices through gestures by using depth sensing and methods of use
CN106055090A (en) Virtual reality and augmented reality control with mobile devices
US20140354602A1 (en) Interactive input system and method
US10652525B2 (en) Quad view display system
JP2017531221A (en) Countering stumbling when immersed in a virtual reality environment
KR20170031733A (en) Technologies for adjusting a perspective of a captured image for display
CN110018736A (en) The object via near-eye display interface in artificial reality enhances
CA2981208A1 (en) Method and system for implementing a multi-user virtual environment
Valkov et al. Touching Floating Objects in Projection-based Virtual Reality Environments.
WO2015027574A1 (en) 3d glasses, 3d display system, and 3d display method
Greenwald et al. Eye gaze tracking with google cardboard using purkinje images
CN207198800U (en) A kind of VR three-dimensional experiencing systems of gesture identification
US20220256137A1 (en) Position calculation system

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant