CN114281193A - Virtual reality interaction device based on gesture recognition under circular screen scene - Google Patents

Virtual reality interaction device based on gesture recognition under circular screen scene Download PDF

Info

Publication number
CN114281193A
CN114281193A CN202111597804.0A CN202111597804A CN114281193A CN 114281193 A CN114281193 A CN 114281193A CN 202111597804 A CN202111597804 A CN 202111597804A CN 114281193 A CN114281193 A CN 114281193A
Authority
CN
China
Prior art keywords
hand
gesture
user
circular screen
pose information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111597804.0A
Other languages
Chinese (zh)
Inventor
王毅刚
张明威
尹学松
李仕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Dianzi University
Original Assignee
Hangzhou Dianzi University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Dianzi University filed Critical Hangzhou Dianzi University
Priority to CN202111597804.0A priority Critical patent/CN114281193A/en
Publication of CN114281193A publication Critical patent/CN114281193A/en
Pending legal-status Critical Current

Links

Images

Abstract

The invention discloses a virtual reality interaction device based on gesture recognition in a circular screen scene, which is suitable for the circular screen scene. The invention enables the user to interact with the virtual scene by utilizing the gesture, realizes roaming and can be operated by multiple persons; the device can track the position of a user through the positioning equipment, track the pose information of a hand through the hand tracking module, and identify the gesture information of the user, so that the gesture made by the user can be fed back in a virtual world in real time correspondingly, and finally displayed on a circular screen; the device comprises positioning equipment, a hand tracking module, a circular screen, 3D glasses, a graphic workstation and a communication module; the invention realizes the functions of virtual reality interaction and roaming which only depend on gestures, have high identification precision and light weight and support multi-person cooperative operation.

Description

Virtual reality interaction device based on gesture recognition under circular screen scene
Technical Field
The invention relates to the field of virtual reality and human-computer interaction, in particular to a virtual reality interaction device based on gesture recognition in a circular screen scene. The invention is applied to the scene of a circular screen projection display system, and realizes the virtual reality interaction function which only depends on gestures, has high recognition precision and light weight of equipment and supports multi-person cooperative operation.
Background
Virtual Reality (VR), also called Virtual environment, is mainly to generate a three-dimensional visual Virtual world by using computer stimulation, so that a user can feel and explore personally on the scene in the Virtual three-dimensional world. That is, the virtual reality device may generate a corresponding image according to the position and the viewing angle of the user, so that the user feels the presence sense during the movement thereof.
At present, the commonly used VR display device includes an external head display, an integrated machine, a mobile phone box, a circular screen projection display system, and the like, wherein the circular screen projection display system is widely applied due to the characteristic that the circular screen projection display system can be viewed by multiple people at the same time, and is commonly used in application scenes such as exhibition, virtual simulation, and the like.
The circular screen projection display system is also called a circular screen projection system, and is an immersive virtual simulation environment in a virtual three-dimensional projection display system, and the system adopts a circular projection screen as a display carrier of simulation application.
A common virtual reality interaction method based on a circular screen projection display system is mainly carried out in a mode that a positioning device collects the position of a user and a handle collects a user instruction. The handle has the advantages that the recognition accuracy is high, but when a user is in a virtual world, the user sees that the virtual world is a virtual world, and the interaction mode with the virtual world needs a foreign object, so that the requirement of the virtual reality experience on the immersion sense cannot be met.
The method of gesture recognition is adopted for interaction, and a user can interact with the virtual reality scene only by sending an instruction through a gesture, so that the immersion feeling of virtual reality experience is met. However, in practical applications, many problems are encountered, on one hand, the design of the gesture has no clear standard, and on the other hand, the gesture is often recognized incorrectly due to the gesture self-occlusion or insufficient recognition accuracy.
In addition, in order to satisfy the recognition of gestures, there are two methods adopted in common virtual reality devices, that is, (i) an infrared emitter is configured in a scene; (ii) the collected image is processed by a computer vision method. In either case, the data acquisition and processing need to be completed, which makes the configuration of the whole device complicated and expensive. The two methods are not high enough in recognition accuracy, the infrared emitter is limited by recognition distance in the environment of a circular screen projection system, and the computer vision method is limited by an algorithm and cannot accurately recognize gestures.
In the environment of the circular screen projection system, the position of the user is obtained through the positioning equipment, so that the walking of the user can be fed back to the virtual world, and the roaming effect of the virtual world is realized. However, in practical applications, due to the limitations of environment configuration and hardware conditions, when the user is too close to the circular screen, the view is greatly blurred, and the user has to go backwards, that is, the roaming of the virtual world is limited. While roaming using the instructions of the handle may solve this problem, as previously described, the user's immersion during the virtual reality experience may be diminished.
In a traditional application scenario, only one user in the environment of the circular projection system interacting with the virtual world sends an instruction to the processing program through the handle. While other users may be viewing the virtual world with 3D glasses on, no interaction can occur with it. In the real world, all participants in the environment can generate interactive effects with the world, so that the immersion of the virtual reality is reduced, and the application environment is limited. If this function needs to be fulfilled, a handle or other command device needs to be configured for all users, which makes the configuration of the whole system expensive and complicated.
Therefore, how to solve the above-mentioned deficiencies of the prior art is the problem faced by the present invention.
Disclosure of Invention
The invention provides a virtual reality interaction device based on gesture recognition in a circular screen scene, which can be applied to the circular screen scene and is carried out based on a hand tracking module.
The technical scheme adopted by the invention for solving the technical problems is as follows:
the method is suitable for the circular screen scene; the invention enables the user to interact with the virtual scene by utilizing the gesture, realizes roaming and can be operated by multiple persons; the device can track the position of a user through the positioning equipment, track the pose information of a hand through the hand tracking module, and identify the gesture information of the user, so that the gesture made by the user can be fed back in a virtual world in real time correspondingly, and finally displayed on a circular screen; the device comprises positioning equipment, a hand tracking module, a circular screen, 3D glasses, a graphic workstation and a communication module;
the hand tracking module and the positioner of the positioning equipment are arranged on the 3D glasses and worn by a user, and the positioning base station of the positioning equipment is placed in the central area of the circular screen; acquiring the position of a user by using positioning equipment; acquiring pose information of a hand of a user by using a hand tracking module, and sending the pose information to a graphic workstation through a communication module; the graphic workstation unifies the local coordinate system of the user with the world coordinate system of the virtual world displayed by the circular screen; processing the pose information of the hand by the graphic workstation, and performing simulation restoration on the processed information on a circular screen; if the processed information meets the requirements of a certain command gesture on the pose, feeding back corresponding commands in the virtual world according to the commands sent by the graphic workstation and the significances represented by the gesture, and re-rendering and displaying;
furthermore, the hand tracking module is assembled with the 3D glasses, and the infrared emitter in the hand tracking module is used for acquiring the spatial position information in the local coordinate system of the module;
the hand tracking module can identify the pose information of a plurality of hands, the pose information of the hands comprises the spatial position information of joints of left and right hands, palms and backs of hands, and the graphic workstation acquires the spatial position information according to the frequency of millisecond level; the hand tracking module can simultaneously track the pose information of multiple hands and distinguish the poses with unique IDs;
the unique ID is added in the graphic workstation; and when the graphic workstation receives the hand pose information sent by the hand tracking module, adding a unique ID to each hand according to the spatial position relation of the hands.
Further, the coordinates are uniformly implemented as follows: the local coordinate system where the pose information of the hand of the user is located is synchronized to a world coordinate system of the virtual world through a space coordinate mapping relation from a positioner of the positioning equipment to the positioning base station;
further, the gesture information comprises a static gesture and a dynamic gesture; in the process of recognizing the gestures by the graphic workstation, the recognition sequence is that the priority of the dynamic gestures is higher than that of the static gestures.
Furthermore, the interaction and the roaming in the virtual scene are realized by recognizing gesture information of the user.
Furthermore, the multi-user collaborative interaction mode is realized by identifying the pose information of multiple hands through the hand tracking module, so that multiple users can carry out collaborative interaction on the same virtual world.
Furthermore, the multi-user collaborative interaction mode can be realized by adding a set of hand tracking module and a positioning device locator for each user, so that multiple users can collaboratively interact with the same virtual world.
Further, the multi-person cooperative interaction is realized as follows: the graphic workstation can be unified through a coordinate system for the hand pose information sent by the hand tracking modules; the pose information of all the hands is unified to a world coordinate system of a virtual world displayed on a circular screen, and a unique ID is added to each hand according to the spatial position relation, so that each hand in the real world can be uniquely expressed in different hand tracking modules.
Furthermore, the hand position and posture information is tracked through the hand tracking modules, so that the recognition error condition caused by self-shielding during gesture recognition can be avoided; the principle of the method is that the graphic workstation receives hand pose information sent by all hand tracking modules, after a unique ID is added to each hand, the pose information of each hand obtained in different hand tracking modules is analyzed according to the ID, the hand pose information obtained by most hand tracking modules is taken as the final pose information of the hand, and other results are considered to be results obtained due to self-shielding and are discarded.
Further, self-shielding is avoided by using the plurality of hand tracking modules, when the analysis results of the plurality of hand tracking modules on the pose information of the same hand show that a plurality of pose information gains are the same, one pose information is randomly used as the pose information of the hand, and whether the pose information represents a certain command gesture or not is not judged.
The invention has the following beneficial effects:
in some specific scenes, such as simulation environment display or virtual venue roaming, in order to improve the experience of virtual reality, a general manner often designates a viewpoint for a user. When the invention is used under the condition, when a hardware device needs to be configured for multiple users to realize multi-user cooperative interaction, a positioning equipment locator does not need to be additionally configured for the users. In addition, under the application condition, the technology designed by the invention can not generate blurred pictures and reduced impression caused by too close distance with the circular screen.
In the aspect of hardware configuration, the interaction method only needs to use common 3D glasses for displaying the virtual reality scene, and the configuration of the whole device is light, so that a user can not generate obvious discomfort even if the user uses the device for a long time.
In the specific use process of the device designed by the invention, only simple environment configuration is needed, and compared with the traditional virtual reality interactive application based on gesture recognition under the circular screen environment, the burden of a user is greatly reduced.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the invention and together with the description serve to explain the principles of the invention. In the drawings:
fig. 1 is a hardware diagram of a device configured for a user, including 3D glasses, a pointing device locator, and a hand tracking module.
Fig. 2 is a bottom schematic view of the apparatus shown in fig. 1.
FIG. 3 is a schematic view of a pointing device locator.
Fig. 4 is a schematic diagram of a single person using the interactive device designed by the invention.
Fig. 5 is a schematic diagram of a multi-user cooperative use of the interactive device designed by the present invention.
FIG. 6 is a schematic diagram of a static gesture in a set of command gestures for interaction according to the present invention.
FIG. 7 is a schematic diagram of a portion of a dynamic gesture in a set of command gestures for interaction according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is further described in detail below with reference to examples and the accompanying drawings, including specific functional modules, operation modes and implementation methods of the present invention. In this embodiment, the pointing device uses the Nolo Cv1 Pro to point the interactive device and the hand tracking module uses the Ultraleap Stereo IR 170. It should be emphasized that the following examples and descriptions thereof, while indicating the preferred embodiment of the invention, are intended to illustrate, but not limit the scope of the invention. In the application process, different positioning interaction devices and hand tracking modules can be selected according to actual conditions, and the method designed by the invention can be adopted for configuration as long as hardware meets the functions.
In the specific use process of the device designed by the invention, only simple environment configuration is needed, and compared with the traditional virtual reality interactive application based on gesture recognition under the circular screen environment, the burden of a user is greatly reduced.
The virtual reality interaction device designed by the invention has the working process that: acquiring the position of a user by using positioning equipment; acquiring pose information of a hand of a user by using a hand tracking module, and sending the pose information to a graphic workstation through a communication module; the graphic workstation unifies the local coordinate system of the user with the world coordinate system of the virtual world displayed by the circular screen; processing the pose information of the hand by the graphic workstation, and performing simulation restoration on the processed information on a circular screen; and if the processed information meets the requirements of a certain command gesture on the pose, feeding back the corresponding command in the virtual world according to the command sent by the graphic workstation according to the meaning represented by the gesture, and re-rendering and displaying.
Fig. 1 shows the hardware of the apparatus configured to the user according to the present invention, which includes 3D glasses, a locator of a positioning device, and a hand tracking module. The device connects the locator 100, the hand tracking module 101 and the 3D glasses 102 together by a stand. The whole device is light in weight, and the user can not feel uncomfortable after wearing the device for a long time.
The hand tracking module selected for this embodiment is Ultraleap Stereo IR 170. IR 170 is an optical manual tracking module designed to be integrated into enterprise-level hardware solutions, displays, installation, and virtual/augmented reality headsets. The IR 170 uses the same core software as the Leap Motion Controller, a predecessor product. Both can identify 27 different hand elements, including bones and joints, which can be tracked even if they are occluded by other parts of the hand. The IR 170 has a wider field of view, longer tracking range, lower power consumption, and smaller form factor. It is capable of tracking a hand over a 3D interaction area of from 10 centimeters (4 inches) to 75 centimeters (29.5 inches) or more, with the device having a wide field of view of 170 ° x170 ° (160 ° x160 ° minimum).
When the device is used, the positioning base station 103 acquires spatial position data of the device through the positioner 100, the hand tracking module 101 transmits infrared rays to the lower front of the face orientation of the user through the infrared emitter of the hand tracking module to acquire spatial depth data of the hand, and the data is extracted according to the knuckle of the hand to obtain a group of data, namely pose information of the hand, which is represented by spatial coordinate data of a local coordinate system where the hand tracking module is located. The simulated hand image can be restored from this data in the computer program. Compared with the traditional method, after the hand tracking module is configured in the designed mode, the hand tracking module greatly improves the gesture recognition precision, and only needs to perform simple conversion of a coordinate system to unify a local coordinate system to a world coordinate system of a virtual world.
Fig. 2 shows the structure of fig. 1 from the bottom.
Fig. 3 is a schematic diagram of a positioning base station of the positioning apparatus. The positioning equipment selected in the invention is Nolo Cv1 Pro positioning interaction equipment which comprises a base station, a positioner and a handle, and in the using process, the interaction with the virtual world is completely carried out by means of gestures.
Fig. 4 is a schematic diagram of a single person using the interactive device designed by the invention. In concrete use, only need carry on simple environment configuration can, compare the virtual reality interactive application based on gesture recognition under the traditional ring curtain environment, user's burden reduces greatly. The process of the environment configuration of the invention is as follows: (i) the user wears 3D glasses, a hand tracking module, and a locator for the pointing device, which are extremely lightweight compared to conventional VR head-mounted display devices; (ii) and a positioning base station of the positioning equipment is placed in the central area of the circular screen.
Fig. 5 is a schematic diagram of a multi-user cooperative use of the interactive device designed by the present invention. According to the interaction technology designed by the invention, multi-person cooperative interaction can be realized without additionally adding hardware facilities, the data of multiple hands can be identified through the hand tracking module, and the data of different hands are distinguished by the unique ID of the hands.
In the case shown in fig. 5, a set of hand tracking module and positioning device locator is added to each user to realize multi-user cooperative interaction. Although extra configuration of hardware devices is needed, the method has the advantage that recognition errors caused by self-occlusion can be avoided when the gesture is recognized. The principle of the method is that a graphic workstation receives hand pose information sent by all hand tracking modules, after a unique ID is added to each hand, the pose information of each hand obtained in different hand tracking modules is analyzed according to the ID, the hand pose information obtained by most hand tracking modules is taken as the final pose information of the hand, and other results are discarded as results obtained due to self-shielding; when the analysis results of the pose information of the same hand by the plurality of hand tracking modules are the same in the multiple pose information scores, selecting the random pose information with the highest score as the pose information of the hand, and not judging whether the pose information represents a certain command gesture.
It should be noted that the positioning device Nolo Cv1 Pro positioning interaction device used in this embodiment includes a base station, a positioner, and a handle, where the base station is configured to obtain spatial position coordinates of only one positioner, and it is impossible to implement multi-user cooperative interaction by allocating a positioning device positioner to multiple users. However, the handle of the device can be adopted by the base station to adopt the space position coordinates, so in the specific implementation, the handle can also be distributed to the user, the positioning of the user is completed by acquiring the space position coordinates of the positioner and the handle through the base station, and the method can be used for completing the multi-person cooperative interaction function supporting at most three persons. Moreover, the handle only plays a role in positioning during the implementation, and the interaction generated between the user and the virtual world still depends on gesture recognition. In addition, the hardware configuration selected by the embodiment is not limited to the method, so that in the specific implementation, a user can select different hardware devices according to the own requirements to implement the technology designed by the present invention.
In some specific scenarios, such as simulation environment display or virtual venue roaming, in order to improve the experience of virtual reality, an observation point or an observation route is often designated for a user. When the invention is used under the condition, when a hardware device needs to be configured for a plurality of users to realize multi-user cooperative interaction, a positioning equipment positioner does not need to be additionally configured for the users. In addition, under the application condition, the technology designed by the invention can not generate blurred pictures and reduced impression caused by too close distance with the circular screen.
FIG. 6 is a schematic diagram of a static gesture in a set of command gestures for interaction according to the present invention. According to the invention, a series of gestures are designed according to the positions, normal vectors and speed of the palm and finger joints, wherein the gestures comprise static gestures and dynamic gestures, and when judging whether the space pose data of the hand acquired by the hand tracking module belongs to any gesture, the priority is that the dynamic gesture is larger than the static gesture, namely when the space pose data of the hand belongs to a certain dynamic gesture, the hand is not considered as the static gesture. The gesture differentiation degree designed by the invention is high, the gesture differentiation degree is easy to identify, and a user can designate different function instructions for the gesture according to requirements. As shown in FIG. 6, the first row of gestures is palm open; the second line of gestures is an index finger L-shaped gesture, an 'I Love U' gesture and a fist gesture; the third row gesture is a "like" gesture, an index finger tap gesture, an index and middle finger tap gesture; the fourth row of gestures is a line gesture including a V-shaped gesture, an Ok gesture, an index and middle L-shaped gesture.
FIG. 7 is a schematic diagram of a portion of a dynamic gesture in a set of command gestures for interaction according to the present invention. As shown in fig. 7, the left side of the diagram is an index finger click gesture, the right side of the diagram is an index finger slide gesture, and the rest of the diagram is the translation motion, the rotation motion of the hand, the circular motion of the hand and the like, and the user can specify different function instructions for the gesture according to the requirement. The dynamic gesture in the command gestures for interaction designed by the invention calculates the position relationship and the moving speed of the fingers and the palm. If the total movement value is greater than a user-defined threshold, the gesture is deemed to have changed, an instructional gesture is generated, and otherwise a static gesture is recognized.

Claims (10)

1. The virtual reality interaction device based on gesture recognition under the circular screen scene is suitable for the circular screen scene; the invention enables the user to interact with the virtual scene by utilizing the gesture, realizes roaming and can be operated by multiple persons; the device can track the position of a user through the positioning equipment, track the pose information of a hand through the hand tracking module, and identify the gesture information of the user, so that the gesture made by the user can be fed back in a virtual world in real time correspondingly, and finally displayed on a circular screen; the device comprises positioning equipment, a hand tracking module, a circular screen, 3D glasses, a graphic workstation and a communication module;
the hand tracking module and the positioner of the positioning equipment are arranged on the 3D glasses and worn by a user, and the positioning base station of the positioning equipment is placed in the central area of the circular screen; acquiring the position of a user by using positioning equipment; acquiring pose information of a hand of a user by using a hand tracking module, and sending the pose information to a graphic workstation through a communication module; the graphic workstation unifies the local coordinate system of the user with the world coordinate system of the virtual world displayed by the circular screen; processing the pose information of the hand by the graphic workstation, and performing simulation restoration on the processed information on a circular screen; and if the processed information meets the requirements of a certain command gesture on the pose, feeding back the corresponding command in the virtual world according to the command sent by the graphic workstation according to the meaning represented by the gesture, and re-rendering and displaying.
2. The virtual reality interaction device based on gesture recognition under the circular screen scene as claimed in claim 1, wherein the hand tracking module is assembled with the 3D glasses, and the infrared emitter in the hand tracking module is used to obtain the spatial position information in the local coordinate system of the module;
the hand tracking module can identify the pose information of a plurality of hands, the pose information of the hands comprises the spatial position information of joints of left and right hands, palms and backs of hands, and the graphic workstation acquires the spatial position information according to the frequency of millisecond level; the hand tracking module can simultaneously track the pose information of multiple hands and distinguish the poses with unique IDs;
the unique ID is added in the graphic workstation; and when the graphic workstation receives the hand pose information sent by the hand tracking module, adding a unique ID to each hand according to the spatial position relation of the hands.
3. The virtual reality interaction device based on gesture recognition under the circular screen scene according to claim 1 or 2, characterized in that the coordinates are uniformly realized as follows: and the local coordinate system where the pose information of the hand of the user is positioned is synchronized to the world coordinate system of the virtual world through the space coordinate mapping relation from the positioner of the positioning equipment to the positioning base station.
4. The virtual reality interaction device based on gesture recognition in the circular screen scene according to claim 1 or 2, wherein the gesture information includes a static gesture and a dynamic gesture; in the process of recognizing the gestures by the graphic workstation, the recognition sequence is that the priority of the dynamic gestures is higher than that of the static gestures.
5. The virtual reality interaction device based on gesture recognition in the circular scene as claimed in claim 4, wherein the interaction and roaming in the virtual scene are performed by recognizing gesture information of the user.
6. The virtual reality interaction device based on gesture recognition in the circular screen scene as claimed in claim 1, wherein the multi-person collaborative interaction mode is realized by recognizing pose information of multiple hands through a hand tracking module, so that multiple persons can collaboratively interact with the same virtual world.
7. The virtual reality interaction device based on gesture recognition in the circular screen scene as claimed in claim 1, wherein the multi-user collaborative interaction mode can be implemented by adding a set of hand tracking module and positioning device locator to each user, so that multiple users can collaboratively interact with the same virtual world.
8. The virtual reality interaction device based on gesture recognition in the circular screen scene according to claim 1, wherein the multi-person collaborative interaction is realized as follows: the graphic workstation can be unified through a coordinate system for the hand pose information sent by the hand tracking modules; the pose information of all the hands is unified to a world coordinate system of a virtual world displayed on a circular screen, and a unique ID is added to each hand according to the spatial position relation, so that each hand in the real world can be uniquely expressed in different hand tracking modules.
9. The virtual reality interaction device based on gesture recognition in the circular screen scene according to claim 8, wherein the gesture information of the hand is tracked through a plurality of hand tracking modules, so that recognition error caused by self-shielding during gesture recognition can be avoided; the principle of the method is that the graphic workstation receives hand pose information sent by all hand tracking modules, after a unique ID is added to each hand, the pose information of each hand obtained in different hand tracking modules is analyzed according to the ID, the hand pose information obtained by most hand tracking modules is taken as the final pose information of the hand, and other results are considered to be results obtained due to self-shielding and are discarded.
10. The virtual reality interaction device based on gesture recognition under the circular screen scene according to claim 10, wherein self-occlusion is avoided by using a plurality of hand tracking modules, when the analysis results of the pose information of the same hand by the plurality of hand tracking modules are the same in a plurality of pose information scores, a random pose information with the highest score is selected as the pose information of the hand, and whether the pose information represents a certain command gesture is not determined.
CN202111597804.0A 2021-12-24 2021-12-24 Virtual reality interaction device based on gesture recognition under circular screen scene Pending CN114281193A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111597804.0A CN114281193A (en) 2021-12-24 2021-12-24 Virtual reality interaction device based on gesture recognition under circular screen scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111597804.0A CN114281193A (en) 2021-12-24 2021-12-24 Virtual reality interaction device based on gesture recognition under circular screen scene

Publications (1)

Publication Number Publication Date
CN114281193A true CN114281193A (en) 2022-04-05

Family

ID=80874820

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111597804.0A Pending CN114281193A (en) 2021-12-24 2021-12-24 Virtual reality interaction device based on gesture recognition under circular screen scene

Country Status (1)

Country Link
CN (1) CN114281193A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101231752A (en) * 2008-01-31 2008-07-30 北京航空航天大学 True three-dimensional panoramic display and interactive apparatus without calibration
CN102654955A (en) * 2011-08-15 2012-09-05 上海华博信息服务有限公司 Gesture-recognition-based interactive sand table system and application thereof
CN206639510U (en) * 2017-03-08 2017-11-14 天津梅迪亚科技有限公司 Sand table system based on VR interactions
CN110444066A (en) * 2019-07-15 2019-11-12 贵州电网有限责任公司 The insulation of electrical installation interacted based on holographic teacher and ring curtain tests training system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101231752A (en) * 2008-01-31 2008-07-30 北京航空航天大学 True three-dimensional panoramic display and interactive apparatus without calibration
CN102654955A (en) * 2011-08-15 2012-09-05 上海华博信息服务有限公司 Gesture-recognition-based interactive sand table system and application thereof
CN206639510U (en) * 2017-03-08 2017-11-14 天津梅迪亚科技有限公司 Sand table system based on VR interactions
CN110444066A (en) * 2019-07-15 2019-11-12 贵州电网有限责任公司 The insulation of electrical installation interacted based on holographic teacher and ring curtain tests training system

Similar Documents

Publication Publication Date Title
CN103793060B (en) A kind of user interactive system and method
Kato et al. Marker tracking and hmd calibration for a video-based augmented reality conferencing system
EP2568355A2 (en) Combined stereo camera and stereo display interaction
EP1431798A2 (en) Arbitrary object tracking in augmented reality applications
EP2919093A1 (en) Method, system, and computer for identifying object in augmented reality
CN114766038A (en) Individual views in a shared space
CN111880659A (en) Virtual character control method and device, equipment and computer readable storage medium
CN107656505A (en) Use the methods, devices and systems of augmented reality equipment control man-machine collaboration
Rekimoto A vision-based head tracker for fish tank virtual reality-VR without head gear
WO2019087564A1 (en) Information processing device, information processing method, and program
Fang et al. Head-mounted display augmented reality in manufacturing: A systematic review
EP3591503B1 (en) Rendering of mediated reality content
Zaldívar-Colado et al. A mixed reality for virtual assembly
Scheggi et al. Shape and weight rendering for haptic augmented reality
Lee et al. Tunnelslice: Freehand subspace acquisition using an egocentric tunnel for wearable augmented reality
KR20220026186A (en) A Mixed Reality Telepresence System for Dissimilar Spaces Using Full-Body Avatar
CN109426336A (en) A kind of virtual reality auxiliary type selecting equipment
CN111881807A (en) VR conference control system and method based on face modeling and expression tracking
WO2022176450A1 (en) Information processing device, information processing method, and program
CN114281193A (en) Virtual reality interaction device based on gesture recognition under circular screen scene
CN113434046A (en) Three-dimensional interaction system, method, computer device and readable storage medium
Siegl et al. An augmented reality human–computer interface for object localization in a cognitive vision system
Miwa et al. Four-dimensional viewing direction control by principal vanishing points operation and its application to four-dimensional fly-through experience
Madritsch CCD-Camera Based Optical Tracking for Human-Computer Interaction
Varma et al. Gestural interaction with three-dimensional interfaces; current research and recommendations

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination