CN110850976A - Virtual reality projection and retrieval system based on environment perception - Google Patents
Virtual reality projection and retrieval system based on environment perception Download PDFInfo
- Publication number
- CN110850976A CN110850976A CN201911071668.4A CN201911071668A CN110850976A CN 110850976 A CN110850976 A CN 110850976A CN 201911071668 A CN201911071668 A CN 201911071668A CN 110850976 A CN110850976 A CN 110850976A
- Authority
- CN
- China
- Prior art keywords
- user
- module
- unit
- model
- interaction
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000008447 perception Effects 0.000 title claims abstract description 24
- 230000002452 interceptive effect Effects 0.000 claims abstract description 57
- 230000003993 interaction Effects 0.000 claims abstract description 45
- 238000004458 analytical method Methods 0.000 claims abstract description 28
- 230000006870 function Effects 0.000 claims abstract description 26
- 238000004891 communication Methods 0.000 claims abstract description 19
- 230000007613 environmental effect Effects 0.000 claims description 14
- 239000011521 glass Substances 0.000 claims description 10
- 238000009877 rendering Methods 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 7
- 238000012423 maintenance Methods 0.000 claims description 6
- 238000005457 optimization Methods 0.000 claims description 5
- 230000033001 locomotion Effects 0.000 claims description 4
- 230000008859 change Effects 0.000 claims description 3
- 239000012535 impurity Substances 0.000 claims description 3
- 230000003238 somatosensory effect Effects 0.000 claims 1
- 238000005516 engineering process Methods 0.000 abstract description 14
- 230000035807 sensation Effects 0.000 description 4
- 235000019615 sensations Nutrition 0.000 description 4
- 238000000034 method Methods 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 238000012351 Integrated analysis Methods 0.000 description 1
- 230000003044 adaptive effect Effects 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 210000000697 sensory organ Anatomy 0.000 description 1
- 235000019613 sensory perceptions of taste Nutrition 0.000 description 1
- 230000035923 taste sensation Effects 0.000 description 1
- 230000016776 visual perception Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
- G01C21/206—Instruments for performing navigational calculations specially adapted for indoor navigation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/131—Protocols for games, networked simulations or virtual reality
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Computer Security & Cryptography (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
The invention belongs to the technical field of virtual reality, and particularly relates to a virtual reality projection and retrieval system based on environment perception, which comprises a plurality of user sides and a server side which are connected through a network, wherein the user sides are communicated with each other through the network, and the server side and the user sides are connected with a cloud analysis platform through the network; the user side comprises a user information unit, a user interaction unit, an environment sensing unit and a voice communication unit; the user information unit is used for recording the detailed information of the user and logging in the server through the user unit. The invention can realize the functions of reading books and reading videos in the model environment of the book management, thereby completing the corresponding functions at home without leaving home, enriching the functionality of the virtual book management, and performing interactive operation with the virtual book management, such as interactive operation of browsing books and fast-forwarding videos, and the like, thereby ensuring that the whole operation is more vivid and effectively combining the virtual reality technology with the library.
Description
Technical Field
The invention relates to the technical field of virtual reality, in particular to a virtual reality projection and retrieval system based on environment perception.
Background
The virtual reality technology is an important direction of the simulation technology, is a collection of various technologies such as the simulation technology, the computer graphics man-machine interface technology, the multimedia technology, the sensing technology, the network technology and the like, and is a challenging advanced subject and research field of cross technologies. Virtual reality technology (VR) mainly includes aspects of simulating environment, perception, natural skills, sensing equipment and the like. The simulated environment is a three-dimensional realistic image generated by a computer and dynamic in real time. Perception means that an ideal VR should have the perception that everyone has. In addition to the visual perception generated by computer graphics technology, there are also perceptions such as auditory sensation, tactile sensation, force sensation, and movement, and even olfactory sensation and taste sensation, which are also called multi-perception. The natural skill refers to the head rotation, eyes, gestures or other human body behavior actions of a human, and data adaptive to the actions of the participants are processed by the computer, respond to the input of the user in real time and are respectively fed back to the five sense organs of the user. The sensing device refers to a three-dimensional interaction device.
Because the development of virtual reality technique, can improve people's life and accelerate the development of science and technology, if through modeling different scenes and increasing interactive function, when the user wears VR glasses, then can be in the environment of modeling and carry out interactive operation, thereby make it be personally on the scene, also utilize the VR technique to perfect like present virtual library pipe, but the interactivity of present virtual library pipe is poor, only possess simple indoor browsing function, can only visit the environment of library pipe, can't look up books and audio and video, environmental perception is poor simultaneously, the flexibility is not enough, thereby can't satisfy user's actual demand.
Disclosure of Invention
Technical problem to be solved
Aiming at the defects of the prior art, the invention provides a virtual reality projection and retrieval system based on environment perception, which solves the problems in the background technology.
(II) technical scheme
In order to achieve the purpose, the invention provides the following technical scheme: a virtual reality projection and retrieval system based on environment perception comprises a plurality of user sides and a server side which are connected through a network, wherein the user sides are communicated with each other through the network, and the server side and the user sides are connected with a cloud analysis platform through the network;
the user side comprises a user information unit, a user interaction unit, an environment sensing unit and a voice communication unit;
the user information unit is used for recording the detailed information of the user and logging in the server through the user unit;
the user interaction unit is used for interacting with the server side;
the environment sensing unit is used for sensing the environment where the user is located and transmitting the sensed and acquired data to the cloud analysis platform;
the voice communication unit is used for mutual communication among users;
the server side comprises a scene model unit, a database resource unit, a route guiding unit and a retrieval unit;
the scene model unit is used for providing a library scene model for a user;
the database resource unit is used for storing video, book and audio resources of a library and carrying out corresponding interactive operation;
the route guidance unit is used for being matched with the environment sensing unit and the cloud analysis platform to plan a route for a user to travel in the scene model unit;
the retrieval unit is used for providing a function of retrieving resource data in the resource unit of the database for a user;
the cloud analysis platform is used for analyzing the information transmitted by the environment sensing unit and the information of the scene model unit, making a route and transmitting the route to the route guidance unit.
As a preferred technical scheme of the invention, the user information unit comprises a user registration module, a user login module, a user friend module, a basic setting module and a user database;
the user registration module is used for registering a specific account of a user;
the user login module is used for logging in a server end to become a user member mode;
the user friend module is used for adding friends by searching user accounts and then performing voice communication through the voice communication unit;
the basic setting module can modify some basic parameters through the preference of a user, including but not limited to the brightness of a projected picture, the voice call volume, the keyboard type of a virtual keyboard and the like;
the user database is used for storing relevant information of the user, including but not limited to user login records, total user login duration, user browsing records and the like.
As a preferred technical scheme of the invention, the user interaction unit comprises VR glasses, an interaction rod, an interaction instruction selection module and an interaction operation setting module;
the VR glasses are used for displaying data information such as a book management scene, books, videos, user pages and the like as image information and performing interactive operation;
the interaction bar is used for finishing interaction operation with the image;
the interactive instruction selection module sets a corresponding interactive instruction, and a user can complete interactive operation by selecting the corresponding instruction through the interactive bar;
the interactive operation setting module is used for modifying the instruction of the interactive instruction selection module.
As a preferred technical scheme of the invention, the environment sensing unit comprises a video acquisition module, an infrared sensing module and a motion sensing detection module;
the video acquisition module is used for acquiring picture information of the environment where the user is located and transmitting the picture information to the cloud analysis platform;
the infrared sensing module is used for detecting the distance information between an object around the environment of the user and transmitting data to the cloud analysis platform;
the motion sensing detection module is used for detecting the body state change of a user and transmitting data to the cloud analysis platform.
As a preferred technical scheme of the invention, the scene model unit comprises a 3D model modeling module, a model rendering module, a model optimization module, an interactive function module I and a model maintenance module;
the 3D model modeling module is used for modeling the appearance, the internal structure, the internal equipment facilities and the internal layout of the library according to a certain proportion;
the model rendering module is used for rendering the built model to make the model more vivid;
the model optimization module is used for re-optimizing the rendered model;
the interaction function module is used for adding interaction points in the optimized model so as to facilitate the completion of interaction operation with a user;
the model maintenance module is used for maintaining the book tube model.
As a preferred technical scheme of the invention, the database resource unit comprises a book resource module, an audio and video module and an interactive function module II;
the book resource module comprises books, impurities, newspapers and other articles contained in the book tube and can be operated interactively;
the audio and video module comprises audio and video contained in the book tube and can carry out interactive operation;
and the interactive function module II is used for endowing the book resource module and the audio and video module with interactive functions.
(III) advantageous effects
Compared with the prior art, the invention provides a virtual reality projection and retrieval system based on environment perception, which has the following beneficial effects:
1. this virtual reality throws and retrieval system based on environmental perception, the user can realize under the model environment of librarray and read the function of book and reading the video to can be enough not going out family accomplish corresponding function at home, richened the functional of virtual librarray, and can carry out interactive operation with virtual librarray, like browsing books and fast-forward interactive operation such as video, thereby make whole operation more lifelike, carry out effectual combination with virtual reality technique and library.
2. This virtual reality throws and retrieval system based on environmental perception, through the design of environmental perception unit, can in time know the environmental information that the user located, rethread cloud analysis platform can carry out the integrated analysis with the environmental information that the user located the information of scene model unit to the best user route of marcing is formulated and the route guide unit is given in the transmission, thereby the security that the user marched in the room has been improved, and make the user have more directly perceived deep experience.
3. The virtual reality projection and retrieval system based on environment perception is high in interaction degree, can be in voice communication with different friends, is rich in functions, and can meet the actual requirements of users on virtual book management.
Drawings
FIG. 1 is a schematic diagram of the structural system of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Examples
Referring to fig. 1, the present invention provides the following technical solutions: a virtual reality projection and retrieval system based on environment perception comprises a plurality of user sides and a server side which are connected through a network, wherein the user sides are communicated with each other through the network, and the server side and the user sides are connected with a cloud analysis platform through the network.
The network comprises a wired network and a wireless network, the wireless network is preferably adopted in the invention, and the user terminal, the server terminal and the cloud analysis platform can be connected through the wireless network, so that the operation and the function of the invention can be realized.
The user side comprises a user information unit, a user interaction unit, an environment sensing unit and a voice communication unit.
The user information unit is used for recording the detailed information of the user and logging in the server through the user unit.
The user information unit comprises a user registration module, a user login module, a user friend module, a basic setting module and a user database.
The user registration module is used for registering a specific account of a user.
The account number of the user is designed by the user and consists of English letters and numbers, after the account number is designed by the user, searching and analyzing can be carried out on a database inside the account number through a cloud analysis platform, if no person registers, the account number can be used, if someone registers, the user needs to redesign an account number, and a password needs to be set for the account number.
The user login module is used for logging in the server side to become a user member mode.
The user enters the server terminal after inputting the correct account and password in the user login module, and then the corresponding interactive operation can be carried out.
The user friend module is used for adding friends by searching user accounts and then performing voice communication through the voice communication unit.
After entering the server, the user can search the corresponding user by opening the friend module of the user and inputting the account number of the person to be searched in the friend module of the user, and then can add the friend to the corresponding user, and then can perform voice communication with the friend through the voice communication module, and can also browse the basic information of the user.
The basic setting module can modify some basic parameters including but not limited to brightness of projected picture, voice call volume, keyboard type of virtual keyboard, etc. through user's preference.
The setting of the basic setting module is completely set according to the preference of a user, and can be changed at will.
The user database is used for storing relevant information of the user, including but not limited to a user login record, a user login total time length, a user browsing record and the like.
The user database is the own detailed data of the user, for example, the user can know the browsing situation last time through the user database so as to browse again, wherein the user can be set into a non-consultable state and a consultable state through the basic setting module, and the privacy of the user can be protected.
And the user interaction unit is used for interacting with the server side.
The user interaction unit comprises VR glasses, an interaction rod, an interaction instruction selection module and an interaction operation setting module.
The VR glasses are used for displaying data information such as book management scenes, books, videos and user pages and the like as image information and performing interactive operation.
The user can observe pictures such as scenes of the book tube through the VR glasses, and can project the interactive instruction, so that the user can conveniently use the interactive bar to perform interactive operation.
The interactive bar is used for finishing interactive operation with the image.
The interaction rod is used for achieving the completion of interaction work, if a user drives VR glasses, a user registration interface can be displayed in an image mode, at the moment, the user needs to click through the interaction rod to complete registration operation, then when the user enters a login section, a virtual keyboard can appear, the user needs to click the virtual keyboard to input a login account password into the server side, and by analogy in sequence, the user needs to use the interaction rod to complete all interaction operations.
The interactive instruction selection module sets a corresponding interactive instruction, and the user can complete interactive operation by selecting the corresponding instruction through the interactive bar.
The interactive instruction selection module is set through the interactive operation setting module, if a user uses the interactive bar to draw a circle, the friend module of the user can be called out, and the user uses the interactive bar to draw a cross, the operation such as the user setting module can be called out, so that the authenticity and the richness of the interactive operation are improved.
The interactive operation setting module is used for modifying the instruction of the interactive instruction selection module.
The environment sensing unit is used for sensing the environment where the user is located and transmitting the sensed and collected data to the cloud analysis platform.
The environment sensing unit comprises a video acquisition module, an infrared sensing module and a body sensing detection module.
The video acquisition module is used for acquiring picture information of the environment where the user is located and transmitting the picture information to the cloud analysis platform.
The infrared sensing module is used for detecting distance information between objects around the user environment and the user and transmitting data to the cloud analysis platform.
The body feeling detection module is used for detecting the body state change of the user and transmitting the data to the cloud analysis platform.
Through the cooperation of video acquisition module, infrared perception module and body sense detection module jointly, can transmit the environmental information that the user was located to the cloud analysis platform on, then the cloud analysis platform combines scene model unit to determine good user's movable range, ensures that the user wears the security of VR glasses when indoor removal.
The voice communication unit is used for mutual communication among users.
The server side comprises a scene model unit, a database resource unit, a route guiding unit and a retrieval unit.
The scene model unit is used for providing the library scene model for the user.
The system comprises a 3D model modeling module, a model rendering module, a model optimizing module, an interactive function module I and a model maintenance module.
The 3D model modeling module is used for modeling the appearance, the internal structure, the internal equipment facilities and the internal layout of the library according to a certain proportion.
The model rendering module is used for rendering the built model to make the model more vivid.
And the model optimization module is used for re-optimizing the rendered model.
The interaction function module is used for adding interaction points in the optimized model, and interaction operation with a user is conveniently completed.
The interaction points are arranged at partial positions in the book tube, so that a user can conveniently use the interaction rod to carry out interaction operation, if the floor option interaction point is arranged at the entrance of the book tube, the user uses the interaction rod to select a specific floor, the picture of the user is converted to the position of the specific floor, and then the next interaction operation is continued.
The model maintenance module is used for maintaining the book tube model.
The database resource unit is used for storing video, book and audio resources of the library and carrying out corresponding interactive operation.
The database resource unit comprises a book resource module, an audio and video module and an interactive function module II.
The book resource module comprises books, impurities, newspapers and periodicals and other articles contained in the book tube and can be operated interactively.
The audio and video module comprises audio and video contained in the book tube and can carry out interactive operation.
And the interactive function module II is used for endowing the book resource module and the audio and video module with interactive functions.
Such as page turning, fast forward video and other interactive operations.
The route guiding unit is used for being matched with the environment sensing unit and the cloud analysis platform to plan a route for the user to travel in the scene model unit.
The retrieval unit is used for providing the function of retrieving the resource data in the resource unit of the database for the user.
The cloud analysis platform is used for analyzing the information transmitted by the environment sensing unit and the information of the scene model unit, making a route and transmitting the route to the route guidance unit.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (6)
1. A virtual reality projection and retrieval system based on environmental perception, characterized by: the cloud analysis system comprises a plurality of user sides and a server side which are connected through a network, wherein the user sides are communicated with each other through the network, and the server side and the user sides are connected with a cloud analysis platform through the network;
the user side comprises a user information unit, a user interaction unit, an environment sensing unit and a voice communication unit;
the user information unit is used for recording the detailed information of the user and logging in the server through the user unit;
the user interaction unit is used for interacting with the server side;
the environment sensing unit is used for sensing the environment where the user is located and transmitting the sensed and acquired data to the cloud analysis platform;
the voice communication unit is used for mutual communication among users;
the server side comprises a scene model unit, a database resource unit, a route guiding unit and a retrieval unit;
the scene model unit is used for providing a library scene model for a user;
the database resource unit is used for storing video, book and audio resources of a library and carrying out corresponding interactive operation;
the route guidance unit is used for being matched with the environment perception unit and the cloud analysis platform to plan a route for a user to travel in the scene model unit
The retrieval unit is used for providing a function of retrieving resource data in the resource unit of the database for a user;
the cloud analysis platform is used for analyzing the information transmitted by the environment sensing unit and the information of the scene model unit, making a route and transmitting the route to the route guidance unit.
2. The virtual reality projection and retrieval system based on environmental perception of claim 1, wherein: the user information unit comprises a user registration module, a user login module, a user friend module, a basic setting module and a user database;
the user registration module is used for registering a specific account of a user;
the user login module is used for logging in a server end to become a user member mode;
the user friend module is used for adding friends by searching user accounts and then performing voice communication through the voice communication unit;
the basic setting module can modify some basic parameters through the preference of a user, including but not limited to the brightness of a projected picture, the voice call volume, the keyboard type of a virtual keyboard and the like;
the user database is used for storing relevant information of the user, including but not limited to user login records, total user login duration, user browsing records and the like.
3. The virtual reality projection and retrieval system based on environmental perception of claim 1, wherein: the user interaction unit comprises VR glasses, an interaction rod, an interaction instruction selection module and an interaction operation setting module;
the VR glasses are used for displaying data information such as a book management scene, books, videos, user pages and the like as image information and performing interactive operation;
the interaction bar is used for finishing interaction operation with the image;
the interactive instruction selection module sets a corresponding interactive instruction, and a user can complete interactive operation by selecting the corresponding instruction through the interactive bar;
the interactive operation setting module is used for modifying the instruction of the interactive instruction selection module.
4. A virtual reality projection and retrieval system based on environmental awareness according to claim environmental awareness unit, wherein: the environment sensing unit comprises a video acquisition module, an infrared sensing module and a somatosensory detection module;
the video acquisition module is used for acquiring picture information of the environment where the user is located and transmitting the picture information to the cloud analysis platform;
the infrared sensing module is used for detecting the distance information between an object around the environment of the user and transmitting data to the cloud analysis platform;
the motion sensing detection module is used for detecting the body state change of a user and transmitting data to the cloud analysis platform.
5. The virtual reality projection and retrieval system based on environmental perception of claim 1, wherein: the scene model unit comprises a 3D model modeling module, a model rendering module, a model optimization module, an interactive function module I and a model maintenance module;
the 3D model modeling module is used for modeling the appearance, the internal structure, the internal equipment facilities and the internal layout of the library according to a certain proportion;
the model rendering module is used for rendering the built model to make the model more vivid;
the model optimization module is used for re-optimizing the rendered model;
the interaction function module is used for adding interaction points in the optimized model so as to facilitate the completion of interaction operation with a user;
the model maintenance module is used for maintaining the book tube model.
6. The virtual reality projection and retrieval system based on environmental perception of claim 1, wherein: the database resource unit comprises a book resource module, an audio and video module and an interactive function module II;
the book resource module comprises books, impurities, newspapers and other articles contained in the book tube and can be operated interactively;
the audio and video module comprises audio and video contained in the book tube and can carry out interactive operation;
and the interactive function module II is used for endowing the book resource module and the audio and video module with interactive functions.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911071668.4A CN110850976A (en) | 2019-11-05 | 2019-11-05 | Virtual reality projection and retrieval system based on environment perception |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201911071668.4A CN110850976A (en) | 2019-11-05 | 2019-11-05 | Virtual reality projection and retrieval system based on environment perception |
Publications (1)
Publication Number | Publication Date |
---|---|
CN110850976A true CN110850976A (en) | 2020-02-28 |
Family
ID=69598676
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201911071668.4A Pending CN110850976A (en) | 2019-11-05 | 2019-11-05 | Virtual reality projection and retrieval system based on environment perception |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN110850976A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111400234A (en) * | 2020-03-12 | 2020-07-10 | 深圳捷径观察科技有限公司 | Multimedia reader based on VR equipment and reading method |
CN113010646A (en) * | 2021-03-31 | 2021-06-22 | 苏州乐智永成智能科技有限公司 | Library's wisdom audiovisual management system |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6489979B1 (en) * | 1998-10-30 | 2002-12-03 | International Business Machines Corporation | Non-computer interface to a database and digital library |
CN103823818A (en) * | 2012-11-19 | 2014-05-28 | 大连鑫奇辉科技有限公司 | Book system on basis of virtual reality |
CN105323129A (en) * | 2015-12-04 | 2016-02-10 | 上海弥山多媒体科技有限公司 | Home virtual reality entertainment system |
CN106648069A (en) * | 2016-11-17 | 2017-05-10 | 安徽华博胜讯信息科技股份有限公司 | Digital library system based on virtual reality technology |
CN107393016A (en) * | 2017-07-28 | 2017-11-24 | 安徽华博胜讯信息科技股份有限公司 | Virtual reality library system based on Virtools |
CN108959595A (en) * | 2018-07-12 | 2018-12-07 | 腾讯科技(深圳)有限公司 | Based on virtual and real Website construction and experiential method and its device |
-
2019
- 2019-11-05 CN CN201911071668.4A patent/CN110850976A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6489979B1 (en) * | 1998-10-30 | 2002-12-03 | International Business Machines Corporation | Non-computer interface to a database and digital library |
CN103823818A (en) * | 2012-11-19 | 2014-05-28 | 大连鑫奇辉科技有限公司 | Book system on basis of virtual reality |
CN105323129A (en) * | 2015-12-04 | 2016-02-10 | 上海弥山多媒体科技有限公司 | Home virtual reality entertainment system |
CN106648069A (en) * | 2016-11-17 | 2017-05-10 | 安徽华博胜讯信息科技股份有限公司 | Digital library system based on virtual reality technology |
CN107393016A (en) * | 2017-07-28 | 2017-11-24 | 安徽华博胜讯信息科技股份有限公司 | Virtual reality library system based on Virtools |
CN108959595A (en) * | 2018-07-12 | 2018-12-07 | 腾讯科技(深圳)有限公司 | Based on virtual and real Website construction and experiential method and its device |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111400234A (en) * | 2020-03-12 | 2020-07-10 | 深圳捷径观察科技有限公司 | Multimedia reader based on VR equipment and reading method |
CN113010646A (en) * | 2021-03-31 | 2021-06-22 | 苏州乐智永成智能科技有限公司 | Library's wisdom audiovisual management system |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR101855639B1 (en) | Camera navigation for presentations | |
US11983837B2 (en) | Cheating deterrence in VR education environments | |
CN111880659A (en) | Virtual character control method and device, equipment and computer readable storage medium | |
CN103258338A (en) | Method and system for driving simulated virtual environments with real data | |
CN111897431B (en) | Display method and device, display equipment and computer readable storage medium | |
CN110850976A (en) | Virtual reality projection and retrieval system based on environment perception | |
CN113687720A (en) | Multi-person online virtual reality education system and use method thereof | |
CN112071130A (en) | Knowledge education system and education method based on VR technology | |
CN108614872A (en) | Course content methods of exhibiting and device | |
WO2022131148A1 (en) | Information processing device, information processing method, and information processing program | |
Liu et al. | An interactive spiraltape video summarization | |
US20210366302A1 (en) | Selecting lesson asset information | |
Li et al. | Exploring the implementation of web, mobile, and VR technology for cultural heritage presentation | |
Wang et al. | Design and Implementation of Children's Games Based on Mixed Reality | |
Ariffin et al. | Enhancing tourism experiences via mobile augmented reality by superimposing virtual information on artefacts | |
US11922595B2 (en) | Redacting content in a virtual reality environment | |
CN111522439B (en) | Revision method, device and equipment of virtual prototype and computer storage medium | |
US20230334790A1 (en) | Interactive reality computing experience using optical lenticular multi-perspective simulation | |
US20230334792A1 (en) | Interactive reality computing experience using optical lenticular multi-perspective simulation | |
US20230334791A1 (en) | Interactive reality computing experience using multi-layer projections to create an illusion of depth | |
Prakash et al. | Embodying a narrative: Spatializing abstract narrative themes for forensic exploration | |
Schäfer | Improving Essential Interactions for Immersive Virtual Environments with Novel Hand Gesture Authoring Tools | |
Gonçalves et al. | Tag around: a 3D gesture game for image annotation | |
Gong | Comprehensive Data Collection and Feedback Algorithm for Smart Music Online Guidance based on Immersive Virtual Media Environment | |
WO2023215637A1 (en) | Interactive reality computing experience using optical lenticular multi-perspective simulation |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200228 |