US11843764B2 - Virtual reality headsets and method of managing user experience with virtual reality headsets - Google Patents
Virtual reality headsets and method of managing user experience with virtual reality headsets Download PDFInfo
- Publication number
- US11843764B2 US11843764B2 US17/507,012 US202117507012A US11843764B2 US 11843764 B2 US11843764 B2 US 11843764B2 US 202117507012 A US202117507012 A US 202117507012A US 11843764 B2 US11843764 B2 US 11843764B2
- Authority
- US
- United States
- Prior art keywords
- user
- headset
- server
- negative effect
- data
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/366—Image reproducers using viewer tracking
- H04N13/383—Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N20/00—Machine learning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
Definitions
- Virtual Reality is a technology in which various devices are used to stimulate a user's senses to simulate a desired setting and perhaps course of events.
- the user wears a headset that displays images and produces the sounds of the virtual environment for the user's eyes and ears.
- the user is then able to interact with the virtual environment using additional system components that detects the user's actions, which are reflected in the output of the VR system.
- Virtual Reality is becoming an important technology in a variety of fields and applications. For example, VR is used for training simulations, gaming and remote consultation and collaboration. As VR technology becomes more effective and useful, users naturally spend longer amounts of time using a VR system, e.g., wearing a VR headset.
- FIG. 2 is a flowchart showing some illustrative responses a system may take to manage user experience, consistent with the disclosed implementations.
- FIG. 3 is a flowchart showing some additional illustrative actions the system may take to manage user experience, consistent with the disclosed implementations.
- FIG. 5 is a computer readable storage medium containing instructions to reduce VR health impacts, consistent with the disclosed implementations.
- FIG. 6 is an illustrative system, including a VR headset, for reducing VR side effects, consistent with the disclosed implementations.
- FIG. 7 is a flowchart illustrating a method of reducing VR health impacts, consistent with the disclosed implementations.
- VR Virtual Reality
- various devices are used to stimulate a user's senses to simulate a desired setting and perhaps a course of events.
- the user wears a headset with a display device mounted over the user's eyes and speakers for the user's ears through which images and sounds of the virtual environment are provided for the user.
- the user is then able to interact with the virtual environment using additional system components that detect the user's actions. For example, movement of the user's head will change the images being displayed as though the user were looking around within the simulated reality.
- the user may also interact with objects that are display in the virtual environment. Both the actions of the user and the results of that action may be represented in the output, e.g., images and sound, of the VR system.
- VR technology becomes more effective and useful, users naturally spend longer amounts of time using a VR system, e.g., wearing a VR headset. This extended usage of the VR system may cause negative effects for the user, such as fatigue, eye strain, headaches and others.
- a non-transitory computer-readable medium comprising instructions that, when executed, cause a server to: receive, via a network interface, data from a number of sensors that are detecting parameters associated with a user during use of a VR headset; operate an artificial intelligence unit to analyze the data from the sensors, the data from the number of sensors detecting parameters associated with the user during use of the VR headset including detection of a condition or action of the user while using the VR headset, the data being applied to machine learning of the artificial intelligence unit to predict a negative effect on the user from the use of the VR headset; and instruct the VR headset to take specific action based on a prediction made by the artificial intelligence unit to minimize a predicted negative effect on the user.
- the sensor may include an electro-magnetic field sensor to detect an amount of electro-magnetic field exposure experienced by the user. Any sensor detecting a condition of the user, the operation of the VR system or the user's environment may be included to provide information that may be useful in quantifying or predicting aspects of the user's experience with the VR system.
- the method includes, with the server, analyzing the data from the sensors 104 to predict a negative effect on the user from the use of the VR headset. For example, exceeding a particular amount of time using the VR system may indicate that the user is likely to experience a headache, disorientation when discontinuing the VR environment to return to reality, eye strain, concern over the amount of time spent and other negative effects.
- detecting an amount of radiation output to the user's eyes over time by the VR system may indicate likely eye strain, sense of fatigue or other negative effects.
- detecting unusual vital signs in the user may indicate a strongly negative emotional response to something being portrayed in the virtual environment. Any potentially negative effect that can be predicted may be associated with corresponding sensors, the output of which is analyzed to ascertain or predict the negative effect being considered.
- the method includes, with the server, taking action to minimize the negative effect on the user that is predicted. For example, if an excessive amount of VR system usage has elapsed, the user may be prompted to discontinue using the system or the system may be automatically deactivated. In other examples, parameters of the system, such as color tone, audio volume or radiation intensity may be adjusted to mitigate potentially negative effects on the user.
- FIG. 2 is a flowchart showing some illustrative responses 200 a system may take to manage user experience, consistent with the disclosed implementations. Each block in FIG. 2 describes a different action that might be taken to mitigate a predicting negative effect on the user from using the VR system.
- the server may instruct the VR headset to shift the display in the VR headset to emit less blue light 208 .
- the prediction of eye strain may be based on the amount or color of radiation that has been output to the user by the VR headset, the amount of time the user has been operating the VR headset, the activity of the user's eyes, ambient conditions such as humidity and other factors. Displays that emit less light in the blue part of the visible spectrum are known to cause less strain to human eyes.
- the system may respond to a prediction of eye strain by the server instructing the VR headset to decrease the brightness of the headset's display ( 210 ).
- a less bright display can also mitigate eye strain.
- the system may predict that the user is experiencing excessive anxiety, perhaps due to the content being displayed in the VR headset. This condition may be predicted, for example, based on user heart activity, user temperature, user perspiration and/or user breathing patterns and other similar parameters.
- the server may instruct the VR headset to recommend a different program for viewing by the user ( 212 ).
- the system may predict that the user is experiencing or will experience a headache. This prediction may be based, for example, on the elapsed time the user has been operating the VR headset, user eye activity and other user conditions.
- the server of the system may decrease a volume level of the audio in the VR headset.
- the server may instruct the VR headset to provide a prompt to the user, visual or audio, recommending a break in headset usage 214 .
- the server may deactivate the VR headset automatically and may disable the VR headset for at least a minimum amount of time 214 .
- FIG. 3 is a flowchart showing some additional illustrative actions the system may take to manage user experience, consistent with the disclosed implementations. As shown in FIG. 3 , the system may also react to the time of day in which the VR headset is used to mitigate potential negative effects on the user.
- the VR headset may be used during nighttime hours, defined as after sundown and before sunup. This period of the day is typically characterized by lower ambient temperatures. Accordingly, in response to the VR headset being used during nighttime hours, the server may instruct the VR headset to adjust a screen temperature of the display device in the VR headset ( 316 ). The screen temperature may be made warmer, with fewer blues and more reds. As with other actions described herein, this may mitigate negative effects on the user from operating the VR headset.
- At least some of the parameters of the responses taken by the system when predicting a negative effect may be set in advance by the user.
- the user may know that, when experiencing eye strain or headache, a particular response by the system is most helpful.
- the system may accept user input that specifies that, in the event of a prediction of eye strain, the user prefers the system to react by reducing blue light emissions, by reducing display brightness or a combination of both.
- the user input may specify that, in the event of VR headset usage exceeding a set amount of time or a prediction of headache or eye strain, the system is to automatically deactivate the VR headset and, in some examples, deactivate the VR headset for a minimum amount of time.
- the method of managing user experience may include surveying the user to better understand and quantify the user experience.
- the method may include expressly surveying the user after use of the VR headset to determine an extent to which any actions taken by the server were perceived to actually benefit user experience 318 .
- This survey may be administered through the VR headset.
- the VR headset may display survey questions and record user input in response using the VR system components that track user action and create electronic user input.
- the survey could be administered to the user on another device, such as a computer or smartphone that the user has access to after the VR session.
- FIG. 4 is a diagram of a VR headset, consistent with the disclosed implementations.
- the VR headset 400 includes a display 420 for displaying a virtues reality program to a user.
- This display 420 may be a single display device or may be two separate display devices, one for each eye of the user.
- the VR headset also includes an eye sensor 430 for sensing parameters of an eye of the user during use of the headset.
- This sensor 430 may be a single sensor or multiple sensors.
- the eye sensor 430 may include a camera for capturing eye movement, pupil dilation, blink rate, widening of the eyes and other parameters of the eye or eyes of the user.
- the VR headset also includes a wireless transceiver 440 and a processor 450 .
- the wireless transceiver 440 can communicate with other peripheral devices, including user input devices and sensors trained on the user.
- the wireless transceiver 440 can also communicate with the system server described above to provide sensor data and receive instructions for the VR headset from the server, as described herein.
- the processor 440 is programmed to use the wireless transceiver to: communicate with a number of peripheral devices 452 that provide sensor data indicative of user activity or condition.
- the processor 440 is further programmed to use the wireless transceiver to transmit sensor data 454 from the peripheral devices and eye sensor to a server for predicting negative effects of using the headset on the user; and receive instructions 456 from the server to take an action to mitigate a negative effect on the user from use of the headset.
- FIG. 5 is a non-transitory computer readable storage medium containing instructions for a server of the system being described, consistent with the disclosed implementations.
- the computer-readable medium comprising instructions that, when executed, cause a server to: receive 562 , via a network interface, data from a number of sensors that are detecting parameters associated with a user during use of a VR headset.
- the instruction also cause the server to operate an artificial intelligence unit 564 to analyze the data from the sensors, the data from the number of sensors detecting parameters associated with the user during use of the VR headset including detection of a condition or action of the user while using the VR headset, the data being applied to machine learning of the artificial intelligence unit to predict a negative effect on the user from the use of the VR headset.
- the medium includes instructions causing the server to instruct 566 the VR headset to take specific action based on a prediction made by the artificial intelligence unit to minimize a predicted negative effect on the user.
- the actions may include any actions within control of the server or VR headset/system that may mitigate a predicted negative effect on the user from use of the VR system.
- FIG. 6 is an illustrative system, including a VR headset, for reducing VR side effects, consistent with the disclosed implementations.
- the VR headset 400 communicates with a server 600 . This communication may be over the Internet or some other data network.
- the VR headset transmits sensor data 601 to the server 600 and receives from the server instructions 602 including instructions to mitigate potentially negative effects on the user.
- the recommended best usage practices 680 for that user can be stored in the cloud data storage 672 to guide in which mitigation instructions are issued to the VR Headset 400 and when.
- the cloud data storage 872 may also store other user data, such as data related to games and gameplay for different individual users.
- Suggesting content according to heart rate Suggest comedy content if heart rate exceeds a threshold or limit.
- the system may connect wirelessly to a smart band which can report the user's pulse rate. For example, the user is watching a war scene, and the system notes that the user's pulse rate has increased over the established limit. The system may then prompt the user to switch to comedic or other light content to help the user normalize.
- FIG. 7 is a flowchart illustrating a method of reducing VR health impacts, consistent with the disclosed implementations.
- FIG. 7 illustrates the initial training of the artificial intelligence unit 610 to develop the prediction models described in FIG. 6 .
- a database 772 of Virtual Reality system usage may be used to train the AI unit 610 .
- This database may include data from a large number of VR sessions in which parameters of the use and conditions of user were tracked.
- the database 772 may also include user survey data describing or quantifying negative effects that the users reported experiencing the VR sessions. The larger this database 772 , the better for training the AI unit 610 .
- the AI unit 610 may use the data from the database 772 to generate a number of scenarios in which a corrective action is taken.
- the AI unit 610 may recognize idleness 776 - 1 of the user or the VR unit for a threshold period of time, e.g., 15 minutes.
- the AI unit 610 will issue an instruction 778 - 1 to deactivate the VR headset.
- the AI unit 610 may recognize that the usage of the VR unit has exceeded some threshold, e.g., a time limit. In response to this over usage 776 - 2 , the AI unit 610 may issue an instruction to notify the user to discontinue use 778 - 2 , e.g. for a minimum period of time such as 15 minutes. This prompt to the user may be made through the VR headset. Alternatively, as noted above, the AI unit 610 could issue an instruction automatically deactivating the VR system.
- some threshold e.g., a time limit.
- the AI unit 610 may issue an instruction to notify the user to discontinue use 778 - 2 , e.g. for a minimum period of time such as 15 minutes. This prompt to the user may be made through the VR headset.
- the AI unit 610 could issue an instruction automatically deactivating the VR system.
- the AI unit 610 may recognize conditions indicative of, or likely to cause, eyestrain 776 - 3 .
- the AI unit 610 may shift the display to avoid blue light or may decrease the brightness of the display 778 - 3 .
- the AI unit 610 may recognize that the use is occurring at night 776 - 4 and shift the color to warmer colors or reduce the blue in the colors displayed to the user to adjust the screen temperature 778 - 4 .
- the AI unit 610 may recognize conditions indicative of, or likely to cause, headache 776 - 5 , and record with corresponding actions 778 - 5 .
- the headache mitigation may include reducing the brightness of the images in the VR headset, reducing the amount of blue light in the displayed images, or pausing use of the VR headset for a period of time.
- the AI unit 610 or the system may recognize that the system is being used at night 776 - 4 .
- the system may respond by adjusting the temperature of the display screen in the VR headset.
- FIG. 8 A is an illustrative user interface for conducting a survey 800 , consistent with the disclosed implementations.
- the survey 800 provides questions on health impacts from the VR system and solicits feedback from the user about which effects the user experienced. For example, the survey 800 may inquire about eye strain, headache, nausea, and disorientation.
- the survey 800 may provide for a ranking of the severity of the health impacts. In the example survey 800 shown in FIG. 8 A , the rankings are none, mild, moderate, and severe. Other rankings may be used including number rankings, e.g. 1-5.
- the feedback from the survey provides information on the symptoms the user experienced and the effectiveness of mitigation strategies provided by the AI unit 610 .
- FIG. 8 B is a survey 800 , consistent with the disclosed implementations.
- the system includes a survey 800 of the user's experience with the content displayed by the system.
- the system may provide a survey 800 to determine a users experience. This information may be provided to developers or others to improve the content.
- the comments of the survey may be parsed using natural language analysis to extract health impacts from VR usage. For example, a comment which includes the term headache or anxiety may prompt a symptom survey from the system.
- the VR headset 400 may also receive input from various VR controllers or other devices 986 such as a joystick, trackball, wand or the like. These peripheral devices may also be describes as part of the Internet of Things (IoT). Both these peripheral devices 986 and the sensors 987 may communicate with the VR headset 400 wirelessly, for example, via Blutooth®,
- FIG. 10 is a flowchart for generating a time series regression model, consistent with the disclosed implementations.
- data is collected in the VR usage database 772 .
- the data may be cleansed 1090 by remove outliers, data imputation and dividing the data into frames.
- Data cleansing may include determining outliers and removing outliers from the data set.
- Data cleansing may include normalizing or smoothing the data.
- Data cleansing may include interpolating missing data points and providing interpolated data points into the subsequent analysis.
- the data cleansing may be automated, semi-automated, or curated by an expert.
- the data cleansing may be reduced as the size of the dataset grows.
- feature selection by statistical correlation 1092 is then performed. This includes creating a feature list, checking collinearity and validating the data through an elbow graph.
- the data with revised features is then fed to a time series regression model 1094 .
- the model provides differencing of the data and calculates a Disparity Change Index (e.g., with Pearson's correlation or Time Series Regression).
- the model than performs distributing train, validation and test; then tuning and model validation including prediction with a test dataset.
- the dataset is split into a modeling dataset and a test data set. For example, the dataset may be split 80:20 or 70:30 between modeling data points and test data points.
- the modeling data points are used to generate the model and the test data points are used to test and validate the model.
- the model 1094 receives sensor data 1096 as an input, as described above. Using the sensor data 1096 , the model 1094 makes a prediction 1098 as to a negative effect that is or will be experienced by the user. The model adds all new data to the training set and continues to improve the predications as more user data is added.
- this approach can be outlined as follows.
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Computation (AREA)
- Medical Informatics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Artificial Intelligence (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
-
- 1. VR_User: User_ID, Address, Zip, User_Since
- 2. VR_Usage_History: User_ID, Usage_ID, Date, Duration, Movie Name, Game Name, Story Name, Education, Automotive
- 3. Health_Issue: User_ID, Usage_ID, Headache(Y/N), Eye Pain(YIN), Nausea(Y/N), General Discomfort(Y/N), Radiation_Exposure(duration)
- 4. User_Experience Use_ID, Usage_ID, Rating (rating range e.g. 1 to 5), Gaming_Control_Exp (rating range e.g. 1 to 5), Emotions_Exp (rating range e.g. 1 to 5)
-
- 1. Collect the data from VR using the eye tracking sensors.
- a. Data related to the eye movement
- b. Eye blinking rate
- c. Pupil dilation
- d. Widening of eyes
- e. incorporated in the proposed model is the previous state of eye fatigue (EPprev)
- 2. After this, create a dataset using all the collected features.
- 3. The Pearson correlation is used between the disparity change of the current and previous frame.
- 4. Data preprocessing is then performed. This includes, but is not limited to, feature scaling and replacing null, invalid, and missing data.
- 5. Post data preprocessing, the dataset is mostly ready. It may also be helpful to perform a reduction in the dimensions of the dataset and retain only those columns/factors that make significant contributions to the result.
- 6. Divide the dataset into training and test datasets. In some examples, the ratio is 70:30 or 80:20 training to test.
- 7. The training dataset is then supplied to the algorithm to train on. In some examples, Time Series Regression is the model of choice as results may be based on the time frames. However, other models may be used without departing from the scope of the claimed invention.
- 8. Results are calculated for all eye fatigue, anxiety and radiation exposure issue and cumulative result will be given.
- 9. Some tuning may be required to be done after getting the recommendation results.
- 10. Once the tuning is done, the model is tested on the test dataset.
- 11. Finally, the model is tested with several new and full datasets.
- 12. The model is ready and keeps learning as it grows and accommodates more and more users.
- 1. Collect the data from VR using the eye tracking sensors.
Claims (20)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| IN202041045941 | 2020-10-21 | ||
| IN202041045941 | 2020-10-21 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20220124304A1 US20220124304A1 (en) | 2022-04-21 |
| US11843764B2 true US11843764B2 (en) | 2023-12-12 |
Family
ID=81185247
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US17/507,012 Active 2041-12-22 US11843764B2 (en) | 2020-10-21 | 2021-10-21 | Virtual reality headsets and method of managing user experience with virtual reality headsets |
Country Status (1)
| Country | Link |
|---|---|
| US (1) | US11843764B2 (en) |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CA3265353A1 (en) * | 2022-09-03 | 2024-03-07 | Haazoyi Investments Ltd. | Digital content moderation |
| US12057036B2 (en) | 2022-09-13 | 2024-08-06 | Qualcomm Incorporated | Predicting thermal states in connected devices to provide edge processing |
| CN116203729A (en) * | 2023-01-10 | 2023-06-02 | 北京太一数科技术有限公司 | Realize synchronous head-mounted VR equipment with cell-phone |
Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6012926A (en) | 1996-03-27 | 2000-01-11 | Emory University | Virtual reality system for treating patients with anxiety disorders |
| US20160026253A1 (en) | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
| US20160077547A1 (en) * | 2014-09-11 | 2016-03-17 | Interaxon Inc. | System and method for enhanced training using a virtual reality environment and bio-signal data |
| US20180364801A1 (en) * | 2017-06-19 | 2018-12-20 | Kt Corporation | Providing virtual reality experience service |
| US20190046859A1 (en) * | 2017-08-14 | 2019-02-14 | International Business Machines Corporation | Sport training on augmented/virtual reality devices by measuring hand-eye coordination-based measurements |
| US10268888B2 (en) | 2010-02-28 | 2019-04-23 | Microsoft Technology Licensing, Llc | Method and apparatus for biometric data capture |
| US20190282910A1 (en) * | 2016-10-26 | 2019-09-19 | Hashilus Co., Ltd. | Vr amusement ride device |
| US20190328305A1 (en) * | 2016-01-21 | 2019-10-31 | Carl Zeiss Meditec, Inc. | System and method for testing a condition of the nervous system using virtual reality technology |
| US20190366030A1 (en) | 2014-03-06 | 2019-12-05 | Virtual Reality Medical Applications, Inc. | Virtual Reality Medical Application System |
| KR102113634B1 (en) | 2018-02-07 | 2020-05-21 | 장준석 | Virtual reality head mounted display for showing user's status and user status display method and content control method using the system |
| TW202027042A (en) | 2019-01-02 | 2020-07-16 | 見臻科技股份有限公司 | Method of monitoring eye strain and related optical system |
| US20200320592A1 (en) * | 2018-08-06 | 2020-10-08 | Olive Seed Industries, Llc | Methods and systems for personalizing visitor experience at a non-profit venue using machine learning to generate selection or sequence of non-profit venue location recommendations |
| US11128636B1 (en) * | 2020-05-13 | 2021-09-21 | Science House LLC | Systems, methods, and apparatus for enhanced headsets |
| US20210383912A1 (en) * | 2020-06-03 | 2021-12-09 | At&T Intellectual Property I, L.P. | System for extended reality visual contributions |
| US11451758B1 (en) * | 2020-02-12 | 2022-09-20 | Meta Platforms Technologies, Llc | Systems, methods, and media for colorizing grayscale images |
-
2021
- 2021-10-21 US US17/507,012 patent/US11843764B2/en active Active
Patent Citations (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6012926A (en) | 1996-03-27 | 2000-01-11 | Emory University | Virtual reality system for treating patients with anxiety disorders |
| US10268888B2 (en) | 2010-02-28 | 2019-04-23 | Microsoft Technology Licensing, Llc | Method and apparatus for biometric data capture |
| US20190366030A1 (en) | 2014-03-06 | 2019-12-05 | Virtual Reality Medical Applications, Inc. | Virtual Reality Medical Application System |
| US20160026253A1 (en) | 2014-03-11 | 2016-01-28 | Magic Leap, Inc. | Methods and systems for creating virtual and augmented reality |
| US20160077547A1 (en) * | 2014-09-11 | 2016-03-17 | Interaxon Inc. | System and method for enhanced training using a virtual reality environment and bio-signal data |
| US20190328305A1 (en) * | 2016-01-21 | 2019-10-31 | Carl Zeiss Meditec, Inc. | System and method for testing a condition of the nervous system using virtual reality technology |
| US20190282910A1 (en) * | 2016-10-26 | 2019-09-19 | Hashilus Co., Ltd. | Vr amusement ride device |
| US20180364801A1 (en) * | 2017-06-19 | 2018-12-20 | Kt Corporation | Providing virtual reality experience service |
| US20190046859A1 (en) * | 2017-08-14 | 2019-02-14 | International Business Machines Corporation | Sport training on augmented/virtual reality devices by measuring hand-eye coordination-based measurements |
| KR102113634B1 (en) | 2018-02-07 | 2020-05-21 | 장준석 | Virtual reality head mounted display for showing user's status and user status display method and content control method using the system |
| US20200320592A1 (en) * | 2018-08-06 | 2020-10-08 | Olive Seed Industries, Llc | Methods and systems for personalizing visitor experience at a non-profit venue using machine learning to generate selection or sequence of non-profit venue location recommendations |
| TW202027042A (en) | 2019-01-02 | 2020-07-16 | 見臻科技股份有限公司 | Method of monitoring eye strain and related optical system |
| US11451758B1 (en) * | 2020-02-12 | 2022-09-20 | Meta Platforms Technologies, Llc | Systems, methods, and media for colorizing grayscale images |
| US11128636B1 (en) * | 2020-05-13 | 2021-09-21 | Science House LLC | Systems, methods, and apparatus for enhanced headsets |
| US20210383912A1 (en) * | 2020-06-03 | 2021-12-09 | At&T Intellectual Property I, L.P. | System for extended reality visual contributions |
Non-Patent Citations (2)
| Title |
|---|
| Aimed; Why We Should Embed AI Into Virtual Reality Headsets; Dec. 27, 2018. |
| Anderson Augusto Simiscuka; Real-Virtual World Device Synchronization in a Cloud-enabled Social Virtual Reality IoT Network; Dublin City University; Aug. 5, 2019. |
Also Published As
| Publication number | Publication date |
|---|---|
| US20220124304A1 (en) | 2022-04-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US12002180B2 (en) | Immersive ecosystem | |
| US11545046B2 (en) | Neuroadaptive intelligent virtual reality learning system and method | |
| EP3688897B1 (en) | Digitally representing user engagement with directed content based on biometric sensor data | |
| US20220262504A1 (en) | Electronic arrangement for therapeutic interventions utilizing virtual or augmented reality and related method | |
| US11843764B2 (en) | Virtual reality headsets and method of managing user experience with virtual reality headsets | |
| CN115004308A (en) | Method and system for providing an interface for activity recommendations | |
| Huang et al. | Virtual reality safety training using deep EEG-net and physiology data | |
| US20220020474A1 (en) | Dynamic Multi-Sensory Simulation System for Effecting Behavior Change | |
| US20240164672A1 (en) | Stress detection | |
| US11393252B2 (en) | Emotion sensing artificial intelligence | |
| KR20190027354A (en) | Method and system for acquiring, analyzing and generating vision performance data and modifying media based on vision performance data | |
| US11404156B2 (en) | Methods for managing behavioral treatment therapy and devices thereof | |
| US20210401339A1 (en) | Adaptive behavioral training, and training of associated physiological responses, with assessment and diagnostic functionality | |
| WO2023283161A1 (en) | Enhanced meditation experience based on bio-feedback | |
| CN119181482A (en) | Old person cognition health monitoring system based on virtual reality | |
| KR20210100393A (en) | Counseling environment control system using virtual reality and artificial intelligence, control method thereof, and computer-readable medium for storing a control program for the same | |
| US11904179B2 (en) | Virtual reality headset and system for delivering an individualized therapy session | |
| US20250069725A1 (en) | System and method for training, tagging, recommending, and generating digital content based on biometric data | |
| US12539080B2 (en) | Biofeedback system | |
| CN121614838A (en) | Interactive system based on emotion recognition | |
| JP2026019118A (en) | system | |
| WO2025147488A1 (en) | Virtual reality training system and method | |
| JP2026029772A (en) | system | |
| JP2026019121A (en) | system | |
| JP2026017929A (en) | system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
| AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KULKARNI, ANURADHA;PAL, TULIT;CHINCHOLIKAR, NARENDRA KUMAR;REEL/FRAME:061518/0057 Effective date: 20201020 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING TC RESP., ISSUE FEE NOT PAID |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |