KR20170104846A - Method and apparatus for analyzing virtual reality content - Google Patents
Method and apparatus for analyzing virtual reality content Download PDFInfo
- Publication number
- KR20170104846A KR20170104846A KR1020160027790A KR20160027790A KR20170104846A KR 20170104846 A KR20170104846 A KR 20170104846A KR 1020160027790 A KR1020160027790 A KR 1020160027790A KR 20160027790 A KR20160027790 A KR 20160027790A KR 20170104846 A KR20170104846 A KR 20170104846A
- Authority
- KR
- South Korea
- Prior art keywords
- information
- virtual reality
- value
- reality content
- fps
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/144—Processing image signals for flicker reduction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
Abstract
Description
The present invention relates to a method and apparatus for analyzing a virtual reality content, and more particularly, to a method and apparatus for analyzing a virtual reality content capable of generating dizziness numerical information about a virtual reality content.
With the development of user interface technology, input / output devices for providing various audiovisual stimuli to users have been developed. For example, input devices that have conventionally provided input signals only with a keyboard, a mouse, a joystick, and the like have been developed to provide input signals in various ways using gravity change, acceleration change, or the like.
In addition, in the case of an output device, it has been developed to provide a stereoscopic image and to provide a more realistic audiovisual stimulation, in addition to providing a large screen to a user or providing an ultra-high-quality image. For example, output devices that allow users to experience virtual reality by wearing their heads, such as Oculus Lift, Galaxy VR, Morpheus, etc., are being developed.
Unlike general audiovisual output devices, these virtual reality terminals are configured such that the user can visually perceive only the image displayed on the output screen in a state in which the user's visual field is totally masked. By this characteristic, the user can obtain motion sickness There is a problem that I can feel.
Various methods have been studied to reduce the dizziness the user feels in the virtual reality environment. Korean Patent No. 10-1564964 is a related prior art document.
The present invention is to provide a method and apparatus for analyzing virtual reality contents capable of generating dizziness numerical information about virtual reality contents.
The present invention also provides a method for providing a virtual reality content developer with information on dizziness for a virtual reality content, and a method for providing a virtual reality content developer with a virtual reality content capable of reducing the time and economic costs required for development by predicting a potential dizziness- And a content analysis method and apparatus.
According to an aspect of the present invention, there is provided a method of reproducing virtual reality content, the method comprising: receiving information on a virtual reality content reproduced by a user; And analyzing information on the virtual reality content and generating dizziness numerical information about the virtual reality content using the analysis result, wherein the information on the virtual reality content includes information on the frame information The object information included in the virtual reality content, and the geometric information about the object, and the analysis result includes at least one of a Frame Percent (FPS) value during the playback time, a speed and acceleration value of the object, A frequency component for the object, and a distance information between the user and the object.
According to another aspect of the present invention, there is provided an information processing apparatus comprising: a storage unit for storing information on a virtual reality content reproduced by a user; And a dizziness numerical information generating unit for analyzing information on the virtual reality content and generating dizziness numerical information about the virtual reality content using the analysis result, The frame information of the object, the object information of the virtual reality content, and the geometric information of the object, and the analysis result includes at least one of a Frame Perceptual (FPS) value during the playback time, An acceleration value, a frequency component for the object, and a distance information of the user and the object.
According to the present invention, dizziness numerical information can be generated by analyzing information on a virtual reality content reproduced by a user.
Further, according to the present invention, since the dizzy numerical value information about the virtual reality contents is provided to the developer of the virtual reality contents and the developer can predict the potential dizziness incidence probability with respect to the virtual reality contents, And to provide a method and apparatus for analyzing virtual reality contents that can reduce costs.
1 is a view for explaining a virtual reality contents analysis system according to an embodiment of the present invention.
2 is a view for explaining a dizziness numerical information generating unit according to an embodiment of the present invention.
3 is a view for explaining a dizziness numerical information generating unit according to a specific embodiment of the present invention.
4 is a view for explaining information on a virtual reality content according to an embodiment of the present invention.
5 is a view for explaining an FPS according to an embodiment of the present invention.
6 is a diagram showing pseudo code of an FPS analysis method using a macroscopic filter.
7 is a diagram for explaining an FPS analysis method using a microscopic filter.
8 is a diagram showing a pseudo code of an FPS analysis method using a microscopic filter.
9 is a diagram illustrating segmentation for frequency analysis according to an embodiment of the present invention.
10 is a diagram showing the results of FFT analysis.
11 is a diagram showing a pseudo code of a frequency analysis method according to the present invention.
12 is a view for explaining a distance analysis method according to an embodiment of the present invention.
13 is a view for explaining a report message according to an embodiment of the present invention.
14 is a diagram for explaining a method of analyzing a virtual reality content according to an embodiment of the present invention.
While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.
The present invention analyzes the virtual reality content, and calculates how dizzy the user is when watching the virtual reality content to provide dizziness numerical information. That is, the present invention analyzes the degree of user's dizziness that can be generated by software called virtual reality content rather than a hardware device such as a virtual reality terminal. The dizziness numerical information is information expressed by quantifying the degree of dizziness that a user can feel when viewing a virtual reality content.
The dizziness numerical information can be provided to the developer who produces the virtual reality contents. The developer can develop the virtual reality contents using the dizzy numerical information according to the present invention so that the user can feel less dizziness.
Hereinafter, embodiments according to the present invention will be described in detail with reference to the accompanying drawings.
1 is a view for explaining a virtual reality contents analysis system according to an embodiment of the present invention.
Referring to FIG. 1, a virtual reality content analysis system according to the present invention includes a
The
The information about the virtual reality contents provided by the
For example, when a virtual reality content is a game, an object in the virtual reality world responds to a user's movement or input, and the
The virtual reality
The dizziness numerical information generating unit 123 analyzes at least one of a frame perceptance (FPS) value during a reproduction time, a speed and an acceleration value of an object, a frequency component for an object, and a distance information of a user and an object And generate analysis results that include.
The dizziness numerical information generating unit 123 may generate dizziness numeric information by comparing the analysis result with a preset threshold value. For example, the dizziness numerical information generating unit 123 can generate dizziness numerical information indicating that the FPS value is smaller than the threshold value and the greater the difference between the FPS value and the threshold value, the greater the degree that the user feels dizziness.
Also, the dizziness numerical information generating unit 123 may calculate an average dizzy number during the reproduction time of the virtual reality contents, or calculate the dizziness value for each measurement interval.
The
As a result, according to the present invention, dizziness numerical information can be generated by analyzing information on a virtual reality content reproduced by a user, and help to develop a virtual reality content in which a user can feel dizziness less through dizziness numerical information You can give.
In FIG. 1, although the case where the
FIG. 2 is a view for explaining a dizziness numerical information generating unit according to an embodiment of the present invention, and FIG. 3 is a view for explaining a dizziness numerical information generating unit according to a specific embodiment of the present invention.
2, the dizziness numerical
The
Referring to FIG. 3, the
The dizziness numerical
Hereinafter, the information about the virtual reality contents will be described in detail, and a specific analysis method of each analyzing unit will be described.
<Information about virtual reality contents>
4 is a view for explaining information on a virtual reality content according to an embodiment of the present invention.
Information on the virtual reality contents can be extracted through the data extraction script, as described above, and can be provided to the virtual reality
Of the information on the virtual reality contents, the frame information during the reproduction time includes the reproduction time information, the index information of the frame output during the reproduction time, and the output time difference information between frames. For example, the frame information can be designated as an attribute value of < deltaTime >, < frameCount > and < time > in FIG. 4, and each attribute value is as shown in [Table 2].
The total number of frames corresponds to the last number of the <frameCount> attribute value, and <deltaTime> represents the difference between the current frame time t1 and the previous frame time t0.
Type
Variables
Of the information on the virtual reality content, the object information included in the virtual reality content being reproduced can be designated as an attribute value of <list_gameObject> in FIG. For example, object information can be the same as [Table 3].
Type
(Variable)
Among the information on the virtual reality contents, the geometrical information on the object includes the position information of the object included in the virtual reality content being reproduced and the rotation information of the object. The geometric information can be designated, for example, as an attribute value of < transform > in FIG. 4, and each attribute value is as shown in [Table 4].
<eulerAngles> is a three-dimensional vector in degrees (°) representing the rotation angle (tilt) with respect to the X, Y, and Z axes based on a calibration reference that is the direction that the user looks at when the
(Euler Angle, in degrees (°))
<FPS Analysis>
FIG. 5 is a view for explaining FPS according to an embodiment of the present invention, and FIG. 6 is a diagram showing pseudo code of an FPS analysis method using a macroscopic filter. FIG. 7 is a view for explaining a FPS analysis method using a microscopic filter, and FIG. 8 is a diagram showing pseudo code of an FPS analysis method using a microscopic filter.
Generally, frames are processed at 30 frames per second for TV broadcasts and 60 frames per second for general PC and console games, whereas the virtual reality technology processes frames at 90 frames per second so that the human brain recognizes virtual reality as real time . The lower the number of frames per second (FPS), the worse the motion sickness experienced by the user viewing the virtual reality content.
The
More specifically, the
When a macroscopic filter is used, the
5 showing the timeline of the frame output time during the reproduction time of the virtual reality contents, the difference (t) of the output time t of each of the frames S included in the virtual reality contents is obtained from the frame information And the
Where t f is the output time of the previous frame and t i is the output time of the current frame.
The
Meanwhile, the
When a microscopic filter is used, the
More specifically, the
At this time, R can be calculated as shown in Equation (3). In Equation (3), T corresponds to the reproduction time, and ceil is the decimal point increasing function.
The
When the number of kernel windows used in the measurement interval is greater than K, the
That is, the
For example, if there is no interval in which the second FPS value is smaller than the threshold value in the first measurement interval, the dizziness value for the first measurement interval may be calculated to be low. Or if the second FPS value is greater than the threshold in the first measurement interval, the dizziness value of the corresponding interval may be calculated to be high or low depending on the difference from the threshold value.
Referring to FIG. 8, the
On the other hand, in the measurement period, the interval (I Remainder ) in which the number of frames is smaller than K is preferably excluded from the analysis target.
The virtual reality contents developer can develop the virtual reality contents by increasing the FPS in a certain interval where the user is likely to feel dizziness by using the dizziness numerical information according to the FPS analysis.
<Speed and Acceleration Analysis>
In the real world, the ears of a person who judges a sense of balance can not sense discomfort when the rate of change is low, or when a certain speed or acceleration acts on the human body. However, when the
The
As described above, since the geometric information includes the position information of the object and the frame information includes the inter-frame output time difference information, the
The
On the other hand, the
The virtual reality contents developer can reduce the dizziness that the user feels by developing the virtual reality contents so that the speed and acceleration of the object responding by the user can be reduced by using the dizziness numerical information according to the speed and acceleration analysis.
<Frequency analysis>
FIG. 9 is a diagram illustrating an interval division for frequency analysis according to an embodiment of the present invention, and FIG. 10 is a diagram illustrating an FFT analysis result. 11 is a diagram showing a pseudo code of a frequency analysis method according to the present invention.
There are mismatched motions among the causes that cause symptoms such as motion sickness, dizziness, and vomiting while experiencing virtual reality contents. Motion mismatch means the motion in the simulation and the inconsistency in the user's intended motion. Therefore, during viewing of the virtual reality content, the user is advised to refrain from inappropriate actions such as periodic head movement.
The
The
here,
Represents the time variation between the previous frame and the current frame, Represents the change in the rotation angle of the object between the previous frame and the current frame.Then, the
The
When FFT is applied to the calculated magnitude, complex-form data is output. The absolute value of the complex-form data is the amplitude in the frequency domain.
As a result, the
As described above, the motion mismatch occurs due to the user's repeated periodic operation, and the frequency component for the movement of the object according to the motion of the user is represented in the low frequency region. In one embodiment, the frequency band that is compared to the threshold may be a frequency band that is selected in the band between 0.05 Hz and 0.8 Hz.
FIG. 10 shows a result of performing FFT by setting the measurement period to 5 seconds when the user periodically moves the head. FIG. 10 (a) shows a section from the start of reproduction to 5 seconds, Represents the interval from seconds to 10 seconds. As shown in FIG. 10, when the user periodically moves the head, the amplitude is high in a low frequency region close to zero.
Accordingly, the
By using the dizziness numerical information according to the frequency analysis, the virtual reality contents developer can reduce the dizziness that the user feels by developing the virtual reality contents such that the reaction of the object to the unnecessary movement of the user is insensitive.
<Distance Analysis>
12 is a view for explaining a distance analysis method according to an embodiment of the present invention.
In a case where a plurality of objects exist in a virtual reality content to be played back in a virtual reality terminal such as an HMD and an object located too close to the user is present in the virtual reality content, that is, when a specific object occupies most of the playback screen, It becomes more likely to feel dizziness.
Accordingly, the
As shown in FIG. 12, when four rabbits exist in the viewing plane (Dlipping plane, Far plane) of the user, the
The
The virtual reality contents developer can reduce the dizziness experienced by the user by using the dizziness numerical information according to the distance analysis to develop the virtual reality contents such that the distance between the object and the user included in the virtual reality contents is more than a specific distance.
13 is a view for explaining a report message according to an embodiment of the present invention.
As described above, the virtual reality contents analyzing apparatus transmits the dizzy numerical value information to the client terminal, and at this time, the information and analysis result used to generate the dizziness numerical information can be transmitted to the client terminal together. Can be transmitted in the same XML data format.
The topmost header <AnalysisReportRoot> stores the dizziness numerical information generated according to the analysis method applied to each frame or measurement interval. FIG. 13 shows a case where the FPS value for the 730th frame, the degree of dizziness numerical information (Low FPS) is stored, the acceleration value for the 714th frame, and the degree of dizziness numerical information (High Acceleration) are stored.
The <TargetObjName> existing as an attribute in the header stores information about the object to be analyzed.
<Message> is an attribute to add a message for adding a description, and this part is output as a log message at the client terminal.
<nStackLevel> represents the layer information of the corresponding dizziness information, <strType> is a string type, and defines the type of module in which the current log is generated.
After defining the attributes for the log itself, information about the point at which the log was created is stored, which is stored in the <time> part under the <AnalysisReportRoot>. The most important attribute values are <deltaTime>, <frameCount>, and <time>.
<deltaTime> stores the time taken to output the current frame in the previous frame, and <frameCount> specifies the frame number of the current frame during playback of the current scene. Finally, time is stored in play time. All <time> related data are stored in units of seconds and can include a decimal point so that the accuracy of time for a frame unit can be maintained.
On the other hand, a null value (0) is stored in an attribute in which a specific value is not stored, and various values may be stored according to the embodiment.
14 is a diagram for explaining a method of analyzing a virtual reality content according to an embodiment of the present invention.
In FIG. 14, a method for analyzing a virtual reality content of the virtual reality
The apparatus for analyzing a virtual reality content according to the present invention receives information about a virtual reality content reproduced by a user (S1410), receives input, analyzes information about the virtual reality content, (S1420). ≪ / RTI > The virtual reality contents analyzing apparatus can generate dizziness numerical information by comparing the analysis result with predetermined threshold values.
At this time, the information on the virtual reality contents includes at least one of frame information during reproduction time, object information included in the virtual reality contents, and geometric information about the object. The analysis result includes at least one of a frame perceptance (FPS) value during playback time, a speed and acceleration value of the object, a frequency component for the object, and distance information of the user and the object.
The virtual reality contents analyzing apparatus may calculate the first FPS value for each frame during the reproduction time using the frame information, compare one of the first FPS values with the first threshold value, and generate dizziness numerical information .
Or the virtual reality contents analyzing apparatus may divide the reproduction time into a predetermined measurement period, calculate a second FPS value for each measurement interval using the first FPS value, compare the second FPS value with a second threshold value , It is possible to generate dizziness numerical information for each measurement interval.
Or the virtual reality content analyzer may use the geometric information to compute the speed and acceleration of the object and compare the speed and acceleration to the third threshold to generate dizziness numerical information, according to an embodiment.
Or the virtual reality contents analyzing apparatus according to the embodiment calculates the magnitude of the angular acceleration of the object moving according to the movement of the user by using the geometric information and calculates the magnitude of the amplitude calculated in the frequency domain through the frequency analysis on the magnitude The amplitude of the set frequency band is compared with the fourth threshold value to generate the dizziness numerical information.
Or the virtual reality content analyzing apparatus may calculate the distance between the user and the object using the geometric information and compare the calculated distance with the fifth threshold value to generate dizziness numerical information according to the embodiment.
The above-described technical features may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware device may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.
As described above, the present invention has been described with reference to particular embodiments, such as specific elements, and specific embodiments and drawings. However, it should be understood that the present invention is not limited to the above- And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains. Accordingly, the spirit of the present invention should not be construed as being limited to the embodiments described, and all of the equivalents or equivalents of the claims, as well as the following claims, belong to the scope of the present invention .
Claims (19)
Analyzing information about the virtual reality content and generating dizziness numerical information about the virtual reality content using the analysis result,
Wherein the information about the virtual reality contents includes:
Frame information during reproduction time, object information included in the virtual reality content, and geometric information about the object,
The results of the analysis
A frame rate value during the playback time, a speed and acceleration value of the object, a frequency component for the object, and a distance information of the user and the object,
Virtual reality content analysis method.
The step of generating the dizzy numerical information
And comparing the analysis result with a predetermined threshold value to generate the dizzy numerical value information
Virtual reality content analysis method.
The frame information during the reproduction time is
The reproduction time information, the index information of the frame outputted during the reproduction time, and the inter-frame output time difference information
A method of analyzing virtual reality content, including rules.
The step of generating the dizzy numerical information
Calculating a first FPS value for each frame during the reproduction time using the frame information; And
Comparing the threshold value with one of the first FPS values to generate the dizziness numerical information
And analyzing the virtual reality content.
The step of generating the dizzy numerical information
Calculating a first FPS value for each frame during the reproduction time using the frame information;
Dividing the reproduction time into a predetermined measurement period;
Calculating a second FPS value for each measurement interval using the first FPS value; And
Comparing the second FPS value with a threshold value, and generating the dizziness value information for each measurement interval
And analyzing the virtual reality content.
The step of calculating the second FPS value for each measurement interval
Calculating a size of a kernel window using the number of frames during the playback time and the playback time;
Overlapping a portion of the kernel window for each measurement interval and calculating a third FPS value for a frame included in the kernel window for each kernel window; And
Calculating the second FPS value by averaging the third FPS value
And analyzing the virtual reality content.
The threshold value
One of the second FPS values per measurement interval
Virtual reality content analysis method.
The geometric information
Position information of the object, rotation information of the object
A method of analyzing virtual reality content, including rules.
The step of generating the dizzy numerical information
Calculating the speed and acceleration of the object using the geometric information; And
Comparing the speed and acceleration with a threshold value, and generating the dizziness value information
And analyzing the virtual reality content.
The step of generating the dizzy numerical information
Analyzes the frequency component of the moving object according to the movement of the user, and generates the dizzy numerical value information
Virtual reality content analysis method.
The step of generating the dizzy numerical information
Calculating an angular acceleration of the object using the geometric information;
Calculating a magnitude of the angular acceleration;
Calculating an amplitude in the frequency domain through frequency analysis for the magnitude; And
Comparing the amplitude of the predetermined frequency band with a threshold value, and generating the dizzy numerical value information
And analyzing the virtual reality content.
The frequency band
The frequency band selected in the band between 0.05 Hz and 0.8 Hz
Virtual reality content analysis method.
The step of generating the dizzy numerical information
Using the geometric information to calculate a distance between the user and the object; And
Comparing the distance with a threshold value, and generating the dizziness value information
And analyzing the virtual reality content.
Transmitting the analysis result and the dizzy numerical value information to the client terminal
Further comprising the steps of:
And a dizziness numerical information generating unit for analyzing the information about the virtual reality contents and generating dizziness numerical information about the virtual reality contents using the analysis result,
Wherein the information about the virtual reality contents includes:
Frame information during reproduction time, object information included in the virtual reality content, and geometric information about the object,
The results of the analysis
A frame rate value during the playback time, a speed and acceleration value of the object, a frequency component for the object, and a distance information of the user and the object,
Virtual reality content analysis device.
The dizzy numerical value information generating unit
An FPS analyzer for comparing the FPS value with a first threshold value to generate first dizziness value information;
An object analyzer for comparing at least one of a speed and an acceleration value of the object and a frequency component for the object with a second threshold to generate second dizziness numerical information; And
A distance analyzing unit for comparing the distance information with a third threshold value and generating third dizziness value information,
Wherein the virtual reality content analyzing apparatus comprises:
Each of the analysis units
It works in the form of libraries that exist in different layers.
Virtual reality content analysis device.
An information receiving unit for receiving information on the virtual reality contents from a virtual reality terminal; And
An information transmission unit for transmitting the dizzy numerical value information to the client terminal,
Wherein the virtual reality content analyzing apparatus further comprises:
The virtual reality contents analyzing apparatus
A virtual reality terminal or server
Virtual reality content analysis device.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160027790A KR20170104846A (en) | 2016-03-08 | 2016-03-08 | Method and apparatus for analyzing virtual reality content |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020160027790A KR20170104846A (en) | 2016-03-08 | 2016-03-08 | Method and apparatus for analyzing virtual reality content |
Publications (1)
Publication Number | Publication Date |
---|---|
KR20170104846A true KR20170104846A (en) | 2017-09-18 |
Family
ID=60034385
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020160027790A KR20170104846A (en) | 2016-03-08 | 2016-03-08 | Method and apparatus for analyzing virtual reality content |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR20170104846A (en) |
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101958263B1 (en) | 2018-08-03 | 2019-03-14 | (주)이노시뮬레이션 | The control method for VR contents and UI templates |
KR20190045532A (en) * | 2017-10-24 | 2019-05-03 | (주)플레이솔루션 | Experience system of virtual reality contents based on motion simulator |
KR20190069684A (en) * | 2017-12-12 | 2019-06-20 | 한국과학기술원 | Apparatus for sickness assessment of vr contents using deep learning based analysis of visualvestibular mismatch and the method thereof |
KR102103430B1 (en) * | 2018-11-08 | 2020-04-22 | 서울과학기술대학교 산학협력단 | Method and system for measuring latency in cloud based virtual reallity services |
KR20200052693A (en) * | 2018-11-07 | 2020-05-15 | 주식회사 인디고엔터테인먼트 | Virtual reality player and integrated management system for monitoring thereof |
US10832483B2 (en) | 2017-12-05 | 2020-11-10 | Electronics And Telecommunications Research Institute | Apparatus and method of monitoring VR sickness prediction model for virtual reality content |
US11366520B2 (en) | 2018-12-07 | 2022-06-21 | Electronics And Telecommunications Research Institute | Method for analyzing element inducing motion sickness in virtual-reality content and apparatus using the same |
-
2016
- 2016-03-08 KR KR1020160027790A patent/KR20170104846A/en not_active Application Discontinuation
Cited By (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR20190045532A (en) * | 2017-10-24 | 2019-05-03 | (주)플레이솔루션 | Experience system of virtual reality contents based on motion simulator |
US10832483B2 (en) | 2017-12-05 | 2020-11-10 | Electronics And Telecommunications Research Institute | Apparatus and method of monitoring VR sickness prediction model for virtual reality content |
KR20190069684A (en) * | 2017-12-12 | 2019-06-20 | 한국과학기술원 | Apparatus for sickness assessment of vr contents using deep learning based analysis of visualvestibular mismatch and the method thereof |
KR101958263B1 (en) | 2018-08-03 | 2019-03-14 | (주)이노시뮬레이션 | The control method for VR contents and UI templates |
KR20200052693A (en) * | 2018-11-07 | 2020-05-15 | 주식회사 인디고엔터테인먼트 | Virtual reality player and integrated management system for monitoring thereof |
KR102103430B1 (en) * | 2018-11-08 | 2020-04-22 | 서울과학기술대학교 산학협력단 | Method and system for measuring latency in cloud based virtual reallity services |
US11366520B2 (en) | 2018-12-07 | 2022-06-21 | Electronics And Telecommunications Research Institute | Method for analyzing element inducing motion sickness in virtual-reality content and apparatus using the same |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR20170104846A (en) | Method and apparatus for analyzing virtual reality content | |
US20230414899A1 (en) | Classifying a discomfort level of a user when interacting with virtual reality (vr) content | |
US10255715B2 (en) | Field of view (FOV) throttling of virtual reality (VR) content in a head mounted display | |
CN110227266B (en) | Building virtual reality game play environments using real world virtual reality maps | |
US20170084084A1 (en) | Mapping of user interaction within a virtual reality environment | |
KR102055481B1 (en) | Method and apparatus for quantitative evaluation assessment of vr content perceptual quality using deep running analysis of vr sickness factors | |
US11302049B2 (en) | Preventing transition shocks during transitions between realities | |
KR20170105905A (en) | Method and apparatus for analyzing virtual reality content | |
US10671151B2 (en) | Mitigating digital reality leakage through session modification | |
CN116474378A (en) | Artificial Intelligence (AI) controlled camera perspective generator and AI broadcaster | |
CN108257205B (en) | Three-dimensional model construction method, device and system | |
Gonçalves et al. | Systematic review of comparative studies of the impact of realism in immersive virtual experiences | |
KR20180013892A (en) | Reactive animation for virtual reality | |
Tseng | Intelligent augmented reality system based on speech recognition | |
US20220210523A1 (en) | Methods and Systems for Dynamic Summary Queue Generation and Provision | |
KR20120042467A (en) | Method for reusing physics simulation results and game service apparatus using the same | |
US20230199420A1 (en) | Real-world room acoustics, and rendering virtual objects into a room that produce virtual acoustics based on real world objects in the room | |
KR102427085B1 (en) | Electronic apparatus for providing education service and method for operation thereof | |
US10203833B2 (en) | Recommending application modifications using image area counts | |
CN114341944A (en) | Computer generated reality recorder | |
KR20200033645A (en) | Method and program for providing the cause of dizzness in vr contents | |
US11842455B1 (en) | Synchronizing physical and virtual environments using quantum entanglement | |
US11863956B2 (en) | Methods and systems for balancing audio directed to each ear of user | |
CN116483208B (en) | Anti-dizzy method and device for virtual reality equipment, computer equipment and medium | |
US20240033640A1 (en) | User sentiment detection to identify user impairment during game play providing for automatic generation or modification of in-game effects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A201 | Request for examination | ||
E902 | Notification of reason for refusal | ||
E601 | Decision to refuse application |