KR20170104846A - Method and apparatus for analyzing virtual reality content - Google Patents

Method and apparatus for analyzing virtual reality content Download PDF

Info

Publication number
KR20170104846A
KR20170104846A KR1020160027790A KR20160027790A KR20170104846A KR 20170104846 A KR20170104846 A KR 20170104846A KR 1020160027790 A KR1020160027790 A KR 1020160027790A KR 20160027790 A KR20160027790 A KR 20160027790A KR 20170104846 A KR20170104846 A KR 20170104846A
Authority
KR
South Korea
Prior art keywords
information
virtual reality
value
reality content
fps
Prior art date
Application number
KR1020160027790A
Other languages
Korean (ko)
Inventor
정준교
이상호
Original Assignee
주식회사 그루크리에이티브랩
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 주식회사 그루크리에이티브랩 filed Critical 주식회사 그루크리에이티브랩
Priority to KR1020160027790A priority Critical patent/KR20170104846A/en
Publication of KR20170104846A publication Critical patent/KR20170104846A/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/106Processing image signals
    • H04N13/144Processing image signals for flicker reduction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N2013/0074Stereoscopic image analysis

Abstract

A method and apparatus for analyzing virtual reality content capable of generating dizziness numerical information for virtual reality content are disclosed. A method for analyzing virtual reality content includes a step of receiving information on virtual reality content reproduced by a user; and a step of analyzing information on the virtual reality content and generating dizziness numerical information on the virtual reality content by using the analysis result. The information on the virtual reality content includes at least one of frame information during reproduction time, object information included in the virtual reality content, and geometric information on an object. The analysis result includes at least one of a frame per second (FPS) value during the reproduction time, the speed and acceleration value of the object, a frequency component for the object, and the distance information of the user and the object.

Description

METHOD AND APPARATUS FOR ANALYZING VIRTUAL REALITY CONTENT BACKGROUND OF THE INVENTION [0001]

The present invention relates to a method and apparatus for analyzing a virtual reality content, and more particularly, to a method and apparatus for analyzing a virtual reality content capable of generating dizziness numerical information about a virtual reality content.

With the development of user interface technology, input / output devices for providing various audiovisual stimuli to users have been developed. For example, input devices that have conventionally provided input signals only with a keyboard, a mouse, a joystick, and the like have been developed to provide input signals in various ways using gravity change, acceleration change, or the like.

In addition, in the case of an output device, it has been developed to provide a stereoscopic image and to provide a more realistic audiovisual stimulation, in addition to providing a large screen to a user or providing an ultra-high-quality image. For example, output devices that allow users to experience virtual reality by wearing their heads, such as Oculus Lift, Galaxy VR, Morpheus, etc., are being developed.

Unlike general audiovisual output devices, these virtual reality terminals are configured such that the user can visually perceive only the image displayed on the output screen in a state in which the user's visual field is totally masked. By this characteristic, the user can obtain motion sickness There is a problem that I can feel.

Various methods have been studied to reduce the dizziness the user feels in the virtual reality environment. Korean Patent No. 10-1564964 is a related prior art document.

The present invention is to provide a method and apparatus for analyzing virtual reality contents capable of generating dizziness numerical information about virtual reality contents.

The present invention also provides a method for providing a virtual reality content developer with information on dizziness for a virtual reality content, and a method for providing a virtual reality content developer with a virtual reality content capable of reducing the time and economic costs required for development by predicting a potential dizziness- And a content analysis method and apparatus.

According to an aspect of the present invention, there is provided a method of reproducing virtual reality content, the method comprising: receiving information on a virtual reality content reproduced by a user; And analyzing information on the virtual reality content and generating dizziness numerical information about the virtual reality content using the analysis result, wherein the information on the virtual reality content includes information on the frame information The object information included in the virtual reality content, and the geometric information about the object, and the analysis result includes at least one of a Frame Percent (FPS) value during the playback time, a speed and acceleration value of the object, A frequency component for the object, and a distance information between the user and the object.

According to another aspect of the present invention, there is provided an information processing apparatus comprising: a storage unit for storing information on a virtual reality content reproduced by a user; And a dizziness numerical information generating unit for analyzing information on the virtual reality content and generating dizziness numerical information about the virtual reality content using the analysis result, The frame information of the object, the object information of the virtual reality content, and the geometric information of the object, and the analysis result includes at least one of a Frame Perceptual (FPS) value during the playback time, An acceleration value, a frequency component for the object, and a distance information of the user and the object.

According to the present invention, dizziness numerical information can be generated by analyzing information on a virtual reality content reproduced by a user.

Further, according to the present invention, since the dizzy numerical value information about the virtual reality contents is provided to the developer of the virtual reality contents and the developer can predict the potential dizziness incidence probability with respect to the virtual reality contents, And to provide a method and apparatus for analyzing virtual reality contents that can reduce costs.

1 is a view for explaining a virtual reality contents analysis system according to an embodiment of the present invention.
2 is a view for explaining a dizziness numerical information generating unit according to an embodiment of the present invention.
3 is a view for explaining a dizziness numerical information generating unit according to a specific embodiment of the present invention.
4 is a view for explaining information on a virtual reality content according to an embodiment of the present invention.
5 is a view for explaining an FPS according to an embodiment of the present invention.
6 is a diagram showing pseudo code of an FPS analysis method using a macroscopic filter.
7 is a diagram for explaining an FPS analysis method using a microscopic filter.
8 is a diagram showing a pseudo code of an FPS analysis method using a microscopic filter.
9 is a diagram illustrating segmentation for frequency analysis according to an embodiment of the present invention.
10 is a diagram showing the results of FFT analysis.
11 is a diagram showing a pseudo code of a frequency analysis method according to the present invention.
12 is a view for explaining a distance analysis method according to an embodiment of the present invention.
13 is a view for explaining a report message according to an embodiment of the present invention.
14 is a diagram for explaining a method of analyzing a virtual reality content according to an embodiment of the present invention.

While the invention is susceptible to various modifications and alternative forms, specific embodiments thereof are shown by way of example in the drawings and will herein be described in detail. It should be understood, however, that the invention is not intended to be limited to the particular embodiments, but includes all modifications, equivalents, and alternatives falling within the spirit and scope of the invention. Like reference numerals are used for like elements in describing each drawing.

The present invention analyzes the virtual reality content, and calculates how dizzy the user is when watching the virtual reality content to provide dizziness numerical information. That is, the present invention analyzes the degree of user's dizziness that can be generated by software called virtual reality content rather than a hardware device such as a virtual reality terminal. The dizziness numerical information is information expressed by quantifying the degree of dizziness that a user can feel when viewing a virtual reality content.

The dizziness numerical information can be provided to the developer who produces the virtual reality contents. The developer can develop the virtual reality contents using the dizzy numerical information according to the present invention so that the user can feel less dizziness.

Hereinafter, embodiments according to the present invention will be described in detail with reference to the accompanying drawings.

1 is a view for explaining a virtual reality contents analysis system according to an embodiment of the present invention.

Referring to FIG. 1, a virtual reality content analysis system according to the present invention includes a virtual reality terminal 110, a virtual reality content analysis apparatus 120, and a client terminal 130.

The virtual reality terminal 110 may be a terminal that reproduces virtual reality contents in the form of a pair of glasses or a mobile terminal that is coupled to a stylus type device. The virtual reality terminal 110 reproduces the virtual reality contents and provides information necessary for the virtual reality contents analysis apparatus 120 to generate dizziness numerical information.

The information about the virtual reality contents provided by the virtual reality terminal 110 to the virtual reality content analysis apparatus 120 includes at least frame information during the reproduction time, object information included in the virtual reality content being reproduced, and geometric information And may include one or more. A data extraction script for extracting information on a virtual reality content from the virtual reality content being reproduced can be loaded in the virtual reality terminal 110. The data extraction script can be created using a graphic engine such as a game engine, Information can be extracted.

For example, when a virtual reality content is a game, an object in the virtual reality world responds to a user's movement or input, and the virtual reality terminal 110 stores the frame information during the time the user plays the game, To the virtual-reality-contents analyzing apparatus 120. The virtual-reality-content analyzing apparatus 120 may receive the information about the object of FIG.

The virtual reality contents analyzing apparatus 120 includes a storage unit 121 and a dizziness numerical information generating unit 123. The storage unit 121 stores information on the virtual reality contents reproduced by the user transmitted from the virtual reality terminal 110. The dizziness numerical information generation unit 123 analyzes information on the virtual reality contents, Using the analysis result, dizziness numerical information for the virtual reality contents is generated.

The dizziness numerical information generating unit 123 analyzes at least one of a frame perceptance (FPS) value during a reproduction time, a speed and an acceleration value of an object, a frequency component for an object, and a distance information of a user and an object And generate analysis results that include.

The dizziness numerical information generating unit 123 may generate dizziness numeric information by comparing the analysis result with a preset threshold value. For example, the dizziness numerical information generating unit 123 can generate dizziness numerical information indicating that the FPS value is smaller than the threshold value and the greater the difference between the FPS value and the threshold value, the greater the degree that the user feels dizziness.

Also, the dizziness numerical information generating unit 123 may calculate an average dizzy number during the reproduction time of the virtual reality contents, or calculate the dizziness value for each measurement interval.

The client terminal 130 receives and outputs dizziness numerical information. The client terminal 130 may be a terminal of the virtual reality contents developer, and visualizing and outputting the dizziness numerical information may help to develop the virtual reality contents.

As a result, according to the present invention, dizziness numerical information can be generated by analyzing information on a virtual reality content reproduced by a user, and help to develop a virtual reality content in which a user can feel dizziness less through dizziness numerical information You can give.

In FIG. 1, although the case where the apparatus 120 generates the dizziness numerical information is described as an embodiment, the virtual reality terminal 110 may generate the dizziness numerical information according to the embodiment.

FIG. 2 is a view for explaining a dizziness numerical information generating unit according to an embodiment of the present invention, and FIG. 3 is a view for explaining a dizziness numerical information generating unit according to a specific embodiment of the present invention.

2, the dizziness numerical information generating unit 120 includes an FPS analyzing unit 210, an object analyzing unit 220, and a distance analyzing unit 230.

The FPS analyzer 210 analyzes the FPS of the virtual reality contents, and compares the FPS value with the first threshold value to generate the first dizziness value information. The object analyzer 220 compares at least one of a speed, an acceleration value, and a frequency component of an object included in the virtual reality contents with a second threshold value to generate second diziness value information. The distance analyzer 230 compares the distance information of the user and the object with the third threshold to generate the third dizziness numerical information.

Referring to FIG. 3, the FPS analyzer 210, the object analyzer 220, and the distance analyzer 230 may operate in the form of libraries existing in different layers. For example, as shown in Table 1, the FPS analyzer 210 has a first layer (Layer 1), the object analyzer 220 has a second layer (Layer 2), the distance analyzer 230 has a third Layer, and another module for analyzing information on the virtual reality contents may be added for each layer according to an embodiment.

Layer # Detailed module number Module Funtion One 1-1 FPS analysis 2 2-1 Speed and Acceleration Analysis 2-2 Frequency analysis 3 3-1 Distance analysis

The dizziness numerical information generating unit 120 may include at least one of the FPS analyzing unit 210, the object analyzing unit 220, and the distance analyzing unit 230 according to the embodiment.

Hereinafter, the information about the virtual reality contents will be described in detail, and a specific analysis method of each analyzing unit will be described.

<Information about virtual reality contents>

4 is a view for explaining information on a virtual reality content according to an embodiment of the present invention.

Information on the virtual reality contents can be extracted through the data extraction script, as described above, and can be provided to the virtual reality contents analysis apparatus 120 in XML form as shown in FIG. FIG. 4 shows information of a virtual reality content for a frame included in a specific scene, and XML data may be generated for each frame.

Of the information on the virtual reality contents, the frame information during the reproduction time includes the reproduction time information, the index information of the frame output during the reproduction time, and the output time difference information between frames. For example, the frame information can be designated as an attribute value of < deltaTime >, < frameCount > and < time > in FIG. 4, and each attribute value is as shown in [Table 2].

The total number of frames corresponds to the last number of the <frameCount> attribute value, and <deltaTime> represents the difference between the current frame time t1 and the previous frame time t0.

Type Name Data
Type
Description
Static
Variables
deltaTime float Difference in output time between frames (unit: second)
frameCount int Corresponding frame index time float Elapsed time (unit: second) since the content is played back

Of the information on the virtual reality content, the object information included in the virtual reality content being reproduced can be designated as an attribute value of <list_gameObject> in FIG. For example, object information can be the same as [Table 3].

Type Name Data
Type
Description
Inherited Member
(Variable)
name string Name of content object
Variables active bool Whether the content object is active. isStatic bool The static content object layer int The layer number to which the content object belongs tag string Tags for content objects

Among the information on the virtual reality contents, the geometrical information on the object includes the position information of the object included in the virtual reality content being reproduced and the rotation information of the object. The geometric information can be designated, for example, as an attribute value of < transform > in FIG. 4, and each attribute value is as shown in [Table 4].

<eulerAngles> is a three-dimensional vector in degrees (°) representing the rotation angle (tilt) with respect to the X, Y, and Z axes based on a calibration reference that is the direction that the user looks at when the virtual reality terminal 110 is initially driven to be. <position> is a value obtained by expressing the coordinates (X, Y, Z) of an object in a virtual reality world (VR world) as a three-dimensional vector.

Type Name Data Type Description Variables eulerAngles Vector3 (float) The rotation angle of the object with respect to the X, Y, and Z axes
(Euler Angle, in degrees (°))
position Vector3 (float) X, Y, Z coordinates of objects in virtual world

<FPS Analysis>

FIG. 5 is a view for explaining FPS according to an embodiment of the present invention, and FIG. 6 is a diagram showing pseudo code of an FPS analysis method using a macroscopic filter. FIG. 7 is a view for explaining a FPS analysis method using a microscopic filter, and FIG. 8 is a diagram showing pseudo code of an FPS analysis method using a microscopic filter.

Generally, frames are processed at 30 frames per second for TV broadcasts and 60 frames per second for general PC and console games, whereas the virtual reality technology processes frames at 90 frames per second so that the human brain recognizes virtual reality as real time . The lower the number of frames per second (FPS), the worse the motion sickness experienced by the user viewing the virtual reality content.

The FPS analyzer 210 calculates the FPS value for the virtual reality contents using the frame information, and compares the FPS value and the threshold value to generate the dizziness value information. The threshold may be set to, for example, 90.

More specifically, the FPS analyzing unit 210 uses a method of analyzing the entire virtual reality contents using the FPS and a method of calculating and analyzing the FPS by dividing the virtual reality contents into measurement intervals. In the present specification, the former method is referred to as a method using a macro filter, and the latter method is referred to as a method using a micro filter.

When a macroscopic filter is used, the FPS analyzing unit 210 calculates a first FPS value for each frame during a reproduction time using frame information, compares one of the first FPS values with a threshold value, and generates dizziness numerical information .

5 showing the timeline of the frame output time during the reproduction time of the virtual reality contents, the difference (t) of the output time t of each of the frames S included in the virtual reality contents is obtained from the frame information And the FPS analyzer 210 can calculate the first FPS value for each frame using Equation (1).

Figure pat00001

Where t f is the output time of the previous frame and t i is the output time of the current frame.

The FPS analyzing unit 210 compares the median value (fps Median ) of the calculated first FPS values, that is, the first FPS value, in order of magnitude with the threshold value (fps Threshold ) To generate dizziness numerical information. As described above, the FPS analyzer 210 can generate dizziness numerical information indicating that the median value is smaller than the threshold value and the greater the difference between the median value and the threshold value, the greater the degree that the user feels dizziness.

Meanwhile, the FPS analyzer 210 can analyze the FPS using a microscopic filter regardless of the analysis result using the macroscopic filter. Alternatively, the FPS analyzer 210 may analyze the FPS using a microscopic filter according to the magnitude of the first dizziness value through the macroscopic filter, according to the embodiment.

When a microscopic filter is used, the FPS analyzing unit 210 calculates the first FPS value for each frame during the reproduction time using the frame information, and divides the reproduction time into a predetermined measurement period. The second FPS value is calculated for each measurement interval using the first FPS value, and the second FPS value is compared with the threshold value to generate dizziness numerical information for each measurement interval. When the FPS analyzing unit 210 calculates the first FPS value using the macroscopic filter, the FPS analyzing unit 210 may calculate the second FPS value using the already calculated first FPS value.

More specifically, the FPS analyzer 210 divides the reproduction time of the virtual reality contents into a plurality of measurement intervals I as shown in FIG. 7, and calculates a second FPS value for each measurement interval. The FPS analyzer 210 can divide the entire frame N output during the reproduction time of the virtual reality contents into a plurality of measurement intervals using Equation (2).

Figure pat00002

At this time, R can be calculated as shown in Equation (3). In Equation (3), T corresponds to the reproduction time, and ceil is the decimal point increasing function.

Figure pat00003

The FPS analyzer 210 calculates the second FPS value using the kernel window K. The FPS analyzer 210 calculates the second FPS value using the kernel window K by using the number of frames N during the reproduction time and the reproduction time T, The kernel window size is calculated and a third FPS value for the frame included in the kernel window is calculated for each kernel window by superimposing a part of the kernel window for each measurement interval as shown in FIG. Then, the FPS analyzer 210 calculates a second FPS value by averaging the third FPS values. For example, the second FPS value (fps AverageInterval ) may be calculated for each measurement interval using Equation (4).

Figure pat00004

When the number of kernel windows used in the measurement interval is greater than K, the FPS analyzer 210 calculates a plurality of second FPS values in units of K, and uses one of the second FPS values as a threshold, The change of the FPS value is analyzed. For example, the threshold may be one of the second FPS values and may be a median value (fps MedianInterval ).

That is, the FPS analyzer 210 calculates a second FPS value for each of a plurality of measurement intervals, and compares the second FPS value with a threshold value determined in the measurement interval to analyze the FPS change rate within the measurement interval And it is possible to generate dizziness numerical information for the measurement period using the analysis result.

For example, if there is no interval in which the second FPS value is smaller than the threshold value in the first measurement interval, the dizziness value for the first measurement interval may be calculated to be low. Or if the second FPS value is greater than the threshold in the first measurement interval, the dizziness value of the corresponding interval may be calculated to be high or low depending on the difference from the threshold value.

Referring to FIG. 8, the FPS analyzer 210 may compare a second FPS value with a threshold value (fps Threshold ) to identify a measurement interval having a high degree of dizziness. If the second FPS value is greater than the threshold fps Threshold For the high measurement interval, the second FPS value and the threshold value (fps MedianInterval ) are compared to identify a section with a high degree of dizziness.

On the other hand, in the measurement period, the interval (I Remainder ) in which the number of frames is smaller than K is preferably excluded from the analysis target.

The virtual reality contents developer can develop the virtual reality contents by increasing the FPS in a certain interval where the user is likely to feel dizziness by using the dizziness numerical information according to the FPS analysis.

<Speed and Acceleration Analysis>

In the real world, the ears of a person who judges a sense of balance can not sense discomfort when the rate of change is low, or when a certain speed or acceleration acts on the human body. However, when the virtual reality terminal 110 such as the HMD is worn, the user is sensitive to changes in the speed and acceleration of the virtual reality contents unlike the real world. Therefore, virtual reality contents including objects moving at a high speed in an elliptical orbit such as a roller coaster may cause dizziness to a user, and thus are not used as a demo.

The object analyzing unit 220 calculates the speed and acceleration of the object included in the reproduced virtual reality contents using the geometric information, and generates dizziness numerical information by comparing the speed and the acceleration with the threshold value.

As described above, since the geometric information includes the position information of the object and the frame information includes the inter-frame output time difference information, the object analyzing unit 220 analyzes the position of the object The speed can be calculated. Also, the acceleration of the object can be calculated using the change of the speed of the object with time.

The object analyzing unit 220 can generate dizziness numerical information indicating that the calculated speed and acceleration are greater than the threshold value and the greater the difference between the speed and the acceleration and the threshold value, the greater the degree that the user feels dizziness. At this time, the object analyzer 220 can compare the speed and acceleration with different thresholds, and generate dizziness numerical information indicating that the user feels dizzy if one of the speed and the acceleration is greater than the threshold value.

On the other hand, the object analyzing unit 220 does not calculate the acceleration for an object having a speed of 0 for efficient operation. In the case of the last frame, since the next frame does not exist, it is impossible to measure the position change of the object and the speed change measurement. Therefore, the object analyzing unit 220 determines that the virtual frame having the same position information as the last frame You can insert it after the frame to calculate the speed and acceleration for the last frame.

The virtual reality contents developer can reduce the dizziness that the user feels by developing the virtual reality contents so that the speed and acceleration of the object responding by the user can be reduced by using the dizziness numerical information according to the speed and acceleration analysis.

<Frequency analysis>

FIG. 9 is a diagram illustrating an interval division for frequency analysis according to an embodiment of the present invention, and FIG. 10 is a diagram illustrating an FFT analysis result. 11 is a diagram showing a pseudo code of a frequency analysis method according to the present invention.

There are mismatched motions among the causes that cause symptoms such as motion sickness, dizziness, and vomiting while experiencing virtual reality contents. Motion mismatch means the motion in the simulation and the inconsistency in the user's intended motion. Therefore, during viewing of the virtual reality content, the user is advised to refrain from inappropriate actions such as periodic head movement.

The object analyzer 220 analyzes the frequency component of the moving object according to the movement of the user, and generates dizziness numerical information. The object analyzing unit 220 may divide the virtual reality contents into a plurality of measurement periods (I FFT ) as shown in FIG. 9, and perform Fourier transform on the divided measurement periods to perform frequency analysis. The length of the measurement interval may be variously set according to the embodiment.

The object analyzing unit 220 calculates the angular acceleration of the object to analyze the frequency component of the object. More specifically, the object analyzing unit 220 can calculate the angular velocity as shown in Equation (5) using the rotation information of the object included in the geometric information and the output time difference information between frames.

Figure pat00005

here,

Figure pat00006
Represents the time variation between the previous frame and the current frame,
Figure pat00007
Represents the change in the rotation angle of the object between the previous frame and the current frame.

Then, the object analyzing unit 220 can calculate the angular acceleration as shown in Equation (6) using the change amount of the angular velocity. here,

Figure pat00008
Represents the amount of change of the acceleration of the object between the previous frame and the current frame.

Figure pat00009

The object analyzer 220 calculates the magnitude of the calculated angular acceleration as shown in Equation (7), and calculates the amplitude in the frequency domain through frequency analysis for magnitude.

Figure pat00010

When FFT is applied to the calculated magnitude, complex-form data is output. The absolute value of the complex-form data is the amplitude in the frequency domain.

As a result, the object analyzing unit 220 can obtain the amplitude in the frequency domain as shown in FIG. 10 through frequency analysis, and compares the amplitude of the predetermined frequency band with the threshold value to generate the dizzy numerical value information.

As described above, the motion mismatch occurs due to the user's repeated periodic operation, and the frequency component for the movement of the object according to the motion of the user is represented in the low frequency region. In one embodiment, the frequency band that is compared to the threshold may be a frequency band that is selected in the band between 0.05 Hz and 0.8 Hz.

FIG. 10 shows a result of performing FFT by setting the measurement period to 5 seconds when the user periodically moves the head. FIG. 10 (a) shows a section from the start of reproduction to 5 seconds, Represents the interval from seconds to 10 seconds. As shown in FIG. 10, when the user periodically moves the head, the amplitude is high in a low frequency region close to zero.

Accordingly, the object analyzer 220 can generate dizziness numerical information by comparing the amplitude and the threshold value. For example, if the amplitude is larger than the threshold value and the difference between the amplitude and the threshold value is larger, the user feels dizziness It is possible to generate the dizziness numerical information.

By using the dizziness numerical information according to the frequency analysis, the virtual reality contents developer can reduce the dizziness that the user feels by developing the virtual reality contents such that the reaction of the object to the unnecessary movement of the user is insensitive.

<Distance Analysis>

12 is a view for explaining a distance analysis method according to an embodiment of the present invention.

In a case where a plurality of objects exist in a virtual reality content to be played back in a virtual reality terminal such as an HMD and an object located too close to the user is present in the virtual reality content, that is, when a specific object occupies most of the playback screen, It becomes more likely to feel dizziness.

Accordingly, the distance analyzer 230 may calculate the distance between the object and the user included in the viewing angle of the user, and generate the dizziness numerical information by comparing the calculated distance and the threshold.

As shown in FIG. 12, when four rabbits exist in the viewing plane (Dlipping plane, Far plane) of the user, the distance analyzer 230 calculates a distance Calculate the distance between the rabbit and the user. The distance analyzer 230 may calculate the distance between the user and the object using the position information of the object included in the geometric information. The position of the user can be set to any point when the virtual reality contents are produced, and the camera shown in FIG. 12 can correspond to the user's eyes.

 The distance analyzer 230 may generate dizziness numerical information indicating that the distance between the user and the object is smaller than the threshold and the greater the difference between the distance and the threshold is, the greater the degree that the user feels dizziness.

The virtual reality contents developer can reduce the dizziness experienced by the user by using the dizziness numerical information according to the distance analysis to develop the virtual reality contents such that the distance between the object and the user included in the virtual reality contents is more than a specific distance.

13 is a view for explaining a report message according to an embodiment of the present invention.

As described above, the virtual reality contents analyzing apparatus transmits the dizzy numerical value information to the client terminal, and at this time, the information and analysis result used to generate the dizziness numerical information can be transmitted to the client terminal together. Can be transmitted in the same XML data format.

The topmost header <AnalysisReportRoot> stores the dizziness numerical information generated according to the analysis method applied to each frame or measurement interval. FIG. 13 shows a case where the FPS value for the 730th frame, the degree of dizziness numerical information (Low FPS) is stored, the acceleration value for the 714th frame, and the degree of dizziness numerical information (High Acceleration) are stored.

The <TargetObjName> existing as an attribute in the header stores information about the object to be analyzed.

<Message> is an attribute to add a message for adding a description, and this part is output as a log message at the client terminal.

<nStackLevel> represents the layer information of the corresponding dizziness information, <strType> is a string type, and defines the type of module in which the current log is generated.

After defining the attributes for the log itself, information about the point at which the log was created is stored, which is stored in the <time> part under the <AnalysisReportRoot>. The most important attribute values are <deltaTime>, <frameCount>, and <time>.

<deltaTime> stores the time taken to output the current frame in the previous frame, and <frameCount> specifies the frame number of the current frame during playback of the current scene. Finally, time is stored in play time. All <time> related data are stored in units of seconds and can include a decimal point so that the accuracy of time for a frame unit can be maintained.

On the other hand, a null value (0) is stored in an attribute in which a specific value is not stored, and various values may be stored according to the embodiment.

14 is a diagram for explaining a method of analyzing a virtual reality content according to an embodiment of the present invention.

In FIG. 14, a method for analyzing a virtual reality content of the virtual reality content analyzing apparatus 120 of FIG. 1 is described as an embodiment. According to an embodiment, a method for analyzing a virtual reality content according to the present invention may be performed in a virtual reality terminal.

The apparatus for analyzing a virtual reality content according to the present invention receives information about a virtual reality content reproduced by a user (S1410), receives input, analyzes information about the virtual reality content, (S1420). &Lt; / RTI &gt; The virtual reality contents analyzing apparatus can generate dizziness numerical information by comparing the analysis result with predetermined threshold values.

At this time, the information on the virtual reality contents includes at least one of frame information during reproduction time, object information included in the virtual reality contents, and geometric information about the object. The analysis result includes at least one of a frame perceptance (FPS) value during playback time, a speed and acceleration value of the object, a frequency component for the object, and distance information of the user and the object.

The virtual reality contents analyzing apparatus may calculate the first FPS value for each frame during the reproduction time using the frame information, compare one of the first FPS values with the first threshold value, and generate dizziness numerical information .

Or the virtual reality contents analyzing apparatus may divide the reproduction time into a predetermined measurement period, calculate a second FPS value for each measurement interval using the first FPS value, compare the second FPS value with a second threshold value , It is possible to generate dizziness numerical information for each measurement interval.

Or the virtual reality content analyzer may use the geometric information to compute the speed and acceleration of the object and compare the speed and acceleration to the third threshold to generate dizziness numerical information, according to an embodiment.

Or the virtual reality contents analyzing apparatus according to the embodiment calculates the magnitude of the angular acceleration of the object moving according to the movement of the user by using the geometric information and calculates the magnitude of the amplitude calculated in the frequency domain through the frequency analysis on the magnitude The amplitude of the set frequency band is compared with the fourth threshold value to generate the dizziness numerical information.

Or the virtual reality content analyzing apparatus may calculate the distance between the user and the object using the geometric information and compare the calculated distance with the fifth threshold value to generate dizziness numerical information according to the embodiment.

The above-described technical features may be implemented in the form of program instructions that can be executed through various computer means and recorded in a computer-readable medium. The computer-readable medium may include program instructions, data files, data structures, and the like, alone or in combination. The program instructions recorded on the medium may be those specially designed and constructed for the embodiments or may be available to those skilled in the art of computer software. Examples of computer-readable media include magnetic media such as hard disks, floppy disks and magnetic tape; optical media such as CD-ROMs and DVDs; magnetic media such as floppy disks; Magneto-optical media, and hardware devices specifically configured to store and execute program instructions such as ROM, RAM, flash memory, and the like. Examples of program instructions include machine language code such as those produced by a compiler, as well as high-level language code that can be executed by a computer using an interpreter or the like. The hardware device may be configured to operate as one or more software modules to perform the operations of the embodiments, and vice versa.

As described above, the present invention has been described with reference to particular embodiments, such as specific elements, and specific embodiments and drawings. However, it should be understood that the present invention is not limited to the above- And various modifications and changes may be made thereto by those skilled in the art to which the present invention pertains. Accordingly, the spirit of the present invention should not be construed as being limited to the embodiments described, and all of the equivalents or equivalents of the claims, as well as the following claims, belong to the scope of the present invention .

Claims (19)

Receiving information on a virtual reality content reproduced by a user; And
Analyzing information about the virtual reality content and generating dizziness numerical information about the virtual reality content using the analysis result,
Wherein the information about the virtual reality contents includes:
Frame information during reproduction time, object information included in the virtual reality content, and geometric information about the object,
The results of the analysis
A frame rate value during the playback time, a speed and acceleration value of the object, a frequency component for the object, and a distance information of the user and the object,
Virtual reality content analysis method.
The method according to claim 1,
The step of generating the dizzy numerical information
And comparing the analysis result with a predetermined threshold value to generate the dizzy numerical value information
Virtual reality content analysis method.
The method according to claim 1,
The frame information during the reproduction time is
The reproduction time information, the index information of the frame outputted during the reproduction time, and the inter-frame output time difference information
A method of analyzing virtual reality content, including rules.
The method of claim 3,
The step of generating the dizzy numerical information
Calculating a first FPS value for each frame during the reproduction time using the frame information; And
Comparing the threshold value with one of the first FPS values to generate the dizziness numerical information
And analyzing the virtual reality content.
The method of claim 3,
The step of generating the dizzy numerical information
Calculating a first FPS value for each frame during the reproduction time using the frame information;
Dividing the reproduction time into a predetermined measurement period;
Calculating a second FPS value for each measurement interval using the first FPS value; And
Comparing the second FPS value with a threshold value, and generating the dizziness value information for each measurement interval
And analyzing the virtual reality content.
6. The method of claim 5,
The step of calculating the second FPS value for each measurement interval
Calculating a size of a kernel window using the number of frames during the playback time and the playback time;
Overlapping a portion of the kernel window for each measurement interval and calculating a third FPS value for a frame included in the kernel window for each kernel window; And
Calculating the second FPS value by averaging the third FPS value
And analyzing the virtual reality content.
6. The method of claim 5,
The threshold value
One of the second FPS values per measurement interval
Virtual reality content analysis method.
The method according to claim 1,
The geometric information
Position information of the object, rotation information of the object
A method of analyzing virtual reality content, including rules.
9. The method of claim 8,
The step of generating the dizzy numerical information
Calculating the speed and acceleration of the object using the geometric information; And
Comparing the speed and acceleration with a threshold value, and generating the dizziness value information
And analyzing the virtual reality content.
9. The method of claim 8,
The step of generating the dizzy numerical information
Analyzes the frequency component of the moving object according to the movement of the user, and generates the dizzy numerical value information
Virtual reality content analysis method.
11. The method of claim 10,
The step of generating the dizzy numerical information
Calculating an angular acceleration of the object using the geometric information;
Calculating a magnitude of the angular acceleration;
Calculating an amplitude in the frequency domain through frequency analysis for the magnitude; And
Comparing the amplitude of the predetermined frequency band with a threshold value, and generating the dizzy numerical value information
And analyzing the virtual reality content.
12. The method of claim 11,
The frequency band
The frequency band selected in the band between 0.05 Hz and 0.8 Hz
Virtual reality content analysis method.
9. The method of claim 8,
The step of generating the dizzy numerical information
Using the geometric information to calculate a distance between the user and the object; And
Comparing the distance with a threshold value, and generating the dizziness value information
And analyzing the virtual reality content.
The method according to claim 1,
Transmitting the analysis result and the dizzy numerical value information to the client terminal
Further comprising the steps of:
A storage unit for storing information on virtual reality contents reproduced by a user; And
And a dizziness numerical information generating unit for analyzing the information about the virtual reality contents and generating dizziness numerical information about the virtual reality contents using the analysis result,
Wherein the information about the virtual reality contents includes:
Frame information during reproduction time, object information included in the virtual reality content, and geometric information about the object,
The results of the analysis
A frame rate value during the playback time, a speed and acceleration value of the object, a frequency component for the object, and a distance information of the user and the object,
Virtual reality content analysis device.
16. The method of claim 15,
The dizzy numerical value information generating unit
An FPS analyzer for comparing the FPS value with a first threshold value to generate first dizziness value information;
An object analyzer for comparing at least one of a speed and an acceleration value of the object and a frequency component for the object with a second threshold to generate second dizziness numerical information; And
A distance analyzing unit for comparing the distance information with a third threshold value and generating third dizziness value information,
Wherein the virtual reality content analyzing apparatus comprises:
17. The method of claim 16,
Each of the analysis units
It works in the form of libraries that exist in different layers.
Virtual reality content analysis device.
16. The method of claim 15,
An information receiving unit for receiving information on the virtual reality contents from a virtual reality terminal; And
An information transmission unit for transmitting the dizzy numerical value information to the client terminal,
Wherein the virtual reality content analyzing apparatus further comprises:
16. The method of claim 15,
The virtual reality contents analyzing apparatus
A virtual reality terminal or server
Virtual reality content analysis device.
KR1020160027790A 2016-03-08 2016-03-08 Method and apparatus for analyzing virtual reality content KR20170104846A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020160027790A KR20170104846A (en) 2016-03-08 2016-03-08 Method and apparatus for analyzing virtual reality content

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
KR1020160027790A KR20170104846A (en) 2016-03-08 2016-03-08 Method and apparatus for analyzing virtual reality content

Publications (1)

Publication Number Publication Date
KR20170104846A true KR20170104846A (en) 2017-09-18

Family

ID=60034385

Family Applications (1)

Application Number Title Priority Date Filing Date
KR1020160027790A KR20170104846A (en) 2016-03-08 2016-03-08 Method and apparatus for analyzing virtual reality content

Country Status (1)

Country Link
KR (1) KR20170104846A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101958263B1 (en) 2018-08-03 2019-03-14 (주)이노시뮬레이션 The control method for VR contents and UI templates
KR20190045532A (en) * 2017-10-24 2019-05-03 (주)플레이솔루션 Experience system of virtual reality contents based on motion simulator
KR20190069684A (en) * 2017-12-12 2019-06-20 한국과학기술원 Apparatus for sickness assessment of vr contents using deep learning based analysis of visual­vestibular mismatch and the method thereof
KR102103430B1 (en) * 2018-11-08 2020-04-22 서울과학기술대학교 산학협력단 Method and system for measuring latency in cloud based virtual reallity services
KR20200052693A (en) * 2018-11-07 2020-05-15 주식회사 인디고엔터테인먼트 Virtual reality player and integrated management system for monitoring thereof
US10832483B2 (en) 2017-12-05 2020-11-10 Electronics And Telecommunications Research Institute Apparatus and method of monitoring VR sickness prediction model for virtual reality content
US11366520B2 (en) 2018-12-07 2022-06-21 Electronics And Telecommunications Research Institute Method for analyzing element inducing motion sickness in virtual-reality content and apparatus using the same

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20190045532A (en) * 2017-10-24 2019-05-03 (주)플레이솔루션 Experience system of virtual reality contents based on motion simulator
US10832483B2 (en) 2017-12-05 2020-11-10 Electronics And Telecommunications Research Institute Apparatus and method of monitoring VR sickness prediction model for virtual reality content
KR20190069684A (en) * 2017-12-12 2019-06-20 한국과학기술원 Apparatus for sickness assessment of vr contents using deep learning based analysis of visual­vestibular mismatch and the method thereof
KR101958263B1 (en) 2018-08-03 2019-03-14 (주)이노시뮬레이션 The control method for VR contents and UI templates
KR20200052693A (en) * 2018-11-07 2020-05-15 주식회사 인디고엔터테인먼트 Virtual reality player and integrated management system for monitoring thereof
KR102103430B1 (en) * 2018-11-08 2020-04-22 서울과학기술대학교 산학협력단 Method and system for measuring latency in cloud based virtual reallity services
US11366520B2 (en) 2018-12-07 2022-06-21 Electronics And Telecommunications Research Institute Method for analyzing element inducing motion sickness in virtual-reality content and apparatus using the same

Similar Documents

Publication Publication Date Title
KR20170104846A (en) Method and apparatus for analyzing virtual reality content
US20230414899A1 (en) Classifying a discomfort level of a user when interacting with virtual reality (vr) content
US10255715B2 (en) Field of view (FOV) throttling of virtual reality (VR) content in a head mounted display
CN110227266B (en) Building virtual reality game play environments using real world virtual reality maps
US20170084084A1 (en) Mapping of user interaction within a virtual reality environment
KR102055481B1 (en) Method and apparatus for quantitative evaluation assessment of vr content perceptual quality using deep running analysis of vr sickness factors
US11302049B2 (en) Preventing transition shocks during transitions between realities
KR20170105905A (en) Method and apparatus for analyzing virtual reality content
US10671151B2 (en) Mitigating digital reality leakage through session modification
CN116474378A (en) Artificial Intelligence (AI) controlled camera perspective generator and AI broadcaster
CN108257205B (en) Three-dimensional model construction method, device and system
Gonçalves et al. Systematic review of comparative studies of the impact of realism in immersive virtual experiences
KR20180013892A (en) Reactive animation for virtual reality
Tseng Intelligent augmented reality system based on speech recognition
US20220210523A1 (en) Methods and Systems for Dynamic Summary Queue Generation and Provision
KR20120042467A (en) Method for reusing physics simulation results and game service apparatus using the same
US20230199420A1 (en) Real-world room acoustics, and rendering virtual objects into a room that produce virtual acoustics based on real world objects in the room
KR102427085B1 (en) Electronic apparatus for providing education service and method for operation thereof
US10203833B2 (en) Recommending application modifications using image area counts
CN114341944A (en) Computer generated reality recorder
KR20200033645A (en) Method and program for providing the cause of dizzness in vr contents
US11842455B1 (en) Synchronizing physical and virtual environments using quantum entanglement
US11863956B2 (en) Methods and systems for balancing audio directed to each ear of user
CN116483208B (en) Anti-dizzy method and device for virtual reality equipment, computer equipment and medium
US20240033640A1 (en) User sentiment detection to identify user impairment during game play providing for automatic generation or modification of in-game effects

Legal Events

Date Code Title Description
A201 Request for examination
E902 Notification of reason for refusal
E601 Decision to refuse application