CN114630182A - Virtual reality video playing method and equipment - Google Patents

Virtual reality video playing method and equipment Download PDF

Info

Publication number
CN114630182A
CN114630182A CN202210191645.2A CN202210191645A CN114630182A CN 114630182 A CN114630182 A CN 114630182A CN 202210191645 A CN202210191645 A CN 202210191645A CN 114630182 A CN114630182 A CN 114630182A
Authority
CN
China
Prior art keywords
video
target
frame rate
camera
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210191645.2A
Other languages
Chinese (zh)
Inventor
郭红
于全夫
王大勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hisense Visual Technology Co Ltd
Original Assignee
Hisense Visual Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hisense Visual Technology Co Ltd filed Critical Hisense Visual Technology Co Ltd
Priority to CN202210191645.2A priority Critical patent/CN114630182A/en
Publication of CN114630182A publication Critical patent/CN114630182A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations

Abstract

The application relates to the technical field of VR (virtual reality), and provides a playing method and equipment of a virtual reality video, wherein in order to prevent a rendering frame rate from being limited by screen hardware, a vertical synchronization function is closed after a three-dimensional virtual scene is entered; in the process of playing the target video, comparing the attitude data of the camera with at least one preset parameter threshold corresponding to the video frame rate of the target video, flexibly determining the target rendering frame rate according to the comparison result, and finally rendering and displaying the target video according to the target rendering frame rate. Because the head state of the user is reflected by the attitude data of the camera, the target rendering frame rate is adapted to the head state of the user while the smoothness of the target video image is ensured, so that the target rendering frame rate is not too large or too small, and the playing power consumption of VR equipment is saved; and the preset parameter threshold is the upper limit of the motion parameter of the camera when the target video is normally displayed, so that the phenomenon of target video picture jitter caused by head motion can be reduced.

Description

Virtual reality video playing method and equipment
Technical Field
The present application relates to the field of Virtual Reality (VR) technologies, and in particular, to a method and an apparatus for playing a Virtual Reality video.
Background
Compared with the common playing device, the VR device combines the computer technology and the display technology, can generate a sense of substitution of a real environment, and enables a user to be immersed in a vivid atmosphere. When watching the video played by the VR equipment, the user can be immersed in the video, and the video pleasure can be better experienced.
Generally, a VR device has a fixed screen refresh rate, and the played video itself has a video frame rate. In the process of playing a video by a VR device, in order to ensure the fluency of a video picture, a rendering frame rate of a virtual scene is generally set to be greater than a video frame rate.
However, when a user wears the VR device to watch a video, the head is in different states, the quality requirements of the different states on video pictures are different, if the video is played at a fixed rendering frame rate all the time, the power consumption of the VR device can be increased, the service life of the VR device is shortened, picture shaking is easy to generate, the head dazzling feeling is brought to the user, and the immersive experience of the user is reduced.
Disclosure of Invention
The embodiment of the application provides a virtual reality video playing method and virtual reality video playing equipment, which are used for reducing the power consumption of VR equipment and solving the problem of picture jitter caused by head movement of a user.
In a first aspect, an embodiment of the present application provides a method for playing a virtual reality video, which is applied to a VR device, and includes:
entering a three-dimensional virtual scene, and closing a vertical synchronization function;
acquiring a target video and a video frame rate of the target video;
acquiring attitude data of a camera according to a preset setting period;
determining a target rendering frame rate according to a comparison result of the attitude data and at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera, wherein the preset parameter threshold is an upper limit of a motion parameter of the camera when the target video is normally displayed;
and rendering and displaying the target video according to the target rendering frame rate.
In a second aspect, embodiments of the present application provide a VR device comprising a controller, a memory, a processor, and an inertial measurement unit, IMU, the processor, the memory, and the controller connected by a bus;
the memory stores a computer program according to which the controller performs the following operations:
entering a three-dimensional virtual scene, and closing a vertical synchronization function;
acquiring a target video and a video frame rate of the target video;
acquiring attitude data of a camera per se through the IMU according to a preset setting period;
determining a target rendering frame rate according to a comparison result of the attitude data and at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera, wherein the preset parameter threshold is an upper limit of a motion parameter of the camera when the target video is normally displayed;
and rendering and displaying the target video according to the target rendering frame rate through the processor.
In a third aspect, the present application provides a computer-readable storage medium, where computer-executable instructions are stored in the computer-readable storage medium, and the computer-executable instructions are configured to enable a computer to execute a playing method of a virtual reality video provided in an embodiment of the present application.
In the embodiment of the application, after the three-dimensional virtual scene is entered, the vertical synchronization function is closed, so that the rendering frame rate can be dynamically adjusted without being limited by screen hardware; in the process of playing the target video, acquiring the attitude data of a camera according to a preset setting period, then comparing the attitude data with at least one preset parameter threshold corresponding to the video frame rate of the target video, flexibly determining the target rendering frame rate according to the comparison result, and finally rendering and displaying the target video according to the target rendering frame rate. Because the attitude data of the camera reflects the head state of the user, the target rendering frame rate is adapted to the head state of the user while the smoothness of the target video image is ensured, so that the target rendering frame rate is not too large or too small, and the playing power consumption of VR equipment is saved; in addition, the preset parameter threshold is the upper limit of the motion parameter of the camera when the target video is normally displayed, so that the phenomenon of target video picture jitter caused by head motion is reduced.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 schematically shows a flowchart of a playing method of a virtual reality video provided by an embodiment of the present application;
fig. 2 is a flowchart illustrating a method for determining a target rendering frame rate according to an embodiment of the present application;
FIG. 3 is a flowchart illustrating a method for determining a type of a virtual display screen according to an embodiment of the present application;
fig. 4 is a flowchart illustrating a playing method of a complete virtual reality video provided by an embodiment of the present application;
fig. 5 schematically shows a block diagram of a VR device provided in an embodiment of the present application.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following is a clear and complete description of exemplary embodiments of the present application with reference to the attached drawings in exemplary embodiments of the present application, and it is apparent that the exemplary embodiments described are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
At present, in the process of playing a video by a VR device, in order to ensure the fluency of a video picture, a rendering frame rate of a virtual scene is generally set to be greater than a video frame rate. Therefore, when the head of the user keeps a static state (namely attention is paid to a video picture), higher rendering efficiency is not needed, and at the moment, the rendering frame rate is greater than the video frame rate, the power consumption of the VR equipment is wasted, the temperature of the VR equipment is increased, and the service life is shortened; when the head of the user is in a moving state (i.e. the attention does not stay on the video picture), if the rendering frame rate is not high enough, the problem of video blocking or picture shaking occurs, and the immersive experience of the user is reduced.
In view of this, embodiments of the present application provide a method and an apparatus for playing a virtual reality video, where the method may flexibly adjust a rendering frame rate according to a state of a user and a virtual scene. Specifically, when the head of the user keeps a static state, the rendering frame rate is set to be consistent with the video frame rate, so that the power consumption of VR equipment can be reduced while the smoothness of a played video picture is ensured; when the head of a user is in a motion state (such as the sight is on a 3D object which moves rapidly) or the sight is at the edge of the 3D object or a virtual display screen, the rendering frame rate is set to be consistent with the screen refresh rate, and the rendering frame rate is ensured to be high enough because the screen refresh rate is greater than the video frame rate, so that the problem of video blockage or picture jitter caused by the head motion is solved.
In the embodiment of the application, in order to ensure the fluency of a target video, frame rate comparison tables are generated by measuring in advance the maximum motion parameters of a head (VR camera), including the moving speed and the rotating speed, when a VR device is at different rendering frame rates and a video can be normally displayed, that is, when a picture has no problems of pause and jitter. The results of measurement are shown in Table 1.
TABLE 1 maximum head motion parameters corresponding to different rendering frame rates
Rendering frame rate Maximum moving speed Maximum rotational speed
F1 V1 R1
F2 V2 R2
F3 V3 R3
... ... ...
Embodiments of the present application are described in detail below with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of a playing method of a virtual reality video provided in the embodiment of the present application is executed by a VR device, and the flow chart mainly includes the following steps:
s 101: and entering a three-dimensional virtual scene, and closing the vertical synchronization function.
In an optional implementation manner, a user wears VR equipment, clicks a video playing application in the VR equipment, and the VR equipment responds to the clicking operation of the user to enter a three-dimensional virtual scene. Generally, the screen refresh rate of the VR device is fixed, and in order to dynamically adjust the rendering frame rate in the target video playing process without being limited by the VR device screen, in S101, after entering the three-dimensional virtual scene, the vertical synchronization function is turned off. Specifically, qualitysettings. vsynccount ═ 0 is set in the playback engine Unity.
s 102: and acquiring the target video and the video frame rate of the target video.
The target video may be a local video stored by the VR device or an online video. When the video is the online video, the VR equipment loads and plays the target video from the server according to the URL of the target video selected by the user, and obtains the video frame rate of the target video.
S103: and acquiring the attitude data of the camera according to a preset setting period.
In an embodiment of the application, the VR device moves with head movement of a user, a camera of the VR device is used for simulating eyes of the user, orientation of the camera can represent a visual line direction of the eyes, and posture data of the camera can represent posture data of the head. And the eyes are divided into a left eye and a right eye, the left eye and the right eye are on the same horizontal line and have the same viewing angle range, so that the posture data of the first camera for simulating the left eye and the second camera for simulating the right eye are also the same and can be regarded as one camera.
Generally, an Inertial Measurement Unit (IMU) is provided in a VR device, and can measure pose data of a camera. When S103 is executed, the VR device tracks the head according to a preset setting period, and acquires the pose data of the camera through the IMU. Wherein the pose data includes a moving speed and a rotating speed of the camera.
S104: and determining the target rendering frame rate according to the comparison result of the gesture data and at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera.
In the embodiment of the application, after the video frame rate of the target video is obtained, a frame rate comparison table generated in advance is queried, and at least one preset parameter threshold corresponding to the video frame rate equal to the rendering frame rate is obtained, wherein each preset parameter threshold is an upper limit of a motion parameter of a camera when the target video is normally displayed.
For example, when the video frame rate is F1, the obtained preset parameter thresholds are V1 and R1, respectively, and when the video frame rate is F2, the obtained preset parameter thresholds are V2 and F2, respectively.
And dynamically setting a target rendering frame rate by comparing the acquired attitude data with corresponding preset parameter thresholds respectively and determining the orientation of the camera. In specific implementation, see fig. 2:
s1041: determining whether the orientation of the camera stays in the area of the virtual display screen, if so, executing S1042, otherwise, executing S1044.
In the embodiment of the application, the cameras comprise a first camera for simulating a left eye and a second camera for simulating a right eye, the first camera and the second camera are positioned on the same horizontal line, the central point of the horizontal line is taken as the starting point of the orientation of the camera, and the direction perpendicular to the horizontal line is taken as the orientation of the camera.
In an alternative embodiment, in step S201, an object intersecting a ray starting along the orientation of the camera in the three-dimensional scene is determined, an identifier (such as a name) of the object is obtained, and if the object is determined to be a virtual display screen playing the target video according to the identifier of the object, it is determined that the orientation of the camera stays in an area of the virtual display screen.
S1042: and determining whether the moving speed and the rotating speed of the camera are both less than or equal to corresponding preset parameter thresholds, if so, executing S1043, otherwise, executing S1044.
When the moving speed and the rotating speed of the camera are less than or equal to the corresponding preset parameter threshold values, the video frame rate corresponding to the current preset parameter threshold values is enough; when the moving speed and the rotating speed of the camera exceed the corresponding preset parameter threshold values, the video frame rate corresponding to the current preset parameter threshold value is smaller.
S1043: and determining the video frame rate as a target rendering frame rate.
When the orientation of the camera is in the area of the virtual display screen, and the moving speed and the rotating speed of the camera are less than or equal to the corresponding preset parameter threshold, if the target video is rendered and displayed according to the video frame rate corresponding to the current preset parameter threshold, the smoothness of the target video frame can be ensured, and the rendering frame rate of the VR device is not required to be higher than the video frame rate of the target video.
S1044: and determining the screen refresh rate of the VR device as a target rendering frame rate.
When the orientation of the camera is left in the area of the virtual display screen and at least one of the moving speed and the rotating speed of the camera exceeds the corresponding preset parameter threshold, if the display target video is rendered according to the video frame rate corresponding to the current preset parameter threshold, the situation of picture stumbling and jittering occurs, and the rendering frame rate needs to be increased. In order to achieve the three-dimensional effect, the screen refresh rate of the VR device is generally higher than the video frame rate, and therefore the screen refresh rate of the VR device can be determined as the target rendering frame rate, so that the problems of picture stumbling and shaking caused by head movement are solved, and the immersive experience of the user is further improved.
S105: and rendering and displaying the target video according to the target rendering frame rate.
In S105, the target rendering frequency matches the head motion state of the user, so that when the target video is rendered at the target rendering frame rate, normal display of the target video can be ensured.
Generally, the types of videos are various, and during the process of playing the target video, a user may switch the videos, and the display screens required by different types of videos are different. Therefore, in an embodiment of the present application, before acquiring the pose data of the own camera, referring to fig. 3, the VR device further performs the following steps:
s1021: the video type of the target video is determined.
Optionally, the video types include, but are not limited to, 2D/3D video, fisheye video, 180 ° video, panoramic video.
S1022: and determining the target type of the virtual display screen according to the determined video type.
In specific implementation, when the video type of the target video is 2D or 3D, determining that the target type of the virtual display screen is a plane screen; when the video type of the target video is a fisheye video or a 180-degree video, determining that the target type of the virtual display screen is a semi-enclosed hemispherical screen; and when the video type of the target video is the panoramic video, determining that the target type of the virtual display screen is the spherical surrounding screen.
In this way, in performing S105, the VR device may render the target video for display on the virtual display screen of the target type at the target rendering frame rate.
Referring to fig. 4, a flow of a method for playing a complete virtual reality video provided by the embodiment of the present application mainly includes the following steps:
s401: and entering a three-dimensional virtual scene, and closing the vertical synchronization function.
Every time the video is switched, the following steps are executed:
s402: and acquiring the target video and the video frame rate of the target video.
S403: the video type of the target video is determined.
S404: and determining the target type of the virtual display screen according to the determined video type.
S405: and acquiring the attitude data of the camera according to a preset setting period.
S406: it is determined whether the orientation of the camera stays in the area of the virtual display screen, if so, S407 is performed, otherwise, S409 is performed.
S407: and determining whether the moving speed and the rotating speed of the camera are both less than or equal to corresponding preset parameter thresholds, if so, executing S408, otherwise, executing S409.
S408: and determining the video frame rate as a target rendering frame rate.
S409: determining a screen refresh rate of the VR device as a target rendering frame rate.
S410: and according to the target rendering frame rate, rendering the target video on a virtual display screen of a target type for display.
In the method for playing the virtual reality video provided by the embodiment of the application, after the VE device enters the three-dimensional virtual scene, the vertical synchronization function is closed, so that the rendering frame rate can be dynamically adjusted without being limited by screen hardware; after the target video is acquired, determining a virtual display screen of a corresponding type according to the video type of the target video, and in the process of playing the target video, acquiring the orientation, the moving speed and the rotating speed of the camera according to a preset setting period, determining the video frame rate as the target rendering frame rate when the orientation of the camera stays in the area of the virtual display screen and whether the moving speed and the rotating speed of the camera are both less than or equal to a preset parameter threshold value corresponding to the video frame rate of the target video, otherwise, determining the screen refresh rate of the VR device as the target rendering frame rate, and rendering the target video on the virtual display screen of the target type for display according to the target rendering frame rate. Because the orientation, the moving speed and the rotating speed of the camera reflect the sight line and the head motion state of the user, the target rendering frame rate is flexibly determined through the acquired orientation, the moving speed and the rotating speed of the camera; and the determined target rendering frame rate is adapted to the head state of the user while ensuring the smoothness of the target video picture, so that the target rendering frame rate is not too large or too small, the playing power consumption of VR equipment is saved, and the phenomenon of target video picture jitter caused by head movement is reduced.
Based on the same technical concept, the embodiment of the application provides the VR equipment, and the VR equipment can realize the steps of the virtual reality video playing method in the embodiment and achieve the same technical effect.
Referring to FIG. 5, the apparatus comprises a controller 501, a memory 502, a processor 503, and an IMU504, the processor 503, the memory 502, and the controller 501 being connected by a bus 505;
the memory 502 stores a computer program, and the controller 501 performs the following operations according to the computer program stored in the memory 502:
entering a three-dimensional virtual scene, and closing a vertical synchronization function;
acquiring a target video and a video frame rate of the target video;
acquiring attitude data of a camera per se through the IMU504 according to a preset setting period;
determining a target rendering frame rate according to a comparison result of the attitude data and at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera, wherein the preset parameter threshold is an upper limit of a motion parameter of the camera when the target video is normally displayed;
rendering and displaying, by the processor 503, the target video at the target rendering frame rate.
Optionally, the gesture data includes a moving speed and a rotating speed of the camera, and the controller 501 determines the target rendering frame rate according to a comparison result between the gesture data and at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera, specifically:
determining whether the orientation of the camera stays in the area of the virtual display screen or not, and whether the moving speed and the rotating speed of the camera are both smaller than or equal to corresponding preset parameter thresholds or not;
and if so, determining the video frame rate as a target rendering frame rate, otherwise, determining the screen refresh rate of the VR equipment as the target rendering frame rate.
Optionally, before acquiring the pose data of the self camera, the controller 501 further performs the following operations:
determining a video type of the target video;
determining the target type of a virtual display screen according to the video type;
rendering and displaying the target video according to the target rendering frame rate through the processor, wherein the specific operations are as follows:
rendering, by the processor 503, the target video on the virtual display screen of the target type for display at the target rendering frame rate.
Optionally, the controller determines the target type of the virtual display screen according to the picture type, and the specific operation is as follows:
when the video type of the target video is 2D or 3D, determining that the target type of the virtual display screen is a plane screen;
when the video type of the target video is a fisheye video or a 180-degree video, determining that the target type of the virtual display screen is a semi-enclosed hemispherical screen;
and when the video type of the target video is the panoramic video, determining that the target type of the virtual display screen is a spherical surrounding screen.
Optionally, the screen refresh rate is greater than the video frame rate.
It should be noted that fig. 5 is only an example, and shows hardware necessary for the VR device to execute the steps of the method for playing the virtual reality video provided in the embodiment of the present application, which is not shown, and the VR device further includes common hardware of a display device, such as a speaker, left and right glasses, and the like.
In the embodiment of the present application, the controller 501 may be a Central Processing Unit (CPU), and the processor 503 may be a Graphics Processing Unit (GPU). The memory 502 may be integrated in the controller 501, or may be provided separately from the controller 501.
Embodiments of the present application also provide a computer-readable storage medium for storing instructions that, when executed, may implement the methods of the foregoing embodiments.
The embodiments of the present application also provide a computer program product for storing a computer program, where the computer program is used to execute the method of the foregoing embodiments.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A virtual reality video playing method is applied to VR equipment, and comprises the following steps:
entering a three-dimensional virtual scene, and closing a vertical synchronization function;
acquiring a target video and a video frame rate of the target video;
acquiring attitude data of a camera according to a preset setting period;
determining a target rendering frame rate according to a comparison result of the attitude data and at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera, wherein the preset parameter threshold is an upper limit of a motion parameter of the camera when the target video is normally displayed;
and rendering and displaying the target video according to the target rendering frame rate.
2. The method of claim 1, wherein the pose data comprises a movement speed and a rotation speed of the camera, and wherein determining a target rendering frame rate based on the comparison of the pose data to at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera comprises:
determining whether the orientation of the camera stays in the area of the virtual display screen or not, and whether the moving speed and the rotating speed of the camera are both smaller than or equal to corresponding preset parameter thresholds or not;
and if so, determining the video frame rate as a target rendering frame rate, otherwise, determining the screen refresh rate of the VR equipment as the target rendering frame rate.
3. The method of claim 1, wherein prior to acquiring pose data of the own camera, the method further comprises:
determining a video type of the target video;
determining the target type of a virtual display screen according to the video type;
rendering and displaying the target video according to the target rendering frame rate, including:
and rendering the target video on a virtual display screen of the target type for display according to the target rendering frame rate.
4. The method of claim 3, wherein determining the target type of the virtual display screen based on the picture type comprises:
when the video type of the target video is 2D or 3D, determining that the target type of the virtual display screen is a plane screen;
when the video type of the target video is a fisheye video or a 180-degree video, determining that the target type of the virtual display screen is a semi-enclosed hemispherical screen;
and when the video type of the target video is the panoramic video, determining that the target type of the virtual display screen is a spherical surrounding screen.
5. The method of any of claims 1-4, wherein the screen refresh rate is greater than the video frame rate.
6. A VR device comprising a controller, a memory, a processor, and an inertial measurement unit, IMU, the processor, the memory, and the controller connected by a bus;
the memory stores a computer program according to which the controller performs the following operations:
entering a three-dimensional virtual scene, and closing a vertical synchronization function;
acquiring a target video and a video frame rate of the target video;
acquiring attitude data of a camera per se through the IMU according to a preset setting period;
determining a target rendering frame rate according to a comparison result of the attitude data and at least one preset parameter threshold corresponding to the video frame rate and the orientation of the camera, wherein the preset parameter threshold is an upper limit of a motion parameter of the camera when the target video is normally displayed;
and rendering and displaying the target video according to the target rendering frame rate through the processor.
7. The VR device of claim 6, wherein the pose data includes a speed of movement and a speed of rotation of the camera, and wherein the controller determines a target frame rate of rendering based on a comparison of the pose data to at least one preset parameter threshold corresponding to the video frame rate and an orientation of the camera by:
determining whether the orientation of the camera stays in the area of the virtual display screen or not, and whether the moving speed and the rotating speed of the camera are both smaller than or equal to corresponding preset parameter thresholds or not;
and if so, determining the video frame rate as a target rendering frame rate, otherwise, determining the screen refresh rate of the VR equipment as the target rendering frame rate.
8. The VR device of claim 6, wherein the controller, prior to acquiring pose data for the own camera, is further to:
determining a video type of the target video;
determining the target type of a virtual display screen according to the video type;
rendering and displaying the target video according to the target rendering frame rate through the processor, wherein the specific operations are as follows:
rendering, by the processor, the target video on the target type of virtual display screen for display at the target rendering frame rate.
9. The VR device of claim 8, wherein the controller determines the target type of virtual display screen based on the picture type by:
when the video type of the target video is 2D or 3D, determining that the target type of the virtual display screen is a plane screen;
when the video type of the target video is a fisheye video or a 180-degree video, determining that the target type of the virtual display screen is a semi-enclosed hemispherical screen;
and when the video type of the target video is the panoramic video, determining that the target type of the virtual display screen is a spherical surrounding screen.
10. The VR device of any one of claims 6-9, wherein the screen refresh rate is greater than the video frame rate.
CN202210191645.2A 2022-02-28 2022-02-28 Virtual reality video playing method and equipment Pending CN114630182A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210191645.2A CN114630182A (en) 2022-02-28 2022-02-28 Virtual reality video playing method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210191645.2A CN114630182A (en) 2022-02-28 2022-02-28 Virtual reality video playing method and equipment

Publications (1)

Publication Number Publication Date
CN114630182A true CN114630182A (en) 2022-06-14

Family

ID=81900656

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210191645.2A Pending CN114630182A (en) 2022-02-28 2022-02-28 Virtual reality video playing method and equipment

Country Status (1)

Country Link
CN (1) CN114630182A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114913471A (en) * 2022-07-18 2022-08-16 深圳比特微电子科技有限公司 Image processing method and device and readable storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1918881A2 (en) * 2005-04-19 2008-05-07 Digitalfish, Inc. Techniques and workflows for computer graphics animation system
CN103930817A (en) * 2011-06-20 2014-07-16 谷歌公司 Systems and methods for adaptive transmission of data
CN106127843A (en) * 2016-06-16 2016-11-16 福建数博讯信息科技有限公司 The rendering intent of three-dimensional virtual scene and device
CN110121885A (en) * 2016-12-29 2019-08-13 索尼互动娱乐股份有限公司 For having recessed video link using the wireless HMD video flowing transmission of VR, the low latency of watching tracking attentively
US20200120322A1 (en) * 2017-05-18 2020-04-16 Sony Interactive Entertainment Inc. Image generating device, image display system, and image generating method
CN111228797A (en) * 2020-01-13 2020-06-05 腾讯科技(深圳)有限公司 Data processing method, data processing device, computer and readable storage medium
CN111711811A (en) * 2020-06-29 2020-09-25 京东方科技集团股份有限公司 VR image processing method, device and system, VR equipment and storage medium
US20210065652A1 (en) * 2019-09-04 2021-03-04 Samsung Display Co., Ltd. Electronic device and method of driving the same
CN113206993A (en) * 2021-04-13 2021-08-03 聚好看科技股份有限公司 Method for adjusting display screen and display device
WO2023093776A1 (en) * 2021-11-25 2023-06-01 华为技术有限公司 Interface generation method and electronic device

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1918881A2 (en) * 2005-04-19 2008-05-07 Digitalfish, Inc. Techniques and workflows for computer graphics animation system
CN103930817A (en) * 2011-06-20 2014-07-16 谷歌公司 Systems and methods for adaptive transmission of data
CN106127843A (en) * 2016-06-16 2016-11-16 福建数博讯信息科技有限公司 The rendering intent of three-dimensional virtual scene and device
CN110121885A (en) * 2016-12-29 2019-08-13 索尼互动娱乐股份有限公司 For having recessed video link using the wireless HMD video flowing transmission of VR, the low latency of watching tracking attentively
US20200120322A1 (en) * 2017-05-18 2020-04-16 Sony Interactive Entertainment Inc. Image generating device, image display system, and image generating method
US20210065652A1 (en) * 2019-09-04 2021-03-04 Samsung Display Co., Ltd. Electronic device and method of driving the same
CN111228797A (en) * 2020-01-13 2020-06-05 腾讯科技(深圳)有限公司 Data processing method, data processing device, computer and readable storage medium
CN111711811A (en) * 2020-06-29 2020-09-25 京东方科技集团股份有限公司 VR image processing method, device and system, VR equipment and storage medium
CN113206993A (en) * 2021-04-13 2021-08-03 聚好看科技股份有限公司 Method for adjusting display screen and display device
WO2023093776A1 (en) * 2021-11-25 2023-06-01 华为技术有限公司 Interface generation method and electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
吴翌;: "游戏硬件100Q 关于游戏硬件的大百科", 现代计算机(普及版), no. 03, 5 March 2010 (2010-03-05) *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114913471A (en) * 2022-07-18 2022-08-16 深圳比特微电子科技有限公司 Image processing method and device and readable storage medium
CN114913471B (en) * 2022-07-18 2023-09-12 深圳比特微电子科技有限公司 Image processing method, device and readable storage medium

Similar Documents

Publication Publication Date Title
CN106502427B (en) Virtual reality system and scene presenting method thereof
US10957104B2 (en) Information processing device, information processing system, and information processing method
US11282264B2 (en) Virtual reality content display method and apparatus
TWI669635B (en) Method and device for displaying barrage and non-volatile computer readable storage medium
US10185145B2 (en) Display apparatus and operating method of display apparatus
CN108319362B (en) Panoramic information display method, electronic equipment and computer storage medium
US10712817B1 (en) Image re-projection for foveated rendering
CN106780674B (en) Lens moving method and device
US20200120380A1 (en) Video transmission method, server and vr playback terminal
US10477198B2 (en) Display control method and system for executing the display control method
US11244427B2 (en) Image resolution processing method, system, and apparatus, storage medium, and device
CN113286138A (en) Panoramic video display method and display equipment
CN113206993A (en) Method for adjusting display screen and display device
CN114630182A (en) Virtual reality video playing method and equipment
US11557087B2 (en) Image processing apparatus and image processing method for generating a strobe image using a three-dimensional model of an object
US20140306958A1 (en) Stereoscopic rendering system
JP6915165B2 (en) Equipment and methods for generating view images
US9628770B2 (en) System and method for stereoscopic 3-D rendering
CN108027646B (en) Anti-shaking method and device for terminal display
CN108737721B (en) Camera limiting adjustment method and device
US11423516B2 (en) Gaze enhanced natural motion blur
TWI817335B (en) Stereoscopic image playback apparatus and method of generating stereoscopic images thereof
CN105630170B (en) Information processing method and electronic equipment
JP2019121072A (en) Viewing condition interlocking system, method and program for 3dcg space
CN109410306B (en) Image rendering method, device, storage medium, equipment and virtual reality system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination