CN110384921B - VR application anti-dizziness technology based on self-adaptive edge view shielding - Google Patents

VR application anti-dizziness technology based on self-adaptive edge view shielding Download PDF

Info

Publication number
CN110384921B
CN110384921B CN201810366865.8A CN201810366865A CN110384921B CN 110384921 B CN110384921 B CN 110384921B CN 201810366865 A CN201810366865 A CN 201810366865A CN 110384921 B CN110384921 B CN 110384921B
Authority
CN
China
Prior art keywords
acceleration
angular velocity
different
screen
incremental
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810366865.8A
Other languages
Chinese (zh)
Other versions
CN110384921A (en
Inventor
吴亚光
李熠
芦宏川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Wuyi Vision Digital Twin Technology Co ltd
Original Assignee
Beijing Wuyi Vision Digital Twin Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Wuyi Vision Digital Twin Technology Co ltd filed Critical Beijing Wuyi Vision Digital Twin Technology Co ltd
Priority to CN201810366865.8A priority Critical patent/CN110384921B/en
Publication of CN110384921A publication Critical patent/CN110384921A/en
Application granted granted Critical
Publication of CN110384921B publication Critical patent/CN110384921B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8017Driving on land or water; Flying
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality

Abstract

The invention discloses a VR application anti-dizziness technology based on self-adaptive edge view shielding, which is used for carrying out edge shielding with different intensities according to different scenes which are possible to make users dizzy, and acquiring the acceleration and the angular speed of a VR scene; acquiring acceleration and angular velocity of a user body: data can be directly obtained through a VR head display sensor; the VR screen is divided into different areas, and each area corresponds to different acceleration and angular velocity; calculating the degree of differentiation of the acceleration angular velocity of each area and the body of the user; different degrees of differentiation for each region are made for different peripheral view obscurations. The method can ensure immersion of VR to the maximum extent, can greatly relieve VR motion sickness, even avoids VR motion sickness, and provides different anti-dizziness intensities for users to freely select so as to ensure the best VR experience.

Description

VR application anti-dizziness technology based on self-adaptive edge view shielding
Technical Field
The invention relates to the field of VR (virtual reality), in particular to a VR application anti-dizziness technology based on self-adaptive edge view shielding.
Background
Motion sickness in VR applications has always been a major bottleneck limiting VR application and popularization. Many current VR applications continue to use for more than a period of time (or about 20 minutes) to produce a dizziness-like effect similar to car sickness. This symptom is caused by the visual disparity of the video seen with the conditions perceived inside the body. For example, a person's internal body perception feels one step forward, but visual VR applications do not fall back as expected, such inconsistent conditions over time may produce symptoms of dizziness.
At present, the general solution to this problem is to make as few as possible continuous movements or rotations in VR applications, but use instantaneous movements or rotations; if it is necessary to move or rotate, a very low speed is used to prevent the user from dizziness. This greatly limits the scenarios of VR applications.
During the development of VR applications, we find that it is very helpful to reduce the vertigo of VR applications if the user's view at the edge in the VR is reduced. In the report of Columbia researchers, it was also proposed that symptoms produced by VR could be reduced by adjusting the visual range. However, how to limit the edge of the VR without reducing the immersion feeling of the game, it is necessary to point out a way to calculate in real time what condition the masking of the edge is performed, and to apply different degrees of the edge masking to the level of the vertigo feeling.
The invention provides an algorithm which can carry out edge shielding in a self-adaptive manner, and carry out edge shielding with different intensities according to different scenes (different moving speeds and different rotating speeds) which are possible to make a user dizzy.
Disclosure of Invention
The invention provides a VR anti-dizziness technology based on self-adaptive edge view shielding, which can ensure immersion of VR to the maximum extent and greatly relieve VR motion sickness.
In order to achieve the purpose, the invention provides the following technical scheme: a VR application anti-glare technology based on adaptive edge view shading comprises the following steps:
the method comprises the following steps: acquiring acceleration and angular velocity of a VR scene:
inputting:
(1) Full field Depth map (Depth) of 2 consecutive frames;
(2) Orientation and Position and FOV (Camera Position, camera Orientation, FOV) of the corresponding consecutive 2-frame Camera;
(3) Delta (sec) difference of absolute time for 2 consecutive frames.
And (3) outputting: pixel-by-pixel scene acceleration and angular velocity.
The algorithm is as follows:
(1) The Depth of each pixel can be converted into the World Position (World Position) of each pixel through the orientation, position and FOV of the camera, and the World positions (World Position 1 and World Position 2) of 2 continuous frames can be obtained by performing the operation on 2 continuous frames;
a) Obtaining an Inverse Transform Inverse of the projective Transform through the FOV;
b) The Position Clip Space Position of the clipping Space is obtained by Depth and pixel screen Position (UV):
i.float z = Depth * 2.0 - 1.0;
ii. float4 Clip Space Position= float4(UV * 2.0 - 1.0, z, 1.0);
c) Transforming the Clip Space Position by Inverse Proj Transform to obtain the Position View Space Position of the camera coordinate:
i. float4 View Space Position = Inverse Proj Transform* Clip Space Position;
d) Inverse Transform Inverse View Transform of Camera Transform may be obtained with orientation and position of Camera
e) The World Position of the final screen pixel can be obtained through Inverse View Transform and Clip Space Position
i. View Space Position /= View Space Position.w;
ii. float4 World Position = Inverse View Transform* View Space Position;
(2) The acceleration change of the pixel between the 2 consecutive frames can be found by the position change of the pixel between the 2 consecutive frames and the difference delta of the absolute time of the 2 consecutive frames. The formula is as follows:
a)
Figure DEST_PATH_IMAGE001
(3) From the change in pixel velocity, the change in camera orientation, and the camera position between 2 consecutive frames, the angular velocity of each pixel can be calculated. The formula is as follows:
a) Radius = distance of pixel world position to camera position
b) Angular velocity = velocity x radius
Step two: acquiring the acceleration and the angular velocity of the head movement of the user: data is typically obtained directly from the VR headset. Hereinafter referred to as Head Accelation and Head Angular Velocity
Step three: the VR screen is divided into different regions, and each region corresponds to a different acceleration and angular velocity.
(1) The VR screen is firstly divided into a left eye and a right eye, each single eye is divided into four rectangular areas (respectively, upper left, upper right, lower left and lower right) in equal proportion without loss of generality
(2) Averaging the acceleration and the angular velocity of each pixel obtained in the step 1 in one area to obtain the acceleration and the angular velocity for four rectangular areas. Hereinafter referred to as Accelation and Angular Velocity
Step four: the degree of differentiation is calculated for the acceleration angular velocity of each region and the user's body.
(1)Delta Acceleration = Absolute(Acceleration - Head Acceleration)
(2)Delta Angular Velocity = Absolute(Head Acceleration - Head Angular Velocity)
Step five: different degrees of differentiation (Delta Acceleration, delta Angular Velocity) are made for each region to make different edge view masks.
(1) Storing the difference value between the Acceleration And the Angular Velocity in a 2D texture (Tile Delta Acceleration And Delta Angular vector), rendering a polygon (generally a 128-or 256-sided circle) inscribed in the VR screen, and moving the distance from different vertexes of the inscribed polygon to the center of the screen by inquiring the corresponding value in the 2D texture to shield.
(2) By the Delta adaptation of each divided area shown in the map, it can be obtained that the body (head) of the user does not move in the real world, but moves forward suddenly in VR (such as the vehicle accelerates suddenly in a racing game), and the corresponding degree of shading is higher at a position with a larger Delta adaptation. If the user is suddenly rotated in the VR world (such as in a racing game, a sudden racing spin), then a location with a greater Delta Angular Velocity requires a higher degree of shading.
The VR is a VR application program, also called virtual environment, which utilizes computer simulation to generate a virtual world of a three-dimensional space, provides simulation of sense organs such as vision and the like for a user, enables the user to feel as if the user is in the same place, and can observe objects in the three-dimensional space in time without limitation.
The acceleration can be calculated through the rendering depth of the scene, and the different acceleration of each area of the scene can be calculated through the acceleration difference between two continuous frames of the VR camera;
the shading process is to store the difference value of the acceleration and the angular velocity in a 2D texture, render a polygon (generally a 128-edge type or a 256-edge type) inscribed in the VR screen, and perform shading by inquiring corresponding values in the 2D texture to move the distances from different vertexes of the inscribed polygon to the center of the screen.
The invention has the beneficial effects that:
1. the invention reduces the dizziness of VR by shielding the edges with different intensities according to different scenes (different moving speeds and different rotating speeds) which can make the user dizzy.
2. Not only can guarantee VR's sense of immersing to the utmost extent, but also can very big alleviate VR motion sickness, avoid VR motion sickness even.
3. The invention also provides different anti-dizziness strengths for users to freely select so as to ensure the best VR experience.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention and are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the principles of the invention and not to limit the invention.
In the drawings:
FIG. 1 is a flow chart of the present invention.
Detailed Description
The preferred embodiments of the present invention will be described in conjunction with the accompanying drawings, and it will be understood that they are described herein for the purpose of illustration and explanation and not limitation.
A VR application anti-vignetting technique based on adaptive edge view occlusion, comprising the steps of:
the method comprises the following steps: acquiring acceleration and angular velocity of a VR scene:
inputting:
(1) Full field Depth map (Depth) of 2 consecutive frames;
(2) Orientation and Position and FOV (Camera Position, camera Orientation, FOV) of the corresponding consecutive 2-frame Camera;
(1) Delta (sec) difference of absolute time for 2 consecutive frames.
And (3) outputting: pixel-by-pixel scene acceleration and angular velocity.
The algorithm is as follows:
(2) The Depth of each pixel can be converted into the World Position (World Position) of each pixel through the orientation and Position of the camera and the FOV, and the World positions of 2 frames can be obtained by doing the operation for 2 frames.
f) Obtaining an Inverse Transform Inverse prej Transform of the projective Transform through the FOV;
g) The Position Clip Space Position of the clipping Space is obtained by Depth and pixel screen Position (UV):
i.float z = Depth * 2.0 - 1.0;
ii. float4 Clip Space Position= float4(UV * 2.0 - 1.0, z, 1.0);
h) Transforming the Clip Space Position by Inverse Proj Transform to obtain the Position View Space Position of the camera coordinate:
i. float4 View Space Position = Inverse Proj Transform* Clip Space Position;
i) Inverse Transform Inverse View Transform of Camera Transform may be obtained with orientation and position of Camera
j) The World Position of the final screen pixel can be obtained through Inverse View Transform and Clip Space Position
i. View Space Position /= View Space Position.w;
ii. float4 World Position = Inverse View Transform* View Space Position;
(3) The acceleration change of the pixel between the consecutive 2 frames can be found by the position change of the pixel between the consecutive 2 frames and the difference delta of the absolute time of the consecutive 2 frames. The formula is as follows:
k)
Figure 866909DEST_PATH_IMAGE001
(4) From the change in pixel velocity, the change in camera orientation, and the camera position between 2 consecutive frames, the angular velocity of each pixel can be calculated. The formula is as follows:
l) radius = distance of pixel world position to camera position
m) angular velocity = velocity x radius
Step two: acquiring the acceleration and the angular velocity of the head movement of the user: data is typically obtained directly from the VR headset. Hereinafter referred to as Head Accelation and Head Angular Velocity
Step three: the VR screen is divided into different regions, and each region corresponds to a different acceleration and angular velocity.
(1) The VR screen is firstly divided into a left eye and a right eye, each single eye is divided into four rectangular areas (respectively, upper left, upper right, lower left and lower right) in equal proportion without loss of generality
(2) Averaging the acceleration and the angular velocity of each pixel obtained in the step 1 in one area to obtain the acceleration and the angular velocity for four rectangular areas. Hereinafter referred to as Accelation and Angular Velocity
Step four: the degree of differentiation is calculated for the acceleration angular velocity of each region and the user's body.
(2)Delta Acceleration = Absolute(Acceleration - Head Acceleration)
(1)Delta Angular Velocity = Absolute(Head Acceleration - Head Angular Velocity)
Step five: different degrees of differentiation (Delta Acceleration, delta Angular Velocity) are made for each region to make different edge view masks.
(1) In the specific embodiment, the difference between the Acceleration And the Angular Velocity is stored in a 2D texture (Tile Delta acquisition And Delta Angular Velocity), a polygon inscribed in the VR screen (generally, a 128-sided or 256-sided circle) is rendered, and the distance from different vertices of the inscribed polygon to the center of the screen is moved by querying the corresponding value in the 2D texture to perform shading.
(2) By the Delta adaptation of each divided area shown in the map, it can be obtained that the body (head) of the user does not move in the real world, but moves forward suddenly in VR (such as the vehicle accelerates suddenly in a racing game), and the corresponding degree of shading is higher at a position with a larger Delta adaptation. If the user is suddenly rotated in the VR world (such as in a racing game, a sudden racing spin), then a location with a greater Delta Angular Velocity requires a higher degree of shading.
(3) The above algorithm is implemented in Vetrex Shader in concrete implementation
a) UV is the screen space position of the circular mask vertex, which is converted to the coordinate system Final UV with monocular center (0,0).
i. float2 Final UV = UV * float2(0.7f,-0.7f) + float2(0.5f,0.5f);
b) Delta Angular Velocity maps were sampled with Final UV, where sampling Sampler used bilinear filtering to achieve softer edges.
i. float2 Delta Acceleration And Delta Angular Velocity = Tile Delta Acceleration And Delta Angular Velocity.Sample Level(Sampler, Final UV, 0).xy;
ii. float Delta Acceleration = Delta Acceleration And Delta Angular Velocity.x;
iii. float Delta Angular Velocity = Delta Acceleration And Delta Angular Velocity.y;
c) Using the obtained Delta Acceleration and Delta Angular Velocity to make the vertex Position (Position) of the circular shading close to the visual center (0,0), and outputting the final vertex transformation completed Position shock Position:
i. float Shrink Amount = 2.0f – Delta Acceleration - Delta Angular Velocity;
ii. Shrink Position = Position * float4(Shrink Amount,Shrink Amount,1.0f,0.0f);
d) Finally, the transition of the edge shielding is softened in a linear transition mode, so that the edge shielding effect which is not easy to perceive is achieved.
After masking in VR, when the user moves forward suddenly in VR but not in real world (head), the position with larger acceleration difference can obtain good anti-dizziness effect by adaptive masking. The same is that the user is suddenly rotated in the VR world, but the body (head) is not rotated in the real world, and a very ideal anti-dizziness effect can be obtained by the adaptive philosophy at a position where the angular velocity difference is large.
Compared with the prior art, the invention reduces the dizziness of VR by shielding the edges with different intensities according to different scenes (different moving speeds and different rotating speeds) which can make the user dizzy; the immersion feeling of VR can be guaranteed to the maximum extent, VR motion sickness can be greatly relieved, and even VR motion sickness is avoided.
Finally, it should be noted that: although the present invention has been described in detail with reference to the foregoing embodiments, it will be apparent to those skilled in the art that changes may be made in the embodiments and/or equivalents thereof without departing from the spirit and scope of the invention. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (3)

1. A VR application anti-vignetting technology based on adaptive edge view occlusion, comprising the following steps:
the method comprises the following steps: acquiring acceleration and angular velocity of a VR scene:
inputting:
(1) A full field depth map of 2 consecutive frames;
(2) The orientation, position and field angle of the corresponding continuous 2-frame camera;
(3) Difference of absolute time of 2 consecutive frames;
and (3) outputting: scene acceleration and angular velocity pixel by pixel;
step two: acquiring the acceleration and the angular velocity of the head movement of the user: directly obtaining data through a VR head display sensor;
step three: the VR screen is divided into different areas, and each area corresponds to different acceleration and angular velocity;
(1) The VR screen is firstly divided into a left eye and a right eye, and each single eye is divided into four rectangular areas, namely an upper left rectangular area, an upper right rectangular area, a lower left rectangular area and a lower right rectangular area in equal proportion;
(2) Averaging the acceleration and the angular velocity of each pixel obtained in the first step in one area to obtain the acceleration and the angular velocity of the four rectangular areas;
step four: calculating differentiation degrees of the acceleration angular velocities of each area and the body of the user, wherein the differentiation degrees comprise incremental acceleration and incremental angular velocities;
step five: making different edge view obscurations for different differentiation degrees of each region;
(1) Storing the difference value of the acceleration and the angular velocity in a 2D texture, rendering a polygon internally tangent to the VR screen, and carrying out shielding by inquiring the corresponding value in the 2D texture to move the distance between different vertexes internally tangent to the polygon and the center of the screen;
(2) Obtaining that the body of the user does not move in the real world but moves forwards suddenly in the VR through the incremental acceleration of each divided region displayed in the map, wherein the position with the larger incremental acceleration corresponds to the higher shielding degree, and if the user is suddenly rotated in the VR world, the position with the larger incremental angular velocity needs the higher shielding degree;
(3) The above algorithm is implemented in a vertex processor in a specific implementation
a) UV is the position of the screen space of the circular shielding vertex, and is converted into a coordinate system taking the center of a single eye as (0,0) to obtain final UV;
b) Sampling the incremental acceleration with the final UV, the incremental angular velocity map, wherein sampling uses bilinear filtering to reach soft edges;
c) Using the obtained incremental acceleration and the incremental angular velocity to enable the vertex position of the circular mask to approach to a visual center (0,0), and outputting the position of the final vertex after conversion;
d) The transition of the edge shielding is softened in a linear transition mode, so that the edge shielding effect which is not easy to perceive is achieved.
2. The VR application anti-glare technique of claim 1, wherein the acceleration is calculated from a depth of a scene rendering, and the acceleration is different for each region of the scene from a difference in acceleration between two consecutive frames of the VR camera.
3. The VR application anti-glare technique of claim 1, wherein the masking is performed by storing the difference between acceleration and angular velocity in a 2D texture and rendering a polygon inscribed in the VR screen, the polygon being a 128 or 256 polygon, and the masking is performed by querying the corresponding value in the 2D texture to shift the distance between the different vertices inscribed in the polygon and the center of the screen.
CN201810366865.8A 2018-04-23 2018-04-23 VR application anti-dizziness technology based on self-adaptive edge view shielding Active CN110384921B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810366865.8A CN110384921B (en) 2018-04-23 2018-04-23 VR application anti-dizziness technology based on self-adaptive edge view shielding

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810366865.8A CN110384921B (en) 2018-04-23 2018-04-23 VR application anti-dizziness technology based on self-adaptive edge view shielding

Publications (2)

Publication Number Publication Date
CN110384921A CN110384921A (en) 2019-10-29
CN110384921B true CN110384921B (en) 2023-03-28

Family

ID=68284590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810366865.8A Active CN110384921B (en) 2018-04-23 2018-04-23 VR application anti-dizziness technology based on self-adaptive edge view shielding

Country Status (1)

Country Link
CN (1) CN110384921B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220308341A1 (en) 2021-03-29 2022-09-29 Tencent America LLC Towards subsiding motion sickness for viewport sharing for teleconferencing and telepresence for remote terminals

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104539929A (en) * 2015-01-20 2015-04-22 刘宛平 Three-dimensional image coding method and coding device with motion prediction function
CN106658148A (en) * 2017-01-16 2017-05-10 深圳创维-Rgb电子有限公司 Virtual reality (VR) playing method, VR playing apparatus and VR playing system
CN106902513A (en) * 2017-03-02 2017-06-30 苏州蜗牛数字科技股份有限公司 A kind of method that VR game pictures are optimized
CN206601680U (en) * 2016-11-15 2017-10-31 北京当红齐天国际文化发展集团有限公司 Dizzy system is prevented based on sterically defined virtual reality

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9586141B2 (en) * 2011-09-08 2017-03-07 Paofit Holdings Pte. Ltd. System and method for visualizing synthetic objects within real-world video clip
US10395428B2 (en) * 2016-06-13 2019-08-27 Sony Interactive Entertainment Inc. HMD transitions for focusing on specific content in virtual-reality environments

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104539929A (en) * 2015-01-20 2015-04-22 刘宛平 Three-dimensional image coding method and coding device with motion prediction function
CN206601680U (en) * 2016-11-15 2017-10-31 北京当红齐天国际文化发展集团有限公司 Dizzy system is prevented based on sterically defined virtual reality
CN106658148A (en) * 2017-01-16 2017-05-10 深圳创维-Rgb电子有限公司 Virtual reality (VR) playing method, VR playing apparatus and VR playing system
CN106902513A (en) * 2017-03-02 2017-06-30 苏州蜗牛数字科技股份有限公司 A kind of method that VR game pictures are optimized

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
虚拟现实技术浅析;武瑞峰;《中国新技术新产品》;20171015(第19期);第32-33页 *

Also Published As

Publication number Publication date
CN110384921A (en) 2019-10-29

Similar Documents

Publication Publication Date Title
EP3471410B1 (en) Image generation device and image generation method
US10721456B2 (en) Image generation apparatus and image generation method
KR101923562B1 (en) Method for efficient re-rendering objects to vary viewports and under varying rendering and rasterization parameters
EP3057066B1 (en) Generation of three-dimensional imagery from a two-dimensional image using a depth map
US9626790B1 (en) View-dependent textures for interactive geographic information system
JP5818773B2 (en) Image processing apparatus, image processing method, and program
JP4555722B2 (en) 3D image generator
US20050219239A1 (en) Method and apparatus for processing three-dimensional images
EP1391846A1 (en) Image processing method, image processing apparatus, and program for emphasizing object movement
JP2012079291A (en) Program, information storage medium and image generation system
JP2008287696A (en) Image processing method and device
US11417060B2 (en) Stereoscopic rendering of virtual 3D objects
US6529194B1 (en) Rendering method and apparatus, game system, and computer readable program product storing program for calculating data relating to shadow of object in virtual space
CN109461197B (en) Cloud real-time drawing optimization method based on spherical UV and re-projection
WO2021146451A1 (en) Creating action shot video from multi-view capture data
CN110384921B (en) VR application anti-dizziness technology based on self-adaptive edge view shielding
KR100381817B1 (en) Generating method of stereographic image using Z-buffer
WO2009068942A1 (en) Method and system for processing of images
US20220406003A1 (en) Viewpoint path stabilization
CN114494545A (en) Implementation method and system for simulating foggy day in 3D scene
WO2019026388A1 (en) Image generation device and image generation method
JP7377014B2 (en) Image display device, image display system, and image display method
JP6503098B1 (en) Image processing apparatus, image processing program and image processing method
KR101227183B1 (en) Apparatus and method for stereoscopic rendering 3-dimension graphic model
CN112037313A (en) VR scene optimization method based on tunnel visual field

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Room 307, 3 / F, supporting public building, Mantingfangyuan community, qingyanli, Haidian District, Beijing 100086

Applicant after: Beijing Wuyi Vision digital twin Technology Co.,Ltd.

Address before: Room 307, 3 / F, public building, Mantingfangyuan community, qingyunli, Haidian District, Beijing

Applicant before: DANGJIA MOBILE GREEN INTERNET TECHNOLOGY GROUP Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant
GR01 Patent grant