CN113347402A - Improved method, device and storage medium for rendering immersive content based on Unity - Google Patents
Improved method, device and storage medium for rendering immersive content based on Unity Download PDFInfo
- Publication number
- CN113347402A CN113347402A CN202110719117.5A CN202110719117A CN113347402A CN 113347402 A CN113347402 A CN 113347402A CN 202110719117 A CN202110719117 A CN 202110719117A CN 113347402 A CN113347402 A CN 113347402A
- Authority
- CN
- China
- Prior art keywords
- camera
- rendering
- screen
- unity
- immersive content
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000009877 rendering Methods 0.000 title claims abstract description 48
- 238000000034 method Methods 0.000 title claims abstract description 20
- 238000003860 storage Methods 0.000 title claims abstract description 8
- 238000004590 computer program Methods 0.000 claims description 11
- 238000004364 calculation method Methods 0.000 claims description 7
- 238000004422 calculation algorithm Methods 0.000 abstract description 8
- 238000007654 immersion Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 4
- 238000005520 cutting process Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000007547 defect Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000011218 segmentation Effects 0.000 description 1
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/122—Improving the 3D impression of stereoscopic images by modifying image signal contents, e.g. by filtering or adding monoscopic depth cues
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/302—Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
Abstract
The invention relates to a Unity-based rendering immersive content improvement method, a Unity-based rendering immersive content improvement device and a storage medium. Rendering resolution is consistent with effective resolution, an immersive picture with higher resolution can be rendered, details of immersive content finally presented are guaranteed, an algorithm is simple and easy to understand, relevant art workers can conveniently adjust parameters more quickly according to the size of an actual application field of naked eye 3D immersive digital sample plates, and software is configured to adapt the content to a display field. The rendering efficiency is improved, the hardware performance requirement is reduced, and the cost investment is effectively reduced.
Description
Technical Field
The invention relates to an improved method, device and storage medium for rendering immersive content based on Unity.
Background
In the process of manufacturing digital contents of naked eye 3D immersion type digital sample plates, a Cave curtain folding system is used for rendering and displaying the manufactured digital contents, wherein the digital contents are divided into front view pictures, left view pictures, right view pictures and bottom view pictures.
In the actual rendering process, the image rendered by the camera corresponding to each image is far larger than the image size projected by the actual immersion space by using the known plug-in algorithm, so that the configuration with low performance cannot normally run, or the related software is crashed due to long-time high-load operation.
Because the rendering resolution of a camera in an actual engine is larger than that of a picture projected in an actual scene, a clear resolution picture is required to be obtained in an actual project, a larger resolution image is often required to be rendered in the engine, related shaders are used for picture segmentation, and then the picture is output to a real display device.
For example: to project 4000 × 4000 resolution pictures in a 4 m × 4 m projected picture area, the picture resolution is 4000 × 7000 when rendered by using the plug-in algorithm, and the currently known algorithm mode generates a large resolution variation due to the difference between the distance and the height of the camera and the view port.
The farther the camera is from the viewport, the closer the actual rendering resolution is to the actual projection resolution, and the closer the camera is to the viewport, the inversely proportional the actual rendering resolution to the actual projection resolution. The left-right relation between the camera and the viewport is also related, when the camera is at the center position of the viewport, the actual rendering resolution is closest to the actual projection resolution, and when the camera is shifted to the left and the right, the larger the shift amount is, the larger the actual rendering resolution is, and the more the wasted rendering resolution is.
The prior art (only aiming at a 4-fold screen naked eye 3D immersive image algorithm) has the following defects: the actual rendering resolution is not consistent with the effective resolution, and under a specific condition, the actual rendering resolution is extremely high, the requirement on rendering configuration is high, and algorithm calculation parameters are more.
Disclosure of Invention
To solve the above technical problem, the present invention provides an improved method, apparatus and storage medium for rendering immersive content based on Unity.
An improved method of Unity-based rendering of immersive content, comprising:
acquiring related parameters according to the actual scene size;
creating a camera array which respectively corresponds to the front, the left, the right and the bottom;
and setting the focal length of the camera according to the related parameters, wherein the focal length formula of the camera is as follows: focal Length = vertical distance of camera to corresponding screen;
calculating the lens shift parameter of the camera according to the relation between the position of the camera and the corresponding screen, wherein the calculation formula is as follows: vertical offset = (screen height/2-camera height)/screen height; horizontal offset = (screen height/2-camera height)/screen height;
and obtaining a final correct rendering picture.
Further, the related parameters include: the screen is long, wide and high, and the distance between the observation point and the front screen and the distance between the observation point and the bottom screen are vertical.
An improved apparatus for Unity-based rendering of immersive content, comprising a processor, a memory, and a computer program stored in the memory and executable on the processor, the processor implementing the improved method for Unity-based rendering of immersive content as described above when executing the computer program.
A computer readable storage medium storing a computer program which, when executed by a processor, implements an improved method of Unity-based rendering of immersive content as described above.
The technical effects of the invention comprise: rendering resolution is consistent with effective resolution, an immersion type picture with higher resolution can be rendered, details of finally presented immersion type contents are guaranteed, an algorithm is simple and easy to understand, relevant art workers can conveniently adjust parameters more quickly according to the size of an actual application field of naked eye 3D immersion type digital sample plates, the contents are configured to a display field in a matched mode, rendering efficiency is improved, hardware performance requirements are reduced, and cost input is effectively reduced.
Drawings
FIG. 1 is an exemplary diagram of a standard immersive projection space;
FIG. 2 is a schematic diagram of a conventional computing approach;
FIG. 3 is a flow diagram of an improved method of rendering immersive content based on Unity;
FIG. 4 is a schematic diagram of the calculation method provided by the present invention.
Detailed Description
In this embodiment, the spatial size in a standard immersive projection space is configured as follows: front projection screen size: 4 meters by 2.51 meters; left and right projection screen size: 4 meters by 2.51 meters; bottom projection screen size: 4 meters by 3 meters; the distance between the observation point position and the front screen is as follows: 3.5 m; height of observation point: 1.6 meters; an example is shown in figure 1.
Conventional computing, as shown in fig. 2:
(1) and acquiring related parameters such as the length, width and height of a screen, the distance between an observation point (a camera) and a front screen and the vertical height distance between an observation point and a bottom screen according to the size of the actual scene.
(2) And creating a camera array which respectively corresponds to the front, the left, the right and the bottom.
(3) And (3) calculating the field angle FOV of the camera, wherein the calculation formula is as follows: FOV = (atan (screen width/2 + (left and right offset of abs camera from screen center point))/distance of screen from camera) × 2. Taking the previous image as an example, the camera field angle is: FOV = (atan (400/2 + 0)/350) × 2 = 59.4898; the left and right camera angles are 120.51; the bottom camera field angle is: 102.68.
(4) after the field angle of the camera is calculated, the rendering resolution of the front camera can be calculated to be 4000 × 3200; rendering resolution 7000 x 3200 of the left and right cameras; the rendering resolution of the bottom camera is 4000 x 7000.
(5) According to the size of the actual scene, the final resolution required by us is respectively as follows: front, left, right: 4000 x 2510, bottom: 4000, calculating cutting parameters according to the camera parameters and the scene parameters, and then cutting the picture to obtain the final projection picture.
The embodiment provides an improved method for rendering immersive content based on Unity, as shown in fig. 3, including:
(1) according to the actual scene size, obtaining relevant parameters, in this embodiment, the relevant parameters include: the length, width and height of the screen, the distance between an observation point and a front screen and the vertical height distance between the observation point and a bottom screen;
(2) creating a camera array which respectively corresponds to the front, the left, the right and the bottom;
(3) setting the focal length of the camera according to the related parameters, wherein the focal length formula of the camera is as follows: focal Length = vertical distance of camera to corresponding screen;
(4) according to the relation between the position of the camera and the corresponding screen, calculating a lens shift parameter (LensShift) of the camera, wherein the calculation formula is as follows:
LenShift_Y = (screenCenterPos.y-eyesPos.y) / screen.y
namely: vertical offset = (screen height/2-camera height)/screen height
LenShift_X = (screenCenterPos.z-eyesPos.z) / screen.x
Namely: horizontal offset = (screen height/2-camera height)/screen height
The code within the Unity engine is: (the front camera is taken as an example here)
Vector3 screenFrontCenterPos=newVector3(0,screenFront.y/2,sceenLength);
float eyesToScreenDis_Z = screenFrontCenterPos.z - eyesPos.z;
float eyesToScreenDis_Y = screenFrontCenterPos.y - eyesPos.y;
float eyesToScreenDis_X = eyesPos.x;
float lenShift_Y = eyesToScreenDis_Y / screenFront.y;
float lenShift_X = eyesToScreenDis_X / screenFront.x;
cam_Front.focalLength = eyesToScreenDis_Z;
cam_Front.lensShift = new Vector2(-lenShift_X, lenShift_Y);
(5) And obtaining a final correct rendering picture, as shown in fig. 4.
The key points of the Unity-based improved method for rendering immersive content provided by the embodiment are as follows: the calculation mode is simpler and more efficient; no redundant rendering area; higher definition immersive content can be rendered; the method can meet the rendering requirement of a low-configuration machine and improve the rendering efficiency under the condition of keeping the configuration of the equipment unchanged.
Technical effects of the improved Unity-based method for rendering immersive content provided by the embodiment include: rendering resolution is consistent with effective resolution, an immersion type picture with higher resolution can be rendered, details of finally presented immersion type contents are guaranteed, an algorithm is simple and easy to understand, relevant art workers can conveniently adjust parameters more quickly according to the size of an actual application field of naked eye 3D immersion type digital sample plates, the contents are configured to a display field in a matched mode, rendering efficiency is improved, hardware performance requirements are reduced, and cost input is effectively reduced.
The embodiment also provides an improved apparatus for rendering immersive content based on Unity, which includes a processor, a memory, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the improved method for rendering immersive content based on Unity is implemented.
The present embodiment also provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the improved Unity-based immersive content rendering method.
Claims (4)
1. An improved Unity-based method of rendering immersive content, comprising:
acquiring related parameters according to the actual scene size;
creating a camera array which respectively corresponds to the front, the left, the right and the bottom;
and setting the focal length of the camera according to the related parameters, wherein the focal length formula of the camera is as follows: focal Length = vertical distance of camera to corresponding screen;
calculating the lens shift parameter of the camera according to the relation between the position of the camera and the corresponding screen, wherein the calculation formula is as follows: vertical offset = (screen height/2-camera height)/screen height; horizontal offset = (screen height/2-camera height)/screen height;
and obtaining a final correct rendering picture.
2. The improved Unity-based rendering immersive content of claim 1, wherein said relevant parameters comprise: the screen is long, wide and high, and the distance between the observation point and the front screen and the distance between the observation point and the bottom screen are vertical.
3. An improved apparatus for Unity-based rendering of immersive content, comprising a processor, a memory, and a computer program stored in said memory and executable on said processor, said processor when executing said computer program implementing the improved method for Unity-based rendering immersive content of any of claims 1-2.
4. A computer readable storage medium storing a computer program which when executed by a processor implements the improved method of Unity-based rendering of immersive content of any of claims 1-2.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110719117.5A CN113347402A (en) | 2021-06-28 | 2021-06-28 | Improved method, device and storage medium for rendering immersive content based on Unity |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202110719117.5A CN113347402A (en) | 2021-06-28 | 2021-06-28 | Improved method, device and storage medium for rendering immersive content based on Unity |
Publications (1)
Publication Number | Publication Date |
---|---|
CN113347402A true CN113347402A (en) | 2021-09-03 |
Family
ID=77479056
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202110719117.5A Pending CN113347402A (en) | 2021-06-28 | 2021-06-28 | Improved method, device and storage medium for rendering immersive content based on Unity |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN113347402A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114449169A (en) * | 2022-01-27 | 2022-05-06 | 中影电影数字制作基地有限公司 | Cutting method and system for displaying panoramic video in CAVE space |
Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102300043A (en) * | 2010-06-23 | 2011-12-28 | 中兴通讯股份有限公司 | Method for adjusting meeting place camera of remote presentation meeting system |
CN102957935A (en) * | 2012-04-05 | 2013-03-06 | 深圳艾特凡斯智能科技有限公司 | Tracking imaging method and device |
US20170206625A1 (en) * | 2016-01-18 | 2017-07-20 | Advanced Micro Devices, Inc. | Method and apparatus to accelerate rendering of graphics images |
CN107333121A (en) * | 2017-06-27 | 2017-11-07 | 山东大学 | The immersion solid of moving view point renders optical projection system and its method on curve screens |
CN108377381A (en) * | 2017-01-03 | 2018-08-07 | 黑帆科技有限公司 | Immersion VR Video Rendering method and devices |
CN108616729A (en) * | 2016-12-15 | 2018-10-02 | 北京阿吉比科技有限公司 | Enhance the method and system of user interface three-dimensional stereo effect |
CN110133958A (en) * | 2019-05-21 | 2019-08-16 | 广州悦享环球文化科技有限公司 | A kind of tracking system and method for three-dimensional film |
CN112770018A (en) * | 2020-12-07 | 2021-05-07 | 深圳市大富网络技术有限公司 | Three-dimensional display method and device for 3D animation and computer readable storage medium |
-
2021
- 2021-06-28 CN CN202110719117.5A patent/CN113347402A/en active Pending
Patent Citations (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102300043A (en) * | 2010-06-23 | 2011-12-28 | 中兴通讯股份有限公司 | Method for adjusting meeting place camera of remote presentation meeting system |
CN102957935A (en) * | 2012-04-05 | 2013-03-06 | 深圳艾特凡斯智能科技有限公司 | Tracking imaging method and device |
US20170206625A1 (en) * | 2016-01-18 | 2017-07-20 | Advanced Micro Devices, Inc. | Method and apparatus to accelerate rendering of graphics images |
CN108616729A (en) * | 2016-12-15 | 2018-10-02 | 北京阿吉比科技有限公司 | Enhance the method and system of user interface three-dimensional stereo effect |
CN108377381A (en) * | 2017-01-03 | 2018-08-07 | 黑帆科技有限公司 | Immersion VR Video Rendering method and devices |
CN107333121A (en) * | 2017-06-27 | 2017-11-07 | 山东大学 | The immersion solid of moving view point renders optical projection system and its method on curve screens |
CN110133958A (en) * | 2019-05-21 | 2019-08-16 | 广州悦享环球文化科技有限公司 | A kind of tracking system and method for three-dimensional film |
CN112770018A (en) * | 2020-12-07 | 2021-05-07 | 深圳市大富网络技术有限公司 | Three-dimensional display method and device for 3D animation and computer readable storage medium |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114449169A (en) * | 2022-01-27 | 2022-05-06 | 中影电影数字制作基地有限公司 | Cutting method and system for displaying panoramic video in CAVE space |
CN114449169B (en) * | 2022-01-27 | 2023-11-17 | 中影电影数字制作基地有限公司 | Clipping method and system for showing panoramic video in CAVE space |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10609282B2 (en) | Wide-area image acquiring method and apparatus | |
CN110336987B (en) | Projector distortion correction method and device and projector | |
CN110111262B (en) | Projector projection distortion correction method and device and projector | |
US8355565B1 (en) | Producing high quality depth maps | |
US7583307B2 (en) | Autostereoscopic display | |
US20140327736A1 (en) | External depth map transformation method for conversion of two-dimensional images to stereoscopic images | |
WO2023045147A1 (en) | Method and system for calibrating binocular camera, and electronic device and storage medium | |
US11189041B2 (en) | Image processing apparatus, control method of image processing apparatus, and non-transitory computer-readable storage medium | |
CN110838164B (en) | Monocular image three-dimensional reconstruction method, system and device based on object point depth | |
CN112686877B (en) | Binocular camera-based three-dimensional house damage model construction and measurement method and system | |
JP2011060216A (en) | Device and method of processing image | |
CN104349155A (en) | Method and equipment for displaying simulated three-dimensional image | |
JP4819834B2 (en) | 3D image processing apparatus and 3D image processing method | |
CN113643414A (en) | Three-dimensional image generation method and device, electronic equipment and storage medium | |
CN106878697A (en) | A kind of image pickup method and its imaging method, device and equipment | |
CN116527863A (en) | Video generation method, device, equipment and medium based on virtual reality | |
CN113347402A (en) | Improved method, device and storage medium for rendering immersive content based on Unity | |
CN112492284A (en) | Edge fusion method and device based on multiple projectors and electronic equipment | |
CN102314682B (en) | Method, device and system for calibrating camera | |
CN101729739A (en) | Method for rectifying deviation of image | |
JP2007315777A (en) | Three-dimensional shape measurement system | |
JP2014164497A (en) | Information processor, image processing method and program | |
CN112785685A (en) | Assembly guiding method and system | |
TWI662694B (en) | 3d image capture method and system | |
CN110691228A (en) | Three-dimensional transformation-based depth image noise marking method and device and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210903 |
|
RJ01 | Rejection of invention patent application after publication |