CN106600672B - A kind of network-based distributed synchronization rendering system and method - Google Patents
A kind of network-based distributed synchronization rendering system and method Download PDFInfo
- Publication number
- CN106600672B CN106600672B CN201611075346.3A CN201611075346A CN106600672B CN 106600672 B CN106600672 B CN 106600672B CN 201611075346 A CN201611075346 A CN 201611075346A CN 106600672 B CN106600672 B CN 106600672B
- Authority
- CN
- China
- Prior art keywords
- camera
- display terminal
- display
- information
- terminal computer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
Landscapes
- Engineering & Computer Science (AREA)
- Computer Graphics (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Processing Or Creating Images (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
The invention discloses a kind of network-based distributed synchronization rendering method comprising steps of calculating projection matrix, setting display terminal computer, setting master control server, updating scene information, processing synchronization object and camera information and adjustment camera perspective.Network-based distributed synchronization rendering system includes: multihead display terminal, terminal computer, high speed Ethernet, master control server, router and camera system, multihead display terminal is connect by DP cable with terminal computer, and high speed Ethernet connects terminal computer, master control server and router.Compared with prior art, the present invention only driving one piece of display screen using every computer, entire scene is spliced by content in muti-piece display screen, thus the performance requirement of single computer is substantially reduced, and possess good extended capability.
Description
Technical field
The present invention relates to a kind of image Renderings in light current field, and in particular to one kind is aobvious by network-control distribution
Show the system that terminal computer realizes synchronous rendering super large resolution picture.
Background technique
With the rapid development of computer hardware, processing capacity, memory size and the computer networking technology of computer are all
The progress advanced by leaps and bounds, but the scale of scientific algorithm is increasing, and people are higher and higher to the expectation of computer.Large-scale virtual
The visualization in scientific computing of the numerous areas such as the application of the immersive VRs such as scene walkthrough system, Virtual Battlefield and meteorology
Using, for calculating grid oneself reach million, ten million or even billions of, the data of calculating have reached the GB even scale of TB.
These applications do not require nothing more than computer graphics system with very high 3 D image drawing speed, also want that high-resolution can be exported
Large screen display.
And with the development of hardware and system, software and hardware combining becomes the mainstream of current Rendering, utilizes external hardware
Information capture and shadow casting technique, so that image is carried out resolution process according to coordinate.But this technology consuming resource is big, price
Valuableness needs large-scale work station to support, on technological layer, its direct dividing screen areas, and by each sub-screen area
Domain corresponds to one piece of final display area, thus can cause task distribution not since pel is unevenly distributed on the screen
Equilibrium, to become the bottleneck of whole system.And technical adjustment is carried out if necessary, total algorithm, which is also required to synchronize, to be changed
Into exception is difficult.
Therefore, a series of improvement have been carried out to solve the above-mentioned problems.
Summary of the invention
The object of the present invention is to provide a kind of network-based distributed synchronization rendering systems, to overcome the prior art
Existing disadvantages mentioned above and deficiency.
A kind of network-based distributed synchronization rendering method, which is characterized in that step includes:
Step 1: calculate projection matrix, by shown in reality screen size and observer head position come
Calculate the position of camera and projection matrix in Unity 3D, it is determined that positional relationship, directional information between camera and screen,
The configuration file of more projection camera models in every machine is set by the environment in reality, it can head according to the observation
Position adjustment show scene visual angle;
Step 2: setting display terminal computer calculates in system comprising plural platform display terminal for running Unity3D
Machine, plural display terminal computer correspond to the display terminal of respective numbers;
Step 3: setting master control server, for the display content of synchronously control plural number platform display terminal computer;
Step 4: updating scene information, master control server process presets scene and shows information, grasps to corresponding object
Make, system will draw an image, and complete picture is collectively constituted by plural block display screen respectively, and master control server will be constantly updated
The change information of present image;
Step 5: processing synchronization object and camera information;Handle synchronization object and camera information, master control server is according to working as
The position coordinates of preceding image update camera information, send synchronization object and camera information, and master control server is shown eventually to plural platform
Computer is held to send present image information and camera information;
Step 6: adjustment camera perspective;Synchronization scenarios information, each display terminal computer is according to the image received
Information and camera information adjust respective camera, render corresponding scene;
Wherein, the step 1 specifically includes: multihead display terminal uses plural platform terminal computer computer Concurrent Display
Mode, the large-scale display scene with wide viewing angle is constituted, by the size and sight that show screen in reality
The position on the person of examining head calculates the position of camera and projection matrix in Unity3D, if pa, pb, pc are the three of projection plane
A vertex, an available three-dimensional rectangle, and determine its size, length-width ratio, position, direction, pe indicates the position of camera,
Positional relationship, the directional information between camera and screen has been determined, has been arranged by the environment in reality more in every machine
Project the configuration file of camera model;
Wherein, multihead display terminal is arranged in cambered surface, and the radius of circle corresponding to cambered surface is R, and H is enabled to indicate the long side length of display screen
Degree, W indicate the bond length of display screen, and w` indicates that the seal ring thickness of display screen, α indicate the folder between adjacent two pieces of display screens
Angle,
Assuming that the coordinate value in three-dimensional space in the three-dimensional coordinate system 1 of a point P is P (x, y, z), then point P is in space
Coordinate value P` (x`, y`, z`) in coordinate system 2 can be indicated are as follows:
Wherein T is translation matrix, and R is spin matrix, and R=RxRyRz,
Pe is on the center vertical line of multihead display terminal, and the central point of the first multihead display terminal is the seat of coordinate system 1
Mark origin;Pe is on the center vertical line of adjacent display screen, and the central point of adjacent display screen is the coordinate origin of coordinate system 3;pe
For the coordinate origin of coordinate system 2, coordinate value of the pb` in coordinate system 3 is pb` (x3, y3, z3), seat of the pb` in coordinate system 2
Scale value is pb` (x2, y2, z2), and coordinate value of the pb` in coordinate system 1 is pb` (x1, y1, z1), then:
Wherein T=[0 0 R]T, R=RxRyRz,
Wherein T`=[0 0-R]T, R`=Rx`Ry`Rz`,
So
In the case where considering display screen frame width, it is easy to getRadiusGeneration
Coordinate value of the pb` in coordinate system 1 can be acquired by entering above formula, so as to acquire coordinate of the vertex of each screen in coordinate system 1
Value.
Further, in the step 1, by the projection matrix for each display screen being calculated as a result, in the form of an xml-file
It is saved, then obtained XML configuration file is deployed on corresponding display terminal computer, by each display terminal computer
In Unity 3D program load use.
A kind of network-based distributed synchronization rendering system, which is characterized in that multihead display terminal, terminal computer,
High speed Ethernet, master control server, router and camera system, the multihead display terminal pass through DP cable and terminal computer
Connection, the high speed Ethernet connection terminal computer, master control server and router.
Further, the multihead display terminal is made of plural block display screen, and successively level arranges display screen, display screen it
Between be equipped with angle.
Beneficial effects of the present invention:
Compared with prior art, the present invention using multiple stage computers simulation in the same scene, and pass through synchronous service
The picture for showing a part in scene respectively forms three-dimensional rectangle by three vertex that algorithm calculates projection plane,
Optimum position is further calculated after determining its size, length-width ratio, position, direction and camera position, then picture is passed through into multi-screen
Display terminal, realization are spliced into entire scene jointly;The rendering of entire scene is completed jointly by multiple stage computers, and every
Platform computer only drives one piece of display screen, entire scene is spliced by content in muti-piece display screen, thus to single computer
Performance requirement substantially reduce, and possess good extended capability.
Detailed description of the invention
Fig. 1 is system construction drawing of the invention.
Fig. 2 is algorithm coordinate schematic diagram of the invention.
Fig. 3 is algorithm demo system figure of the invention.
Appended drawing reference:
Multihead display terminal 100, terminal computer 200, high speed Ethernet 300, master control server 400,500 and of router
Camera system 600.
Specific embodiment
Below in conjunction with specific embodiment, the invention will be further described.It should be understood that following embodiment is merely to illustrate this
Invention is not for limiting the scope of the invention.
Embodiment 1
Fig. 1 is system construction drawing of the invention.Fig. 2 is algorithm coordinate schematic diagram of the invention.Fig. 3 is algorithm of the invention
Demonstration graph.
A kind of network-based distributed synchronization rendering method, step include:
Step 1: calculate projection matrix, by shown in reality screen size and observer head position come
Calculate the position of camera and projection matrix in Unity 3D, it is determined that positional relationship, directional information between camera and screen,
The configuration file of more projection camera models in every machine is set by the environment in reality, it can head according to the observation
Position adjustment show scene visual angle;
Step 2: setting display terminal computer calculates in system comprising plural platform display terminal for running Unity3D
Machine, plural display terminal computer correspond to the display terminal of respective numbers;
Step 3: setting master control server, for the display content of synchronously control plural number platform display terminal computer;
Step 4: updating scene information, master control server process presets scene and shows information, grasps to corresponding object
Make, system will draw an image, and complete picture is collectively constituted by plural block display screen respectively, and master control server will be constantly updated
The change information of present image;
Step 5: processing synchronization object and camera information;Handle synchronization object and camera information, master control server is according to working as
The position coordinates of preceding image update camera information, send synchronization object and camera information, and master control server is shown eventually to plural platform
Computer is held to send present image information and camera information;
Step 6: adjustment camera perspective;Synchronization scenarios information, each display terminal computer is according to the image received
Information and camera information adjust respective camera, render corresponding scene;
Wherein, the step 1 specifically includes: multihead display terminal uses plural platform terminal computer computer Concurrent Display
Mode, the large-scale display scene with wide viewing angle is constituted, by the size and sight that show screen in reality
The position on the person of examining head calculates the position of camera and projection matrix in Unity3D, if pa, pb, pc are the three of projection plane
A vertex, an available three-dimensional rectangle, and determine its size, length-width ratio, position, direction, pe indicates the position of camera,
Positional relationship, the directional information between camera and screen has been determined, has been arranged by the environment in reality more in every machine
Project the configuration file of camera model;
Wherein, multihead display terminal is arranged in cambered surface, and the radius of circle corresponding to cambered surface is R, and H is enabled to indicate the long side length of display screen
Degree, W indicate the bond length of display screen, and w` indicates that the seal ring thickness of display screen, α indicate the folder between adjacent two pieces of display screens
Angle,
Assuming that the coordinate value in three-dimensional space in the three-dimensional coordinate system 1 of a point P is P (x, y, z), then point P is in space
Coordinate value P` (x`, y`, z`) in coordinate system 2 can be indicated are as follows:
Wherein T is translation matrix, and R is spin matrix, and R=RxRyRz,
Pe is on the center vertical line of multihead display terminal, and the central point of the first multihead display terminal is the seat of coordinate system 1
Mark origin;Pe is on the center vertical line of adjacent display screen, and the central point of adjacent display screen is the coordinate origin of coordinate system 3;pe
For the coordinate origin of coordinate system 2, coordinate value of the pb` in coordinate system 3 is pb` (x3, y3, z3), seat of the pb` in coordinate system 2
Scale value is pb` (x2, y2, z2), and coordinate value of the pb` in coordinate system 1 is pb` (x1, y1, z1), then:
Wherein T=[0 0 R]T, R=RxRyRz,
Wherein T`=[0 0-R]T, R`=Rx`Ry`Rz`,
So
In the case where considering display screen frame width, it is easy to getRadiusGeneration
Coordinate value of the pb` in coordinate system 1 can be acquired by entering above formula, so as to acquire coordinate of the vertex of each screen in coordinate system 1
Value.
In step 1, by the projection matrix for each display screen being calculated as a result, saving in the form of an xml-file, then
Obtained XML configuration file is deployed on corresponding display terminal computer, by the Unity in each display terminal computer
The load of 3D program uses.
Multihead display terminal 100, terminal computer 200, high speed Ethernet 300, master control server 400,500 and of router
Camera system 600, multihead display terminal 100 are connect by DP cable with terminal computer 200, and high speed Ethernet 300 connects end
Hold computer 200, master control server 400 and router 500.
In the present embodiment, multihead display terminal 100: by 5 4K, the display screen of 3840 × 2160 resolution ratio is formed,
Vertical display is spliced into the large screen that one piece of resolution ratio is 10800 × 3840, for showing scene information;
Terminal computer 200: for running Unity3D, for handling contextual data, render scenes, terminal computer
Configure as follows, Intel Corei5CPU, 4G memory, 1T hard disk, GeForceGTX750 video card, operating system Windows10;
High speed Ethernet 300: for connecting distributed display terminal computer 200, master control server 400 and gigabit routing
Device 500, carries out data transmission;
Master control server 400: it for the rendering content of synchronously control distribution display terminal computer 200, can manipulate
Position, the rotation etc. of object are shown in scene, the configuration of master control server is as follows, Intel Intel Core i5 CPU, 8G memory, and 1T is hard
Disk, GeForce GTX750 video card, operating system Windows10;
Router 500: terminal computer 200 and master control server 400 are connected by high high speed Ethernet 300, counted
According to transmission, which uses 8 mouthfuls of gigabit wireless routers of TP-LINK.
Multihead display terminal 100 is made of plural block display screen, and successively level arranges display screen, and folder is equipped between display screen
Angle.
H=1.24, W=0.72, w=are taken by actual measurement using 5 pieces 55 cun of large-size screen monitors in the present embodiment
0.015, unit: rice, neighboring screens angle α=30 degree, then pb` in coordinate system 3 coordinate value pb` (0.345, -0.605,
0) coordinate value pb` (- 0.3730, -0.605,0.3525) of the pb` in coordinate system 1, is acquired by above formula.It can similarly acquire each
Coordinate value of the vertex of screen in coordinate system 1.Write XML configuration file: by the projection matrix for each display screen being calculated
As a result, being saved in the form of an xml-file.
It imports XML configuration file: obtained XML configuration file being deployed on corresponding display terminal computer, by each
Unity 3D program in display terminal computer, which loads, to be used;
Above-mentioned hardware configuration is being used, and after system is ready, operational process is as follows:
Setting display terminal computer 200: it for running Unity3D, is calculated in the present embodiment comprising 5 display terminals
Machine;
Master control server 400 is set: for the display content of 5 display terminal computers 200 of synchronously control;
Circulation: this is executed always after circulating in system starting until system exits.
Update scene information: the default scene of the processing of master control server 400 shows information, operates to corresponding object,
Such as carry out the operation such as moving, scale, rotate, in the present embodiment, system will draw the earth image in a universe, the earth
Rotation animation is kept, complete picture is collectively constituted by 5 pieces of display screens, and master control server 400 will constantly update current Earth model
Motion information;
Handle synchronization object and camera information: in the present embodiment, master control server 400 is according to the position of current Earth model
It sets coordinate and updates camera information;
Send synchronization object and camera information: in the present embodiment, master control server 400 is to 5 display terminal computers
Send current Earth model motion information and camera information.
Adjust camera perspective, synchronization scenarios information: in the present embodiment, each display terminal computer is according to receiving
Earth model motion information and camera information, adjust respective camera, render corresponding scene.
A specific embodiment of the invention is illustrated above, but the present invention is not limited thereto, without departing from
Spirit of the invention, the present invention can also have various change.
Claims (4)
1. a kind of network-based distributed synchronization rendering method, which is characterized in that step includes:
Step 1: calculating projection matrix, calculated by the size and the position on observer head that show screen in reality
The position of camera and projection matrix in Unity 3D out, it is determined that positional relationship, directional information between camera and screen pass through
Environment in reality come the configuration file for the more projection camera models being arranged in every machine, can head according to the observation position
Set the visual angle that adjustment shows scene;
Step 2: setting display terminal computer includes plural platform display terminal computer in system for running Unity3D,
Plural platform display terminal computer corresponds to the display terminal of respective numbers;
Step 3: setting master control server, for the display content of synchronously control plural number platform display terminal computer;
Step 4: updating scene information, master control server process presets scene and shows information, operates to corresponding object, be
System will draw an image, and complete picture is collectively constituted by plural block display screen respectively, and master control server will be constantly updated current
The change information of image;
Step 5: processing synchronization object and camera information;Synchronization object and camera information are handled, master control server is according to current figure
The position coordinates of picture update camera information, send synchronization object and camera information, and master control server is to plural platform display terminal meter
Calculation machine sends present image information and camera information;
Step 6: adjustment camera perspective;Synchronization scenarios information, each display terminal computer is according to the image information received
With camera information, respective camera is adjusted, renders corresponding scene;
Wherein, the step 1 specifically includes: mode of the multihead display terminal using plural platform terminal computer Concurrent Display, structure
At a large-scale display scene with wide viewing angle, by the size and the observer head that show screen in reality
Position calculates the position of camera and projection matrix in Unity3D, can be with if pa, pb, pc are three vertex of projection plane
A three-dimensional rectangle is obtained, and determines its size, length-width ratio, position, direction, pe indicates the position of camera, it is determined that camera
More projection camera moulds in every machine are arranged by the environment in reality for positional relationship, directional information between screen
The configuration file of block;
Wherein, multihead display terminal is arranged in cambered surface, and the radius of circle corresponding to cambered surface is R, and H is enabled to indicate display screen long side length, W
Indicating the bond length of display screen, w` indicates that the seal ring thickness of display screen, α indicate the angle between adjacent two pieces of display screens,
Assuming that the coordinate value in three-dimensional space in the three-dimensional coordinate system 1 of a point P is P (x, y, z), then point P is in space coordinate
It is that coordinate value P` (x`, y`, z`) in 2 can be indicated are as follows:
Wherein T is translation matrix, and R is spin matrix, and R=RxRyRz,
Pe is on the center vertical line of multihead display terminal, and the central point of the first multihead display terminal is the coordinate original of coordinate system 1
Point;Pe is on the center vertical line of adjacent display screen, and the central point of adjacent display screen is the coordinate origin of coordinate system 3;Pe is to sit
The coordinate origin of mark system 2, coordinate value of the pb` in coordinate system 3 are pb` (x3, y3, z3), coordinate value of the pb` in coordinate system 2
For pb` (x2, y2, z2), coordinate value of the pb` in coordinate system 1 is pb` (x1, y1, z1), then:
Wherein 0 R of T=0T, R=RxRyRz,
Wherein T`=[0 0-R]T, R`=Rx`Ry`Rz`,
So
In the case where considering display screen frame width, it is easy to getRadiusIn substitution
Formula can acquire coordinate value of the pb` in coordinate system 1, so as to acquire coordinate value of the vertex of each screen in coordinate system 1.
2. a kind of network-based distributed synchronization rendering method according to claim 1, which is characterized in that the step
In 1, by the projection matrix for each display screen being calculated as a result, saving in the form of an xml-file, then the XML that will be obtained
Configuration file is deployed on corresponding display terminal computer, is made by the Unity3D program load in each display terminal computer
With.
3. a kind of network-based distributed synchronization rendering system method according to claim 1, which is characterized in that packet
It includes: multihead display terminal (100), terminal computer (200), high speed Ethernet (300), master control server (400), router
(500) it is connect by DP cable with terminal computer (200) with camera system (600), the multihead display terminal (100), institute
State high speed Ethernet (300) connection terminal computer (200), master control server (400) and router (500).
4. a kind of network-based distributed synchronization rendering system according to claim 3, which is characterized in that the multi-screen
Display terminal (100) is made of plural block display screen, and successively level arranges display screen, and angle is equipped between display screen.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611075346.3A CN106600672B (en) | 2016-11-29 | 2016-11-29 | A kind of network-based distributed synchronization rendering system and method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201611075346.3A CN106600672B (en) | 2016-11-29 | 2016-11-29 | A kind of network-based distributed synchronization rendering system and method |
Publications (2)
Publication Number | Publication Date |
---|---|
CN106600672A CN106600672A (en) | 2017-04-26 |
CN106600672B true CN106600672B (en) | 2019-09-10 |
Family
ID=58593809
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201611075346.3A Active CN106600672B (en) | 2016-11-29 | 2016-11-29 | A kind of network-based distributed synchronization rendering system and method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN106600672B (en) |
Families Citing this family (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111124326A (en) * | 2018-10-31 | 2020-05-08 | 中兴通讯股份有限公司 | Picture display method, terminal and computer readable storage medium |
CN109725728B (en) * | 2018-12-29 | 2022-02-08 | 三星电子(中国)研发中心 | Display correction method and device of AR equipment |
CN109727315B (en) * | 2018-12-29 | 2023-08-22 | 上海曼恒数字技术股份有限公司 | One-to-many cluster rendering method, device, equipment and storage medium |
CN110471772B (en) * | 2019-08-19 | 2022-03-15 | 上海云绅智能科技有限公司 | Distributed system, rendering method thereof and client |
CN110868617B (en) * | 2019-11-27 | 2022-03-22 | 烟台职业学院 | Synchronous display method based on distributed system |
CN111240625B (en) * | 2020-01-09 | 2022-03-18 | 盾钰(上海)互联网科技有限公司 | Method and system for calculating image dynamic rendering of infinite visual boundary |
CN117149121A (en) * | 2023-09-01 | 2023-12-01 | 上海昇瑭智能科技有限公司 | Method and device for displaying panoramic multimedia resource of circular screen and electronic equipment |
CN117130573B (en) * | 2023-10-26 | 2024-02-20 | 北京世冠金洋科技发展有限公司 | Multi-screen control method, device, equipment and storage medium |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102880288A (en) * | 2012-08-20 | 2013-01-16 | 深圳市维尚视界立体显示技术有限公司 | Three-dimensional (3D) display human-machine interaction method, device and equipment |
CN102945564A (en) * | 2012-10-16 | 2013-02-27 | 上海大学 | True 3D modeling system and method based on video perspective type augmented reality |
CN202854704U (en) * | 2012-08-20 | 2013-04-03 | 深圳市维尚视界立体显示技术有限公司 | Three-dimensional (3D) displaying man-machine interaction equipment |
CN103080882A (en) * | 2010-09-09 | 2013-05-01 | 索尼公司 | Information processing device, method of processing information, and program |
CN103106679A (en) * | 2013-01-05 | 2013-05-15 | 广东威创视讯科技股份有限公司 | Method, system and platform for distributed type three-dimensional (3D) multichannel rendering |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2639690B1 (en) * | 2012-03-16 | 2017-05-24 | Sony Corporation | Display apparatus for displaying a moving object traversing a virtual display region |
-
2016
- 2016-11-29 CN CN201611075346.3A patent/CN106600672B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103080882A (en) * | 2010-09-09 | 2013-05-01 | 索尼公司 | Information processing device, method of processing information, and program |
CN102880288A (en) * | 2012-08-20 | 2013-01-16 | 深圳市维尚视界立体显示技术有限公司 | Three-dimensional (3D) display human-machine interaction method, device and equipment |
CN202854704U (en) * | 2012-08-20 | 2013-04-03 | 深圳市维尚视界立体显示技术有限公司 | Three-dimensional (3D) displaying man-machine interaction equipment |
CN102945564A (en) * | 2012-10-16 | 2013-02-27 | 上海大学 | True 3D modeling system and method based on video perspective type augmented reality |
CN103106679A (en) * | 2013-01-05 | 2013-05-15 | 广东威创视讯科技股份有限公司 | Method, system and platform for distributed type three-dimensional (3D) multichannel rendering |
Non-Patent Citations (2)
Title |
---|
CAVE系统中不规则多屏投影变换矩阵;刘劲松 等;《计算机与现代化》;20160914;40-44 |
基于Unity3D平台的三维虚拟城市研究与应用;王星捷 等;《计算机技术与发展》;20130430;第23卷(第4期);241-244 |
Also Published As
Publication number | Publication date |
---|---|
CN106600672A (en) | 2017-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN106600672B (en) | A kind of network-based distributed synchronization rendering system and method | |
CN103049927B (en) | Real time ray tracing rendering intent based on GPU cluster | |
CN102495712A (en) | Map splicing display method based on a plurality of display terminals | |
WO2021135320A1 (en) | Video generation method and apparatus, and computer system | |
CN103918012A (en) | Rendering system, rendering server, control method thereof, program, and recording medium | |
CN101189643A (en) | 3D image forming and displaying system | |
CN111754614A (en) | Video rendering method and device based on VR (virtual reality), electronic equipment and storage medium | |
US20200137267A1 (en) | Virtual video environment display systems | |
WO2013185516A1 (en) | Animation display method and apparatus for three-dimensional curve | |
JPH09244522A (en) | Method and device for undergoing virtual building | |
CN105025281B (en) | Large-size spherical screen super-definition film playing and interactive application splicing and fusing method | |
WO2023207001A1 (en) | Image rendering method and apparatus, and electronic device and storage medium | |
Zhu et al. | SAVE: shared augmented virtual environment for real-time mixed reality applications | |
CN117132699A (en) | Cloud rendering system and method based on computer | |
CN115103134A (en) | LED virtual shooting cutting synthesis method | |
Mueller et al. | Distributed Force-Directed Graph Layout and Visualization. | |
JP6719596B2 (en) | Image generation device and image display control device | |
JP2015534299A (en) | Automatic correction method of video projection by inverse transformation | |
CN106547557A (en) | A kind of multi-screen interactive exchange method based on virtual reality and bore hole 3D | |
Teubl et al. | Fastfusion: A scalable multi-projector system | |
WO2012173304A1 (en) | Graphical image processing device and method for converting a low-resolution graphical image into a high-resolution graphical image in real time | |
JP4733757B2 (en) | Polygon processing apparatus, program, and information recording medium | |
CN110837297B (en) | Information processing method and AR equipment | |
Liao et al. | Gpu parallel computing of spherical panorama video stitching | |
CN116075860A (en) | Information processing apparatus, information processing method, video distribution method, and information processing system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |