CN111354085A - Immersive interactive Box image production method - Google Patents
Immersive interactive Box image production method Download PDFInfo
- Publication number
- CN111354085A CN111354085A CN202010118481.1A CN202010118481A CN111354085A CN 111354085 A CN111354085 A CN 111354085A CN 202010118481 A CN202010118481 A CN 202010118481A CN 111354085 A CN111354085 A CN 111354085A
- Authority
- CN
- China
- Prior art keywords
- panoramic
- images
- cameras
- host
- frame
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000004519 manufacturing process Methods 0.000 title claims abstract description 47
- 230000002452 interceptive effect Effects 0.000 title claims abstract description 43
- 238000013507 mapping Methods 0.000 claims abstract description 11
- 238000009877 rendering Methods 0.000 claims abstract description 10
- 230000001360 synchronised effect Effects 0.000 claims abstract description 10
- 238000000034 method Methods 0.000 claims description 13
- 230000006870 function Effects 0.000 claims description 9
- 230000000694 effects Effects 0.000 claims description 7
- 238000012545 processing Methods 0.000 abstract description 7
- 230000002194 synthesizing effect Effects 0.000 abstract 1
- 238000005516 engineering process Methods 0.000 description 19
- 230000000007 visual effect Effects 0.000 description 7
- 238000010586 diagram Methods 0.000 description 3
- 230000009471 action Effects 0.000 description 2
- 238000012937 correction Methods 0.000 description 2
- 230000010365 information processing Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000008014 freezing Effects 0.000 description 1
- 238000007710 freezing Methods 0.000 description 1
- 238000007306 functionalization reaction Methods 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T17/00—Three dimensional [3D] modelling, e.g. data description of 3D objects
- G06T17/10—Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/04—Texture mapping
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Geometry (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Processing Or Creating Images (AREA)
Abstract
The invention relates to the technical field of interactive image processing, and particularly discloses an immersive interactive Box movie and television manufacturing method which comprises the steps of shooting a plurality of groups of spherical panoramic images, converting the spherical panoramic images into panoramic sequence frames to be exported, naming the panoramic sequence frames regularly, storing the panoramic sequence frames as material balls in an image sequence form map, establishing a cube model, subdividing the cube model to a project requirement state, giving three-dimensional texture coordinates of spherical mapping, establishing six cameras, establishing six groups of simulated parallel light, setting attributes of the six cameras, rendering lenses of the six cameras to obtain six groups of sequence frames, synthesizing the six groups of sequence frames into six images, naming the six images regularly to realize synchronous playing of the six images by a main computer and an auxiliary computer, and realizing the interactive play function of the main computer and the synchronous selection function of the auxiliary computer. The invention can enable the user to watch the best picture at any position in the equipment, and simultaneously realizes the function of the interactive worker by utilizing the program control.
Description
Technical Field
The invention relates to the technical field of interactive image processing, in particular to an immersive interactive Box image manufacturing method.
Background
The immersive CAVE image production method is a visual immersive artifact which utilizes a video information processing technology to fuse a single source or image by utilizing equipment and project the fused image on a plurality of display equipment. The non-real immersive virtual demonstration environment is composed of more than two hard back projection walls, a user can watch a three-dimensional perspective picture at an optimal observation point, and the manufacturing principle is that a picture or a film source is subjected to deformation cutting processing and is put on a screen.
Although the traditional immersive CAVE image manufacturing method can make the picture generate three-dimensional sense, the real immersive experience cannot be realized, the picture can only be watched outside the equipment, meanwhile, the picture has the phenomena of stretching, breaking, distortion and the like, and the picture cannot be interacted, so that the picture is only watched, and the functionalization cannot be realized. In the immersive CAVE image production method, from the aspect of expression, the film source is a pseudo-stereo image produced from a non-stereo image. The three-dimensional immersive image has the inevitable phenomena of stretching, breaking, distortion and the like, has poor visual effect, is produced by plane images although having certain three-dimensional image presenting force, and is not a true three-dimensional immersive image. From the view point of watching, the immersive CAVE image production method has the limitation of the optimal viewing point, the optimal viewing point is outside the equipment, the viewer can not freely and really watch the image personally, only by using visual deception, the effect of pseudo-stereoscopic imaging is achieved, at most, five display surfaces can be displayed, and stereoscopic package display (six surfaces and more than six surfaces) cannot be realized. From the view of interactivity, the immersive CAVE image production method can only realize image playing and single forward playing of images because of the limited nature of the immersive CAVE image production method and the failure to realize programming, and users and equipment cannot interact with each other. From the perspective of hardware equipment, the immersive CAVE image production method is realized by using a fusion technology support because of a single image deformation display principle, so that a large amount of cost is increased, and the immersive CAVE image production method cannot be well popularized in the society due to high manufacturing cost.
Therefore, due to the problems, the immersive CAVE image production method has many limitations, fails to bring true immersive experience to users, and limits future development situations of the users.
Disclosure of Invention
The invention aims to provide an immersive interactive Box image making method, which can enable a user to have real immersive experience, is not limited to a fixed optimal observation point, can watch an optimal picture at any position in equipment, and simultaneously realizes an interactive function by utilizing program control.
In order to solve the technical problem, the invention provides an immersive interactive Box movie and television production method, which comprises the following steps:
s1, making or shooting a plurality of groups of spherical panoramic images;
s2, importing one of the spherical panoramic images into multimedia video editing software, converting the spherical panoramic image into a panoramic sequence frame and exporting the panoramic sequence frame;
s3, naming the obtained panoramic sequence frame regularly;
s4, importing the regularly named panoramic sequence frame into three-dimensional software, and storing the panoramic sequence frame as a material ball in an image sequence form map;
s5, establishing a cube model in three-dimensional software;
s6, subdividing the cube model to a project requirement state;
s7, giving three-dimensional texture coordinates of the subdivision cube model through spherical mapping;
s8, establishing six cameras in the three-dimensional software, wherein the axes of the cameras are perpendicular to six surfaces of the subdivision cube;
s9, establishing six groups of simulated parallel light in three-dimensional software, wherein each group of parallel light is respectively in the axial direction of six cameras;
s10, setting attributes of six cameras;
s11, rendering six camera lenses to obtain six groups of sequence frames;
s12, importing six groups of sequence frames into multimedia video editing software to synthesize six images by taking the groups as units, and exporting the images;
s13, naming the six images regularly;
s14, compiling a C + + language to realize that the main computer and the auxiliary computer synchronously play the six images after the rules are named;
and S15, compiling C + + language to realize the functions of interactively selecting and playing the host and selecting the auxiliary machine synchronous host.
Preferably, the plurality of spherical panoramic images in step S1 are processed in steps S2-S13.
Preferably, in step S2, the multimedia video editing software is Adobe After Effects, in step S4, the three-dimensional software is Autodesk Maya, and in step S14, the computer is a computer with an independent graphics card.
Preferably, in step S3, the panorama-sequence frame set number is defined as a combination of a front lens position number and a rear lens position number.
Preferably, in step S6, the height, width and depth of the cube model are subdivided into the available number of items to ensure that the texture coordinates are maintained sufficiently to maintain the cube shape when projected.
Preferably, in step S7, the mapping adjusts the projected horizontal and vertical scanning values according to the height-width-depth ratio of the cube model, and corrects the coordinate points that exceed the texture coordinate creation area during mapping so that all coordinate points fall within the creation area, and maintains the overall proportion of texture coordinates, and the vertices of four sides are attracted to the outer frame edge of the creation area so that the four edges are seamlessly butted left, right, up and down.
Preferably, in step S10, the frame rate of the cameras is set, the frame rates of the six cameras are adjusted according to the frame rate of the captured video, so that the frame rates of the six cameras are consistent, the attributes of the cameras are set, the cameras are adjusted to have orthogonal view lens frames exceeding the corresponding box surface, the pixel setting is one fourth higher than the pixel setting of the captured image, the time setting is equal to the duration of the captured image, and the product-level rendering quality is selected.
Preferably, in step S13, the naming is to initially obtain the name of the panoramic video or the single-frame panorama plus _ x, where x is the first letter of the azimuth english word.
Preferably, in step S14, six computers are required, one is used as a control host, five are required, the host controls the slave, the host uses a parent program to control the slave, the slave uses a subprogram, the parent program of the host is determined as a main, the subprogram of the slave synchronizes with the parent program of the host but does not perform determination, only data is uploaded for the parent program of the host to determine and issue a command with the subprogram, and it is determined whether the play frame number of the video controlled by the slave is synchronized with the play frame number of the host and corrected.
Preferably, in step S15, the four basic commands of forward playing, reverse playing, selecting, pausing are executed, and the playing is started from the starting point.
The immersive interactive Box movie and television production method integrates the computer coding technology, the virtual reality technology and the three-dimensional space image processing technology, can enable a user to have real immersive experience, is not limited to a fixed optimal observation point, can watch an optimal picture at any position in equipment, simultaneously utilizes program control to realize an interactive function, brings better experience to the user, greatly improves the possibility and the ductility of the technology, and enables the technology to have expandability in the future.
Drawings
FIG. 1 is a schematic diagram illustrating the naming of rules in step S3 of the immersive interactive Box image generation method according to the embodiment of the present invention;
FIG. 2 is a schematic diagram illustrating the rule naming in step S3 of the immersive interactive Box image creation method according to the embodiment of the present invention;
FIG. 3 is a schematic diagram of texture coordinates in step S7 of the immersive interactive Box image production method according to the embodiment of the invention;
fig. 4 is a schematic view of a camera rendering in step S8 of the immersive interactive Box image production method according to the embodiment of the present invention.
Detailed Description
The embodiments of the present invention will be described in further detail with reference to the drawings and examples. The following examples are intended to illustrate the invention but are not intended to limit the scope of the invention.
The immersive interactive Box movie and television production method of the embodiment is based on the comprehensive application of a computer coding technology, a virtual reality technology and a stereoscopic space image processing technology, so as to realize an overall comprehensive technical scheme. The method is characterized in that the panoramic video or image obtained by production or shooting is obtained by three-dimensional modeling, mapping, light simulation, rendering, classification integration and programming control, and comprises the following steps:
s1, making or shooting a plurality of groups of spherical panoramic images;
the video production needs to shoot or produce a plurality of sections of videos according to a preset linear route, a preset camera position and a main lens (front lens) direction;
the plurality of spherical panoramic images are processed in steps S2-S13.
S2, importing one of the spherical panoramic images into multimedia video editing software, converting the spherical panoramic image into a panoramic sequence frame and exporting the panoramic sequence frame;
in this embodiment, the multimedia video editing software is Adobe After Effects, the three-dimensional software is Autodesk Maya in step S4, and the computer is a computer with an independent graphics card in step S14.
S3, naming the obtained panoramic sequence frame regularly;
the panoramic sequence frame set number is defined as a combination of a front lens position number and a rear lens position number;
as shown in fig. 1, the names of the video editing from the machine position a to the machine position B are AB, the names of the panoramic image sequence frames from the machine position B to the machine position C and from the machine position B to the machine position D are BC and BD, and so on, and then the shooting frame rate (25-60 frames/second) is set according to the state of the computer hardware used in the corresponding items in the later period, and the video is produced after the setting. The images of the single frame are named by the number of the preset camera position, as shown in fig. 2, the name of the single frame panorama ball obtained from the a-position is a, and the name of the single frame panorama ball obtained from the B, C, D-position is B, C, D. All the manufactured products are placed under the same folder.
S4, importing the regularly named panoramic sequence frame into three-dimensional software, and storing the panoramic sequence frame as a material ball in an image sequence form map;
and importing the obtained panoramic sequence frame set or fixed-point panoramic ball picture into three-dimensional software to be stored in a named material ball as a map in an image sequence form.
S5, establishing a cube model in three-dimensional software;
a cubic model is established in the three-dimensional software (the model is at the center point of the world, the length, width and height proportion is not limited, and each angle is not limited to be a 90-degree right angle).
S6, subdividing the cube model to a project requirement state;
subdividing the height, width and depth of the cubic model into the available number of items ensures that the texture coordinates are maintained enough to keep cubic shape when projected (the texture coordinate manufacturing range of the three-dimensional model is a square area).
S7, giving three-dimensional texture coordinates of the subdivision cube model through spherical mapping;
as shown in fig. 3, the mapping adjusts the projected horizontal and vertical scanning values according to the height-width-depth ratio of the cube model, and at the same time, corrects the coordinate points exceeding the texture coordinate production area during mapping to make all the coordinate points fall within the production area, and keeps the overall proportion of texture coordinates, and the four-side vertexes are adsorbed to the outer frame edge line of the production area to make the four-side vertexes seamlessly butted left, right, up and down.
S8, establishing six cameras in the three-dimensional software, wherein the axes of the cameras are perpendicular to six surfaces of the subdivision cube;
as shown in FIG. 4, six cameras are made in the three-dimensional software, each camera position lens is perpendicular to one of the faces of the square box, and the axes of the cameras are coaxial with the center point of the face.
S9, establishing six groups of simulated parallel light in three-dimensional software, wherein each group of parallel light is respectively in the axial direction of six cameras;
as shown in fig. 4, six parallel lights are created, each illuminating a corresponding plane one-to-one perpendicular to the camera axis.
S10, setting attributes of six cameras;
setting a camera frame rate, adjusting the frame rates of six cameras according to the frame rate of a video obtained by shooting to enable the frame rates of the six cameras to be consistent, setting camera attributes, adjusting the cameras to be orthogonal view (parallel pictures without perspective) lens frames exceeding the corresponding square box surface, setting pixels to be one fourth higher than the pixels of an image obtained by shooting, setting time to be equal to the duration of the image obtained by shooting and making, and selecting product-level rendering quality.
And S11, rendering six camera lenses to obtain six groups of sequence frames.
S12, importing six groups of sequence frames into multimedia video editing software to synthesize six images by taking the groups as units, and exporting the images;
and importing six groups of sequence frames rendered by the six cameras into multimedia video editing software one by taking the groups as units, and horizontally and vertically turning over and then cutting off redundant parts to synthesize images.
S13, naming the six images regularly;
the naming mode is that the name of the panoramic video or the single-frame panoramic picture is obtained initially and is added with _ x, wherein x is the initial letter of the English word in the azimuth.
S14, compiling a C + + language to realize that the main computer and the auxiliary computer synchronously play the six images after the rules are named;
six computers with independent display cards are needed, one computer is used as a control host, five additional computers are needed, the additional computers are controlled by the host, the host is controlled by a father program, the additional computers use a subprogram, the father program of the host is judged as a main computer, the subprogram of the additional computers is synchronous with the father program of the host but is not judged, only data is uploaded for the father program of the host to judge and issue a subprogram command, whether the number of playing frames of videos controlled by the additional computers is synchronous with the number of the playing frames of the host or not is judged, and correction is carried out, such as: the host machine plays 15 frames in 1 minute 30 seconds, the L auxiliary machine only plays 14 frames in 1 minute 30 seconds at the moment, the host machine sends a command to the L auxiliary machine to skip 15 frames in 1 minute 30 seconds, and the next frame is played directly and synchronously with the host machine for playing 16 frames in 1 minute 30 seconds, so that the correction is kept under the state that the errors cannot be identified by naked eyes.
S15, compiling a C + + language to realize a host interactive selection playing function and an auxiliary machine synchronous host selection function;
the four basic commands are executed to play forward, reverse, select and pause, and the playing is started from the starting point. And playing the video set AB, before the video is not played, displaying whether the interactive icon goes to the point B or not by the host picture, starting playing the video AB if the interactive icon goes to the point B, freezing the interactive icon to the last frame when the video AB is being played, displaying the interactive icon to the point C, the point D or returning to the point A by the host picture, playing the BC by selecting the point C, playing the BD by selecting the point D, and so on. If the user wants to return to the point B after selecting the point C, the user reversely plays the point BC, the user switches back to the judgment when the forward playing AB is ended when the reverse playing is ended, the user reversely plays the point AB and returns to the initial point when the user continuously returns to the point A, and the host computer displays whether the interactive icon goes to the point B or not. In the middle of forward playing and reverse playing, the pause playing can be selected, when the playing is selected again, the previous action is continued, namely the previous action is taken as the forward playing, the playing after pause is continued at the pause position, and the reverse playing principle is the same as the forward playing.
The above-mentioned all the basic technical principles of the invention can be applied to various visual information processing technology applications to display the effects which can be achieved by the invention. The user can experience a realistic experience from the immersive effect at any location inside the cubic display medium.
The immersive interactive Box movie and television production method integrates the computer coding technology, the virtual reality technology and the three-dimensional space image processing technology, can enable a user to have real immersive experience, is not limited to a fixed optimal observation point, can watch an optimal picture at any position in equipment, simultaneously utilizes program control to realize an interactive function, brings better experience to the user, greatly improves the possibility and the ductility of the technology, and enables the technology to have expandability in the future.
The immersive interactive Box movie and television production method breaks through the two-dimensional production limit, mainly adopts the three-dimensional production mode, gets rid of the two-dimensional production mode of the immersive CAVE image production method, changes the three-dimensional production mode into the three-dimensional production mode, truly realizes three-dimensional space display, and achieves the absolute immersive experience; compared with an immersive CAVE image making method, the immersive interactive Box image making method is a real three-dimensional immersive image made in a three-dimensional mode, the immersive CAVE image making method is a plane image processing deformation, and has essential differences, wherein the immersive interactive Box image making method is used for making a three-dimensional image, and the immersive CAVE image making method is used for plane visual deception; the immersive interactive Box image manufacturing method has no limit of the optimal visual point and no limit of the number of planes of projection equipment, a user can put the user in the immersive interactive Box image to view the image without dead angles in all directions, and the user can feel the same effect in any direction and any position in a formed display system because of no limit of the visual point, so that the immersive interactive Box image manufacturing method supports multiple people to view simultaneously; the three-dimensional technology, the two-dimensional technology and the C + + language are combined to be manufactured, a plurality of functional systems are utilized in the image manufacturing process, the three-dimensional model system, the three-dimensional material system, the three-dimensional texture coordinate system, the three-dimensional light simulation system, the three-dimensional rendering system and the two-dimensional graphic video technology are included, C + + language programming is used at the same time, the manufacturing method is a multi-step set and multi-system cooperation manufacturing mode, and a technical foundation is laid for truly achieving three-dimensional space display; the immersive Box image has interactivity, solves the problem that the traditional cave-shaped display system can not realize the interactive function for a long time, enables the immersive display system to be completely butted with the interaction, enables the non-wearable immersive display system to have the interactive function in the future, enables the non-wearable immersive display system to be applied to more industries, opens the bottleneck that the traditional cave-shaped display system can only be used for entertainment and propaganda display, and develops a wider market.
In economic benefit, as the invention does not need to merge technical means, even a plurality of small screens or the lowest level projection can be used for building a non-wearable immersive interactive display system, the hardware cost is directly reduced, and more users, cooperation and business opportunities are provided in the field.
The embodiments of the present invention have been presented for purposes of illustration and description, and are not intended to be exhaustive or limited to the invention in the form disclosed. Many modifications and variations will be apparent to those of ordinary skill in the art. The embodiment was chosen and described in order to best explain the principles of the invention and the practical application, and to enable others of ordinary skill in the art to understand the invention for various embodiments with various modifications as are suited to the particular use contemplated.
Claims (10)
1. An immersive interactive Box movie production method is characterized by comprising the following steps:
s1, making or shooting a plurality of groups of spherical panoramic images;
s2, importing one of the spherical panoramic images into multimedia video editing software, converting the spherical panoramic image into a panoramic sequence frame and exporting the panoramic sequence frame;
s3, naming the obtained panoramic sequence frame regularly;
s4, importing the regularly named panoramic sequence frame into three-dimensional software, and storing the panoramic sequence frame as a material ball in an image sequence form map;
s5, establishing a cube model in three-dimensional software;
s6, subdividing the cube model to a project requirement state;
s7, giving three-dimensional texture coordinates of the subdivision cube model through spherical mapping;
s8, establishing six cameras in the three-dimensional software, wherein the axes of the cameras are perpendicular to six surfaces of the subdivision cube;
s9, establishing six groups of simulated parallel light in three-dimensional software, wherein each group of parallel light is respectively in the axial direction of six cameras;
s10, setting attributes of six cameras;
s11, rendering six camera lenses to obtain six groups of sequence frames;
s12, importing six groups of sequence frames into multimedia video editing software to synthesize six images by taking the groups as units, and exporting the images;
s13, naming the six images regularly;
s14, compiling a C + + language to realize that the main computer and the auxiliary computer synchronously play the six images after the rules are named;
and S15, compiling C + + language to realize the functions of interactively selecting and playing the host and selecting the auxiliary machine synchronous host.
2. The method for immersive interactive Box movie production according to claim 1, wherein the sets of spherical panoramic images in step S1 are processed through steps S2-S13.
3. The immersive interactive Box movie production method of claim 1, wherein in the step S2, the multimedia video editing software is Adobe After Effects, in the step S4, the three-dimensional software is Autodesk Maya, and in the step S14, the computer is a computer with a separate display card.
4. The method for immersive interactive Box movie production according to claim 1, wherein in step S3, a combination of two lens letters is defined for the panoramic sequence frame set number, front lens position number and rear lens position number.
5. The immersive interactive Box movie production method of claim 1, wherein in step S6, the cube model height, width and depth are subdivided into the available number of items, ensuring that the texture coordinates are sufficient to maintain the cube shape when projected.
6. The method of claim 1, wherein in step S7, the mapping adjusts the projected horizontal and vertical scanning values according to the height-to-width and depth ratios of the cube model, and corrects the coordinate points that exceed the texture coordinate creation area during mapping so that all coordinate points are included in the creation area, and the four vertices are attached to the outer border of the creation area so that the vertices are seamlessly butted left, right, top, and bottom.
7. The immersive interactive Box movie production method of claim 1, wherein in step S10, a camera frame rate is set, frame rates of six cameras are adjusted according to a frame rate of a captured video so that the frame rates of the six cameras are consistent, camera attributes are set, the cameras are adjusted to have orthogonal view lens frames exceeding a corresponding Box plane, pixels are set to be one fourth higher than pixels of the captured image, a time setting is equal to a duration of the captured image, and product-level rendering quality is selected.
8. The immersive interactive Box movie production method of claim 1, wherein in the step S13, the naming is that the name of the panoramic video or the single-frame panorama is obtained initially plus _ x, wherein x is the azimuth english word initial.
9. The method of claim 1, wherein in step S14, six computers are required, one is used as a control host, five are used as auxiliary machines, the host controls the auxiliary machines, the host uses a parent program to control the auxiliary machines, the auxiliary machines use a subprogram, the host parent program is determined as a main one, the auxiliary machine subprogram is synchronized with the host parent program but not determined, only data is uploaded for the host parent program to determine and issue a subprogram command, and whether the play frame number of the video controlled by the auxiliary machine is synchronized with the play frame number of the host is determined and corrected.
10. The immersive interactive Box movie production method of claim 1, wherein in step S15, four basic commands of forward playing, reverse playing, selecting, pausing, and starting playing from a starting point are executed.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010118481.1A CN111354085A (en) | 2020-02-26 | 2020-02-26 | Immersive interactive Box image production method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010118481.1A CN111354085A (en) | 2020-02-26 | 2020-02-26 | Immersive interactive Box image production method |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111354085A true CN111354085A (en) | 2020-06-30 |
Family
ID=71195862
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010118481.1A Pending CN111354085A (en) | 2020-02-26 | 2020-02-26 | Immersive interactive Box image production method |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111354085A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112184907A (en) * | 2020-10-27 | 2021-01-05 | 中图云创智能科技(北京)有限公司 | Space moving method of three-dimensional scene |
CN114168101A (en) * | 2021-12-01 | 2022-03-11 | 北京凯视达科技股份有限公司 | Method and device for pasting spherical panoramic video to cubic screen for display |
CN114449169A (en) * | 2022-01-27 | 2022-05-06 | 中影电影数字制作基地有限公司 | Cutting method and system for displaying panoramic video in CAVE space |
CN117173378A (en) * | 2023-11-03 | 2023-12-05 | 成都泰盟软件有限公司 | CAVE environment-based WebVR panoramic data display method, device, equipment and medium |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101938599A (en) * | 2009-06-30 | 2011-01-05 | 爱国者全景(北京)网络科技发展有限公司 | Method for generating interactive dynamic panoramic image |
CN105939481A (en) * | 2016-05-12 | 2016-09-14 | 深圳市望尘科技有限公司 | Interactive three-dimensional virtual reality video program recorded broadcast and live broadcast method |
CN106527857A (en) * | 2016-10-10 | 2017-03-22 | 成都斯斐德科技有限公司 | Virtual reality-based panoramic video interaction method |
US20170092008A1 (en) * | 2015-09-24 | 2017-03-30 | California Institute Of Technology | Systems and Methods for Data Visualization Using Three-Dimensional Displays |
CN108174174A (en) * | 2017-12-29 | 2018-06-15 | 暴风集团股份有限公司 | VR image display methods, device and terminal |
CN111047711A (en) * | 2019-12-16 | 2020-04-21 | 山东东艺数字科技有限公司 | Immersive interactive Box image manufacturing method |
-
2020
- 2020-02-26 CN CN202010118481.1A patent/CN111354085A/en active Pending
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101938599A (en) * | 2009-06-30 | 2011-01-05 | 爱国者全景(北京)网络科技发展有限公司 | Method for generating interactive dynamic panoramic image |
US20170092008A1 (en) * | 2015-09-24 | 2017-03-30 | California Institute Of Technology | Systems and Methods for Data Visualization Using Three-Dimensional Displays |
CN105939481A (en) * | 2016-05-12 | 2016-09-14 | 深圳市望尘科技有限公司 | Interactive three-dimensional virtual reality video program recorded broadcast and live broadcast method |
CN106527857A (en) * | 2016-10-10 | 2017-03-22 | 成都斯斐德科技有限公司 | Virtual reality-based panoramic video interaction method |
CN108174174A (en) * | 2017-12-29 | 2018-06-15 | 暴风集团股份有限公司 | VR image display methods, device and terminal |
CN111047711A (en) * | 2019-12-16 | 2020-04-21 | 山东东艺数字科技有限公司 | Immersive interactive Box image manufacturing method |
Non-Patent Citations (1)
Title |
---|
苏柳;: "全景播放技术中的帧渲染" * |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112184907A (en) * | 2020-10-27 | 2021-01-05 | 中图云创智能科技(北京)有限公司 | Space moving method of three-dimensional scene |
CN114168101A (en) * | 2021-12-01 | 2022-03-11 | 北京凯视达科技股份有限公司 | Method and device for pasting spherical panoramic video to cubic screen for display |
CN114449169A (en) * | 2022-01-27 | 2022-05-06 | 中影电影数字制作基地有限公司 | Cutting method and system for displaying panoramic video in CAVE space |
CN114449169B (en) * | 2022-01-27 | 2023-11-17 | 中影电影数字制作基地有限公司 | Clipping method and system for showing panoramic video in CAVE space |
CN117173378A (en) * | 2023-11-03 | 2023-12-05 | 成都泰盟软件有限公司 | CAVE environment-based WebVR panoramic data display method, device, equipment and medium |
CN117173378B (en) * | 2023-11-03 | 2024-02-02 | 成都泰盟软件有限公司 | CAVE environment-based WebVR panoramic data display method, device, equipment and medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN111354085A (en) | Immersive interactive Box image production method | |
US10121284B2 (en) | Virtual camera control using motion control systems for augmented three dimensional reality | |
US7868847B2 (en) | Immersive environments with multiple points of view | |
JP3992045B2 (en) | Video signal processing apparatus and method, and virtual reality generation apparatus | |
US20050185711A1 (en) | 3D television system and method | |
US20090195643A1 (en) | Medial axis decomposition of 2d objects to synthesize binocular depth | |
CN106527857A (en) | Virtual reality-based panoramic video interaction method | |
CN111047711B (en) | Immersive interactive Box image manufacturing method | |
CN102692808A (en) | Large-scene 360-degree panorama dynamic display method, and display system | |
JP2006094458A (en) | Video signal processor, virtual reality creating apparatus, and recording medium | |
KR101340598B1 (en) | Method for generating a movie-based, multi-viewpoint virtual reality and panoramic viewer using 3d surface tile array texture mapping | |
Naimark | Elements of real-space imaging: a proposed taxonomy | |
Yoshida | fVisiOn: glasses-free tabletop 3D display to provide virtual 3D media naturally alongside real media | |
JP5016648B2 (en) | Image processing method for 3D display device having multilayer structure | |
Balogh et al. | HoloVizio-True 3D display system | |
Schratt et al. | The potential of three-dimensional display technologies for the visualization of geo-virtual environments | |
CN103024414A (en) | Three dimensional (3D) display method based on WinXP system | |
WO2010098159A1 (en) | Stereoscopic display device | |
JPH01295296A (en) | Production of stereoscopic variable picture element forming sheet | |
Chung | A study of immersive display technologies | |
Sun et al. | Combining 360◦ video and camera mapping for virtual reality: an innovative solution | |
CN110430454B (en) | Multi-device real-time interactive display method and device | |
CN113676731A (en) | Method for compressing VR video data | |
US20220122216A1 (en) | Generating and processing an image property pixel structure | |
CN111338095A (en) | Naked eye 3D system based on multi-screen folding screen |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200630 |
|
RJ01 | Rejection of invention patent application after publication |