CN112305766A - Immersive visual experience method and system - Google Patents

Immersive visual experience method and system Download PDF

Info

Publication number
CN112305766A
CN112305766A CN202011237447.2A CN202011237447A CN112305766A CN 112305766 A CN112305766 A CN 112305766A CN 202011237447 A CN202011237447 A CN 202011237447A CN 112305766 A CN112305766 A CN 112305766A
Authority
CN
China
Prior art keywords
fov
matrix
virtual
virtual camera
asymmetric
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011237447.2A
Other languages
Chinese (zh)
Inventor
王珏
代开天
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai Yueying Technology Co ltd
Original Assignee
Shanghai Yueying Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai Yueying Technology Co ltd filed Critical Shanghai Yueying Technology Co ltd
Priority to CN202011237447.2A priority Critical patent/CN112305766A/en
Publication of CN112305766A publication Critical patent/CN112305766A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses an immersive visual experience method, which comprises the following steps: generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix; projecting material in the virtual environment into the virtual camera using the FOV matrix. The invention also discloses an immersive visual experience system, which comprises: a virtual camera and a virtual environment; the virtual camera is arranged in the virtual environment, and projects materials in the virtual environment into the virtual camera by using an FOV matrix; the system also comprises an asymmetric FOV matrix generation module, a virtual camera and a virtual environment generation module, wherein the asymmetric FOV matrix generation module is used for generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix. The invention has the technical effects that: the original conditions that the pictures are symmetrical, the positions of the people are asymmetrical and the pictures and the positions are not matched are avoided, and the immersion feeling of the visual experience is improved.

Description

Immersive visual experience method and system
Technical Field
The invention relates to the field of virtual reality, in particular to an immersive visual experience method and system.
Background
With the continuous development and improvement of the panoramic experience technology, the problems existing in the original panoramic experience are continuously exposed, and the foreign body sensation of the original FOV is particularly prominent.
For example, to achieve a 1:1 restoration of a room in which a person moves about the room changes according to the person's position view, Fov in current ue4 is only 90 degrees and only stands at the center of the frame, if not, the frame and person's position do not match, there is no sense of immersion, and since the FOVs are all symmetric, this results in a symmetric frame, and the person stands asymmetrically, i.e., the frame and position do not match.
Disclosure of Invention
In order to solve the technical problems, the invention provides an immersive visual experience method and system, and the specific technical scheme is as follows:
in one aspect, a method of immersive visual experience is provided, comprising:
generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix;
projecting material in the virtual environment into the virtual camera using the FOV matrix.
Preferably, the virtual camera and the virtual environment are initialized using a non Engine.
Preferably, the method further comprises the following steps: setting a variable of bUseConsustomProjectionmatrix of USceneCaptureComponent2D to true;
the default FOV matrix is replaced with an asymmetric FOV matrix.
Preferably, when the position of the virtual camera in the virtual environment changes, the FOV matrix is updated according to the new position of the virtual camera.
Preferably, the generating the FOV matrix according to the position of the virtual camera in the virtual environment comprises:
acquiring the position of the virtual camera to obtain a virtual position;
calculating the left distance, the right distance, the upper distance and the lower distance from the virtual position to the edge of the virtual environment;
and constructing the FOV matrix according to the left distance, the right distance, the upper distance and the lower distance.
The invention also provides an immersive visual experience system comprising: a virtual camera and a virtual environment;
the virtual camera is arranged in the virtual environment, and projects materials in the virtual environment into the virtual camera by using an FOV matrix; the system also comprises an asymmetric FOV matrix generation module, a virtual camera and a virtual environment generation module, wherein the asymmetric FOV matrix generation module is used for generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix.
Preferably, the virtual camera and the virtual environment are constructed based on a non Engine.
Preferably, the FOV matrix replacement module is further included to set a variable of the disustecomprojectmationmatrix of the usenecapturecompontent 2D to true, and replace the default FOV matrix with the asymmetric FOV matrix generated by the asymmetric FOV matrix generation module.
Preferably, the system further comprises a virtual camera position monitoring module, configured to notify the asymmetric FOV matrix generation module to update the FOV matrix according to the new position of the virtual camera when the position of the virtual camera in the virtual environment changes.
Preferably, the asymmetric FOV matrix generation module comprises:
the virtual camera position acquisition module is used for acquiring the virtual position of the virtual camera in the virtual environment;
the distance calculation module is used for calculating the left distance, the right distance, the upper distance and the lower distance from the virtual position to the edge of the virtual environment;
and the matrix construction module is used for constructing the FOV matrix according to the left distance, the right distance, the upper distance and the lower distance.
The invention has the technical effects that: the asymmetric FOV matrix is used for replacing the original symmetric FOV matrix so as to realize the original technical scheme, and the FOV is asymmetric, so that the visual field is asymmetric relative to the original symmetric FOV matrix, the situations of original symmetric pictures, asymmetric human station positions and unmatched pictures and positions are effectively avoided, and the visual experience immersion is improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
FIG. 1 is a schematic flow chart of an embodiment 1 of the immersive visual experience method of the present invention;
FIG. 2 is a flow chart of an embodiment 2 of the immersive visual experience method of the present invention;
FIG. 3 is a schematic structural diagram of an embodiment 3 of an immersive visual experience system of the present invention;
FIGS. 4 and 5 are schematic diagrams of a conventional symmetric matrix;
fig. 6 is a schematic diagram of the principle of the asymmetric matrix of the present invention.
FIGS. 7 and 8 are schematic diagrams illustrating effects of a conventional symmetric matrix;
fig. 9 is a schematic diagram of the effect of the asymmetric matrix of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. However, it will be apparent to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
For the sake of simplicity, the drawings only schematically show the parts relevant to the present invention, and they do not represent the actual structure as a product. In addition, in order to make the drawings concise and understandable, components having the same structure or function in some of the drawings are only schematically depicted, or only one of them is labeled. In this document, "one" means not only "only one" but also a case of "more than one".
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
In addition, in the description of the present application, the terms "first", "second", and the like are used only for distinguishing the description, and are not intended to indicate or imply relative importance.
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the following description will be made with reference to the accompanying drawings. It is obvious that the drawings in the following description are only some examples of the invention, and that for a person skilled in the art, other drawings and embodiments can be derived from them without inventive effort.
Example 1:
as shown in fig. 1, the present embodiment provides an immersive visual experience method, including:
s2: generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix;
s4: projecting material in the virtual environment into the virtual camera using the FOV matrix.
In this embodiment, an asymmetric FOV matrix is used instead of an original symmetric FOV matrix to implement an original technical solution, since the FOV is asymmetric, and compared to the original symmetric FOV matrix, the field of view is asymmetric, which effectively avoids the situation that the original picture shown in fig. 4, 5, 7, and 8 is symmetric, a person, that is, the virtual camera station is asymmetric in position and the picture and the position are not matched, and the situation that the asymmetric FOV matrix is used is shown in fig. 6 and 9, which improves the immersion of the visual experience.
Example 2:
as shown in fig. 2, the present embodiment provides an immersive visual experience method, including:
s1: initializing the virtual camera and the virtual environment using a non Engine; the virtual camera and the virtual environment are constructed based on an unregeal Engine.
S2-1: acquiring the position of the virtual camera to obtain a virtual position;
s2-2: calculating the left distance, the right distance, the upper distance and the lower distance from the virtual position to the edge of the virtual environment;
s2-3: constructing the FOV matrix according to the left distance, the right distance, the upper distance and the lower distance;
s3: replacing the default FOV matrix with the asymmetric FOV matrix;
s4: projecting material in the virtual environment into the virtual camera using the FOV matrix;
s5: when the position of the virtual camera in the virtual environment changes, go back to S2-1.
In this embodiment, a universal Engine is generally used to implement the construction of the virtual camera and the virtual environment, and meanwhile, Unity3D, Source Engine, frost Engine, and the like can be equivalently replaced to implement the same or similar technical effects.
Then the variable of the ususecomprojectingmatrix of uscecencapturecponent 2D is set to true so that a customized projection matrix can be used, and then the FOV matrix is constructed by the specific position of the virtual camera in the virtual environment, i.e. the left distance, the right distance, the upper distance, the lower distance of the virtual position to the edge of the virtual environment, specifically: the symmetric matrix in the conventional art is a special case of an asymmetric matrix, i.e. the left and right distances from the edge of the virtual environment are equal, and L (representing the left distance) R (representing the right distance) T (representing the upper distance) B (representing the lower distance) is then center-symmetric when L ═ R and T ═ B. If the central point is taken as the origin, L + R is 0, T + B is 0, which is central symmetry, and if L + R is not equal to 0, T + B is not equal to 0, which is asymmetric, the distance between the left, right, top and bottom can be dynamically obtained according to the position of the character in the UE4 scene, taking the matrices shown in fig. 4-6 as an example, the asymmetric FOV matrix can be obtained as
Figure BDA0002767185780000061
Correspondingly, corresponding asymmetric matrices can also be obtained in other directions.
For example, if the projected area is 4 meters wide and 2.5 meters high, we use 2 meters wide and 1.25 meters high as the origin, the left side is the negative axis, the right side is the positive axis, and the height is more than 1.25 meters as the positive axis, and we can modify according to the central matrix of the unitary Engine, and bring L + R and T + B into it, instead of writing directly to 0, so as to obtain the asymmetric FOV matrix, and then set the corresponding asymmetric FOV matrix into the FOV, thereby completing the whole process.
Example 3:
as shown in fig. 3, the present embodiment provides an immersive visual experience system including: a virtual camera and a virtual environment; the virtual camera is arranged in the virtual environment, and projects materials in the virtual environment into the virtual camera by using an FOV matrix; the system also comprises an asymmetric FOV matrix generation module, a virtual camera and a virtual environment generation module, wherein the asymmetric FOV matrix generation module is used for generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix.
In the embodiment, the asymmetric FOV matrix is used for replacing the original symmetric FOV matrix so as to realize the original technical scheme, and as the FOV is asymmetric, compared with the original symmetric FOV matrix, the visual field of the FOV is asymmetric, thereby effectively avoiding the situations of original symmetric pictures, asymmetric stand positions and unmatched pictures and positions, and improving the immersion of visual experience.
Example 4:
the embodiment further includes a FOV matrix replacement module, configured to set a variable of the bousecorestimoprojectionmatrix of the usenecapturecompontent 2D to true, and replace the default FOV matrix with the asymmetric FOV matrix generated by the asymmetric FOV matrix generation module. The system further comprises a virtual camera position monitoring module used for informing the asymmetric FOV matrix generation module to update the FOV matrix according to the new position of the virtual camera when the position of the virtual camera in the virtual environment changes. The asymmetric FOV matrix generation module includes: the virtual camera position acquisition module is used for acquiring the virtual position of the virtual camera in the virtual environment; the distance calculation module is used for calculating the left distance, the right distance, the upper distance and the lower distance from the virtual position to the edge of the virtual environment; a matrix construction module, configured to construct the FOV matrix according to the left distance, the right distance, the upper distance, and the lower distance, taking the matrices shown in fig. 4-6 as an example, an asymmetric FOV matrix can be obtained as
Figure BDA0002767185780000071
Correspondingly, corresponding asymmetric matrices can also be obtained in other directions.
In this embodiment, a universal Engine is generally used to implement the construction of the virtual camera and the virtual environment, and meanwhile, Unity3D, Source Engine, frost Engine, and the like can be equivalently replaced to implement the same or similar technical effects.
Then the variable of the ususecomprojectingmatrix of uscecencapturecponent 2D is set to true so that a customized projection matrix can be used, and then the FOV matrix is constructed by the specific position of the virtual camera in the virtual environment, i.e. the left distance, the right distance, the upper distance, the lower distance of the virtual position to the edge of the virtual environment, specifically: the symmetric matrix in the conventional art is a special case of an asymmetric matrix, i.e. the left and right distances from the edge of the virtual environment are equal, and L (representing the left distance) R (representing the right distance) T (representing the upper distance) B (representing the lower distance) is then center-symmetric when L ═ R and T ═ B. If the central point is taken as the origin, L + R is 0, T + B is 0, which is centrosymmetric, and when L + R is not equal to 0, T + B is not equal to 0, which is asymmetric, the distance between the left and right upper and lower parts can be dynamically obtained according to the position of the character in the UE4 scene;
for example, if the projected area is 4 meters wide and 2.5 meters high, we use 2 meters wide and 1.25 meters high as the origin, the left side is the negative axis, the right side is the positive axis, and the height is more than 1.25 meters as the positive axis, and we can modify according to the central matrix of the unitary Engine, and bring L + R and T + B into it, instead of writing directly to 0, so as to obtain the asymmetric FOV matrix, and then set the corresponding asymmetric FOV matrix into the FOV, thereby completing the whole process.
While preferred embodiments of the present invention have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all such alterations and modifications as fall within the scope of the invention.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present invention without departing from the spirit and scope of the invention. Thus, if such modifications and variations of the present invention fall within the scope of the claims of the present invention and their equivalents, the present invention is also intended to include such modifications and variations.

Claims (10)

1. A method of immersive visual experience, comprising:
generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix;
projecting material in the virtual environment into the virtual camera using the FOV matrix.
2. The immersive visual experience method of claim 1, wherein generating the FOV matrix based on the position of the virtual camera in the virtual environment further comprises: initializing the virtual camera and the virtual environment using a non Engine.
3. The method of immersive visual experience of claim 2, wherein generating the FOV matrix based on the position of the virtual camera in the virtual environment comprises:
the default FOV matrix is replaced with an asymmetric FOV matrix.
4. The method of immersive visual experience of claim 1, further comprising: updating the FOV matrix according to the new position of the virtual camera when the position of the virtual camera in the virtual environment changes.
5. The method of immersive visual experience of claim 1, wherein said generating a FOV matrix based on the position of the virtual camera in the virtual environment further comprises:
acquiring the position of the virtual camera to obtain a virtual position;
calculating the left distance, the right distance, the upper distance and the lower distance from the virtual position to the edge of the virtual environment;
and constructing the FOV matrix according to the left distance, the right distance, the upper distance and the lower distance.
6. An immersive visual experience system comprising: a virtual camera and a virtual environment;
the virtual camera is arranged in the virtual environment, and projects materials in the virtual environment into the virtual camera by using an FOV matrix;
the system is characterized by further comprising an asymmetric FOV matrix generation module, a virtual camera and a virtual environment, wherein the asymmetric FOV matrix generation module is used for generating a FOV matrix according to the position of the virtual camera in the virtual environment; the FOV matrix is an asymmetric matrix.
7. The immersive visual experience system of claim 6, wherein said virtual camera and said virtual environment are constructed based on a non Engine.
8. The immersive visual experience system of claim 7, further comprising a FOV matrix replacement module for setting a variable of the baustecomprojectingmatrix of usenecapturecponent 2D to true and replacing the default FOV matrix with the asymmetric FOV matrix generated by the asymmetric FOV matrix generation module.
9. The immersive visual experience system of claim 6, further comprising a virtual camera position monitoring module that notifies the asymmetric FOV matrix generation module to update the FOV matrix according to the new position of the virtual camera when the position of the virtual camera in the virtual environment changes.
10. The immersive visual experience system of claim 6, wherein the asymmetric FOV matrix generation module comprises:
the virtual camera position acquisition module is used for acquiring the virtual position of the virtual camera in the virtual environment;
the distance calculation module is used for calculating the left distance, the right distance, the upper distance and the lower distance from the virtual position to the edge of the virtual environment;
and the matrix construction module is used for constructing the FOV matrix according to the left distance, the right distance, the upper distance and the lower distance.
CN202011237447.2A 2020-11-09 2020-11-09 Immersive visual experience method and system Pending CN112305766A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011237447.2A CN112305766A (en) 2020-11-09 2020-11-09 Immersive visual experience method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011237447.2A CN112305766A (en) 2020-11-09 2020-11-09 Immersive visual experience method and system

Publications (1)

Publication Number Publication Date
CN112305766A true CN112305766A (en) 2021-02-02

Family

ID=74326590

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011237447.2A Pending CN112305766A (en) 2020-11-09 2020-11-09 Immersive visual experience method and system

Country Status (1)

Country Link
CN (1) CN112305766A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140066178A1 (en) * 2012-08-28 2014-03-06 Wms Gaming, Inc. Presenting autostereoscopic gaming content according to viewer position
CN109829981A (en) * 2019-02-16 2019-05-31 深圳市未来感知科技有限公司 Three-dimensional scenic rendering method, device, equipment and storage medium
CN110610454A (en) * 2019-09-18 2019-12-24 上海云绅智能科技有限公司 Method and device for calculating perspective projection matrix, terminal device and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140066178A1 (en) * 2012-08-28 2014-03-06 Wms Gaming, Inc. Presenting autostereoscopic gaming content according to viewer position
CN109829981A (en) * 2019-02-16 2019-05-31 深圳市未来感知科技有限公司 Three-dimensional scenic rendering method, device, equipment and storage medium
CN110610454A (en) * 2019-09-18 2019-12-24 上海云绅智能科技有限公司 Method and device for calculating perspective projection matrix, terminal device and storage medium

Similar Documents

Publication Publication Date Title
US20190121428A1 (en) Adaptive parallax adjustment method and virtual reality display device
CN105916022A (en) Video image processing method and apparatus based on virtual reality technology
CN105455285A (en) Virtual reality helmet adaptation method
CN102427542A (en) Method and device for processing three-dimensional image and terminal equipment thereof
JP6384940B2 (en) 3D image display method and head mounted device
CN110442486A (en) A kind of remote device diagnostics system and method based on mixed reality technology
CN202837794U (en) Multi-face type panoramic screen of 4D movie theatre
CN115202485B (en) XR (X-ray fluorescence) technology-based gesture synchronous interactive exhibition hall display system
CN206350095U (en) A kind of three-dimensional filming system dynamically tracked based on human body
CN108833987A (en) A kind of extensive video conference split screen method and system
CN110428477A (en) A kind of drawing methods for the event camera not influenced by speed
CN111770326A (en) Indoor three-dimensional monitoring method for panoramic video projection
CN104599308A (en) Projection-based dynamic mapping method
CN107172409A (en) Camber display screen bore hole 3D display methods and device
WO2019096057A1 (en) Dynamic image generation method, and processing device
CN112305766A (en) Immersive visual experience method and system
CN109725429A (en) A kind of 3 d display device of the poly- mixing imaging of reality
CN105629639A (en) Spherical display system based on super-wide field angle fish-eye lens
JP2007323093A (en) Display device for virtual environment experience
JP6168597B2 (en) Information terminal equipment
Horan et al. Feeling your way around a cave-like reconfigurable VR system
JPH0365943A (en) Stereoscopic display device
US20180300918A1 (en) Wearable device and method for displaying evacuation instruction
CN105447812B (en) A kind of three-dimensional moving image based on line array is shown and information concealing method
CN108881892B (en) Anti-dizziness method and system for desktop virtual reality system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210202

RJ01 Rejection of invention patent application after publication