CN116664752B - Method, system and storage medium for realizing panoramic display based on patterned illumination - Google Patents

Method, system and storage medium for realizing panoramic display based on patterned illumination Download PDF

Info

Publication number
CN116664752B
CN116664752B CN202310957933.9A CN202310957933A CN116664752B CN 116664752 B CN116664752 B CN 116664752B CN 202310957933 A CN202310957933 A CN 202310957933A CN 116664752 B CN116664752 B CN 116664752B
Authority
CN
China
Prior art keywords
coordinate
illumination
light source
point light
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310957933.9A
Other languages
Chinese (zh)
Other versions
CN116664752A (en
Inventor
王赞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nanjing Weisaike Network Technology Co ltd
Original Assignee
Nanjing Weisaike Network Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nanjing Weisaike Network Technology Co ltd filed Critical Nanjing Weisaike Network Technology Co ltd
Priority to CN202310957933.9A priority Critical patent/CN116664752B/en
Publication of CN116664752A publication Critical patent/CN116664752A/en
Application granted granted Critical
Publication of CN116664752B publication Critical patent/CN116664752B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02BCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO BUILDINGS, e.g. HOUSING, HOUSE APPLIANCES OR RELATED END-USER APPLICATIONS
    • Y02B20/00Energy efficient lighting technologies, e.g. halogen lamps or gas discharge lamps
    • Y02B20/40Control techniques providing energy savings, e.g. smart controller or presence detection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Architecture (AREA)
  • Processing Or Creating Images (AREA)
  • Image Generation (AREA)

Abstract

The invention discloses a method, a system and a storage medium for realizing panoramic display based on patterned illumination, belonging to the technical field of three-dimensional panoramic construction, wherein the method comprises the following steps: establishing a space model of a scene, arranging a point light source in the space model, and arranging a patterned illumination layer surrounding the surface of the point light source; acquiring a panoramic picture, and filling the panoramic picture into the patterned illumination layer shape to generate an illumination pattern; setting an area of the inner surface of the space model, which is covered by point light source illumination, as an area to be projected, and carrying out illumination display on the area to be projected, wherein the method comprises the following steps: converting the region to be projected from a model coordinate space to a point light source coordinate space; calculating a first UV coordinate of the illumination pattern in a point light source coordinate space; horizontally overturning the first UV coordinate to obtain a second UV coordinate; mapping the illumination pattern into a second UV coordinate for rendering. According to the invention, the panoramic picture is projected onto the space model through the point light source to form a virtual scene, and the construction is simple, quick and real.

Description

Method, system and storage medium for realizing panoramic display based on patterned illumination
Technical Field
The invention relates to the technical field of three-dimensional panorama construction, in particular to a method, a system and a storage medium for realizing panorama display based on patterned illumination.
Background
Three-dimensional panorama is composed of a combination of panoramic images, virtual reality, and computer vision. Panoramic image: panoramic images are a special type of image that is capable of displaying a 360 degree panoramic field of view. Panoramic images are typically captured using a panoramic camera to capture multiple overlapping images, which are then combined into a single large image using special software. Virtual reality: virtual reality is a computer technology that creates a simulated environment in which users feel to be in the body. Virtual reality technology is typically implemented using devices such as head mounted displays and handles, as well as specialized software and hardware. Computer vision: computer vision is an artificial intelligence technique that enables computers to process images and video. Computer vision techniques may be used to identify and track objects, measure the size and position of objects, and the like.
At present, a technology for constructing a three-dimensional panorama by utilizing a panoramic image is widely applied, but most of traditional modes are constructed by adopting a mapping method, the mode needs to carry out decal processing on the picture and has higher requirements on model precision, otherwise, the constructed panorama is easy to deform, and the brightness in a scene needs to be adjusted after the panorama is finished, and the lighting is additionally arranged or is not convenient to adjust before decal, so that the traditional three-dimensional panorama construction method still has more defects.
Disclosure of Invention
The invention aims to solve the problems of complex three-dimensional panorama construction operation and inconvenient use, and provides a method, a system and a storage medium for realizing panorama display based on patterned illumination.
In a first aspect, the present invention achieves the above object by a method for achieving panoramic display based on patterned illumination, the method comprising the steps of:
establishing a space model of a scene;
a point light source is arranged in the space model, and a patterned illumination layer surrounding the surface of the point light source is arranged;
acquiring a panoramic picture, and filling the panoramic picture into the patterned illumination layer shape to generate an illumination pattern;
setting an area of the inner surface of the space model, which is covered by point light source illumination, as an area to be projected, and carrying out illumination display on the area to be projected, wherein the illumination display method comprises the following steps:
converting the region to be projected from a model coordinate space to a point light source coordinate space;
calculating a first UV coordinate of the illumination pattern in the point light source coordinate space;
horizontally overturning the first UV coordinates to obtain second UV coordinates;
and mapping the illumination pattern into the second UV coordinates for rendering.
Preferably, the method comprises setting the illumination distance of the point light source to be at least greater than the dimension of any one of the three dimensions of the spatial model.
Preferably, the method comprises setting the illumination intensity of the point light source to be kept constant along with the change of the illumination distance.
Preferably, the method for generating the illumination pattern by filling the panoramic picture into the patterned illumination layer shape comprises the following steps:
converting the panoramic picture into a cube map, wherein the cube map comprises six texture surfaces;
mapping each texture surface of the cube map to six surfaces of the patterned illumination layer in sequence, and setting the texture surface as an ambient illumination texture;
six ambient light textures make up the light pattern.
Preferably, the method for converting the region to be projected from the model coordinate space to the point light source coordinate space comprises the following steps:
firstly, converting a model coordinate space into a world coordinate space and storing the world coordinate space;
extracting a world coordinate space and converting the world coordinate space into a uniform clipping coordinate space;
the world coordinate space is extracted and converted into a point light source coordinate space.
Preferably, the first UV coordinate is turned horizontally to obtain the second UV coordinate, the new coordinate is obtained by turning the horizontal coordinate of the first UV coordinate to be used as the horizontal coordinate of the second UV coordinate, and the first UV coordinate is set to be (X, Y), so that the turning formula is as follows:
u=1.0-X, where U is the horizontal direction coordinate of the second UV coordinate;
the second UV coordinate is calculated as (U, Y).
In a second aspect, the present invention achieves the above object by a system for realizing panoramic display based on patterned illumination, comprising:
the model unit is used for establishing a space model of the scene;
the point light source unit is used for arranging a point light source in the space model and arranging a patterned illumination layer surrounding the surface of the point light source;
the pattern filling unit is used for obtaining a panoramic picture and filling the panoramic picture into the patterned illumination layer shape to generate an illumination pattern;
the illumination display unit is used for setting the area, covered by the point light source, of the inner surface of the space model as an area to be projected, and carrying out illumination display on the area to be projected, and the illumination display method comprises the following steps:
converting the region to be projected from a model coordinate space to a point light source coordinate space;
calculating a first UV coordinate of the illumination pattern in the point light source coordinate space;
horizontally overturning the first UV coordinates to obtain second UV coordinates;
and mapping the illumination pattern into the second UV coordinates for rendering.
Preferably, the point light source unit further includes an attribute setting module, configured to set an illumination distance of the point light source to be at least greater than a dimension of any one dimension of the three-dimensional dimensions of the space model, and further configured to set an illumination intensity of the point light source to maintain a constant state along with a change of the illumination distance.
Preferably, the illumination display unit further includes a coordinate overturning module, configured to horizontally overturn the first UV coordinate to obtain a second UV coordinate, and overturn the horizontal direction coordinate of the first UV coordinate to obtain a new coordinate as the horizontal direction coordinate of the second UV coordinate, and set the first UV coordinate as (X, Y), where the overturning formula is as follows:
u=1.0-X, where U is the horizontal direction coordinate of the second UV coordinate;
the second UV coordinate is calculated as (U, Y).
In a third aspect, the present invention achieves the above object by a storage medium having stored thereon a computer program which, when executed by a processor, implements a method for achieving panoramic display based on patterned illumination as described in the first aspect.
Compared with the prior art, the invention has the beneficial effects that:
1. according to the invention, the panoramic picture is projected to the inner surface of the space model in an illumination mode by utilizing the point light source, so that the virtual scene is displayed, the built scene is simple and quick, and the construction efficiency of the scene is improved.
2. According to the invention, panoramic display is performed by adopting a point light source illumination mode, and the illumination distance and illumination intensity of the point light source can be set to ensure that a panoramic picture can be accurately and clearly displayed on the inner surface of the space model, so that the restored panoramic picture has better effect.
Drawings
Fig. 1 is a flowchart of a method for realizing panoramic display based on patterned illumination according to the present invention.
FIG. 2 is a flowchart of a lighting display method according to the present invention.
Fig. 3 is a schematic diagram of a system for implementing panoramic display based on patterned illumination according to the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
In order to facilitate operation and enable visual viewing of operation results, the three-dimensional scene is displayed in a patterned illumination mode, and operations such as model establishment, point light source setting, illumination rendering and the like can be performed on a unit 3D platform, and the virtual scene establishment platform is not limited to be used only on the unit 3D platform because of a plurality of types, other platforms are not repeated herein, and the scheme description is performed in a mode of realizing patterned illumination by means of the unit 3D platform, as shown in fig. 1, the method for realizing panoramic display based on the patterned illumination comprises the following steps:
step S1, a space model of a scene is established, the space model can be manually established in a unit 3D platform, or other third party software is imported to generate a model by scanning a real scene, and the space model is used as illumination for receiving point light sources.
Step S2, setting a point light source in the space model, setting a patterned illumination layer surrounding the surface of the point light source, wherein light generated by the point light source arranged in the space model can illuminate the inner surface of the space model, in a virtual scene, the point light source is a light source for emitting light from one point to the periphery, and is usually used for simulating natural light or local illumination effect, if the structure of the space model is complex, a plurality of point light sources are required to be arranged, so that the situation that the light cannot cover certain dead angle areas can be avoided, the principle of the patterned illumination layer is that a shielding object with different shapes is placed in front of the point light source, so that a special artistic effect is formed by shadow and high light effect generated by the light after the light passes through the shielding object, and therefore, the point light source can project the content of the patterned illumination layer to the inner surface of the space model in an illumination mode.
S3, acquiring a panoramic picture, filling the panoramic picture into the patterned illumination layer shape to generate an illumination pattern, wherein the panoramic picture can be a 360-degree panoramic picture of a real scene obtained by direct shooting through a third-party tool or can be a panoramic picture formed by splicing and fusing six views shot by a common camera in the later stage by utilizing a splicing technology; the method for generating the illumination pattern by filling the panoramic picture into the patterned illumination layer shape comprises the following steps:
converting the panoramic picture into a cube map, wherein the cube map comprises six texture surfaces, and the cube map sequentially comprises a front surface, a rear surface, a left surface, a right surface, an upper surface and a lower surface;
mapping each texture surface of the cube map to six surfaces of a patterned illumination layer in turn and setting the texture surface as an ambient illumination texture, wherein the patterned illumination layer is usually a cube grid, and each surface corresponds to one surface of the cube map, so that the cube map can be mapped on the grid according to the propagation direction of light rays;
six ambient light textures make up the light pattern so that it is correctly mapped into the scene by the point light sources.
Step S4, setting a region of the inner surface of the space model, which is covered by the point light source, as a region to be projected, performing illumination display on the region to be projected, and processing the region to be projected through the illumination display, so that an illumination pattern projected by the point light source in step S3 can be displayed on the inner surface of the space model, as shown in fig. 2, wherein the illumination display method includes:
in step S401, the area to be projected is converted from the model coordinate space to the point light source coordinate space, and in order to correctly map the patterned illumination effect to the scene space, two different coordinate spaces need to be subjected to coordinate conversion, wherein the coordinate conversion is to convert a point or a vector in one coordinate system into a corresponding point or a vector in another coordinate system, and generally, the coordinate conversion is performed in a matrix conversion manner, and since the matrix conversion belongs to a mature mathematical algorithm, a specific conversion principle is not repeated.
In step S402, a first UV coordinate of the illumination pattern in the point light source coordinate space is calculated, where the first UV coordinate is implemented by multiplying a pattern vector in the illumination pattern by a transformation matrix in the point light source coordinate space, and the first UV coordinate is a coordinate of the illumination pattern enclosed outside the point light source, so that light rays of the point light source penetrate from the back of the illumination pattern to cover the area to be projected, and if the final rendering effect implemented by only this step is that the back pattern of the illumination pattern is rendered on the area to be projected, the display effect and the actual panoramic picture are just in a mirror image state, so that the secondary operation in step S403 is required.
Step S403, performing horizontal inversion on the first UV coordinate to obtain a second UV coordinate, changing the illumination pattern from the coordinate surrounding the outside of the point light source to the mirror image illumination pattern surrounding the coordinate surrounding the outside of the point light source in a horizontal inversion manner, wherein the first UV coordinate is horizontally inverted to obtain the second UV coordinate, and the new coordinate is obtained by inverting the horizontal coordinate of the first UV coordinate to be used as the horizontal coordinate of the second UV coordinate, and the first UV coordinate is set to be (X, Y), so that the inversion formula is as follows:
u=1.0-X, where U is the horizontal direction coordinate of the second UV coordinate;
the second UV coordinate is calculated as (U, Y).
For example, if a triangle has a first UV coordinate (0.2,0.3), after horizontal mirroring, the second UV coordinate becomes: u=1.0-0.2=0.8, y=0.3. Therefore, the second UV coordinate of the triangle is (0.8,0.3), which corresponds to mirroring in the horizontal direction.
According to the formula, when horizontal mirroring is performed, only the U coordinates are turned over, and the V coordinates are kept unchanged, so that the vertical direction of the texture of the illumination pattern is not changed, the horizontal turning mode is not unique, the first UV coordinates can be turned over horizontally through the formula to obtain the second UV coordinates, and tools such as graphic editing software or texture tools can be used for generating the texture of the horizontal mirroring.
In step S404, the illumination pattern is mapped to the second UV coordinate for rendering, and the texture rendering belongs to a common technical means for virtual scenes, for example, in a unit 3D platform, rendering can be implemented by using tools such as a shader, so that a specific rendering process thereof is not repeated.
In the rendering process, it is notable that 1. Remove the influence of ambient light on projection; 2. removing the influence of other light sources on projection; 3. all reflections from the point sources are not considered. In 1, ambient light generally refers to a space box of a spatial model, in 2, other light sources generally refer to other light sources besides point light sources, and in 3, reflection generally refers to diffuse and specular reflection.
Because the point light sources basically have basic properties such as positions, colors, light intensities, ranges and the like, in the step S2, parameter adjustment is needed for the point light sources, so that the point light sources can accurately illuminate and display panoramic pictures, the method comprises the steps of setting the illumination distance of the point light sources to be at least larger than the dimension of any one of the three-dimensional dimensions of the space model, and setting the illumination distance of the point light sources to be equal to the illumination distance of the point light sources so as to ensure that the illumination of the point light sources can cover the inner surface of the space model: the point light source can cover illumination on the inner surface of the space model at any position in the space model, so that the illumination distance needs to exceed the largest three-dimensional dimension of the space model, namely the length, the width and the height, so that illumination can be covered on the opposite wall of the space model even if the point light source is arranged in a corner.
Not only is the illumination distance of the point light source required to be set, but also the illumination intensity of the point light source is required to be set because the light of the point light source is attenuated, and when the illumination pattern is projected onto a wall of a space model far away from the point light source, the phenomenon can lead to brightness darkening of a picture and influence the accuracy of panoramic picture restoration, and the method comprises the steps of setting the illumination intensity of the point light source to keep a constant state along with the change of the illumination distance, for example, in a unit 3D platform, describing the change rule of light in the transmission process by using an attenuation function in order to simulate the light attenuation effect of the point light source, wherein the function generally comprises three parameters: a constant term, a linear term and a quadratic term, respectively, represent constant attenuation, linear attenuation and quadratic attenuation of light during transmission, wherein the constant term is used for controlling attenuation of light within a short distance, the linear term is used for controlling attenuation of light within a medium distance, and the quadratic term is used for controlling attenuation of light within a long distance, so that in order to ensure constant illumination intensity, a constant attenuation function is used to replace a normal trigonometric attenuation function, the constant attenuation function represents that light maintains constant intensity during transmission and does not attenuate with increasing distance, and the form of the constant attenuation function is as follows:
attenuation=1.0
here 1.0 means that the intensity of the light is always a constant value and does not vary with distance.
In order to directly convert one coordinate space into another coordinate space, sometimes, the conversion cannot be directly performed, for example, in step S401, the conversion needs to be performed by converting the model coordinate space into the world coordinate space, and the method for converting the to-be-projected area from the model coordinate space into the point light source coordinate space includes:
firstly, converting a model coordinate space into a world coordinate space and storing the world coordinate space;
the world coordinate space is extracted and converted into a uniform clipping coordinate space, and the virtual scene formed by the panoramic picture projected by the point light source can be observed by a user through the terminal equipment by converting into a secondary clipping coordinate space;
and extracting the world coordinate space and converting the world coordinate space into the point light source coordinate space, wherein the world coordinate space is used as a transit space, and the conversion from the model coordinate space into the point light source coordinate space is realized.
Example 2
As shown in fig. 3, a system for implementing panoramic display based on patterned illumination includes:
the model unit is used for establishing a space model of the scene;
a point light source unit for setting a point light source in the space model and a patterned illumination layer surrounding the surface of the point light source, the point light source unit further comprising an attribute setting module, the illumination distance of the point light source is at least larger than the dimension of any one dimension in the three-dimensional dimension of the space model, and the illumination intensity of the point light source is kept in a constant state along with the change of the illumination distance;
the pattern filling unit is used for obtaining a panoramic picture and filling the panoramic picture into the patterned illumination layer shape to generate an illumination pattern;
the illumination display unit is used for setting the area, covered by the point light source, of the inner surface of the space model as an area to be projected, and carrying out illumination display on the area to be projected, and the illumination display method comprises the following steps:
converting the region to be projected from a model coordinate space to a point light source coordinate space;
calculating a first UV coordinate of the illumination pattern in the point light source coordinate space;
the illumination display unit further comprises a coordinate overturning module, wherein the coordinate overturning module is used for horizontally overturning the first UV coordinate to obtain a second UV coordinate, the new coordinate is obtained by overturning the horizontal direction coordinate of the first UV coordinate to serve as the horizontal direction coordinate of the second UV coordinate, the first UV coordinate is set to be (X, Y), and then an overturning formula is as follows:
u=1.0-X, where U is the horizontal direction coordinate of the second UV coordinate;
then a second UV coordinate is calculated as (U, Y);
and mapping the illumination pattern into the second UV coordinates for rendering.
Since embodiment 2 and embodiment 1 are substantially the same, the use principle of each unit module will not be described in detail.
Example 3
The embodiment provides a storage medium, which comprises a storage program area and a storage data area, wherein the storage program area can store an operating system, a program required by running an instant messaging function and the like; the storage data area can store various instant messaging information, operation instruction sets and the like. A computer program is stored in the stored program area, which when being executed by a processor implements the method of implementing a panoramic display based on patterned illumination as described in embodiment 1. The processor may include one or more Central Processing Units (CPUs) or a digital processing unit or the like.
It will be evident to those skilled in the art that the invention is not limited to the details of the foregoing illustrative embodiments, and that the present invention may be embodied in other specific forms without departing from the spirit or essential characteristics thereof. The present embodiments are, therefore, to be considered in all respects as illustrative and not restrictive, the scope of the invention being indicated by the appended claims rather than by the foregoing description, and all changes which come within the meaning and range of equivalency of the claims are therefore intended to be embraced therein.
Furthermore, it should be understood that although the present disclosure describes embodiments, not every embodiment is provided with a separate embodiment, and that this description is provided for clarity only, and that the disclosure is not limited to the embodiments described in detail below, and that the embodiments described in the examples may be combined as appropriate to form other embodiments that will be apparent to those skilled in the art.

Claims (10)

1. A method for realizing panoramic display based on patterned illumination, which is characterized by comprising the following steps:
establishing a space model of a scene;
a point light source is arranged in the space model, and a patterned illumination layer surrounding the surface of the point light source is arranged;
acquiring a panoramic picture, and filling the panoramic picture into the patterned illumination layer shape to generate an illumination pattern;
setting an area of the inner surface of the space model, which is covered by point light source illumination, as an area to be projected, and carrying out illumination display on the area to be projected, wherein the illumination display method comprises the following steps:
converting the region to be projected from a model coordinate space to a point light source coordinate space;
calculating a first UV coordinate of the illumination pattern in the point light source coordinate space;
horizontally overturning the first UV coordinates to obtain second UV coordinates;
and mapping the illumination pattern into the second UV coordinates for rendering.
2. The method for realizing panoramic display based on patterned illumination of claim 1, wherein the method comprises setting the illumination distance of the point light source to be at least greater than the dimension of any one of the three dimensions of the spatial model.
3. The method for realizing panoramic display based on patterned illumination according to claim 2, wherein the method comprises setting the illumination intensity of the point light source to be kept in a constant state along with the change of the illumination distance.
4. The method for realizing panoramic display based on patterned illumination according to claim 1, wherein the method for generating illumination patterns by filling the panoramic picture into the patterned illumination layer shape comprises the following steps:
converting the panoramic picture into a cube map, wherein the cube map comprises six texture surfaces;
mapping each texture surface of the cube map to six surfaces of the patterned illumination layer in sequence, and setting the texture surface as an ambient illumination texture;
six ambient light textures make up the light pattern.
5. The method for realizing panoramic display based on patterned illumination according to claim 1, wherein the method for converting the region to be projected from the model coordinate space to the point light source coordinate space comprises the following steps:
firstly, converting a model coordinate space into a world coordinate space and storing the world coordinate space;
extracting a world coordinate space and converting the world coordinate space into a uniform clipping coordinate space;
the world coordinate space is extracted and converted into a point light source coordinate space.
6. The method for realizing panoramic display based on patterned illumination according to claim 1, wherein the step of horizontally inverting the first UV coordinate to obtain the second UV coordinate is to invert the horizontal coordinate of the first UV coordinate to obtain a new coordinate as the horizontal coordinate of the second UV coordinate, and setting the first UV coordinate to be (X, Y), and the inversion formula is as follows:
u=1.0-X, where U is the horizontal direction coordinate of the second UV coordinate;
the second UV coordinate is calculated as (U, Y).
7. A system for implementing panoramic display based on patterned illumination, comprising:
the model unit is used for establishing a space model of the scene;
the point light source unit is used for arranging a point light source in the space model and arranging a patterned illumination layer surrounding the surface of the point light source;
the pattern filling unit is used for obtaining a panoramic picture and filling the panoramic picture into the patterned illumination layer shape to generate an illumination pattern;
the illumination display unit is used for setting the area, covered by the point light source, of the inner surface of the space model as an area to be projected, and carrying out illumination display on the area to be projected, and the illumination display method comprises the following steps:
converting the region to be projected from a model coordinate space to a point light source coordinate space;
calculating a first UV coordinate of the illumination pattern in the point light source coordinate space;
horizontally overturning the first UV coordinates to obtain second UV coordinates;
and mapping the illumination pattern into the second UV coordinates for rendering.
8. The system for realizing panoramic display based on patterned illumination according to claim 7, wherein the point light source unit further comprises an attribute setting module for setting an illumination distance of the point light source to be at least greater than a dimension of any one of three dimensions of the spatial model, and for setting an illumination intensity of the point light source to be kept in a constant state according to a change of the illumination distance.
9. The system for realizing panoramic display based on patterned illumination of claim 7, wherein the illumination display unit further comprises a coordinate flipping module, wherein the flipping module is configured to flip the first UV coordinate horizontally to obtain the second UV coordinate by flipping the horizontal coordinate of the first UV coordinate to obtain a new coordinate as the horizontal coordinate of the second UV coordinate, and set the first UV coordinate as (X, Y), and the flipping formula is as follows:
u=1.0-X, where U is the horizontal direction coordinate of the second UV coordinate;
the second UV coordinate is calculated as (U, Y).
10. A storage medium having stored thereon a computer program which, when executed by a processor, implements a method of implementing panoramic display based on patterned illumination as claimed in any one of claims 1-6.
CN202310957933.9A 2023-08-01 2023-08-01 Method, system and storage medium for realizing panoramic display based on patterned illumination Active CN116664752B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310957933.9A CN116664752B (en) 2023-08-01 2023-08-01 Method, system and storage medium for realizing panoramic display based on patterned illumination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310957933.9A CN116664752B (en) 2023-08-01 2023-08-01 Method, system and storage medium for realizing panoramic display based on patterned illumination

Publications (2)

Publication Number Publication Date
CN116664752A CN116664752A (en) 2023-08-29
CN116664752B true CN116664752B (en) 2023-10-17

Family

ID=87715786

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310957933.9A Active CN116664752B (en) 2023-08-01 2023-08-01 Method, system and storage medium for realizing panoramic display based on patterned illumination

Country Status (1)

Country Link
CN (1) CN116664752B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117689557A (en) * 2024-02-02 2024-03-12 南京维赛客网络科技有限公司 OpenCV-based method, system and storage medium for converting orthoscopic panorama into hexahedral panorama

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN101882323A (en) * 2010-05-19 2010-11-10 北京航空航天大学 Microstructure surface global illumination real-time rendering method based on height map
CN103426199A (en) * 2013-08-09 2013-12-04 中国科学院自动化研究所 Low-noise real-time global illumination drawing method for three-dimensional geometric scene
CN107240065A (en) * 2017-04-19 2017-10-10 中科院微电子研究所昆山分所 A kind of 3D full view image generating systems and method
CN113538704A (en) * 2021-07-13 2021-10-22 海信视像科技股份有限公司 Method and equipment for drawing virtual object shadow based on light source position
CN116228986A (en) * 2023-03-22 2023-06-06 南京大学 Indoor scene illumination estimation method based on local-global completion strategy
CN116485984A (en) * 2023-06-25 2023-07-25 深圳元戎启行科技有限公司 Global illumination simulation method, device, equipment and medium for panoramic image vehicle model

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101246600A (en) * 2008-03-03 2008-08-20 北京航空航天大学 Method for real-time generating reinforced reality surroundings by spherical surface panoramic camera
CN101882323A (en) * 2010-05-19 2010-11-10 北京航空航天大学 Microstructure surface global illumination real-time rendering method based on height map
CN103426199A (en) * 2013-08-09 2013-12-04 中国科学院自动化研究所 Low-noise real-time global illumination drawing method for three-dimensional geometric scene
CN107240065A (en) * 2017-04-19 2017-10-10 中科院微电子研究所昆山分所 A kind of 3D full view image generating systems and method
CN113538704A (en) * 2021-07-13 2021-10-22 海信视像科技股份有限公司 Method and equipment for drawing virtual object shadow based on light source position
CN116228986A (en) * 2023-03-22 2023-06-06 南京大学 Indoor scene illumination estimation method based on local-global completion strategy
CN116485984A (en) * 2023-06-25 2023-07-25 深圳元戎启行科技有限公司 Global illumination simulation method, device, equipment and medium for panoramic image vehicle model

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李华 ; 王旭阳 ; 杨华民 ; 韩成 ; .基于高动态范围图像中光晕分析的光照方向测算算法.计算机应用.2016,(第05期),全文. *
高兴烨 ; 刘建军 ; 任鑫 ; 牟伶俐 ; 李春来 ; .利用嫦娥三号全景相机数据构建沉浸式虚拟月表环境系统.计算机辅助设计与图形学学报.2016,(第03期),全文. *

Also Published As

Publication number Publication date
CN116664752A (en) 2023-08-29

Similar Documents

Publication Publication Date Title
US7212207B2 (en) Method and apparatus for real-time global illumination incorporating stream processor based hybrid ray tracing
JP4276178B2 (en) Method for digital rendering of skin or similar
JP2508513B2 (en) Image generator
CN108257204B (en) Vertex color drawing baking method and system applied to Unity engine
US9508191B2 (en) Optimal point density using camera proximity for point-based global illumination
US20050041023A1 (en) Method and apparatus for self shadowing and self interreflection light capture
JPH09231404A (en) Picture processing method for displaying object, and device therefor
CN110246146A (en) Full parallax light field content generating method and device based on multiple deep image rendering
CN116664752B (en) Method, system and storage medium for realizing panoramic display based on patterned illumination
CN112184873B (en) Fractal graph creation method, fractal graph creation device, electronic equipment and storage medium
WO2022063260A1 (en) Rendering method and apparatus, and device
CN112734892A (en) Real-time global illumination rendering method for virtual cable tunnel scene model
JPH10111953A (en) Image processing method, device therefor and recording medium
US5793372A (en) Methods and apparatus for rapidly rendering photo-realistic surfaces on 3-dimensional wire frames automatically using user defined points
JP2008033522A (en) Program, information storage medium and image generation system
KR100489572B1 (en) Image processing method
JP6679966B2 (en) Three-dimensional virtual space presentation system, three-dimensional virtual space presentation method and program
US20230206567A1 (en) Geometry-aware augmented reality effects with real-time depth map
Soh et al. Texture mapping of 3D human face for virtual reality environments
JPH1027268A (en) Image processing method and image processor
JP7190780B1 (en) Image processing program, image processing apparatus, and image processing method
WO2024093282A1 (en) Image processing method, related device, and structured light system
JP2952585B1 (en) Image generation method
Gledhill et al. A Novel Methodology for the Optimization of Photogrammetry Data of Physical Objects for Use in Metaverse Virtual Environments
CN117974856A (en) Rendering method, computing device and computer-readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant