CN114071104B - Method for realizing multi-projector projection gradual change fusion based on shader - Google Patents

Method for realizing multi-projector projection gradual change fusion based on shader Download PDF

Info

Publication number
CN114071104B
CN114071104B CN202210051321.9A CN202210051321A CN114071104B CN 114071104 B CN114071104 B CN 114071104B CN 202210051321 A CN202210051321 A CN 202210051321A CN 114071104 B CN114071104 B CN 114071104B
Authority
CN
China
Prior art keywords
band
shader
projection
color
overlap
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210051321.9A
Other languages
Chinese (zh)
Other versions
CN114071104A (en
Inventor
李腾
王涛
林雨
张伟顺
王伟康
姚舜天
赵磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong Jerei Digital Technology Co Ltd
Original Assignee
Shandong Jerei Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong Jerei Digital Technology Co Ltd filed Critical Shandong Jerei Digital Technology Co Ltd
Priority to CN202210051321.9A priority Critical patent/CN114071104B/en
Publication of CN114071104A publication Critical patent/CN114071104A/en
Application granted granted Critical
Publication of CN114071104B publication Critical patent/CN114071104B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3141Constructional details thereof
    • H04N9/3147Multi-projection systems
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
    • H04N9/3179Video signal processing therefor
    • H04N9/3182Colour adjustment, e.g. white balance, shading or gamut

Abstract

The invention discloses a method for realizing gradual change fusion of projection of multiple projectors based on a shader, which belongs to the technical field of image processing and comprises the following steps: establishing a scene image in a three-dimensional engine according to a site screen and a projection requirement; establishing a material ball file and a real-time rendering program, wherein the material ball file is used as a calling object of the real-time rendering program; designing a shader according to a scene image to be loaded; assigning the field circular screen width, the field overlapping band width and the overlapping band color adjusting variable to a material ball file; acquiring a scene image to be loaded from video stream data of a projector in real time; performing projection gradient fusion processing on the scene image by adopting a shader in the material ball file; and calling a real-time rendering program, and performing real-time rendering and projection on the scene image subjected to the projection gradient fusion processing. The invention changes the stretching coefficient of the picture area in real time in a view visualization mode, thereby solving the problem of distortion in the picture fusion process.

Description

Method for realizing multi-projector projection gradual change fusion based on shader
Technical Field
The invention relates to a method for realizing multi-projector projection gradual change fusion based on a shader, belonging to the technical field of image processing.
Background
The projector is a common device in life, and can be used for projection teaching in meeting rooms or staircases, the projection areas of the places are small, but if some places with large projection area requirements are met, the images projected by two or more projectors need to be subjected to edge overlapping by using a projection fusion technology, so that the image effect is like that the image is projected by one projector.
With the increasing development of 3D technologies, 3D visual effects can be presented to users in a circular screen manner, and in order to ensure that the image effects are clear and undistorted, the technology of multi-projection fusion splicing can also be widely applied to application scenes such as circular screen and huge screen. In the multi-channel projection display system, each projector is independent, a gap exists between pictures projected by the two projectors, and a large screen of a plurality of channels is not planar but is a cylindrical ring screen or a spherical screen, so that projection deformation can be generated on a curved surface. However, the current projection fusion has disadvantages in processing the chromatic aberration of the overlap band, the parameter adjustment is complicated and inconvenient, and non-professional personnel cannot directly adjust the chromatic aberration effect of the overlap band. Therefore, the adjustment process of the existing multi-projection fusion splicing technology is time-consuming and labor-consuming, and the color interval change of the projection picture cannot be adjusted by modifying a single variable within a limited time.
Therefore, it is necessary to provide an efficient adjustment method for the color difference of the overlapped bands.
Disclosure of Invention
In order to solve the problems, the invention provides a method for realizing multi-projector projection gradual change fusion based on a shader, and particularly provides a method for realizing multi-camera picture stretching gradual change fusion based on a shader.
The technical scheme adopted for solving the technical problems is as follows:
the embodiment of the invention provides a method for realizing multi-projector projection gradual change fusion based on a shader, which comprises the following steps:
establishing a scene image in a three-dimensional engine according to a site screen and a projection requirement;
establishing a material ball file and a real-time rendering program, wherein the material ball file is used as a calling object of the real-time rendering program;
designing a shader according to a scene image to be loaded;
assigning the field circular screen width, the field overlapping band width and the overlapping band color adjusting variable to a material ball file;
acquiring a scene image to be loaded from video stream data of a projector in real time;
performing projection gradient fusion processing on the scene image by adopting a shader in the material ball file;
and calling a real-time rendering program, and performing real-time rendering and projection on the scene image subjected to the projection gradient fusion processing.
As a possible implementation manner of this embodiment, the method further includes the following steps:
and arranging a plurality of projectors according to the on-site screen and the projection requirements.
As one possible implementation of this embodiment, the onsite screen includes an annular projection screen.
As a possible implementation manner of this embodiment, the real-time rendering program is mounted on a camera installed with a three-dimensional engine.
As a possible implementation manner of this embodiment, an input interface of the material ball file is opened.
As a possible implementation manner of this embodiment, the shader performs fusion calculation on the pixel colors of the scene image to be loaded, which exist in the overlapping band.
As a possible implementation manner of this embodiment, the performing projection gradient fusion processing on the scene image by using a shader in the material ball file includes:
acquiring the actual width ScreenWidth of the field loop screen;
acquiring the width overlapWidth of an overlapped band caused by the projection of multiple projectors;
acquiring the distance SLtoOCdistance between the leftmost side of a field screen and the center of the overlapping band;
calculating the abscissa M of the center of the overlapped band on the screen0= sloocdistance/ScreenWidth, the width OWidth of the overlap band on the screen = overlappwidth/ScreenWidth;
the scene width and the scene height are respectively converted into U and V coordinates, and then the coordinates of the image sampling point are (x)0,y0);
According to the image acquisition point and the abscissa M of the central point of the overlapped band0Constructing a curve model with gradually changed chromatic aberration, and obtaining the gradually changed chromatic aberration degree of the overlapped band and the abscissa X of the image acquisition point in the overlapped band0A linear relationship mathematical model of (a);
reading the UV coordinates of the image acquisition point pixels in the vertex function of the shader: o.uv = v.uv;
acquiring a preliminary image pixel output by using the UV coordinates of the image acquisition point pixels;
calculating the gradient degree of the chromatic aberration of the overlapped band by using a linear relation mathematical model;
and performing mixed calculation on the primary image pixel output and the calculated gradient degree of the overlapped band chromatism to obtain a scene image after projection gradient fusion processing.
As a possible implementation manner of the embodiment, the abscissa M of the center point according to the image acquisition point and the overlapped belt is0Constructing a curve model with gradually changed chromatic aberration, and obtaining the gradually changed chromatic aberration degree of the overlapped band and the abscissa X of the image acquisition point in the overlapped band0The linear relationship mathematical model of (1), comprising:
(1) when M is0=<X0<= M0+ OWidth 0.5:
based on the principle that a straight line is determined by two points, the straight line inevitably exists on a curve passing through an image acquisition point and a central point of an overlapped belt (M)0+ OWidth/2, 1) and (M)0Overlap with color adjusting variable) to construct a curve model of the area with gradually changed color difference:
overlap band color adjustment variable = K1*M0+B1(1)
1=K1*(M0+OWidth*0.5)+B1 (2)
K1Is a coefficient, B1Is a constant;
obtained according to formula (1) and formula (2):
K1=2 = (1-overlapping banding color adjustment variable)/owedth;
B1= overlap band color adjustment variable-2 (1-overlap band color adjustment variable)/OWidth M0
According to K1 and B1, the gradient degree of the color difference of the overlapped band and the abscissa X of the image acquisition point in the overlapped band are obtained0The linear relation mathematical model is:
extent of overlap band shading =2 × (1-overlap band color adjustment variable)/oidth ×0+ heavyFold band color tuning variable-2 x (1-overlap band color tuning variable)/OWidth M0
(2) When M is0-OWidth/2=<x0<= M0The method comprises the following steps:
based on the principle that a straight line is determined by two points, the straight line inevitably exists on a curve passing through an image acquisition point and a central point of an overlapped belt (M)0-OWidth/2, 1) and (M)0Overlap with color adjusting variable) to construct a curve model of the area with gradually changed color difference:
overlap band color adjustment variable = K2*M0+B2(3)
1=K2*(M0-OWidth*0.5)+B2 (4)
K2Is a coefficient, B2Is a constant;
obtained according to formula (3) and formula (4):
K2=2 (overlap band color adjustment variable-1) ×/owedth;
B2= overlap band color adjustment variable-2 (overlap band color adjustment variable-1)/OWidth M0
According to K2 and B2, the gradient degree of the color difference of the overlapped band and the abscissa X of the image acquisition point in the overlapped band are obtained0The linear relation mathematical model is:
extent of overlap band shading =2 × (overlap band color adjustment variable-1)/oidth ×0+ overlapped band color adjustment variable-2 (overlapped band color adjustment variable-1))/OWidth M0
As a possible implementation manner of this embodiment, the preliminary image pixel output is:
col=tex2D(_MainTex,float2(u,v))*_Main_Color
wherein, the tex2D () function is the function used to sample the map in the CG program; 'MainTex' is a self-defined four-dimensional variable of CG language, and float2 () is a built-in variable of CG language and represents a float type binary vector; the _Main _ Color is a CG language self-defined four-dimensional variable RGBA, belongs to the Color type, and is a brightness and Color adjusting parameter.
As a possible implementation manner of this embodiment, the calculating the gradient degree of the color difference of the overlapping band by using a linear relation mathematical model includes:
when M is0=<X0<= M0+ OWidth 0.5:
overlap band hue gradation degree R =2 (1-overlap band hue adjustment variable)/oidth o.u + overlap band hue adjustment variable-2 (1-overlap band hue adjustment variable)/oidth M0(ii) a Wherein o.u is the abscissa in o.uv coordinates;
when M is0-OWidth/2=<x0<= M0The method comprises the following steps:
the degree of overlap band shading gradation R =2 (overlap band color adjustment variable-1)/oidth o.u + overlap band color adjustment variable-2 (overlap band color adjustment variable-1)/oidth M0
As a possible implementation manner of this embodiment, the scene image after the projection gradient fusion processing is:
FinalCol=col.rgb*R
RGB is the RGB image output by the pixels of the preliminary image, and R is the gradient degree of the color difference of the overlapped band.
The technical scheme of the embodiment of the invention has the following beneficial effects:
the invention provides a method for realizing multi-camera picture stretching gradual change fusion based on a shader, which changes the stretching coefficient of a picture area in real time in a view visualization mode so as to solve the problem of distortion in the picture fusion process.
The invention solves the problem of color aberration of the area of the fusion zone by directionally changing the local color brightness display of the projection area, thereby effectively reducing the color aberration of the projection fusion zone, ensuring smooth gradual change color of the picture, reducing cost and improving efficiency.
The method changes the stretching coefficient of the picture area in real time, solves distortion through compiling the shader based on the Unity engine, is flexible and convenient, has high feasibility and stable performance, and is suitable for rendering various three-dimensional scenes.
The invention is convenient to use, the script is mounted on the camera, and the effect of the shader is changed by changing the coefficient.
Drawings
FIG. 1 is a flow diagram illustrating a method for shader-based implementation of multi-projector projection gradient blending in accordance with an exemplary embodiment;
FIG. 2 illustrates an original image before a projective gradient blending process, according to an example embodiment;
FIG. 3 is a schematic diagram of an image during a projection gradient fusion process performed on FIG. 2 using the method of the present invention;
fig. 4 is a final image after the projection gradient blending processing and rendering are performed on fig. 2 by using the method of the present invention.
Detailed Description
The invention is further illustrated by the following examples in conjunction with the accompanying drawings:
in order to clearly explain the technical features of the present invention, the following detailed description of the present invention is provided with reference to the accompanying drawings. The following disclosure provides many different embodiments, or examples, for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. It should be noted that the components illustrated in the figures are not necessarily drawn to scale. Descriptions of well-known components and processing techniques and procedures are omitted so as to not unnecessarily limit the invention.
As shown in fig. 1 to 4, a method for implementing multi-projector projection gradient fusion based on a shader according to an embodiment of the present invention includes the following steps:
establishing a scene image in a three-dimensional engine according to a site screen and a projection requirement;
establishing a material ball file and a real-time rendering program, wherein the material ball file is used as a calling object of the real-time rendering program;
designing a shader according to a scene image to be loaded;
assigning the field circular screen width, the field overlapping band width and the overlapping band color adjusting variable to a material ball file;
acquiring a scene image to be loaded from video stream data of a projector in real time, wherein fig. 2 is an original image before projection gradient fusion processing, and the middle part (between two vertical lines) of fig. 2 is an overlapped band image needing gradient fusion processing;
performing projection gradient fusion processing on the scene image by using a shader in a material ball file, and fig. 3 is an image in the process of performing projection gradient fusion processing on the scene image in fig. 2 by using the method of the invention;
and calling a real-time rendering program to render and project the scene image subjected to the projection gradient fusion processing in real time, and referring to fig. 4, which is a final projected image subjected to the projection gradient fusion processing and rendering of fig. 2, so that the problem of color overlapping bands caused by multiple projectors on the scene can be visually adjusted.
As a possible implementation manner of this embodiment, the method further includes the following steps:
and arranging a plurality of projectors according to the on-site screen and the projection requirements.
As one possible implementation of this embodiment, the onsite screen includes an annular projection screen.
As a possible implementation manner of this embodiment, the real-time rendering program is mounted on a camera installed with a three-dimensional engine.
As a possible implementation manner of this embodiment, an input interface of the material ball file is opened. And opening screen width, overlapping band width and overlapping band color adjusting variables in the established real-time rendering program, so that a field user can conveniently achieve unification on visual effect through real-time adjustment of visual parameters according to the actual overlapping effect of the circular screen.
As a possible implementation manner of this embodiment, the shader performs fusion calculation on the pixel colors of the scene image to be loaded, which exist in the overlapping band. The shader may be written in a computer graphics language.
As a possible implementation manner of this embodiment, the performing projection gradient fusion processing on the scene image by using a shader in the material ball file includes:
acquiring the actual width ScreenWidth of the field loop screen;
acquiring the width overlapWidth of an overlapped band caused by the projection of multiple projectors;
acquiring the distance SLtoOCdistance between the leftmost side of a field screen and the center of the overlapping band;
calculating the abscissa M of the center of the overlapped band on the screen0= sloocdistance/ScreenWidth, the width OWidth of the overlap band on the screen = overlappwidth/ScreenWidth;
the scene width and the scene height are respectively converted into U and V coordinates, and then the coordinates of the image sampling point are (x)0,y0);
According to the image acquisition point and the abscissa M of the central point of the overlapped band0Constructing a curve model with gradually changed chromatic aberration, and obtaining the gradually changed chromatic aberration degree of the overlapped band and the abscissa X of the image acquisition point in the overlapped band0A linear relationship mathematical model of (a);
reading the UV coordinates of the image acquisition point pixels in the vertex function of the shader: o.uv = v.uv;
acquiring a preliminary image pixel output by using the UV coordinates of the image acquisition point pixels;
calculating the gradient degree of the chromatic aberration of the overlapped band by using a linear relation mathematical model;
and performing mixed calculation on the primary image pixel output and the calculated gradient degree of the overlapped band chromatism to obtain a scene image after projection gradient fusion processing.
As the scene image is used as a sampling image of the shader, the scene width and the scene height are respectively converted into U and V in sampling, and the range of the U and the V in the shader is 0-1, the abscissa M of the center of the overlapped band on the screen0= SLTOOCDistance/ScreenWidth; the width on the screen of the overlap band OWidth = OverlapWidth/ScreenWidth.
As a possible implementation manner of the embodiment, according to the characteristics of the overlapped band caused by multiple projectors, the gradient degree of the color difference of the overlapped band and the abscissa of the distance between the image acquisition point and the center point of the overlapped bandM0Has a linear variation relationship with respect to the distance of (1), and 0<Degree of overlapping band color difference gradation<And = 1. The abscissa M of the central point according to the image acquisition point and the overlapped band0Constructing a curve model with gradually changed chromatic aberration, and obtaining the gradually changed chromatic aberration degree of the overlapped band and the abscissa X of the image acquisition point in the overlapped band0The linear relationship mathematical model of (1), comprising:
(1) when M is0=<X0<= M0+ OWidth 0.5:
based on the principle that a straight line is determined by two points, the straight line inevitably exists on a curve passing through an image acquisition point and a central point of an overlapped belt (M)0+ OWidth/2, 1) and (M)0Overlap with color adjusting variable) to construct a curve model of the area with gradually changed color difference:
overlap band color adjustment variable = K1*M0+B1(1)
1=K1*(M0+OWidth*0.5)+B1 (2)
K1Is a coefficient, B1Is a constant;
obtained according to formula (1) and formula (2):
K1=2 = (1-overlapping banding color adjustment variable)/owedth;
B1= overlap band color adjustment variable-2 (1-overlap band color adjustment variable)/OWidth M0
According to K1 and B1, the gradient degree of the color difference of the overlapped band and the abscissa X of the image acquisition point in the overlapped band are obtained0The linear relation mathematical model is:
extent of overlap band shading =2 × (1-overlap band color adjustment variable)/oidth ×0+ overlap band color tuning variable-2 (1-overlap band color tuning variable)/OWidth M0
(2) When M is0-OWidth/2=<x0<= M0The method comprises the following steps:
based on the principle that a straight line is determined by two points, the straight line inevitably exists on a curve passing through an image acquisition point and a central point of an overlapped belt (M)0-OWidth/2, 1) and (M)0Overlap with color adjustment variable) two points, therebyConstructing a curve model of the area with gradually changed chromatic aberration:
overlap band color adjustment variable = K2*M0+B2(3)
1=K2*(M0-OWidth*0.5)+B2 (4)
K2Is a coefficient, B2Is a constant;
obtained according to formula (3) and formula (4):
K2=2 (overlap band color adjustment variable-1) ×/owedth;
B2= overlap band color adjustment variable-2 (overlap band color adjustment variable-1)/OWidth M0
According to K2 and B2, the gradient degree of the color difference of the overlapped band and the abscissa X of the image acquisition point in the overlapped band are obtained0The linear relation mathematical model is:
extent of overlap band shading =2 × (overlap band color adjustment variable-1)/oidth ×0+ overlapped band color adjustment variable-2 (overlapped band color adjustment variable-1))/OWidth M0
Reading the UV coordinates of the image capture point pixels in the vertex function of the shader by: o.uv = v.uv.
As a possible implementation manner of this embodiment, the preliminary image pixel output is:
col=tex2D(_MainTex,float2(u,v))*_Main_Color
wherein, the tex2D () function is the function used to sample the map in the CG program; 'MainTex' is a self-defined four-dimensional variable of CG language, and float2 () is a built-in variable of CG language and represents a float type binary vector; the _Main _ Color is a CG language self-defined four-dimensional variable RGBA, belongs to the Color type, and is a brightness and Color adjusting parameter.
As a possible implementation manner of this embodiment, the calculating the gradient degree of the color difference of the overlapping band by using a linear relation mathematical model includes:
when M is0=<X0<= M0+ OWidth 0.5:
degree of overlap band shading gradient R =2 (1-overlap band color)Adjusting variable)/OWidth o.u + overlapped band color adjusting variable-2 (1-overlapped band color adjusting variable)/OWidth M0(ii) a Wherein o.u is the abscissa in o.uv coordinates;
when M is0-OWidth/2=<x0<= M0The method comprises the following steps:
the degree of overlap band shading gradation R =2 (overlap band color adjustment variable-1)/oidth o.u + overlap band color adjustment variable-2 (overlap band color adjustment variable-1)/oidth M0
As a possible implementation manner of this embodiment, the scene image after the projection gradient fusion processing is:
FinalCol=col.rgb*R
RGB is the RGB image output by the pixels of the preliminary image, and R is the gradient degree of the color difference of the overlapped band.
The invention solves the problem of color aberration of the area of the fusion zone by directionally changing the local color brightness display of the projection area, thereby effectively reducing the color aberration of the projection fusion zone, ensuring smooth gradual change color of the picture, reducing cost and improving efficiency.
The method changes the stretching coefficient of the picture area in real time, solves distortion through compiling the shader based on the Unity engine, is flexible and convenient, has high feasibility and stable performance, and is suitable for rendering various three-dimensional scenes.
Finally, it should be noted that: the above embodiments are only for illustrating the technical solutions of the present invention and not for limiting the same, and although the present invention is described in detail with reference to the above embodiments, those of ordinary skill in the art should understand that: modifications and equivalents may be made to the embodiments of the invention without departing from the spirit and scope of the invention, which is to be covered by the claims.

Claims (8)

1. A method for realizing multi-projector projection gradual change fusion based on a shader is characterized by comprising the following steps:
establishing a scene image in a three-dimensional engine according to a site screen and a projection requirement;
establishing a material ball file and a real-time rendering program, wherein the material ball file is used as a calling object of the real-time rendering program;
designing a shader according to a scene image to be loaded;
assigning the field circular screen width, the field overlapping band width and the overlapping band color adjusting variable to a material ball file;
acquiring a scene image to be loaded from video stream data of a projector in real time;
performing projection gradient fusion processing on the scene image by adopting a shader in the material ball file;
calling a real-time rendering program, and performing real-time rendering and projection on the scene image subjected to projection gradient fusion processing;
the adoption of the shader in the material ball file to perform projection gradual change fusion processing on the scene image comprises the following steps:
acquiring the actual width ScreenWidth of the field loop screen;
acquiring the width overlapWidth of an overlapped band caused by the projection of multiple projectors;
acquiring the distance SLtoOCdistance between the leftmost side of a field screen and the center of the overlapping band;
calculating the abscissa M of the center of the overlapped band on the screen0= sloocdistance/ScreenWidth, the width OWidth of the overlap band on the screen = overlappwidth/ScreenWidth;
the scene width and the scene height are respectively converted into U and V coordinates, and then the coordinates of the image sampling point are (x)0,y0);
According to the image acquisition point and the abscissa M of the central point of the overlapped band0Constructing a curve model with gradually changed chromatic aberration, and obtaining the gradually changed chromatic aberration degree of the overlapped band and the abscissa X of the image acquisition point in the overlapped band0A linear relationship mathematical model of (a);
reading the UV coordinates of the image acquisition point pixels in the vertex function of the shader: o.uv = v.uv;
acquiring a preliminary image pixel output by using the UV coordinates of the image acquisition point pixels;
calculating the gradient degree of the chromatic aberration of the overlapped band by using a linear relation mathematical model;
performing mixed calculation on the primary image pixel output and the calculated gradient degree of the overlapped band chromatism to obtain a scene image after projection gradient fusion processing;
when M is0=<X0<= M0At + owedth 0.5, the curve model of the color difference gradient is:
overlap band color adjustment variable = K1*M0+B1(1)
1=K1*(M0+OWidth*0.5)+B1 (2)
The linear relation mathematical model is as follows:
extent of overlap band shading =2 × (1-overlap band color adjustment variable)/oidth ×0+ overlap band color tuning variable-2 (1-overlap band color tuning variable)/OWidth M0
When M is0-OWidth/2=<x0<= M0Then, the curve model of the color difference gradual change is as follows:
overlap band color adjustment variable = K2*M0+B2(3)
1=K2*(M0-OWidth*0.5)+B2 (4)
The linear relation mathematical model is as follows:
extent of overlap band shading =2 × (overlap band color adjustment variable-1)/oidth ×0+ overlapped band color adjustment variable-2 (overlapped band color adjustment variable-1))/OWidth M0
In the formula, K1And K2Is a coefficient, B1And B2Is a constant.
2. The shader-based method for implementing multi-projector projection gradient blending according to claim 1, further comprising the steps of:
and arranging a plurality of projectors according to the on-site screen and the projection requirements.
3. The shader-based method for implementing multi-projector projection gradient blending of claim 1, wherein the real-time rendering program is mounted to a camera with a three-dimensional engine installed thereon.
4. The shader-based method for implementing multi-projector projection gradient blending of claim 1, wherein an input interface to a texture ball file is opened.
5. The shader-based method for implementing multi-projector projection gradient blending of claim 1, wherein the shader performs blending calculations on the colors of pixels of the scene image to be loaded that exist in the overlap region.
6. The shader-based method for implementing multi-projector projection gradient blending according to any one of claims 1 to 5, wherein the preliminary image pixel outputs are:
col=tex2D(_MainTex,float2(u,v))*_Main_Color
wherein, the tex2D () function is the function used to sample the map in the CG program; 'MainTex' is a self-defined four-dimensional variable of CG language, and float2 () is a built-in variable of CG language and represents a float type binary vector; the _Main _ Color is a CG language self-defined four-dimensional variable RGBA, belongs to the Color type, and is a brightness and Color adjusting parameter.
7. The shader-based method for implementing multi-projector projection gradient blending of claim 6, wherein the calculating the overlap band color gradient using a linear relationship mathematical model includes:
when M is0=<X0<= M0+ OWidth 0.5:
overlap band hue gradation degree R =2 (1-overlap band hue adjustment variable)/oidth o.u + overlap band hue adjustment variable-2 (1-overlap band hue adjustment variable)/oidth M0(ii) a Wherein o.u is the abscissa in o.uv coordinates;
when M is0-OWidth/2=<x0<= M0The method comprises the following steps:
degree of superimposed band shading R =2 (heavy)Fold band color tuning variable-1)/OWidth o.u + overlap band color tuning variable-2 (overlap band color tuning variable-1)/OWidth M0
8. The shader-based method for implementing multi-projector projection gradient blending of claim 7, wherein the scene image after projection gradient blending processing is:
FinalCol=col.rgb*R
RGB is the RGB image output by the pixels of the preliminary image, and R is the gradient degree of the color difference of the overlapped band.
CN202210051321.9A 2022-01-18 2022-01-18 Method for realizing multi-projector projection gradual change fusion based on shader Active CN114071104B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210051321.9A CN114071104B (en) 2022-01-18 2022-01-18 Method for realizing multi-projector projection gradual change fusion based on shader

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210051321.9A CN114071104B (en) 2022-01-18 2022-01-18 Method for realizing multi-projector projection gradual change fusion based on shader

Publications (2)

Publication Number Publication Date
CN114071104A CN114071104A (en) 2022-02-18
CN114071104B true CN114071104B (en) 2022-04-19

Family

ID=80231235

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210051321.9A Active CN114071104B (en) 2022-01-18 2022-01-18 Method for realizing multi-projector projection gradual change fusion based on shader

Country Status (1)

Country Link
CN (1) CN114071104B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114463475B (en) * 2022-04-08 2022-07-19 山东捷瑞数字科技股份有限公司 Edge correction-based multi-camera rendering image fusion method
CN115115644B (en) * 2022-08-31 2022-11-15 启东市德立神起重运输机械有限公司 Vehicle welding defect detection method based on artificial intelligence

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9369683B2 (en) * 2010-11-15 2016-06-14 Scalable Display Technologies, Inc. System and method for calibrating a display system using manual and semi-manual techniques
CN104954715B (en) * 2015-07-06 2017-12-05 山东大学 Method based on the GPU special-shaped screen multi-projection system fusion video playbacks accelerated
JP2017085446A (en) * 2015-10-30 2017-05-18 キヤノン株式会社 Projection device, projection method and projection system
US10136055B2 (en) * 2016-07-29 2018-11-20 Multimedia Image Solution Limited Method for stitching together images taken through fisheye lens in order to produce 360-degree spherical panorama
US10417810B2 (en) * 2017-05-31 2019-09-17 Verizon Patent And Licensing Inc. Methods and systems for rendering virtual reality content based on two-dimensional (“2D”) captured imagery of a three-dimensional (“3D”) scene
US11049218B2 (en) * 2017-08-11 2021-06-29 Samsung Electronics Company, Ltd. Seamless image stitching
CN113506214B (en) * 2021-05-24 2023-07-21 南京莱斯信息技术股份有限公司 Multi-path video image stitching method
CN113256777B (en) * 2021-06-28 2021-09-28 山东捷瑞数字科技股份有限公司 Method for playing and adjusting dome screen based on computer graphics

Also Published As

Publication number Publication date
CN114071104A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
CN114071104B (en) Method for realizing multi-projector projection gradual change fusion based on shader
US6793350B1 (en) Projecting warped images onto curved surfaces
JP6037375B2 (en) Image projection apparatus and image processing method
Majumder et al. Perceptual photometric seamlessness in projection-based tiled displays
Raskar et al. Quadric transfer for immersive curved screen displays
CN111192552B (en) Multi-channel LED spherical screen geometric correction method
CN104954769A (en) Immersion type ultra-high-definition video processing system and method
WO2018196472A1 (en) Projection method, apparatus and system, and storage medium
JP5236219B2 (en) Distortion correction and integration method using divided imaging, mapping function generation method therefor, distortion correction and integration device using divided imaging, and mapping function generation apparatus therefor
JP2015097350A (en) Image processing apparatus and multi-projection system
CN101727880A (en) Projection fusion method of true seamless rear projection large screen display image
CN103037189A (en) Method to achieve integrate output of large-size screen video images through much projection
CN101621701A (en) Correcting method of multiple projector display wall colors of arbitrary smooth curve screens independent of geometric correction
CN107635120A (en) A kind of method of multiple channel ball curtain Geometry rectification and Fusion Edges
CN112118435B (en) Multi-projection fusion method and system for special-shaped metal screen
CN111770326B (en) Indoor three-dimensional monitoring method for panoramic video projection
CN201846426U (en) Multi-image automatic geometry and edge blending system based on photography
JP4751084B2 (en) Mapping function generation method and apparatus, and composite video generation method and apparatus
CN107424206A (en) A kind of interactive approach that the performance of virtual scene shadow is influenceed using actual environment
JP2005039849A (en) Video display system
JP6466354B2 (en) Apparatus, video projection apparatus, video projection system, video display apparatus, video generation apparatus, method thereof, and program
Zoido et al. Optimized methods for multi-projector display correction
JP6430420B2 (en) Information presentation system and information presentation method
CN114401388A (en) Projection method, projection device, storage medium and projection equipment
CN115883805B (en) Multi-projector picture splicing color fusion method, fusion device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant