CN117314737A - Space projection reuse acceleration method for panoramic stereo picture rendering - Google Patents

Space projection reuse acceleration method for panoramic stereo picture rendering Download PDF

Info

Publication number
CN117314737A
CN117314737A CN202311312163.9A CN202311312163A CN117314737A CN 117314737 A CN117314737 A CN 117314737A CN 202311312163 A CN202311312163 A CN 202311312163A CN 117314737 A CN117314737 A CN 117314737A
Authority
CN
China
Prior art keywords
point
panoramic image
cylindrical projection
pixel
equidistant cylindrical
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311312163.9A
Other languages
Chinese (zh)
Inventor
陈纯毅
于海洋
胡勇
李延风
胡小娟
范宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beihang University
Changchun University of Science and Technology
Original Assignee
Beihang University
Changchun University of Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beihang University, Changchun University of Science and Technology filed Critical Beihang University
Publication of CN117314737A publication Critical patent/CN117314737A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/005General purpose rendering architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/04Texture mapping
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a space projection reuse acceleration method for panoramic stereo picture rendering. The method can be used for projecting the pixel colors of the equidistant cylindrical projection panoramic image corresponding to the left eye into the equidistant cylindrical projection panoramic image corresponding to the right eye for reuse in the process of rendering the three-dimensional scene panoramic stereoscopic picture, thereby saving the rendering calculation cost of the equidistant cylindrical projection panoramic image corresponding to the right eye. When the space projection is repeated, the method can ensure that the correct panoramic stereoscopic effect and the shielding effect are obtained, and the finally generated equidistant cylindrical projection panoramic image corresponding to the right eye does not have the problems of projection hollowness, error shielding and the like.

Description

Space projection reuse acceleration method for panoramic stereo picture rendering
Technical Field
The invention relates to a space projection reuse acceleration method for panoramic stereo picture rendering, and belongs to the technical field of three-dimensional graphic drawing.
Background
The Chinese patent application with the application number of 202010841798.8 discloses a spherical panoramic stereo image generation and interactive display method of a virtual 3D scene, and the Chinese patent application uses a ray tracing technology. When rendering a three-dimensional scene using ray tracing techniques, it is necessary to calculate the position of the intersection point where a ray emanating from a viewpoint first intersects a three-dimensional scene object, which intersection point is referred to as a visual scene point, i.e., a scene point that is directly visible from the viewpoint position. In rendering a three-dimensional scene with ray tracing, it is necessary to trace a ray for each pixel of the three-dimensional scene image, so that each pixel corresponds to a visual field, and the color of the pixel depends on the color of the corresponding visual field. The panoramic stereo picture comprises two images, namely an equidistant cylindrical projection panoramic image corresponding to the left eye and an equidistant cylindrical projection panoramic image corresponding to the right eye, which have strong similarity. Therefore, if the equidistant cylindrical projection panoramic image corresponding to the left eye is generated first, when the equidistant cylindrical projection panoramic image corresponding to the right eye is generated, some results in the equidistant cylindrical projection panoramic image corresponding to the left eye can be reused through space projection, so that the rendering calculation cost is saved, and the rendering speed is increased.
At present, people often use two-dimensional rectangular images in an equidistant cylindrical projection modeThe horizontal 360 degrees + vertical 180 degrees panoramic spherical image signal is stored. The Chinese patent application 202010841798.8 provides a model of a spherical panoramic stereo camera, as shown in FIG. 1, wherein the spherical panoramic stereo camera comprises a virtual imaging sphere and a viewing circle, and the spherical radius corresponding to the virtual imaging sphere isRThe radius of the viewing circle isrThe sphere center corresponding to the virtual imaging sphere is positioned at the origin of the u-v-w coordinate system, the viewing circle is positioned on the u-o-v plane of the u-v-w coordinate system, and the circle center of the viewing circle is positioned at the origin of the u-v-w coordinate system; firstly, calculating coordinates of sampling points of all pixels on a virtual imaging spherical surface according to an equiangular interval sampling mode of an azimuth angle and a polar angle; then, calculating the corresponding left eye viewpoint position and right eye viewpoint position according to the parameters of the viewing circle for each pixel sampling point on the virtual imaging sphere; transmitting light rays passing through the pixel sampling points from the left eye viewpoint position by utilizing a ray tracing technology and tracing the propagation of the light rays in the three-dimensional scene, so that the color of the pixel sampling points when the left eye views is obtained; light rays passing through the pixel sampling point are emitted from the right eye viewpoint position by utilizing a ray tracing technology, and the propagation of the light rays in the three-dimensional scene is traced, so that the color of the pixel sampling point when the right eye views is obtained. Equidistant cylindrical projection panoramic images are used to store the color of each pixel sample point on the virtual imaging sphere.
Each pixel sampling point on the virtual imaging sphere corresponds to a specific left eye viewpoint c L And a right eye viewpoint c R Left eye viewpoint c L And a right eye viewpoint c R And the image is moved on the viewing circle due to the difference of pixel sampling points on the virtual imaging sphere. As shown in FIG. 1, the sphere center corresponding to the virtual imaging sphere is located at the origin of the u-v-w coordinate system, which is a coordinate system using the center of the viewing circle of the spherical panoramic stereo camera as the origin, point a 1 And point a 2 Is two pixel sampling points on the virtual imaging sphere, from the left eye viewpoint c L Departure through Point a 1 Intersecting the ray of (2) with the three-dimensional scene at point p from right eye point c R Departure through Point a 2 Intersecting the ray of (2) with the three-dimensional scene at a point p; it is assumed here that the left eye viewpoint c L Point of attachmentNo other intersection point except for the line segment end points exists between the line segment determined by p and the three-dimensional scene, and the intersection point is determined by the right eye viewpoint c R No other intersection point except for the line segment end point exists between the line segment determined by the point p and the three-dimensional scene, so that the scene point p is imaged at the point a in the spherical panoramic picture corresponding to the left eye 1 The scene point p is imaged at the point a in the spherical panoramic picture corresponding to the right eye 2
As shown in fig. 1, point a on the virtual imaging sphere 1 The coordinates of (c) may be represented by rectangular coordinates or by spherical coordinates. The spherical coordinates can be written as% r, θ, φ) In the form of (a), whereinrRepresenting point a 1 The distance to the origin o of the coordinate system,θas a directed line segment oa 1 The angle with the positive direction of the w-axis (called polar angle),φin order to turn from the u-axis in a counter-clockwise direction to an angle (called azimuth) coinciding with line segment om, seen from the positive w-axis, where point m is point a 1 A projection point on the u-o-v plane; the u-o-v plane is the plane that passes through the origin of the u-v-w coordinate system and is parallel to the u-axis and the v-axis.
Disclosure of Invention
The invention aims to provide a space projection reuse acceleration method for panoramic stereo picture rendering, which can project the pixel colors of equidistant cylindrical projection panoramic pictures corresponding to left eyes into the proper pixels of equidistant cylindrical projection panoramic pictures corresponding to right eyes in a space projection mode when a three-dimensional scene panoramic stereo picture is rendered, thereby saving the rendering calculation cost of equidistant cylindrical projection panoramic pictures corresponding to right eyes. The technical scheme of the method is realized as follows:
a data structure DSTLTOR is provided for storing color values and distance data, the data structure DSTLTOR comprising two member variables, a color value pixolor and a distance VDIS. The specific implementation steps are as follows.
1) By utilizing a ray tracing technology, an equidistant cylindrical projection panoramic image corresponding to the left eye is generated according to the spherical panoramic stereo camera and the three-dimensional scene model, and the specific steps are as follows:
Step101: calculating the virtual imaging sphere according to the equiangular interval sampling mode of azimuth angle and polar angleCoordinates of each pixel sampling point, and storing colors of the pixel sampling points on the virtual imaging sphere by using the equidistant cylindrical projection panoramic image; as shown in fig. 2, the equidistant cylindrical projection panoramic image totally comprises N rows and 2N columns of pixels, N being a positive integer; rectangular coordinates of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image in a u-v-w coordinate system are (u) ij , v ij , w ij ),u ij = R×sin(i×π/N+0.5×π/N)×cos(j×π/N+0.5×π/N),v ij = R×sin(i×π/N+0.5×π/N)×sin(j×π/N+0.5×π/N),w ij = R×cos(i×π/N+0.5×π/N);i = 0,1,…,N-1;j = 0,1, …,2N-1;
Step102: creating a two-dimensional array IML containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array IML stores a color value of one pixel of the equidistant cylindrical projection panoramic image corresponding to the left eye, and the elements of the two-dimensional array IML are in one-to-one correspondence with the pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye; creating a two-dimensional array VPOINTS L containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array VPOINTS L stores a visual field spot coordinate corresponding to one pixel of the equidistant cylindrical projection panoramic image corresponding to the left eye, and the elements of the two-dimensional array VPOINTS L correspond to the pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye one by one; initializing the value of each element of the two-dimensional array VPOINTS L to a coordinate (M X , M Y , M Z ),M X 、M Y And M Z Equal to 2 128
Step103: for each pixel A001 of the equidistant cylindrical projection panoramic image corresponding to the left eye, the following operations are respectively performed:
step103-1: calculating a row number i and a column number j of the pixel A001 in the equidistant cylindrical projection panoramic image corresponding to the left eye;
step103-2: calculating rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye in a u-v-w coordinate system ij , v ij , w ij );Calculating rectangular coordinates (u) ij , v ij , w ij ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step103-3: as shown in fig. 3, a tangent point c corresponding to two tangents to a viewing circle passing through a point Q of the spherical panoramic stereo camera is calculated L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L A left eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel A001 and a tangent point c R A right eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel A001; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ]The method comprises the steps of carrying out a first treatment on the surface of the II represents the length of the vector x;
step103-4: according to the tangent point c by using a coordinate transformation method L Rectangular coordinates in u-v-w coordinate system to calculate tangent point c L Rectangular coordinates in an x-y-z world coordinate system; according to the virtual corresponding to the ith row and jth column pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye by utilizing a coordinate transformation methodRectangular coordinates (u ij , v ij , w ij ) Calculating rectangular coordinates of pixel sampling points on a virtual imaging spherical surface corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye in an x-y-z world coordinate system; in the x-y-z world coordinate system, at tangent point c L Transmitting a light LRay passing through a pixel sampling point on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye as a starting point, tracking the propagation of the light LRay in a three-dimensional scene by utilizing a ray tracing technology, recording an intersection point position coordinate SECCOORL of the light LRay intersecting with a three-dimensional scene object for the first time, and calculating that the tangent point c is reached along the opposite direction of the light LRay L The brightness value LRAD of the image is converted into a color value LColor of the pixel of the panoramic image, and the color value LColor is stored in the ith row and jth column elements of the two-dimensional array IML; storing the intersection point position coordinate SECCOORL in the ith row and jth column elements of the two-dimensional array VPOINTS L;
step103-5: the operation for the pixel a001 ends.
2) The method comprises the following specific steps of:
step201: creating a two-dimensional array VPRLIST containing N rows and 2N columns of elements in a memory of a computer, wherein the elements of the two-dimensional array VPRLIST are in one-to-one correspondence with pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye; each element of the two-dimensional array VPRLIST stores a space projection result list PLIST1 corresponding to one pixel of the equidistant cylindrical projection panoramic image corresponding to the right eye, and each element of the list PLIST1 stores a variable of a data structure DSTLTOR type; making a space projection result list PLIST1 stored by each element of the two-dimensional array VPRLIST empty;
step202: for each pixel A001 of the equidistant cylindrical projection panoramic image corresponding to the left eye, the following operations are respectively performed:
Step202-1: computing equidistant cylindrical projection of pixel A001 corresponding to left eyeA row number i and a column number j in the panoramic image; let VPLLC represent the visual field scenic spot coordinates stored by the ith row and jth column elements of the two-dimensional array VPOINTS L; let V px Representing the x-component, V, of VPLLC py Representing the y-component of VPLLC, V pz Representing the z-component of the VPLLC; rectangular coordinates (V) px , V py , V pz ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step202-2: as shown in fig. 3, a tangent point c corresponding to two tangents to a viewing circle passing through a point Q of the spherical panoramic stereo camera is calculated L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ];
Step202-3: according to the tangent point c by using a coordinate transformation method R Rectangular coordinates in u-v-w coordinate system to calculate tangent point c R Rectangular coordinates in an x-y-z world coordinate system; let v RayB0 Representing the slave tangent point c R Vector pointing to point corresponding to VPLLC, let P i′j′ Representing right eye pairsPixel sampling points, c, on a virtual imaging sphere corresponding to the ith row and jth column pixels of the corresponding equidistant cylindrical projection panoramic image Ri′j′ Representing P i′j′ Corresponding right eye viewpoint, v i′,j′ Representing slave c Ri′j′ Pointing to P i′j′ Let v be RayB0 •v i′,j′ /(‖v RayB0 ‖•‖v i′,j′ II) subscripts i 'and j' with maximum value are equal to i, respectively m And j m Where i 'is an element of the integer set {0, 1, …, N-1} and j' is an element of the integer set {0, 1, …, 2N-1 };
step202-4: if the ith of the two-dimensional array VPRLIST m Line j m Turning to Step202-4-1 if the list stored by the elements of the column is empty, otherwise turning to Step202-4-2;
step202-4-1: creating a variable vdsthttor of data structure DSTLTOR type in the memory of computer, making the color value PIXCOLOR member variable of variable vdsthttor equal to the value stored in the ith row and jth column elements of two-dimensional array IML, making the distance VDIS member variable of variable vdsthttor equal to the secondary tangent point c R Distance to the point corresponding to VPLLC; adding variable vDSTLTOR to ith of two-dimensional array VPRLIST m Line j m List PLIST1, in which the elements of the column are stored; turning to Step202-5;
step202-4-2: let TMP represent the ith of the two-dimensional array VPRLIST m Line j m The first element of the list of element stores of column PLIST1 stores a variable of data structure DSTLTOR type; if the value of the TMP's distance VDIS member variable is greater than the slave tangent point c R The distance to the point corresponding to VPLLC, the i-th two-dimensional array VPRLIST is obtained m Line j m Clearing the list PLIST1 stored by the elements of the column, turning to Step202-4-1, otherwise turning to Step202-5;
step202-5: the operation for the pixel a001 ends.
3) In the process of rendering the equidistant cylindrical projection panoramic image corresponding to the right eye, the equidistant cylindrical projection panoramic image rendering result corresponding to the left eye is reused through space projection, and the equidistant cylindrical projection panoramic image rendering corresponding to the right eye is completed, and the specific steps are as follows:
step301: calculating coordinates of each pixel sampling point on the virtual imaging spherical surface according to an equiangular interval sampling mode of the azimuth angle and the polar angle, and storing colors of the pixel sampling points on the virtual imaging spherical surface by using equidistant cylindrical projection panoramic images; as shown in fig. 2, the equidistant cylindrical projection panoramic image totally comprises N rows and 2N columns of pixels, N being a positive integer; rectangular coordinates of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image in a u-v-w coordinate system are (u) ij , v ij , w ij ),u ij = R×sin(i×π/N+0.5×π/N)×cos(j×π/N+0.5×π/N),v ij = R×sin(i×π/N+0.5×π/N)×sin(j×π/N+0.5×π/N),w ij = R×cos(i×π/N+0.5×π/N);i = 0,1,…,N-1;j = 0,1, …,2N-1;
Step302: creating a two-dimensional array IMR containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array IMR stores a color value of one pixel of the equidistant cylindrical projection panoramic image corresponding to the right eye, and the elements of the two-dimensional array IMR are in one-to-one correspondence with the pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye;
step303: for each pixel B001 of the equidistant cylindrical projection panoramic image corresponding to the right eye, the following operations are respectively performed:
step303-1: calculating a row number i and a column number j of the pixel B001 in the equidistant cylindrical projection panoramic image corresponding to the right eye;
step303-2: calculating rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in a u-v-w coordinate system ij , v ij , w ij ) The method comprises the steps of carrying out a first treatment on the surface of the Calculating rectangular coordinates (u) ij , v ij , w ij ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step303-3: as shown in fig. 3, two tangent lines passing through a point Q of a viewing circle of the spherical panoramic stereo camera are calculated to correspond to each otherIs the tangent point c of (2) L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Q q w =0; tangent point c L A left eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel B001 and a tangential point c R A right eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel B001; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ];
Step303-4: according to the tangent point c by using a coordinate transformation method R Rectangular coordinates in u-v-w coordinate system to calculate tangent point c R Rectangular coordinates in an x-y-z world coordinate system; according to rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and jth column pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in a u-v-w coordinate system by utilizing a coordinate transformation method ij , v ij , w ij ) Calculating rectangular coordinates of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in an x-y-z world coordinate system; in the x-y-z world coordinate system, with tangent points c R Transmitting a ray RRay passing through a pixel sampling point on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye as a starting point, calculating an intersection point position coordinate SECCOORR of the ray RRay intersecting a three-dimensional scene object for the first time, and if a list PLIST1 stored by the ith row and the jth column of elements of the two-dimensional array VPRLIST is not empty and from the intersection point position coordinate SECCOORR to a tangent point c R If the absolute value of the difference between the distance of the coordinates of (2) and the value of the member variable of the distance VDIS of the variable of the data structure DSTLTOR type stored in the first element of the list PLIST1 stored in the ith row, jth column of the two-dimensional array VPRLIST is smaller than the distance threshold TDis, step303-5 is shifted, otherwise, the propagation of ray RRay in the three-dimensional scene is tracked by using ray tracing technology, and the arrival of the tangential point c along the opposite direction of ray RRay is calculated R The brightness value RRAD is converted into a color value RColor of the pixel of the panoramic image, the color value RColor is stored in the ith row and the jth column elements of the two-dimensional array IMR, and the Step303-6 is performed;
step303-5: assigning the ith row and the jth column of the two-dimensional array IMR as the values of the color value PIXCOLOR member variables of the variable of the data structure DSTLTOR type stored by the first element of the list PLIST1 stored by the ith row and the jth column of the two-dimensional array VPRLIST;
Step303-6: the operation for the pixel B001 ends.
4) The equidistant cylindrical projection panoramic image corresponding to the left eye and the equidistant cylindrical projection panoramic image corresponding to the right eye are stored in a computer disk file, and the specific steps are as follows:
and the pixel color values stored in the two-dimensional array IML are stored into a computer disk file PIML as equidistant cylindrical projection panoramic image data corresponding to the left eye, and the pixel color values stored in the two-dimensional array IMR are stored into the computer disk file PIMR as equidistant cylindrical projection panoramic image data corresponding to the right eye.
The invention has the positive effects that: the method can be used for projecting the pixel colors of the equidistant cylindrical projection panoramic image corresponding to the left eye into the equidistant cylindrical projection panoramic image corresponding to the right eye for reuse in the process of rendering the three-dimensional scene panoramic stereoscopic picture, thereby saving the rendering calculation cost of the equidistant cylindrical projection panoramic image corresponding to the right eye. When the space projection is repeated, the method can ensure that the correct panoramic stereoscopic effect and the shielding effect are obtained, and the finally generated equidistant cylindrical projection panoramic image corresponding to the right eye does not have the problems of projection hollowness, error shielding and the like.
Drawings
Fig. 1 is a schematic view of a spherical panoramic stereo camera.
Fig. 2 is a schematic diagram of an equidistant cylindrical projection panoramic image.
Fig. 3 is a schematic diagram of left and right eye views on a viewing circle corresponding to pixel sampling points on a virtual imaging sphere.
Detailed Description
In order to make the features and advantages of the present method more apparent, the present method will be further described in connection with the following specific examples. The present embodiment contemplates generating a spherical panoramic stereoscopic view of a three-dimensional scene model of a room containing four walls, ceilings, floors, tables, chairs, etc., all geometric objects in the three-dimensional model of the room being non-transparent diffuse reflective objects.
The technical scheme of the method is realized as follows:
a data structure DSTLTOR is provided for storing color values and distance data, the data structure DSTLTOR comprising two member variables, a color value pixolor and a distance VDIS. The specific implementation steps are as follows.
1) By utilizing a ray tracing technology, an equidistant cylindrical projection panoramic image corresponding to the left eye is generated according to the spherical panoramic stereo camera and the three-dimensional scene model, and the specific steps are as follows:
step101: calculating coordinates of each pixel sampling point on the virtual imaging spherical surface according to an equiangular interval sampling mode of the azimuth angle and the polar angle, and storing colors of the pixel sampling points on the virtual imaging spherical surface by using equidistant cylindrical projection panoramic images; as shown in fig. 2, the equidistant cylindrical projection panoramic image totally comprises N rows and 2N columns of pixels, N being a positive integer; virtual imaging corresponding to ith row and jth column pixels of equidistant cylindrical projection panoramic image Rectangular coordinates of pixel sampling points on the spherical surface in a u-v-w coordinate system are (u) ij , v ij , w ij ),u ij = R×sin(i×π/N+0.5×π/N)×cos(j×π/N+0.5×π/N),v ij = R×sin(i×π/N+0.5×π/N)×sin(j×π/N+0.5×π/N),w ij = R×cos(i×π/N+0.5×π/N);i = 0,1,…,N-1;j = 0,1, …,2N-1;
Step102: creating a two-dimensional array IML containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array IML stores a color value of one pixel of the equidistant cylindrical projection panoramic image corresponding to the left eye, and the elements of the two-dimensional array IML are in one-to-one correspondence with the pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye; creating a two-dimensional array VPOINTS L containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array VPOINTS L stores a visual field spot coordinate corresponding to one pixel of the equidistant cylindrical projection panoramic image corresponding to the left eye, and the elements of the two-dimensional array VPOINTS L correspond to the pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye one by one; initializing the value of each element of the two-dimensional array VPOINTS L to a coordinate (M X , M Y , M Z ),M X 、M Y And M Z Equal to 2 128
Step103: for each pixel A001 of the equidistant cylindrical projection panoramic image corresponding to the left eye, the following operations are respectively performed:
step103-1: calculating a row number i and a column number j of the pixel A001 in the equidistant cylindrical projection panoramic image corresponding to the left eye;
step103-2: calculating rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye in a u-v-w coordinate system ij , v ij , w ij ) The method comprises the steps of carrying out a first treatment on the surface of the Calculating rectangular coordinates (u) ij , v ij , w ij ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step103-3: as shown in fig. 3, a tangent point c corresponding to two tangents to a viewing circle passing through a point Q of the spherical panoramic stereo camera is calculated L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L A left eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel A001 and a tangent point c R A right eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel A001; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ]The method comprises the steps of carrying out a first treatment on the surface of the II represents the length of the vector x;
step103-4: according to the tangent point c by using a coordinate transformation method L Rectangular coordinates in u-v-w coordinate system to calculate tangent point c L Rectangular coordinates in an x-y-z world coordinate system; according to rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and jth column pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye in a u-v-w coordinate system by utilizing a coordinate transformation method ij , v ij , w ij ) Calculating rectangular coordinates of pixel sampling points on a virtual imaging spherical surface corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye in an x-y-z world coordinate system; in the x-y-z worldIn the coordinate system, at the tangent point c L Transmitting a light LRay passing through a pixel sampling point on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye as a starting point, tracking the propagation of the light LRay in a three-dimensional scene by utilizing a ray tracing technology, recording an intersection point position coordinate SECCOORL of the light LRay intersecting with a three-dimensional scene object for the first time, and calculating that the tangent point c is reached along the opposite direction of the light LRay L The brightness value LRAD of the image is converted into a color value LColor of the pixel of the panoramic image, and the color value LColor is stored in the ith row and jth column elements of the two-dimensional array IML; storing the intersection point position coordinate SECCOORL in the ith row and jth column elements of the two-dimensional array VPOINTS L;
step103-5: the operation for the pixel a001 ends.
2) The method comprises the following specific steps of:
Step201: creating a two-dimensional array VPRLIST containing N rows and 2N columns of elements in a memory of a computer, wherein the elements of the two-dimensional array VPRLIST are in one-to-one correspondence with pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye; each element of the two-dimensional array VPRLIST stores a space projection result list PLIST1 corresponding to one pixel of the equidistant cylindrical projection panoramic image corresponding to the right eye, and each element of the list PLIST1 stores a variable of a data structure DSTLTOR type; making a space projection result list PLIST1 stored by each element of the two-dimensional array VPRLIST empty;
step202: for each pixel A001 of the equidistant cylindrical projection panoramic image corresponding to the left eye, the following operations are respectively performed:
step202-1: calculating a row number i and a column number j of the pixel A001 in the equidistant cylindrical projection panoramic image corresponding to the left eye; let VPLLC represent the visual field scenic spot coordinates stored by the ith row and jth column elements of the two-dimensional array VPOINTS L; let V px Representing the x-component, V, of VPLLC py Representing the y-component of VPLLC, V pz Representing the z-component of the VPLLC; rectangular coordinates (V) px , V py , V pz ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step202-2: as shown in fig. 3, a tangent point c corresponding to two tangents to a viewing circle passing through a point Q of the spherical panoramic stereo camera is calculated L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ];
Step202-3: according to the tangent point c by using a coordinate transformation method R Rectangular coordinates in u-v-w coordinate system to calculate tangent point c R Rectangular coordinates in an x-y-z world coordinate system; let v RayB0 Representing the slave tangent point c R Vector pointing to point corresponding to VPLLC, let P i′j′ Representing pixel sampling points on a virtual imaging sphere corresponding to the ith row and jth column pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye, c Ri′j′ Representing P i′j′ Corresponding right eye viewpoint, v i′,j′ Representing slave c Ri′j′ Pointing to P i′j′ Let v be RayB0 •v i′,j′ /(‖v RayB0 ‖•‖v i′,j′ II) subscripts i 'and j' with maximum value are equal to i, respectively m And j m Where i 'is an element of the integer set {0, 1, …, N-1} and j' is an element of the integer set {0, 1, …, 2N-1 };
step202-4: if the ith of the two-dimensional array VPRLIST m Line j m Turning to Step202-4-1 if the list stored by the elements of the column is empty, otherwise turning to Step202-4-2;
step202-4-1: creating a variable vdsthttor of data structure DSTLTOR type in the memory of computer, making the color value PIXCOLOR member variable of variable vdsthttor equal to the value stored in the ith row and jth column elements of two-dimensional array IML, making the distance VDIS member variable of variable vdsthttor equal to the secondary tangent point c R Distance to the point corresponding to VPLLC; adding variable vDSTLTOR to ith of two-dimensional array VPRLIST m Line j m List PLIST1, in which the elements of the column are stored; turning to Step202-5;
step202-4-2: let TMP represent the ith of the two-dimensional array VPRLIST m Line j m The first element of the list of element stores of column PLIST1 stores a variable of data structure DSTLTOR type; if the value of the TMP's distance VDIS member variable is greater than the slave tangent point c R The distance to the point corresponding to VPLLC, the i-th two-dimensional array VPRLIST is obtained m Line j m Clearing the list PLIST1 stored by the elements of the column, turning to Step202-4-1, otherwise turning to Step202-5;
step202-5: the operation for the pixel a001 ends.
3) In the process of rendering the equidistant cylindrical projection panoramic image corresponding to the right eye, the equidistant cylindrical projection panoramic image rendering result corresponding to the left eye is reused through space projection, and the equidistant cylindrical projection panoramic image rendering corresponding to the right eye is completed, and the specific steps are as follows:
Step301: calculating coordinates of sampling points of each pixel on the virtual imaging sphere according to an equiangular interval sampling mode of azimuth angle and polar angle, and storing virtual imaging by using equidistant cylindrical projection panoramic imagesColors of these pixel sampling points on the sphere; as shown in fig. 2, the equidistant cylindrical projection panoramic image totally comprises N rows and 2N columns of pixels, N being a positive integer; rectangular coordinates of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image in a u-v-w coordinate system are (u) ij , v ij , w ij ),u ij = R×sin(i×π/N+0.5×π/N)×cos(j×π/N+0.5×π/N),v ij = R×sin(i×π/N+0.5×π/N)×sin(j×π/N+0.5×π/N),w ij = R×cos(i×π/N+0.5×π/N);i = 0,1,…,N-1;j = 0,1, …,2N-1;
Step302: creating a two-dimensional array IMR containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array IMR stores a color value of one pixel of the equidistant cylindrical projection panoramic image corresponding to the right eye, and the elements of the two-dimensional array IMR are in one-to-one correspondence with the pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye;
step303: for each pixel B001 of the equidistant cylindrical projection panoramic image corresponding to the right eye, the following operations are respectively performed:
step303-1: calculating a row number i and a column number j of the pixel B001 in the equidistant cylindrical projection panoramic image corresponding to the right eye;
step303-2: calculating rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in a u-v-w coordinate system ij , v ij , w ij ) The method comprises the steps of carrying out a first treatment on the surface of the Calculating rectangular coordinates (u) ij , v ij , w ij ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step303-3: as shown in fig. 3, a tangent point c corresponding to two tangents to a viewing circle passing through a point Q of the spherical panoramic stereo camera is calculated L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L A left eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel B001 and a tangential point c R A right eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel B001; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ];
Step303-4: according to the tangent point c by using a coordinate transformation method R Rectangular coordinates in u-v-w coordinate system to calculate tangent point c R Rectangular coordinates in an x-y-z world coordinate system; according to rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and jth column pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in a u-v-w coordinate system by utilizing a coordinate transformation method ij , v ij , w ij ) Calculating rectangular coordinates of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in an x-y-z world coordinate system; in the x-y-z world coordinate system, at tangent point c R Transmitting a ray RRay passing through pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye as a starting point, calculating an intersection point position coordinate SECCOORR of the ray RRay intersecting with the three-dimensional scene object for the first time, and if a two-dimensional array VPRLIST is obtainedThe element-stored list PLIST1 of row i, column j is not empty and extends from the intersection location coordinate SECCOORR to the tangent point c R If the absolute value of the difference between the distance of the coordinates of (2) and the value of the member variable of the distance VDIS of the variable of the data structure DSTLTOR type stored in the first element of the list PLIST1 stored in the ith row, jth column of the two-dimensional array VPRLIST is smaller than the distance threshold TDis, step303-5 is shifted, otherwise, the propagation of ray RRay in the three-dimensional scene is tracked by using ray tracing technology, and the arrival of the tangential point c along the opposite direction of ray RRay is calculated R The brightness value RRAD is converted into a color value RColor of the pixel of the panoramic image, the color value RColor is stored in the ith row and the jth column elements of the two-dimensional array IMR, and the Step303-6 is performed;
Step303-5: assigning the ith row and the jth column of the two-dimensional array IMR as the values of the color value PIXCOLOR member variables of the variable of the data structure DSTLTOR type stored by the first element of the list PLIST1 stored by the ith row and the jth column of the two-dimensional array VPRLIST;
step303-6: the operation for the pixel B001 ends.
4) The equidistant cylindrical projection panoramic image corresponding to the left eye and the equidistant cylindrical projection panoramic image corresponding to the right eye are stored in a computer disk file, and the specific steps are as follows:
and the pixel color values stored in the two-dimensional array IML are stored into a computer disk file PIML as equidistant cylindrical projection panoramic image data corresponding to the left eye, and the pixel color values stored in the two-dimensional array IMR are stored into the computer disk file PIMR as equidistant cylindrical projection panoramic image data corresponding to the right eye.
In this embodiment, n=5000,R = 100,r=3.25. The distance threshold TDis takes a value of 1/2000 of the height of the wall of the room.
In Step303-4, if the stored list PLIST1 of elements of the ith row, jth column of the two-dimensional array VPRLIST is not empty and ranges from the intersection position coordinate SECCOORR to the tangent point c R Is the distance between the coordinates of the two-dimensional array VPRLIST and the first element of the list PLIST1 stored in the ith row and jth column of the element storage list DSTLTOR class If the absolute value of the difference between the values of the type variable and the VDIS member variable is smaller than the distance threshold TDis, step303-5 is performed, and no further tracking calculation is performed on the light ray RRay under this condition, so that the calculation overhead can be saved.
In Step103-3 of the process,θandφrespectively, rectangular coordinates (u ij , v ij , w ij ) And the polar angle and the azimuth angle of the corresponding spherical coordinates are two coordinate component values.
In Step202-2 of the process,θandφrespectively, rectangular coordinates (V in Step202-1 px , V py , V pz ) And the polar angle and the azimuth angle of the corresponding spherical coordinates are two coordinate component values.
In Step303-3 of the process,θandφrespectively, rectangular coordinates (u in Step303-2 ij , v ij , w ij ) And the polar angle and the azimuth angle of the corresponding spherical coordinates are two coordinate component values.
In Step202-3, P i′j′ The corresponding right eye viewpoint calculation method adopts the methods described in Step303-2 and Step303-3, and only i in the method is needed to be replaced by i ', and j in the method is needed to be replaced by j'.3 x 3 rotation transformation matrix M #β) The writing mode of the matrix is a two-dimensional matrix definition mode in Matlab. The multiplication operator represents a vector dot product (i.e., solving an inner product of two vectors) if it is vector to the left and right, a matrix and vector product if it is matrix to the left and vector to the right, and a scalar number if it is left or right.

Claims (1)

1. A space projection reuse acceleration method for panoramic stereo picture rendering is characterized in that: providing a data structure DSTLTOR for storing color values and distance data, wherein the data structure DSTLTOR comprises two member variables of a color value PIXCOLOR and a distance VDIS; the specific implementation steps are as follows;
1) By utilizing a ray tracing technology, an equidistant cylindrical projection panoramic image corresponding to the left eye is generated according to the spherical panoramic stereo camera and the three-dimensional scene model, and the specific steps are as follows:
step101: calculating coordinates of each pixel sampling point on the virtual imaging spherical surface according to an equiangular interval sampling mode of the azimuth angle and the polar angle, and storing colors of the pixel sampling points on the virtual imaging spherical surface by using equidistant cylindrical projection panoramic images; the equidistant cylindrical projection panoramic image comprises N rows and 2N columns of pixels in total, wherein N is a positive integer; rectangular coordinates of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image in a u-v-w coordinate system are (u) ij , v ij , w ij ),u ij = R×sin(i×π/N+0.5×π/N)×cos(j×π/N+0.5×π/N),v ij = R×sin(i×π/N+0.5×π/N)×sin(j×π/N+0.5×π/N),w ij = R×cos(i×π/N+0.5×π/N);i = 0,1,…,N-1;j = 0,1, …,2N-1;
Step102: creating a two-dimensional array IML containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array IML stores a color value of one pixel of the equidistant cylindrical projection panoramic image corresponding to the left eye, and the elements of the two-dimensional array IML are in one-to-one correspondence with the pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye; creating a two-dimensional array VPOINTS L containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array VPOINTS L stores a visual field spot coordinate corresponding to one pixel of the equidistant cylindrical projection panoramic image corresponding to the left eye, and the elements of the two-dimensional array VPOINTS L correspond to the pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye one by one; initializing the value of each element of the two-dimensional array VPOINTS L to a coordinate (M X , M Y , M Z ),M X 、M Y And M Z Equal to 2 128
Step103: for each pixel A001 of the equidistant cylindrical projection panoramic image corresponding to the left eye, the following operations are respectively performed:
step103-1: calculating a row number i and a column number j of the pixel A001 in the equidistant cylindrical projection panoramic image corresponding to the left eye;
step103-2: calculating the pixel corresponding to the ith row and the jth column of the equidistant cylindrical projection panoramic image corresponding to the left eyeRectangular coordinates (u) of pixel sampling points on the virtual imaging sphere in a u-v-w coordinate system ij , v ij , w ij ) The method comprises the steps of carrying out a first treatment on the surface of the Calculating rectangular coordinates (u) ij , v ij , w ij ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step103-3: calculating a tangent point c corresponding to two tangent lines of a viewing circle passing through a point Q of the spherical panoramic stereo camera L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L A left eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel A001 and a tangent point c R A right eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel A001; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ]The method comprises the steps of carrying out a first treatment on the surface of the II represents the length of the vector x;
step103-4: according to the tangent point c by using a coordinate transformation method L Rectangular coordinates in u-v-w coordinate system to calculate tangent point c L Rectangular coordinates in an x-y-z world coordinate system; by means ofThe coordinate transformation method is based on rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye in a u-v-w coordinate system ij , v ij , w ij ) Calculating rectangular coordinates of pixel sampling points on a virtual imaging spherical surface corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye in an x-y-z world coordinate system; in the x-y-z world coordinate system, at tangent point c L Transmitting a light LRay passing through a pixel sampling point on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the left eye as a starting point, tracking the propagation of the light LRay in a three-dimensional scene by utilizing a ray tracing technology, recording an intersection point position coordinate SECCOORL of the light LRay intersecting with a three-dimensional scene object for the first time, and calculating that the tangent point c is reached along the opposite direction of the light LRay L The brightness value LRAD of the image is converted into a color value LColor of the pixel of the panoramic image, and the color value LColor is stored in the ith row and jth column elements of the two-dimensional array IML; storing the intersection point position coordinate SECCOORL in the ith row and jth column elements of the two-dimensional array VPOINTS L;
step103-5: the operation for the pixel a001 ends;
2) The method comprises the following specific steps of:
step201: creating a two-dimensional array VPRLIST containing N rows and 2N columns of elements in a memory of a computer, wherein the elements of the two-dimensional array VPRLIST are in one-to-one correspondence with pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye; each element of the two-dimensional array VPRLIST stores a space projection result list PLIST1 corresponding to one pixel of the equidistant cylindrical projection panoramic image corresponding to the right eye, and each element of the list PLIST1 stores a variable of a data structure DSTLTOR type; making a space projection result list PLIST1 stored by each element of the two-dimensional array VPRLIST empty;
step202: for each pixel A001 of the equidistant cylindrical projection panoramic image corresponding to the left eye, the following operations are respectively performed:
Step202-1: calculating a row number i and a column number j of the pixel A001 in the equidistant cylindrical projection panoramic image corresponding to the left eye; let VPLLC represent the visual field scenic spot coordinates stored by the ith row and jth column elements of the two-dimensional array VPOINTS L; let V px Representing the x-component, V, of VPLLC py Representing the y-component of VPLLC, V pz Representing the z-component of the VPLLC; rectangular coordinates (V) px , V py , V pz ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step202-2: calculating a tangent point c corresponding to two tangent lines of a viewing circle passing through a point Q of the spherical panoramic stereo camera L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ];
Step202-3: according to the tangent point c by using a coordinate transformation method R Rectangular coordinates in u-v-w coordinate system to calculate tangent point c R Rectangular coordinates in an x-y-z world coordinate system; let v RayB0 Representing the slave tangent point c R Vector pointing to point corresponding to VPLLC, let P i′j′ Representing pixel sampling points on a virtual imaging sphere corresponding to the ith row and jth column pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye, c Ri′j′ Representing P i′j′ Corresponding right eye viewpoint, v i′,j′ Representing slave c Ri′j′ Pointing to P i′j′ Let v be RayB0 •v i′,j′ /(‖v RayB0 ‖•‖v i′,j′ II) subscripts i 'and j' with maximum value are equal to i, respectively m And j m Where i 'is an element of the integer set {0, 1, …, N-1} and j' is an element of the integer set {0, 1, …, 2N-1 };
step202-4: if the ith of the two-dimensional array VPRLIST m Line j m Turning to Step202-4-1 if the list stored by the elements of the column is empty, otherwise turning to Step202-4-2;
step202-4-1: creating a variable vdsthttor of data structure DSTLTOR type in the memory of computer, making the color value PIXCOLOR member variable of variable vdsthttor equal to the value stored in the ith row and jth column elements of two-dimensional array IML, making the distance VDIS member variable of variable vdsthttor equal to the secondary tangent point c R Distance to the point corresponding to VPLLC; adding variable vDSTLTOR to ith of two-dimensional array VPRLIST m Line j m List PLIST1, in which the elements of the column are stored; turning to Step202-5;
step202-4-2: let TMP represent the ith of the two-dimensional array VPRLIST m Line j m The first element of the list of element stores of column PLIST1 stores a variable of data structure DSTLTOR type; if the value of the TMP's distance VDIS member variable is greater than the slave tangent point c R The distance to the point corresponding to VPLLC, the i-th two-dimensional array VPRLIST is obtained m Line j m Clearing the list PLIST1 stored by the elements of the column, turning to Step202-4-1, otherwise turning to Step202-5;
step202-5: the operation for the pixel a001 ends;
3) In the process of rendering the equidistant cylindrical projection panoramic image corresponding to the right eye, the equidistant cylindrical projection panoramic image rendering result corresponding to the left eye is reused through space projection, and the equidistant cylindrical projection panoramic image rendering corresponding to the right eye is completed, and the specific steps are as follows:
step301: calculating coordinates of each pixel sampling point on the virtual imaging spherical surface according to an equiangular interval sampling mode of the azimuth angle and the polar angle, and storing colors of the pixel sampling points on the virtual imaging spherical surface by using equidistant cylindrical projection panoramic images; the equidistant cylindrical projection panoramic image comprises N rows and 2N columns of pixels in total, wherein N is a positive integer; rectangular coordinates of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image in a u-v-w coordinate system are (u) ij , v ij , w ij ),u ij = R×sin(i×π/N+0.5×π/N)×cos(j×π/N+0.5×π/N),v ij = R×sin(i×π/N+0.5×π/N)×sin(j×π/N+0.5×π/N),w ij = R×cos(i×π/N+0.5×π/N);i = 0,1,…,N-1;j = 0,1, …,2N-1;
Step302: creating a two-dimensional array IMR containing N rows and 2N columns of elements in a memory of a computer, wherein each element of the two-dimensional array IMR stores a color value of one pixel of the equidistant cylindrical projection panoramic image corresponding to the right eye, and the elements of the two-dimensional array IMR are in one-to-one correspondence with the pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye;
step303: for each pixel B001 of the equidistant cylindrical projection panoramic image corresponding to the right eye, the following operations are respectively performed:
step303-1: calculating a row number i and a column number j of the pixel B001 in the equidistant cylindrical projection panoramic image corresponding to the right eye;
step303-2: calculating rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in a u-v-w coordinate system ij , v ij , w ij ) The method comprises the steps of carrying out a first treatment on the surface of the Calculating rectangular coordinates (u) ij , v ij , w ij ) Polar angle of corresponding spherical coordinatesθAnd azimuth angleφTwo coordinate component values;
step303-3: calculating a tangent point c corresponding to two tangent lines of a viewing circle passing through a point Q of the spherical panoramic stereo camera L And tangent point c R Wherein let the u-coordinate component of point Qq u =R×cos(φ) V-coordinate component of point Qq v =R×sin(φ) W-coordinate component of point Qq w =0; tangent point c L A left eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel B001 and a tangential point c R A right eye viewpoint corresponding to a pixel sampling point on the virtual imaging sphere corresponding to the pixel B001; tangent point c L And tangent point c R The coordinate calculation formula in the u-v-w coordinate system is: c L =Q-b L ,c R =Q-b R Vector b L The calculation formula of (b) is b L =(l 2 -r 2 ) 1/2 •M(β)•V/‖M(β) V II, vector b R The calculation formula of (b) is b R =(l 2 -r 2 ) 1/2 •M(-β)•V/‖M(-β)•V‖,lThe distance from the point Q to the origin of the u-V-w coordinate system is represented by a vector V representing the length of the point Q pointing from the origin of the u-V-w coordinate system to the point Ql3 x 3 rotation transformation matrix M #, vectorβ)=[cos(β), sin(β), 0; -sin(β), cos(β), 0; 0, 0, 1]Rotation angleβ=arctan[r/(l 2 -r 2 ) 1/2 ];
Step303-4: according to the tangent point c by using a coordinate transformation method R Rectangular coordinates in u-v-w coordinate system to calculate tangent point c R Rectangular coordinates in an x-y-z world coordinate system; according to rectangular coordinates (u) of pixel sampling points on a virtual imaging sphere corresponding to the ith row and jth column pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye in a u-v-w coordinate system by utilizing a coordinate transformation method ij , v ij , w ij ) Calculating virtual imaging sphere corresponding to ith row and jth column pixels of equidistant cylindrical projection panoramic image corresponding to right eyeRectangular coordinates of pixel sampling points on the surface in an x-y-z world coordinate system; in the x-y-z world coordinate system, at tangent point c R Transmitting a ray RRay passing through a pixel sampling point on a virtual imaging sphere corresponding to the ith row and the jth column of pixels of the equidistant cylindrical projection panoramic image corresponding to the right eye as a starting point, calculating an intersection point position coordinate SECCOORR of the ray RRay intersecting a three-dimensional scene object for the first time, and if a list PLIST1 stored by the ith row and the jth column of elements of the two-dimensional array VPRLIST is not empty and from the intersection point position coordinate SECCOORR to a tangent point c R If the absolute value of the difference between the distance of the coordinates of (2) and the value of the member variable of the distance VDIS of the variable of the data structure DSTLTOR type stored in the first element of the list PLIST1 stored in the ith row, jth column of the two-dimensional array VPRLIST is smaller than the distance threshold TDis, step303-5 is shifted, otherwise, the propagation of ray RRay in the three-dimensional scene is tracked by using ray tracing technology, and the arrival of the tangential point c along the opposite direction of ray RRay is calculated R The brightness value RRAD is converted into a color value RColor of the pixel of the panoramic image, the color value RColor is stored in the ith row and the jth column elements of the two-dimensional array IMR, and the Step303-6 is performed;
step303-5: assigning the ith row and the jth column of the two-dimensional array IMR as the values of the color value PIXCOLOR member variables of the variable of the data structure DSTLTOR type stored by the first element of the list PLIST1 stored by the ith row and the jth column of the two-dimensional array VPRLIST;
Step303-6: the operation for the pixel B001 ends;
4) The equidistant cylindrical projection panoramic image corresponding to the left eye and the equidistant cylindrical projection panoramic image corresponding to the right eye are stored in a computer disk file, and the specific steps are as follows:
and the pixel color values stored in the two-dimensional array IML are stored into a computer disk file PIML as equidistant cylindrical projection panoramic image data corresponding to the left eye, and the pixel color values stored in the two-dimensional array IMR are stored into the computer disk file PIMR as equidistant cylindrical projection panoramic image data corresponding to the right eye.
CN202311312163.9A 2023-10-08 2023-10-11 Space projection reuse acceleration method for panoramic stereo picture rendering Pending CN117314737A (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202311293336 2023-10-08
CN2023112933367 2023-10-08

Publications (1)

Publication Number Publication Date
CN117314737A true CN117314737A (en) 2023-12-29

Family

ID=89255023

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311312163.9A Pending CN117314737A (en) 2023-10-08 2023-10-11 Space projection reuse acceleration method for panoramic stereo picture rendering

Country Status (1)

Country Link
CN (1) CN117314737A (en)

Similar Documents

Publication Publication Date Title
US5325472A (en) Image displaying system for interactively changing the positions of a view vector and a viewpoint in a 3-dimensional space
US5936630A (en) Method of and apparatus for performing perspective transformation of visible stimuli
EP3057066B1 (en) Generation of three-dimensional imagery from a two-dimensional image using a depth map
CN104427325B (en) Fast integration image generating method and the naked eye three-dimensional display system with user mutual
US6526166B1 (en) Using a reference cube for capture of 3D geometry
Watanabe et al. Extended dot cluster marker for high-speed 3D tracking in dynamic projection mapping
Saito et al. Appearance-based virtual view generation from multicamera videos captured in the 3-d room
CN111161398B (en) Image generation method, device, equipment and storage medium
CN113923438A (en) Composite stereoscopic image content capture
Vetter et al. Image synthesis from a single example image
CN110276823A (en) The integration imaging generation method and system that can be interacted based on ray tracing and in real time
CN112002003B (en) Spherical panoramic stereo picture generation and interactive display method for virtual 3D scene
GB2571953A (en) Single view tracking of cylindrical objects
CN108364355B (en) AR rendering method fitting facial expressions
CN110619601A (en) Image data set generation method based on three-dimensional model
KR100489572B1 (en) Image processing method
CN117314737A (en) Space projection reuse acceleration method for panoramic stereo picture rendering
CN111460937A (en) Face feature point positioning method and device, terminal equipment and storage medium
JPH10232953A (en) Stereoscopic image generator
Bai et al. Visualization pipeline of autonomous driving scenes based on FCCR-3D reconstruction
Chessa et al. A virtual reality simulator for active stereo vision systems
Chessa et al. Chapter Virtual Reality to Simulate Visual Tasks for Robotic Systems
Kato et al. Creation of 3D Environmental Map using Omnidirectional Camera Images
Song et al. Study on the perception mechanism and method of virtual and real objects in augmented reality assembly environment
JP2952585B1 (en) Image generation method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination