CN100334500C - Interactive 3D scene lighting method and system - Google Patents

Interactive 3D scene lighting method and system Download PDF

Info

Publication number
CN100334500C
CN100334500C CNB2003101243820A CN200310124382A CN100334500C CN 100334500 C CN100334500 C CN 100334500C CN B2003101243820 A CNB2003101243820 A CN B2003101243820A CN 200310124382 A CN200310124382 A CN 200310124382A CN 100334500 C CN100334500 C CN 100334500C
Authority
CN
China
Prior art keywords
light source
triangle gridding
triangle
scene
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Lifetime
Application number
CNB2003101243820A
Other languages
Chinese (zh)
Other versions
CN1635401A (en
Inventor
翁明昉
苏俊明
陈俊呈
李俊毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Institute for Information Industry
Original Assignee
Institute for Information Industry
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Institute for Information Industry filed Critical Institute for Information Industry
Priority to CNB2003101243820A priority Critical patent/CN100334500C/en
Publication of CN1635401A publication Critical patent/CN1635401A/en
Application granted granted Critical
Publication of CN100334500C publication Critical patent/CN100334500C/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Abstract

The present invention relates to an interactive 3D scene lighting method and a system thereof. A 3D scene is firstly converted into a plurality of triangular meshes; a unique exclusive color is provided for each triangular mesh so that the triangular mesh can be identified; a light source is simulated into an image picking up device; the 3D scene is displayed according to the image picking up device, and an image is obtained; the color of a pixel on each point on the image stands for the triangular mesh irradiated by each light beam. Therefore, the present invention can analyze the color of the pixel so that the consistent triangular mesh is found. If more pixels appear on a pixel, the triangular mesh which is corresponding to the exclusive color absorbs more light sources. Finally, the size of a display region, the area of the triangular mesh, an included angle of a normal vector of the triangular mesh and a visual angle of the light source, and a distance between the normal vector of the triangular mesh and the visual angle of the light source are analyzed so that the energy of the light source received by the triangular mesh is judged.

Description

Interactive 3D scene lighting method and system thereof
Technical field
The invention relates to a kind of scene lighting method and system thereof, refer to especially in a kind of 3D of being applicable to stereo scene in order to produce the interactive 3D scene lighting method and the system thereof of shadow.
Background technology
Press, with showing the continuous evolution of 3D hardware video picture usefulness, reaching the universalness day by day that 3D plays, the user is also more and more higher for the quality requirements of the integral illumination effect in the 3D scene integral illumination (global illumination) at present, for example the article in the 3D scene are receiving light source (ligjt source) when irradiation, and whether the hatching effect that is presented is true to nature etc.
Known proposition shadow Z-buffer shadow generating technique is in order to produce the stereo-picture of shadow in geometric scheme.Shadow Z-buffer shadow generating technique mainly comprised for two stages, phase one earlier is assumed to be light source camera (camera) taking a sample under the degree of accuracy of image space (image space), and then calculates the depth value of each pixel (pixel); Subordinate phase then utilizes general imaging technique to produce the visible image of viewpoint so that camera is revert to original viewpoint.Wherein, the Z-buffer data storage that the phase one is normally supervened camera video picture result becomes shadow map (shadow map), promptly so-called ray space (light space).In view of the above, can whether fall within the shadow region of containing by shadow map by the pixel of judging video picture, to produce suitable effect of shadow.For instance, suppose in 3d space, the imaging results of certain 1 P is defined as a pixel on visible image, then can be with the coordinate (x of P point in screen space (screen space), y, z) corresponding conversion is the coordinate (x ', y ', z ') in the ray space, now uses coordinate x ' and y ' as the index of inquiring about shadow map, to calculate a depth value in order to compare with z ' value.If z ' value is greater than this degree of depth, the centre that is illustrated in P point and light source have another object than the P point more near light source, that is the P point blocked by another object, thereby makes the P point fall in the shadow region, so the brightness of P point pixel must weaken to present the effect of shadow; Otherwise,, represent the not crested and can receive complete light source irradiation of P point if z ' value equals this depth value.
Yet, because shadow Z-buffer technology must be calculated the effect of shadow of each pixel one by one in image space, therefore suitable consumes resources, and system is caused sizable computational load, in general, animation Shi Ruo thinks the design effect of preview one 3D scene, usually after the calculating and time of developing that must wait for several minutes, just can obtain the image effect of integral illumination, when the fine arts personnel that also even design the 3D scene want to readjust the illuminating effect of scene, must wait marquis quite long calculating and time of developing again.Certainly known also the proposition only made the effect of shadow Calculation Method to visible pixel in order to reduce system loading, yet this measure can only present the effect of shadow of this viewpoint, can't present whole comprehensively effect, is not very good.In addition, shadow Z-buffer technology j adopts the decay pixel intensity with also out of true very of the practice that produces effect of shadow, because in fact the received shadow brightness in each piece shadow region is different, can't use accurately the decay brightness of corresponding region of summary value merely.Hence one can see that, and still there are many disappearances in known 3D scene lighting method in order to the generation shadow and give improved necessity.
Summary of the invention
Fundamental purpose of the present invention is to provide a kind of interactive 3D scene lighting method and system thereof, go out the illumination and the effect of shadow of scene with analog computation according to the physical method of light actual illumination, can calculate the energy of light source that each triangle gridding reality can receive, and then improve the quality of illumination imaging.
Another object of the present invention is to provide a kind of interactive 3D scene lighting method and system thereof, whether the operation efficiency that utilizes 3D video picture wafer shines operation program in the scene with the light beam that replaces known light source and give out, significantly to promote operation efficiency.
According to characteristic of the present invention, a kind of interactive 3D scene lighting method is proposed, comprise the following steps:
(A) acquisition one 3D scene;
(B) this 3D scene is converted into a plurality of triangle griddings, each triangle gridding is made up of three summit lines;
(C) give each triangle gridding one unique identifier, and each unique identifier is corresponding to an exclusive color code, to define the pairing exclusive color of each triangle gridding;
(D) light source is modeled as an image capture unit, and uses the plane to paint the shadow technology to have the 3D scene of various exclusive colors to carry out video picture in the coordinate of this light source correspondence to definition, to obtain a group image by this image capture unit;
(E) analyze the distribution of each exclusive color correspondence in this image, go out the received light source irradiation amount of each triangle gridding with conversion Calculation;
(F) calculate the light source irradiation amount that each summit average received arrives; And
(G) this image capture unit is modeled as a present viewpoint, and use and smoothly to paint the shadow technology and integrate the received light source irradiation amount of these triangle griddings and summit, observe effect of shadow and the imaging results that this 3D scene is produced after by this light source irradiation to present from this present viewpoint.
Wherein each triangle hole lattice comprises:
One with the data of the shared vertex index of other triangle griddings, in the middle of, each summit definition has a vertex index; And
One not with the data of the shared vertex index of other triangle griddings, in the middle of, each summit at different triangle griddings to define different vertex index.
Wherein this exclusive color code uses the 32 bit number of the nothing short integer forms (embodiment says that again (255,255,255,255) are background colors, can not be brought as exclusive color) as RGBA.
Wherein this exclusive color code uses rgb format.
Wherein step (D) is used the square of a unit length, works as this light source with simulation and is placed in the result of shining in this square, and each face of this square is cut into n * n grid, and the area of each grid is
Figure C20031012438200071
And this image capture unit is obtained six pairing six images of this square respectively.
Wherein step (E) comprises the following steps:
(E1) analyze the plural number that is had in this image and plant exclusive color;
(E2) with these exclusive color conversion corresponding unique identifier, to obtain the triangle gridding of this unique identifier correspondence;
(E3) calculate the projected area of light area projection on this square of each triangle gridding;
(E4) according to the distance between this triangle gridding and this square, projection ratio, and formed between the two three-dimensional angle, with the real area of the light area that converses this triangle gridding; And
(E5) calculate the received light source irradiation amount of elementary area of this triangle gridding.
Wherein in step (E3), count total k pixel and conform to the exclusive color of this triangle gridding, the projected area that calculates this triangle gridding sensitive area according to this is:
A ′ lighting = K ( 1 n 2 )
Wherein in step (E4), if the body angle between this triangle gridding and this square be (θ, ψ), then the real area of this triangle gridding sensitive area is:
Figure C20031012438200082
In the middle of, dist is the distance of the center of gravity of this triangle gridding to this light source.Wherein in step (E5), the received light source irradiation amount of the elementary area of this triangle gridding is:
Δ E polygon = A lighting A total × dot ( N , L ) × I
In the middle of, A LightingBe the real area of this triangle gridding sensitive area, A TotalBe the total area of this triangle gridding, N is the normal vector of this triangle gridding, and L is the vector of this light source irradiation in this triangle gridding center of gravity, and I is the light source irradiation total amount.
According to another characteristic of the present invention, the interactive 3D scene lighting system that is proposed is equipped with a 3D video picture wafer, and this system comprises:
The means of one acquisition, one 3D scene;
One is converted into the means of a plurality of triangle griddings with this 3D scene, and wherein each triangle gridding is made up of three summit lines;
The means of the pairing exclusive color of one each triangle gridding of definition give each triangle gridding one unique identifier, and each identification code define the pairing exclusive color of each triangle gridding according to this corresponding to an exclusive color code;
The shadow means are painted on one plane, and a light source is modeled as an image capture unit, and use the plane to paint the shadow technology to have the 3D scene of various exclusive colors to carry out video picture in the corresponding coordinate of this light source to definition by this image capture unit, to obtain a group image;
One analyzes the means of the received light source irradiation amount of each triangle gridding, analyzes the distribution of each exclusive color correspondence in this image, goes out the received light source irradiation amount of each triangle gridding with conversion Calculation;
The means of the light source irradiation amount that one each summit average received of calculating arrives; And
One imaging means, this image capture unit is modeled as a present viewpoint, and use and smoothly to paint the shadow technology and integrate the received light source irradiation amount of these triangle griddings and summit, observe effect of shadow and the imaging results that this 3D scene is produced after by this light source irradiation to present from this present viewpoint.
Description of drawings
Fig. 1 is the enforcement environment synoptic diagram of a preferred embodiment of the present invention.
Fig. 2 is the process flow diagram of a preferred embodiment of the present invention.
Fig. 3 uses triangle gridding for a preferred embodiment of the present invention and describes the synoptic diagram of 3D scene.
Fig. 4 is that a preferred embodiment of the present invention is in order to calculate the process flow diagram of the received light source irradiation amount of triangle gridding.
Embodiment
For more understanding technology contents of the present invention, be described as follows especially exemplified by a preferred embodiment.
Please consult the enforcement environment synoptic diagram of Fig. 1 present embodiment earlier, be subjected to shade 31 effects that light source 2 irradiation backs are presented in order in the 3D of for example Fig. 1 scene 1, to calculate object 3, see also the process flow diagram of Fig. 2, present embodiment at first captures the information (step S201) of the 3D scene 1 of desire calculating, sets up a scene (step S202) of using triangle gridding (polygon) to describe according to this.As shown in Figure 3, it is divided into four triangle gridding A, B, C, D, each triangle gridding is made up of three summits (vertex) line, and the description of arbitrary triangle gridding need comprise the vertex index information of two kinds of different-formats in the scene, being respectively can be shared with other triangle griddings, and the shared index information of other triangle griddings of discord, is example with the 3D scene of Fig. 3, and the description data of its triangle gridding promptly must have following two kinds of forms:
Form one (with the shared vertex index information of other triangle griddings):
The array of vertices index array
Figure C20031012438200091
Form two (not with the shared vertex index information of other triangle griddings)
The array of vertices index array
Figure C20031012438200101
Now, to set an exclusive color code (step S203) to each triangle gridding, it gives each triangle gridding one unique identifier (ID number) at the beginning earlier, but afterwards again via the program of man-to-man inverse conversion, unique identifier is converted to exclusive color code.In present embodiment, exclusive color code adopts RBGA (red, blue, green, the 32 bit number of nothing short integers (unsinged short 32bits) form alpha), wherein, present embodiment specifies (255,255,255,255) therefore color as a setting will can not define to triangle gridding and use.The also visual practical application of certain exclusive color code and adopt the RBG form, or any other can be in order to the form of identification triangle gridding.Then, the exclusive color that each triangle gridding is assigned to is assigned in the above-mentioned form two in order to form each summit of each trigonal lattice again, with after finishing color settings, forms following extension form:
Array of vertices: index array:
Figure C20031012438200111
Come again, to be modeled as an image capture unit to light source, a camera (camera) for example, and use the plane to paint shadow (flat shading) technology having the 3D scene of various exclusive colors to carry out video picture at the corresponding coordinate of light source place to definition, and then obtain a group image (step S304) by image capture unit.Because present embodiment is that the hypothesis light source is an open luminophor, therefore, the energy of light source will distribute equably towards periphery from central point and ask, and thus, the square that can design a unit length comes the irradiation of analog light source.That is earlier each face of square (having six faces altogether) is cut into the grid of n * n, and the area of each grid is Imaginary afterwards image capture unit is furnished the center in square, makes each grid that is positioned at each face of square to make present embodiment with desirable that following six group images capture parameter as the pixel on the view plane that is both image capture unit (pixel):
Face Before A left side After Right On Down
View directions (0,0,1) (-1,0,0) (0,0,1) (1,0,0) (0,1,0) (0,-1,0)
Last vector (0,1,0) (0,1,0) (0,1,0) (0,1,0) (0,0,1) (0,0,-1)
Right ward axis (1,0,0) (0,0,-1) (-1,0,0) (0,0,-1) (1,0,0) (1,0,0)
Angular field of view 90 degree
Depth scaling 1,0
Hither plane 0,5
Look the port size (n,n)
In view of the above, the image capture unit of present embodiment can be erected at light source coordinate place, and with the parameter value according to above-mentioned setting, the data of using the plane to paint the triangle gridding of shadow and form two is done video picture, and writes down the result of last video picture.
Afterwards, can go out the received light source irradiation amount (step S205) of elementary area of each triangle gridding with conversion Calculation by the distribution of analyzing each exclusive color correspondence in the image.See also the process flow diagram of Fig. 4, so that the process of step S205 to be described in detail in detail, present embodiment be earlier from the result of six groups of parameter images to count all color of pixel distribute (step S401); Now will be incident upon the area size (step S403) on the square to learn the actual zone that is subjected to light source irradiation of each triangle gridding the color that occurs via the unique identifier (step S402) of the table of comparisons to convert each triangle gridding correspondence to.For instance, for example in statistics, add up to k color of pixel and walk RGBA (0,0,0,2551), so can change via the table of comparisons and learn RGBA (0,0,0,255) be the triangle guiding principle lattice A of representative graph 3, and can calculate the projected area of zone on square that triangle gridding A is subjected to light source irradiation according to this and be:
A ′ lighting = K ( 1 n 2 )
Again, present embodiment can be via both distance and the conversion of projection ratio, to calculate the real area following (step S404) that triangle gridding A is subjected to the zone of light source irradiation:
Figure C20031012438200122
Wherein, dist is a triangle gridding A center of gravity to the distance of light source, (θ, ψ) be triangle gridding A and square between three-dimensional viewpoin (solid angle);
Therefore, if (θ ψ) is all 0 degree, and then the triangle gridding A real area that is subjected to the light source irradiation zone will become:
A lighting = 4 × K × ( 1 n 2 ) × dist 2 ·
At last, can be (step S405) with the received light source irradiation amount of the elementary area that calculates triangle guiding principle lattice A according to following formula:
Δ E polygon = A lighting A total × dot ( N , L ) × I
In the middle of, A TotalBe the total area of this triangle gridding, N is the normal vector of this triangle gridding, and L is the vector of this light source irradiation in this triangle gridding center of gravity, and I is the light source irradiation total amount.Be noted that present embodiment is the characteristic that the center of gravity of utilization triangle gridding must drop on triangle gridding inside, arrive the energy of light source with any point actual reception in the zone of representing triangle gridding.
For making last imaging results present level and smooth and effect of uniform, therefore will calculate each summit average absorption to the energy of light source and produce last vertex color (step S206).Describe in the data structure of object or scene at the use triangle gridding, often there are many summits to be used jointly by different triangle griddings, for example the summit V5 of Fig. 3 is promptly by triangle gridding B, C, D uses jointly, but because these summits will absorb the energy of light source of inequality on different triangle griddings, therefore in order to make video picture more level and smooth, and produce preferable effect of shadow, present embodiment is average in addition with the energy that the summit is absorbed on each different triangle griddings, the energy that again this average absorption is arrived and the material of vertex are calculated afterwards, to produce last color and brightness.
At last, image capture unit will be returned into the present viewpoint of this 3D scene of actual observation, and use and smoothly to paint shadow (smooth shading) technology and integrate above-mentioned triangle gridding and light source irradiation amount that the summit calculated, observe effect of shadow and the imaging results (step S207) that the 3D scene is produced after by light source irradiation to present from present viewpoint.
According to the above description, show that the present invention can go out the illumination and the effect of shadow of 3D scene according to the physical effect of light actual illumination, with analog computation, the energy of light source that each triangle gridding reality can receive can be calculated, the quality that in geometric scheme, produces light and shadow imaging can be improved; In addition, the present invention is aided with the huge calculation function of 3D video picture wafer, whether shines operation program in the scene to replace light beam that known light source gives out, can significantly promote operation efficiency.
The foregoing description only is to give an example for convenience of description, and the interest field that the present invention advocated should be as the criterion so that claim is described certainly, but not only limits to the foregoing description.

Claims (10)

1. an interactive 3D scene lighting method comprises the following steps:
(A) acquisition one 3D scene;
(B) this 3D scene is converted into a plurality of triangle griddings, each triangle gridding is made up of three summit lines;
(C) give each triangle gridding one unique identifier, and each unique identifier is corresponding to an exclusive color code, to define the pairing exclusive color of each triangle gridding;
(D) light source is modeled as an image capture unit, and uses the plane to paint the shadow technology to have the 3D scene of various exclusive colors to carry out video picture in the coordinate of this light source correspondence to definition, to obtain a group image by this image capture unit;
(E) analyze the distribution of each exclusive color correspondence in this image, go out the received light source irradiation amount of each triangle gridding with conversion Calculation;
(F) calculate the light source irradiation amount that each summit average received arrives; And
(G) this image capture unit is modeled as a present viewpoint, and use and smoothly to paint the shadow technology and integrate the received light source irradiation amount of these triangle griddings and summit, observe effect of shadow and the imaging results that this 3D scene is produced after by this light source irradiation to present from this present viewpoint.
2. the method for claim 1 is characterized in that, wherein each triangle hole lattice comprises:
One with the data of the shared vertex index of other triangle griddings, in the middle of, each summit definition has a vertex index; And
One not with the data of the shared vertex index of other triangle griddings, in the middle of, each summit at different triangle griddings to define different vertex index.
3. the method for claim 1 is characterized in that, wherein this exclusive color code uses the 32 bit number of nothing short integer forms as RGBA.
4. the method for claim 1 is characterized in that, wherein this exclusive color code uses rgb format.
5. the method for claim 1, it is characterized in that wherein step (D) is used the square of a unit length, work as this light source with simulation and be placed in the result of shining in this square, each face of this square is cut into n * n grid, and the area of each grid is
Figure C2003101243820002C1
, and this image capture unit is obtained six pairing six images of this square respectively.
6. method as claimed in claim 5 is characterized in that, wherein step (E) comprises the following steps:
(E1) analyze the plural number that is had in this image and plant exclusive color;
(E2) with these exclusive color conversion corresponding unique identifier, to obtain the triangle gridding of this unique identifier correspondence;
(E3) calculate the projected area of light area projection on this square of each triangle gridding;
(E4) according to the distance between this triangle gridding and this square, projection ratio, and formed between the two three-dimensional angle, with the real area of the light area that converses this triangle gridding; And
(E5) calculate the received light source irradiation amount of elementary area of this triangle gridding.
7. method as claimed in claim 6 is characterized in that, wherein in step (E3), counts total k pixel and conforms to the exclusive color of this triangle gridding, and the projected area that calculates this triangle gridding sensitive area according to this is:
A ′ lighting = K × ( 1 n 2 ) .
8. method as claimed in claim 6 is characterized in that, wherein in step (E4), if the body angle between this triangle gridding and this square be (θ, ψ), then the real area of this triangle gridding sensitive area is:
Figure C2003101243820003C2
In the middle of, dist is the distance of the center of gravity of this triangle gridding to this light source.
9. method as claimed in claim 6 is characterized in that, wherein in step (E5), the received light source irradiation amount of the elementary area of this triangle gridding is:
ΔE polygon = A lighting A total × dot ( N , L ) × I
In the middle of, A LightingBe the real area of this triangle gridding sensitive area, A TotalBe the total area of this triangle gridding, N is the normal vector of this triangle gridding, and L is the vector of this light source irradiation in this triangle gridding center of gravity, and I is the light source irradiation total amount.
10. an interactive 3D scene lighting system is equipped with a 3D video picture wafer, and this system comprises:
The device of one acquisition, one 3D scene;
One is converted into the device of a plurality of triangle griddings with this 3D scene, and wherein each triangle gridding is made up of three summit lines;
The device of the pairing exclusive color of one each triangle gridding of definition give each triangle gridding one unique identifier, and each identification code defines the pairing exclusive color of each triangle gridding according to this corresponding to an exclusive color code;
Image device is painted on one plane, and a light source is modeled as an image capture unit, and uses the plane to paint the shadow technology to have the 3D scene of various exclusive colors to carry out video picture in the corresponding coordinate of this light source to definition by this image capture unit, to obtain a group image;
One analyzes the device of the received light source irradiation amount of each triangle gridding, analyzes the distribution of each exclusive color correspondence in this image, goes out the received light source irradiation amount of each triangle gridding with conversion Calculation;
The device of the light source irradiation amount that one each summit average received of calculating arrives; And
One imaging device, this image capture unit is modeled as a present viewpoint, and use and smoothly to paint the shadow technology and integrate the received light source irradiation amount of these triangle griddings and summit, observe effect of shadow and the imaging results that this 3D scene is produced after by this light source irradiation to present from this present viewpoint.
CNB2003101243820A 2003-12-30 2003-12-30 Interactive 3D scene lighting method and system Expired - Lifetime CN100334500C (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CNB2003101243820A CN100334500C (en) 2003-12-30 2003-12-30 Interactive 3D scene lighting method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CNB2003101243820A CN100334500C (en) 2003-12-30 2003-12-30 Interactive 3D scene lighting method and system

Publications (2)

Publication Number Publication Date
CN1635401A CN1635401A (en) 2005-07-06
CN100334500C true CN100334500C (en) 2007-08-29

Family

ID=34844998

Family Applications (1)

Application Number Title Priority Date Filing Date
CNB2003101243820A Expired - Lifetime CN100334500C (en) 2003-12-30 2003-12-30 Interactive 3D scene lighting method and system

Country Status (1)

Country Link
CN (1) CN100334500C (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1129872A (en) * 1994-10-18 1996-08-28 菲利浦电子有限公司 Transmission system comprising a control circuit
US20010024201A1 (en) * 2000-02-17 2001-09-27 Akihiro Hino Image drawing method, image drawing apparatus, recording medium, and program
US6356264B1 (en) * 1997-12-15 2002-03-12 Sega Enterprises, Ltd. Image processing device and image processing method
US20030112237A1 (en) * 2001-12-13 2003-06-19 Marco Corbetta Method, computer program product and system for rendering soft shadows in a frame representing a 3D-scene

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1129872A (en) * 1994-10-18 1996-08-28 菲利浦电子有限公司 Transmission system comprising a control circuit
US6356264B1 (en) * 1997-12-15 2002-03-12 Sega Enterprises, Ltd. Image processing device and image processing method
US20010024201A1 (en) * 2000-02-17 2001-09-27 Akihiro Hino Image drawing method, image drawing apparatus, recording medium, and program
US20030112237A1 (en) * 2001-12-13 2003-06-19 Marco Corbetta Method, computer program product and system for rendering soft shadows in a frame representing a 3D-scene

Also Published As

Publication number Publication date
CN1635401A (en) 2005-07-06

Similar Documents

Publication Publication Date Title
CN111145174B (en) 3D target detection method for point cloud screening based on image semantic features
LU102117B1 (en) Method and system for measuring mountain view visible area in city
CN110728671B (en) Dense reconstruction method of texture-free scene based on vision
CN106204735B (en) Application method of the Unity3D terrain datas in 11 environment of Direct3D
CN107341853A (en) Super large virtual scene and dynamic take the photograph the virtual reality fusion method and system of screen
CN107170037A (en) A kind of real-time three-dimensional point cloud method for reconstructing and system based on multiple-camera
AU2003298666A1 (en) Reality-based light environment for digital imaging in motion pictures
CN102609950B (en) Two-dimensional video depth map generation process
CN205451195U (en) Real -time three -dimensional some cloud system that rebuilds based on many cameras
CN111968216A (en) Volume cloud shadow rendering method and device, electronic equipment and storage medium
CN109194954B (en) Method, device and equipment for testing performance parameters of fisheye camera and storable medium
CN106558017A (en) Spherical display image processing method and system
CN115937461B (en) Multi-source fusion model construction and texture generation method, device, medium and equipment
CN113902812A (en) Laser radar and camera external parameter automatic calibration method based on multiple calibration plates
US20230033319A1 (en) Method, apparatus and device for processing shadow texture, computer-readable storage medium, and program product
CN106204701A (en) A kind of rendering intent based on light probe interpolation dynamic calculation indirect reference Gao Guang
CN101799924A (en) Method for calibrating projector by CCD (Charge Couple Device) camera
Moersch et al. Variable-sized, circular bokeh depth of field effects
Alshawabkeh et al. Automatic multi-image photo texturing of complex 3D scenes
CN100334500C (en) Interactive 3D scene lighting method and system
EP3794910B1 (en) A method of measuring illumination, corresponding system, computer program product and use
CN113610982B (en) Neighborhood query method based on large-scale model scene
CN108182702B (en) Real-time three-dimensional modeling method and system based on depth image acquisition equipment
CN114332356A (en) Virtual and real picture combining method and device
CN114820904A (en) Illumination-supporting pseudo-indoor rendering method, apparatus, medium, and device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CX01 Expiry of patent term
CX01 Expiry of patent term

Granted publication date: 20070829