CN111275809A - Space display method and system for ceramic tiles - Google Patents

Space display method and system for ceramic tiles Download PDF

Info

Publication number
CN111275809A
CN111275809A CN202010036911.5A CN202010036911A CN111275809A CN 111275809 A CN111275809 A CN 111275809A CN 202010036911 A CN202010036911 A CN 202010036911A CN 111275809 A CN111275809 A CN 111275809A
Authority
CN
China
Prior art keywords
image
space
dimensional space
display
matting
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010036911.5A
Other languages
Chinese (zh)
Other versions
CN111275809B (en
Inventor
陈来波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xinjiang Inpo Information Technology Co ltd
Original Assignee
Xinjiang Inpo Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xinjiang Inpo Information Technology Co ltd filed Critical Xinjiang Inpo Information Technology Co ltd
Priority to CN202010036911.5A priority Critical patent/CN111275809B/en
Publication of CN111275809A publication Critical patent/CN111275809A/en
Application granted granted Critical
Publication of CN111275809B publication Critical patent/CN111275809B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Graphics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Software Systems (AREA)
  • Geometry (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a space display method and a space display system for a ceramic tile, which comprises the steps of obtaining a pre-established three-dimensional space model, carrying out image matting processing on a brick replacing area of the three-dimensional space model, and obtaining a first image matting image; rendering the first image matting image into a space effect image consistent with the preset target image illumination attribute; code fusion is carried out on the reflection image of the space effect image and all illumination, transparency display is carried out on the obtained fusion image, and a shadow reflection image is generated; and combining and superposing the shadow reflection map, the space effect image and the three-dimensional space model to display the superposition effect in real time. Through the scheme, the economic and labor cost of a merchant for displaying products through the rental exhibition room simulation room is reduced, the space application style of the ceramic tile can be changed according to the customer demands in real time, the updating speed and efficiency of product display are guaranteed, and the experience effect is good.

Description

Space display method and system for ceramic tiles
Technical Field
The invention relates to the technical field of space display, in particular to a space display method and system for a ceramic tile.
Background
In the process of selling ceramic tile products for consumers, ceramic tile merchants cannot present the final effect of paving the products in a medium space at home when the consumers do not make purchasing decisions because the products belong to non-standard products, so that the ceramic tile merchants can only present the products in a mode of establishing a simulation room in an exhibition hall.
However, the space occupied cost is high between the simulation, the speed of replacing products and displaying is slow, the labor cost is high, the effect experience is not good, the space application style of the ceramic tile cannot be changed according to the requirements of customers, and professional ceramic tile sales service cannot be provided.
Disclosure of Invention
In order to solve the problems, the invention provides a space display method and a space display system for tiles, so that the limitation of non-standard product attributes is broken through in the process of selling tile products for consumers by tile manufacturers, the product effect of paving tiles in a specified space can be shown on the premise that the consumers do not make purchasing decisions, and the commodity display effect and the user experience are greatly improved.
In order to achieve the purpose of the invention, the invention adopts the following technical scheme:
a method of spatial display of tiles, the method comprising:
acquiring a pre-established three-dimensional space model;
carrying out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image;
rendering the first image matting image into a space effect image consistent with the preset target image illumination attribute;
code fusion is carried out on the reflection image of the space effect image and all illumination, transparency display is carried out on the obtained fusion image, and a shadow reflection image is generated;
and the shadow reflection map, the space effect image and the three-dimensional space model are combined and overlapped to display the overlapping effect in real time.
Preferably, the pre-establishing of the three-dimensional space model comprises:
sending a three-dimensional space display request to a server, wherein the three-dimensional space display request comprises a space scene type;
the server generates a three-dimensional space model according to the space scene type and sends corresponding three-dimensional space model data;
receiving three-dimensional space model data sent by the server; the three-dimensional space model data comprises pixel point information contained in a three-dimensional space and a replaceable tile map matched with the three-dimensional space model.
Preferably, the acquiring the first matting image includes:
setting at least one editable area on the three-dimensional space model as a brick replacing area;
selecting a color key to carry out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image.
Preferably, the code fusing the reflection map of the spatial effect image with all the illumination, and performing transparency display on the obtained fused map to generate the shadow reflection map includes:
acquiring a space effect image, taking a reflection image of the space effect image as a picture frame of a rendering space, and adding a black transparency mask;
determining color information of a reflection map edge region of the space effect image;
setting the transparency of a region, with the color information similarity within a preset range, in the reflection image of the space effect image as a preset value, and fusing pixel points to form a fused image;
and covering the reflection map of the space effect image on the fusion image, and performing smoothing treatment on the covered image splicing boundary to obtain a shadow reflection map.
Preferably, the combining and superimposing the shadow reflection map, the spatial effect image, and the three-dimensional spatial model includes:
placing the space effect image above the shadow reflection image, and placing the space effect image on the first layer of the screen for display;
and placing the three-dimensional space model below the shadow reflection image and placing the three-dimensional space model at the bottom layer of the screen for displaying.
Preferably, rendering the first matting image into a spatial effect image consistent with a preset target image illumination attribute includes:
the illumination attribute is a transparency parameter;
converting the first image matting image from an RGB color space to a luminance chrominance YUV color space;
and converting the transparency parameter of the first image matting image into the transparency parameter consistent with the preset target image.
A spatial display system of tiles, the system comprising:
the acquisition module is used for acquiring a pre-established three-dimensional space model;
the first processing module is used for carrying out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image;
the second processing module is used for rendering the first image matting image into a space effect image which is consistent with the preset target image illumination attribute;
the third processing module is used for carrying out code fusion on the reflection image of the space effect image and all illumination, and carrying out transparency display on the obtained fusion image to generate a shadow reflection image;
and the combined display module is used for displaying the superposition effect in real time after the shadow reflection map, the space effect image and the three-dimensional space model are combined and superposed.
Further, the obtaining module comprises:
the system comprises a request unit, a display unit and a display unit, wherein the request unit is used for sending a three-dimensional space display request to a server, and the three-dimensional space display request comprises a space scene type;
the response unit is used for generating a three-dimensional space model according to the space scene type by the server and sending corresponding three-dimensional space model data;
the receiving unit is used for receiving the three-dimensional space model data sent by the server; the three-dimensional space model data comprises pixel point information contained in a three-dimensional space and a replaceable tile map matched with the three-dimensional space model.
Further, the first processing module comprises:
the setting unit is used for setting at least one editable area on the three-dimensional space model as a brick replacing area;
the matting unit is used for selecting a color key to perform matting processing on a brick replacing area of the three-dimensional space model to obtain a first matting image;
the second processing module comprises:
a first conversion unit, configured to convert the first matting image from an RGB color space to a luminance chrominance YUV color space;
the second conversion unit is used for converting the transparency parameter of the first image matting image into the transparency parameter consistent with the transparency parameter of the preset target image;
the third processing module comprises:
the processing unit is used for acquiring a space effect image, taking a reflection image of the space effect image as a picture frame of a rendering space, and adding a black transparency mask;
a determination unit for determining color information of a reflection map edge region of the spatial effect image;
the fusion unit is used for setting the transparency of a region with the similarity between the reflection image of the space effect image and the color information within a preset range as a preset value, and fusing pixel points to form a fusion image;
and the acquisition unit is used for covering the reflection map of the space effect image on the fusion image, and performing smoothing treatment on the covered image splicing boundary to obtain a shadow reflection map.
Further, the modular display module comprises:
the first combination unit is used for placing the space effect image above the shadow reflection image and placing the space effect image on the first layer of the screen for display;
and the second combination unit is used for placing the three-dimensional space model below the shadow reflection map and placing the three-dimensional space model at the bottom layer of the screen for display.
The invention has the beneficial effects that:
according to the space display method and system for the ceramic tiles, disclosed by the invention, scenes with various space styles and space types are designed through modeling, and a high-definition panoramic three-dimensional model is generated. The tile product can be matched with the space style liked by consumers according to the tile product style, and the panoramic effect of the product in the application space is presented. The service quality provided by a tile merchant when the tiles are sold is greatly improved, the sale is promoted, important reference can be made for the purchase decision of a consumer through the finally presented panoramic effect picture, and convenience is provided for the user.
In addition, the method and the system provided by the invention can simulate the effect of the designed product matched with the matched carrier, and if the display effect is not consistent with the preset scene, the tile patches with different styles can be replaced at any time until the satisfaction is reached, so that the labor cost and resource waste caused by the reworking of the actual laid tiles or the replacement of the tiles again are avoided.
Drawings
In order to more clearly illustrate the detailed description of the invention or the technical solutions in the prior art, the drawings that are needed in the detailed description of the invention or the prior art will be briefly described below. Throughout the drawings, like elements or portions are generally identified by like reference numerals. In the drawings, elements or portions are not necessarily drawn to scale.
Fig. 1 is a flowchart of a space display method for tiles provided in embodiment 1 of the present invention.
Detailed Description
The following describes embodiments of the present invention in further detail with reference to the accompanying drawings.
In order to specifically understand the technical solutions provided by the present invention, the technical solutions of the present invention will be described and illustrated in detail in the following examples. It is apparent that the embodiments provided by the present invention are not limited to the specific details familiar to those skilled in the art. The following detailed description of the preferred embodiments of the invention is intended to provide further embodiments of the invention in addition to those described herein.
According to the space display method and system for the ceramic tiles, scenes with various space styles and space types are designed through 3D modeling, and a high-definition panorama is generated. The tile product can be matched with the space style liked by consumers according to the tile product style, and the panoramic effect of the product in the application space is presented. The service quality provided by a tile merchant when the tiles are sold is greatly improved, the sale is promoted, and important reference can be made for the purchase decision of a consumer through the designed panoramic effect picture.
As shown in fig. 1, the present invention provides a space display method of a tile, the method comprising:
s1, acquiring a pre-established three-dimensional space model;
s2, matting the brick replacing area of the three-dimensional space model to obtain a first matting image;
s3, rendering the first matting image into a space effect image consistent with the preset target image illumination attribute;
s4, carrying out code fusion on the reflection image of the space effect image and all illumination, and carrying out transparency display on the obtained fusion image to generate a shadow reflection image;
and S5, combining and superposing the shadow reflection map, the space effect image and the three-dimensional space model to display the superposition effect in real time.
In step S1, the pre-building of the three-dimensional space model includes:
sending a three-dimensional space display request to a server, wherein the three-dimensional space display request comprises a space scene type;
the server generates a three-dimensional space model according to the space scene type and sends corresponding three-dimensional space model data;
receiving three-dimensional space model data sent by the server; the three-dimensional space model data comprises pixel point information contained in a three-dimensional space and a replaceable tile map matched with the three-dimensional space model.
In step S2, the acquiring the first matting image includes:
setting at least one editable area on the three-dimensional space model as a brick replacing area;
selecting a color key to carry out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image.
In step S3, the code-fusing the reflection map of the spatial effect image with all the illumination, and performing transparency display on the obtained fused map, and generating the shadow reflection map includes:
acquiring a space effect image, taking a reflection image of the space effect image as a picture frame of a rendering space, and adding a black transparency mask;
determining color information of a reflection map edge region of the space effect image;
setting the transparency of a region, with the color information similarity within a preset range, in the reflection image of the space effect image as a preset value, and fusing pixel points to form a fused image;
and covering the reflection map of the space effect image on the fusion image, and performing smoothing treatment on the covered image splicing boundary to obtain a shadow reflection map.
In step S4, rendering the first matting image into a spatial effect image consistent with the preset target image lighting attribute includes:
the illumination attribute is a transparency parameter; transparency (transparency) means that light can pass through an object. When rendering transparent objects, it is not sufficient to render only the primitives with the smallest depth values, since the following primitives may have to be seen through those primitives. The color of a pixel in the rendered image may be formed by mixing the color of the transparent primitive with the color of one or more other primitives. Typically, the rendered image is constructed by blending multiple layers of transparent objects starting with the primitive having the largest depth value and ending with the primitive having the smallest depth value. Not all rendering systems are able to sort transparent objects, often requiring a software application to render primitives that are pre-ordered in a back-to-front order. In one example of transparency processing, transparent primitives are processed in ISP 102 (e.g., to determine whether they are hidden behind existing non-transparent objects at arbitrary sample locations), and the tag buffer is flushed behind each transparent primitive to enable the primitives to be textured and shaded, and blended with previously textured and shaded primitives in the pixel buffer. If the application sends additional non-transparent primitives after the transparent primitive, the result of the blending may be hidden.
When the illumination attribute is defined as a transparency parameter, firstly converting the first matting image from an RGB color space to a luminance and chrominance YUV color space; and then converting the transparency parameter of the first image matting image into the transparency parameter consistent with the preset target image.
In step S5, the combining and superimposing the shadow map, the spatial effect image, and the three-dimensional spatial model includes:
placing the space effect image above the shadow reflection image, and placing the space effect image on the first layer of the screen for display;
and placing the three-dimensional space model below the shadow reflection image and placing the three-dimensional space model at the bottom layer of the screen for displaying.
Based on the same inventive concept, the present embodiment further provides a tile space display system, comprising:
the acquisition module is used for acquiring a pre-established three-dimensional space model;
the first processing module is used for carrying out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image;
the second processing module is used for rendering the first image matting image into a space effect image which is consistent with the preset target image illumination attribute;
the third processing module is used for carrying out code fusion on the reflection image of the space effect image and all illumination, and carrying out transparency display on the obtained fusion image to generate a shadow reflection image;
and the combined display module is used for displaying the superposition effect in real time after the shadow reflection map, the space effect image and the three-dimensional space model are combined and superposed.
Wherein, the acquisition module includes:
the system comprises a request unit, a display unit and a display unit, wherein the request unit is used for sending a three-dimensional space display request to a server, and the three-dimensional space display request comprises a space scene type;
the response unit is used for generating a three-dimensional space model according to the space scene type by the server and sending corresponding three-dimensional space model data;
the receiving unit is used for receiving the three-dimensional space model data sent by the server; the three-dimensional space model data comprises pixel point information contained in a three-dimensional space and a replaceable tile map matched with the three-dimensional space model.
The first processing module includes:
the setting unit is used for setting at least one editable area on the three-dimensional space model as a brick replacing area;
the matting unit is used for selecting a color key to perform matting processing on a brick replacing area of the three-dimensional space model to obtain a first matting image;
the second processing module comprises:
a first conversion unit, configured to convert the first matting image from an RGB color space to a luminance chrominance YUV color space;
the second conversion unit is used for converting the transparency parameter of the first image matting image into the transparency parameter consistent with the transparency parameter of the preset target image;
the third processing module comprises:
the processing unit is used for acquiring a space effect image, taking a reflection image of the space effect image as a picture frame of a rendering space, and adding a black transparency mask;
a determination unit for determining color information of a reflection map edge region of the spatial effect image;
the fusion unit is used for setting the transparency of a region with the similarity between the reflection image of the space effect image and the color information within a preset range as a preset value, and fusing pixel points to form a fusion image;
and the acquisition unit is used for covering the reflection map of the space effect image on the fusion image, and performing smoothing treatment on the covered image splicing boundary to obtain a shadow reflection map.
The combined display module comprises:
the first combination unit is used for placing the space effect image above the shadow reflection image and placing the space effect image on the first layer of the screen for display;
and the second combination unit is used for placing the three-dimensional space model below the shadow reflection map and placing the three-dimensional space model at the bottom layer of the screen for display.
Example 2: the embodiment of the invention mainly solves the problem that the existing ceramic tile merchant can not show the final product effect of being laid in a medium space of home when the consumer does not make a purchase decision because the product belongs to a non-standard product in the process of selling ceramic tile products for the consumer. The method specifically comprises the following steps:
firstly, designing a 3DMAX space by a common technical means in the field;
a 3DMAX space effect graph obtained after space rendering;
a reflection map corresponding to a 3DMAX rendered space effect map obtained after space rendering;
and rendering all the lighting maps corresponding to the 3DMAX rendered space effect map obtained after the space is rendered.
Secondly, 3DMAX is adopted to make a current space model;
a. the partial cutout needing replacing the ceramic tile is transparent and is placed on the first layer of the screen for display;
b. adding the reflection image into a black transparent mask, fusing the reflection image with illumination, and displaying the reflection image on a second layer of the screen;
c. the space model is placed on the third layer of the screen and is used in the current patent to replace tile maps.
d. And (c) displaying the steps a, b and c together to achieve the effect of displaying the space light and shadow of the ceramic tile in real time when the space model is changed with the ceramic tile mapping.
Finally, it should be noted that: the above embodiments are only used to illustrate the technical solution of the present invention, and not to limit the same; while the invention has been described in detail and with reference to the foregoing embodiments, it will be understood by those skilled in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; such modifications and substitutions do not depart from the spirit and scope of the present invention, and they should be construed as being included in the following claims and description.

Claims (10)

1. A method for spatial display of tiles, characterized in that it comprises:
acquiring a pre-established three-dimensional space model;
carrying out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image;
rendering the first image matting image into a space effect image consistent with the preset target image illumination attribute;
code fusion is carried out on the reflection image of the space effect image and all illumination, transparency display is carried out on the obtained fusion image, and a shadow reflection image is generated;
and the shadow reflection map, the space effect image and the three-dimensional space model are combined and overlapped to display the overlapping effect in real time.
2. The method of claim 1, wherein the pre-building of the three-dimensional spatial model comprises:
sending a three-dimensional space display request to a server, wherein the three-dimensional space display request comprises a space scene type;
the server generates a three-dimensional space model according to the space scene type and sends corresponding three-dimensional space model data;
receiving three-dimensional space model data sent by the server; the three-dimensional space model data comprises pixel point information contained in a three-dimensional space and a replaceable tile map matched with the three-dimensional space model.
3. The method of claim 1, wherein the obtaining the first matte image comprises:
setting at least one editable area on the three-dimensional space model as a brick replacing area;
selecting a color key to carry out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image.
4. The method of claim 1, wherein code fusing the reflection map of the spatial effect image with the total illumination and transparently displaying the resulting fused map, and wherein generating the shadow reflection map comprises:
acquiring a space effect image, taking a reflection image of the space effect image as a picture frame of a rendering space, and adding a black transparency mask;
determining color information of a reflection map edge region of the space effect image;
setting the transparency of a region, with the color information similarity within a preset range, in the reflection image of the space effect image as a preset value, and fusing pixel points to form a fused image;
and covering the reflection map of the space effect image on the fusion image, and performing smoothing treatment on the covered image splicing boundary to obtain a shadow reflection map.
5. The method of claim 1, wherein the combining the superimposed shadow map, the spatial effect image, and the three-dimensional spatial model comprises:
placing the space effect image above the shadow reflection image, and placing the space effect image on the first layer of the screen for display;
and placing the three-dimensional space model below the shadow reflection image and placing the three-dimensional space model at the bottom layer of the screen for displaying.
6. The method of claim 1, wherein rendering the first matte image as a spatial effect image consistent with preset target image lighting attributes comprises:
the illumination attribute is a transparency parameter;
converting the first image matting image from an RGB color space to a luminance chrominance YUV color space;
and converting the transparency parameter of the first image matting image into the transparency parameter consistent with the preset target image.
7. A tile space display system, comprising:
the acquisition module is used for acquiring a pre-established three-dimensional space model;
the first processing module is used for carrying out image matting processing on a brick replacing area of the three-dimensional space model to obtain a first image matting image;
the second processing module is used for rendering the first image matting image into a space effect image which is consistent with the preset target image illumination attribute;
the third processing module is used for carrying out code fusion on the reflection image of the space effect image and all illumination, and carrying out transparency display on the obtained fusion image to generate a shadow reflection image;
and the combined display module is used for displaying the superposition effect in real time after the shadow reflection map, the space effect image and the three-dimensional space model are combined and superposed.
8. The system of claim 7, wherein the acquisition module comprises:
the system comprises a request unit, a display unit and a display unit, wherein the request unit is used for sending a three-dimensional space display request to a server, and the three-dimensional space display request comprises a space scene type;
the response unit is used for generating a three-dimensional space model according to the space scene type by the server and sending corresponding three-dimensional space model data;
the receiving unit is used for receiving the three-dimensional space model data sent by the server; the three-dimensional space model data comprises pixel point information contained in a three-dimensional space and a replaceable tile map matched with the three-dimensional space model.
9. The system of claim 7, wherein the first processing module comprises:
the setting unit is used for setting at least one editable area on the three-dimensional space model as a brick replacing area;
the matting unit is used for selecting a color key to perform matting processing on a brick replacing area of the three-dimensional space model to obtain a first matting image;
the second processing module comprises:
a first conversion unit, configured to convert the first matting image from an RGB color space to a luminance chrominance YUV color space;
the second conversion unit is used for converting the transparency parameter of the first image matting image into the transparency parameter consistent with the transparency parameter of the preset target image;
the third processing module comprises:
the processing unit is used for acquiring a space effect image, taking a reflection image of the space effect image as a picture frame of a rendering space, and adding a black transparency mask;
a determination unit for determining color information of a reflection map edge region of the spatial effect image;
the fusion unit is used for setting the transparency of a region with the similarity between the reflection image of the space effect image and the color information within a preset range as a preset value, and fusing pixel points to form a fusion image;
and the acquisition unit is used for covering the reflection map of the space effect image on the fusion image, and performing smoothing treatment on the covered image splicing boundary to obtain a shadow reflection map.
10. The system of claim 7, wherein the modular display module comprises:
the first combination unit is used for placing the space effect image above the shadow reflection image and placing the space effect image on the first layer of the screen for display;
and the second combination unit is used for placing the three-dimensional space model below the shadow reflection map and placing the three-dimensional space model at the bottom layer of the screen for display.
CN202010036911.5A 2020-01-14 2020-01-14 Space display method and system for ceramic tiles Active CN111275809B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010036911.5A CN111275809B (en) 2020-01-14 2020-01-14 Space display method and system for ceramic tiles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010036911.5A CN111275809B (en) 2020-01-14 2020-01-14 Space display method and system for ceramic tiles

Publications (2)

Publication Number Publication Date
CN111275809A true CN111275809A (en) 2020-06-12
CN111275809B CN111275809B (en) 2023-09-12

Family

ID=71002995

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010036911.5A Active CN111275809B (en) 2020-01-14 2020-01-14 Space display method and system for ceramic tiles

Country Status (1)

Country Link
CN (1) CN111275809B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132961A (en) * 2020-09-28 2020-12-25 建信金融科技有限责任公司 Panoramic image template-based digital virtual exhibition hall generation method and system

Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6437782B1 (en) * 1999-01-06 2002-08-20 Microsoft Corporation Method for rendering shadows with blended transparency without producing visual artifacts in real time applications
JP2003175274A (en) * 2003-01-08 2003-06-24 Namco Ltd Game system and information memory medium
JP2004164571A (en) * 2002-06-27 2004-06-10 Mitsubishi Electric Research Laboratories Inc Method for modeling three-dimensional object
CN102393970A (en) * 2011-12-13 2012-03-28 北京航空航天大学 Object three-dimensional modeling and rendering system as well as generation and rendering methods of three-dimensional model
CN102722904A (en) * 2012-05-30 2012-10-10 北京尔宜居科技有限责任公司 Local rendering method
US20120307005A1 (en) * 2011-06-03 2012-12-06 Guzman Suarez Angela Generating a simulated three dimensional scene by producing reflections in a two dimensional scene
CN103400412A (en) * 2013-07-17 2013-11-20 天脉聚源(北京)传媒科技有限公司 Resource displaying method, device and terminal
KR20140011675A (en) * 2012-07-18 2014-01-29 한국과학기술원 A physically-based approach to reflection separation
CN103761760A (en) * 2014-01-07 2014-04-30 珠海宜高科技有限公司 Method for manufacturing multi-view indoor design effect picture
CN103886631A (en) * 2014-02-21 2014-06-25 浙江大学 Three-dimensional virtual indoor display system based on mobile equipment
CN107316336A (en) * 2017-06-22 2017-11-03 四川数字工匠科技有限公司 VR real-time rendering systems based on interaction technique
CN107548502A (en) * 2015-02-25 2018-01-05 脸谱公司 Object in characteristic identification volume elements based on the light by object reflection
CN107610211A (en) * 2017-08-09 2018-01-19 中建局集团装饰工程有限公司 A kind of panoramic design sketch renders displaying preparation method, system
CN108509887A (en) * 2018-03-26 2018-09-07 深圳超多维科技有限公司 A kind of acquisition ambient lighting information approach, device and electronic equipment
CN109214898A (en) * 2018-11-08 2019-01-15 新疆初点信息科技有限公司 A kind of ceramics trading service platform and method
CN109960872A (en) * 2019-03-22 2019-07-02 南京可居网络科技有限公司 The virtual soft dress collocation management system of AR and its working method

Patent Citations (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6437782B1 (en) * 1999-01-06 2002-08-20 Microsoft Corporation Method for rendering shadows with blended transparency without producing visual artifacts in real time applications
JP2004164571A (en) * 2002-06-27 2004-06-10 Mitsubishi Electric Research Laboratories Inc Method for modeling three-dimensional object
JP2003175274A (en) * 2003-01-08 2003-06-24 Namco Ltd Game system and information memory medium
US20120307005A1 (en) * 2011-06-03 2012-12-06 Guzman Suarez Angela Generating a simulated three dimensional scene by producing reflections in a two dimensional scene
CN102393970A (en) * 2011-12-13 2012-03-28 北京航空航天大学 Object three-dimensional modeling and rendering system as well as generation and rendering methods of three-dimensional model
CN102722904A (en) * 2012-05-30 2012-10-10 北京尔宜居科技有限责任公司 Local rendering method
KR20140011675A (en) * 2012-07-18 2014-01-29 한국과학기술원 A physically-based approach to reflection separation
CN103400412A (en) * 2013-07-17 2013-11-20 天脉聚源(北京)传媒科技有限公司 Resource displaying method, device and terminal
CN103761760A (en) * 2014-01-07 2014-04-30 珠海宜高科技有限公司 Method for manufacturing multi-view indoor design effect picture
CN103886631A (en) * 2014-02-21 2014-06-25 浙江大学 Three-dimensional virtual indoor display system based on mobile equipment
CN107548502A (en) * 2015-02-25 2018-01-05 脸谱公司 Object in characteristic identification volume elements based on the light by object reflection
CN107316336A (en) * 2017-06-22 2017-11-03 四川数字工匠科技有限公司 VR real-time rendering systems based on interaction technique
CN107610211A (en) * 2017-08-09 2018-01-19 中建局集团装饰工程有限公司 A kind of panoramic design sketch renders displaying preparation method, system
CN108509887A (en) * 2018-03-26 2018-09-07 深圳超多维科技有限公司 A kind of acquisition ambient lighting information approach, device and electronic equipment
CN109214898A (en) * 2018-11-08 2019-01-15 新疆初点信息科技有限公司 A kind of ceramics trading service platform and method
CN109960872A (en) * 2019-03-22 2019-07-02 南京可居网络科技有限公司 The virtual soft dress collocation management system of AR and its working method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
刘宝彬: "真实感实时渲染的三维虚拟家居系统设计", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》 *
刘小晶;: "基于图形图像软件的效果图制作应用研究", 漳州职业技术学院学报, no. 03 *
王乐乐;吴子朝;: "三维模型展示与标注系统的设计与实现", 电脑迷, no. 08 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112132961A (en) * 2020-09-28 2020-12-25 建信金融科技有限责任公司 Panoramic image template-based digital virtual exhibition hall generation method and system
CN112132961B (en) * 2020-09-28 2023-06-06 建信金融科技有限责任公司 Panoramic template-based digital virtual exhibition hall generation method and system

Also Published As

Publication number Publication date
CN111275809B (en) 2023-09-12

Similar Documents

Publication Publication Date Title
CN109448089B (en) Rendering method and device
US6281904B1 (en) Multi-source texture reconstruction and fusion
US6417850B1 (en) Depth painting for 3-D rendering applications
EP3462383B1 (en) A data processing system and method for processing an image of an object
US5058042A (en) Method for employing a hierarchical display list in global rendering
US8633939B2 (en) System and method for painting 3D models with 2D painting tools
JPH11513508A (en) Method and system for manipulating images of floor coverings or other textiles
JP2002183761A (en) Image generation method and device
US20110102424A1 (en) Storyboard generation method and system
US10353073B1 (en) Point cloud colorization system with real-time 3D visualization
US11790610B2 (en) Systems and methods for selective image compositing
CN108377374A (en) Method and system for generating depth information related to an image
US20100262405A1 (en) Methods and apparatus for creating customisable cad image files
CN111275809B (en) Space display method and system for ceramic tiles
CN116843816B (en) Three-dimensional graphic rendering display method and device for product display
CN108198011A (en) For generating the method for order, system and terminal device
WO2019012314A1 (en) Method of displaying a wide-format augmented reality object
CN113313814B (en) Indoor design system and method based on reverse modeling and AR technology
Reche-Martinez et al. View-dependent layered projective texture maps
JP2005284403A (en) Three-dimensional information output system, server, and client
Previtali et al. An automated and accurate procedure for texture mapping from images
Pollard et al. View synthesis by edge transfer with application to the generation of immersive video objects
CN114693895B (en) Map switching method and device, electronic equipment and storage medium
KR101011476B1 (en) System and method for a service coordinating a product image based on on-line providing a background-image
CN114840484A (en) Method and device for sharing products

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant