CN117379782A - Building indoor rendering method, device, computer equipment and storage medium - Google Patents

Building indoor rendering method, device, computer equipment and storage medium Download PDF

Info

Publication number
CN117379782A
CN117379782A CN202311348311.2A CN202311348311A CN117379782A CN 117379782 A CN117379782 A CN 117379782A CN 202311348311 A CN202311348311 A CN 202311348311A CN 117379782 A CN117379782 A CN 117379782A
Authority
CN
China
Prior art keywords
virtual
building
instance
room
indoor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311348311.2A
Other languages
Chinese (zh)
Inventor
陈尚文
尚鸿
陈星翰
孙钟前
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202311348311.2A priority Critical patent/CN117379782A/en
Publication of CN117379782A publication Critical patent/CN117379782A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/02Non-photorealistic rendering
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The application relates to a building indoor rendering method, a device, a computer device, a storage medium and a computer program product. The method comprises the following steps: obtaining a virtual building to be rendered, wherein the virtual building is provided with a virtual room, and the outer surface of the virtual room is provided with a material instance; the material instance is used for presenting a scene inside the virtual room; acquiring size information of a material instance, and determining a scaling of the material instance compared with an original material instance according to the size information of the material instance; acquiring the position information of a material instance, and determining an indoor area according to the position information and the scaling of the material instance, wherein the indoor area is seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room; and acquiring a preset indoor building material, generating an indoor rendering image according to the indoor building material and the indoor area, and placing the indoor rendering image at a material instance. By adopting the method, the rendering efficiency can be improved.

Description

Building indoor rendering method, device, computer equipment and storage medium
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to a method and apparatus for indoor rendering of a building, a computer device, and a storage medium.
Background
With the development of computer technology, there are more and more kinds of games that can be played on computer devices. In a game scenario, a virtual building is often presented. The virtual building may include a plurality of virtual rooms in which indoor effects may be presented.
At present, indoor rendering images are placed behind material examples in each virtual room in a manual mode, and indoor effects are reflected through the placed indoor rendering images. For example, an example of material may be a window of a virtual room, and an indoor rendering image may be posted behind the window to obtain an indoor rendering effect. However, the efficiency of obtaining the indoor rendering effect by the manual method is low.
Disclosure of Invention
In view of the foregoing, it is desirable to provide a method, an apparatus, a computer device, a computer-readable storage medium, and a computer program product for indoor rendering of a building that can improve indoor rendering efficiency.
In a first aspect, the present application provides a method for indoor rendering of a building, the method comprising:
obtaining a virtual building to be rendered, wherein the virtual building is provided with a virtual room, and the outer surface of the virtual room is provided with a material instance; the material instance is used for presenting a scene inside the virtual room;
Acquiring the size information of the material instance, and determining the scaling of the material instance compared with the original material instance according to the size information of the material instance;
acquiring the position information of the material instance, and determining an indoor area according to the position information of the material instance and the scaling, wherein the indoor area is an indoor area seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room;
and acquiring a preset indoor building material, generating an indoor rendering image according to the indoor building material and the indoor area, and placing the indoor rendering image at the material instance.
In a second aspect, the present application also provides an indoor rendering device for a building, the device comprising:
the virtual building acquisition module is used for acquiring a virtual building to be rendered, wherein the virtual building is provided with a virtual room, and the outer surface of the virtual room is provided with a material instance; the material instance is used for presenting a scene inside the virtual room;
the indoor area determining module is used for acquiring the size information of the material instance and determining the scaling of the material instance compared with the original material instance according to the size information of the material instance; acquiring the position information of the material instance, and determining an indoor area according to the position information of the material instance and the scaling, wherein the indoor area is an indoor area seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room;
The rendering module is used for acquiring preset indoor building materials, generating an indoor rendering image according to the indoor building materials and the indoor area, and placing the indoor rendering image at the material instance.
In one embodiment, the obtained position information of the material instance is a relative position between the material instance and the virtual room; the indoor area determining module is also used for acquiring a material two-dimensional image corresponding to the material instance; the material two-dimensional image is an image which is obtained by carrying out projection processing on the material instance and is positioned in a preset material two-dimensional space; performing scaling processing on the material two-dimensional image according to the scaling ratio to obtain the position of the scaled material two-dimensional image in the material two-dimensional space; according to the relative position between the material instance and the virtual room, translating the scaled material two-dimensional image to obtain the position of the scaled and translated material two-dimensional image in the material two-dimensional space; determining a room two-dimensional space corresponding to the virtual room, and determining the position of the scaled and translated material two-dimensional image in the room two-dimensional space according to the position of the scaled and translated material two-dimensional image in the material two-dimensional space; and determining an indoor area according to the position of the scaled and translated material two-dimensional image in the room two-dimensional space.
In one embodiment, the indoor area determining module is further configured to obtain size information of the original material instance, and determine, according to the size information of the original material instance, a size represented by a unit coordinate in the two-dimensional space of the material; acquiring the size information of the virtual room, and determining the size represented by the unit coordinates in the two-dimensional space of the room according to the size information of the virtual room; and converting the scaled and translated material two-dimensional image from the material two-dimensional space to the room two-dimensional space according to the size represented by the unit coordinates in the material two-dimensional space and the size represented by the unit coordinates in the room two-dimensional space, so as to obtain the position coordinates of the scaled and translated material two-dimensional image in the room two-dimensional space.
In one embodiment, the size information and the position information of the material instance are stored in instance parameters of the material instance; the example parameters also comprise at least one of indoor layout information and household ornament patterns; the rendering module is also used for outputting an indoor rendering image which is matched with the indoor layout information and comprises the household ornaments based on the indoor layout information, the household ornament style and the indoor area through the indoor building material.
In one embodiment, the building indoor rendering device further includes an instance parameter generating module, configured to perform room division processing on the virtual building according to a plurality of material instances in the virtual building, to obtain a plurality of virtual rooms in the virtual building; for each of the plurality of virtual rooms, determining instances of material disposed on an exterior surface of the virtual room in question; an instance of material disposed on an exterior surface of the virtual room being targeted is determined relative to a size and location of the virtual room being targeted.
In one embodiment, the instance parameter generation module is further configured to determine an outer contour of the virtual building; determining a virtual building surface of the virtual building according to the position relation among the outer contour lines; dividing floors of the virtual building based on the virtual building facing the virtual building to obtain a plurality of virtual floors; and respectively dividing rooms of each virtual floor to obtain a plurality of virtual rooms.
In one embodiment, the virtual building is located in a preset building coordinate system; the instance parameter generation module is also used for screening parallel lines parallel to each outline of the virtual building from the outline of the virtual building; screening out intersecting lines intersecting the outer contour line of the virtual building; and determining a virtual building surface of the virtual building according to the first position information of the aimed outer contour line in the preset building coordinate system, the second position information of the parallel line in the preset building coordinate system and the third position information of the intersecting line in the preset coordinate system.
In one embodiment, the example parameter generation module is further configured to determine a floor height for a current round; the floor division result of the current round is obtained by carrying out floor division on the virtual building based on the floor height of the current round and the virtual building; determining the floor height of the next round, entering a next round floor dividing process, taking the floor height of the next round as the floor height of a new current round, and returning to the step of dividing the floors of the virtual building through the floor height of the current round until a preset stopping condition is met; and determining a plurality of virtual floors according to the floor division results of each round.
In one embodiment, the example parameter generating module is further configured to construct a parallel line group on the virtual building surface during the floor division of the current round; the distance between every two adjacent horizontal lines in the parallel line group is the floor height of the current turn; determining the building surface area between every two adjacent horizontal lines in the parallel line group; and determining the floor division result of the current round according to the building surface area.
In one embodiment, the instance parameter generating module is further configured to determine, for each round, for an initial virtual floor obtained by dividing the round for each round, whether the material instance in the virtual building is located in at least two initial virtual floors; if the virtual floors are located at least two initial virtual floors, taking the round aimed at as a candidate round; for each candidate round, determining adjacent horizontal lines corresponding to each material instance in parallel line groups constructed in the candidate round, and determining a distance calculation result corresponding to the candidate round according to the distance between each material instance and the corresponding adjacent horizontal line; screening a target round from the candidate rounds according to the distance calculation results corresponding to the candidate rounds; and taking the initial virtual floor obtained by dividing the target round as a final virtual floor.
In one embodiment, each adjacent horizontal line corresponding to each material instance comprises an upper horizontal line and a lower horizontal line; each upper horizontal line is a horizontal line which is positioned above the corresponding material instance and is nearest to the corresponding material instance; each lower horizontal line is a horizontal line which is positioned below the corresponding material instance and is closest to the corresponding material instance;
The instance parameter generation module is further configured to determine, for each material instance in the virtual building, a first distance between the targeted material instance and a corresponding upper horizontal line, and a second distance between the targeted material instance and a corresponding lower horizontal line; superposing the first distance and the second distance to obtain a distance sum corresponding to the targeted material instance; overlapping the distance sum corresponding to each material instance in the virtual building to obtain a distance calculation result corresponding to the aimed candidate round; and taking the candidate round with the minimum distance calculation result as a target round.
In one embodiment, the instance parameter generating module is further configured to obtain a preset room division requirement; for each virtual floor in the plurality of virtual floors, performing room division on the virtual floor according to the room division requirement to obtain a plurality of virtual rooms in the virtual floor; the room dividing requirement comprises at least one requirement that the height-width ratio of the divided virtual room is an integer and each material instance is completely divided into one virtual room.
In one embodiment, the virtual building is located in a preset building coordinate system; the instance parameter generation module is further used for determining a first position of the targeted virtual room in the preset building coordinate system; determining a second position of the material instance arranged on the outer surface of the aimed virtual room in the preset building coordinate system; and determining size information and position information of the material instance arranged on the outer surface of the targeted virtual room relative to the targeted virtual room according to the first position and the second position.
In one embodiment, the instance parameter generation module is further configured to obtain a material used to generate a material instance; determining a target surface in the material, and projecting the material based on the target surface to obtain a projection result projected to a preset material two-dimensional space; and carrying out horizontal and vertical scaling on the projection result to obtain a material two-dimensional image which corresponds to the material and is paved with the material two-dimensional space.
In a third aspect, the present application further provides a computer device, where the computer device includes a memory and a processor, where the memory stores a computer program, and where the processor implements steps in any of the building indoor rendering methods provided in the embodiments of the present application when the computer program is executed.
In a fourth aspect, the present application also provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements steps in any of the building indoor rendering methods provided by embodiments of the present application.
In a fifth aspect, the present application also provides a computer program product comprising a computer program which, when executed by a processor, implements steps in any of the building indoor rendering methods provided by the embodiments of the present application.
According to the building indoor rendering method, the building indoor rendering device, the computer equipment, the storage medium and the computer program product, the material examples in the virtual building can be determined by acquiring the virtual building to be rendered, and the scaling of the material examples compared with the original material examples is determined according to the size information of the material examples. By acquiring the position information of the material instance, the indoor area which can be seen through the material instance can be determined based on the position information and the size information, so that the indoor area can be rendered, and the rendering effect of the indoor area can be obtained. Compared with the traditional method of rendering through manual mode, the method and the device can automatically render the virtual building to obtain indoor effect, and therefore rendering efficiency is greatly improved. In addition, because the indoor area is determined through the scaling, the indoor rendering effect can be accurately obtained no matter how the material instance is scaled, and compared with the traditional material instance, the size of the indoor rendering effect can only be fixed and unchanged, and the applicability of indoor rendering is improved.
Drawings
FIG. 1 is an application environment diagram of a building indoor rendering method in one embodiment;
FIG. 2 is a flow diagram of a method of rendering an interior of a building in one embodiment;
FIG. 3 is a schematic diagram of a virtual building in one embodiment;
FIG. 4 is a schematic diagram of a two-dimensional image of material in one embodiment;
FIG. 5 is a schematic diagram of rendering effects in one embodiment;
FIG. 6 is a schematic diagram of an overall flow of coordinate transformation in one embodiment;
FIG. 7 is a schematic illustration of a virtual building surface in one embodiment;
FIG. 8 is a schematic diagram of a set of parallel lines in one embodiment;
FIG. 9 is a schematic diagram of adjacent horizontal lines in one embodiment;
FIG. 10 is a schematic view of a virtual room in one embodiment;
FIG. 11 is a schematic diagram of an overall flow of generation of room semantic information in one embodiment;
FIG. 12 is a schematic illustration of the size and location of material instances in a virtual room in one embodiment;
FIG. 13 is a flow chart of a method of rendering an interior of a building in one embodiment;
FIG. 14 is a schematic overall frame diagram of a method of rendering an interior of a building in one embodiment;
FIG. 15 is a block diagram of a construction indoor rendering device in one embodiment;
FIG. 16 is an internal structural view of a computer device in another embodiment;
Fig. 17 is an internal structural view of a computer device in one embodiment.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The building indoor rendering method provided by the embodiment of the application can be applied to an application environment shown in fig. 1. Wherein the terminal 102 communicates with the server 104 via a network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on the cloud or other servers. Both the terminal 102 and the server 104 may be used separately to perform the building indoor rendering method provided in the embodiments of the present application. The terminal 102 and the server 104 may also cooperate to perform the building indoor rendering methods provided in embodiments of the present application. Taking the example that the terminal 102 and the server 104 may cooperate to perform the building indoor rendering method provided in the embodiment of the present application, a target application may be run in the terminal 102, for example, a game application may be run. A virtual building may be displayed by the target application. When it is required to render the virtual building, the terminal 102 may generate a rendering request and transmit the rendering request to the server 104, so that the server 104 may generate an indoor rendering image in response to the rendering request and transmit the indoor rendering image to the terminal 102. When the terminal 102 receives the indoor rendering image, the terminal 102 may render a virtual room in the virtual building based on the indoor rendering image. The terminal 102 may be, but not limited to, various desktop computers, notebook computers, smart phones, tablet computers, internet of things devices, aircrafts, and portable wearable devices, and the internet of things devices may be smart speakers, smart televisions, smart air conditioners, smart vehicle devices, and the like. The portable wearable device may be a smart watch, smart bracelet, headset, or the like. The server 104 may be implemented as a stand-alone server or as a server cluster of multiple servers.
It should be noted that the terms "first," "second," and the like as used herein do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. The singular forms "a," "an," or "the" and similar terms do not denote a limitation of quantity, but rather denote the presence of at least one, unless the context clearly dictates otherwise. The numbers of "plurality" or "multiple parts" and the like referred to in the embodiments of the present application each refer to the number of "at least two", for example, "multiple" means "at least two", and "multiple parts" means "at least two parts".
The present application relates to artificial intelligence techniques, for example, the present application may generate images through artificial intelligence good computer vision techniques. Computer Vision (CV) is a science of studying how to "look" a machine, and more specifically, to replace a human eye with a camera and a Computer to perform machine Vision such as recognition and measurement on a target, and further perform graphic processing to make the Computer process an image more suitable for human eye observation or transmission to an instrument for detection. As a scientific discipline, computer vision research-related theory and technology has attempted to build artificial intelligence systems that can acquire information from images or multidimensional data. Computer vision technologies typically include image processing, image recognition, image semantic understanding, image retrieval, OCR, video processing, video semantic understanding, video content/behavior recognition, three-dimensional object reconstruction, 3D technology, virtual reality, augmented reality, synchronous positioning and mapping, autopilot, intelligent transportation, etc., as well as common biometric technologies such as face recognition, fingerprint recognition, etc.
In one embodiment, as shown in fig. 2, a method for rendering a room in a building is provided, and the method is exemplified as applied to a computer device. The computer device may be the terminal or server of fig. 1. The indoor rendering of the building comprises the following steps:
step 202, obtaining a virtual building to be rendered, wherein the virtual building is provided with a virtual room, and the outer surface of the virtual room is provided with a material instance; the material instance is used to present a view of the interior of the virtual room.
The virtual building is a three-dimensional building model, and for example, the virtual building can be a building model in a game. In one embodiment, referring to fig. 3, the virtual building may be a model obtained by three-dimensional reconstruction of a building by a three-dimensional reconstruction algorithm. A virtual building may have a plurality of virtual rooms disposed therein, and one or more instances of material may be disposed on an exterior surface of the virtual rooms. Fig. 3 shows a schematic diagram of a virtual building in one embodiment.
The material example refers to a virtual object for looking into the virtual room through, and is used for presenting a scene inside the virtual room. Examples of materials may be, for example, a door, window, balcony, etc. of a virtual room through which the internal view of the virtual room can be seen, for example, the internal furnishing, indoor layout, etc. of the virtual room can be seen. The material instance may be an instance object obtained by instantiating the material, and only one piece of material may be reserved in the computing device, and one material instance is created at each location where the material is used. For example, only one piece of material may be retained in a computer device, while multiple instances of material are created in a virtual building.
Specifically, when a virtual building rendering task needs to be performed, the computer device may obtain a virtual building to be rendered. For example, when a game application is running in the computer device, a virtual character and a virtual building can be displayed through the game application, and when a player controls the virtual character to approach the virtual building, the computer device can generate a virtual building rendering task, and the virtual building rendering task is executed to render the virtual building, so that the player can see the rendered virtual building, thereby achieving the effect of being personally on the scene.
Step 204, obtaining the size information of the material instance, and determining the scaling of the material instance compared with the original material instance according to the size information of the material instance.
Specifically, the material examples set on the virtual room have example parameters in which size information and position information of the material examples are recorded. For example, a window is provided in the virtual room, and the window is preset with instance parameters including the relative size between the window and the virtual room, and including the relative position between the window and the virtual room.
When obtaining the instance parameters of the material instance, the computer device may extract the size information of the material instance from the instance parameters, and determine the scaling of the material instance compared to the original material instance according to the size information of the material instance. In one embodiment, the original material instance is an instance object prior to scaling the material instance. When modeling the virtual building, a batch of original material instances can be generated in the virtual building in advance, and then the original material instances can be subjected to size adjustment to obtain material instances which are more matched with the virtual building. Alternatively, after the modeled virtual building is room-partitioned to obtain a virtual room, the original material instances located in the virtual room may be resized to make the resized material instances more matching the virtual room. For example, when the size of the virtual room is 3 m×3 m, if the size of the window is 2.9 m×2.9 m, the size of the window will be mismatched with the size of the virtual room due to oversized window when the window is placed on the outer surface of the virtual room, and the window can be reduced to adapt to the virtual room.
In one embodiment, the size information in the instance parameter is the relative size between the instance of material and the virtual room. The computer device may obtain the actual size of the virtual room and determine the actual size of the material instance based on the actual size of the virtual room, the relative size between the material instance and the virtual room. The computer device may obtain an actual size of the original material instance, determine a scale of the material instance compared to the original material instance based on the actual size of the original material instance and the actual size of the material instance. For example, the example parameters may record that the size of the material example is half of the size of the virtual room, when the height and width of the virtual room are 3 m×3 m, the actual size of the material example may be determined to be 1.5 m×1.5 m, and when the actual size of the original material example is 3 m×3 m, the material example may be determined to be reduced by half compared to the original material example.
In one embodiment, the size information in the instance parameters is the actual size of the material instance. At this time, the computer device may acquire the actual size of the original material instance, and determine the scaling of the material instance compared to the original material instance based on the actual size of the material instance and the actual size of the original material instance. For example, the actual size of the material instance is divided by the actual size of the original material instance to obtain the scaling of the material instance compared to the original material instance.
Step 206, obtaining the position information of the material instance, and determining an indoor area according to the position information and the scaling of the material instance, wherein the indoor area is the indoor area seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room.
Specifically, the instance parameters of the material instance further include location information of the material instance, and when the location information and the scaling of the material instance are determined, the indoor area which can be seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room can be determined based on the location information and the scaling.
In one embodiment, the obtained position information of the material instance is the relative position between the material instance and the virtual room; determining the indoor area according to the position and the scaling of the material example comprises the following steps: acquiring a material two-dimensional image corresponding to a material instance; the material two-dimensional image is an image obtained by projecting a material corresponding to a material instance into a preset material two-dimensional space; performing scaling treatment on the material two-dimensional image according to the scaling ratio to obtain the position of the scaled material two-dimensional image in a material two-dimensional space; according to the relative position between the material instance and the virtual room, translating the scaled material two-dimensional image to obtain the position of the scaled material two-dimensional image in the material two-dimensional space; determining a room two-dimensional space corresponding to the virtual room, and determining the position of the scaled and translated material two-dimensional image in the room two-dimensional space according to the position of the scaled and translated material two-dimensional image in the material two-dimensional space; and determining the indoor area according to the position of the scaled and translated material two-dimensional image in the two-dimensional space of the room.
Specifically, when the indoor area needs to be determined, the computer device may acquire a two-dimensional image of the material corresponding to the material instance. The material two-dimensional image is an image obtained by projecting the material corresponding to the material instance into a preset material two-dimensional space. For example, referring to fig. 4, when the material is a window that has not been placed in a virtual building, the front 401 of the window may be projected into a two-dimensional space of the material (also referred to as a material UV space), and the projected image may be scaled so that the projected image can span the entire two-dimensional space of the material, thereby obtaining a two-dimensional image 402 of the material (also referred to as a material UV image). The two-dimensional space of the material can be a 1*1 size space. Fig. 4 shows a schematic representation of a two-dimensional image of material in one embodiment. Wherein the dashed lines in fig. 4 constitute a material two-dimensional space.
Further, when the material two-dimensional image is obtained, the computer device can scale the material two-dimensional image according to the scaling ratio to obtain a scaled material two-dimensional image. For example, when the scale is one half of the material instance compared to the original material instance, the computer device may reduce the material two-dimensional image by one half. The computer equipment can translate the scaled material two-dimensional image through the relative position between the material instance and the virtual room in the instance parameters so as to obtain the scaled translated material two-dimensional image and obtain the position of the scaled translated material two-dimensional image in the material two-dimensional space, wherein the position of the scaled translated material two-dimensional image in the material two-dimensional space is matched with the position of the material instance in the virtual room. For example, when the instance parameter records that the center of the material instance coincides with the center of the virtual room, the computer device may translate the scaled material instance so that the center of the scaled material instance coincides with the center of the material two-dimensional space.
Further, the computer device determines a room two-dimensional space (also called a room UV space) corresponding to the virtual room, and converts the material two-dimensional image from the material two-dimensional space into the room two-dimensional space according to the position of the scaled and translated material two-dimensional image in the material two-dimensional space, so as to obtain the position of the scaled and translated material two-dimensional image in the room two-dimensional space. The position of the scaled material two-dimensional image in the room two-dimensional space can comprise the position coordinates of each vertex of the scaled material two-dimensional image in the room two-dimensional space, and the size of the scaled material two-dimensional image in the room two-dimensional space can be determined through the position coordinates of each vertex of the scaled material two-dimensional image in the room two-dimensional space, and the specific position of the scaled material two-dimensional image in the room two-dimensional space can be determined, so that the indoor area which can be seen through the material example can be determined based on the size and the specific position of the scaled material two-dimensional image in the room two-dimensional space. For example, the room two-dimensional space can be regarded as a virtual room with a depth of 0, the scaled and translated material two-dimensional image can be regarded as a material instance (the depth of the material instance is 0) arranged on the outer surface of the virtual room, and when the size and the specific position of the scaled and translated material two-dimensional image in the room two-dimensional space are obtained, the area occupied by the scaled and translated material two-dimensional image in the room two-dimensional space can be regarded as the indoor area of a plane. When a depth information is added to the indoor area of the plane, a three-dimensional indoor area can be obtained.
In the above embodiment, through the above UV coordinate change, the indoor space that can be seen through the material instance can be accurately determined no matter how the material instance is scaled, so that the indoor rendering effect obtained later is more accurate. And because no matter how the material examples are scaled, the indoor rendering effect can be accurately obtained, compared with the traditional material examples, the size of which can only be fixed and unchanged, and the embodiment also improves the applicability of indoor rendering.
In one embodiment, determining the position of the scaled and translated material two-dimensional image in the room two-dimensional space according to the position of the scaled and translated material two-dimensional image in the material two-dimensional space includes: acquiring the size information of an original material instance, and determining the size represented by a unit coordinate in a material two-dimensional space according to the size information of the original material instance; acquiring size information of a virtual room, and determining the size represented by unit coordinates in a two-dimensional space of the room according to the size information of the virtual room; and adjusting the position coordinates of the scaled and translated material two-dimensional image in the material two-dimensional space according to the size represented by the unit coordinates in the material two-dimensional space and the size represented by the unit coordinates in the room two-dimensional space, so as to obtain the position coordinates of the scaled and translated material two-dimensional image in the room two-dimensional space.
Specifically, the obtained size information of the original material instance may be an actual size of the original material instance. The obtained size information of the virtual room may be an actual size of the virtual room. When the actual size of the original material instance is determined, the size represented by the unit coordinates in the material two-dimensional space can be determined based on the actual size of the original material instance. For example, the width in the actual size of the original material instance may be divided by the width of the material two-dimensional space, and the height in the actual size of the original material instance may be divided by the height of the material two-dimensional space. For example, when the actual size of the front surface of the original material example is determined to be 2.2 m×1.6 m, since the spatial size of the material two-dimensional space is 1*1, it can be determined that the size represented by the unit coordinates of the U (horizontal axis) axis is 2.2 m and the size represented by the unit coordinates of the V (vertical axis) axis is 1.6 m in the material two-dimensional space. Accordingly, the computer device may divide the width in the actual size of the virtual room by the width of the two-dimensional space of the room, and divide the height in the actual size of the room by the height of the two-dimensional space of the room to obtain the size represented by the unit coordinates in the two-dimensional space of the room. For example, when the space size of the two-dimensional space of the room is 1*1 and the actual size of the virtual room is 3 m×3 m, it can be determined that the size represented by the unit coordinates of the U (horizontal axis) axis is 3 m and the size represented by the unit coordinates of the V (vertical axis) axis is 3 m in the two-dimensional space of the room.
Further, when determining the size represented by the unit coordinates in the material two-dimensional space and the size represented by the unit coordinates in the room two-dimensional space, the computer device may determine the scaling of the room two-dimensional space relative to the material two-dimensional space based on the size represented by the unit coordinates in the material two-dimensional space and the size represented by the unit coordinates in the room two-dimensional space, so as to convert the scaled material two-dimensional image from the material two-dimensional space to the room two-dimensional space based on the scaling of the room two-dimensional space relative to the material two-dimensional space, so as to obtain the position coordinates of the scaled material two-dimensional image in the room two-dimensional space. For example, when the two-dimensional space of the room is reduced by half to obtain the two-dimensional space of the material, the size of the scaled and translated two-dimensional image of the material can be doubled to obtain the position coordinates of the scaled and translated two-dimensional image of the material in the two-dimensional space of the room.
In the above embodiment, by determining the size represented by the unit coordinates in the material two-dimensional space and the size represented by the unit coordinates in the room two-dimensional space, the scale of the room two-dimensional space compared with the material two-dimensional space can be accurately determined based on the determined sizes, so that the scaled and translated material two-dimensional image can be accurately converted into the room two-dimensional space based on the scale.
Step 208, obtaining a preset indoor building material, generating an indoor rendering image according to the indoor building material and the indoor area, and placing the indoor rendering image at the material instance.
Wherein, the material is the response function of the three-dimensional model surface to illumination, and is used for describing the interaction between the object surface and the illumination in the scene, namely: when light strikes the surface of an object, the material determines the color of the surface of the object that ultimately appears on the screen. The indoor building material is a material which is arranged indoors by the pointer.
Specifically, the computer device may obtain a preset indoor building material, obtain an indoor effect corresponding to an indoor area in the indoor building material, generate an indoor rendering image by the obtained indoor effect, render the indoor area by the generated indoor rendering image, and, for example, place the generated indoor rendering image at the material instance. In one embodiment, referring to fig. 5, when the material instance is a window, the final rendering may result in an indoor scene 501. FIG. 5 illustrates a rendering effect schematic in one embodiment.
In one embodiment, referring to FIG. 6, FIG. 6 illustrates an overall flow diagram of coordinate transformation in one embodiment. When the material example is a window, and the actual size of the original material example corresponding to the material example is 2.2 meters by 1.6 meters, the two-dimensional image of the material corresponding to the original material example is 601. The computer device may determine a compression ratio based on the size information in the instance parameters, scale the material two-dimensional image based on the compression ratio, resulting in a scaled material two-dimensional image 602. The computer device translates the scaled two-dimensional image of material based on the location information in the instance parameters, resulting in a scaled translated two-dimensional image 603 of material. Further, the computer device converts the scaled and translated material two-dimensional image into a room two-dimensional space to obtain an indoor rendering image based on the material two-dimensional image converted into the room two-dimensional space.
In the building indoor rendering method, the material instance in the virtual building can be determined by acquiring the virtual building to be rendered, and the scaling of the material instance compared with the original material instance is determined according to the size information of the material instance. By acquiring the position information of the material instance, the indoor area which can be seen through the material instance can be determined based on the position information and the size information, so that the indoor area can be rendered, and the rendering effect of the indoor area can be obtained. Compared with the traditional method of rendering through manual mode, the method and the device can automatically render the virtual building to obtain indoor effect, and therefore rendering efficiency is greatly improved. In addition, because the indoor area is determined through the scaling, the indoor rendering effect can be accurately obtained no matter how the material instance is scaled, and compared with the traditional material instance, the size of the indoor rendering effect can only be fixed and unchanged, and the applicability of indoor rendering is improved.
In one embodiment, the size and location of the material instance are stored in instance parameters of the material instance; the example parameters also comprise at least one of indoor layout information and household ornament patterns; generating an indoor rendering image according to the indoor building material and the indoor area, comprising: and outputting an indoor rendering image based on the indoor layout information, the household ornament style and the indoor area through the indoor building material.
In particular, when setting instance parameters, indoor layout information, furniture item styles may also be encoded as instance parameters. The indoor layout information may be used to indicate an indoor layout of the virtual room, for example, may be used to indicate whether the virtual room is a conference room layout, a print room layout, a bedroom layout, or the like. The furniture accessory style may include a window covering horizontal display, vertical display, full open or half open, and may also be used to indicate whether there is water stain on the furniture accessory, and so on. When the indoor layout information, the furniture ornament style and the indoor area are obtained, the indoor building material can generate an indoor rendering image matched with the indoor building material information, the furniture ornament style and the indoor area.
In the embodiment, the variety of the generated indoor rendering image can be enriched by adding the indoor layout information and the home decoration style in the example parameters.
In one embodiment, a virtual building is provided with a plurality of material instances; the size and the position of the material instance are stored in instance parameters of the material instance; the generating step of the example parameters comprises the following steps: according to a plurality of material examples in the virtual building, performing room division processing on the virtual building to obtain a plurality of virtual rooms in the virtual building; for each of a plurality of virtual rooms, determining instances of material disposed on exterior surfaces of the virtual room in question; determining a size and a position of the material instance disposed on the exterior surface of the targeted virtual room relative to the targeted virtual room; and generating respective corresponding instance parameters of each material instance according to the size and the position of each material instance in the building model relative to the virtual room.
In particular, before using the instance parameters, the instance parameters need to be set, and the steps of setting the instance parameters are explained in detail below. It is readily understood that a digital scene may be provided with a plurality of virtual buildings, for each of which instance parameters may be set in the following manner.
The virtual building obtained by modeling in the digital scene can be provided with a plurality of material instances, but the virtual building model at the moment does not have room semantics, so that the virtual building model needs to be divided into virtual rooms based on the plurality of material instances to obtain the room semantics, and thus, the respective corresponding instance parameters of each material instance on the virtual building are generated based on the room semantics. For example, the virtual building may be a building, and when modeling the building, a plurality of windows, doors, and balconies may be disposed in the building, so that when instance parameters need to be generated, the computer device may divide the building into rooms according to the windows, doors, and balconies in the building to obtain a plurality of rooms in the building, so as to generate respective instance parameters corresponding to the windows, doors, and balconies based on the sizes and positions of the windows, doors, and balconies in the corresponding virtual rooms.
In one embodiment, when a plurality of virtual buildings are included in the digital scene, or a virtual building and a virtual environment in which the virtual buildings are located are included, the virtual building of the instance parameters to be generated may be determined by building identification of the virtual building.
In this embodiment, by generating the room semantics, the size and the position of the material instance with respect to the virtual room may be determined based on the generated room semantics, so that the size and the position of the material instance with respect to the virtual room may be encoded as instance parameters, and thus accurate instance parameters may be obtained.
In one embodiment, according to a plurality of material instances in a virtual building, performing room division processing on the virtual building to obtain a plurality of virtual rooms in the virtual building, including: determining an outer contour line of the virtual building; determining a virtual building surface of the virtual building according to the position relation among the outer contour lines; dividing floors of a virtual building based on the virtual building to obtain a plurality of virtual floors; for each virtual floor of the plurality of virtual floors, dividing the virtual floor according to the material instance in the virtual floor to be subjected to room division to obtain a plurality of virtual rooms in the virtual floor to be subjected to room division.
Specifically, when a virtual building for which instance parameters are to be generated is obtained, an outer contour of the virtual building may be determined first. The outer contour line refers to a line used to form the virtual building contour, for example, referring to fig. 3, the outer contour line may be 301, 302, and 303 in fig. 3. It will be readily appreciated that fig. 3 only identifies a partial outline of a virtual building. Further, the computer device may determine a building surface of the virtual building from the outline. For example, a surface formed by an outer contour line may be used as a building surface of a virtual building. For convenience of description, the building surface of the virtual building will be referred to as a virtual building surface hereinafter. For example, referring to fig. 7, the plane constituted by the broken line in fig. 7 is a virtual building plane. When the virtual building surface is obtained, the virtual building surface can be divided into floors based on the virtual building surface, and a plurality of floors can be obtained. For convenience of description, the floors divided are referred to as virtual floors hereinafter. FIG. 7 illustrates a schematic diagram of a virtual building surface in one embodiment.
Further, for each of the plurality of virtual floors, the computer device may divide the room for each virtual floor to obtain a plurality of rooms in each virtual floor. For convenience of description, the divided rooms will be referred to as virtual rooms hereinafter. For example, when virtual floors 1 to 10 are divided, division of rooms may be performed for virtual floor 1, resulting in a plurality of virtual rooms in virtual floor 1; the division of rooms may also be performed for virtual floor 2, resulting in multiple virtual rooms in virtual floor 2, which are processed sequentially until multiple virtual rooms in virtual floor 10 are obtained.
It will be readily appreciated that the virtual building may comprise a plurality of virtual building surfaces, the remaining virtual building surfaces excluding the upper and lower surfaces, and the computer apparatus may divide the virtual rooms in the manner described above to obtain respective virtual rooms for each of the remaining virtual building surfaces excluding the upper and lower surfaces. Alternatively, for the remaining virtual building surfaces except the upper surface and the lower surface, the room may be divided for a part of the remaining virtual building surfaces, to obtain a virtual room on the part of the virtual building surfaces.
In the above embodiment, by determining the outer contour line, the virtual building surface can be determined based on the outer contour line, and by determining the virtual building surface, the division of floors can be performed based on the virtual building surface, resulting in a plurality of virtual floors. By determining a plurality of virtual floors, the division of rooms can be performed for each virtual floor, so that a plurality of virtual rooms on each virtual floor can be obtained.
In one embodiment, in a digital scenario, each line in a virtual building has a corresponding class identification, and the outer contour of the virtual building may be determined based on the class identification.
In one embodiment, the virtual building may be contour identified to obtain an outer contour line for constructing the virtual building contour.
In one embodiment, the virtual building is located in a preset building coordinate system; determining a virtual building surface of a virtual building from the outline, comprising: determining depth position coordinates of each outline in the virtual building in a preset building coordinate system respectively; determining an outer contour line group according to the depth position coordinates; the difference between the depth position coordinates of the outer contour lines in the outer contour line group is smaller than or equal to a preset difference threshold value; the surface of the virtual building, which is composed of the outer contour line group, is referred to as a virtual building surface.
Specifically, the virtual building is arranged in a preset building coordinate system, and when the outer contour lines are determined, the depth position coordinate of each outer contour line in the preset building coordinate system can be determined. Wherein, referring to fig. 7, the preset building coordinate system may be in an axis alignment relationship with the virtual building, that is, the virtual building may be approximately a cuboid, the height of the virtual building is parallel to the y-axis of the preset building coordinate system, the width of the virtual building is parallel to the x-axis of the preset building coordinate system, and the depth of the virtual building is parallel to the z-axis of the preset coordinate system. The depth position coordinates may be position coordinates of the z-axis. The computer device may cluster each outer contour line according to the depth information of each outer contour line to obtain an outer contour line group, where the outer contour lines in the outer contour line group have the same or similar depth information, that is, have the same or similar z-axis coordinates. The computer device uses a surface formed by the outer contour lines in the outer contour group as a virtual building surface.
In one embodiment, the virtual building is located in a preset building coordinate system; determining a virtual building surface of a virtual building from the outline, comprising: for each outer contour line in the virtual building, screening parallel lines parallel to the outer contour line of the virtual building; screening out intersecting lines intersecting the outer contour line of the virtual building; and determining a virtual building surface of the virtual building according to the first position information of the aimed outer contour line in the preset building coordinate system, the second position information of the parallel line in the preset building coordinate system and the third position information of the intersecting line in the preset coordinate system.
Specifically, the virtual building may be approximated as a cuboid, and the virtual building is disposed in a preset building coordinate system, which may be in an axially aligned relationship with the virtual building. For each outer contour in the virtual building, lines parallel to the outer contour in question, called parallel lines, may be screened out of the outer contours of the virtual building. For example, referring to fig. 3, when the outer contour line is 301, parallel lines parallel to the outer contour line 301 may be the outer contour line 302. Further, the computer device may screen out the outer contour lines of the virtual building for lines intersecting the outer contour lines aimed at, referred to as intersecting lines. For example, referring to fig. 3, the line intersecting the outer contour 301 may be the outer contour 303. The computer device may determine the position information of the outer contour line in the preset building coordinate system for which it is intended, referred to as first position information, and the position information of the parallel lines in the preset building coordinate system, referred to as second position information, and the position information of the intersecting lines in the preset coordinate system, referred to as third position information. The lines with the same depth information in the outer contour lines, parallel lines and intersecting lines are determined, namely the lines with the same or similar z-axis coordinates are determined, and the plane formed by the lines with the same or similar z-axis coordinates is taken as a virtual building plane. And determining the line with the same or similar x-axis coordinates in the outer contour line, the parallel line and the intersecting line, and taking the plane formed by the lines with the same or similar x-axis coordinates as the virtual building plane.
In the above embodiment, by determining the positional relationship between the outer contours, the virtual building surface on the virtual building can be accurately determined based on the positional relationship between the outer contours.
In one embodiment, the floor division is performed on the virtual building based on the virtual building to obtain a plurality of virtual floors, including: determining the floor height of the current turn; the floor division result of the current round is obtained by carrying out floor division on the virtual building based on the floor height of the current round and the virtual building; determining the floor height of the next round, entering a next round floor dividing process, taking the floor height of the next round as the floor height of the new current round, and returning to the step of dividing the floors of the virtual building by the floor height of the current round until the preset stopping condition is met; and determining a plurality of final virtual floors according to the floor division results of each round.
Specifically, when the virtual building surface is determined, division of virtual floors may be performed through the virtual building surface. For a virtual building surface, layering can be performed from top to bottom, when layering, parallel line groups can be constructed by the floor height H, and the parallel line groups can not cut material examples in the virtual building, and the floor height H is optimized by minimizing the sum of distances from the material examples to upper and lower horizontal lines.
More specifically, the computer device may optimize the floor height H for multiple passes. In the current round, the computer equipment can acquire the floor height of the current round, and divide the floors of the virtual building through the floor height of the current round to obtain a plurality of initial virtual floors, wherein the plurality of initial virtual floors are the floor division results of the current round. The computer equipment enters the floor dividing process of the next round, and the floor height of the next round is obtained. The floor height of the next turn may be adjusted over the floor height of the previous turn or may be a randomly generated floor height. The computer equipment performs floor division on the virtual building based on the floor height of the next round to obtain a plurality of initial virtual floors, wherein the plurality of initial virtual floors are the floor division results of the next round. And iterating in this way until the preset stopping condition is met. The preset stopping condition may be freely set according to the requirement, for example, when the preset round is iterated, it is determined that the preset stopping condition is met. Theoretically, when the number of iterations is sufficient, a suitable floor height can be obtained.
Further, when the iteration is stopped, the floor division result of each executed round can be obtained, and the computer equipment can determine the target round according to the floor division result of each round, and the initial virtual floor obtained by dividing the target round is used as the virtual floor obtained by final division.
In one embodiment, the floor height of the current round may be determined based on the floor division results of the previous round. For example, when the floor division result of the previous round indicates that the parallel line group divides the material instance, the distance between every two adjacent horizontal lines in the parallel line group can be considered to be shorter, that is, the floor height of the previous round can be considered to be smaller, and the floor height of the previous round can be increased at this time to obtain the floor height of the current round. For example, when the floor division result of the previous round indicates that the sum of the distances between the material instance and the upper horizontal lines and the lower horizontal lines is greater than the preset distance threshold, the distance between every two adjacent horizontal lines in the parallel line group can be considered to be larger, that is, the floor height of the previous round can be considered to be larger, and at the moment, the floor height of the previous round can be reduced to obtain the floor height of the current round.
In the above embodiment, by performing the iteration of the rounds, the floor height may be optimized stepwise to obtain the floor height suitable for the virtual building, and further the indoor rendering image suitable for the virtual building may be generated based on the floor height suitable for the virtual building.
In one embodiment, the floor division result of the current round includes a virtual floor obtained by dividing the current round; the floor division result of the current turn is obtained by the floor height of the current turn and carrying out floor division on the virtual building based on the virtual building, and the floor division result comprises the following steps: in the floor dividing process of the current round, constructing parallel line groups on the virtual building surface; the distance between every two adjacent horizontal lines in the parallel line group is the floor height of the current round; determining building surface areas between every two adjacent horizontal lines in the parallel line groups on the virtual building surface; and determining virtual floors obtained by dividing the current round in the virtual building according to the building surface area.
Specifically, during the floor division of the current turn, the computer device may construct a set of parallel lines on the virtual building surface based on the floor height of the current turn. For example, referring to fig. 8, a set of parallel lines may be constructed on a virtual building surface 801, where the set of parallel lines includes horizontal lines 802, 803, 804, and so on. It will be readily appreciated that fig. 8 does not label all horizontal lines in the set of parallel lines. Wherein, for every two adjacent horizontal lines in the parallel line group, the distance between the adjacent horizontal lines is the floor height of the current turn. For example, the adjacent horizontal lines may be the horizontal line 802 and the horizontal line 803 in fig. 8, and the distance between the horizontal line 802 and the horizontal line 803 is the floor height of the current turn.
Further, the computer device determines the area between every two adjacent horizontal lines in the parallel line group on the virtual building surface, which is called building surface area, and takes the building surface area as a virtual floor, and notices that the virtual floor at the moment is a two-dimensional floor, if depth information is added to the virtual floor, a three-dimensional virtual floor can be obtained. For example, extending the horizontal line along the Z-axis results in a horizontal plane, and the area between adjacent horizontal planes may be used as a virtual floor. Figure 8 shows a schematic diagram of a set of parallel lines in one embodiment.
In this embodiment, by setting the parallel line groups with the floor height of the current turn, the division of floors of the virtual building can be quickly performed based on the parallel line groups, so that the determination efficiency of the virtual floors is greatly improved.
In one embodiment, determining a final plurality of virtual floors based on the floor division results of each round includes: determining whether parallel line groups constructed in the aimed round cut material examples in the virtual building according to the floor division result obtained in each round; if the material examples in the virtual building are not segmented, taking the round aimed at as a candidate round; for each candidate round, determining the adjacent horizontal lines corresponding to each material instance in the parallel line group constructed in the candidate round, and determining a distance calculation result corresponding to the candidate round according to the distance between each material instance and the corresponding adjacent horizontal line; and screening a target round from the candidate rounds according to the distance calculation results corresponding to the candidate rounds, and determining a plurality of virtual floors finally according to the floor division results of the target round.
Specifically, for each round, the computer device performs the following steps. When the floor division result of each round is obtained, the computer equipment determines whether the parallel line group constructed in the round is divided into material examples in the virtual building or not for each round, if the material examples are divided, the computer equipment indicates that one material example is divided into two initial virtual floors, and the division of the floors is wrong. When the parallel line group constructed in the aimed turn does not cut the material example, the floor division is possibly correct, and the aimed turn is taken as a candidate turn.
When candidate runs are determined, the computer device performs the following steps for each candidate run. And determining adjacent horizontal lines corresponding to each material instance in the parallel line groups constructed in the candidate rounds. The adjacent horizontal lines comprise an upper horizontal line and a lower horizontal line; each upper horizontal line is a horizontal line which is positioned above the corresponding material instance and is nearest to the corresponding material instance; each lower horizontal line is a horizontal line located below and closest to the corresponding material instance. For example, referring to fig. 9, when the candidate round for is the third round, for the material example 901, the adjacent horizontal lines corresponding to the material example 901 are the horizontal line 902 and the horizontal line 903, where the horizontal line 902 is the upper horizontal line, the horizontal line 903 is the lower horizontal line, and the horizontal line 902 and the horizontal line 903 are lines in the parallel line group constructed in the third round. It is easy to understand that when a plurality of material instances are provided in the virtual building, adjacent horizontal lines corresponding to each material instance individually can be determined. Fig. 9 shows a schematic of adjacent horizontal lines in one embodiment.
Further, when the adjacent horizontal line corresponding to each material instance is determined, the distance calculation result corresponding to the candidate round for which each material instance corresponds may be determined according to the adjacent horizontal line corresponding to each material instance. For example, in the above example, when the candidate round for the third round is the third round, the distance calculation result corresponding to the third round may be determined based on the adjacent horizontal lines corresponding to each material instance.
When the distance calculation result corresponding to each candidate round is determined, the computer equipment screens out the target round from the candidate rounds according to the distance calculation results, and determines a plurality of final virtual floors according to the floor division results of the target round. For example, a plurality of initial virtual floors divided in a target round are used as a plurality of virtual floors for dividing rooms.
In one embodiment, determining a distance calculation result corresponding to the candidate round for each material instance according to the distance between the material instance and the corresponding adjacent horizontal line includes: for each material instance in the virtual building, determining a first distance between the targeted material instance and the corresponding upper horizontal line, and determining a second distance between the targeted material instance and the corresponding lower horizontal line; overlapping the first distance and the second distance to obtain a distance sum corresponding to the aimed material instance; overlapping the distance sum corresponding to each material instance in the virtual building to obtain a distance calculation result corresponding to the targeted candidate round; and screening the target round from the candidate rounds according to the distance calculation results corresponding to the candidate rounds, wherein the method comprises the following steps: and taking the candidate round with the minimum distance calculation result as a target round.
Specifically, for each material instance in the virtual building, the computer device processes as follows. For each material instance, the computer device determines an upper horizontal line corresponding to the targeted material instance and determines a distance between the targeted material instance and the upper horizontal line, referred to as a first distance. The computer device determines a lower horizontal line corresponding to the targeted material instance and determines a distance between the targeted material instance and the lower horizontal line, referred to as a second distance. And the computer equipment superimposes the first distance and the second distance to obtain a distance sum corresponding to the aimed material instance. For example, referring to fig. 9, for the third round, the computer apparatus refers to the distance between the material instance 901 and the horizontal line 902 as a first distance, the distance between the material instance 901 and the horizontal line 903 as a second distance, and superimposes the first distance and the second distance to obtain a distance sum corresponding to the material instance 901.
Further, when obtaining the respective distance sums of each material instance, the computer device may superimpose the respective distance sums to obtain a distance calculation result corresponding to the round for which each distance sum corresponds. For example, for the third round, when obtaining the distance sum corresponding to each material instance, the computer device may superimpose the distances to obtain the distance calculation result corresponding to the third round.
When the distance calculation result corresponding to each candidate round is obtained, the computer device may use the candidate round with the smallest distance calculation result as the target round, and use the virtual floor obtained by dividing the target round as the virtual floor obtained by final division.
In this embodiment, by taking the candidate round with the smallest distance calculation result as the target round, the virtual floors obtained by the final division can be divided as many as possible on the premise of ensuring that one material instance is not located on two virtual floors, so that the virtual floors obtained by the division are closer to the floors in the real scene, and the feeling of making a person personally on the scene is achieved.
In one embodiment, according to a material instance in the targeted virtual floor, dividing the targeted virtual floor into a plurality of virtual rooms in the targeted virtual floor includes: acquiring preset room dividing requirements, and dividing the aimed virtual floors according to the room dividing requirements to obtain a plurality of virtual rooms in the aimed virtual floors; wherein the room dividing requirement includes at least one requirement that the divided virtual room height-width ratio is an integer, and each material instance is completely divided into one virtual room.
Specifically, after the virtual floors are determined, the computer device may perform division of rooms for each virtual floor. For each virtual floor of the plurality of virtual floors, the computer device obtains a preset room partitioning requirement and partitions the room for the virtual floor according to the preset room partitioning requirement. Wherein the room dividing requirement at least comprises: each material instance is completely divided into one virtual room, i.e., one material instance cannot be located in two virtual rooms; the aspect ratio of the virtual rooms obtained by division is as integer as possible, for example, 1:1, 2:1 or 3:1, etc. Referring to fig. 10, fig. 10 illustrates a schematic diagram of a virtual room in one embodiment. In fig. 10, rectangular blocks of the same color belong to one virtual room. In one embodiment, the aspect ratio of the virtual room may be determined based on a pre-constructed proportion of the indoor building material. For example, when the ratio of the pre-constructed indoor building materials is 2:1, the division of rooms can be attempted according to the ratio of 2:1.
In one embodiment, when a virtual room is partitioned, the computer device may resize instances of material within the virtual room based on the size of the virtual room. For example, when the material example is a window, the window may be reduced in size when the window is consistent with the size of the virtual room, so that the reduced window is more compatible with the virtual room. The material instance before resizing is the original material instance mentioned above. That is, the computer device may divide the virtual floor and the virtual room based on the original material instance, and when the virtual room is obtained by the division, the size of the original material instance may be scaled to obtain the material instance. And then determining an instance parameter corresponding to the material instance according to the position size of the material instance in the virtual room, and generating an indoor rendering diagram based on the instance parameter when the indoor rendering of the building is needed.
In the above embodiment, by setting the room dividing requirement, the virtual floors may be divided based on the room dividing requirement, so that the divided virtual rooms are more similar to the rooms in the real environment, and when the divided virtual rooms are more similar to the rooms in the real environment, the divided virtual rooms are more accurate.
In one embodiment, referring to FIG. 11, FIG. 11 illustrates an overall flow diagram for generating room semantic information in one embodiment. In the generation of room semantic information, the input is a digital scene, and the computer device may extract virtual buildings from the input digital scene, i.e., the computer device separates the buildings. The computer equipment determines the building surface of the extracted virtual building to obtain the virtual building surface, namely, the computer equipment performs building surface division. Further, the computer device performs layering on the virtual building to obtain a virtual floor, that is, the computer device performs building face layering. When the virtual floors are obtained, the computer equipment can divide rooms on the virtual floors to obtain virtual rooms, namely, the computer equipment divides rooms on the building floors to obtain a plurality of virtual rooms, and room information of the plurality of virtual rooms is room semantic information.
In one embodiment, the virtual building is located in a preset building coordinate system; determining the size and location of material instances disposed on the exterior surface of the targeted virtual room relative to the targeted virtual room includes: determining a first position of the targeted virtual room in a preset building coordinate system; determining a material instance arranged on the outer surface of the aimed virtual room, and a second position in a preset building coordinate system; size information and position information of material instances disposed on the exterior surface of the targeted virtual room relative to the targeted virtual room are determined based on the first location and the second location.
Specifically, the virtual building is located in a preset building coordinate system, and for each virtual room in the virtual building, the computer device can process as follows. For each virtual room in the virtual building, the computer device may determine the location coordinates of the virtual room for which it is intended in the virtual building coordinate system, referred to as the first location. The computer device may determine the setting and instance of material on the virtual room for which it is intended and determine the location coordinates of the instance material in the virtual building coordinate system, referred to as the second location. The computer device determines the size and location of the instance material as compared to the virtual room for which it is intended based on the first location and the second location. For example, when the virtual room is a virtual room a, a material instance B is set on the virtual room a, the computer device may determine an actual size of the material instance B according to a position coordinate of the material instance B in a preset building coordinate system, determine the actual size of the virtual room a based on the position coordinate of the virtual room a in the preset building coordinate system, and determine a size of the material instance B compared with the virtual room a based on the actual size of the material instance B and the actual size of the virtual room a, thereby obtaining size information. The computer device may further determine a position of the material instance B relative to the virtual room a according to the position coordinates of the material instance B in the preset building coordinate system and based on the position coordinates of the virtual room a in the preset building coordinate system, thereby obtaining the position information.
In one embodiment, referring to fig. 12, the size and location of material instances in the virtual room being targeted may be denoted by (W, H) and (Px, py), respectively. FIG. 12 illustrates a schematic of the size and location of material instances in a virtual room in one embodiment.
It will be readily appreciated that, when the size information and the position information are determined, the computer device may use the size information and the position information as instance parameters for the instance of material. For example, in the above example, after the size and position of the material instance B in the virtual room a, the computer apparatus may take the size and position of the material instance B in the virtual room a as the instance parameters of the material instance B.
In one of the real-time, the computer device may also randomly generate indoor layout information and a home decoration pattern according to a building type of the virtual building, for example, an office building, a residential building, etc. The home furnishing patterns include proper curtains (horizontal, vertical, fully or semi-open, etc.), water spots, indoor layout information including meeting rooms, printing rooms or bedrooms, etc. And also encodes the indoor layout information and the furniture-ornament style as instance parameters of the material instance.
In the embodiment, the instance parameters are set, so that the subsequent indoor rendering of the building is convenient, rapid and accurate based on the instance parameters, and the accuracy and efficiency of the rendering are improved.
In one embodiment, the method further comprises: acquiring materials used for generating material examples; determining a target surface in the material, and projecting the material based on the target surface to obtain a projection result projected to a preset material two-dimensional space; and carrying out horizontal and vertical scaling on the projection result to obtain a material two-dimensional image which is corresponding to the material and is paved with a material two-dimensional space.
Specifically, in order to obtain an indoor rendering map that is full of the entire material instance at the time of rendering, it is also necessary to pre-process the material two-dimensional image of the material instance. The computer device may obtain the material used to generate the instance of the material and determine a target surface in the material, where the target surface may be a surface specified by a user, e.g., a front surface in the material is specified as the target surface. The front surface is an outdoor-facing surface, for example, when the material is a window, referring to fig. 4, the target surface is a geometric surface 401 including a glass material portion and an outer contour portion, and the computer device projects the target surface into a preset material two-dimensional space by means of orthogonal projection, so as to obtain an orthogonal projection result. Since the preset two-dimensional space of the material is a 1*1 space, the actual size of the material is normalized in the projection process. Referring to fig. 4, the orthogonal projection results in 403. The computer device may scale the orthogonal projection results horizontally and vertically so that the orthogonal projection results can span the entire material two-dimensional space, resulting in a material two-dimensional image 402 as shown in fig. 4.
In one embodiment, the material is a three-dimensional model, and the depth value of the material is set to 0, so that an orthogonal projection result of projecting the target surface into a preset material two-dimensional space is obtained.
In the above embodiment, the projection result is scaled to obtain the material two-dimensional image which fills the whole material two-dimensional space, so that the indoor rendering map generated based on the material two-dimensional image can fill the area formed by the outer contour of the material instance, and the rendering effect is better.
Referring to fig. 13, fig. 13 illustrates an overall flow diagram of building indoor rendering in one embodiment. The indoor rendering method of the building comprises the following steps:
step 1302, obtaining a virtual building and determining an outer contour line of the virtual building; and determining a virtual building surface of the virtual building according to the position relation among the outer contour lines.
Step 1304, determining the floor height of the current turn; the floor division result of the current round is obtained by carrying out floor division on the virtual building based on the floor height of the current round and the virtual building; determining the floor height of the next turn and entering a next-turn floor dividing process, taking the floor height of the next turn as the floor height of the new current turn, and returning to the step of dividing the floors of the virtual building through the floor height of the current turn until the preset stopping condition is met.
Step 1306, for each round, determining whether the material instance in the virtual building is located in at least two initial virtual floors for the initial virtual floors obtained by the division of the round; and if the number of the rounds is at least two initial virtual floors, taking the round to be targeted as a candidate round.
Step 1308, for each candidate round, determining the adjacent horizontal line corresponding to each material instance in the parallel line group constructed in the candidate round, and determining the distance calculation result corresponding to the candidate round according to the distance between each material instance and the corresponding adjacent horizontal line.
Step 1310, screening out a target round from the candidate rounds according to the distance calculation results corresponding to the candidate rounds; the initial virtual floor obtained by dividing the target round is taken as the final virtual floor.
Step 1312, obtaining preset room dividing requirements; for each virtual floor in the plurality of virtual floors, dividing the room of the virtual floor according to the room dividing requirement to obtain a plurality of virtual rooms in the virtual floor; the room dividing requirement includes at least one requirement that the divided virtual room height-width ratio is an integer, and each material instance is completely divided into one virtual room.
A step 1314 of, for each of the plurality of virtual rooms, determining an instance of material disposed on an exterior surface of the virtual room being targeted; the method comprises the steps of determining material instances arranged on the outer surface of a virtual room, and generating the material instances according to the size and the position relative to the size and the position of the virtual room.
Step 1316, obtaining material used to generate a material instance; determining a target surface in the material, and projecting the material based on the target surface to obtain a projection result projected to a two-dimensional space of the material; and carrying out horizontal and vertical scaling on the projection result to obtain a material two-dimensional image which is corresponding to the material and is paved with a material two-dimensional space.
In step 1318, when the virtual building needs to be rendered, the size information of the material instance on the virtual building is obtained, and the scaling of the material instance compared with the original material instance is determined according to the size information of the material instance.
Step 1320, acquiring a material two-dimensional image corresponding to a material instance; the material two-dimensional image is an image which is obtained by carrying out projection processing on a material instance and is positioned in a preset material two-dimensional space; and performing scaling processing on the material two-dimensional image according to the scaling ratio to obtain the position of the scaled material two-dimensional image in the material two-dimensional space.
Step 1322, translating the scaled two-dimensional image of the material according to the relative position between the material instance and the virtual room, so as to obtain the position of the scaled two-dimensional image of the material in the two-dimensional space of the material; determining a room two-dimensional space corresponding to the virtual room, and determining the position of the scaled and translated material two-dimensional image in the room two-dimensional space according to the position of the scaled and translated material two-dimensional image in the material two-dimensional space; and determining the indoor area according to the position of the scaled and translated material two-dimensional image in the two-dimensional space of the room.
Step 1324, acquiring a preset indoor building material, generating an indoor rendering image according to the indoor building material and the indoor area, and placing the indoor rendering image at the material instance.
It should be understood that, although the steps in the flowcharts related to the embodiments described above are sequentially shown as indicated by arrows, these steps are not necessarily sequentially performed in the order indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least some of the steps in the flowcharts described in the above embodiments may include a plurality of steps or a plurality of stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of the steps or stages is not necessarily performed sequentially, but may be performed alternately or alternately with at least some of the other steps or stages.
The application scene is applied to the building indoor rendering method.
Specifically, the application of the building indoor rendering method in the application scene is as follows:
referring to fig. 14, fig. 14 shows an overall frame schematic of a method of rendering an interior of a building in one embodiment. The application inputs the digital scene and the indoor material of the building, and outputs the digital scene with the indoor effect of the building.
1. UV treatment of materials
The module inputs a digital scene and outputs the digital scene after preprocessing the material UV. The module establishes a relation between a material UV (U is a horizontal coordinate of a material two-dimensional image in a material two-dimensional space, and V is a vertical coordinate of the material two-dimensional image in the material two-dimensional space) and physical transformation thereof, so that after a material instance is scaled, the correct UV can be calculated through the physical transformation, and a correct indoor effect is obtained. And observing from the front of the material, selecting geometric surfaces of the glass material part and the outer contour part, performing UV projection on the selected geometric surfaces in an orthogonal projection mode, and horizontally and vertically scaling the projection result to enable the projection result to be paved with a UV space (also called as a material two-dimensional space).
2. Generating building room semantics
The module inputs the digital scene and outputs the digital scene containing the room semantic information. After the module is subjected to the steps of separating buildings, building facets, building surface layering, building layer layering rooms and the like, the semantics of the rooms are generated for the buildings in the scene.
2.1 separation of buildings. And separating the virtual building part from the digital scene according to the identification of the virtual building in the digital scene.
2.2 building facets. Dividing the virtual building into different building surfaces according to the edges of the virtual building contours, wherein each edge of the building contours corresponds to one virtual building surface. Instances of material on the same virtual building surface have the same normal and depth offset.
2.3 building cover layering. The virtual building is constructed according to floors, and layering is performed from top to bottom on a virtual building surface. When layering, the height H of the floor needs to be optimized, parallel line groups are built by taking H as the height, parallel lines cannot segment material examples, and the sum of the distances from each material example to the nearest upper horizontal line and the nearest lower horizontal line is minimized.
2.4 building level divides rooms. Each material instance can only belong to one virtual room, but one virtual room may contain multiple materials. Each virtual floor contains several virtual rooms, the aspect ratio of which is kept as integer as possible, such as 1:1,2:1,3:1, etc.
3. Calculating the size and position of material in its room
The input of the module is a digital scene of a divided building room, and the output is an urban scene containing the size and the position of each material instance. Each instance of material within the virtual room is traversed, its size and position relative to the virtual room is calculated, and the size and position of the instance of material within the virtual room determines from which portion of the content within the virtual room should be seen.
4. Setting instance parameters
The input of the module is digital scenes and materials containing indoor effects of the building, and the output is the urban scenes in the set building. Traversing all material examples in sequence, and executing the following operations: firstly, replacing glass materials of material examples with indoor materials of buildings; secondly, the size and the position of the material instance in the virtual room are encoded into material instance parameters; then, according to the type of the virtual building (such as office building, residential building and the like), randomly generating proper curtains (horizontal, vertical, fully-opened, half-opened and the like), water stains and indoor layout (meeting room, printing room, bedroom and the like), and encoding the parameters as material examples; finally, all the encoded parameters are written into the material instance for transforming the UV of the model during rendering. When rendering, the material two-dimensional image is transformed by analyzing the material instance parameters in the indoor building material, and then the correct indoor effect is obtained from the indoor building material. When the scene building is placed indoors, compared with a manual placement scheme, the problems of high cost and low efficiency are solved; and because the material instance can still generate a correct indoor rendering effect after being scaled, the applicability of the application is greatly improved.
The method and the device are suitable for placement of indoor effects of buildings in digital scenes. After the digital scene and the indoor materials of the building are given, room semantics are generated for the building in the scene by a programmed placement method, and parameters for controlling effects of curtains, switching lamps, water stains and the like are randomly generated for each room, so that the reality and diversity of the scene building are greatly improved. In sum, the application can be used for realizing reasonable and various indoor effects of building placement in digital scenes with low cost and high efficiency.
The application scene is applied to the building indoor rendering method. Specifically, the application of the building indoor rendering method in the application scene is as follows:
the terminal can be provided with a game application, and when a user controls a virtual object in a game to approach to a virtual building, the terminal can render the virtual building in the mode, so that the user can see the rendered virtual building. The virtual building may specifically be a virtual building, or the like.
The above application scenario is only schematically illustrated, and it is to be understood that the application of the building indoor rendering method provided in the embodiments of the present application is not limited to the above scenario. For example, a modeling application can be run in the terminal, the virtual building obtained by modeling can be displayed through the modeling application, and when a user desires to render the virtual building, the terminal can render the virtual building through the mode so as to obtain the rendered virtual building.
Based on the same inventive concept, the embodiment of the application also provides a building indoor rendering device for realizing the building indoor rendering method. The implementation of the solution provided by the device is similar to the implementation described in the above method, so the specific limitations in the embodiments of one or more building indoor rendering devices provided below may be referred to the limitations of the building indoor rendering method hereinabove, and will not be repeated here.
In one embodiment, as shown in fig. 15, there is provided a building indoor rendering apparatus 1500, including: a virtual building acquisition module 1502, an indoor area determination module 1504, and a rendering module 1506, wherein:
a virtual building acquisition module 1502, configured to acquire a virtual building to be rendered, where the virtual building is provided with a virtual room, and an outer surface of the virtual room is provided with a material instance; the material instance is used for presenting a scene inside the virtual room;
the indoor area determining module 1504 is configured to obtain size information of a material instance, and determine a scaling of the material instance compared with an original material instance according to the size information of the material instance; acquiring the position information of a material instance, and determining an indoor area according to the position information and the scaling of the material instance, wherein the indoor area is seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room;
The rendering module 1506 is configured to obtain a preset indoor building material, generate an indoor rendering image according to the indoor building material and the indoor area, and place the indoor rendering image at the material instance.
In one embodiment, the obtained position information of the material instance is the relative position between the material instance and the virtual room; the indoor area determining module 1504 is further configured to obtain a two-dimensional image of a material corresponding to the material instance; the material two-dimensional image is an image which is obtained by carrying out projection processing on a material instance and is positioned in a preset material two-dimensional space; performing scaling treatment on the material two-dimensional image according to the scaling ratio to obtain the position of the scaled material two-dimensional image in a material two-dimensional space; according to the relative position between the material instance and the virtual room, translating the scaled material two-dimensional image to obtain the position of the scaled material two-dimensional image in the material two-dimensional space; determining a room two-dimensional space corresponding to the virtual room, and determining the position of the scaled and translated material two-dimensional image in the room two-dimensional space according to the position of the scaled and translated material two-dimensional image in the material two-dimensional space; and determining the indoor area according to the position of the scaled and translated material two-dimensional image in the two-dimensional space of the room.
In one embodiment, the indoor area determining module 1504 is further configured to obtain size information of an original material instance, and determine, according to the size information of the original material instance, a size represented by a unit coordinate in a two-dimensional space of the material; acquiring size information of a virtual room, and determining the size represented by unit coordinates in a two-dimensional space of the room according to the size information of the virtual room; and converting the scaled and translated material two-dimensional image from the material two-dimensional space to the room two-dimensional space according to the size represented by the unit coordinates in the material two-dimensional space and the size represented by the unit coordinates in the room two-dimensional space, so as to obtain the position coordinates of the scaled and translated material two-dimensional image in the room two-dimensional space.
In one embodiment, the size information and the position information of the material instance are stored in the instance parameters of the material instance; the example parameters also comprise at least one of indoor layout information and household ornament patterns; the rendering module 1506 is further configured to output an indoor rendering image including a home decoration that matches the indoor layout information based on the indoor layout information, the home decoration style, and the indoor area, through the indoor building material.
In one embodiment, the building indoor rendering device 1500 further includes an instance parameter generating module, configured to perform room division processing on the virtual building according to a plurality of material instances in the virtual building, so as to obtain a plurality of virtual rooms in the virtual building; for each of a plurality of virtual rooms, determining instances of material disposed on exterior surfaces of the virtual room in question; an instance of material disposed on an exterior surface of the virtual room being targeted is determined relative to a size and location of the virtual room being targeted.
In one embodiment, the instance parameter generation module is further configured to determine an outer contour of the virtual building; determining a virtual building surface of the virtual building according to the position relation among the outer contour lines; dividing floors of the virtual building based on the virtual building facing the virtual building to obtain a plurality of virtual floors; and respectively dividing rooms of each virtual floor to obtain a plurality of virtual rooms.
In one embodiment, the virtual building is located in a preset building coordinate system; the example parameter generation module is also used for screening parallel lines parallel to each outline of the virtual building from the outline of the virtual building; screening out intersecting lines intersecting the outer contour line of the virtual building; and determining a virtual building surface of the virtual building according to the first position information of the aimed outer contour line in the preset building coordinate system, the second position information of the parallel line in the preset building coordinate system and the third position information of the intersecting line in the preset coordinate system.
In one embodiment, the example parameter generation module is further configured to determine a floor height for the current turn; the floor division result of the current round is obtained by carrying out floor division on the virtual building based on the floor height of the current round and the virtual building; determining the floor height of the next round, entering a next round floor dividing process, taking the floor height of the next round as the floor height of the new current round, and returning to the step of dividing the floors of the virtual building by the floor height of the current round until the preset stopping condition is met; and determining a plurality of virtual floors according to the floor division results of each round.
In one embodiment, the example parameter generating module is further configured to construct parallel line groups on the virtual building surface during the floor division of the current round; the distance between every two adjacent horizontal lines in the parallel line group is the floor height of the current round; determining the building surface area between every two adjacent horizontal lines in the parallel line group; and determining the floor division result of the current turn according to the building surface area.
In one embodiment, the instance parameter generating module is further configured to determine, for each round, for an initial virtual floor obtained by dividing the round for which the instance is located, whether the material instance in the virtual building is located in at least two initial virtual floors; if the virtual floors are located at least two initial virtual floors, taking the round aimed at as a candidate round; for each candidate round, determining the adjacent horizontal lines corresponding to each material instance in the parallel line group constructed in the candidate round, and determining a distance calculation result corresponding to the candidate round according to the distance between each material instance and the corresponding adjacent horizontal line; screening a target round from the candidate rounds according to the distance calculation results corresponding to the candidate rounds; the initial virtual floor obtained by dividing the target round is taken as the final virtual floor.
In one embodiment, each material instance includes an upper horizontal line and a lower horizontal line respectively corresponding to adjacent horizontal lines; each upper horizontal line is a horizontal line which is positioned above the corresponding material instance and is nearest to the corresponding material instance; each lower horizontal line is a horizontal line which is positioned below the corresponding material instance and is closest to the corresponding material instance;
the instance parameter generation module is further used for determining, for each material instance in the virtual building, a first distance between the targeted material instance and a corresponding upper horizontal line, and a second distance between the targeted material instance and a corresponding lower horizontal line; overlapping the first distance and the second distance to obtain a distance sum corresponding to the aimed material instance; overlapping the distance sum corresponding to each material instance in the virtual building to obtain a distance calculation result corresponding to the targeted candidate round; and taking the candidate round with the minimum distance calculation result as a target round.
In one embodiment, the instance parameter generating module is further configured to obtain a preset room division requirement; for each virtual floor in the plurality of virtual floors, dividing the room of the virtual floor according to the room dividing requirement to obtain a plurality of virtual rooms in the virtual floor; the room dividing requirement comprises at least one requirement that the height-width ratio of the divided virtual room is an integer and each material instance is completely divided into one virtual room.
In one embodiment, the virtual building is located in a preset building coordinate system; the instance parameter generation module is also used for determining a first position of the aimed virtual room in a preset building coordinate system; determining a second position of the material instance arranged on the outer surface of the aimed virtual room in a preset building coordinate system; size information and position information of material instances disposed on the exterior surface of the targeted virtual room relative to the targeted virtual room are determined based on the first location and the second location.
In one embodiment, the instance parameter generation module is further configured to obtain a material used to generate the material instance; determining a target surface in the material, and projecting the material based on the target surface to obtain a projection result projected to a preset material two-dimensional space; and carrying out horizontal and vertical scaling on the projection result to obtain a material two-dimensional image which is corresponding to the material and is paved with a material two-dimensional space.
The respective modules in the above-described building indoor rendering apparatus may be implemented in whole or in part by software, hardware, and a combination thereof. The above modules may be embedded in hardware or may be independent of a processor in the computer device, or may be stored in software in a memory in the computer device, so that the processor may call and execute operations corresponding to the above modules.
In one embodiment, a computer device is provided, which may be a server, and the internal structure of which may be as shown in fig. 16. The computer device includes a processor, a memory, an Input/Output interface (I/O) and a communication interface. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface is connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, computer programs, and a database. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The database of the computer device is for storing building indoor rendering data. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a method of rendering a room in a building.
In one embodiment, a computer device is provided, which may be a terminal, and the internal structure thereof may be as shown in fig. 17. The computer device includes a processor, a memory, an input/output interface, a communication interface, a display unit, and an input means. The processor, the memory and the input/output interface are connected through a system bus, and the communication interface, the display unit and the input device are connected to the system bus through the input/output interface. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device includes a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The internal memory provides an environment for the operation of the operating system and computer programs in the non-volatile storage media. The input/output interface of the computer device is used to exchange information between the processor and the external device. The communication interface of the computer device is used for carrying out wired or wireless communication with an external terminal, and the wireless mode can be realized through WIFI, a mobile cellular network, NFC (near field communication) or other technologies. The computer program is executed by a processor to implement a method of rendering a room in a building. The display unit of the computer equipment is used for forming a visual picture, and can be a display screen, a projection device or a virtual reality imaging device, wherein the display screen can be a liquid crystal display screen or an electronic ink display screen, the input device of the computer equipment can be a touch layer covered on the display screen, can also be a key, a track ball or a touch pad arranged on a shell of the computer equipment, and can also be an external keyboard, a touch pad or a mouse and the like.
It will be appreciated by those skilled in the art that the structures shown in fig. 16-17 are block diagrams of only portions of structures related to the present application and are not intended to limit the computer device on which the present application may be implemented, and that a particular computer device may include more or fewer components than shown, or may combine certain components, or have a different arrangement of components.
In an embodiment, there is also provided a computer device comprising a memory and a processor, the memory having stored therein a computer program, the processor implementing the steps of the method embodiments described above when the computer program is executed.
In one embodiment, a computer-readable storage medium is provided, storing a computer program which, when executed by a processor, implements the steps of the method embodiments described above.
In one embodiment, a computer program product or computer program is provided that includes computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the steps in the above-described method embodiments.
It should be noted that, the user information (including, but not limited to, user equipment information, user personal information, etc.) and the data (including, but not limited to, data for analysis, stored data, presented data, etc.) referred to in the present application are information and data authorized by the user or sufficiently authorized by each party, and the collection, use and processing of the related data are required to comply with the related laws and regulations and standards of the related countries and regions.
Those skilled in the art will appreciate that implementing all or part of the above described methods may be accomplished by way of a computer program stored on a non-transitory computer readable storage medium, which when executed, may comprise the steps of the embodiments of the methods described above. Any reference to memory, database, or other medium used in the various embodiments provided herein may include at least one of non-volatile and volatile memory. The nonvolatile Memory may include Read-Only Memory (ROM), magnetic tape, floppy disk, flash Memory, optical Memory, high density embedded nonvolatile Memory, resistive random access Memory (ReRAM), magnetic random access Memory (Magnetoresistive Random Access Memory, MRAM), ferroelectric Memory (Ferroelectric Random Access Memory, FRAM), phase change Memory (Phase Change Memory, PCM), graphene Memory, and the like. Volatile memory can include random access memory (Random Access Memory, RAM) or external cache memory, and the like. By way of illustration, and not limitation, RAM can be in the form of a variety of forms, such as static random access memory (Static Random Access Memory, SRAM) or dynamic random access memory (Dynamic Random Access Memory, DRAM), and the like. The databases referred to in the various embodiments provided herein may include at least one of relational databases and non-relational databases. The non-relational database may include, but is not limited to, a blockchain-based distributed database, and the like. The processors referred to in the embodiments provided herein may be general purpose processors, central processing units, graphics processors, digital signal processors, programmable logic units, quantum computing-based data processing logic units, etc., without being limited thereto.
The technical features of the above embodiments may be arbitrarily combined, and all possible combinations of the technical features in the above embodiments are not described for brevity of description, however, as long as there is no contradiction between the combinations of the technical features, they should be considered as the scope of the description.
The above examples only represent a few embodiments of the present application, which are described in more detail and are not to be construed as limiting the scope of the present application. It should be noted that it would be apparent to those skilled in the art that various modifications and improvements could be made without departing from the spirit of the present application, which would be within the scope of the present application. Accordingly, the scope of protection of the present application shall be subject to the appended claims.

Claims (18)

1. A method of indoor rendering of a building, the method comprising:
obtaining a virtual building to be rendered, wherein the virtual building is provided with a virtual room, and the outer surface of the virtual room is provided with a material instance; the material instance is used for presenting a scene inside the virtual room;
acquiring the size information of the material instance, and determining the scaling of the material instance compared with the original material instance according to the size information of the material instance;
Acquiring the position information of the material instance, and determining an indoor area according to the position information of the material instance and the scaling, wherein the indoor area is an indoor area seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room;
and acquiring a preset indoor building material, generating an indoor rendering image according to the indoor building material and the indoor area, and placing the indoor rendering image at the material instance.
2. The method according to claim 1, wherein the obtained position information of the material instance is a relative position between the material instance and the virtual room; the determining the indoor area according to the position information of the material instance and the scaling comprises the following steps:
acquiring a material two-dimensional image corresponding to the material instance; the material two-dimensional image is an image which is obtained by carrying out projection processing on the material instance and is positioned in a preset material two-dimensional space;
performing scaling processing on the material two-dimensional image according to the scaling ratio to obtain the position of the scaled material two-dimensional image in the material two-dimensional space;
According to the relative position between the material instance and the virtual room, translating the scaled material two-dimensional image to obtain the position of the scaled and translated material two-dimensional image in the material two-dimensional space;
determining a room two-dimensional space corresponding to the virtual room, and determining the position of the scaled and translated material two-dimensional image in the room two-dimensional space according to the position of the scaled and translated material two-dimensional image in the material two-dimensional space;
and determining an indoor area according to the position of the scaled and translated material two-dimensional image in the room two-dimensional space.
3. The method of claim 2, wherein the determining the location of the scaled translated material two-dimensional image in the room two-dimensional space based on the location of the scaled translated material two-dimensional image in the material two-dimensional space comprises:
acquiring the size information of the original material instance, and determining the size represented by the unit coordinates in the material two-dimensional space according to the size information of the original material instance;
acquiring the size information of the virtual room, and determining the size represented by the unit coordinates in the two-dimensional space of the room according to the size information of the virtual room;
And converting the scaled and translated material two-dimensional image from the material two-dimensional space to the room two-dimensional space according to the size represented by the unit coordinates in the material two-dimensional space and the size represented by the unit coordinates in the room two-dimensional space, so as to obtain the position coordinates of the scaled and translated material two-dimensional image in the room two-dimensional space.
4. The method of claim 1, wherein the size information and the location information of the material instance are stored in instance parameters of the material instance; the example parameters also comprise at least one of indoor layout information and household ornament patterns; generating an indoor rendering image according to the indoor building material and the indoor area, wherein the indoor rendering image comprises the following steps:
and outputting an indoor rendering image matched with the indoor layout information and comprising the household ornaments based on the indoor layout information, the household ornament style and the indoor area through the indoor building material.
5. The method of claim 1, wherein the step of determining size information and location information for the instance of material comprises:
According to a plurality of material examples in the virtual building, performing room division processing on the virtual building to obtain a plurality of virtual rooms in the virtual building;
for each of the plurality of virtual rooms, determining instances of material disposed on an exterior surface of the virtual room in question;
an instance of material disposed on an exterior surface of the virtual room being targeted is determined relative to a size and location of the virtual room being targeted.
6. The method of claim 5, wherein the performing room division processing on the virtual building according to the plurality of material instances in the virtual building to obtain a plurality of virtual rooms in the virtual building comprises:
determining an outer contour of the virtual building;
determining a virtual building surface of the virtual building according to the position relation among the outer contour lines;
dividing floors of the virtual building based on the virtual building facing the virtual building to obtain a plurality of virtual floors;
and respectively dividing rooms of each virtual floor to obtain a plurality of virtual rooms.
7. The method of claim 6, wherein the virtual building is located in a preset building coordinate system; the determining the virtual building surface of the virtual building according to the position relation between the outer contour lines comprises the following steps:
For each outer contour line in the virtual building, screening parallel lines parallel to the outer contour line of the virtual building;
screening out intersecting lines intersecting the outer contour line of the virtual building;
and determining a virtual building surface of the virtual building according to the first position information of the aimed outer contour line in the preset building coordinate system, the second position information of the parallel line in the preset building coordinate system and the third position information of the intersecting line in the preset coordinate system.
8. The method of claim 6, wherein the dividing the virtual building into a plurality of virtual floors based on the virtual building facing the virtual building comprises:
determining the floor height of the current turn;
the floor division result of the current round is obtained by carrying out floor division on the virtual building based on the floor height of the current round and the virtual building;
determining the floor height of the next round, entering a next round floor dividing process, taking the floor height of the next round as the floor height of a new current round, and returning to the step of dividing the floors of the virtual building through the floor height of the current round until a preset stopping condition is met;
And determining a plurality of virtual floors according to the floor division results of each round.
9. The method according to claim 8, wherein the step of performing floor division on the virtual building by the floor height of the current round and based on the virtual building to obtain the floor division result of the current round comprises:
in the floor dividing process of the current round, constructing parallel line groups on the virtual building surface; the distance between every two adjacent horizontal lines in the parallel line group is the floor height of the current turn;
determining the building surface area between every two adjacent horizontal lines in the parallel line group;
and determining the floor division result of the current round according to the building surface area.
10. The method of claim 8, wherein the floor division result for each turn comprises an initial virtual floor obtained at the respective turn division; the determining a plurality of virtual floors according to the floor division results of each round comprises the following steps:
for each round, determining whether material instances in the virtual building are located in at least two initial virtual floors for the initial virtual floors obtained by the division of the round;
If the virtual floors are located at least two initial virtual floors, taking the round aimed at as a candidate round;
for each candidate round, determining adjacent horizontal lines corresponding to each material instance in parallel line groups constructed in the candidate round, and determining a distance calculation result corresponding to the candidate round according to the distance between each material instance and the corresponding adjacent horizontal line;
screening a target round from the candidate rounds according to the distance calculation results corresponding to the candidate rounds;
and taking the initial virtual floor obtained by dividing the target round as a final virtual floor.
11. The method of claim 10, wherein each respective adjacent horizontal line of the material instances includes an upper horizontal line and a lower horizontal line; each upper horizontal line is a horizontal line which is positioned above the corresponding material instance and is nearest to the corresponding material instance; each lower horizontal line is a horizontal line which is positioned below the corresponding material instance and is closest to the corresponding material instance;
determining a distance calculation result corresponding to the aimed candidate round according to the distance between each material instance and the corresponding adjacent horizontal line, including:
For each material instance in the virtual building, determining a first distance between the targeted material instance and a respective upper horizontal line, and determining a second distance between the targeted material instance and a respective lower horizontal line;
superposing the first distance and the second distance to obtain a distance sum corresponding to the targeted material instance;
overlapping the distance sum corresponding to each material instance in the virtual building to obtain a distance calculation result corresponding to the aimed candidate round;
the step of screening the target round from the candidate rounds according to the distance calculation results corresponding to the candidate rounds, comprising the following steps:
and taking the candidate round with the minimum distance calculation result as a target round.
12. The method of claim 6, wherein the dividing each virtual floor into a plurality of virtual rooms comprises:
acquiring preset room dividing requirements;
for each virtual floor in the plurality of virtual floors, performing room division on the virtual floor according to the room division requirement to obtain a plurality of virtual rooms in the virtual floor;
The room dividing requirement comprises at least one requirement that the height-width ratio of the divided virtual room is an integer and each material instance is completely divided into one virtual room.
13. The method of claim 5, wherein the virtual building is located in a preset building coordinate system; the determining of material instances disposed on the exterior surface of the virtual room for which the material instances are disposed, relative to the size and location of the virtual room for which the material instances are disposed, comprises:
determining a first position of the targeted virtual room in the preset building coordinate system;
determining a second position of the material instance arranged on the outer surface of the aimed virtual room in the preset building coordinate system;
and determining size information and position information of the material instance arranged on the outer surface of the targeted virtual room relative to the targeted virtual room according to the first position and the second position.
14. The method according to claim 4, wherein the method further comprises:
acquiring materials used for generating material examples;
determining a target surface in the material, and projecting the material based on the target surface to obtain a projection result projected to a preset material two-dimensional space;
And carrying out horizontal and vertical scaling on the projection result to obtain a material two-dimensional image which corresponds to the material and is paved with the material two-dimensional space.
15. An indoor rendering device for a building, the device comprising:
the virtual building acquisition module is used for acquiring a virtual building to be rendered, wherein the virtual building is provided with a virtual room, and the outer surface of the virtual room is provided with a material instance; the material instance is used for presenting a scene inside the virtual room;
the indoor area determining module is used for acquiring the size information of the material instance and determining the scaling of the material instance compared with the original material instance according to the size information of the material instance; acquiring the position information of the material instance, and determining an indoor area according to the position information of the material instance and the scaling, wherein the indoor area is an indoor area seen when the material instance is seen from the outside of the virtual room to the inside of the virtual room;
the rendering module is used for acquiring preset indoor building materials, generating an indoor rendering image according to the indoor building materials and the indoor area, and placing the indoor rendering image at the material instance.
16. A computer device comprising a memory and a processor, the memory storing a computer program, characterized in that the processor implements the steps of the method of any one of claims 1 to 14 when the computer program is executed.
17. A computer readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any of claims 1 to 14.
18. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method of any one of claims 1 to 14.
CN202311348311.2A 2023-10-17 2023-10-17 Building indoor rendering method, device, computer equipment and storage medium Pending CN117379782A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311348311.2A CN117379782A (en) 2023-10-17 2023-10-17 Building indoor rendering method, device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311348311.2A CN117379782A (en) 2023-10-17 2023-10-17 Building indoor rendering method, device, computer equipment and storage medium

Publications (1)

Publication Number Publication Date
CN117379782A true CN117379782A (en) 2024-01-12

Family

ID=89436708

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311348311.2A Pending CN117379782A (en) 2023-10-17 2023-10-17 Building indoor rendering method, device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117379782A (en)

Similar Documents

Publication Publication Date Title
US11238644B2 (en) Image processing method and apparatus, storage medium, and computer device
US11989895B2 (en) Capturing environmental features using 2D and 3D scans
CN107004297B (en) Three-dimensional automatic stereo modeling method and program based on two-dimensional plane diagram
Chang et al. Matterport3d: Learning from rgb-d data in indoor environments
Bostanci et al. Augmented reality applications for cultural heritage using Kinect
CN109191369A (en) 2D pictures turn method, storage medium and the device of 3D model
CN109978984A (en) Face three-dimensional rebuilding method and terminal device
CN109360262A (en) The indoor locating system and method for threedimensional model are generated based on CAD diagram
CN108895981A (en) A kind of method for three-dimensional measurement, device, server and storage medium
CN106816077A (en) Interactive sandbox methods of exhibiting based on Quick Response Code and augmented reality
JP2002042169A (en) Three-dimensional image providing system, its method, morphing image providing system, and its method
CN103489216A (en) 3d object scanning using video camera and tv monitor
CN112530005B (en) Three-dimensional model linear structure recognition and automatic restoration method
CN114067041B (en) Material generation method and device of three-dimensional model, computer equipment and storage medium
CN113052951B (en) Object rendering method and device, computer equipment and storage medium
Earl et al. Formal and informal analysis of rendered space: The Basilica Portuense
CN111199573A (en) Virtual-real mutual reflection method, device, medium and equipment based on augmented reality
Boom et al. Interactive light source position estimation for augmented reality with an RGB‐D camera
KR102276451B1 (en) Apparatus and method for modeling using gis
CN110379003A (en) Three-dimensional head method for reconstructing based on single image
CN116310188B (en) Virtual city generation method and storage medium based on instance segmentation and building reconstruction
JP7476511B2 (en) Image processing system, image processing method and program
Peethambaran et al. Enhancing Urban Façades via LiDAR‐Based Sculpting
Divya Udayan et al. Animage-based approach to the reconstruction of ancient architectures by extracting and arranging 3D spatial components
CN117379782A (en) Building indoor rendering method, device, computer equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication