CN113313809A - Rendering method and device - Google Patents

Rendering method and device Download PDF

Info

Publication number
CN113313809A
CN113313809A CN202110621472.9A CN202110621472A CN113313809A CN 113313809 A CN113313809 A CN 113313809A CN 202110621472 A CN202110621472 A CN 202110621472A CN 113313809 A CN113313809 A CN 113313809A
Authority
CN
China
Prior art keywords
room
room number
component
camera
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110621472.9A
Other languages
Chinese (zh)
Inventor
陈钊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China Construction Bank Corp
Original Assignee
China Construction Bank Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China Construction Bank Corp filed Critical China Construction Bank Corp
Priority to CN202110621472.9A priority Critical patent/CN113313809A/en
Publication of CN113313809A publication Critical patent/CN113313809A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/10Geometric effects
    • G06T15/20Perspective computation
    • G06T15/205Image-based rendering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Civil Engineering (AREA)
  • Architecture (AREA)
  • Computer Graphics (AREA)
  • Computing Systems (AREA)
  • Structural Engineering (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Generation (AREA)

Abstract

The application discloses a rendering method and a rendering device, relates to the technical field of computers, and particularly relates to the technical field of artificial intelligence, wherein the method comprises the steps of obtaining a model to be rendered, and determining a room number and a component number in the model to be rendered; determining a central point of a room in the model to be rendered corresponding to the room number, and screening a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point; generating room space attribute information pairs corresponding to the room numbers according to the component number list; generating an index file according to each room number, the member number list corresponding to each room number and the space attribute information pair corresponding to each room number; and determining the room number of the camera in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component. Therefore, geometric grids outside the visual angle range are removed in a large amount, and the rendering efficiency of the BIM model is greatly improved.

Description

Rendering method and device
Technical Field
The application relates to the technical field of computers, in particular to the technical field of artificial intelligence, and particularly relates to a rendering method and device.
Background
At present, Building Information Models (BIMs) play a very important role in the overall process management of Building engineering, including the industrial application of smart buildings, smart parks, smart cities and the like which are emerging at the back by means of BIM models. However, the BIM model is subject to the problem of the industry due to the huge amount of the BIM model, so that a plurality of methods for preprocessing the BIM model appear in the industry. The BIM model is preprocessed, for example, component disassociation is carried out, and after the BIM model is lightened, much building information in the BIM model is completely lost, only partial data is left, and the original effect of the BIM model cannot be achieved.
In the process of implementing the present application, the inventor finds that at least the following problems exist in the prior art:
the BIM model is subjected to component disassociation, and after the BIM model is lightened, much building information in the BIM model is completely lost, only partial data is left, and the original effect of the BIM model cannot be achieved.
Disclosure of Invention
In view of this, the embodiment of the present application provides a rendering method and an apparatus, which can solve the problem that, after the existing component decorrelation is performed on the BIM model, and the lightweight of the BIM model is realized, many building information inside the BIM model are completely lost, only part of data is left, and the original function of the BIM model cannot be achieved.
To achieve the above object, according to an aspect of an embodiment of the present application, there is provided a rendering method including:
obtaining a model to be rendered, and determining a room number and a component number in the model to be rendered;
determining a central point of a room in the model to be rendered corresponding to the room number, and screening a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point;
generating room space attribute information pairs corresponding to the room numbers according to the component number list;
generating an index file according to each room number, the member number list corresponding to each room number and the space attribute information pair corresponding to each room number;
and determining the room number of the camera in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component.
Optionally, the screening the component number list corresponding to the room number according to the center point and the auxiliary line passing through the center point includes:
and for the room corresponding to each room number in the model to be rendered, emitting rays from the center point to each direction, determining the component number of the component which is first intersected with each ray, and further generating a component number list corresponding to the room number.
Optionally, generating a room space attribute information pair corresponding to the room number includes:
and calculating the maximum value and the minimum value of the spatial coordinates of all the components in the component number list corresponding to each room number in the model to be rendered, and determining the calculated maximum value and the calculated minimum value of the spatial coordinates of the components corresponding to each room number as a room spatial attribute information pair corresponding to the room number.
Optionally, determining a room number in which a camera in a current frame of the model to be rendered is located includes:
determining the current space coordinate of a camera in the current frame of the model to be rendered;
acquiring the space coordinate of a camera of a previous frame of a current frame of a model to be rendered and determining the space coordinate as the space coordinate of the previous frame;
and determining the room number of the camera in the current frame according to the current space coordinate and the space coordinate of the previous frame.
Optionally, determining, according to the current spatial coordinate and the spatial coordinate of the previous frame, a room number in which the camera in the current frame is located, includes:
and in response to determining that the current spatial coordinates are the same as the previous frame spatial coordinates, determining that the room number in which the camera in the current frame is located is the same as the room number in which the camera in the previous frame is located, acquiring the room number in which the camera in the previous frame is located and determining the room number as the room number in which the camera in the current frame is located.
Optionally, determining a component number list corresponding to the room number where the camera is located includes:
and acquiring a component number list corresponding to the room number where the camera in the previous frame is located, and determining the component number list as the component number list corresponding to the room number where the camera in the current frame is located.
Optionally, determining a component number list corresponding to the room number where the camera is located includes:
and calling the index file to determine a component number list corresponding to the room number of the camera in the current frame according to the room number of the camera in the previous frame.
Optionally, determining, according to the current spatial coordinate and the spatial coordinate of the previous frame, a room number in which the camera in the current frame is located, includes:
and in response to the fact that the current space coordinate is different from the space coordinate of the previous frame, calling the index file to compare the current space coordinate with each room space attribute information pair in the index file to determine a corresponding space attribute information pair, and further determine a corresponding room number.
In addition, the present application also provides a rendering apparatus, including:
the acquisition unit is configured to acquire a model to be rendered and determine a room number and a component number in the model to be rendered;
the screening unit is configured to determine a central point of a room in the model to be rendered corresponding to the room number, and further screen a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point;
a room space attribute information pair generation unit configured to generate a room space attribute information pair corresponding to the room number from the component number list;
an index file generation unit configured to generate an index file from each room number, a member number list corresponding to each room number, and a pair of spatial attribute information corresponding to each room number;
and the rendering unit is configured to determine the room number of the camera in the current frame of the model to be rendered, and further determine a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component.
Optionally, the screening unit is further configured to:
and for the room corresponding to each room number in the model to be rendered, emitting rays from the center point to each direction, determining the component number of the component which is first intersected with each ray, and further generating a component number list corresponding to the room number.
Optionally, the room space property information pair generating unit is further configured to:
and calculating the maximum value and the minimum value of the spatial coordinates of all the components in the component number list corresponding to each room number in the model to be rendered, and determining the calculated maximum value and the calculated minimum value of the spatial coordinates of the components corresponding to each room number as a room spatial attribute information pair corresponding to the room number.
Optionally, the rendering unit is further configured to:
determining the current space coordinate of a camera in the current frame of the model to be rendered;
acquiring the space coordinate of a camera of a previous frame of a current frame of a model to be rendered and determining the space coordinate as the space coordinate of the previous frame;
and determining the room number of the camera in the current frame according to the current space coordinate and the space coordinate of the previous frame.
Optionally, the rendering unit is further configured to:
and in response to determining that the current spatial coordinates are the same as the previous frame spatial coordinates, determining that the room number in which the camera in the current frame is located is the same as the room number in which the camera in the previous frame is located, acquiring the room number in which the camera in the previous frame is located and determining the room number as the room number in which the camera in the current frame is located.
Optionally, the rendering unit is further configured to:
and acquiring a component number list corresponding to the room number where the camera in the previous frame is located, and determining the component number list as the component number list corresponding to the room number where the camera in the current frame is located.
Optionally, the rendering unit is further configured to:
and calling the index file to determine a component number list corresponding to the room number of the camera in the current frame according to the room number of the camera in the previous frame.
Optionally, the rendering unit is further configured to:
and in response to the fact that the current space coordinate is different from the space coordinate of the previous frame, calling the index file to compare the current space coordinate with each room space attribute information pair in the index file to determine a corresponding space attribute information pair, and further determine a corresponding room number.
In addition, the present application also provides a rendering electronic device, including: one or more processors; a storage device for storing one or more programs which, when executed by one or more processors, cause the one or more processors to implement the rendering method as described above.
In addition, the present application also provides a computer readable medium, on which a computer program is stored, which when executed by a processor implements the rendering method as described above.
One embodiment of the above invention has the following advantages or benefits: the method comprises the steps of determining a room number and a component number in a model to be rendered by obtaining the model to be rendered; determining a central point of a room in the model to be rendered corresponding to the room number, and screening a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point; generating room space attribute information pairs corresponding to the room numbers according to the component number list; generating an index file according to each room number, the member number list corresponding to each room number and the space attribute information pair corresponding to each room number; and determining the room number of the camera in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component. Therefore, geometric grids outside the visual angle range are removed in a large amount, and the rendering efficiency of the BIM model is greatly improved.
Further effects of the above-mentioned non-conventional alternatives will be described below in connection with the embodiments.
Drawings
The drawings are included to provide a further understanding of the application and are not to be construed as limiting the application. Wherein:
fig. 1 is a schematic diagram of a main flow of a rendering method according to a first embodiment of the present application;
fig. 2 is a schematic diagram of a main flow of a rendering method according to a second embodiment of the present application;
fig. 3 is a schematic view of an application scenario of a rendering method according to a third embodiment of the present application;
FIG. 4 is a schematic diagram of the main modules of a rendering apparatus according to an embodiment of the present application;
FIG. 5 is an exemplary system architecture diagram to which embodiments of the present application may be applied;
fig. 6 is a schematic structural diagram of a computer system suitable for implementing the terminal device or the server according to the embodiment of the present application.
Detailed Description
The following description of the exemplary embodiments of the present application, taken in conjunction with the accompanying drawings, includes various details of the embodiments of the application for the understanding of the same, which are to be considered exemplary only. Accordingly, those of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present application. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Fig. 1 is a schematic diagram of a main flow of a rendering method according to a first embodiment of the present application, and as shown in fig. 1, the rendering method includes:
and S101, acquiring a model to be rendered, and determining a room number and a component number in the model to be rendered.
In this embodiment, an execution subject (for example, a server) of the rendering method may obtain a to-be-rendered model that implements indoor browsing fast rendering in a wired or wireless connection manner. The Model to be rendered may be a Building Information Model (BIM). Specifically, the execution subject may pre-process the BIM model to obtain the respective room number and component number. Reading model configuration information of the BIM (specifically, reading serial numbers (1-n), component numbers corresponding to the serial numbers, component data information sets corresponding to the component numbers, and storing the data in the BIM model file according to the component numbers) so as to build a BIM model database. Specifically, the room numbers of each room in the BIM model are not repeated and are unique. All rooms of the BIM model are numbered, wherein the rooms are not only rooms in the traditional sense, but also closed spaces in a building are defined as rooms, such as a hall, a corridor and the like. When determining the room number, it is ensured that all numbers are unique, i.e. unique identification numbers. A model of one floor is extracted from the BIM building model, and specifically, the room numbers may be 001, 002, 003, 004, 005.
All components are then numbered, specifically 1, 2, 3, … …, 25, 26. In particular, all the components of the BIM model, which are the smallest building blocks that make up the BIM model, including walls, panels, beams, columns, windows, doors, stairs, etc., are numbered, and are not subdivided. The numbering of the components follows a rule from small to large, all numbers being unique.
By way of example, the component may include a wall, a floor, a ceiling. The component numbers may be wall-1, wall-2, wall-3, wall-4, floor-5, ceiling-6, wall-7, wall-8, wall-9, floor-10, ceiling-11, wall-12, wall-13, wall-14, floor-15, ceiling-16, wall-17, wall-18, wall-19, floor-20, ceiling-21, wall-22, wall-23, wall-24, floor-25, ceiling-26. The corresponding member numbers for each room are also non-repeating. For example, the component number corresponding to room 001 may include: wall-1, wall-2, wall-3, wall-4, floor-5, ceiling-6. The component number corresponding to room 002 may include: wall-4, wall-7, wall-8, wall-9, floor-10 and ceiling-11. The component numbers corresponding to room 003 may include: wall-8, wall-12, wall-13, wall-14, floor-15, ceiling-16. The component number for room 004 may include: wall-17, wall-3, wall-9, wall-14, wall-19, wall-18, floor-20, ceiling-21. The corresponding component number for room 005 may include: wall-22, wall-18, wall-24, wall-23, floor-25, ceiling-26.
Step S102, determining a central point of a room in the model to be rendered corresponding to the room number, and further screening a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point.
Specifically, the executive agent may generate a corresponding list of component numbers for all rooms of the BIM model in turn, in a round-robin fashion. The method comprises the following specific steps: determining the center point of the room in the model to be rendered corresponding to each room number, taking the center point of each room as a sphere center in a circulating mode, taking rays in all directions passing through the sphere center to the outside of the room where the sphere center is located, and enabling the rays to exceed the room where the sphere center is located so as to ensure that the rays in all directions have intersection points with the room where the sphere center is located, wherein the rays are auxiliary lines. For example, the executing entity may cyclically emit rays in a sphere form from a central point of a room, i.e., a center of the sphere, to all directions 360 ° outside the room, determine a first intersected component of the BIM model corresponding to each ray and add a component number corresponding to the determined first intersected component to the container t (i), where i represents a room number. And after the emission of the rays from the center point of the room to all directions outside the room is finished, the component numbers in the containers T (i) are subjected to de-weighting, and the rest component numbers after de-weighting are determined as a component number list corresponding to the room number i, and are defined as L (i).
In step S103, a room space attribute information pair corresponding to the room number is generated from the component number list.
Specifically, after the room component list screening is completed, the execution main body may sequentially perform statistics of the maximum value and the minimum value of the spatial coordinates on all components of the component number list corresponding to each room, to obtain a maximum coordinate point Pmax and a minimum coordinate point Pmin of each room, that is, Pmax (x, y, z) ═ MAX (l (i) · x), MAX (l (i) · y), MAX (l (i) · z), Pmin (x, y, z) · (MIN (l) (i) · x), MIN (l) (i) · y), MIN (l) (i) · z), Pmax (i) and Pmin (i) are the spatial attribute information of the room r (i), i is the room number, and a pair of the maximum coordinate point component and the minimum coordinate point component corresponding to each room number, respectively, and a pair of the spatial attribute information corresponding to the room number, which may be, for example, < Pmin (001), pmax (001) >, wherein 001 is the room number.
In step S104, an index file is generated from each room number, the component number list corresponding to each room number, and the pair of spatial attribute information corresponding to each room number.
The execution main body can store each room number of the BIM model, the component number list corresponding to each room number and the space attribute information pair corresponding to each room number in a list form in a one-to-one correspondence manner, and further generate a room component information index file of the BIM model, namely an index file F.
Step S105, determining the room number of the camera in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component.
Specifically, BIM model rendering belongs to the category of graphics rendering, and is a cyclic process.
In this embodiment, determining the room number where the camera in the current frame of the model to be rendered is located includes:
determining the current space coordinate of a camera in the current frame of the model to be rendered; acquiring the space coordinate of a camera of a previous frame of a current frame of a model to be rendered and determining the space coordinate as the space coordinate of the previous frame; and determining the room number of the camera in the current frame according to the current space coordinate and the space coordinate of the previous frame.
For example, the execution subject may acquire the current spatial coordinates Pc (x, y, z) of the current frame camera in the three-dimensional scene. Judging whether the current space coordinate of the current frame camera is changed from the previous frame: if there is a change, comparing Pc (x, y, z) with each room space attribute information in the index file F, finding a pair of space attribute information < Pmin (i), Pmax (i) >, and making Pmin (x, y, z) < Pc (x, y, z) < Pmax (x, y, z), where the pair of space attribute information < Pmin (i), Pmax (i) > corresponds to a room number r (i), which is the room number where the current frame of the camera is located. Assuming that the spatial coordinates of the camera satisfy Pmin (004) < Pc (x, y, z) < Pmax (004), the room number where the current camera is located is r (i) < 004. And if not, executing component rendering by the GPU according to the component number list corresponding to the room number of the previous frame according to the rendering pipeline.
Specifically, in some optional implementations of this embodiment, determining, according to the current spatial coordinate and the spatial coordinate of the previous frame, a room number in which the camera in the current frame is located includes:
and in response to determining that the current spatial coordinates are the same as the previous frame spatial coordinates, determining that the room number in which the camera in the current frame is located is the same as the room number in which the camera in the previous frame is located, acquiring the room number in which the camera in the previous frame is located and determining the room number as the room number in which the camera in the current frame is located.
In this implementation manner, determining a component number list corresponding to a room number where the camera is located includes:
and acquiring a component number list corresponding to the room number where the camera in the previous frame is located, and determining the component number list as the component number list corresponding to the room number where the camera in the current frame is located.
In this implementation manner, determining a component number list corresponding to a room number where the camera is located includes:
and calling the index file to determine a corresponding component number list according to the room number of the camera in the previous frame, and further determine the component number list corresponding to the room number of the camera in the current frame. For example, the execution main body may call the index file to perform component list indexing according to the room number r (i) obtained from the previous frame, so as to obtain a component list l (i); when r (i) ═ 004, according to the index list, the corresponding component list l (i) <17, 3, 9, 14, 19, 18, 20, 21>, l (i) is the rendering component list.
In some optional implementation manners of this embodiment, determining, according to the current spatial coordinate and the spatial coordinate of the previous frame, a room number in which the camera in the current frame is located, further includes:
and in response to the fact that the current space coordinate is different from the space coordinate of the previous frame, calling the index file to compare the current space coordinate with each room space attribute information pair in the index file to determine which room space attribute information pair the current space coordinate is located in, and further determining the room number corresponding to the space attribute information pair in which the current space coordinate falls as the room number in which the camera in the current frame is located.
Specifically, rendering of corresponding components is performed, including: the BIM model rendering executes rendering commands cyclically according to the following program steps: reading a rendering component list value, namely a component number list, sequentially reading all component data information sets I from a BIM database according to component numbers in the rendering list, namely the component number list, for example, sequentially reading all information attributes of components with numbers <17, 3, 9, 14, 19, 18, 20, 21> including vertex attributes, material attributes, mapping attributes and the like of the components, transmitting all read component attributes into a video memory, and rendering corresponding components by a GPU according to a rendering pipeline.
In the embodiment, the room number of the camera in the current frame of the model to be rendered is accurately determined based on the comparison between the current space coordinate of the camera in the current frame and the space coordinate of the camera in the previous frame, and when the room numbers corresponding to the rooms of the camera in the previous frame and the current frame are the same, component rendering is directly performed, so that the rendering process is simplified, and the BIM rendering efficiency and accuracy are improved. And when the room numbers corresponding to the rooms in which the cameras are located in the previous frame and the current frame are different, the room numbers corresponding to the current space coordinates of the cameras in the current frame are quickly determined by calling the index file, and then the corresponding component number list is determined based on the index file to perform component rendering, so that the rendering efficiency of the BIM model is greatly improved.
Fig. 2 is a schematic main flow diagram of a rendering method according to a second embodiment of the present application, and as shown in fig. 2, the rendering method includes:
step S201, obtaining a model to be rendered, and determining a room number and a component number in the model to be rendered.
Step S202, determining the center point of the room in the model to be rendered corresponding to the room number, and further screening a component number list corresponding to the room number according to the center point and an auxiliary line passing through the center point.
The principle of step S201 to step S202 is similar to that of step S101 to step S102, and is not described here again.
Specifically, step S202 can also be realized by step S2021:
step S2021, for the room corresponding to each room number in the model to be rendered, rays are emitted from the central point to each direction, the component number of the component which is first intersected with each ray is determined, and then a component number list corresponding to the room number is generated.
Specifically, for the room corresponding to each room number in the model to be rendered, the execution subject may determine a distance from an edge of each room, which is farthest from a center point of each room, to the center point, and then construct a sphere by taking the center point of each room as a sphere radius. Rays are emitted from the center point of each room in each direction, which rays can be replaced by the sphere radius of each direction of the sphere. Specifically, the execution body may determine the component number list corresponding to the room number corresponding to each room by determining the component number of the component corresponding to the point where each sphere radius of the constructed sphere intersects with the first intersection of the corresponding room.
In the embodiment, the central point of each room is taken as the sphere center, the distance from the edge farthest from the central point to the central point in each room is taken as the sphere radius to construct the sphere, and the component number list corresponding to each room is determined by the component number of the component corresponding to the sphere radius in each direction and the first intersected point of each room, so that a large number of geometric grids outside the visual angle range can be eliminated, and the rendering efficiency of the BIM model is greatly improved.
In step S203, a room space attribute information pair corresponding to the room number is generated from the component number list.
The principle of step S203 is similar to that of step S103, and is not described here.
Specifically, step S203 may also be implemented by step S2031:
step S2031, calculating the maximum value and the minimum value of the spatial coordinates of all the components in the component number list corresponding to each room number in the model to be rendered, and determining the calculated maximum value and minimum value of the spatial coordinates of the components corresponding to each room number as the room space attribute information pair corresponding to the room number. For example, the pair of room space attribute information may be < Pmin (001), Pmax (001) >, where 001 is the room number.
In step S204, an index file is generated from each room number, the component number list corresponding to each room number, and the pair of spatial attribute information corresponding to each room number.
Step S205, determining a room number where the camera is located in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number where the camera is located based on the index file, so as to render the corresponding component.
The principle of step S204 to step S205 is similar to that of step S104 to step S105, and is not described here again.
Fig. 3 is a schematic view of an application scenario of a rendering method according to a third embodiment of the present application. The rendering method can be applied to indoor browsing and rapid rendering of scenes of the building information model. As shown in fig. 3, the rendering method in this embodiment involves three steps of BIM model preprocessing, generating an index file, and BIM model rendering.
Specifically, BIM model preprocessing: after the Building Information Model (BIM) components are completed, the number of the internal room is not completely determined, the execution main body is required to firstly mark all the rooms with unique identification number numbers, and then mark all the components (such as walls, plates, beams, columns, doors, windows and the like) in the BIM model with the unique identification number numbers, so that all the room numbers and component numbers in the BIM model are determined. And then, the execution main body can extract data of the BIM model, build a BIM model database, and store the BIM model into the database, wherein all room numbers and component numbers in the BIM model are also stored into the database of the corresponding BIM model.
Generating an index file: namely generating a BIM model room component index file. The method specifically comprises the following steps: the method comprises the steps of screening the components of rooms corresponding to all room numbers, further generating a component number list of the room corresponding to each room number, namely obtaining the component numbers of all the components forming each room in a BIM model, generating a space attribute information pair of the corresponding room according to 3D coordinate information of all the components, and storing the component number list and the space attribute information pair of each room into an independent file (named as a BIM model room component information index file, hereinafter referred to as an index file) to be stored as an auxiliary file of an original BIM model file.
And (3) BIM model rendering: the BIM rendering system reads the BIM configuration information and the index file into the memory, and then transmits the grid data information of the BIM component into the video memory from the memory for calling and rendering the video memory. When a Graphics Processing Unit (GPU) renders each frame of a BIM model, reading current space coordinates of a camera, comparing the current space coordinates with space coordinates of the camera in the previous frame to determine whether the current space coordinates are changed, and directly rendering a corresponding component according to a component number list to be rendered determined by the previous frame when the current space coordinates are not changed; and when the change is determined, recalculating the room number of the camera in the current frame, further obtaining a component number list to be rendered by the GPU in the current frame according to the index file and the room number of the camera, then obtaining the attribute values of the components in the component number list from the database, and further performing corresponding component rendering. In addition, after the current frame component rendering is finished, other rendering commands can be executed, the current frame is updated, and the current space coordinates of the camera in the updated current frame are determined until all the components corresponding to all the rooms in the BIM model are rendered. Therefore, the grid data of all irrelevant components outside the current frame room can be eliminated, the number of the components needing to be rendered is greatly reduced, a large amount of processing resources are saved for GPU rendering, and the rendering efficiency of the BIM model is improved.
The method and the device avoid the problem of resource consumption of the BIM real-time rendering, and through preprocessing of the BIM space grid model in the rendering scene, based on the position of a camera in the 3D scene, specific space scene grid configuration information is generated, a large number of geometric grids outside the visual angle range are removed, and the rendering efficiency of the BIM model is greatly improved.
The embodiment of the application starts from the data source of the BIM model, solves the data problem of the large-volume model, and greatly improves the rendering efficiency, so that the application platform of the BIM model can be transplanted from a high-performance PC (personal computer) end to a low-performance smart phone end and a tablet personal computer end.
Fig. 4 is a schematic diagram of main blocks of a rendering apparatus according to an embodiment of the present application. As shown in fig. 4, the rendering apparatus includes an acquisition unit 401, a filtering unit 402, a room space attribute information pair generation unit 403, an index file generation unit 404, and a rendering unit 405.
An obtaining unit 401 configured to obtain a model to be rendered, and determine a room number and a component number in the model to be rendered.
And the screening unit 402 is configured to determine a center point of a room in the model to be rendered corresponding to the room number, and further screen a component number list corresponding to the room number according to the center point and an auxiliary line passing through the center point.
The room space attribute information pair generation unit 403 is configured to generate a room space attribute information pair corresponding to a room number from the component number list.
The index file generation unit 404 is configured to generate an index file from each room number, a component number list corresponding to each room number, and a pair of spatial attribute information corresponding to each room number.
The rendering unit 405 is configured to determine a room number where the camera is located in the current frame of the model to be rendered, and further determine a component number list corresponding to the room number where the camera is located based on the index file, so as to perform rendering of the corresponding component.
In some embodiments, the screening unit 402 is further configured to: and for the room corresponding to each room number in the model to be rendered, emitting rays from the center point to each direction, determining the component number of the component which is first intersected with each ray, and further generating a component number list corresponding to the room number.
In some embodiments, the room space attribute information pair generation unit 403 is further configured to: and calculating the maximum value and the minimum value of the spatial coordinates of all the components in the component number list corresponding to each room number in the model to be rendered, and determining the calculated maximum value and the calculated minimum value of the spatial coordinates of the components corresponding to each room number as a room spatial attribute information pair corresponding to the room number.
In some embodiments, the rendering unit 405 is further configured to: determining the current space coordinate of a camera in the current frame of the model to be rendered; acquiring the space coordinate of a camera of a previous frame of a current frame of a model to be rendered and determining the space coordinate as the space coordinate of the previous frame; and determining the room number of the camera in the current frame according to the current space coordinate and the space coordinate of the previous frame.
In some embodiments, the rendering unit 405 is further configured to: and in response to determining that the current spatial coordinates are the same as the previous frame spatial coordinates, determining that the room number in which the camera in the current frame is located is the same as the room number in which the camera in the previous frame is located, acquiring the room number in which the camera in the previous frame is located and determining the room number as the room number in which the camera in the current frame is located.
In some embodiments, the rendering unit 405 is further configured to: and acquiring a component number list corresponding to the room number where the camera in the previous frame is located, and determining the component number list as the component number list corresponding to the room number where the camera in the current frame is located.
In some embodiments, the rendering unit 405 is further configured to: and calling the index file to determine a component number list corresponding to the room number of the camera in the current frame according to the room number of the camera in the previous frame.
In some embodiments, the rendering unit 405 is further configured to: and in response to the fact that the current space coordinate is different from the space coordinate of the previous frame, calling the index file to compare the current space coordinate with each room space attribute information pair in the index file to determine a corresponding space attribute information pair, and further determine a corresponding room number.
It should be noted that, in the present application, the rendering method and the rendering apparatus have corresponding relationships in the specific implementation contents, so the repeated contents are not described again.
Fig. 5 shows an exemplary system architecture 500 to which the rendering method or the rendering apparatus of the embodiments of the present application may be applied.
As shown in fig. 5, the system architecture 500 may include terminal devices 501, 502, 503, a network 504, and a server 505. The network 504 serves to provide a medium for communication links between the terminal devices 501, 502, 503 and the server 505. Network 504 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The user may use the terminal devices 501, 502, 503 to interact with a server 505 over a network 504 to receive or send messages or the like. The terminal devices 501, 502, 503 may have installed thereon various communication client applications, such as shopping-like applications, web browser applications, search-like applications, instant messaging tools, mailbox clients, social platform software, etc. (by way of example only).
The terminal devices 501, 502, 503 may be various electronic devices having rendering processing screens and supporting web browsing, including but not limited to smart phones, tablet computers, laptop portable computers, desktop computers, and the like.
The server 505 may be a server providing various services, such as a background management server (for example only) providing support for rendering requests submitted by users using the terminal devices 501, 502, 503. The background management server can obtain the model to be rendered and determine the room number and the component number in the model to be rendered; determining a central point of a room in the model to be rendered corresponding to the room number, and screening a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point; generating room space attribute information pairs corresponding to the room numbers according to the component number list; generating an index file according to each room number, the member number list corresponding to each room number and the space attribute information pair corresponding to each room number; and determining the room number of the camera in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component. Therefore, geometric grids outside the visual angle range are removed in a large amount, and the rendering efficiency of the BIM model is greatly improved.
It should be noted that the rendering method provided by the embodiment of the present application is generally executed by the server 505, and accordingly, the rendering apparatus is generally disposed in the server 505.
It should be understood that the number of terminal devices, networks, and servers in fig. 5 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
Referring now to FIG. 6, shown is a block diagram of a computer system 600 suitable for use in implementing a terminal device of an embodiment of the present application. The terminal device shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present application.
As shown in fig. 6, the computer system 600 includes a Central Processing Unit (CPU)601 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage section 608 into a Random Access Memory (RAM) 603. In the RAM603, various programs and data necessary for the operation of the computer system 600 are also stored. The CPU601, ROM602, and RAM603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
The following components are connected to the I/O interface 605: an input portion 606 including a keyboard, a mouse, and the like; an output section 607 including a signal processing section such as a Cathode Ray Tube (CRT), a liquid crystal credit authorization inquiry processor (LCD), and the like, and a speaker and the like; a storage section 608 including a hard disk and the like; and a communication section 609 including a network interface card such as a LAN card, a modem, or the like. The communication section 609 performs communication processing via a network such as the internet. The driver 610 is also connected to the I/O interface 605 as needed. A removable medium 611 such as a magnetic disk, an optical disk, a magneto-optical disk, a semiconductor memory, or the like is mounted on the drive 610 as necessary, so that a computer program read out therefrom is mounted in the storage section 608 as necessary.
In particular, according to embodiments disclosed herein, the processes described above with reference to the flow diagrams may be implemented as computer software programs. For example, embodiments disclosed herein include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated by the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network through the communication section 609, and/or installed from the removable medium 611. The above-described functions defined in the system of the present application are executed when the computer program is executed by the Central Processing Unit (CPU) 601.
It should be noted that the computer readable medium shown in the present application may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the present application, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In this application, however, a computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams or flowchart illustration, and combinations of blocks in the block diagrams or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present application may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit, a filtering unit, a room space attribute information pair generation unit, an index file generation unit, and a rendering unit. Wherein the names of the elements do not in some way constitute a limitation on the elements themselves.
As another aspect, the present application also provides a computer-readable medium, which may be contained in the apparatus described in the above embodiments; or may be separate and not incorporated into the device. The computer readable medium carries one or more programs which, when executed by a device, cause the device to obtain a model to be rendered, determine a room number and a member number in the model to be rendered; determining a central point of a room in the model to be rendered corresponding to the room number, and screening a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point; generating room space attribute information pairs corresponding to the room numbers according to the component number list; generating an index file according to each room number, the member number list corresponding to each room number and the space attribute information pair corresponding to each room number; and determining the room number of the camera in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component. .
According to the technical scheme of the embodiment of the application, a large number of geometric grids outside the visual angle range can be eliminated, and the rendering efficiency of the BIM model is greatly improved.
The above-described embodiments should not be construed as limiting the scope of the present application. Those skilled in the art will appreciate that various modifications, combinations, sub-combinations, and substitutions can occur, depending on design requirements and other factors. Any modification, equivalent replacement, and improvement made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (12)

1. A rendering method, comprising:
obtaining a model to be rendered, and determining a room number and a component number in the model to be rendered;
determining a central point of a room in the model to be rendered corresponding to the room number, and further screening a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point;
generating room space attribute information pairs corresponding to the room numbers according to the component number list;
generating an index file according to each room number, a component number list corresponding to each room number and a space attribute information pair corresponding to each room number;
and determining the room number of the camera in the current frame of the model to be rendered, and further determining a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component.
2. The method of claim 1, wherein the screening the list of component numbers corresponding to the room numbers according to the center point and an auxiliary line passing through the center point comprises:
and for the room corresponding to each room number in the model to be rendered, emitting rays from the central point to all directions, determining the component number of the component which is first intersected with each ray, and further generating a component number list corresponding to the room number.
3. The method of claim 1, wherein generating the pair of room space attribute information corresponding to the room number comprises:
and calculating the maximum value and the minimum value of the spatial coordinates of all the components in the component number list corresponding to each room number in the model to be rendered, and determining the calculated maximum value and the calculated minimum value of the spatial coordinates of the components corresponding to each room number as a room spatial attribute information pair corresponding to the room number.
4. The method of claim 1, wherein determining the room number in which the camera in the current frame of the model to be rendered is located comprises:
determining the current space coordinate of a camera in the current frame of the model to be rendered;
acquiring the space coordinate of a camera of a previous frame of a current frame of a model to be rendered and determining the space coordinate as the space coordinate of the previous frame;
and determining the room number of the camera in the current frame according to the current space coordinate and the space coordinate of the previous frame.
5. The method according to claim 4, wherein determining the room number where the camera in the current frame is located according to the current spatial coordinate and the spatial coordinate of the previous frame comprises:
and in response to determining that the current spatial coordinates are the same as the previous frame spatial coordinates, determining that the room number in which the camera in the current frame is located is the same as the room number in which the camera in the previous frame is located, acquiring the room number in which the camera in the previous frame is located and determining the room number as the room number in which the camera in the current frame is located.
6. The method of claim 5, wherein determining the component number list corresponding to the room number in which the camera is located comprises:
and acquiring a component number list corresponding to the room number where the camera in the previous frame is located, and determining the component number list as the component number list corresponding to the room number where the camera in the current frame is located.
7. The method of claim 5, wherein determining the component number list corresponding to the room number in which the camera is located comprises:
and calling the index file to determine a component number list corresponding to the room number of the camera in the current frame according to the room number of the camera in the previous frame.
8. The method according to claim 4, wherein determining the room number where the camera in the current frame is located according to the current spatial coordinate and the spatial coordinate of the previous frame comprises:
and in response to the fact that the current space coordinate is different from the space coordinate of the previous frame, calling the index file to compare the current space coordinate with each room space attribute information pair in the index file to determine a corresponding space attribute information pair, and further determine a corresponding room number.
9. A rendering apparatus, characterized by comprising:
the system comprises an acquisition unit, a display unit and a control unit, wherein the acquisition unit is configured to acquire a model to be rendered and determine a room number and a component number in the model to be rendered;
the screening unit is configured to determine a central point of a room in the model to be rendered corresponding to the room number, and then screen a component number list corresponding to the room number according to the central point and an auxiliary line passing through the central point;
a room space attribute information pair generation unit configured to generate a room space attribute information pair corresponding to the room number from the component number list;
an index file generation unit configured to generate an index file from each room number, a component number list corresponding to each room number, and a pair of spatial attribute information corresponding to each room number;
and the rendering unit is configured to determine the room number of the camera in the current frame of the model to be rendered, and further determine a component number list corresponding to the room number of the camera based on the index file so as to render the corresponding component.
10. The apparatus of claim 9, wherein the screening unit is further configured to:
and for the room corresponding to each room number in the model to be rendered, emitting rays from the central point to all directions, determining the component number of the component which is first intersected with each ray, and further generating a component number list corresponding to the room number.
11. A rendering electronic device, comprising:
one or more processors;
a storage device for storing one or more programs,
when executed by the one or more processors, cause the one or more processors to implement the method of any one of claims 1-8.
12. A computer-readable medium, on which a computer program is stored, which, when being executed by a processor, carries out the method according to any one of claims 1-8.
CN202110621472.9A 2021-06-03 2021-06-03 Rendering method and device Pending CN113313809A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110621472.9A CN113313809A (en) 2021-06-03 2021-06-03 Rendering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110621472.9A CN113313809A (en) 2021-06-03 2021-06-03 Rendering method and device

Publications (1)

Publication Number Publication Date
CN113313809A true CN113313809A (en) 2021-08-27

Family

ID=77377378

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110621472.9A Pending CN113313809A (en) 2021-06-03 2021-06-03 Rendering method and device

Country Status (1)

Country Link
CN (1) CN113313809A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114241161A (en) * 2021-12-22 2022-03-25 中设数字技术股份有限公司 BIM (building information modeling) model rendering method and system based on double GPUs (graphic processing units)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114241161A (en) * 2021-12-22 2022-03-25 中设数字技术股份有限公司 BIM (building information modeling) model rendering method and system based on double GPUs (graphic processing units)
CN114241161B (en) * 2021-12-22 2023-09-15 中设数字技术股份有限公司 BIM model rendering method and system based on double GPUs

Similar Documents

Publication Publication Date Title
CN113538706B (en) Digital sand table-based house scene display method, device, equipment and storage medium
WO2023241097A1 (en) Semantic instance reconstruction method and apparatus, device, and medium
CN114792355B (en) Virtual image generation method and device, electronic equipment and storage medium
CN110478898B (en) Configuration method and device of virtual scene in game, storage medium and electronic equipment
US20230206578A1 (en) Method for generating virtual character, electronic device and storage medium
US11315300B2 (en) Multi-spectral rendering for synthetics
CN115797565A (en) Three-dimensional reconstruction model training method, three-dimensional reconstruction device and electronic equipment
CN115578433A (en) Image processing method, image processing device, electronic equipment and storage medium
CN112053440A (en) Method for determining individualized model and communication device
CN114092673B (en) Image processing method and device, electronic equipment and storage medium
CN114998433A (en) Pose calculation method and device, storage medium and electronic equipment
CN113313809A (en) Rendering method and device
CN113971307A (en) Incidence relation generation method and device, storage medium and electronic equipment
US9898873B2 (en) Methods and systems for processing 3D graphic objects at a content processor
CN115311414A (en) Live-action rendering method and device based on digital twinning and related equipment
CN114564268A (en) Equipment management method and device, electronic equipment and storage medium
CN114581586A (en) Method and device for generating model substrate, electronic equipment and storage medium
CN114820908B (en) Virtual image generation method and device, electronic equipment and storage medium
CN111429576A (en) Information display method, electronic device, and computer-readable medium
CN113610992B (en) Bone driving coefficient determining method and device, electronic equipment and readable storage medium
CN114393583B (en) Method and device for controlling equipment through robot
CN113590219B (en) Data processing method and device, electronic equipment and storage medium
CN113778905B (en) UI design acceptance method, device, equipment and storage medium
Wang et al. Research on 3D Terminal Rendering Technology Based on Power Equipment Business Features
Li et al. A NEW CLOUD-EDGE-TERMINAL RESOURCES COLLABORATIVE SCHEDULING FRAMEWORK FOR MULTI-LEVEL VISUALIZATION TASKS OF LARGE-SCALE SPATIO-TEMPORAL DATA

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination