CN113538706B - Digital sand table-based house scene display method, device, equipment and storage medium - Google Patents

Digital sand table-based house scene display method, device, equipment and storage medium Download PDF

Info

Publication number
CN113538706B
CN113538706B CN202111084046.2A CN202111084046A CN113538706B CN 113538706 B CN113538706 B CN 113538706B CN 202111084046 A CN202111084046 A CN 202111084046A CN 113538706 B CN113538706 B CN 113538706B
Authority
CN
China
Prior art keywords
model
house type
house
building
point location
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111084046.2A
Other languages
Chinese (zh)
Other versions
CN113538706A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Xumi Yuntu Space Technology Co Ltd
Original Assignee
Shenzhen Xumi Yuntu Space Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Xumi Yuntu Space Technology Co Ltd filed Critical Shenzhen Xumi Yuntu Space Technology Co Ltd
Priority to CN202111084046.2A priority Critical patent/CN113538706B/en
Publication of CN113538706A publication Critical patent/CN113538706A/en
Application granted granted Critical
Publication of CN113538706B publication Critical patent/CN113538706B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/003Navigation within 3D models or images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B25/00Models for purposes not provided for in G09B23/00, e.g. full-sized devices for demonstration purposes
    • G09B25/04Models for purposes not provided for in G09B23/00, e.g. full-sized devices for demonstration purposes of buildings

Abstract

The disclosure provides a house scene display method, a house scene display device, house scene display equipment and a storage medium based on a digital sand table. The method comprises the following steps: processing the model file of the first building information model to obtain a second building information model, and converting the second building information model into a model intermediate file with a preset format; traversing house type data of each house type in the second building information model, generating a camera point location of the house type based on the window attribute information, and combining the house type data and the camera point location into an element information object; loading the model intermediate file by using an engine of a digital sand table so as to render a building model in the digital sand table, and performing analysis operation on the element information object to obtain a house-type block; and establishing an interactive relation between the camera point location and the house-shaped body block, and adjusting the visual angle of the digital sand table to the visual angle of the camera point location when the selection operation aiming at the house-shaped body block is triggered. The method and the device can automatically generate the camera point positions, improve the house scene configuration efficiency, and reduce the difficulty and the error rate of the house scene configuration.

Description

Digital sand table-based house scene display method, device, equipment and storage medium
Technical Field
The present disclosure relates to the field of digital sand tables, and in particular, to a house view display method, apparatus, device, and storage medium based on a digital sand table.
Background
The digital sand table is used for displaying the relevant information of the building more visually and comprehensively by using a virtual three-dimensional scene, and can support various three-dimensional visual special effects and sand table function applications. The model can be presented in a digital sand table by means of digital imagery, such as a floor model presented in front of the audience by means of a virtualized three-dimensional model. The digital sand table is used for simulating and displaying the landscapes of different building unit types under a specific visual angle, and is one of the main functions of the digital sand table, and the corresponding outdoor scenery of each unit type is displayed through the digital sand table, so that the house buyer can be assisted in making house buying decisions, and the use experience and the display effect of sand table simulation are improved.
In the prior art, the method is mainly based on manual batch construction, that is, manual configuration is performed in a digital sand table according to house type distribution, window position, orientation and the like of a building, and parameters and labels such as camera positions, house types, house numbers and the like are added to each house type one by one in a manual mode. However, this way of manually adding camera positions in the digital sand table and generating landscape views reduces the efficiency of configuring the house scenery in the digital sand table, and as the number of room models in the sand table increases, the difficulty and error rate of configuring the house scenery will increase.
Disclosure of Invention
In view of this, embodiments of the present disclosure provide a house view display method, apparatus, device and storage medium based on a digital sand table, so as to solve the problems of low house view configuration efficiency, and high difficulty and error rate of house view configuration in the prior art.
In a first aspect of the embodiments of the present disclosure, a house view display method based on a digital sand table is provided, including: acquiring a first building information model, processing a model file corresponding to the first building information model to obtain a second building information model, and converting the second building information model into a model intermediate file with a preset format; sequentially traversing each house type in the second building information model to obtain house type data corresponding to each house type, generating a camera point location corresponding to each house type based on the window attribute information, and combining the house type data and the camera point location into an element information object; loading the model intermediate file by using an engine corresponding to a preset digital sand table so as to render a building model corresponding to the model intermediate file in the digital sand table, and performing analysis operation on the element information object to obtain a room type block corresponding to each room type; and establishing an interactive relation between the camera point location of each house type and the house type block, and adjusting the visual angle of the digital sand table to the visual angle of the camera point location when the selection operation of the house type block in the building model for the digital sand table is triggered so as to display the house scene corresponding to the visual angle of the camera point location.
In a second aspect of the embodiments of the present disclosure, there is provided a house view display device based on a digital sand table, including: the acquisition module is configured to acquire a first building information model, process a model file corresponding to the first building information model to obtain a second building information model, and convert the second building information model into a model intermediate file in a preset format; the generating module is configured to sequentially traverse each house type in the second building information model to obtain house type data corresponding to each house type, generate camera point positions corresponding to the house types based on the window attribute information, and combine the house type data and the camera point positions into element information objects; the loading module is configured to load the model intermediate file by using an engine corresponding to a preset digital sand table so as to render a building model corresponding to the model intermediate file in the digital sand table, and perform analysis operation on the element information object to obtain a house type block corresponding to each house type; and the interaction module is configured to establish an interaction relation between the camera point location of each house type and the house type block, and when the selection operation of the house type block in the building model for the digital sand table is triggered, the visual angle of the digital sand table is adjusted to be the visual angle of the camera point location so as to display the house view corresponding to the visual angle of the camera point location.
In a third aspect of the embodiments of the present disclosure, an electronic device is provided, which includes a memory, a processor and a computer program stored in the memory and executable on the processor, and the processor implements the steps of the method when executing the program.
In a fourth aspect of the embodiments of the present disclosure, a computer-readable storage medium is provided, which stores a computer program, which when executed by a processor, implements the steps of the above-mentioned method.
The embodiment of the present disclosure adopts at least one technical scheme that can achieve the following beneficial effects:
processing a model file corresponding to the first building information model to obtain a second building information model by acquiring the first building information model, and converting the second building information model into a model intermediate file with a preset format; sequentially traversing each house type in the second building information model to obtain house type data corresponding to each house type, generating a camera point location corresponding to each house type based on the window attribute information, and combining the house type data and the camera point location into an element information object; loading the model intermediate file by using an engine corresponding to a preset digital sand table so as to render a building model corresponding to the model intermediate file in the digital sand table, and performing analysis operation on the element information object to obtain a room type block corresponding to each room type; and establishing an interactive relation between the camera point location of each house type and the house type block, and adjusting the visual angle of the digital sand table to the visual angle of the camera point location when the selection operation of the house type block in the building model for the digital sand table is triggered so as to display the house scene corresponding to the visual angle of the camera point location. The method and the device can automatically generate corresponding camera point positions for each house type, reduce the problems of low configuration efficiency and increased error rate caused by manual configuration, reduce the difficulty of house scene configuration and improve the house scene display effect.
Drawings
To more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and other drawings can be obtained by those skilled in the art without inventive efforts.
FIG. 1 is a schematic view of an overall processing flow of a technical solution according to an embodiment of the present disclosure;
fig. 2 is a schematic flow chart of a house view display method based on a digital sand table according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram of determining camera point locations in a Revit model provided by an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a digital sand table-based house view display device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present disclosure.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the disclosed embodiments. However, it will be apparent to one skilled in the art that the present disclosure may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present disclosure with unnecessary detail.
The digital sand table is characterized in that a building model is displayed in front of a user in a three-dimensional visual mode through a three-dimensional model technology and a digital image technology, and compared with an entity sand table, the digital sand table is used for displaying relevant information of the building more intuitively and comprehensively through a virtual three-dimensional scene, and various three-dimensional visual special effects and sand table function applications can be supported. The digital sand table is also called a three-dimensional digital sand table, the three-dimensional digital sand table can display the model by displaying a digital image on a screen, and the model can be displayed on the screen by utilizing a three-dimensional GIS technology, a three-dimensional modeling technology and an internet technology.
In the application of the digital sand table technology, the house scene display is an important component function in the digital sand table, and the function supports users to select buildings and house types in the sand table model more intuitively through some simple interactions and screening, and simulates the landscape visual angles corresponding to the buildings, the floor heights and the house types, so that the use experience of house buyers is improved.
In the related art, the house scene configuration in the digital sand table is mainly generated based on a manual batch building mode, for example: the configurator manually configures the camera position of each house type in the digital sand table according to the house type distribution, the window position, the orientation and the like of the building, namely, the camera positions are added to each house type one by one in a manual mode, and parameters and labels such as the house type, the house number and the like are added. However, in the current actual scene, there may be tens of buildings and thousands of house types in one building, so that when there are many house types in the existing digital sand table by manual operation, the process of configuring the house view becomes more and more complicated, the consumed time and labor cost are higher and higher, the probability of human error is higher and higher, and meanwhile, the efficiency of configuring the house view in the digital sand table is greatly reduced.
In view of the above problems in the prior art, it is desirable to provide a room view display scheme that can automatically generate a corresponding camera point location for each room type in a digital sand table, improve the efficiency of room view configuration, reduce the difficulty of room view configuration, and improve the effect of room view display. The overall implementation process of the technical solution of the embodiment of the present disclosure is described below with reference to the accompanying drawings, and fig. 1 is a schematic overall processing flow diagram of the technical solution corresponding to the embodiment of the present disclosure. As shown in fig. 1, the overall process of the technical solution may specifically include:
on one hand, a complete building information model (such as a Revit model) corresponding to a building or a building is obtained, on the other hand, a model of a building specialty is screened from the building information model, and the building information model containing the building specialty is exported to be a model intermediate file, wherein the model intermediate file adopts a datasmith format; and on the other hand, analyzing the building information model obtained by screening, extracting house type information corresponding to each house type in the building information model, screening a window corresponding to each house type according to the building information model, further calculating a normal vector of each window, generating a camera point location based on the normal vector of the window, and exporting the house type information and the camera point location together into a Json format element information object.
Further, after exporting the model intermediate file in the datasmith format and the element information object in the Json format, the two parts of information are recorded by using a UE4 engine on which the digital sand table depends, wherein after the model intermediate file is loaded, the model intermediate file is directly rendered into a three-dimensional model in the digital sand table; after the element information object is loaded, the element information object is analyzed to generate a house type block of each house type, a camera object is added at a position corresponding to the coordinates of the camera point in the house type block according to the camera point in the element information object, and an interaction mode between the camera object and the house type block is established, for example: and configuring the click house type block to move the view angle of the sand table model to the view angle corresponding to the camera object. The configuration operation of the whole house scene is completed, and when an interaction request aiming at a certain house type in the digital sand table is received, the visual angle of the digital sand table is adjusted to the visual angle of the camera object corresponding to the house type, so that the effect that one house type corresponds to one external scene, namely the effect of displaying one house scene is achieved.
Fig. 2 is a schematic flow chart of a house view display method based on a digital sand table according to an embodiment of the present disclosure. The digital sand table-based house view display method in fig. 2 may be executed by a server, and as shown in fig. 2, the digital sand table-based house view display method may specifically include:
s201, acquiring a first building information model, processing a model file corresponding to the first building information model to obtain a second building information model, and converting the second building information model into a model intermediate file with a preset format;
s202, sequentially traversing each house type in the second building information model to obtain house type data corresponding to each house type, generating a camera point location corresponding to each house type based on window attribute information, and combining the house type data and the camera point location into an element information object;
s203, loading the model intermediate file by using an engine corresponding to a preset digital sand table so as to render a building model corresponding to the model intermediate file in the digital sand table, and performing analysis operation on the element information object to obtain a house type block corresponding to each house type;
and S204, establishing an interactive relation between the camera point location of each house type and the house type block, and adjusting the visual angle of the digital sand table to the visual angle of the camera point location when the selection operation of the house type block in the building model for the digital sand table is triggered so as to display the house scene corresponding to the visual angle of the camera point location.
Specifically, the digital sand table in the embodiment of the present disclosure may be considered as a three-dimensional digital sand table constructed by a three-dimensional building model and an external scene in a certain range around the building, where the number of the three-dimensional building model is not limited, and the three-dimensional building model may be a single building or a building composed of multiple buildings; the exterior scene of the building is a virtual exterior scene environment which is constructed in advance according to the actual exterior scene within several kilometers of the periphery of the building, and the exterior scene of the building comprises but is not limited to the following scenes: trees, other buildings, lakes, rivers, etc.
Further, the first building Information model and the second building Information model may both adopt a model based on BIM forward design, BIM (building Information modeling) is referred to as a building Information model for short, BIM refers to a process of creating and managing building Information in a whole life cycle of planning, designing, constructing, operating and maintaining stages of construction engineering and facilities, and a three-dimensional, real-time and dynamic model applied in the whole process covers geometric Information, spatial Information, geographic Information, property Information of various building components and work and material Information. In BIM, Revit is a mainstream BIM visualization and modeling tool, and therefore, the building information model in the embodiment of the present disclosure may adopt a Revit model.
Further, the model intermediate file in the predetermined format refers to a model file that can be recognized and loaded by the UE4 engine, and in practical applications, the model intermediate file may be a model file of the datasmith suffix. Here, the UE4 engine is a game engine, and the underlying engine of the digital sand table in the embodiment of the present disclosure uses the UE4 engine, but of course, other 3D engines can be used as the underlying engine of the digital sand table as well, besides the UE4 engine. The house scene in the embodiment of the present disclosure is shown in the digital sand table of the UE4 engine, and the interactive function of the house scene is also implemented in the digital sand table of the UE4 engine.
According to the technical scheme provided by the embodiment of the disclosure, the embodiment of the disclosure comprises two parts, namely, a first part, converting a Revit model into a model file in a datasmith format, analyzing the Revit model to obtain attribute data (namely house type data) of each house type, calculating a camera point location corresponding to each house type, and generating an element information object in a Json format according to the attribute data of the house type and the camera point location; and a second part, loading a model file in a datasmith format on a digital sand table based on a UE4 engine, analyzing element information objects in a Json format, generating a house type block of each house type, and establishing an interactive relation between a camera point position and the house type block, so that the house type blocks have a certain interactive function, and when the house type blocks are clicked, the view angle of the whole digital sand table can be moved to the view angle of the camera point position corresponding to the house type blocks. The method and the device can automatically generate corresponding camera point positions for each house type, reduce the problems of low configuration efficiency and increased error rate caused by manual configuration, reduce the difficulty of house scene configuration and improve the house scene display effect.
In some embodiments, obtaining a first building information model, and processing a model file corresponding to the first building information model to obtain a second building information model includes: acquiring a model file and a file name corresponding to the model file, sequentially matching the file names of the model files according to a key character string corresponding to a preset target model file, and loading the successfully matched model file to obtain a second building information model; the target model file is a model file related to the building structure, and the first building information model and the second building information model are both Revit models.
Specifically, the first building information model may be considered as an initial model including multiple specialties, in practical applications, a complete Revit model may include multiple specialties, such as a building model, an electromechanical module, a structural model, and the like, and generally only a building specialties model is needed in the digital sand table, so that the model in the initial building information model (i.e., the first building information model) is first screened, and only the building specialties model needs to be loaded.
It should be noted that models of different specialties in the building information model can be regarded as models under different layers, that is, one speciality corresponds to one layer file, and for a complete Revit model, all speciality models are displayed. The following describes a process of loading a second building information model with reference to a specific embodiment, which may specifically include the following:
generally, after building information models are obtained by utilizing BIM forward design, different types of model structures have independent and regularized naming modes, for example, file names of a building professional model and an electromechanical professional model are different, so that the different types of models can be distinguished through the names of model files, and then which model files are target model files needing to be loaded finally are judged. For example, the naming rule of the architecture professional model is "XX-project name-a-f 01. rvt", when the Revit model file is loaded, all file names in the Revit model file can be traversed by using key character strings through a native API interface in the Revit model, wherein the key character strings are character strings generated based on the naming rule of the architecture model, and the "-a-" key character strings can be generated according to the naming rule to traverse and match the model file, so that the model file conforming to the naming rule is screened out, and the matched model file is loaded, and the unmatched model file is not loaded.
In some embodiments, converting the second building information model into a model intermediate file of a predetermined format includes: and converting the Revit model corresponding to the second building information model into a model intermediate file in a format with datasmith as a suffix.
Specifically, the second building information model may be considered to only include a building model, the datasmith format is an intermediate format that the Revit model supports loading to the UE4 engine, a special plug-in and API export interface are provided in the UE4 engine, and the model intermediate file that exports the Revit model file into the datasmith format is a function possessed by the Revit model, and therefore, the specific process of format conversion is not described in the embodiment of the present disclosure.
In some embodiments, sequentially traversing each house type in the second building information model to obtain house type data corresponding to each house type includes: determining all floors in the second building information model, and traversing the house type in each floor in sequence along the floors by taking the floors as units so as to extract a floor identifier, a house type identifier, an outer contour line and an elevation corresponding to each house type; the method comprises the steps of calculating the house type area corresponding to each house type according to the area parameter corresponding to each room in each house type, generating structured data based on floor identification, house type identification, outer contour lines, elevation and the house type area, and taking the structured data as house type data.
Specifically, in the Reivt modeling process, a designer can firstly build a model of a standard floor of a building, or build a half of the standard floor, then obtain a complete standard floor along the center line in bilateral symmetry, completely build the middle floor of the building in a mirror image or copy mode, and finally complement the bottom floor and the top floor of each building. In the process of modeling the standard layer, a designer can finish building the wall facades of all rooms on the same layer, and labels, area outlines and the types of the rooms are marked. Therefore, when the second building information model is traversed to generate the house type data, the standard layer may be used as a basic traversal unit to traverse attribute data of all rooms, and the following detailed description is given to the traversal process of the house type data in combination with a specific embodiment, and may specifically include the following contents:
the data files corresponding to each floor in the second Reivt model file (namely, the second building information model) are obtained, the data files corresponding to each room in each floor data file are sequentially traversed, each room is classified according to the house type, an array structure of one house type dimension can be obtained through traversing data, and the array structure of each house type dimension comprises an array structure of one or more room dimensions. The array structure of each house type dimension at least comprises information such as a floor mark, a house type mark, an outer contour line, an elevation and the like, area parameters corresponding to rooms forming each house type are added, and then the whole area of each house type is obtained.
Further, according to the calculated whole area of each house type, the floor identification, the house type identification, the outer contour line and the elevation of each house type obtained through traversal are combined, a set of data structure used for describing the house type structure is generated comprehensively through the parameters, and the structured data are used as house type data. It should be noted that the above-mentioned so-called standard floor can be regarded as an intermediate floor after the top floor and the bottom floor of the whole building are removed, that is, in practical application, the floor data file representing the intermediate floor in the Reivt model file can be regarded as a file of the standard floor, and of course, in the embodiment of the present disclosure, the standard floor can be regarded as any floor in the intermediate floor.
In some embodiments, generating a camera site location corresponding to the house type based on the window attribute information comprises: acquiring window attribute information corresponding to each house type, determining a wall facade corresponding to a window according to the window attribute information, calculating a plane vector of a center line corresponding to the wall facade, and determining a normal vector of the window according to the plane vector; and respectively moving the window along the normal vector of the window to positive and negative directions for a preset distance to obtain two point location coordinates, judging whether the point location is in the outer contour line according to the point location coordinates, and taking the point location coordinates belonging to the outer contour line as the camera point location.
Specifically, before obtaining the window attribute information corresponding to each house type in the second Reivt model file, it is first necessary to determine which windows in the house type correspond to which window attribute information is processed to generate a camera point location. Since one house type may correspond to a plurality of windows, when the UE4 engine shows the external landscape of the house type, it is often only necessary to pay attention to the landscape at the positions of the balcony and the main-lying window, and therefore, when acquiring the window attribute information in the Reivt model file, only the window attribute information corresponding to the balcony and the main-lying window may be acquired. The following describes how to select a specific window attribute information extraction process in the Reivt model file with reference to a specific embodiment, which may specifically include the following:
in the Revit model file, windows are represented by points, a room-shaped contour line is a closed surface, the intersection relation between the points and the surface is used for judging which points (windows) are in which contour lines (rooms), and the windows belonging to balconies and living rooms can be obtained according to room label information mapped on contour lines. The point location of the window is intersected with the outer contour line according to the outer contour line and the point location corresponding to the window in the second building information model, and the room where the window is located is judged according to the intersection relationship between the point location of the window and the outer contour line.
Further, after determining window attribute information corresponding to a window to be extracted, a camera point location is generated according to the window attribute information, and a detailed description is given below of a generation manner of the camera point location with reference to the accompanying drawings and specific embodiments. Fig. 3 is a schematic diagram of determining a camera point location in a Revit model according to an embodiment of the present disclosure. As shown in fig. 3, the generation process of the camera point location may specifically include:
the center point position of a balcony window or a main horizontal window is obtained, the normal direction of the window is calculated, and in Revit modeling design, the window is usually hung on a wall surface, so that the wall vertical surface corresponding to the window can be found out through window attributes, the XY plane vector of the center line corresponding to the vertical surface is calculated, the normal vector perpendicular to the wall vertical surface is further calculated, and the normal vector of the window is a three-dimensional vector perpendicular to the wall vertical surface. The method for calculating the camera location is shown in fig. 3, and the center point of a window is taken as a circular point, the window is moved by 1m in the positive direction and the negative direction of a normal vector to obtain two locations, whether the locations are included in the contour line of a room is judged according to the locations, if the locations are included, the locations are taken as correct camera locations, and the coordinates of the camera locations are stored.
In some embodiments, combining the house type data and the camera point locations into a factor information object comprises: and generating a data structure for representing the house type structure and the camera position for each house type according to the house type data corresponding to each house type and the camera position, and exporting the data structure into an element information object in a Json format.
Specifically, after house type data is analyzed from the Revit model file and camera point locations are determined through calculation, a key element information object in a Json format is generated for each house type according to the house type data and the camera point locations, and the key element information object comprises information such as a floor identifier, a house type identifier, an outer contour line, an elevation, an overall area, a window positive direction, a window center point location and the camera point location corresponding to each house type. Therefore, the element information object in the finally generated Json format includes coordinates corresponding to the camera point location.
It should be noted that each model intermediate file corresponds to a model file identifier, and a corresponding relationship between the model file identifier and the identifier of the element information object is established. In practical application, information such as house type outer contour lines, house type identifiers (such as house numbers), room contours (two-dimensional arrays corresponding to contour line coordinates), elevations and the like can be further extracted, and unique room identification IDs are generated and associated.
According to the technical scheme provided by the embodiment of the disclosure, the Revit model is converted into the model intermediate file in the datasmith format, house type data is obtained through extraction and analysis according to the Revit model file, corresponding camera point positions are generated based on the house type data, and the house type data and the camera point positions are combined into element information in the Json format, so that the electronic sand table based on the UE4 engine can automatically load the model intermediate file and analyze the element information. The corresponding camera point location is automatically generated for the window of each house type, the camera point location does not need to be manually added, the workload of house scene configuration is reduced, and the accuracy of the camera point location during house scene configuration is improved.
In some embodiments, the engine corresponding to the preset digital sand table employs a UE4 engine, and loads the model intermediate file by using the engine corresponding to the preset digital sand table, so as to render the building model corresponding to the model intermediate file in the digital sand table, including: and loading the model intermediate file in the datasmith format into a UE4 engine by using a preset plug-in, rendering the building model corresponding to the model intermediate file in the UE4 engine, wherein the rendering is used for displaying the effect corresponding to the building model in a pre-configured digital sand table containing the external scene of the building.
Specifically, the UE4 engine provides a datasmith plug-in for importing and loading the datasmith data converted by the Revit model into the UE4 engine, that is, the model intermediate file in the datasmith format is automatically loaded into the UE4 engine by calling the datasmith plug-in through an API preset in the UE4 engine.
Here, the datasmith format is a model file used for expressing a model interface and a presentation effect in a digital sand table, the datasmith model file does not have interaction and parameter association functions, the Json element information file loaded and parsed in the UE4 engine is used for generating some block models (house-type blocks) for interaction, and a user clicks not the datasmith model when actually doing interaction but a block generated based on the Json file and specially used for interaction.
In some embodiments, performing a parsing operation on the element information object to obtain a house type block corresponding to each house type includes: and serializing the element information objects in the Json format, traversing house type data corresponding to the house types in the serialized element information objects, and constructing house type blocks corresponding to the house types in a three-dimensional coordinate system of the UE4 engine according to outer contour lines and elevations corresponding to the house types.
Specifically, the serialization process of the element information object may be regarded as a process of converting the object into a data format (e.g., into a byte string of a specific format) that can be transmitted through a network or stored to a local disk. And traversing the element information after the serialization processing, traversing all floors and house types, generating a house type block according to an outer contour line and an elevation of the house type, and associating the house type identifier with the house type block. The following detailed description of the generation process of the atrial block with reference to the specific embodiment may specifically include the following:
the three-dimensional space in the UE4 engine is a three-axis coordinate system, which is the X-axis, Y-axis, and Z-axis, respectively. The house-type contour is a closed polygonal surface, numerical values are only arranged on an X axis and a Y axis, and a polygonal space geometric body is calculated through the numerical values of the X axis and the Y axis of the house-type contour and the numerical value (namely elevation) of the Z axis, and the polygonal space geometric body can be regarded as a house-type block. That is, the principle of creating a house block is that a polygonal block with a height of Z-axis value is formed by stretching a polygonal surface (i.e. house profile) in Z-axis direction, and the house elevation is the height of the polygonal block, i.e. a building block can be created by stretching according to the elevation.
In some embodiments, establishing an interaction relationship between the camera point locations of each house type and the house type blocks comprises: acquiring point location coordinates corresponding to the camera point location of the house type and the identification of the house type block, and establishing an interactive relation between the point location coordinates of the camera point location and the identification of the house type block, wherein the interactive relation is used for indicating that the camera point location rotates or moves in a building model of a UE4 engine according to a preset interactive mode when the house type block is clicked.
Specifically, the house type blocks generated in the UE4 engine are blocks to which an interactive operation can be added, and in practical applications, the camera point and the house type blocks are bound together by establishing an interactive relationship between the camera point and the house type blocks of each house type in the UE4 engine, and the interactive operation is added for each house type block, wherein the interactive operation can be that after the house type block is clicked, the view angle of the digital sand table is rapidly moved to the view angle position of the camera point associated with the house type block, and for example, the view angle of the camera position after the movement can be obtained by performing an inverse calculation on the normal line of a window.
In some embodiments, the method further comprises: and generating a camera object for the window of each house type in the building model of the UE4 engine according to the camera point position corresponding to each house type and the normal vector of the window, and moving the current view angle corresponding to the digital sand table to the view angle corresponding to the camera object when triggering the selection operation of the house type blocks in the building model for the digital sand table.
Specifically, a camera object for showing a scene of a house is bound for each house type according to the camera point location and the window normal direction of each house type in the UE4 engine, the camera object is a kind of virtual camera in the UE4 engine, each virtual camera is placed in a house type block, and the position of the virtual camera in the house type block can be a relative coordinate position 1.6m away from the floor of the floor and 1m away from the window.
According to the technical scheme provided by the embodiment of the disclosure, house type data are extracted through a Revit model designed in BIM forward direction, a camera point location of each house type is generated according to window attribute data in the Revit model, the house type data and the camera point location are combined into element information, the element information in Json format is analyzed based on a digital sand table of a UE4 engine, a house type block of each house type is calculated, and a camera object is automatically created in the house type block according to the camera point location; specifically, the camera point locations used for one-room one-scene display are automatically generated by utilizing window normal vector generation and point location screening based on the relation between the window and the wall surface, and floor, room type, room and other information are extracted to automatically generate room type block used for interaction of one-room one-scene. And establishing an interactive relation between the house type blocks and the camera point positions in a UE4 engine, and enabling the digital sand table to automatically move the view angles to the positions of the camera objects corresponding to the house type blocks after the user selects the house type blocks. The method and the device can automatically generate the camera point positions, reduce the problems of low configuration efficiency and increased error rate caused by manual configuration, reduce the difficulty of house view configuration and improve the house view display effect.
The following are embodiments of the disclosed apparatus that may be used to perform embodiments of the disclosed methods. For details not disclosed in the embodiments of the apparatus of the present disclosure, refer to the embodiments of the method of the present disclosure.
Fig. 4 is a schematic structural diagram of a digital sand table-based house view display device according to an embodiment of the present disclosure. As shown in fig. 4, the digital sand table-based house view exhibition device includes:
the obtaining module 401 is configured to obtain a first building information model, process a model file corresponding to the first building information model to obtain a second building information model, and convert the second building information model into a model intermediate file in a predetermined format;
a generating module 402, configured to sequentially traverse each house type in the second building information model to obtain house type data corresponding to each house type, generate a camera point location corresponding to the house type based on the window attribute information, and combine the house type data and the camera point location into an element information object;
a loading module 403, configured to load the model intermediate file by using an engine corresponding to a preset digital sand table, so as to render a building model corresponding to the model intermediate file in the digital sand table, and perform an analysis operation on the element information object, so as to obtain a house-type block corresponding to each house type;
an interaction module 404 configured to establish an interaction relationship between the camera point location and the house type block of each house type, and when a selection operation for the house type block in the building model of the digital sand table is triggered, adjust the view angle of the digital sand table to the view angle of the camera point location so as to display the house view corresponding to the view angle of the camera point location.
In some embodiments, the obtaining module 401 in fig. 4 obtains the model file and the file name corresponding to the model file, sequentially matches the file names of the model files according to the key character string corresponding to the preset target model file, and loads the successfully matched model file to obtain the second building information model; the target model file is a model file related to the building structure, and the first building information model and the second building information model are both Revit models.
In some embodiments, the obtaining module 401 of fig. 4 converts the Revit model corresponding to the second building information model into a model intermediate file with datasmith as a suffix format.
In some embodiments, the generation module 402 in fig. 4 determines all floors in the second building information model, and sequentially traverses the house type in each floor along the floor in units of floors, so as to extract a floor identifier, a house type identifier, an outer contour line, and an elevation corresponding to each house type; the method comprises the steps of calculating the house type area corresponding to each house type according to the area parameter corresponding to each room in each house type, generating structured data based on floor identification, house type identification, outer contour lines, elevation and the house type area, and taking the structured data as house type data.
In some embodiments, the generating module 402 of fig. 4 obtains the window attribute information corresponding to each house type, determines a wall facade corresponding to the window according to the window attribute information, calculates a plane vector of a center line corresponding to the wall facade, and determines a normal vector of the window according to the plane vector; and respectively moving the window along the normal vector of the window to positive and negative directions for a preset distance to obtain two point location coordinates, judging whether the point location is in the outer contour line according to the point location coordinates, and taking the point location coordinates belonging to the outer contour line as the camera point location.
In some embodiments, the generation module 402 of fig. 4 generates a data structure representing the house type structure and the camera position for each house type from the house type data and the camera point location corresponding to each house type, and exports the data structure as an element information object in Json format.
In some embodiments, the engine corresponding to the preset digital sand table is a UE4 engine, and the loading module 403 in fig. 4 loads the model intermediate file in the datasmith format into the UE4 engine by using a preset plug-in, and renders the building model corresponding to the model intermediate file in the UE4 engine, where the rendering is used to show the effect corresponding to the building model in the preconfigured digital sand table containing the external scene of the building.
In some embodiments, the loading module 403 in fig. 4 performs serialization processing on the element information object in the Json format, traverses room type data corresponding to the room type in the element information object after the serialization processing, and constructs a room type block corresponding to each room type in the three-dimensional coordinate system of the UE4 engine according to an outer contour line and an elevation corresponding to each room type.
In some embodiments, the interaction module 404 in fig. 4 obtains point location coordinates corresponding to the camera point location of the house type and an identifier of the house type block, and establishes an interaction relationship between the point location coordinates of the camera point location and the identifier of the house type block, where the interaction relationship is used to indicate that the camera point location rotates or moves in the building model of the UE4 engine according to a preset interaction manner when the house type block is clicked.
In some embodiments, the interaction module 404 of fig. 4 generates a camera object for a window of each house type in the building model of the UE4 engine according to the camera point location corresponding to each house type and the normal vector of the window, and moves the current view angle corresponding to the digital sand table to the view angle corresponding to the camera object when the selection operation for the house type block in the building model of the digital sand table is triggered.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation on the implementation process of the embodiments of the present disclosure.
Fig. 5 is a schematic structural diagram of an electronic device 5 provided in the embodiment of the present disclosure. As shown in fig. 5, the electronic apparatus 5 of this embodiment includes: a processor 501, a memory 502 and a computer program 503 stored in the memory 502 and operable on the processor 501. The steps in the various method embodiments described above are implemented when the processor 501 executes the computer program 503. Alternatively, the processor 501 implements the functions of the respective modules/units in the above-described respective apparatus embodiments when executing the computer program 503.
Illustratively, the computer program 503 may be partitioned into one or more modules/units, which are stored in the memory 502 and executed by the processor 501 to accomplish the present disclosure. One or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution of the computer program 503 in the electronic device 5.
The electronic device 5 may be a desktop computer, a notebook, a palm computer, a cloud server, or other electronic devices. The electronic device 5 may include, but is not limited to, a processor 501 and a memory 502. Those skilled in the art will appreciate that fig. 5 is merely an example of the electronic device 5, and does not constitute a limitation of the electronic device 5, and may include more or less components than those shown, or combine certain components, or be different components, e.g., the electronic device may also include input-output devices, network access devices, buses, etc.
The Processor 501 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The storage 502 may be an internal storage unit of the electronic device 5, for example, a hard disk or a memory of the electronic device 5. The memory 502 may also be an external storage device of the electronic device 5, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device 5. Further, the memory 502 may also include both internal storage units and external storage devices of the electronic device 5. The memory 502 is used for storing computer programs and other programs and data required by the electronic device. The memory 502 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules, so as to perform all or part of the functions described above. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.
In the embodiments provided in the present disclosure, it should be understood that the disclosed apparatus/computer device and method may be implemented in other ways. For example, the above-described apparatus/computer device embodiments are merely illustrative, and for example, a division of modules or units, a division of logical functions only, an additional division may be made in actual implementation, multiple units or components may be combined or integrated with another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
Units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present disclosure may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, the present disclosure may implement all or part of the flow of the method in the above embodiments, and may also be implemented by a computer program to instruct related hardware, where the computer program may be stored in a computer readable storage medium, and when the computer program is executed by a processor, the computer program may implement the steps of the above methods and embodiments. The computer program may comprise computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer readable medium may include: any entity or device capable of carrying computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain suitable additions or additions that may be required in accordance with legislative and patent practices within the jurisdiction, for example, in some jurisdictions, computer readable media may not include electrical carrier signals or telecommunications signals in accordance with legislative and patent practices.
The above examples are only intended to illustrate the technical solutions of the present disclosure, not to limit them; although the present disclosure has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present disclosure, and are intended to be included within the scope of the present disclosure.

Claims (13)

1. A house scene display method based on a digital sand table is characterized by comprising the following steps:
acquiring a first building information model, processing a model file corresponding to the first building information model to obtain a second building information model, and converting the second building information model into a model intermediate file with a preset format;
sequentially traversing each house type in the second building information model to obtain house type data corresponding to each house type, generating a camera point location corresponding to each house type based on window attribute information, and combining the house type data and the camera point location into an element information object;
loading the model intermediate file by using an engine corresponding to a preset digital sand table so as to render a building model corresponding to the model intermediate file in the digital sand table, and performing analysis operation on the element information object to obtain a house type block corresponding to each house type;
establishing an interactive relation between the camera point location of each house type and the house type block, and when a selection operation for the house type block in the building model of the digital sand table is triggered, adjusting the visual angle of the digital sand table to the visual angle of the camera point location so as to display the house view corresponding to the visual angle of the camera point location;
wherein the generating of the camera point location corresponding to the house type based on the window attribute information comprises: acquiring window attribute information corresponding to each house type, determining a wall facade corresponding to a window according to the window attribute information, calculating a plane vector of a center line corresponding to the wall facade, determining a normal vector of the window according to the plane vector, and generating a camera point position based on the normal vector of the window.
2. The method according to claim 1, wherein the obtaining a first building information model and processing a model file corresponding to the first building information model to obtain a second building information model comprises:
acquiring the model file and a file name corresponding to the model file, sequentially matching the file names of the model files according to a key character string corresponding to a preset target model file, and loading the successfully matched model file to obtain the second building information model;
the target model file is a model file related to a building structure, and the first building information model and the second building information model are both Revit models.
3. The method of claim 2, wherein converting the second building information model into a model intermediate file of a predetermined format comprises:
and converting the Revit model corresponding to the second building information model into a model intermediate file in a suffix format of data smith.
4. The method of claim 1, wherein traversing each house type in the second building information model in sequence to obtain house type data corresponding to each house type comprises:
determining all floors in the second building information model, and traversing the house type in each floor in sequence along the floors by taking the floors as units so as to extract a floor identifier, a house type identifier, an outer contour line and an elevation corresponding to each house type;
according to every the area parameter that each room corresponds in the room type calculates every the room type area that the room type corresponds, based on the floor sign the room type sign outer contour line the elevation, and room type area generation structured data will structured data conduct room type data.
5. The method of claim 4, wherein the generating the camera point location corresponding to the house type based on the window attribute information further comprises:
and respectively moving the window along the normal vector of the window to positive and negative directions for a preset distance to obtain two point location coordinates, judging whether the point location is in the outer contour line according to the point location coordinates, and taking the point location coordinates belonging to the outer contour line as the camera point location.
6. The method of claim 1, wherein said combining said house type data and camera point location into a factor information object comprises:
and generating a data structure for representing a house type structure and a camera position for each house type according to the house type data and the camera point position corresponding to each house type, and exporting the data structure into an element information object in a Json format.
7. The method according to claim 1, wherein the preset digital sand table corresponding engine adopts a UE4 engine, and the loading the model intermediate file by using the preset digital sand table corresponding engine so as to render the building model corresponding to the model intermediate file in the digital sand table comprises:
and loading the model intermediate file in the datasmith format into the UE4 engine by using a preset plug-in, and rendering the building model corresponding to the model intermediate file in the UE4 engine, wherein the rendering is used for showing the effect corresponding to the building model in a pre-configured digital sand table containing the external scene of the building.
8. The method of claim 6, wherein the performing a parsing operation on the element information object to obtain a house type block corresponding to each house type comprises:
and serializing the element information objects in the Json format, traversing house type data corresponding to the house types in the element information objects after the serializing processing, and constructing house type blocks corresponding to the house types in a three-dimensional coordinate system of a UE4 engine according to outer contour lines and elevations corresponding to the house types.
9. The method of claim 1, wherein the establishing an interaction relationship between the camera site location of each of the houses and the house block comprises:
acquiring point location coordinates corresponding to the camera point location of the house type and the identification of the house type block, and establishing an interactive relationship between the point location coordinates of the camera point location and the identification of the house type block, wherein the interactive relationship is used for indicating that the camera point location rotates or moves in a building model of a UE4 engine according to a preset interactive mode when the house type block is clicked.
10. The method of claim 9, further comprising:
generating a camera object for each window of the house type in a building model of the UE4 engine according to the camera point position corresponding to each house type and the normal vector of the window, and moving the current view angle corresponding to the digital sand table to the view angle corresponding to the camera object when triggering the selection operation of the house type blocks in the building model for the digital sand table.
11. The utility model provides a house view display device based on digital sand table which characterized in that includes:
the system comprises an acquisition module, a storage module and a processing module, wherein the acquisition module is configured to acquire a first building information model, process a model file corresponding to the first building information model to obtain a second building information model and convert the second building information model into a model intermediate file with a preset format;
the generating module is configured to sequentially traverse each house type in the second building information model to obtain house type data corresponding to each house type, generate camera point positions corresponding to the house types based on window attribute information, and combine the house type data and the camera point positions into an element information object;
the loading module is configured to load the model intermediate file by using an engine corresponding to a preset digital sand table so as to render a building model corresponding to the model intermediate file in the digital sand table, and perform analysis operation on the element information object to obtain a house type block corresponding to each house type;
the interaction module is configured to establish an interaction relationship between the camera point location of each house type and the house type block, and when a selection operation of the house type block in the building model for the digital sand table is triggered, adjust the view angle of the digital sand table to the view angle of the camera point location so as to display a house view corresponding to the view angle of the camera point location;
the generating module obtains window attribute information corresponding to each house type, determines a wall vertical face corresponding to a window according to the window attribute information, calculates a plane vector of a center line corresponding to the wall vertical face, determines a normal vector of the window according to the plane vector, and generates a camera point location based on the normal vector of the window.
12. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the method of any one of claims 1 to 10 when executing the program.
13. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 10.
CN202111084046.2A 2021-09-16 2021-09-16 Digital sand table-based house scene display method, device, equipment and storage medium Active CN113538706B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111084046.2A CN113538706B (en) 2021-09-16 2021-09-16 Digital sand table-based house scene display method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111084046.2A CN113538706B (en) 2021-09-16 2021-09-16 Digital sand table-based house scene display method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN113538706A CN113538706A (en) 2021-10-22
CN113538706B true CN113538706B (en) 2021-12-31

Family

ID=78092695

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111084046.2A Active CN113538706B (en) 2021-09-16 2021-09-16 Digital sand table-based house scene display method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN113538706B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114067055B (en) * 2021-11-18 2022-11-11 北京优锘科技有限公司 Method, device, medium and equipment for rapidly generating building floor 3D model in real time
CN114237398B (en) * 2021-12-16 2024-04-16 深圳须弥云图空间科技有限公司 Method and device for generating room small map based on illusion engine and storage medium
CN114419248B (en) * 2021-12-23 2022-09-16 深圳健路网络科技有限责任公司 Three-dimensional building model dynamic loading method and system and storage medium

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930083A (en) * 2012-10-16 2013-02-13 苏州麦子软件科技有限公司 Houses real-time demonstrating system and method
CN106780756A (en) * 2016-11-30 2017-05-31 安徽金曦网络科技股份有限公司 Virtual reality house viewing system
CN108062797A (en) * 2017-12-15 2018-05-22 威仔软件科技(苏州)有限公司 A kind of selecting room device and select room method based on virtual reality technology
CN108089703A (en) * 2017-12-15 2018-05-29 威仔软件科技(苏州)有限公司 Room device and method is selected based on virtual reality technology in real time
CN110084293A (en) * 2019-04-18 2019-08-02 贝壳技术有限公司 A kind of determination method and apparatus in complete bright pattern house
CN112596713A (en) * 2020-12-30 2021-04-02 深圳须弥云图空间科技有限公司 Processing method and device based on illusion engine, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102609584A (en) * 2012-02-09 2012-07-25 孙华良 Device and method for outputting indoor soft decoration 3D (three-dimensional) effect drawing designs
CN112528353A (en) * 2020-12-18 2021-03-19 深圳须弥云图空间科技有限公司 Method and device for reconstructing three-dimensional scene based on CAD drawing

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102930083A (en) * 2012-10-16 2013-02-13 苏州麦子软件科技有限公司 Houses real-time demonstrating system and method
CN106780756A (en) * 2016-11-30 2017-05-31 安徽金曦网络科技股份有限公司 Virtual reality house viewing system
CN108062797A (en) * 2017-12-15 2018-05-22 威仔软件科技(苏州)有限公司 A kind of selecting room device and select room method based on virtual reality technology
CN108089703A (en) * 2017-12-15 2018-05-29 威仔软件科技(苏州)有限公司 Room device and method is selected based on virtual reality technology in real time
CN110084293A (en) * 2019-04-18 2019-08-02 贝壳技术有限公司 A kind of determination method and apparatus in complete bright pattern house
CN112596713A (en) * 2020-12-30 2021-04-02 深圳须弥云图空间科技有限公司 Processing method and device based on illusion engine, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN113538706A (en) 2021-10-22

Similar Documents

Publication Publication Date Title
CN113538706B (en) Digital sand table-based house scene display method, device, equipment and storage medium
CN110807835B (en) Building BIM model and live-action three-dimensional model fusion method
CN111008422B (en) Building live-action map making method and system
Liu Three-dimensional visualized urban landscape planning and design based on virtual reality technology
CN112182700A (en) BIM three-dimensional building model display method based on Web end
CN107464286B (en) Method, device, equipment and readable medium for repairing holes in three-dimensional city model
CN107273543B (en) DGN data format conversion method
US20140244219A1 (en) Method of creating a pipe route line from a point cloud in three-dimensional modeling software
CN107153744B (en) Underground three-dimensional pipeline decision making system
CN115205476A (en) Three-dimensional geological modeling method, device, electronic equipment and storage medium
CN108664860A (en) The recognition methods of room floor plan and device
CN112053440A (en) Method for determining individualized model and communication device
CN102930083B (en) Houses real-time demonstrating system and method
CN111651826A (en) Building information model technology-based building industrialization system
CN109684656B (en) Assembly constraint inheritance method based on SolidWorks
CN112070901A (en) AR scene construction method and device for garden, storage medium and terminal
Altabtabai et al. A user interface for parametric architectural design reviews
CN114820968A (en) Three-dimensional visualization method and device, robot, electronic device and storage medium
CN108597025A (en) Accelerated model construction method and device based on artificial intelligence Virtual reality
CN115205484A (en) Three-dimensional space display method, device, equipment and medium for historical culture block
CN114821055A (en) House model construction method and device, readable storage medium and electronic equipment
CN112052489B (en) Method and system for generating house type graph
Ragia et al. Precise photorealistic visualization for restoration of historic buildings based on tacheometry data
CN112991520A (en) Design and implementation method based on batch rapid three-dimensional modeling of non-fine building
Zhang et al. Design and implementation of GIS+ BIM-based digital campus system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant