CN111008422B - Building live-action map making method and system - Google Patents

Building live-action map making method and system Download PDF

Info

Publication number
CN111008422B
CN111008422B CN201911205533.2A CN201911205533A CN111008422B CN 111008422 B CN111008422 B CN 111008422B CN 201911205533 A CN201911205533 A CN 201911205533A CN 111008422 B CN111008422 B CN 111008422B
Authority
CN
China
Prior art keywords
data
file format
model
classification
building
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911205533.2A
Other languages
Chinese (zh)
Other versions
CN111008422A (en
Inventor
刘建华
李聪聪
罗竟妍
李司宇
冯国强
温丹祺
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Civil Engineering and Architecture
Original Assignee
Beijing University of Civil Engineering and Architecture
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Civil Engineering and Architecture filed Critical Beijing University of Civil Engineering and Architecture
Priority to CN201911205533.2A priority Critical patent/CN111008422B/en
Publication of CN111008422A publication Critical patent/CN111008422A/en
Application granted granted Critical
Publication of CN111008422B publication Critical patent/CN111008422B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/004Map manufacture or repair; Tear or ink or water resistant maps; Long-life maps
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B29/00Maps; Plans; Charts; Diagrams, e.g. route diagram
    • G09B29/003Maps
    • G09B29/005Map projections or methods associated specifically therewith

Abstract

The embodiment of the invention provides a method and a system for manufacturing a real scene map of a building, wherein the method comprises the following steps: building a BIM model of the target live-action, and classifying the model structure of the BIM model according to the geometric structure to obtain a first classification geometric structure and a second classification geometric structure; storing the model data of the first classification geometric structure as first file format data according to the semantic type of the first classification geometric structure; storing the model data of the second classification geometric structure as second file format data according to the semantic type of the second classification geometric structure; and acquiring texture data of the target live-action, and performing coordinate matching on the texture data and the first file format data and the second file format data respectively to obtain a live-action map corresponding to the target live-action. The embodiment of the invention improves the precision of the indoor and outdoor live-action map of the building, so that the live-action map of the building is more effectively applied to positioning and navigation of a three-dimensional scene.

Description

Building live-action map making method and system
Technical Field
The invention relates to the technical field of map making, in particular to a method and a system for making a real-scene map of a building.
Background
With the rapid development of the subjects such as the Geographic Information System (GIS), the smart phone, the close-up photography, the augmented reality, the deep learning, and the like, the positioning and navigation Service mode of the existing two-dimensional planar map scene cannot be satisfied by the social public demand Based on the Location Based Service (LBS) which is currently integrated with these technologies, and the related application research on the "in-person" real-scene indoor positioning and navigation has shown a rapidly increasing trend. However, currently, there is no effective and complete solution for the scientific problems related to efficient construction of indoor and outdoor three-dimensional live-action maps of buildings and mobile phone indoor positioning and navigation; therefore, it is very urgent to rapidly construct a three-dimensional live-action building of a holographic map element geographic entity and to serve the intelligent popular application of mobile phone positioning and navigation in the scene.
The live-action three-dimensional model is an important expression mode of urban spatial information and has important functions on urban planning, construction, management, emergency response, indoor positioning navigation and the like. The three-dimensional live-action model construction method at the present stage mainly comprises the traditional manual modeling, oblique photography automatic modeling and three-dimensional laser scanning modeling methods, and has respective advantages and limitations, and model achievements can not be effectively applied to refined three-dimensional scene indoor positioning and navigation.
Therefore, a method and a system for real-scene mapping of buildings are needed to solve the above problems.
Disclosure of Invention
Aiming at the problems in the prior art, the embodiment of the invention provides a method and a system for making a real scene map of a building.
In a first aspect, an embodiment of the present invention provides a method for making a real-scene map of a building, including:
building a BIM model of a target live-action, and classifying a model structure of the BIM model according to a geometric structure to obtain a first classification geometric structure and a second classification geometric structure;
storing the model data of the first classification geometrical structure as first file format data according to the semantic type of the first classification geometrical structure; storing the model data of the second classification geometrical structure as second file format data according to the semantic type of the second classification geometrical structure;
and acquiring texture data of the target live-action, and performing coordinate matching on the texture data and the first file format data and the second file format data respectively to obtain a live-action map corresponding to the target live-action.
Further, the first categorical geometry includes at least walls, studs, doors, windows, and stairs; the second classification geometry comprises at least a display board, a doorplate, a safety exit indication sign, a bookshelf and a fire hydrant signboard.
Further, the texture data includes: building outdoor texture data and building indoor texture data.
Further, the storing the model data of the first classification geometry as first file format data according to the semantic type of the first classification geometry includes:
storing the model data of the first classification geometrical structure into model data of an FBX file format according to the semantic type of the first classification geometrical structure;
and converting the model data in the FBX file format into model data in a JSON file format to obtain first file format data.
Further, the storing the model data of the second classification geometry as second file format data according to the semantic type of the second classification geometry includes:
storing the model data of the second classification geometry into an image file format according to the semantic type of the second classification geometry;
and converting the model data in the image file format into model data in an OBJ file format to obtain second file format data.
Further, after the building the BIM model of the target real scene, the method further includes:
and carrying out lightweight processing on the BIM to obtain the BIM after lightweight processing so as to be used for manufacturing the live-action map.
Furthermore, the building outdoor texture data is obtained through an unmanned aerial vehicle, and the building indoor texture data is obtained through an image acquisition mobile terminal.
In a second aspect, an embodiment of the present invention provides a building live-action mapping system, including:
the model processing module is used for constructing a BIM (building information modeling) model of a target real scene and classifying the model structure of the BIM model according to the geometric structure to obtain a first classification geometric structure and a second classification geometric structure;
the model format conversion module is used for storing the model data of the first classification geometric structure as first file format data according to the semantic type of the first classification geometric structure; storing the model data of the second classification geometric structure as second file format data according to the semantic type of the second classification geometric structure;
and the texture matching module is used for acquiring texture data of the target live-action and performing coordinate matching on the texture data and the first file format data and the second file format data respectively to obtain a live-action map corresponding to the target live-action.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the steps of the method as provided in the first aspect are implemented.
In a fourth aspect, embodiments of the present invention provide a non-transitory computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the steps of the method as provided in the first aspect.
According to the building live-action map manufacturing method and system provided by the embodiment of the invention, different semantic structures of the building are classified, processed and stored, and texture coordinate mapping matching is carried out on texture data and a building model, so that the precision of the indoor and outdoor live-action maps of the building is improved, and the built building live-action map is more effectively applied to positioning and navigation of a three-dimensional scene.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and those skilled in the art can also obtain other drawings according to the drawings without creative efforts.
Fig. 1 is a schematic flow chart of a method for making a real map of a building according to an embodiment of the present invention;
fig. 2 is a schematic overall flow chart of a method for making a real-scene map of a building according to an embodiment of the present invention;
FIG. 3 is a schematic structural diagram of a real-scene mapping system for buildings according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without inventive step based on the embodiments of the present invention, are within the scope of protection of the present invention.
In recent years, with the wide application of building information model BIM in the field of building technology, the BIM and the geographic information mapping industry are combined into a novel research field. The BIM technology effectively promotes the design efficiency and the quality of the design scheme of the building, proposes the full life cycle idea from design, construction to operation and maintenance, and improves the economic benefit of the digital engineering design and delivery technology. The BIM data comprises a geometric structure, professional attributes and state information of building single semantic segmentation, and has refined model construction components, so that the information integration degree of the building engineering is greatly improved, and basic refined component data can be provided for building map elements.
However, the BIM geometric model lacks real scene information of the indoor and outdoor building, how to effectively obtain building textures with different semantic attributes, and realize real scene texture mapping from materials to structures is a key problem for constructing an indoor and outdoor integrated real scene map of a building. In recent years, rapid modeling of oblique photogrammetry of unmanned aerial vehicles is rapidly developed, and a rapid and effective solution is provided for outdoor live-action texture data acquisition of buildings. The multi-angle image of the scene is rapidly acquired through the multi-view sensor, the triangulation network model of the large-scale scene can be automatically generated by the oblique photogrammetry solution of the unmanned aerial vehicle, and the unmanned aerial vehicle has higher precision and abundant texture information.
The organization and management of the building map elements is the foundation and the framework for building the real-scene map, and effectively expresses and organizes the indoor element data, which is beneficial to improving the performance and the practicability of the building real-scene map. Three-dimensional map models of buildings can be divided into geometric-based space models, symbolic-based space models and semantic-based space models. However, with the rapid increase of large-scale complex buildings in urban construction and the continuous expansion of the application field of the building map model, a single model cannot meet the diversified requirements of the indoor environment and the application scene of the complex buildings, and the mixed space model is gradually applied to the expression of the indoor scene of the complex buildings. The existing real scene map model construction has abundant geometric and semantic information, but lacks the concept of a floor, has weak support for building path construction and semantic association analysis among all elements, and lacks comprehensive geometric structure description, so that the existing real scene map model cannot be used independently. The main problems are represented by: 1. the method is characterized in that a general building map model which can meet the requirements of geometric information, semantic attributes and topological relations of all elements of a covering building is lacked; 2. the building map model only singly expresses a specific application and cannot adapt to the diversified trend of the application scene of the building map model.
Fig. 1 is a schematic flow chart of a method for making a real map of a building according to an embodiment of the present invention, and as shown in fig. 1, the embodiment of the present invention provides a method for making a real map of a building, including:
step 101, constructing a BIM model of a target live-action, and classifying a model structure of the BIM model according to a geometric structure to obtain a first classification geometric structure and a second classification geometric structure.
In particular, in an embodiment of the invention, the first sort geometry comprises at least walls, studs, doors, windows and stairs; the second classification geometry comprises at least a display board, a doorplate, a safety exit indication sign, a bookshelf and a fire hydrant signboard.
In the embodiment of the invention, based on the target scene, the BIM model of the target scene is constructed by sequentially establishing grids and floor lines, importing CAD documents, establishing layers and the like. In addition, if the BIM model of the target scene is the existing manufactured result data, the BIM model can also be directly used. In the embodiment of the invention, a BIM model of a target scene is integrated model data of the whole indoor and outdoor, firstly, based on a model in each floor of the model data, model data corresponding to geometric structures with large areas, large quantity and uniform structures, such as walls, columns, stairs, doors, windows and stairs, in each floor are obtained by layered extraction, so as to obtain a first classification geometric structure, and in the embodiment of the invention, the surface texture characteristics of the model data have the characteristics of simple texture or less information and the like; then, building auxiliary elements with small areas, small quantities and diversified structures, such as display boards, doorplates, safety exit indication signs, bookshelves and fire hydrant signs, in each floor are used as a second classification geometrical structure.
Step 102, storing the model data of the first classification geometric structure as first file format data according to the semantic type of the first classification geometric structure, specifically comprising: storing the model data of the first classification geometric structure as model data of an FBX file format according to the semantic type of the first classification geometric structure; and converting the model data in the FBX file format into model data in a JSON file format to obtain first file format data.
In the embodiment of the invention, after the first classification geometric structure is obtained, based on different semantic classification requirements, the first classification geometric structure is firstly stored according to an FBX file format, the FBX file format is a structure of a Scene Graph/Tree (Scene Graph/Tree) to store all information of a model (also can be regarded as a multi-way Tree), nodes such as a geometric network, a camera, a light source and a skeleton are contained under a Scene Tree root node corresponding to the FBX file format, corresponding attribute information is read through traversal, and basic information of the model can be better reserved in a model conversion process. Then, the FBX file format is imported into blend software, and converted into a JSON (JavaScript Object notification) data file for storage, so as to obtain data in the first file format, where the JSON data file is a lightweight data exchange format and can be applied to multiple Application Programming Interfaces (APIs) for data exchange. According to the embodiment of the invention, the walls, the columns, the doors, the windows and the stairs with uniform structures and high normalization degrees are stored as JSON data sets, so that the loading speed of model data can be effectively increased, and the production efficiency of the live-action map is improved.
Further, on the basis of the above embodiment, storing the model data of the second classification geometric structure as second file format data according to the semantic type of the second classification geometric structure specifically includes: storing the model data of the second classification geometrical structure into an image file format according to the semantic type of the second classification geometrical structure; and converting the model data in the image file format into model data in an OBJ file format to obtain second file format data.
In the embodiment of the present invention, first, the second classification geometry is temporarily stored in an image file format, so that when the subsequent texture mapping is matched, model data in an OBJ file format, that is, second file format data, may be output. The OBJ file is a standard 3D model file, mainly supports a polygon (Polygons) model and supports a normal and a mapping coordinate, so that the normal and the mapping coordinate of the model are reserved, and an effective data organization basis is provided for building an indoor and outdoor real scene model of a building. The embodiment of the invention comprises but is not limited to that the display board, the doorplate, the safety exit indication sign, the bookshelf, the fire hydrant identification sign and the like are used as auxiliary basic data of the building map, and the OBJ model is provided for storage and loading in the later live-action map making process.
Step 103, obtaining texture data of the target live-action, and performing coordinate matching on the texture data and the first file format data and the second file format data respectively to obtain a live-action map corresponding to the target live-action. Specifically, the texture data includes: building outdoor texture data and building indoor texture data.
Further, on the basis of the above embodiment, the building outdoor texture data is obtained by an unmanned aerial vehicle, and the building indoor texture data is obtained by an image acquisition mobile terminal.
Fig. 2 is a schematic overall flow chart of a method for making a real-scene map of a building according to an embodiment of the present invention, which may be referred to as fig. 2, in an embodiment of the present invention, before coordinate matching of texture data is performed, first, classified collection of textures inside and outside a building is required, and texture data is obtained through processing procedures of affine transformation, corner detection, texture extraction, semantic segmentation, front view image, and the like in sequence. For a tall building, texture information of each side vertical surface and top surface of the tall building cannot be directly acquired, and multi-angle outdoor texture acquisition needs to be carried out through a small and light Unmanned Aerial Vehicle (UAV for short) to obtain building outdoor texture data; when the indoor texture data of the building are acquired, the internal structure of the building needs to be classified, and corresponding texture data are acquired aiming at different geometric structures. Because the extracted indoor and outdoor texture data of the building have geometric distortion caused by factors such as camera inclination, light, atmospheric environment and the like, the acquired texture data image needs to be corrected. In the embodiment of the invention, firstly, vertex information of a texture mapping is obtained through angular point detection; then, the deformed texture is corrected into an orthographic texture image through perspective transformation, thereby obtaining corrected texture image data.
Further, on the basis of the above embodiment, after the texture data is obtained, the texture data is stored in an MTL (Material Library File) File format, where the File format includes texture coordinates, a texture map path, a shadow control effect, a light loading condition, and the like, so that a map File in the texture data matches a corresponding geometric structure model. Preferably, as shown in fig. 2, on the basis of the foregoing embodiment, after the building the BIM model of the target real scene, the method further includes:
and carrying out lightweight processing on the BIM to obtain the BIM after lightweight processing so as to be used for manufacturing the live-action map.
In the embodiment of the invention, after the BIM high-precision model is constructed, as the BIM model contains more unnecessary mouse trails and other line structures in the construction process, the number of the surface structures of the model is increased, so that the loading speed of the model is low, and the texture rendering is complex. In order to enhance the visualization effect of the BIM and effectively improve the loading speed of the BIM in the GIS practical application, the embodiment of the invention performs lightweight processing on the BIM to obtain reconstructed data after the lightweight processing, so that the BIM is more suitable for GIS application scenes.
Specifically, in the embodiment of the present invention, in the process of weight reduction, the obtained BIM model data is first derived in the DWG format in Revit software, and the derived color is adjusted to true color (RGB value), and is derived and stored in the ACIS entity format. And then importing the exported DWG true color model data into 3Dmax software, converting the DWG true color model data into editable grids, deleting redundant lines and surface structures, deleting non-geometric data structures and other lightweight processing to obtain BIM lightweight processed model data and obtain an OBJ format white mould.
Further, in the embodiment of the present invention, after obtaining the indoor and outdoor geometry (i.e. the first and second classification geometries) and the texture data of the building, as shown in fig. 2, webGL is a technique for drawing, displaying and interacting with three-dimensional computer graphics in a browser, and is one of the key techniques for visualizing the indoor three-dimensional model. Due to the complexity of the WebGL API, the embodiment of the invention realizes the visualization of the indoor map model based on the framework three.js library of the WebGL, and the three.js library does not support data in the BIM format, so that the data in the BIM format needs to be converted into a data format which can be used for Web visualization. Js supports file loading of various 3D model formats in the embodiment of the invention; the JSON format data can improve the loading rate of the three-dimensional model due to the lightweight data organization structure; and the OBJ format data support the texture mapping loading of model refinement, so that the building scene construction is more realistic, the two data files are fused to construct the indoor and outdoor scene model of the building, and the loading efficiency of the model is improved while the scene authenticity is ensured. Further, the three-dimensional rendering method based on the Web end is characterized in that coordinate matching is carried out on model data in an OBJ file format of a geometric structure and texture data in an MTL file format, model data in a JSON file format are loaded in a layered mode, the visualization effect of the indoor and outdoor integrated live-action map of the building at the Web end is achieved through three.
According to the building live-action map manufacturing method provided by the embodiment of the invention, different semantic structures of the building are classified, processed and stored, and texture coordinate mapping matching is carried out on texture data and a building model, so that the precision of the indoor and outdoor live-action map of the building is improved, and the constructed live-action map is more effectively applied to positioning and navigation of a three-dimensional scene.
In an embodiment of the present invention, BIM indoor and outdoor data of a certain building are collected to be used for making a live-action map of the building for explanation. Specifically, the multi-span building of the building has six floors, five floors above the ground and one floor below the ground, and the building elements with high repetition probability and uniform texture information in the building, including walls, columns, stairs, doors, windows and the like, are displayed in a Web end through a JSON format. The embodiment of the invention adopts the Blender to export the JSON format model file, the connection between the Blender and the three.
Further, for elements with small repetition probability or diversified texture information in the building scene, such as a display board, an electric box, a fire hydrant and the like, the embodiment of the invention loads in a mode of combining an OBJ file with an MTL file, so that the texture matching between the texture data and the corresponding geometric structure is completed, and carries out layered loading through the geometric structure in a JSON file format, so as to manufacture indoor and outdoor live-action maps of different floors of the building.
FIG. 3 is a schematic structural diagram of a real-scene mapping system for buildings according to an embodiment of the present invention, such as
As shown in fig. 3, an embodiment of the present invention provides a building realistic mapping system, which includes a model processing module 301, a model format conversion module 302, and a texture matching module 303, where the model processing module 301 is configured to construct a BIM model of a target realistic scene, and classify a model structure of the BIM model according to a geometric structure to obtain a first classification geometric structure and a second classification geometric structure; the model format conversion module 302 is configured to store the model data of the first classification geometry as first file format data according to the semantic type of the first classification geometry; storing the model data of the second classification geometric structure as second file format data according to the semantic type of the second classification geometric structure; the texture matching module 303 is configured to obtain texture data of the target real scene, and perform coordinate matching on the texture data and the first file format data and the second file format data, respectively, to obtain a real scene map corresponding to the target real scene.
According to the building live-action map manufacturing system provided by the embodiment of the invention, different semantic structures of a building are classified, processed and stored, and texture coordinate mapping matching is carried out on texture data and a building model, so that the precision of indoor and outdoor live-action maps of the building is improved, and the constructed live-action map is more effectively applied to positioning and navigation of a three-dimensional scene.
On the basis of the above embodiment, the system further includes: and the preprocessing module is used for carrying out lightweight processing on the BIM to obtain the BIM after the lightweight processing so as to manufacture a live-action map.
The system provided by the embodiment of the present invention is used for executing the above method embodiments, and for details of the process and the details, reference is made to the above embodiments, which are not described herein again.
Fig. 4 is a schematic structural diagram of an electronic device according to an embodiment of the present invention, and referring to fig. 4, the electronic device may include: a processor (processor) 401, a communication Interface (communication Interface) 402, a memory (memory) 403 and a communication bus 404, wherein the processor 401, the communication Interface 402 and the memory 403 complete communication with each other through the communication bus 404. Processor 401 may call logic instructions in memory 403 to perform the following method: building a BIM model of a target live-action, and classifying a model structure of the BIM model according to a geometric structure to obtain a first classification geometric structure and a second classification geometric structure; storing the model data of the first classification geometric structure as first file format data according to the semantic type of the first classification geometric structure; storing the model data of the second classification geometric structure as second file format data according to the semantic type of the second classification geometric structure; and acquiring texture data of the target live-action, and performing coordinate matching on the texture data and the first file format data and the second file format data respectively to obtain a live-action map corresponding to the target live-action.
In addition, the logic instructions in the memory 403 may be implemented in the form of software functional units and stored in a computer readable storage medium when the software functional units are sold or used as independent products. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk, and various media capable of storing program codes.
In another aspect, an embodiment of the present invention further provides a non-transitory computer-readable storage medium, on which a computer program is stored, where the computer program is implemented to perform the method for creating a real map of a building provided in the foregoing embodiments when executed by a processor, for example, the method includes: building a BIM model of a target live-action, and classifying a model structure of the BIM model according to a geometric structure to obtain a first classification geometric structure and a second classification geometric structure; storing the model data of the first classification geometric structure as first file format data according to the semantic type of the first classification geometric structure; storing the model data of the second classification geometrical structure as second file format data according to the semantic type of the second classification geometrical structure; and acquiring texture data of the target live-action, and performing coordinate matching on the texture data and the first file format data and the second file format data respectively to obtain a live-action map corresponding to the target live-action.
The above-described embodiments of the apparatus are merely illustrative, and the units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of the present embodiment. One of ordinary skill in the art can understand and implement it without inventive effort.
Through the above description of the embodiments, those skilled in the art will clearly understand that each embodiment can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware. With this understanding in mind, the above-described technical solutions may be embodied in the form of a software product, which can be stored in a computer-readable storage medium such as ROM/RAM, magnetic disk, optical disk, etc., and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device, etc.) to execute the methods described in the embodiments or some parts of the embodiments.
Finally, it should be noted that: the above examples are only intended to illustrate the technical solution of the present invention, but not to limit it; although the present invention has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions of the embodiments of the present invention.

Claims (7)

1. A method for making a real scene map of a building is characterized by comprising the following steps:
building a BIM model of a target live-action, and classifying a model structure of the BIM model according to a geometric structure to obtain a first classification geometric structure and a second classification geometric structure;
storing the model data of the first classification geometric structure as first file format data according to the semantic type of the first classification geometric structure; storing the model data of the second classification geometric structure as second file format data according to the semantic type of the second classification geometric structure;
acquiring texture data of the target live-action, and performing coordinate matching on the texture data and the first file format data and the second file format data respectively to obtain a live-action map corresponding to the target live-action;
the first sort geometry comprises at least walls, studs, doors, windows and stairs; the second classification geometrical structure at least comprises a display board, a doorplate, a safety exit indication sign, a bookshelf and a fire hydrant signboard;
storing the model data of the first classification geometry as first file format data according to the semantic type of the first classification geometry, including:
storing the model data of the first classification geometric structure as model data of an FBX file format according to the semantic type of the first classification geometric structure;
converting the model data in the FBX file format into model data in a JSON file format to obtain first file format data;
storing the model data of the second classification geometry as second file format data according to the semantic type of the second classification geometry, including:
storing the model data of the second classification geometrical structure into an image file format according to the semantic type of the second classification geometrical structure;
and converting the model data in the image file format into model data in an OBJ file format to obtain second file format data.
2. The method of claim 1, wherein the texture data comprises: building outdoor texture data and building indoor texture data.
3. The building real estate mapping method of claim 1 wherein after the building of the BIM model of the target real estate, the method further comprises:
and carrying out lightweight processing on the BIM to obtain the BIM after lightweight processing so as to be used for manufacturing a live-action map.
4. The method for making the real scene map of the building as claimed in claim 2, wherein the texture data of the outdoor building is obtained by an unmanned aerial vehicle, and the texture data of the indoor building is obtained by an image acquisition mobile terminal.
5. A building live-action mapping system, comprising:
the model processing module is used for constructing a BIM (building information modeling) model of a target real scene and classifying the model structure of the BIM model according to the geometric structure to obtain a first classification geometric structure and a second classification geometric structure;
the model format conversion module is used for storing the model data of the first classification geometric structure as first file format data according to the semantic type of the first classification geometric structure; storing the model data of the second classification geometric structure as second file format data according to the semantic type of the second classification geometric structure;
the texture matching module is used for acquiring texture data of the target real scene and performing coordinate matching on the texture data with the first file format data and the second file format data respectively to obtain a real scene map corresponding to the target real scene;
the first sort geometry includes at least walls, studs, doors, windows, and stairs; the second classification geometrical structure at least comprises a display board, a doorplate, a safety exit indication sign, a bookshelf and a fire hydrant signboard;
the model format conversion module is specifically configured to:
storing the model data of the first classification geometric structure as model data of an FBX file format according to the semantic type of the first classification geometric structure;
converting the model data in the FBX file format into model data in a JSON file format to obtain first file format data;
storing the model data of the second classification geometrical structure into an image file format according to the semantic type of the second classification geometrical structure;
and converting the model data in the image file format into model data in an OBJ file format to obtain second file format data.
6. An electronic device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, wherein the processor when executing the program performs the steps of the building real scene mapping method according to any of claims 1 to 4.
7. A non-transitory computer readable storage medium having stored thereon a computer program, wherein the computer program when executed by a processor implements the steps of the building realistic mapping method according to any of claims 1 to 4.
CN201911205533.2A 2019-11-29 2019-11-29 Building live-action map making method and system Active CN111008422B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911205533.2A CN111008422B (en) 2019-11-29 2019-11-29 Building live-action map making method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911205533.2A CN111008422B (en) 2019-11-29 2019-11-29 Building live-action map making method and system

Publications (2)

Publication Number Publication Date
CN111008422A CN111008422A (en) 2020-04-14
CN111008422B true CN111008422B (en) 2022-11-22

Family

ID=70112340

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911205533.2A Active CN111008422B (en) 2019-11-29 2019-11-29 Building live-action map making method and system

Country Status (1)

Country Link
CN (1) CN111008422B (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111930694B (en) * 2020-07-17 2023-07-28 深圳市万翼数字技术有限公司 Electronic file processing method, electronic device and processing server
CN112001012A (en) * 2020-08-17 2020-11-27 宇构科技(上海)有限公司 Building system based on BIM technology preset model
CN112288873B (en) * 2020-11-19 2024-04-09 网易(杭州)网络有限公司 Rendering method and device, computer readable storage medium and electronic equipment
CN113421292A (en) * 2021-06-25 2021-09-21 北京华捷艾米科技有限公司 Three-dimensional modeling detail enhancement method and device
CN113688459A (en) * 2021-08-31 2021-11-23 中国建筑第七工程局有限公司 BIM and AI-based construction site safety management system
CN113790730B (en) * 2021-08-31 2022-09-23 北京航空航天大学 Mobile robot navigation map conversion method and system based on DXF format
CN113835703B (en) * 2021-09-27 2024-03-15 北京斯年智驾科技有限公司 Method for drawing automatic driving monitoring visual map at WEB front end
CN114071392B (en) * 2021-12-28 2023-07-25 智小途(上海)数字科技有限公司 UWB indoor high-precision three-dimensional live-action data construction method and system
CN114037808B (en) * 2022-01-10 2022-03-29 武大吉奥信息技术有限公司 High-simulation white modulus data production method
CN114419121B (en) * 2022-01-20 2022-10-04 盈嘉互联(北京)科技有限公司 BIM texture generation method based on image
CN117197382B (en) * 2023-11-02 2024-01-12 广东省测绘产品质量监督检验中心 Live-action three-dimensional data construction method and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108520544A (en) * 2018-04-12 2018-09-11 郑州信大先进技术研究院 A kind of indoor map building method and device towards wisdom fire-fighting
CN108710739A (en) * 2018-05-11 2018-10-26 北京建筑大学 A kind of Building Information Model lightweight and three-dimensional scenic visualization method and system
CN109934914A (en) * 2019-03-28 2019-06-25 东南大学 A kind of embedded urban design scene simulation method and system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140267273A1 (en) * 2013-03-15 2014-09-18 Janne Kontkanen System and method for overlaying two-dimensional map elements over terrain geometry
CN109029488A (en) * 2018-06-29 2018-12-18 百度在线网络技术(北京)有限公司 Navigating electronic map generating method, equipment and storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108520544A (en) * 2018-04-12 2018-09-11 郑州信大先进技术研究院 A kind of indoor map building method and device towards wisdom fire-fighting
CN108710739A (en) * 2018-05-11 2018-10-26 北京建筑大学 A kind of Building Information Model lightweight and three-dimensional scenic visualization method and system
CN109934914A (en) * 2019-03-28 2019-06-25 东南大学 A kind of embedded urban design scene simulation method and system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
全息位置地图多源信息融合可视化方法研究;朱华晨等;《信息技术与标准化》;20171010(第10期);第25-29页 *
基于工业基础类的建筑物3D Tiles数据可视化;徐照等;《浙江大学学报(工学版)》;20190412(第06期);第1047-1055页 *

Also Published As

Publication number Publication date
CN111008422A (en) 2020-04-14

Similar Documents

Publication Publication Date Title
CN111008422B (en) Building live-action map making method and system
CN109145366B (en) Web 3D-based lightweight visualization method for building information model
US11222465B2 (en) Embedded urban design scene emulation method and system
CN108919944B (en) Virtual roaming method for realizing data lossless interaction at display terminal based on digital city model
Heo et al. Productive high-complexity 3D city modeling with point clouds collected from terrestrial LiDAR
CN110704928B (en) Method for converting BIM model into GIS model
KR101465487B1 (en) Bim data processing system for extracting surface object of building
CN112069582A (en) Engineering scene establishing method
US20230074265A1 (en) Virtual scenario generation method and apparatus, computer device and storage medium
CN110660125B (en) Three-dimensional modeling device for power distribution network system
CN114049462B (en) Three-dimensional model monomer method and device
CN110070616A (en) Memory, statistical data rendering method, device and equipment based on GIS platform
CN112053440A (en) Method for determining individualized model and communication device
CN113112603A (en) Method and device for optimizing three-dimensional model
CN115937461B (en) Multi-source fusion model construction and texture generation method, device, medium and equipment
She et al. 3D building model simplification method considering both model mesh and building structure
CN114202622A (en) Virtual building generation method, device, equipment and computer readable storage medium
CN114758337A (en) Semantic instance reconstruction method, device, equipment and medium
Zhang et al. A geometry and texture coupled flexible generalization of urban building models
CN114820975A (en) Three-dimensional scene simulation reconstruction system and method based on all-element parameter symbolization
CN113051654B (en) Indoor stair three-dimensional geographic entity model construction method based on two-dimensional GIS data
Zhao et al. A 3D modeling method for buildings based on LiDAR point cloud and DLG
CN114972665A (en) Three-dimensional visual virtual scene modeling method in unmanned aerial vehicle virtual simulation
CN115186347A (en) Building CityGML modeling method combining house type plan and inclined model
CN113505185A (en) Three-dimensional scene rendering and displaying method for urban information model

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant