CN115545426A - Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning - Google Patents

Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning Download PDF

Info

Publication number
CN115545426A
CN115545426A CN202211138810.4A CN202211138810A CN115545426A CN 115545426 A CN115545426 A CN 115545426A CN 202211138810 A CN202211138810 A CN 202211138810A CN 115545426 A CN115545426 A CN 115545426A
Authority
CN
China
Prior art keywords
production
bim
point cloud
points
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211138810.4A
Other languages
Chinese (zh)
Inventor
包胜
秦现德
章竑骎
卜航栋
徐洁
贺帅
徐健青
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Zhejiang Communications Construction Group Co Ltd
Original Assignee
Zhejiang University ZJU
Zhejiang Communications Construction Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU, Zhejiang Communications Construction Group Co Ltd filed Critical Zhejiang University ZJU
Priority to CN202211138810.4A priority Critical patent/CN115545426A/en
Publication of CN115545426A publication Critical patent/CN115545426A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0631Resource planning, allocation, distributing or scheduling for enterprises or organisations
    • G06Q10/06311Scheduling, planning or task assignment for a person or group
    • G06Q10/063114Status monitoring or status determination for a person or group
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/64Three-dimensional objects

Landscapes

  • Engineering & Computer Science (AREA)
  • Business, Economics & Management (AREA)
  • Theoretical Computer Science (AREA)
  • Human Resources & Organizations (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Strategic Management (AREA)
  • Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Computing Systems (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Artificial Intelligence (AREA)
  • General Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Medical Informatics (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Tourism & Hospitality (AREA)
  • Software Systems (AREA)
  • Educational Administration (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Game Theory and Decision Science (AREA)
  • Development Economics (AREA)
  • Primary Health Care (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

The invention discloses a prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning, which comprises an acquisition module, an identification module and a progress analysis module; the acquisition module is used for acquiring field manual filling information and video monitoring information and acquiring field three-dimensional point cloud information by using a three-dimensional laser scanning technology. The identification module is used for positioning the position of the production pedestal, identifying the three-dimensional point cloud based on a convolutional neural network algorithm, outputting an identification tag and judging the production stage of the prefabricated part. And the progress analysis module is used for correlating the production stage of the prefabricated part obtained by the identification module with the BIM and correcting information in the BIM. The method is based on the BIM technology, the three-dimensional laser scanning technology and the convolutional neural network algorithm, the three-dimensional point cloud and the BIM model are divided into a plurality of cubes, the progress is judged according to the space saturation and the coordinate occupation condition, the accuracy is high, and the function of analyzing the production progress of the prefabricated part is achieved.

Description

Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning
Technical Field
The embodiment of the invention relates to the field of prefabricated part production of an assembly type building, in particular to a prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning.
Background
With the domestic large-scale application of the assembly type building, the precise production and management of the prefabricated components of the assembly type building are very important. However, the prefabricated part factory usually adopts a mode of manually analyzing the production progress, and the subsequent production and management are carried out by depending on experience, so that great waste is often caused, and the analysis result is also inaccurate; the production process management needs to consume a large amount of manpower and material resources, and the informatization degree is low.
Aiming at the related problems, the inventor considers that the BIM technology and the three-dimensional laser scanning technology have advantages in this respect, can quickly acquire the production information of the prefabricated part factory, can analyze and compare the production information with the original progress after identifying the production information, combines the production field information with the BIM model, changes the production and management mode of the traditional prefabricated part factory, and realizes precise production and management.
Disclosure of Invention
The invention aims to provide a prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning, aiming at the defects of the prior art.
The purpose of the invention is realized by the following technical scheme: a prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning comprises an acquisition module, an identification module and a progress analysis module;
the acquisition module is used for acquiring on-site manual filling information and video monitoring information and acquiring on-site three-dimensional point cloud information by utilizing a three-dimensional laser scanning technology.
The identification module is used for positioning the position of the production pedestal, identifying the three-dimensional point cloud based on a convolutional neural network algorithm, outputting an identification tag and judging the production stage of the prefabricated part.
The progress analysis module is used for correlating the production stage of the prefabricated part obtained by the identification module with the BIM model and correcting the information in the BIM model; the method comprises the following specific steps:
registering the three-dimensional point cloud and the BIM model, and unifying coordinates; selecting a certain area needing progress analysis in the three-dimensional point cloud, wherein the average distance of points in the point cloud is a; setting the size of o (2 a is more than or equal to o and less than or equal to 3 a), and dividing the region into unit cubes with the side length of o, wherein n cubes are counted;
calculating the distances between points for all points in any cube i, and constructing a delaunay triangle in the cube i to form a triangulation network model; calculating the volume of each tetrahedron in the triangular net model, and then adding to obtain the total volume V of the triangular net model t And calculating the spatial saturation w of the cube i i
Figure BDA0003852519270000021
For cube i, the following equation applies:
Figure BDA0003852519270000022
if p is i =1 thenThe points in the cube i are effective points, all the points in the cube i are reserved, and the coordinate set of the points is R i (ii) a If p is i =0, then delete all points within the cube i; until all n cubes are traversed, and the number of cubes with effective points inside is counted to be S 1
In the BIM model, for the same selected area, dividing the area into n cubes with the same size and coordinates as the three-dimensional point cloud; for a certain cube in the BIM, finding out a corresponding cube after three-dimensional point cloud segmentation, traversing each coordinate in a coordinate set, judging whether the coordinate is occupied in the BIM, and counting the number S of cubes with the unoccupied coordinates accounting for more than 70% of the coordinate set 2 Then the points in these cubes are newly added points;
setting different judgment values m for different production processes; for the selected area, if any
Figure BDA0003852519270000023
The production of the prefabricated parts in the selected area is judged to be finished, otherwise, the production is judged not to be finished.
Further, the acquisition module includes:
and the manual filling unit is used for acquiring the production information of the manual filling site.
And the video monitoring unit is used for shooting the field production information.
And the three-dimensional point cloud unit acquires the on-site three-dimensional point cloud information by using a three-dimensional scanning technology.
Further, the identification module includes:
the point cloud identification unit adopts a PointNet + + network structure, and each sub-network consists of 6 edge convolution layers, 1 MLP layer and 1 maximum pooling layer; after the point cloud is input, the original features of each point in the point cloud are mapped to a high-dimensional feature space. And searching each sampling point by adopting a k nearest neighbor algorithm to obtain k field groups, extracting deep semantic geometric features of each group to obtain new point cloud, repeating the operation once again, recursing the whole point cloud, obtaining a classification score through a full connection layer and a SoftMax function, and outputting an identification label.
And the positioning unit is used for positioning the prefabricated part by the position of the camera, wherein the identification result of the point cloud identification unit for position positioning corresponds to the position of the production pedestal.
Further, the specific process of constructing the delaunay triangle is as follows: selecting two points with the closest distance as initial edges of the delaunay triangles, traversing the rest points by taking two end points of the initial edges as starting points, taking a point with the minimum cosine value of an included angle formed with the initial edges as an end point to form a first delaunay triangle, taking an edge with the minimum cosine value of the included angle formed with the initial edges in the first delaunay triangle as a new initial edge, continuously constructing the delaunay triangle, and so on, and stopping forming the triangle when the two end points of the first initial edge are the end points.
Further, before the three-dimensional point cloud and the BIM are registered, noise reduction processing needs to be carried out on the three-dimensional point cloud, and miscellaneous points in the three-dimensional point cloud are removed.
Further, after the progress analysis module finishes obtaining a progress analysis result, information obtained in a production field is integrated and added into the BIM, the information of the production field obtained by the acquisition module, the identification module and the progress analysis module is arranged, for a certain production area, production personnel, time, materials and quality inspection information are obtained by the acquisition module, the placement position of each prefabricated component is obtained by the identification module, the production stage and the production progress of each prefabricated component are obtained by the progress analysis module, the BIM is built according to effective points, the BIM is updated after the information is arranged, and the information is led into the BIM.
The invention has the beneficial effects that: the method is based on the BIM technology, the three-dimensional laser scanning technology and the convolutional neural network algorithm, realizes the production progress analysis function of the prefabricated part, and is more reliable compared with an artificial progress analysis method depending on experience; the three-dimensional point cloud and the BIM model are divided into a plurality of cubes, and the progress is judged according to the space saturation and the coordinate occupation condition, so that the progress is more accurate; by utilizing the BIM technology, the production condition in a prefabricated part factory can be mastered, and compared with the traditional prefabricated part production, the production process is more scientific.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
Fig. 1 is a structural diagram of a prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning according to an embodiment of the present invention;
fig. 2 is a schematic flow diagram of an implementation process of a prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning according to an embodiment of the present invention.
Detailed Description
The invention is described in further detail below with reference to the drawings and the detailed description. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a specific embodiment of a prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning according to an embodiment of the present invention. The embodiment can arrange the production of the subsequent production process according to the current production stage of the prefabricated part. As shown in fig. 1, the BIM and digital twin-based method for analyzing the production progress of prefabricated parts of an assembly type building may specifically include: the device comprises an acquisition module, an identification module and a progress analysis module.
The acquisition module is used for acquiring on-site manual filling information, video monitoring information and three-dimensional point cloud information.
The identification module is used for positioning the position of the production pedestal, identifying the three-dimensional point cloud based on a convolutional neural network algorithm, automatically outputting an identification tag and judging the production stage of the prefabricated part.
And the progress analysis module is used for associating the field progress with the original BIM model and correcting the information in the original BIM model.
Further, the acquisition module includes:
and the manual filling unit is used for acquiring the production information of the manual filling site.
And the video monitoring unit is used for shooting the field production information.
And the three-dimensional point cloud unit acquires field three-dimensional point cloud information by using a three-dimensional scanning technology.
Further, the identification module includes:
and the point cloud identification unit is used for identifying the three-dimensional point cloud based on a convolutional neural network algorithm, automatically outputting an identification label and judging the production stage of the prefabricated part. A PointNet + + network structure is adopted, and each sub-network is composed of 6 edge convolution layers, 1 MLP layer and 1 maximum pooling layer. After the point cloud is input, the original features of each point in the point cloud are mapped to a high-dimensional feature space. And searching each sampling point by adopting a k nearest neighbor algorithm to obtain k field groups, extracting deep semantic geometric features of each group to obtain new point clouds, repeating the operation once again, recursing the whole point clouds, obtaining classification scores through a full connection layer and a SoftMax function, and outputting identification labels. The point cloud identification unit can identify a steel reinforcement cage, concrete, a template, a pedestal and the like, so as to judge the production stage of the prefabricated part.
And the positioning unit is used for positioning the position of the production pedestal, uploading the position of the shooting camera and determining the position of the pedestal shot by the camera. The positioning unit can know the position of the pedestal where the prefabricated part is placed and is identified by the point cloud identification unit, and the prefabricated part is positioned.
Further, the progress analysis module includes:
and the model unit is used for carrying out noise reduction treatment on the three-dimensional point cloud and removing the miscellaneous points in the three-dimensional point cloud. First a suitable distance threshold is set and if the distance is greater than the given distance threshold, the points are removed. And removing noise points through multiple times of noise reduction processing. And then removing miscellaneous points which interfere with the progress analysis, such as site garbage, waste materials, personnel and the like according to the identification result of the point cloud identification unit, and finally reserving the points as prefabricated parts, pedestals and production equipment.
And the progress unit is used for comparing the three-dimensional point cloud with the original BIM model to realize progress analysis. Firstly, setting two or more characteristic points, registering the three-dimensional point cloud and the original BIM model, and unifying coordinates.
And selecting a certain area needing progress analysis in the three-dimensional point cloud, wherein the average distance between the midpoints of the point cloud is a. Setting the size of o (2 a is more than or equal to o and less than or equal to 3 a), and dividing the region into unit cubes with the side length of o, wherein n cubes are counted.
For any cube i, the distance between the point and the point is calculated for all points in the cube according to the following formula:
Figure BDA0003852519270000041
then selecting two points with the closest distance as the initial edge of the delaunay triangle, traversing the rest points by taking two end points of the initial edge as starting points, taking the point with the minimum cosine value of the included angle formed with the initial edge as an end point to form a first delaunay triangle, taking the edge with the minimum cosine value of the included angle formed with the initial edge in the first delaunay triangle as a new initial edge, continuing to construct the delaunay triangle, and so on, and stopping forming the triangle when the two end points of the first initial edge are the end points.
Calculating the volume of each tetrahedron in the triangular net model, then adding to obtain the total volume of the triangular net model, and calculating the spatial saturation w of the cube i i
Figure BDA0003852519270000051
Figure BDA0003852519270000052
Figure BDA0003852519270000053
For cube i, the following equation applies:
Figure BDA0003852519270000054
if p is i =1, then all points within cube i are kept, with the set of coordinates R i (ii) a Otherwise all points within the cube i are deleted. Until all cubes are traversed. Having the formula:
Figure BDA0003852519270000055
through the steps, whether effective points exist in each cube can be judged, and the number of cubes with points in the cubes can be counted.
In the BIM model, for the same selected region, the region is divided into n cubes with the same size and coordinates according to a three-dimensional point cloud unit body segmentation method. For a certain cube i, traverse the coordinate set R i Judging whether the coordinate is occupied in the BIM model, if not, occupying the set R i More than 70% of (a), then q is present i =1; otherwise q i =0. All cubes are traversed. Having the formula:
Figure BDA0003852519270000056
by the steps, each cube with points inside can be judged, and compared with the original BIM model, whether the point cloud in the cube is newly added or not can be judged.
Different judgment values m are set for different production processes. For selected areas, if any
Figure BDA0003852519270000057
The production of the prefabricated parts in the selected area is judged to be finished, otherwise, the production is judged not to be finished. The judgment completion conditions of each process are shown inIn the following table, 0.9 in the table is a point cloud loss coefficient, so that errors caused by point deletion in the point cloud processing process are reduced.
Figure BDA0003852519270000061
And the information unit is used for integrating the information obtained on the production site and adding the information into the BIM. And arranging the production field information obtained by the manual filling unit, the positioning unit and the progress unit. For a certain production area, the manual filling unit can know production personnel, duration, materials and quality inspection information, the positioning unit can know the placing position of each prefabricated part, the progress unit can know the production stage and the production progress of each prefabricated part, and a BIM (building information modeling) can be established according to reserved points. And after the information is collated, establishing a new BIM model, importing the information into the BIM model, and providing production field information for the 4D unit to establish the 4DBIM model.
And the 4D unit is used for establishing a 4DBIM model, simulating the process of installing the prefabricated part on the construction site in a virtual environment and finding possible problems in the subsequent construction site installation process in advance. And establishing a 4DBIM model according to the information arranged by the information unit to form a simulation animation, and checking whether the size of the component meets the requirement, whether the position of the embedded part is correct and the like.
Fig. 2 is a schematic flow chart of a method for analyzing the production progress of prefabricated parts of an assembly type building based on BIM and digital twins according to an embodiment of the present invention.
At the prefabricated part production site, collecting information of the site by the collection module, wherein the information comprises: information such as constructors, duration, materials, quality inspection and the like; producing a site video; and producing the on-site three-dimensional point cloud. Then, importing the three-dimensional point cloud into the identification module, identifying the three-dimensional point cloud, and judging the production stage of the prefabricated part; and positioning the position of the prefabricated part. And finally, importing the three-dimensional point cloud into the progress analysis module, performing noise reduction treatment and miscellaneous point removal on the three-dimensional point cloud, performing progress comparison analysis on the three-dimensional point cloud and the original BIM model, integrating information acquired in a production field, updating and correcting information in the BIM model, establishing a 4DBIM model, simulating a construction field installation process, and finding possible problems in advance.
The embodiments described in this specification are merely illustrative of implementation forms of the inventive concept, and the scope of the present invention should not be considered limited to the specific forms set forth in the embodiments, but also includes equivalent technical means that can be conceived by those skilled in the art based on the inventive concept.

Claims (6)

1. A prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning is characterized by comprising an acquisition module, a recognition module and a progress analysis module;
the acquisition module is used for acquiring field manual filling information and video monitoring information and acquiring field three-dimensional point cloud information by using a three-dimensional laser scanning technology.
The identification module is used for positioning the position of the production pedestal, identifying the three-dimensional point cloud based on a convolutional neural network algorithm, outputting an identification tag and judging the production stage of the prefabricated part.
The progress analysis module is used for correlating the production stage of the prefabricated part obtained by the identification module with the BIM model and correcting the information in the BIM model; the method comprises the following specific steps:
registering the three-dimensional point cloud and the BIM model, and unifying coordinates; selecting a certain area needing progress analysis in the three-dimensional point cloud, wherein the average distance of points in the point cloud is a; setting the size of o (2 a is more than or equal to o and less than or equal to 3 a), and dividing the region into unit cubes with the side length of o, wherein n cubes are counted;
calculating the distances between points for all points in any cube i, and constructing a delaunay triangle in the cube i to form a triangulation network model; calculating the volume of each tetrahedron in the triangular net model, and then adding to obtain the total volume V of the triangular net model t And calculating the spatial saturation w of the cube i i
Figure FDA0003852519260000011
For cube i, the following equation applies:
Figure FDA0003852519260000012
if p is i =1, then the point in the cube i is a valid point, all points in the cube i are reserved, and the coordinate set of the points is R i (ii) a If p is i =0, all points within the cube i are deleted; until all n cubes are traversed, and the number of the cubes with effective points inside is counted as S 1
In the BIM model, for the same selected region, dividing the region into n cubes with the same size and coordinates as the three-dimensional point cloud; for a certain cube in the BIM, finding out a corresponding cube after three-dimensional point cloud segmentation, traversing each coordinate in a coordinate set, judging whether the coordinate is occupied in the BIM, and counting the number S of cubes with the unoccupied coordinates accounting for more than 70% of the coordinate set 2 Then the points in these cubes are newly added points;
setting different judgment values m for different production procedures; for selected areas, if any
Figure FDA0003852519260000013
The production of the prefabricated parts in the selected area is judged to be finished, otherwise, the production is judged not to be finished.
2. The BIM and three-dimensional laser scanning-based prefabricated part production progress analysis system as claimed in claim 1, wherein the acquisition module comprises:
and the manual filling unit is used for acquiring the production information of the manual filling site.
And the video monitoring unit is used for shooting the field production information.
And the three-dimensional point cloud unit acquires the on-site three-dimensional point cloud information by using a three-dimensional scanning technology.
3. The BIM and three-dimensional laser scanning-based prefabricated part production progress analysis system according to claim 1, wherein the recognition module comprises:
the point cloud identification unit adopts a PointNet + + network structure, and each sub-network consists of 6 edge convolution layers, 1 MLP layer and 1 maximum pooling layer; after the point cloud is input, the original features of each point in the point cloud are mapped to a high-dimensional feature space. And searching each sampling point by adopting a k nearest neighbor algorithm to obtain k field groups, extracting deep semantic geometric features of each group to obtain new point clouds, repeating the operation once again, recursing the whole point clouds, obtaining classification scores through a full connection layer and a SoftMax function, and outputting identification labels.
And the positioning unit is used for positioning the prefabricated part by the position of the camera, wherein the identification result of the point cloud identification unit for position positioning corresponds to the position of the production pedestal.
4. The prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning as claimed in claim 1, wherein the concrete process of constructing the delaunay triangle is as follows: selecting two points with the closest distance as initial edges of the delaunay triangles, traversing the rest points by taking two end points of the initial edges as starting points, taking a point with the minimum cosine value of an included angle formed with the initial edges as an end point to form a first delaunay triangle, taking an edge with the minimum cosine value of the included angle formed with the initial edges in the first delaunay triangle as a new initial edge, continuously constructing the delaunay triangle, and so on, and stopping forming the triangle when the two end points of the first initial edge are the end points.
5. The BIM and three-dimensional laser scanning-based precast element production progress analysis system as claimed in claim 1, wherein the three-dimensional point cloud is denoised and the miscellaneous points in the three-dimensional point cloud are removed before the registration of the three-dimensional point cloud and the BIM model.
6. The prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning as claimed in claim 1, wherein after the progress analysis module finishes obtaining the progress analysis result, information obtained in a production field is integrated and added to the BIM model, the information of the production field obtained by the acquisition module, the identification module and the progress analysis module is arranged, information of production personnel, duration, materials and quality inspection is obtained from a certain production area by the acquisition module, the placement position of each prefabricated part is obtained by the identification module, the production stage and the production progress of each prefabricated part are obtained by the progress analysis module, the BIM model is established according to effective points, the BIM model is updated after the information is arranged, and the information is led into the BIM model.
CN202211138810.4A 2022-09-19 2022-09-19 Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning Pending CN115545426A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211138810.4A CN115545426A (en) 2022-09-19 2022-09-19 Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211138810.4A CN115545426A (en) 2022-09-19 2022-09-19 Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning

Publications (1)

Publication Number Publication Date
CN115545426A true CN115545426A (en) 2022-12-30

Family

ID=84727797

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211138810.4A Pending CN115545426A (en) 2022-09-19 2022-09-19 Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning

Country Status (1)

Country Link
CN (1) CN115545426A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117634006A (en) * 2024-01-26 2024-03-01 新疆三联工程建设有限责任公司 BIM technology-based sleeve embedded engineering management system and method

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117634006A (en) * 2024-01-26 2024-03-01 新疆三联工程建设有限责任公司 BIM technology-based sleeve embedded engineering management system and method
CN117634006B (en) * 2024-01-26 2024-04-26 新疆三联工程建设有限责任公司 BIM technology-based sleeve embedded engineering management system and method

Similar Documents

Publication Publication Date Title
CN109785337B (en) In-column mammal counting method based on example segmentation algorithm
CN110490415B (en) Building progress assessment method for visual coordination of multiple unmanned aerial vehicles
CN106680798B (en) A kind of identification of airborne LIDAR air strips overlay region redundancy and removing method
CN107154040A (en) A kind of tunnel-liner surface image crack detection method
CN116993928B (en) Urban engineering mapping method and system based on unmanned aerial vehicle remote sensing technology
CN107341508B (en) Fast food picture identification method and system
CN111652835A (en) Method for detecting insulator loss of power transmission line based on deep learning and clustering
CN111612846A (en) Concrete crack width measuring method based on U-net CNN image recognition and pixel calibration
CN111369526B (en) Multi-type old bridge crack identification method based on semi-supervised deep learning
CN116305436A (en) Existing bridge monitoring method based on combination of three-dimensional laser scanning and BIM
CN115545426A (en) Prefabricated part production progress analysis system based on BIM and three-dimensional laser scanning
CN111709775A (en) House property price evaluation method and device, electronic equipment and storage medium
CN115861816A (en) Three-dimensional low vortex identification method and device, storage medium and terminal
CN113870326B (en) Structural damage mapping, quantifying and visualizing method based on image and three-dimensional point cloud registration
CN114758127A (en) Urban scene garbage detection system based on big data
WO2024125434A1 (en) Regional-consistency-based building principal angle correction method
CN112557506B (en) Method, system, terminal and storage medium for supervising road surface characteristics by adopting unmanned aerial vehicle
CN106683131A (en) City component automation measurement method
CN117496073A (en) Method and system for constructing multi-time-phase live-action three-dimensional model
CN117541594A (en) Double-non-maximum-suppression transverse wind ridging small target detection method and system
CN116863134A (en) Method and system for detecting and dividing length and width of tunnel lining crack
CN116028660A (en) Weight value-based image data screening method, system and medium
CN110068279B (en) Prefabricated part plane circular hole extraction method based on point cloud data
CN108197134B (en) Automatic point group target synthesis algorithm supported by big data
CN111310853A (en) Single recognition algorithm based on neural network and elevation fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination