CN116612091A - Construction progress automatic estimation method based on multi-view matching - Google Patents

Construction progress automatic estimation method based on multi-view matching Download PDF

Info

Publication number
CN116612091A
CN116612091A CN202310589437.2A CN202310589437A CN116612091A CN 116612091 A CN116612091 A CN 116612091A CN 202310589437 A CN202310589437 A CN 202310589437A CN 116612091 A CN116612091 A CN 116612091A
Authority
CN
China
Prior art keywords
construction
dimensional model
model
view
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310589437.2A
Other languages
Chinese (zh)
Inventor
姚青松
杨语嫣
郭小龙
刘宝宝
虎雷
常逢灏
梁永强
马建峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xi'an Cloud Vein Intelligent Technology Co ltd
Xidian University
Original Assignee
Xi'an Cloud Vein Intelligent Technology Co ltd
Xidian University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xi'an Cloud Vein Intelligent Technology Co ltd, Xidian University filed Critical Xi'an Cloud Vein Intelligent Technology Co ltd
Priority to CN202310589437.2A priority Critical patent/CN116612091A/en
Publication of CN116612091A publication Critical patent/CN116612091A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/06Resources, workflows, human or project management; Enterprise or organisation planning; Enterprise or organisation modelling
    • G06Q10/063Operations research, analysis or management
    • G06Q10/0633Workflow analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Systems or methods specially adapted for specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/20Finite element generation, e.g. wire-frame surface description, tesselation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/7715Feature extraction, e.g. by transforming the feature space, e.g. multi-dimensional scaling [MDS]; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/20Indexing scheme for editing of 3D models
    • G06T2219/2008Assembling, disassembling

Abstract

The application discloses a method for automatically estimating construction progress based on multi-view matching, which comprises the following steps: determining a three-dimensional model corresponding to a construction site; registering a pre-designed building information model BIM with the three-dimensional model to enable the two to be rotationally aligned; determining corresponding position information of each construction equipment on the three-dimensional model according to each construction equipment on the BIM, and carrying out monomerization splitting on the three-dimensional model; the three-dimensional grid denoising method of each device after the three-dimensional model is individualized realizes the denoising treatment of the reconstructed model surface; determining each view angle picture of the three-dimensional model, and determining a pixel degree matrix of the view angle picture; and determining the construction progress of different equipment according to the pixel degree matrix. By adopting the scheme provided by the embodiment of the application, the construction progress of the sheet construction area can be automatically acquired, and the artificial subjective interference is reduced.

Description

Construction progress automatic estimation method based on multi-view matching
Technical Field
The application relates to the technical field of machine vision, in particular to a method for automatically estimating construction progress based on multi-view matching.
Background
The engineering construction projects all need to be monitored on the construction progress, and a scientific monitoring method is a precondition for ensuring the smooth implementation of engineering construction. The traditional construction progress monitoring is mainly to manually check and check on site according to a construction schedule, and the method is low in efficiency, heavy in workload and large in blind area, and can not meet the fast-paced industry demands. Along with the development of unmanned aerial vehicle technology, also the technology of utilizing unmanned aerial vehicle technique to realize construction progress monitoring has appeared, mainly through manual relevant construction condition of matching, and this method still has the drawback, appears the mistake easily and consumes the manpower.
In recent years, along with the development of three-dimensional reconstruction technology, a three-dimensional grid or point cloud model for reconstructing an actual scene by using an unmanned aerial vehicle aerial photo is also generated, and a relevant design model is compared with a real scene model to obtain a construction basic condition. Current methods for estimating construction progress using reconstruction models are broadly classified into the following categories: acquiring a construction site point cloud through an unmanned plane and a three-dimensional reconstruction technology, splitting building monomers of a construction area, acquiring a building monomer elevation by using a construction drawing, and comparing the elevation to acquire a corresponding construction progress; reconstructing point clouds of a construction site by using unmanned aerial vehicle oblique photography technology, voxelizing the point clouds obtained from the construction site, comparing the changes of construction site models in different periods, and monitoring the construction progress; and acquiring corresponding scene pictures by using a camera, analyzing and acquiring point clouds, realizing registration of construction site building point clouds and planned BIM point clouds, and performing progress perception.
At present, a point cloud registration mode is adopted for construction progress monitoring, when the construction scale is large, site data are complex, accurate registration of construction site point cloud and BIM design model point cloud is difficult to achieve, deviation is easy to occur when the construction basic condition is obtained by adopting the point cloud mode, and construction progress is difficult to measure.
Disclosure of Invention
In order to make up for the defects of the prior art, the application provides a method for automatically estimating the construction progress based on multi-view matching, which can automatically acquire the construction progress of a sheet construction area and reduce the interference of manual subjectivity. The specific scheme is as follows:
the embodiment of the application provides a method for automatically estimating construction progress based on multi-view matching, which comprises the following steps:
determining a three-dimensional model corresponding to a construction site;
registering a pre-designed building information model BIM with the three-dimensional model to enable the two to be rotationally aligned;
determining corresponding position information of each construction equipment on the three-dimensional model according to each construction equipment on the BIM, and carrying out monomerization splitting on the three-dimensional model;
the three-dimensional grid denoising method of each device after the three-dimensional model is individualized realizes the denoising treatment of the reconstructed model surface;
determining each view angle picture of the three-dimensional model, and determining a pixel degree matrix of the view angle picture;
and determining the construction progress of different equipment according to the pixel degree matrix.
Optionally, the determining the three-dimensional model corresponding to the construction site includes:
planning an unmanned aerial vehicle flight route, laying corresponding image control points, acquiring aerial images corresponding to construction sites shot by the unmanned aerial vehicle, and realizing three-dimensional reconstruction of the construction sites by utilizing the corresponding image control points and images of relevant positions to generate a three-dimensional model corresponding to the construction sites, wherein the three-dimensional model comprises position information corresponding to different equipment of the construction sites and corresponding external construction information.
Optionally, registering the pre-designed building information model BIM with the three-dimensional model to rotationally align the two includes:
the principal component analysis method PCA is utilized to obtain three principal directions of a pre-designed building information model BIM and the three-dimensional model;
projecting each three-dimensional point of the three-dimensional model in the three main directions to obtain each point projection;
obtaining 8 corner points of the OBB bounding box of the three-dimensional model according to the extreme value of the point projection;
aligning the BIM with the three-dimensional model based on the 8 corner rotations;
and realizing the length and width registration of the BIM and the three-dimensional model bounding box through scaling.
Optionally, the performing the singulation on of the three-dimensional model includes:
marking the electrical and civil engineering equipment in the BIM one by one, and performing monomer presentation on different equipment in the BIM;
traversing each device in the BIM, determining an OBB bounding box of corresponding electrical and civil engineering equipment, and obtaining 8 coordinate point values;
storing coordinate values of 4 points in the bottom surface direction of all equipment needing to realize the monomerization splitting and names of the three-dimensional model into a txt file to obtain polygonal points needing to be segmented;
and calling a corresponding 3dmax script, and carrying out monomerization splitting on the three-dimensional model based on the polygonal points needing to be segmented.
Optionally, the calling the corresponding 3dmax script, and performing the monomerizing splitting on the three-dimensional model based on the polygon point to be segmented, including:
reading a txt file of the three-dimensional model and the split module point set corresponding to the construction site needing to be split;
traversing all the modules to be segmented in the file, obtaining a segmentation point set in the modules to be segmented, and generating a polygon surface to be segmented by the point set;
the polygon to be split faces upwards to form a three-dimensional polygon, the segmentation part is obtained through Boolean calculation, and the points and the triangular mesh faces which need to be split are selected to realize the automatic splitting function.
Optionally, the determining each view picture of the three-dimensional model includes:
setting the background color and size of a blender, the orthogonal shooting of an engine and a camera, and the shooting range and height of the corresponding camera;
traversing different devices for designing the BIM model and different devices on a construction site, and importing the BIM model into a scene;
a top view, a front view, a left view, and two oblique 45 degree views of different devices were taken.
Optionally, the determining the pixel degree matrix of the view picture includes:
a unified color view of the original image I and the template image T is input;
sliding the template T in the image I, wherein the sliding refers to moving the template picture one pixel at a time, and performing measurement calculation at different positions so as to judge the similarity value of a certain area corresponding to the original image in the current pixel and obtain the similarity values corresponding to different pixel positions;
for different positions of the template T covered on the I, saving the calculated measurement values into a result image matrix R, wherein the R contains a matching measurement value corresponding to each position;
searching the most value in the result matrix R, wherein the maximum value of the matching numerical value can be considered as the similarity numerical value of image matching due to the adoption of a standard correlation matching criterion measuring mode;
and traversing different view angles of all equipment needing to measure the construction progress to obtain a corresponding similarity numerical matrix s [ i ] [ j ], wherein i represents a model number and j represents a view number.
Optionally, the determining the construction progress of different devices according to the pixel degree matrix includes:
the construction progress estimate is divided into 3 stages: the construction is completed, the construction is in progress and the construction is not yet performed;
when the similarity between the left view and the front view is greater than 60% and the overall similarity mean is greater than 70%, determining that the equipment construction is completed;
when the matching similarity of the left view and the front view is below 30%, determining that the equipment is not started temporarily;
when the matching similarity between the left view and the front view is above 30%, identifying whether the model is a soil pile by training corresponding soil pile pictures, classifying the soil pile as the soil pile which is not constructed yet, and classifying the soil pile as the soil pile which is not constructed yet.
According to the method for automatically estimating the construction progress based on multi-view matching, the construction progress of different equipment on a construction site is obtained through multi-view comparison analysis between a three-dimensional model reconstructed by oblique photography representing the construction site and a BIM design model. The application improves the matching precision by processing the two models for multiple times, has the advantages of high visualization and automation degree, and improves the defects of large workload and high error rate of the existing construction progress estimation method. In addition, the method and the device are applicable to unmanned aerial vehicles with lower precision, reduce the working cost and have better application prospects for use in construction scenes.
Drawings
FIG. 1 is a general flow chart of the present application;
FIG. 2 is a flow chart of a model registration module;
FIG. 3 is a flow chart of an auto-split module;
FIG. 4 is a flowchart of a surface smoothing process;
FIG. 5 is a flow chart of construction progress prediction;
FIG. 6 is a schematic diagram of the model after smoothing;
FIG. 7 is a schematic diagram of a multi-view of the apparatus;
FIG. 8 is a schematic view of similarity values for the construction completion section;
fig. 9 is a view schematically showing similarity values of unfinished portions of the construction.
Detailed Description
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present application. The present application may be embodied in many other forms than those herein described, and those skilled in the art will readily appreciate that the present application may be similarly embodied without departing from the spirit or essential characteristics thereof, and therefore the present application is not limited to the specific embodiments disclosed below.
Example 1
A method for automatically estimating construction progress based on multi-view matching comprises the following steps:
step 1, planning a flight route of the unmanned aerial vehicle, laying corresponding image control points, acquiring aerial images corresponding to a construction site, and realizing three-dimensional reconstruction of the construction site by utilizing the corresponding image control points and images of related positions to generate a three-dimensional grid model. The construction site comprises position information corresponding to different equipment on the construction site and corresponding external construction information;
step 2, registering the planned BIM model and the on-site reconstruction model, and enabling the planned BIM model and the on-site reconstruction model to be rotationally aligned;
step 3, determining corresponding position information according to each construction equipment on the corresponding planned BIM model, and realizing the monomerization splitting of the field reconstruction model by using a 3dmax script;
step 4, the related equipment after the field model is individualized is subjected to surface denoising treatment of the reconstructed model by using a three-dimensional grid denoising method of L0 norm;
step 5, using a blender script to realize automatic acquisition of 5 main view pictures of the model;
step 6, obtaining a pixel degree matrix of five visual angles of one device according to the modified image template matching algorithm;
step 7, determining construction progress of different equipment according to the correlation similarity threshold;
further, the specific steps of the registration estimation method in the step 2 are as follows:
step 2.1, three main directions of two models are obtained by using a Principal Component Analysis (PCA);
step 2.2, projecting the model points in three main directions;
step 2.3, obtaining 8 corner points of the OBB bounding box of the model according to the extreme value of the point projection;
step 2.4, rotating and aligning the planned BIM model and the live-action reconstruction model;
step 2.5, realizing the length and width registration of the two model bounding boxes through scaling;
furthermore, the implementation of the model monomerization splitting in the step 3 mainly requires the invocation of a 3dmax splitting model script, and the specific steps are as follows:
step 3.1, labeling the electrical and civil engineering equipment in the planned BIM model one by one, and realizing the individual presentation of different equipment in the design model;
step 3.2, traversing each device in the BIM design model, determining an OBB bounding box of corresponding electrical and civil engineering devices, and obtaining 8 coordinate point values;
step 3.3, storing coordinate values and model names of 4 points in the bottom surface direction of all equipment needing to realize the monomerization splitting into txt files to obtain polygonal points needing to be segmented;
step 3.4, calling a corresponding 3dmax script to realize model splitting;
further, the split script in 3dmax is mainly divided into the following sub-steps;
step 3.4.1, reading a construction site model to be split and txt files of a split module point set;
step 3.4.2, traversing all modules to be segmented in the file, obtaining a segmentation point set in a segmentation module, and generating a polygon surface to be segmented by the point set;
and 3.4.3, forming a three-dimensional polygon by the upward direction of the polygon to be split, obtaining a segmentation part through Boolean calculation, selecting points and triangular mesh surfaces to be split, and realizing an automatic splitting function.
Further, the specific steps of the reconstruction model smoothing process in the step 4 are as follows:
step 4.1, obtaining a corresponding vertex coordinate p of the model to be smoothed;
step 4.2, setting an optimized differential operator D (e) applicable to the grid edges;
step 4.3, introducing a regularization term
R(p)=(p 1 –p 2 +p 3 –p 4 ) 2
Step 4.4, the optimization objective becomes
min p,δ |p–p*| 2 +α|R(p)| 2 +β|D(p)–δ| 2 +λ|δ| 0
Step 4.5, fix p-optimized delta, i.e
min δ β|D(p)–δ| 2 +λ|δ| 0 When (when)Otherwise delta i =D i
Step 4.6, fixing delta optimized p, i.e
min p |p–p*| 2 +α|R(p)| 2 +β|D(p)–δ| 2
Step 4.7, circularly iterating until beta is more than or equal to 10 3
Further, the specific steps of automatically acquiring the view in the step 5 are as follows:
step 5.1, setting background color and size of a blender, an engine and a camera to be orthogonal shooting, and corresponding shooting range and height of the camera;
step 5.2, traversing different devices for designing the BIM model and different devices on a construction site, and importing the BIM model and the different devices into a scene;
step 5.3, obtaining a top view, a front view, a left view and two oblique 45-degree views of different devices.
Further, the specific steps of the view matching and obtaining view similarity numerical algorithm in the step 6 are as follows:
step 6.1, inputting a unified color view of the original image I and the template image T;
and 6.2, sliding the template T in the image I, wherein the sliding refers to moving the template picture one pixel at a time, and performing measurement calculation at different positions so as to judge the similarity value of a certain area corresponding to the original image in the current pixel. The metric herein utilizes a criteria-dependent matching criterion (TM ccoiff reference,
R(x,y)=∑ x',y' (T'(x',y')×I'(x+x',y+y')),
therefore, the similarity values corresponding to different pixel positions can be obtained;
step 6.3, for different positions of the template T covered on the I, saving the calculated measurement value into a result image matrix R, wherein the R contains a matching measurement value corresponding to each position;
step 6.4, searching the most value in the result matrix R, wherein the maximum value of the matching values can be identified as the similarity value of image matching due to the adoption of a standard correlation matching criterion measuring mode;
and 6.5, traversing different view angles of all equipment needing to measure the construction progress by using the algorithm to obtain a corresponding similarity numerical matrix s [ i ] [ j ], wherein i represents a model number and j represents a view number.
Further, the specific steps of the step 7 of estimating the construction progress are as follows:
the construction progress is estimated into 3 stages according to the construction condition: the construction is completed, the construction is in progress, and the construction is not yet performed.
The construction equipment construction is preferentially distinguished whether to finish or not, and the judgment basis is as follows:
that is, the completion of the equipment construction can be determined only when the similarity of the left view to the front view is greater than 60% and the overall similarity average is greater than 70%.
The judgment basis for distinguishing the construction of equipment from the non-construction is as follows:
and when the matching similarity between the left view and the front view is below 30%, the equipment is not started temporarily.
And (3) for the condition that the similarity value is above 30%, identifying whether the model is a soil pile by training corresponding soil pile pictures, classifying the soil pile as the soil pile for equipment construction, and classifying the soil pile as the soil pile for equipment construction.
Example two
The present application will be described in more detail below by way of more specific embodiments.
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the application without making any inventive effort, are intended to be within the scope of the application.
The application provides a construction progress automatic estimation method based on multi-view matching, which utilizes three-dimensional reconstruction, model registration, automatic splitting, surface smoothing, view matching and the like to realize construction progress estimation through comparison of a construction site reconstruction model and a planning BIM model. Referring to fig. 1, the method for automatically estimating the construction progress based on multi-view matching of the present application comprises the steps of:
step 1, planning a flight route of the unmanned aerial vehicle, laying corresponding image control points, acquiring aerial images corresponding to a construction site, and realizing three-dimensional reconstruction of the construction site by utilizing the corresponding image control points and images of related positions to generate a three-dimensional grid model. The construction site comprises position information corresponding to different equipment on the construction site and corresponding external construction information;
step 2, registering the planned BIM model and the on-site reconstruction model, and enabling the planned BIM model and the on-site reconstruction model to be rotationally aligned;
step 3, determining corresponding position information according to each construction equipment on the corresponding planned BIM model, and realizing the monomerization splitting of the field reconstruction model by using a 3dmax script;
step 4, the related equipment after the field model is individualized is subjected to surface denoising treatment of the reconstructed model by using a three-dimensional grid denoising method of L0 norm;
step 5, using a blender script to realize automatic acquisition of 5 main view pictures of the model;
step 6, obtaining a pixel degree matrix of five visual angles of one device according to the modified image template matching algorithm;
and 7, determining construction progress of different equipment according to the correlation similarity threshold.
In the above embodiment, as shown in fig. 1, a method for automatically estimating the construction progress based on multi-view matching is an overall frame diagram. The model registration module, the automatic splitting module and the smoothing module provide model data support for construction progress estimation, and the final construction progress estimation precision is improved. The model registration module is mainly realized by using the OOB bounding box, and can realize accurate estimation of the construction progress of the corresponding position equipment through position matching, so that the situation of estimation confusion is not easy to generate; the automatic splitting module is responsible for automatically splitting equipment of which the progress needs to be estimated on a construction site according to the position, so that the subsequent single equipment is facilitated to compare and acquire the progress; the surface smoothing treatment of the reconstructed model is realized in the smoothing treatment module, and the surface smoothing treatment is needed because the reconstructed model of the unmanned aerial vehicle oblique photography has the problems of surface roughness and the like and influences the progress estimation; and the progress estimating module estimates similarity matching values of different visual angles of the model through a template matching algorithm and acquires the construction state of the equipment through evaluating the similarity values.
The model registration module is specifically illustrated in fig. 2. The flow of the step 2 is as follows:
firstly, inputting two models to be registered, obtaining three main directions of the two models, namely three directions of x, y and z by using a Principal Component Analysis (PCA), wherein the two models are triangular mesh models, projecting vertexes of the two three-dimensional models in the main axis direction, obtaining boundary values, and determining 8 corner points of an OBB bounding box. The two models form two bounding boxes, the two bounding boxes are rotationally aligned, and as the height of the reconstruction model of the construction site is different from the height of the planned BIM model, the registration is carried out in a ground alignment mode, namely length and width registration is adopted, and finally the registered models are output for subsequent processing.
FIG. 3 is a flow chart illustrating a model splitting module. The automatic splitting module of the model is mainly divided into two parts, wherein the first part is to acquire a device point set needing to be split in the planning BIM design model, and the second part is to realize automatic splitting. The implementation steps for the set of device points to be split in the first partial acquisition plan BIM design model are approximately as follows:
(1) Labeling the electrical and civil engineering equipment in the planned BIM one by one, and completing the individual presentation of different equipment in the design model;
(2) Traversing each device in the BIM design model, determining an OBB bounding box of corresponding electrical and civil engineering equipment, and obtaining 8 coordinate point values;
(3) Storing coordinate values and model names of 4 points in the bottom surface direction of all equipment needing to realize the monomerization splitting into txt files to obtain polygonal points needing to be segmented;
the second part is to call a split script in 3dmax to realize automatic split, and the method mainly comprises the following steps;
(1) Reading a construction site model to be split and a txt file of a split module point set;
(2) Traversing all modules to be segmented in the file, obtaining a segmentation point set in a splitting module, and generating a polygon surface to be split by the point set;
(3) The polygon to be split faces upwards to form a three-dimensional polygon, the part to be split is calculated through Boolean, and the point and the triangular mesh face to be split are selected to realize the automatic splitting function.
The flow of the reconstruction model smoothing process is shown in fig. 4. The part mainly adopts a triangular mesh denoising algorithm (L0 Minimization) based on L0 norm Minimization, and the algorithm is used for realizing the smoothing treatment of the model, and the main flow of the algorithm is as follows:
(1) Obtaining a corresponding vertex coordinate p of a model to be smoothed;
(2) Setting an optimized differential operator D (e) applicable to the grid edges;
(3) Introducing regularization term
R(p)=(p 1 –p 2 +p 3 –p 4 ) 2
(4) The optimization objective becomes
min p,δ |p–p*| 2 +α|R(p)| 2 +β|D(p)–δ| 2 +λ|δ| 0
(5) Fixing p-optimized delta, i.e
min δ β|D(p)–δ| 2 +λ|δ| 0 When (when)Otherwise delta i =D i
(6) Fixing delta-optimized p, i.e
min p |p–p*| 2 +α|R(p)| 2 +β|D(p)–δ| 2
(7) Iterating circularly until beta is more than or equal to 10 3
(8) And finishing the surface smoothing treatment and outputting the triangular mesh model p.
The basic flow chart of the construction progress estimation module is shown in fig. 5. The construction progress prediction module is divided into 3 parts, namely automatic acquisition of multi-view pictures, view similarity matching and construction progress estimation. The steps of automatically acquiring the first part of multi-view pictures are as follows:
(1) Setting the background color and size of a blender, the orthogonal shooting of an engine and a camera, and the shooting range and height of the corresponding camera;
(2) Traversing different devices for designing the BIM model and different devices on a construction site, and importing the BIM model and the different devices into the blender scene set in the step (1);
(3) And calling a function in a blender library to automatically acquire a top view, a front view, a left view and two oblique 45-degree views of different equipment, and storing and outputting the top view, the front view, the left view and the two oblique 45-degree views.
The second partial view matching obtaining graph similarity degree value algorithm adopts an image template matching algorithm, and the specific flow of the algorithm is as follows:
(1) A unified color view of the original image I and the template image T is input;
(2) The template T is slid in the image I, wherein the sliding refers to moving the template picture one pixel at a time, and performing measurement calculation at different positions, so as to judge the similarity value of a certain area corresponding to the original image in the current pixel. The metric herein utilizes a criteria-dependent matching criterion (TM ccoiff reference,
R(x,y)=∑ x',y' (T'(x',y')×I'(x+x',y+y')),
therefore, the similarity values corresponding to different pixel positions can be obtained;
(3) For different positions of the template T covered on the I, saving the calculated measurement values into a result image matrix R, wherein the R contains a matching measurement value corresponding to each position;
(4) Searching the most value in the result matrix R, wherein the maximum value of the matching numerical value can be considered as the similarity numerical value of image matching due to the adoption of a standard correlation matching criterion measuring mode;
(5) The algorithm is used for traversing different view angles of all equipment needing to measure the construction progress, and a corresponding similarity numerical matrix s [ i ] [ j ] can be obtained, wherein i represents a model number and j represents a view number.
The third part is to realize the final estimation of the construction progress, complete the matching of the similarity and the construction progress, and divide the construction progress estimation into 3 stages according to the construction condition: a construction completion stage, a construction stage and a construction stage. The specific flow is as follows:
(1) Input similarity matching matrix s
(2) The construction equipment construction is preferentially distinguished whether to finish or not, and the judgment basis is as follows:
that is, the completion of the equipment construction can be determined only when the similarity of the left view to the front view is greater than 60% and the overall similarity average is greater than 70%.
(3) The judgment basis for distinguishing the construction of equipment from the non-construction is as follows:
and when the matching similarity between the left view and the front view is below 30%, the equipment is not started temporarily.
(4) And (3) for the condition that the similarity value is above 30%, identifying whether the model is a soil pile by training corresponding soil pile pictures, classifying the soil pile as the soil pile for equipment construction, and classifying the soil pile as the soil pile for equipment construction.
(5) In the above-mentioned situation, the visual display is used for the displayed equipment, and a corresponding construction progress monitoring report is generated.
The implementation effect of the application in the embodiment is approximately as follows:
(1) The surface smoothing treatment of the model is shown in fig. 6, a rough three-dimensional model is input as shown in fig. 6 (a), and the smooth three-dimensional model shown in fig. 6 (b) is finally obtained through the treatment of a model surface smoothing module, so that the surface smoothing efficiency is good;
(2) The final effect obtained automatically by the three-dimensional model view is shown in fig. 7, the model is centered, the images of five visual angles of the model are obtained, the generated images are matched with the original model, and the automatic effect is good;
(3) Fig. 8 and 9 generally show the basic effect of obtaining the image similarity value by view matching, and the matching similarity is about 80% for the model matching effect of the construction completion; the similarity of the not-completed construction is generally not high, so that the construction progress can be basically monitored through the determination of the threshold value.
While the application has been described in terms of preferred embodiments, it is not intended to be limiting, but rather, it will be apparent to those skilled in the art that various changes and modifications can be made herein without departing from the spirit and scope of the application as defined by the appended claims.

Claims (8)

1. A method for automatically estimating construction progress based on multi-view matching, the method comprising:
determining a three-dimensional model corresponding to a construction site;
registering a pre-designed building information model BIM with the three-dimensional model to enable the two to be rotationally aligned;
determining corresponding position information of each construction equipment on the three-dimensional model according to each construction equipment on the BIM, and carrying out monomerization splitting on the three-dimensional model;
the three-dimensional grid denoising method of each device after the three-dimensional model is individualized realizes the denoising treatment of the reconstructed model surface;
determining each view angle picture of the three-dimensional model, and determining a pixel degree matrix of the view angle picture;
and determining the construction progress of different equipment according to the pixel degree matrix.
2. The method of claim 1, wherein the determining a three-dimensional model corresponding to a construction site comprises:
planning an unmanned aerial vehicle flight route, laying corresponding image control points, acquiring aerial images corresponding to construction sites shot by the unmanned aerial vehicle, and realizing three-dimensional reconstruction of the construction sites by utilizing the corresponding image control points and images of relevant positions to generate a three-dimensional model corresponding to the construction sites, wherein the three-dimensional model comprises position information corresponding to different equipment of the construction sites and corresponding external construction information.
3. The method of claim 1, wherein registering the pre-designed building information model BIM with the three-dimensional model to rotationally align the two comprises:
the principal component analysis method PCA is utilized to obtain three principal directions of a pre-designed building information model BIM and the three-dimensional model;
projecting each three-dimensional point of the three-dimensional model in the three main directions to obtain each point projection;
obtaining 8 corner points of the OBB bounding box of the three-dimensional model according to the extreme value of the point projection;
aligning the BIM with the three-dimensional model based on the 8 corner rotations;
and realizing the length and width registration of the BIM and the three-dimensional model bounding box through scaling.
4. The method of claim 1, wherein said singulating said three-dimensional model comprises:
marking the electrical and civil engineering equipment in the BIM one by one, and performing monomer presentation on different equipment in the BIM;
traversing each device in the BIM, determining an OBB bounding box of corresponding electrical and civil engineering equipment, and obtaining 8 coordinate point values;
storing coordinate values of 4 points in the bottom surface direction of all equipment needing to realize the monomerization splitting and names of the three-dimensional model into a txt file to obtain polygonal points needing to be segmented;
and calling a corresponding 3dmax script, and carrying out monomerization splitting on the three-dimensional model based on the polygonal points needing to be segmented.
5. The method of claim 4, wherein invoking the corresponding 3dmax script, and performing a monomerizing split on the three-dimensional model based on the polygon points for which segmentation is to be completed, comprises:
reading a txt file of the three-dimensional model and the split module point set corresponding to the construction site needing to be split;
traversing all the modules to be segmented in the file, obtaining a segmentation point set in the modules to be segmented, and generating a polygon surface to be segmented by the point set;
the polygon to be split faces upwards to form a three-dimensional polygon, the segmentation part is obtained through Boolean calculation, and the points and the triangular mesh faces which need to be split are selected to realize the automatic splitting function.
6. The method of claim 1, wherein the determining the respective perspective pictures of the three-dimensional model comprises:
setting the background color and size of a blender, the orthogonal shooting of an engine and a camera, and the shooting range and height of the corresponding camera;
traversing different devices for designing the BIM model and different devices on a construction site, and importing the BIM model into a scene;
a top view, a front view, a left view, and two oblique 45 degree views of different devices were taken.
7. The method of claim 1, wherein the determining the matrix of pixelities of the view picture comprises:
a unified color view of the original image I and the template image T is input;
sliding the template T in the image I, wherein the sliding refers to moving the template picture one pixel at a time, and performing measurement calculation at different positions so as to judge the similarity value of a certain area corresponding to the original image in the current pixel and obtain the similarity values corresponding to different pixel positions;
for different positions of the template T covered on the I, saving the calculated measurement values into a result image matrix R, wherein the R contains a matching measurement value corresponding to each position;
searching the most value in the result matrix R, wherein the maximum value of the matching numerical value can be considered as the similarity numerical value of image matching due to the adoption of a standard correlation matching criterion measuring mode;
and traversing different view angles of all equipment needing to measure the construction progress to obtain a corresponding similarity numerical matrix s [ i ] [ j ], wherein i represents a model number and j represents a view number.
8. The method of claim 1, wherein said determining the progress of construction of different devices from said pixel degree matrix comprises:
the construction progress estimate is divided into 3 stages: the construction is completed, the construction is in progress and the construction is not yet performed;
when the similarity between the left view and the front view is greater than 60% and the overall similarity mean is greater than 70%, determining that the equipment construction is completed;
when the matching similarity of the left view and the front view is below 30%, determining that the equipment is not started temporarily;
when the matching similarity between the left view and the front view is above 30%, identifying whether the model is a soil pile by training corresponding soil pile pictures, classifying the soil pile as the soil pile which is not constructed yet, and classifying the soil pile as the soil pile which is not constructed yet.
CN202310589437.2A 2023-05-24 2023-05-24 Construction progress automatic estimation method based on multi-view matching Pending CN116612091A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310589437.2A CN116612091A (en) 2023-05-24 2023-05-24 Construction progress automatic estimation method based on multi-view matching

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310589437.2A CN116612091A (en) 2023-05-24 2023-05-24 Construction progress automatic estimation method based on multi-view matching

Publications (1)

Publication Number Publication Date
CN116612091A true CN116612091A (en) 2023-08-18

Family

ID=87674294

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310589437.2A Pending CN116612091A (en) 2023-05-24 2023-05-24 Construction progress automatic estimation method based on multi-view matching

Country Status (1)

Country Link
CN (1) CN116612091A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808324A (en) * 2024-02-27 2024-04-02 西安麦莎科技有限公司 Building progress assessment method for unmanned aerial vehicle vision coordination

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117808324A (en) * 2024-02-27 2024-04-02 西安麦莎科技有限公司 Building progress assessment method for unmanned aerial vehicle vision coordination

Similar Documents

Publication Publication Date Title
CN107945267B (en) Method and equipment for fusing textures of three-dimensional model of human face
EP3242275B1 (en) Using photo collections for three dimensional modeling
US9942535B2 (en) Method for 3D scene structure modeling and camera registration from single image
CN108648194B (en) Three-dimensional target identification segmentation and pose measurement method and device based on CAD model
Lerones et al. A practical approach to making accurate 3D layouts of interesting cultural heritage sites through digital models
US20140267254A1 (en) Accurate Image Alignment to a 3D Model
US20060061566A1 (en) Method and apparatus for performing three-dimensional computer modeling
CN112258658B (en) Augmented reality visualization method based on depth camera and application
WO2018053952A1 (en) Video image depth extraction method based on scene sample library
Lallensack et al. Photogrammetry in ichnology: 3D model generation, visualisation, and data extraction
Huang et al. Active image-based modeling with a toy drone
CN111695431A (en) Face recognition method, face recognition device, terminal equipment and storage medium
CN112164142A (en) Building lighting simulation method based on smart phone
CN116612091A (en) Construction progress automatic estimation method based on multi-view matching
CN113538373A (en) Construction progress automatic detection method based on three-dimensional point cloud
WO2020136523A1 (en) System and method for the recognition of geometric shapes
CN110390724B (en) SLAM method with instance segmentation
CN115222884A (en) Space object analysis and modeling optimization method based on artificial intelligence
CN105787870A (en) Graphic image splicing fusion system
Boulanger et al. ATIP: A Tool for 3D Navigation inside a Single Image with Automatic Camera Calibration.
Kaiser et al. Automatic co-registration of photogrammetric point clouds with digital building models
Stent et al. Precise deterministic change detection for smooth surfaces
JP6938201B2 (en) Information processing equipment, information processing methods and programs
Piech et al. 3D modelling with the use of photogrammetric methods
CN114549780B (en) Intelligent detection method for large complex component based on point cloud data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination