CN114676763A - Construction progress information processing method - Google Patents

Construction progress information processing method Download PDF

Info

Publication number
CN114676763A
CN114676763A CN202210248902.1A CN202210248902A CN114676763A CN 114676763 A CN114676763 A CN 114676763A CN 202210248902 A CN202210248902 A CN 202210248902A CN 114676763 A CN114676763 A CN 114676763A
Authority
CN
China
Prior art keywords
construction site
key feature
matrix
point
construction
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210248902.1A
Other languages
Chinese (zh)
Inventor
杨海平
陈涛
陈梦月
唐思琪
苟先太
钱照国
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Jiaoda Prestressed Engineering Testing Technology Co ltd
Cscec Southwest Consulting Co ltd
Southwest Jiaotong University
Original Assignee
Sichuan Jiaoda Prestressed Engineering Testing Technology Co ltd
Cscec Southwest Consulting Co ltd
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Jiaoda Prestressed Engineering Testing Technology Co ltd, Cscec Southwest Consulting Co ltd, Southwest Jiaotong University filed Critical Sichuan Jiaoda Prestressed Engineering Testing Technology Co ltd
Priority to CN202210248902.1A priority Critical patent/CN114676763A/en
Publication of CN114676763A publication Critical patent/CN114676763A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q10/00Administration; Management
    • G06Q10/10Office automation; Time management
    • G06Q10/103Workflow collaboration or project management
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q50/00Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
    • G06Q50/08Construction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Business, Economics & Management (AREA)
  • Geometry (AREA)
  • Human Resources & Organizations (AREA)
  • Strategic Management (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Economics (AREA)
  • Computer Hardware Design (AREA)
  • Marketing (AREA)
  • Entrepreneurship & Innovation (AREA)
  • Software Systems (AREA)
  • Tourism & Hospitality (AREA)
  • General Business, Economics & Management (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Civil Engineering (AREA)
  • Architecture (AREA)
  • Structural Engineering (AREA)
  • Computational Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Mathematical Physics (AREA)
  • Computing Systems (AREA)
  • Computational Linguistics (AREA)
  • Operations Research (AREA)
  • Quality & Reliability (AREA)
  • Biophysics (AREA)

Abstract

The invention discloses a construction progress information processing method, which comprises the following steps: s1: collecting a plurality of construction site pictures, and extracting key feature points of each picture; s2: matching the same key characteristic points in each two construction site pictures; s3: establishing a virtual three-dimensional image according to the same key feature points; s4: and comparing the virtual three-dimensional image with a preset model, and determining the current construction progress according to comparison information. The construction progress information processing method provided by the invention can overcome the defect that the existing construction progress cannot be checked in real time.

Description

Construction progress information processing method
Technical Field
The invention relates to the technical field of building construction, in particular to a construction progress information processing method.
Background
In the construction of modern building engineering, the construction progress is an important factor influencing the overall quality of the engineering. The traditional construction site construction progress inspection is that appointments professionals are assigned to carry out on-site inspection regularly, inspection contents are complex, and real-time construction progress cannot be mastered. Because the real-time construction progress can not be obtained, if the field constructor finishes the subsequent continuous entity work in advance when the supervision instruction is not executed in the construction process, the subsequent entity is exposed to the risk of being dismantled and reworked, and the construction period is out of control.
Disclosure of Invention
The invention aims to provide a construction progress information processing method to solve the defect that the existing construction progress cannot be checked in real time.
The technical scheme for solving the technical problems is as follows:
the invention provides a construction progress information processing method, which comprises the following steps:
s1: collecting a plurality of construction site pictures, and extracting key feature points of each of the construction site pictures;
s2: matching the same key characteristic points in each two construction site pictures;
s3: establishing a virtual three-dimensional image according to the same key feature points;
s4: and comparing the virtual three-dimensional image with a preset model, and determining the current construction progress according to comparison information.
Alternatively, the step S1 includes the following substeps:
s11: placing a plurality of construction site pictures in a coordinate system, and extracting a coordinate point of each pixel of each construction site picture in the coordinate system;
s12: generating a new output according to the coordinate point and a Gaussian fuzzy model;
s13: constructing a differential scale space according to the coordinate points and the new output;
S14: determining the size of the key characteristic point by a curve fitting mode according to the difference scale space;
s15: and determining the direction of the key feature point according to the size of the key feature point and the neighborhood radius.
Alternatively, in step S11, the generating a new output according to the coordinate point and the gaussian fuzzy model adopts the following formula:
Figure BDA0003546188340000021
wherein, G (x, y, sigma) represents a Gaussian fuzzy model function, x is an abscissa value of a coordinate point, y is an ordinate value of the coordinate point, sigma is a standard deviation of normal distribution, m is a length value of the Gaussian fuzzy model, and n is a width value of the Gaussian fuzzy model.
Optionally, in S12, the constructing the differential scale space includes the following sub-steps:
s121: performing convolution calculation according to the Gaussian fuzzy model and the coordinate points corresponding to the construction site pictures to construct a scale space;
s122: carrying out multiple reduced-order sampling on the construction site picture to obtain n layers of images with different sizes;
s123: establishing n scales for n layers of images with different sizes;
s124: changing the value of sigma, returning to S120 to perform convolution calculation again until obtaining a scale space with a specified number of grades;
S125: and subtracting adjacent scale spaces at different scales to obtain a difference scale space.
Optionally, in S121, the scale space is represented by the following formula:
L(x,y,σ)=G(x,y,σ)*I(x,y)
g (x, y, sigma) represents a Gaussian fuzzy model function, I (x, y) is a coordinate point corresponding to a construction site picture, and L (x, y, sigma) is the convolution of the Gaussian fuzzy model and the coordinate point, namely a scale space;
in S125, the difference scale space is represented by the following formula:
D(x,y,σ)=(G(x,y,kσ)-G(x,y,σ))*I(x,y)
d (x, y, sigma) represents a difference scale space, G (x, y, sigma) represents a Gaussian fuzzy model function, I (x, y) is a coordinate point corresponding to a construction site picture, and k is a constant value.
Optionally, in S14, the determining the size of the key feature point is calculated by the following formula:
Figure BDA0003546188340000031
wherein D (X) represents a fitting function, D is D (x, y, sigma) and represents a difference scale space, and DTRepresenting the transpose of D (mathematical notation),
Figure BDA0003546188340000035
denotes differentiation, X denotes the variable inside the fitting function d (X), and X ═ X, y, σ)T
In S15, determining the direction of the key feature point uses a gradient histogram method, and the direction is calculated by the following formula:
Figure BDA0003546188340000032
Figure BDA0003546188340000033
wherein m (x, y) represents the size of the corresponding histogram of the key feature point, θ (x, y) represents the direction of the key feature point, and L represents the size of the scale space in which the key feature point is located.
Alternatively, the step S2 includes:
s21: calculating the Euclidean distance between the same key feature points in the two adjacent construction site pictures according to the following formula:
Figure BDA0003546188340000034
wherein rho is the Euclidean distance between the same key feature points in the two adjacent construction site pictures; (x)1,y1) For the coordinates of the previous job site picture, (x)2,y2) The coordinates of the latter construction site picture;
s22: and comparing the sizes of the Euclidean distances, and determining that the coordinate point corresponding to the minimum Euclidean distance is the same key feature point.
Alternatively, the step S3 includes the following substeps:
s31: calculating a basic matrix according to the same key feature points;
s32: obtaining a projection matrix between two adjacent construction site pictures according to the basic matrix;
s33: generating three-dimensional point cloud data according to the projection matrix and the feature matching points;
s34: and establishing the virtual three-dimensional image according to the three-dimensional point cloud data.
Alternatively, in step S31, the calculation formula of the basis matrix is:
x'TFx=0
where x and x' are a pair of feature matching points, and x ═ x (x)1,y1,1),x′=(x2,y2,1),(x1,y1) For the coordinates of the previous job site picture, (x)2,y2) F is a coordinate of the later construction site picture, and is a basic matrix;
The foundation matrix F comprises rotation and translation information between the two construction site pictures and internal parameters of a camera, and the same key feature points of the two construction site pictures have the following constraint relationship:
Figure BDA0003546188340000041
wherein, fijIs a parameter in the basis matrix F, wherein i is 1, 2, 3; j is 1, 2, 3;
in step S31, the projection matrix is calculated by the following formula:
P1=C[I|0]
P2=C[R|T]
F=C-TEC-1
E=[T]R
wherein, P1,P2Projection matrixes of a previous picture and a next picture respectively; i is an identity matrix; e is a matrix containing rotation and translation information between the two images; r and T are respectively a rotation matrix and a translation matrix; after singular value decomposition is carried out on the matrix E, matrixes R and T are obtained, and projection matrixes of two pictures are calculated at the same time; c is a matrix symbol.
Alternatively, the step S4 includes:
mapping the height of the preset model by the coordinate values of the virtual three-dimensional image and the following formula:
Figure BDA0003546188340000051
in the above formula, h' is the height of the virtual three-dimensional image, and the lowest point to the highest point are taken; w' is the width of the virtual three-dimensional image, the width of one side of the base is taken, w is the width of the corresponding side in the complete preset model, and h is the height of the complete preset model.
The invention has the following beneficial effects:
through the technical scheme, namely the construction progress information processing method provided by the embodiment of the invention, on one hand, the construction progress can be checked in real time, so that project management personnel can control the construction flow, and the phenomena of completing construction in advance and slowing down the construction progress are effectively avoided; on the other hand, the manpower required by the patrol can be effectively saved, so that the construction cost is saved.
Drawings
Fig. 1 is a flowchart of a construction progress information processing method according to an embodiment of the present invention;
fig. 2 is a flowchart illustrating a substep of step S1 of the construction progress information processing method according to the embodiment of the present invention;
fig. 3 is a flowchart illustrating a substep of step S12 of the construction progress information processing method according to the embodiment of the present invention;
fig. 4 is a flowchart illustrating a substep of step S3 of the construction progress information processing method according to an embodiment of the present invention.
Detailed Description
The principles and features of this invention are described below in conjunction with the following drawings, which are set forth by way of illustration only and are not intended to limit the scope of the invention.
In the present invention, the terms "first", "second", and the like are used for distinguishing one element from another without any explanation to the contrary, and have no sequence or importance. The terms "upper" and "lower" are based on the upper and lower positions of FIG. 1 of the present invention; the terms "inner" and "outer" refer to the relative inner and outer surface contours of an object. In the following description, when referring to the figures, the same reference numbers in different figures denote the same or similar elements, unless otherwise explained. The foregoing definitions are provided to illustrate and describe the present disclosure only and should not be construed to limit the present disclosure.
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, but not all, embodiments of the present invention. All other embodiments, which can be obtained by a person skilled in the art without making any creative effort based on the embodiments in the present invention, belong to the protection scope of the present invention.
Examples
The technical scheme for solving the technical problems is as follows:
referring to fig. 1, the present invention provides a construction progress information processing method including the steps of:
s1: collecting a plurality of construction site pictures, and extracting key feature points of each of the construction site pictures;
s2: matching the same key characteristic points in each two construction site pictures;
s3: establishing a virtual three-dimensional image according to the same key feature points;
s4: and comparing the virtual three-dimensional image with a preset model, and determining the current construction progress according to comparison information.
The invention has the following beneficial effects:
through the technical scheme, namely the construction progress information processing method provided by the embodiment of the invention, on one hand, the construction progress can be checked in real time, so that project management personnel can control the construction flow, and the phenomena of completing construction in advance and slowing down the construction progress are effectively avoided; on the other hand, the manpower required by the patrol can be effectively saved, so that the construction cost is saved.
It should be noted here that the multiple construction site pictures may be acquired in any form, for example, a smart phone, a smart camera, a smart patrol car, a smart helmet, smart glasses, and the like, and the pictures may be acquired and transmitted to the background server. In the invention, an intelligent helmet is selected, wherein the intelligent helmet has an information transmission function and mainly transmits through 4G, so that an inspector can shoot the site construction condition and transmit the site construction condition to a background server only by wearing the helmet on a construction site.
In addition, the shot construction site picture at least needs to comprise the following three characteristics:
1. comprises a whole building;
2. taking at least 8 pictures around a circle of the building;
3. the distance from the building to the shooting is basically consistent.
So as to ensure the accuracy of subsequent picture processing and the integrity of information acquisition. After the construction site pictures are obtained, all the construction site pictures are extracted with key feature points, specifically, referring to fig. 2, the extraction steps are as follows:
s11: placing a plurality of construction site pictures in a coordinate system, and extracting a coordinate point of each pixel of each construction site picture in the coordinate system;
S12: generating new output according to the coordinate points and a Gaussian fuzzy model;
s13: constructing a differential scale space according to the coordinate points and the new output;
s14: determining the size of the key characteristic point by a curve fitting mode according to the difference scale space;
s15: and determining the direction of the key feature point according to the size of the key feature point and the neighborhood radius.
Here, the coordinate area is the origin of coordinates at the lower left corner of the construction site picture, and after each construction site picture enters the coordinate system, each pixel of the construction site picture generates a coordinate point in the coordinate system.
In particular to8 pictures are numbered, namely a first picture, a second picture, a third picture … … and an eighth picture (the ellipses omit four to seven pictures, and the same is used hereinafter), so that each picture at least comprises a plurality of pixels, taking the first picture as an example, the first picture at least comprises a first pixel, a second pixel and a third pixel … … nth pixel, so that each pixel generates a coordinate in a coordinate system, for example, the first pixel is (x)1,y1) Second pixel (x)2,y2) … … n pixel (x)n,yn)。
In addition, a gaussian fuzzy model region (m, n) is set, and the following formula is adopted in combination with the gaussian fuzzy model and the coordinate point to generate a new output:
Figure BDA0003546188340000081
G (x, y, sigma) represents a Gaussian fuzzy model function, x and y represent coordinates of the construction site picture mapped on the coordinate system, x is an abscissa value of a coordinate point, y is an ordinate value of the coordinate point, sigma is a standard deviation of normal distribution, m is a length value of the Gaussian fuzzy model, and n is a width value of the Gaussian fuzzy model.
Here, the larger the σ value is, the more blurred the image is, and therefore, in the present invention, σ takes a value of 1. In addition, in order to make the key feature points other than the distance 3 σ useless, the length and width values of m and n are generally set to (6 σ +1)2Nearby. It should be noted here that after the gaussian fuzzy model is generated, in order to make the elements in the gaussian template at [0,1 ]]Meanwhile, the gaussian fuzzy model should be normalized (normalization is a dimensionless processing means, and the absolute value of the physical system value is changed into a certain relative value relation).
Optionally, in S12, referring to fig. 3, the constructing the differential scale space includes the following sub-steps:
s121: performing convolution calculation according to the Gaussian fuzzy model and the coordinate points corresponding to the construction site pictures to construct a scale space;
s122: carrying out multiple reduced-order sampling on the construction site picture to obtain n layers of images with different sizes;
S123: establishing n scales for n layers of images with different sizes;
s124: changing the value of sigma, returning to S120 to perform convolution calculation again until obtaining a scale space with a specified number of grades;
s125: and subtracting adjacent scale spaces at different scales to obtain a difference scale space.
Here, if the gaussian blur model is directly used to perform convolution operation with the construction site picture when the σ value is large, pixel loss is serious. Therefore, considering that the gaussian fuzzy model matrix can be separated into a form of multiplying a row vector by a column vector, the horizontal direction and the vertical direction are respectively convoluted on the actual image by using the separated row vector and column vector, so that the phenomenon of pixel missing can be effectively avoided.
In addition, in order to detect extreme points at different scales, it is necessary to perform continuous step reduction (reduction) sampling on an original job site picture to obtain n layers of pictures with different sizes, where the pictures are constructed into a tower-shaped structure, and a calculation formula of n is as follows:
n=log2{min(M,N)}-t
wherein t is the logarithm of the minimum image, and t belongs to [0, log ∈2{min(M,N)}]And M and N are the length and width of the construction site picture.
Thus, after n is obtained, n scales are established, each scale corresponding to a scale space of a predetermined number of levels, and in the present invention, the predetermined number of levels is 5. Further, the value of σ is dynamically changed, and is set to:
Figure BDA0003546188340000091
Where k is a constant value, and in the present invention, k is 1, and the dimension of σ is n × 5. And substituting the sigma into the scale space to obtain corresponding Gaussian blur pictures under different scales.
Alternatively, the scale space is represented by the following formula:
L(x,y,σ)=G(x,y,σ)*I(x,y)
g (x, y, sigma) represents a Gaussian fuzzy model function, I (x, y) is a coordinate point corresponding to a construction site picture, and L (x, y, sigma) is the convolution of the Gaussian fuzzy model and the coordinate point, namely a scale space; finally, the difference scale space can be obtained by subtracting the scale space values under different adjacent scales.
Here, the differential scale space is expressed by the following formula:
D(x,y,σ)=(G(x,y,kσ)-G(x,y,σ))*I(x,y)
d (x, y, sigma) represents a difference scale space, G (x, y, sigma) represents a Gaussian fuzzy model function, I (x, y) is a coordinate point corresponding to a construction site picture, and k is a constant value.
After the difference scale space is obtained, comparing each pixel point of each layer of image with the image domains and the size domains of the upper layer, the lower layer and the 8 adjacent pixel points of the layer, so that the maximum or minimum value point can be found. However, since the difference scale space is a discrete space, the maximum value point or the minimum value point thus found is not a true extreme point, and therefore a curve fitting manner is required to determine the true extreme point,
Optionally, the determining the size of the key feature point is calculated by the following formula:
Figure BDA0003546188340000101
where D (X) represents the fitting function, i.e., the difference scale space D (x, y, σ), DTRepresenting the transpose of D (mathematical sign),
Figure BDA0003546188340000102
denotes differentiation, X denotes the variable inside the fitting function d (X), and X ═ X, y, σ)T
In addition, after the size of the key feature point is obtained, the direction of the key feature point needs to be determined, and the direction of the key feature point needs to be determined by means of other pixels (feature points) in the neighborhood of the key feature point. Here, the determination of the neighborhood may be any range, and in the embodiment provided by the present invention, the neighborhood radius is 3 σ. In addition, the determining the direction of the key feature point adopts a gradient histogram method, and is calculated by the following formula:
Figure BDA0003546188340000111
Figure BDA0003546188340000112
wherein m (x, y) represents the size of the corresponding histogram of the key feature point, θ (x, y) represents the direction of the key feature point, and L represents the size of the scale space in which the key feature point is located.
After the feature points of the actual pictures are obtained, matching the same feature points in every two construction site pictures is needed:
s21: calculating the Euclidean distance between the same key feature points in the two adjacent construction site pictures according to the following formula:
Figure BDA0003546188340000113
Wherein rho is the Euclidean distance between the same key feature points in the two adjacent construction site pictures; (x)1,y1) For the coordinates of the previous job site picture, (x)2,y2) The coordinates of the latter job site picture. Here, the front and the back are judged according to the time sequence of the helmet to the server, and the pictures collected by the intelligent helmet are collected around the building in sequence, so that the front and the back pictures are related.
S22: and comparing the sizes of the Euclidean distances, and determining the coordinate point corresponding to the minimum Euclidean distance as the feature matching point. For example (x)1,y1) And (x)2,y2) Has the smallest value of the Euclidean distance therebetween, then (x)2,y2) Is that (x)1,y1) The same key feature points.
Besides, after the same key feature points are obtained, a virtual three-dimensional image needs to be established. Referring to fig. 4, the present invention proposes a virtual three-dimensional image according to the following substeps:
s31: calculating a basic matrix according to the same key feature points;
s32: obtaining a projection matrix between two adjacent construction site pictures according to the basic matrix;
s33: generating three-dimensional point cloud data according to the projection matrix and the feature matching points;
s34: and establishing the virtual three-dimensional image according to the three-dimensional point cloud data.
After the projection matrix between the images is obtained, the corresponding space points can be obtained by using the projection matrix and the matching points, a series of three-dimensional point clouds are finally obtained, then the three-dimensional point clouds are stored as txt files, and the reconstruction of the virtual three-dimensional image can be completed by calling a vispy library of python.
Optionally, in step S31, the calculation formula of the basis matrix is:
x'TFx=0
where x and x' are a pair of feature matching points, and x is (x)1,y1,1),x′=(x2,y2,1),(x1,y1) For the coordinates of the previous job site picture, (x)2,y2) F is a coordinate of the later construction site picture, and is a basic matrix;
the foundation matrix F comprises rotation and translation information between the two construction site pictures and internal parameters of a camera, and the same key feature points of the two construction site pictures have the following constraint relationship:
Figure BDA0003546188340000121
wherein, fijIs a parameter in the basis matrix F, wherein i is 1, 2, 3; j is 1, 2, 3;
in step S32, the projection matrix is calculated by the following formula:
P1=C[I|0]
P2=C[R|T]
F=C-TEC-1
E=[T]R
wherein, P1,P2Projection matrixes of a previous picture and a next picture respectively; i is an identity matrix; e is a matrix containing rotation and translation information between the two images; r and T are respectively a rotation matrix and a translation matrix; performing singular value decomposition on the matrix E to obtain matrices R and T, and calculating projection matrices of the two pictures; c is a matrix symbol, e.g. C [ I |0 ] ]The following I and 0 are shown to form a matrix, and C in F is the carry-in P1 and P2.
Alternatively, the step S4 includes:
mapping the height of the preset model by the coordinate value of the virtual three-dimensional image and the following formula:
Figure BDA0003546188340000131
in the above formula, h' is the height of the virtual three-dimensional image, and the lowest point to the highest point are taken; w' is the width of the virtual three-dimensional image, the width of one side of the base is taken, w is the width of the corresponding side in the complete preset model, and h is the height of the complete preset model.
The preset model can be any model, in the invention, a BIM (Building Information Modeling) model is adopted, the BIM model is built layer by layer, and when the virtual three-dimensional image is mapped to the height of the BIM model, the model component below the height value can be colored red to represent that the Building is completed; the model component above the height value is displayed as a transparent color, indicating that it is not being constructed; the model component at the threshold value is shown in yellow, indicating that it is being built. Of course, the color is not limited to the present invention, and those skilled in the art can change the color of the model display to represent the current construction progress according to the present invention.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (10)

1. A construction progress information processing method is characterized by comprising the following steps:
s1: collecting a plurality of construction site pictures, and extracting key feature points of each picture;
s2: matching the same key characteristic points in each two construction site pictures;
s3: establishing a virtual three-dimensional image according to the same key feature points;
s4: and comparing the virtual three-dimensional image with a preset model, and determining the current construction progress according to comparison information.
2. The construction progress information processing method according to claim 1, wherein the step S1 includes the sub-steps of:
s11: placing a plurality of construction site pictures in a coordinate system, and extracting a coordinate point of each pixel of each construction site picture in the coordinate system;
s12: generating a new output according to the coordinate point and a Gaussian fuzzy model;
S13: constructing a differential scale space according to the coordinate points and the new output;
s14: determining the size of the key characteristic point in a plurality of pixels in a curve fitting mode according to the difference scale space;
s15: and determining the direction of the key feature point according to the size of the key feature point and the neighborhood radius.
3. The construction progress information processing method according to claim 2, wherein in the step S11, the generation of the new output from the coordinate points and the gaussian fuzzy model employs the following equation:
Figure FDA0003546188330000011
wherein, G (x, y, sigma) represents a Gaussian fuzzy model function, x is an abscissa value of a coordinate point, y is an ordinate value of the coordinate point, sigma is a standard deviation of normal distribution, m is a length value of the Gaussian fuzzy model, and n is a width value of the Gaussian fuzzy model.
4. The construction progress information processing method according to claim 3, wherein in the step S12, the constructing of the difference scale space includes the substeps of:
s121: performing convolution calculation according to the new output and the coordinate point corresponding to the construction site picture to construct a scale space;
s122: carrying out multiple reduced-order sampling on the construction site picture to obtain n layers of images with different sizes;
S123: establishing n scales for n layers of images with different sizes;
s124: changing the value of sigma, returning to S120 to perform convolution calculation again until obtaining the scale space with the specified number of grades;
s125: and subtracting adjacent scale spaces at different scales to obtain a difference scale space.
5. The construction progress information processing method according to claim 4, wherein in S121, the scale space is expressed by the following formula:
L(x,y,σ)=G(x,y,σ)*I(x,y)
g (x, y, sigma) represents a Gaussian fuzzy model function, I (x, y) is a coordinate point corresponding to a construction site picture, and L (x, y, sigma) is the convolution of the Gaussian fuzzy model and the coordinate point, namely a scale space;
in S125, the difference scale space is represented by the following formula:
D(x,y,σ)=(G(x,y,kσ)-G(x,y,σ))*I(x,y)
d (x, y, sigma) represents a difference scale space, G (x, y, sigma) represents a Gaussian fuzzy model function, I (x, y) is a coordinate point corresponding to a construction site picture, and k is a constant value.
6. The construction progress information processing method according to claim 2, wherein in the S14, the determining the size of the key feature point is calculated by a formula:
Figure FDA0003546188330000021
where D (X) represents the fitting function, i.e. the difference scale space D (x, y, σ), and represents D TWhich represents the transpose of the D,
Figure FDA0003546188330000031
denotes differentiation, X denotes the variable inside the fitting function d (X), and X ═ X, y, σ)T
In S15, determining the direction of the key feature point uses a gradient histogram method, and the direction is calculated by the following formula:
Figure FDA0003546188330000032
Figure FDA0003546188330000033
wherein m (x, y) represents the size of the corresponding histogram of the key feature point, θ (x, y) represents the direction of the key feature point, and L represents the size of the scale space in which the key feature point is located.
7. The construction progress information processing method according to claim 1, wherein the step S2 includes:
s21: calculating the Euclidean distance between the same key feature points in the two adjacent construction site pictures according to the following formula:
Figure FDA0003546188330000034
wherein rho is the Euclidean distance between the same key feature points in the two adjacent construction site pictures; (x)1,y1) For the coordinates of the previous job site picture, (x)2,y2) The coordinates of the latter construction site picture;
s22: and comparing the sizes of the Euclidean distances, and determining that the coordinate point corresponding to the minimum Euclidean distance is the same key feature point.
8. The construction progress information processing method according to claim 7, wherein the step S3 includes the sub-steps of:
S30: calculating a basic matrix according to the same key feature points;
s31: obtaining a projection matrix between two adjacent construction site pictures according to the basic matrix;
s32: generating three-dimensional point cloud data according to the projection matrix and the feature matching points;
s33: and establishing the virtual three-dimensional image according to the three-dimensional point cloud data.
9. The construction progress information processing method according to claim 8, wherein in the step S30, the calculation formula of the basis matrix is:
x'TFx=0
where x and x' are a pair of feature matching points, and x ═ x (x)1,y1,1),x′=(x2,y2,1),(x1,y1) For the coordinates of the previous job site picture, (x)2,y2) F is a coordinate of the later construction site picture, and is a basic matrix;
the foundation matrix F comprises rotation and translation information between the two construction site pictures and internal parameters of a camera, and the same key feature points of the two construction site pictures have the following constraint relationship:
Figure FDA0003546188330000041
wherein f isijIs a parameter in the basis matrix F, where i ═ 1, 2, 3; j is 1, 2, 3;
in step S31, the projection matrix is calculated by the following formula:
P1=C[I|0]
P2=C[R|T]
F=C-TEC-1
E=[T]R
wherein, P1,P2Projection matrixes of a previous picture and a next picture respectively; i is an identity matrix; e is a matrix containing rotation and translation information between the two images; r and T are respectively a rotation matrix and a translation matrix; after singular value decomposition is carried out on the matrix E, matrixes R and T are obtained, and projection matrixes of two pictures are calculated at the same time; c is a matrix symbol.
10. The construction progress information processing method according to any one of claims 1 to 9, wherein the step S4 includes:
mapping the height of the preset model by the coordinate values of the virtual three-dimensional image and the following formula:
Figure FDA0003546188330000051
in the above formula, h' is the height of the virtual three-dimensional image, and the lowest point to the highest point are taken; w' is the width of the virtual three-dimensional image, the width of one side of the base is taken, w is the width of the corresponding side in the complete preset model, and h is the height of the complete preset model.
CN202210248902.1A 2022-03-14 2022-03-14 Construction progress information processing method Pending CN114676763A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210248902.1A CN114676763A (en) 2022-03-14 2022-03-14 Construction progress information processing method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210248902.1A CN114676763A (en) 2022-03-14 2022-03-14 Construction progress information processing method

Publications (1)

Publication Number Publication Date
CN114676763A true CN114676763A (en) 2022-06-28

Family

ID=82074688

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210248902.1A Pending CN114676763A (en) 2022-03-14 2022-03-14 Construction progress information processing method

Country Status (1)

Country Link
CN (1) CN114676763A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272587A (en) * 2022-09-26 2022-11-01 深圳市纵维立方科技有限公司 Model file generation method, medium and electronic device for 3D printing
CN116523471A (en) * 2023-06-25 2023-08-01 中建西南咨询顾问有限公司 Information generation method, apparatus, electronic device and computer readable medium
CN116630095A (en) * 2023-07-24 2023-08-22 中建西南咨询顾问有限公司 Overall planning and planning method and system for automatic engineering construction total tasks

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272587A (en) * 2022-09-26 2022-11-01 深圳市纵维立方科技有限公司 Model file generation method, medium and electronic device for 3D printing
CN116523471A (en) * 2023-06-25 2023-08-01 中建西南咨询顾问有限公司 Information generation method, apparatus, electronic device and computer readable medium
CN116523471B (en) * 2023-06-25 2023-09-22 中建西南咨询顾问有限公司 Information generation method, apparatus, electronic device and computer readable medium
CN116630095A (en) * 2023-07-24 2023-08-22 中建西南咨询顾问有限公司 Overall planning and planning method and system for automatic engineering construction total tasks
CN116630095B (en) * 2023-07-24 2023-09-19 中建西南咨询顾问有限公司 Overall planning and planning method and system for automatic engineering construction total tasks

Similar Documents

Publication Publication Date Title
CN114676763A (en) Construction progress information processing method
US11875583B2 (en) Dataset generation method for self-supervised learning scene point cloud completion based on panoramas
CN106815808A (en) A kind of image split-joint method of utilization piecemeal computing
CN104463969B (en) A kind of method for building up of the model of geographical photo to aviation tilt
CN112288758B (en) Infrared and visible light image registration method for power equipment
EP3971829B1 (en) Cutting method, apparatus and system for point cloud model
CN111242026B (en) Remote sensing image target detection method based on spatial hierarchy perception module and metric learning
CN114332385A (en) Monocular camera target detection and spatial positioning method based on three-dimensional virtual geographic scene
CN115082254A (en) Lean control digital twin system of transformer substation
CN109255304A (en) Method for tracking target based on distribution field feature
CN112802208B (en) Three-dimensional visualization method and device in terminal building
CN117876874B (en) Forest fire detection and positioning method and system based on high-point monitoring video
CN115147488B (en) Workpiece pose estimation method and grabbing system based on dense prediction
CN111683221B (en) Real-time video monitoring method and system for natural resources embedded with vector red line data
CN115512247A (en) Regional building damage grade assessment method based on image multi-parameter extraction
CN114998251A (en) Air multi-vision platform ground anomaly detection method based on federal learning
CN115546273A (en) Scene structure depth estimation method for indoor fisheye image
CN103985141B (en) Method for tracking target based on hsv color covariance feature
CN118247429A (en) Air-ground cooperative rapid three-dimensional modeling method and system
CN112950565A (en) Method and device for detecting and positioning water leakage of data center and data center
CN109141372B (en) Fuzzy matching method for photographic measurement of port hoisting machinery
CN114549780B (en) Intelligent detection method for large complex component based on point cloud data
CN110599587A (en) 3D scene reconstruction technology based on single image
CN115527008A (en) Safety simulation experience training system based on mixed reality technology
CN113658274B (en) Automatic individual spacing calculation method for primate population behavior analysis

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination