CN110929607A - Remote sensing identification method and system for urban building construction progress - Google Patents

Remote sensing identification method and system for urban building construction progress Download PDF

Info

Publication number
CN110929607A
CN110929607A CN201911097311.3A CN201911097311A CN110929607A CN 110929607 A CN110929607 A CN 110929607A CN 201911097311 A CN201911097311 A CN 201911097311A CN 110929607 A CN110929607 A CN 110929607A
Authority
CN
China
Prior art keywords
stage
building
construction
image
stages
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911097311.3A
Other languages
Chinese (zh)
Other versions
CN110929607B (en
Inventor
李百寿
刘聪娜
张越
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guilin University of Technology
Original Assignee
Guilin University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guilin University of Technology filed Critical Guilin University of Technology
Priority to CN201911097311.3A priority Critical patent/CN110929607B/en
Publication of CN110929607A publication Critical patent/CN110929607A/en
Application granted granted Critical
Publication of CN110929607B publication Critical patent/CN110929607B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/176Urban or other man-made structures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/35Categorising the entire scene, e.g. birthday party or wedding scene
    • G06V20/38Outdoor scenes
    • G06V20/39Urban scenes
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A30/00Adapting or protecting infrastructure or their operation
    • Y02A30/60Planning or developing urban green infrastructure

Abstract

The invention discloses a remote sensing identification method for urban building construction progress. The method comprises the following steps: acquiring a high-resolution remote sensing image and preprocessing the high-resolution remote sensing image; dividing the image into four stages of images, namely a foundation stage, a middle construction stage, a capping stage and a construction stage, and extracting image features according to different construction stages; creating a sample set and establishing a sample database; designing a network model and parameters; classifying and identifying images in different construction stages; and extracting the construction position and the boundary outline of the building. The invention also provides a system for improving the remote sensing identification efficiency of the construction progress of the urban building by utilizing cloud computing, which comprises the following steps: a spatial database; a data processing module; a GIS cloud service platform; a GIS convolutional network cloud computing module; a statistical analysis cloud computing module; and a building construction stage type graph output module. The building construction progress identification method and the building construction progress identification system can improve accuracy and efficiency of building identification in different construction stages and meet building construction progress identification requirements in the era of urban remote sensing big data.

Description

Remote sensing identification method and system for urban building construction progress
Technical Field
The invention relates to the technical field of remote sensing identification of buildings in different construction stages, in particular to a remote sensing identification method and system for urban building construction progress based on a Deep Convolutional Neural Network (DCNN) and cloud computing.
Background
With the acceleration of urbanization in China, the rapid development of urban expansion and urbanization, the number of buildings constructed in cities is more and more, and many buildings for social security and illegal construction are not lacked. The city management department urgently needs to master the building space position and area information of different building construction progress (foundation, construction middle, bottom sealing and construction) of the city in time. In order to know and obtain the construction progress and the construction state of the building in time, remote sensing dynamic monitoring of the construction progress of the urban building becomes key contents of urban management and the foundation of smart city construction.
At present, the remote sensing extraction precision and efficiency of the building and the construction progress information thereof based on the building morphological index and multi-time phase change detection technology cannot meet the requirements of city rapid development, illegal building monitoring and guaranteed house management. With the increase of urban image data acquired by domestic high-grade second-grade satellites and the like, a high-precision and high-efficiency building construction progress identification method is urgently needed.
The invention meets the requirement of accurately and rapidly identifying different construction schedules of the building by constructing a remote sensing identification method and a remote sensing identification system of the construction schedule of the building based on a Deep Convolutional Neural Network (DCNN) and cloud computing. Deep convolution neural calculation is carried out through high-resolution satellite and airplane remote sensing images, a space-time cloud storage database of urban building construction progress images in different stages is established, and query, management and sharing of building information in different stages are facilitated, so that the problems that the building space position and area information is not updated timely, and the construction progress information identification precision is low are solved.
Disclosure of Invention
The invention aims to provide a remote sensing identification method for urban building construction progress based on DCNN and cloud computing, which can quickly and accurately identify buildings with different construction progresses (foundation, construction middle, capping and construction stages).
The method comprises the following specific steps:
(1) obtaining urban high-resolution remote sensing images: and observing urban buildings through a high-resolution remote sensing satellite and an airplane to obtain a high-spatial-resolution remote sensing image needing to judge the construction progress of the buildings.
(2) Image preprocessing: and carrying out atmospheric correction and radiation correction pretreatment on the acquired high-resolution image.
(3) Dividing different construction stages and extracting image characteristics: the construction stage is divided into four stages, namely a foundation stage, a middle construction stage, a capping stage and a construction stage. Extracting the building at the foundation stage by adopting image spectrum transformation characteristics according to different building identification characteristics at different stages; extracting the buildings by using texture features in the building middle and capping stages; and (4) extracting corner point features of the building in the construction stage and fusing the corner point features with the spectral features.
(4) Sample set creation and partitioning: and drawing the boundaries of buildings in different construction stages through visual interpretation and an interesting region tool, and respectively creating image sample sets and label data in different construction stages. And aiming at each stage, dividing the image sample set and the label data of different construction stages into a training set, a verification set and a test set by utilizing an OpenCV tool.
(5) Building deep learning sample database establishment: and establishing a deep learning sample database for storing the foundation stage, the middle building stage, the top sealing stage, the characteristic images of the building stage, the deep learning training set, the verification set and the test set.
(6) Designing a network model and parameters: and designing depth network models and parameters for identifying and extracting buildings in different construction stages according to the main image characteristics of the buildings in different construction stages.
(7) Training a network and obtaining an optimal network model: and training the network by using the sample set and adjusting network parameters to find out the optimal network model at different construction stages.
(8) Calculating the precision and storing the trained network: and classifying the characteristic images of different construction stages by using the optimal network models of different construction stages, and calculating the precision. And (4) judging whether the precision reaches a threshold value set by a user, if so, storing the trained network corresponding to the construction stage, otherwise, returning to the step (7).
(9) Extracting the construction position and the boundary contour of the building: and carrying out convolutional network classification and identification on the urban image by using building samples in different construction stages and the trained networks in different construction stages, and extracting the spatial positions and the boundary outlines of the buildings in different construction stages.
(10) Counting the number and the area of construction: and (4) counting the number and the total area of buildings in different construction stages (foundation, construction, capping and construction) of the city.
Further, in the step (3), for the ground-based stage, the image spectrum transformation adopts gray scale linear transformation and principal component analysis to obtain a ground-based stage feature image. And for buildings at the middle building stage and the top sealing stage, extracting image entropy and contrast texture characteristic images on the basis of gray level linear transformation and gray level co-occurrence matrix calculation.
For the building in the building construction stage, an original image is enhanced through gray scale linear transformation, then angular point features are extracted by adopting a Canny operator and a Harris algorithm, and the angular point features and the original spectral image are subjected to fusion processing to obtain a building feature image in the building construction stage.
Further, in one embodiment of step (6), the ground-based phase convolution kernel is 5 × 5 in size; and 3 × 3 sizes are adopted for convolution kernels in the construction, capping and construction stages. Preferably, the maximum pooling layer convolution kernel size is taken to be 2 x 2 in size and the sliding step is taken to be 1 pixel.
The number of layers of the deep convolutional network is 35-40, wherein the number of convolutional layers in the foundation stage and the construction stage is 20, the number of pooling layers is 10, the number of full-connection layers is 3, and the number of input layers and the number of output layers are 1 respectively.
In order to identify the texture features more accurately, it is preferable that 25 layers are used for the network convolution layer in the middle building stage and the capping stage, 10 layers are used for the pooling layer, 3 layers are used for the full connection layer, and 1 layer is used for each of the input and output layers.
The invention also provides a system for improving the remote sensing identification efficiency of the construction progress of the urban building by using the cloud computing platform, which comprises the following steps:
the system comprises a spatial database, a data processing module, a GIS cloud service platform, a GIS convolutional network cloud computing module, a GIS statistical analysis cloud computing module and a building construction progress type graph output module. The space database is connected with the data processing module in a bidirectional mode, and the data processing module is connected with the cloud service platform through a wired network. The GIS convolutional network cloud computing module, the GIS statistical analysis cloud computing module and the building construction progress type graph output module are respectively connected with the GIS cloud service platform in a bidirectional mode.
The spatial database comprises a data input module and a data storage module. The data input module is used for importing image data and deep learning sample data obtained by high-resolution remote sensing satellites and aerial photogrammetry. The data storage module can store, inquire, add and delete high-resolution image information and is used for storing building deep learning samples of image sets in a foundation stage, a construction middle stage, a capping stage and a construction stage.
The data processing module comprises an image feature extraction module and a sample data processing module. The image feature extraction module is used for extracting image features of the building at different construction stages, and the sample data processing module is used for creating and dividing a sample set.
The GIS cloud service platform is based on a data and processing service interface provided by a cloud computing platform service layer-PaaS (platform as a service). And the cloud function system is formed together with a GIS convolutional network cloud computing module, a GIS statistical analysis cloud computing module and a building construction stage type graph output module.
And the GIS convolutional network cloud computing module is used for convolutional network and parameter design, network training and optimal network model acquisition.
And the GIS statistical analysis cloud computing module is used for rapidly counting the number and the area of buildings at different construction stages.
And the building construction progress type graph output module is used for outputting the distribution, area and quantity of the building in four construction stages to paper and electronic media.
The embodiment of the invention has the beneficial effects that:
the building construction stage remote sensing identification method and system based on DCNN and cloud computing are adopted, building monitoring identification and a deep convolution neural network are established by combining remote sensing digital image processing software and a cloud computing technology, building construction stage and progress identification accuracy and intelligent level can be increased, building construction progress monitoring efficiency is improved by a cloud computing platform, and urban building monitoring and management cost is saved. The method of the invention increases the characteristic images of the foundation construction stage, the construction neutralization stage and the capping stage on the basis of the spectrum image, and can improve the accuracy of detecting the construction progress of the building by high-resolution remote sensing data.
Drawings
FIG. 1 is a technical flow chart of a remote sensing identification method for building construction progress based on DCNN and cloud computing
FIG. 2 deep learning sample set creation for different construction stage buildings
FIG. 3 is a diagram of a model architecture of a ground-based phase deep convolutional network in one embodiment
FIG. 4 is a block diagram of deep convolutional network model structure in the neutralization and capping stage according to one embodiment
FIG. 5 is a diagram of a deep convolutional network model structure at the build stage according to an embodiment
FIG. 6 is a functional block diagram of a remote sensing identification system for building construction progress based on DCNN and cloud computing
Detailed Description
In order to more clearly illustrate the technical solutions of the present invention, the present invention is described in detail below with reference to specific examples, which will help those skilled in the art to better understand the present invention. It should be noted that other examples, which may be developed by others skilled in the art without departing from the invention and its scope, are intended to be within the scope of the present invention.
Example (b):
fig. 1 is a technical flowchart of a remote sensing identification method for building construction progress based on DCNN and cloud computing, which includes:
(1) obtaining urban high-resolution remote sensing images: the urban buildings are observed through high-resolution remote sensing satellites and aerial photogrammetry to obtain high-resolution remote sensing images with the resolution being better than 2 meters, and the embodiment is domestic high-resolution second-order satellite images with the resolution of 0.8 meter and containing 4 wave bands.
(2) Image preprocessing: during remote sensing imaging, due to the influence of various factors, the remote sensing image has the phenomena of geometric distortion, atmospheric extinction, radiation distortion and the like to a certain extent, and in order to eliminate irrelevant information in the obtained image and recover useful earth surface building construction information, atmospheric correction and radiation correction preprocessing are carried out on the obtained high-resolution image through remote sensing image processing software such as ENVI, ERDAS and the like, so that a clearer high-resolution remote sensing image is obtained.
(3) Dividing different construction stages and extracting image characteristics: the construction stage is divided into four stages, namely a foundation stage, a middle construction stage, a capping stage and a construction stage. The features adopted for the extraction of buildings at different construction stages are shown in figure 2:
a) and for the foundation stage, obtaining a characteristic image of the foundation stage through gray linear transformation and principal component analysis. Carrying out gray scale linear transformation on each wave band of the high-resolution image, wherein a transformation formula is as follows:
I(x,y)1=k*I(x,y)+b (1)
in the formula, I is an image space, I (x, y) is a gray value of each pixel point, k and b are linear stretching coefficients, and I (x, y)1The gray value of each pixel point after linear transformation.
In order to enhance the definition of the multiband image, k is 2 and b is 20 in the embodiment. And respectively calculating each wave band of the image to obtain a single wave band image after linear transformation. And obtaining an image of the first principal component on the basis of the linear transformation result by utilizing an ENVI software principal component analysis menu, and taking the image as a new characteristic image input and recognized by the convolution network.
b) And extracting the texture features of the image for the buildings at the middle building stage and the top sealing stage. The enhanced image is extracted by formula (1), and then the gray level co-occurrence matrix is calculated by formula (2). And (3) extracting image entropy and contrast characteristics on the basis of gray level co-occurrence matrix calculation by respectively adopting formulas (3) and (4) to obtain an image entropy and contrast texture characteristic image.
P(i,j)=#{(x1,y1),(x2,y2)∈M×N∣f(x1,y1)=i,f(x2,y2)=j} (2)
Wherein P is a gray level co-occurrence matrix, P (i, j) is the frequency of the simultaneous occurrence of a pixel with gray level i and a pixel with gray level j, # (x) represents the number of elements in a set x, and MXN isImage size, (x)1,y1)、(x2,y2) For any two pixels in the image, f (x)1,y1) Is (x)1,y1) Gray value, f (x) corresponding to the position pixel2,y2) Is (x)2,y2) The gray value corresponding to the position pixel. i. j is a gray scale value.
Figure BDA0002268726500000051
Figure BDA0002268726500000052
Wherein ENT represents entropy, CON represents contrast, L is image gray level, L is 256 gray levels because a single wave band of the input image contains 256 gray levels, and P (i, j) is an element positioned in the ith row and the jth column in the gray level co-occurrence matrix. As described above, the texture feature images of all the bands are sequentially obtained.
c) And extracting the geometric corner features of the building for the building in the construction stage. The original image is enhanced through a formula (1), then, the Canny operator is adopted to extract the image edge, and the Harris algorithm is adopted to extract the corner feature of the binary image aiming at the edge extraction.
The present embodiment adopts Harris corner detection algorithm as shown in formula (5):
Figure BDA0002268726500000053
where x is the pixel column value, y is the pixel row value, u is the column offset, v is the row offset, E (u, v) is the gray scale value variation gradient, w (x, y) is the window function, I (x, y) is the original image gray scale, I (x + u, y + v) is the translated image gray scale, and (u, v) is the offset of the window. And (x, y) is the corresponding pixel coordinate position in the window. I isxAnd IyIs a first order gray gradient, the image can be convolved to obtain:
Figure BDA0002268726500000054
Figure BDA0002268726500000061
m is the partial derivative matrix:
Figure BDA0002268726500000062
the window function is shown in equation (6) using a binary gaussian function:
Figure BDA0002268726500000063
where G is a binary Gaussian function and σ is the Gaussian filter width, this example takes σ to be 1.
The corner response calculation formula of each pixel point is shown as (7):
R=AB-C2-k(A+B)2(7)
in the formula, R is the corner response function, k is a constant, and k is 0.05 in this example.
When the value of the corner response function R is greater than the threshold, the threshold is set according to the specific image, and the threshold is set to 120 in this example. And fusing the geometric corner feature with the original image through an image sharpening/GS menu of ENVI to obtain a building feature image at the construction stage.
(4) Sample set creation and partitioning: dividing the image after feature extraction into four progress construction images of a foundation stage, a construction middle stage, a capping stage and a construction stage, drawing boundaries of buildings in different construction stages through visual interpretation and an ENVI software ROI tool, and respectively creating feature image sample sets and label data in different construction stages, as shown in FIG. 2. For each construction stage, the OpenCV tool is used to randomly clip the sample set and the label data into N sample data with size of 256 × 256, where N is set to 10000. Preferably, the present embodiment randomly divides the sample set into a training set, a validation set, and a test set at a ratio of 3:1: 1.
(5) Building deep learning sample database establishment: the method comprises the steps of establishing an image database by using a grid data structure and a spatial database engine (ArcSDE) and a database management system (SQL Server), establishing characteristic images for storing a foundation stage, a middle building stage, a top sealing stage and an establishing stage, and storing a deep learning training set, a verification set and a test set in a deep learning sample database.
(6) Designing a network model and parameters: embodiments employ a SegNet network model, the learning framework uses a TensorFlow framework, the input layer inputs 256 × 256 samples, and the output layer employs a Soft-max classifier. And designing a network model and parameters according to the image characteristics of the urban building foundation stage, the middle building stage, the top sealing stage and the building stage.
Fig. 3 is a diagram of a deep convolutional network model in the foundation stage, fig. 4 is a diagram of a deep convolutional network model in the neutralization and capping stage, and fig. 5 is a diagram of a deep convolutional network model in the building stage.
3-5, Conv a-b (e.g., Conv 5-64) in a convolutional layer indicates that the size of the convolutional kernel is a x a, the number of convolutional kernels is b, and (× n) indicates that the network has n layers here; maxpool2 indicates that the pooling layer has a pooling core size of 2 x 2; FC-c indicates that the full link layer has c output nodes, and the number of output nodes of the full link layer in this example is 4096, 2048 and 6, respectively.
As shown in fig. 3-5, the ground-based phase convolution kernel preferably takes a size of 5 x 5; the convolution kernel adopts a size of 3 x 3 because the construction, capping and construction stages need finer local receptive fields and higher computational efficiency. Preferably, the maximum pooling layer convolution kernel size is taken to be 2 x 2 in size, with a sliding step of 1 pixel.
As shown in fig. 3-5, it is preferable that the number of layers of the deep convolutional network is 35-40, wherein the number of convolutional layers in the foundation stage and the construction stage is 20, the number of pooling layers is 10, the number of fully-connected layers is 3, and the number of input and output layers is 1. In order to identify the texture features more accurately, it is preferable that 25 layers are used for the network convolution layer in the middle building stage and the capping stage, 10 layers are used for the pooling layer, 3 layers are used for the full connection layer, and 1 layer is used for each of the input and output layers.
The activation function is a RELU function,
y=max(0,x) (8)
y=f(x) (9)
where x represents the network input signal and y represents the activation output signal, where f (x) represents the activation function for the network input signal x. The loss function is calculated by adopting a cross entropy function:
Figure BDA0002268726500000071
where L is the loss function, n is the number of training samples, yiThe true value of the ith pixel and ai is the predicted value.
The normalization method is adopted to normalize the characteristic images to (0,1), so that the problem that the training process is not converged due to the difference of training samples is avoided. The training process adopts a random gradient descent method (SGD) to continuously adjust the weight parameters of the network, and the loss function L is minimized through back propagation.
Setting initial learning rate, batch size and training period. In this embodiment, the initial learning rate of training is set to 0.001, the batch size (batch size) of a single training is set to 64, and the training period epoch is set to 8.
(7) Training a network and obtaining an optimal network model: and training the network by using the sample set and adjusting network parameters to find out the optimal network model at different construction stages. Training a convolutional network: respectively inputting sample data of a foundation stage, a middle building stage, a top sealing stage and a building stage, performing convolution and maximum pooling in an encoder, and then performing sampling in a decoder. And adjusting the size of a convolution kernel and the number of convolution layers, pooling layers and full-connection layers. And then sending the pixels to a Soft-max classifier for type identification, outputting an error test curve, and storing the fitting model with the minimum error in different construction stages.
(8) And (3) calculating the precision and storing the trained network: and (3) classifying and identifying the characteristic images extracted in the foundation stage, the construction middle stage, the capping stage and the construction stage in sequence by using the optimal network models in different construction stages and combining a Soft-max classifier, and evaluating the classification precision after classifying the images. The evaluation precision indexes are various, and the method calculates three indexes of accuracy, recall rate and F value of the classification result of the feature images at the foundation stage.
And taking the classification precision evaluation result of the feature images at the foundation stage as a reference, wherein the calculation methods at the construction middle stage, the capping stage and the construction stage are the same.
The characteristic image pixels at the foundation stage are divided into two types: building pixels of the ground stage and the non-ground stage. The identification accuracy rate calculation formula of the building at the foundation stage is as follows:
Figure BDA0002268726500000081
where P is the accuracy, N(t)Number of foundation stage building pixels, N, correctly classified for foundation stage feature images1The classified ground stage building pixel number.
Recall ratio calculation formula:
Figure BDA0002268726500000082
wherein R is recall, N(t)The number of the foundation stage building pixels correctly classified from the foundation stage feature image is N, and N is the total number of the actual foundation stage building pixels in the image. N is visually interpreted through the boundary outline of the building in the foundation stage, and the number of pixels in the outline is calculated and determined by adopting ENVI software statistics.
F value: for the harmonic mean of the accuracy and recall, the following formula is used:
Figure BDA0002268726500000083
wherein P is the accuracy and R is the recall.
In the process of judging whether the precision reaches the threshold set by the user, preferably, the F value of the foundation stage is higher than 80%, the F value of the neutralization and capping stage is higher than 85%, and the F value of the construction stage is higher than 90%, storing the trained network corresponding to the construction stage, otherwise, returning to the step (7) to retrain the network.
(9) Extracting the construction position and the boundary contour of the building: and carrying out convolutional network classification and identification on the urban image by using building samples in different construction stages and the trained networks in different construction stages, and extracting the spatial positions and boundary outlines of the urban buildings in different construction stages.
(10) Counting the number and the area of construction: by utilizing a GIS statistical analysis tool, the number and the total area of buildings in different construction stages (foundation, construction, capping and construction) of a city are counted by adopting ArcGIS.
The embodiment of the invention also provides a remote sensing identification system for the construction progress of the building based on DCNN and cloud computing, and the remote sensing identification system is shown in figure 6.
A building construction progress remote sensing identification system based on DCNN and cloud computing includes: the building construction progress type graph output system comprises a space database S1, a data processing module S2, a GIS cloud service platform S3, a GIS convolutional network cloud computing module S4, a GIS statistical analysis cloud computing module S5 and a building construction progress type graph output module S6. The space database is connected with the data processing module in a bidirectional mode, and the data processing module is connected with the cloud service platform through a wired network. The GIS convolutional network cloud computing module, the GIS statistical analysis cloud computing module and the building construction progress type graph output module are respectively connected with the GIS cloud service platform in a bidirectional mode.
The spatial database S1 comprises a data input module S11 and a data storage module S12. The data input module is used for importing image data and deep learning sample data obtained by high-resolution remote sensing satellites and aerial photogrammetry. The data storage module stores, queries, adds and deletes high-resolution image information and is used for storing building deep learning samples of image sets in a foundation stage, a middle building stage, a top sealing stage and a building stage. The data processing module S2 is provided with input data.
The data processing module S2 is configured to process the input data provided in S1, and includes an image feature extraction module S21 and a sample data processing module S22. The image feature extraction module S21 is used for extracting image features of different construction stages of the building, and the sample data processing module S22 is used for creating and dividing a sample set. S2 provides images to be identified and sample data for the GIS convolutional network cloud computing module S4 through the GIS cloud service platform S3.
The GIS cloud service platform S3, the GIS convolutional network cloud computing module S4, the GIS statistical analysis cloud computing module S5 and the building construction stage type graph output module S6 form a cloud system together.
The GIS cloud service platform S3 is based on a data and processing service interface provided by a cloud computing platform service layer-paas (platform as a service). An ArcGIS 10.7 intelligent decision and data science cloud platform is adopted. The data processing module S2 uploads the extracted image to be recognized and the sample data to the cloud platform through the interface.
And the GIS convolutional network cloud computing module acquires the image to be identified and extracted and sample data through a cloud platform shared data service interface. The method is used for cloud network model design, network training and optimal network model acquisition. The data science Server of ArcGIS Notebook Server of ArcGIS 10.7 is accessed through a cloud service platform S3, and a TensorFlow machine learning library is called through a deep learning sharing service interface to realize the calculation function of the convolution network.
And the GIS statistical analysis cloud computing module S5 obtains the extraction results of the spatial positions and the boundary contour information of the building in different construction stages through a cloud platform shared data service interface. The method is used for rapidly counting the number and the area of buildings in different construction stages. The method comprises the steps of accessing a Web application module for spatial data mining and visualization of ArcGIS 10.7 'instruments for ArcGIS' through a cloud service platform S3, calling a statistical analysis library through a statistical calculation shared service interface, and realizing the functions of counting the number and the area of buildings at different construction stages by combining a vector map API interface.
And the building construction progress type graph output module S6 obtains the extraction results of the spatial positions, the boundary outlines and the number area information of the buildings at different construction stages through the cloud platform shared data service interface. The system is used for outputting the distribution, the quantity and the area of the buildings in four construction stages to paper and electronic media. The method comprises the steps of accessing an ArcGIS 10.7 'ArcGIS Pro' online drawing module through a cloud service platform S3, calling a drawing function library through an online drawing service interface, combining a vector map API interface, outputting a building construction progress type map to a paper medium and an electronic medium, and displaying construction progress information.
The paper output medium comprises a color printer and a plotter, and the electronic medium displays the construction progress information on a computer screen in the form of vector thematic and grid image maps and simultaneously outputs the construction progress information to image format files such as jpg, tif and the like.

Claims (9)

1. A remote sensing identification method for urban building construction progress is characterized by comprising the following steps:
(1) obtaining urban high-resolution remote sensing images: observing urban buildings through a high-resolution remote sensing satellite and an airplane to obtain a high-spatial-resolution remote sensing image needing to judge the construction progress of the buildings;
(2) image preprocessing: carrying out atmospheric correction and radiation correction pretreatment on the acquired high-resolution image;
(3) dividing different construction stages and extracting image characteristics: dividing the construction stage into four stages, namely a foundation stage, a middle construction stage, a capping stage and a construction stage; extracting the building at the foundation stage by adopting image spectrum transformation characteristics according to different building identification characteristics at different stages; extracting the buildings by using texture features in the building middle and capping stages; building combination corner feature extraction and spectrum feature fusion are carried out in the building construction stage;
(4) sample set creation and partitioning: drawing boundaries of buildings at different construction stages through visual interpretation and an interesting tool, and respectively creating image sample sets and label data at different construction stages; for each stage, dividing the image sample set and the label data of different construction stages into a training set, a verification set and a test set by utilizing an OpenCV tool;
(5) building deep learning sample database establishment: establishing a deep learning sample database for storing a foundation stage, a middle building stage, a top sealing stage, a feature image and deep learning training set, a verification set and a test set at the building stage;
(6) designing a network model and parameters: designing depth network models and parameters for identifying and extracting buildings at different construction stages according to main image characteristics of the buildings at different construction stages;
(7) training a network and obtaining an optimal network model: training a network by using the sample set and adjusting network parameters to find out the optimal network models in different construction stages;
(8) calculating the precision and storing the trained network: classifying the feature images in different construction stages by using the optimal network models in different construction stages, and calculating the precision; judging whether the precision reaches a threshold value set by a user, if so, saving the trained network corresponding to the construction stage, otherwise, returning to the step (7);
(9) extracting the construction position and the boundary contour of the building: carrying out convolutional network classification and identification on the urban image by using building samples at different construction stages and networks trained at different construction stages, and extracting the spatial positions and boundary contours of the buildings at different construction stages;
(10) counting the number and the area of construction: and (5) counting the number and the total area of buildings in different construction stages of the city.
2. A remote sensing identification method of urban building construction progress as recited in claim 1, wherein, in step (3), the image features of different construction stages are extracted, wherein, for the ground stage, the image spectrum transformation adopts gray linear transformation and principal component analysis to obtain a characteristic image of the ground stage; for buildings at the middle building stage and the top sealing stage, extracting image entropy and contrast texture characteristic images on the basis of gray level linear transformation and gray level co-occurrence matrix calculation; for the building in the building construction stage, an original image is enhanced through gray scale linear transformation, then angular point features are extracted by adopting a Canny operator and a Harris algorithm, and the angular point features and the original spectral image are subjected to fusion processing to obtain a building feature image in the building construction stage.
3. A remote sensing recognition method of urban building construction progress according to claim 1, characterized in that: and (5) establishing the building deep learning sample database, adopting a grid data structure, establishing an image database by utilizing a spatial database engine and a database management system, establishing a characteristic image for storing a foundation stage, a middle building stage, a top sealing stage and an establishing stage, and establishing a deep learning training set, a verification set and a test set, and storing the characteristic image, the verification set and the test set in the deep learning sample database.
4. A remote sensing recognition method of urban building construction progress according to claim 1, characterized in that: step (6), identifying and extracting depth network models and parameters of the buildings at different construction stages: the foundation stage convolution kernel adopts the size of 5 × 5; the convolution kernel in the stages of building, capping and building adopts 3 × 3 size; the maximum pooling layer convolution kernel size adopts 2 x 2, and the sliding step length adopts 1 pixel; the number of layers of the deep convolutional network is 35-40, wherein the number of convolutional layers in the foundation stage and the construction stage is 20, the number of pooling layers is 10, the number of full-connection layers is 3, and the number of input layers and the number of output layers are 1 respectively; the network convolution layer of the middle building stage and the top sealing stage adopts 25 layers, the pooling layer adopts 10 layers, the full-connection layer adopts 3 layers, and the input layer and the output layer respectively adopt 1 layer.
5. A remote sensing recognition method of urban building construction progress according to claim 1, characterized in that: in the process of judging whether the precision reaches the threshold set by the user, the precision is judged by adopting an F value; and (3) if the F value threshold value of the foundation stage is 80%, the F value threshold value of the building neutralization capping stage is 85%, the F value threshold value of the building stage is 90%, and if the extraction precision is higher than the F value threshold value of the corresponding stage, the trained network of the corresponding construction stage is stored, otherwise, the network is retrained in the step (7).
6. A remote sensing identification system of urban building construction progress, characterized in that the system comprises: the system comprises a spatial database, a data processing module, a GIS cloud service platform, a GIS convolutional network cloud computing module, a GIS statistical analysis cloud computing module and a building construction progress type graph output module; the space database is bidirectionally connected with the data processing module, and the data processing module is connected with the cloud service platform through a wired network; the GIS convolutional network cloud computing module, the GIS statistical analysis cloud computing module and the building construction progress type graph output module are respectively connected with the GIS cloud service platform in a bidirectional mode.
7. The remote sensing system of urban building construction progress of claim 6, characterized by: the data input module of the spatial database is used for importing deep learning sample data, and the data storage module is used for storing building deep learning samples of image sets in a foundation stage, a middle building stage, a top sealing stage and a building stage; the data processing module comprises an image feature extraction module and a sample data processing module; the image feature extraction module is used for extracting image features of the building at different construction stages, and the sample data processing module is used for creating and dividing a sample set.
8. The remote sensing system of urban building construction progress of claim 6, characterized by: the GIS cloud service platform, the GIS convolutional network cloud computing module, the GIS statistical analysis cloud computing module and the building construction stage type graph output module form a cloud functional system together; the GIS cloud service platform is based on a data and processing service interface provided by a cloud computing platform service layer-PaaS.
9. The remote sensing system of urban building construction progress of claim 6, characterized by: the GIS convolutional network cloud computing module is used for convolutional network and parameter design, network training and optimal network model acquisition; the GIS statistical analysis cloud computing module is used for rapidly counting the number and the area of buildings at different construction stages; and the building construction progress type graph output module is used for outputting the distribution, area and quantity of the building in four construction stages to paper and electronic media.
CN201911097311.3A 2019-11-11 2019-11-11 Remote sensing identification method and system for urban building construction progress Active CN110929607B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911097311.3A CN110929607B (en) 2019-11-11 2019-11-11 Remote sensing identification method and system for urban building construction progress

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911097311.3A CN110929607B (en) 2019-11-11 2019-11-11 Remote sensing identification method and system for urban building construction progress

Publications (2)

Publication Number Publication Date
CN110929607A true CN110929607A (en) 2020-03-27
CN110929607B CN110929607B (en) 2022-05-31

Family

ID=69852671

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911097311.3A Active CN110929607B (en) 2019-11-11 2019-11-11 Remote sensing identification method and system for urban building construction progress

Country Status (1)

Country Link
CN (1) CN110929607B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507255A (en) * 2020-04-16 2020-08-07 中铁二院工程集团有限责任公司 Surface ground object information acquisition method and system based on cloud technology
CN111625664A (en) * 2020-05-12 2020-09-04 贵州国卫信安科技有限公司 Network practice teaching operation progress checking method based on image contrast
CN111985355A (en) * 2020-08-01 2020-11-24 桂林理工大学 Remote sensing building earthquake damage assessment method and system based on deep learning and cloud computing
CN112036336A (en) * 2020-09-03 2020-12-04 金钱猫科技股份有限公司 Engineering monitoring method and system based on AI intelligent image analysis
CN112767469A (en) * 2021-01-25 2021-05-07 东南大学 Highly intelligent acquisition method for urban mass buildings
CN112800895A (en) * 2021-01-18 2021-05-14 内蒙古电力(集团)有限责任公司乌兰察布电业局 Method for identifying building based on deep learning algorithm
CN113537142A (en) * 2021-08-03 2021-10-22 广东电网有限责任公司 Monitoring method, device and system for construction progress of capital construction project and storage medium
CN113610013A (en) * 2021-08-10 2021-11-05 四川易方智慧科技有限公司 Method for extracting building outline based on RGB (Red Green blue) wave bands of high-definition remote sensing image
CN113963268A (en) * 2021-10-19 2022-01-21 成都四象纵横遥感科技有限公司 Construction state change monitoring method for large-scale engineering construction
CN115100536A (en) * 2022-06-01 2022-09-23 中科星睿科技(北京)有限公司 Building identification method, building identification device, electronic equipment and computer readable medium
CN115761362A (en) * 2022-11-25 2023-03-07 杭州新中大科技股份有限公司 Intelligent construction stage identification model, method and device based on feature fusion
CN115775116A (en) * 2023-02-13 2023-03-10 华设设计集团浙江工程设计有限公司 BIM-based road and bridge engineering management method and system
WO2024006474A1 (en) * 2022-06-30 2024-01-04 Zonda Data, Inc. New construction detection using satellite or aerial imagery
WO2024006471A3 (en) * 2022-06-30 2024-03-14 Zonda Data, Inc. Construction stage detection using satellite or aerial imagery
CN112036336B (en) * 2020-09-03 2024-05-03 金钱猫科技股份有限公司 Engineering monitoring method and system based on AI intelligent image analysis

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150006117A1 (en) * 2013-07-01 2015-01-01 Here Global B.V. Learning Synthetic Models for Roof Style Classification Using Point Clouds
US20180373932A1 (en) * 2016-12-30 2018-12-27 International Business Machines Corporation Method and system for crop recognition and boundary delineation
CN109598202A (en) * 2018-11-05 2019-04-09 中国科学院遥感与数字地球研究所 A kind of object-based satellite image multi objective built-up areas extraction method
WO2019109262A1 (en) * 2017-12-06 2019-06-13 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for determining new roads on a map
CN110243354A (en) * 2019-07-04 2019-09-17 桂林理工大学 A kind of city illegal building object dynamic monitoring method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150006117A1 (en) * 2013-07-01 2015-01-01 Here Global B.V. Learning Synthetic Models for Roof Style Classification Using Point Clouds
US20180373932A1 (en) * 2016-12-30 2018-12-27 International Business Machines Corporation Method and system for crop recognition and boundary delineation
WO2019109262A1 (en) * 2017-12-06 2019-06-13 Beijing Didi Infinity Technology And Development Co., Ltd. Systems and methods for determining new roads on a map
CN109598202A (en) * 2018-11-05 2019-04-09 中国科学院遥感与数字地球研究所 A kind of object-based satellite image multi objective built-up areas extraction method
CN110243354A (en) * 2019-07-04 2019-09-17 桂林理工大学 A kind of city illegal building object dynamic monitoring method and system

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
BAISHOU LI 等: "Clustering Accuracy Analysis of Building Area in High Spatial Resolution Remote", 《2017 2ND INTERNATIONAL CONFERENCE ON FRONTIERS OF SENSORS TECHNOLOGIES》 *
孙宁: "面向高空间分辨率遥感影像的建筑物目标识别方法研究", 《中国优秀博硕士学位论文全文数据库(硕士) 工程科技Ⅱ辑》 *
李百寿 等: "一种MBR约束下的高分光学影像直角建筑物提取与标绘方法", 《测绘通报》 *
林源: "基于航空立体多影像的城市建筑物识别技术研究", 《中国优秀博硕士学位论文全文数据库(硕士) 信息科技辑》 *
王利忠 等: "基于深度学习的高分遥感影像乡镇建筑物识别方法:", 《科研信息化技术与应用》 *

Cited By (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111507255A (en) * 2020-04-16 2020-08-07 中铁二院工程集团有限责任公司 Surface ground object information acquisition method and system based on cloud technology
CN111507255B (en) * 2020-04-16 2023-07-28 中铁二院工程集团有限责任公司 Cloud technology-based surface feature information acquisition method and system
CN111625664B (en) * 2020-05-12 2022-08-16 贵州国卫信安科技有限公司 Network practice teaching operation progress checking method based on image contrast
CN111625664A (en) * 2020-05-12 2020-09-04 贵州国卫信安科技有限公司 Network practice teaching operation progress checking method based on image contrast
CN111985355A (en) * 2020-08-01 2020-11-24 桂林理工大学 Remote sensing building earthquake damage assessment method and system based on deep learning and cloud computing
CN112036336A (en) * 2020-09-03 2020-12-04 金钱猫科技股份有限公司 Engineering monitoring method and system based on AI intelligent image analysis
CN112036336B (en) * 2020-09-03 2024-05-03 金钱猫科技股份有限公司 Engineering monitoring method and system based on AI intelligent image analysis
CN112800895A (en) * 2021-01-18 2021-05-14 内蒙古电力(集团)有限责任公司乌兰察布电业局 Method for identifying building based on deep learning algorithm
CN112800895B (en) * 2021-01-18 2024-02-27 内蒙古电力(集团)有限责任公司乌兰察布电业局 Method for identifying building based on deep learning algorithm
CN112767469A (en) * 2021-01-25 2021-05-07 东南大学 Highly intelligent acquisition method for urban mass buildings
CN112767469B (en) * 2021-01-25 2021-09-21 东南大学 Highly intelligent acquisition method for urban mass buildings
CN113537142A (en) * 2021-08-03 2021-10-22 广东电网有限责任公司 Monitoring method, device and system for construction progress of capital construction project and storage medium
CN113610013A (en) * 2021-08-10 2021-11-05 四川易方智慧科技有限公司 Method for extracting building outline based on RGB (Red Green blue) wave bands of high-definition remote sensing image
CN113963268A (en) * 2021-10-19 2022-01-21 成都四象纵横遥感科技有限公司 Construction state change monitoring method for large-scale engineering construction
CN113963268B (en) * 2021-10-19 2022-06-28 成都四象纵横遥感科技有限公司 Construction state change monitoring method for large-scale engineering construction
CN115100536A (en) * 2022-06-01 2022-09-23 中科星睿科技(北京)有限公司 Building identification method, building identification device, electronic equipment and computer readable medium
WO2024006471A3 (en) * 2022-06-30 2024-03-14 Zonda Data, Inc. Construction stage detection using satellite or aerial imagery
WO2024006474A1 (en) * 2022-06-30 2024-01-04 Zonda Data, Inc. New construction detection using satellite or aerial imagery
CN115761362B (en) * 2022-11-25 2024-02-06 杭州新中大科技股份有限公司 Construction stage intelligent recognition model, method and device based on feature fusion
CN115761362A (en) * 2022-11-25 2023-03-07 杭州新中大科技股份有限公司 Intelligent construction stage identification model, method and device based on feature fusion
CN115775116A (en) * 2023-02-13 2023-03-10 华设设计集团浙江工程设计有限公司 BIM-based road and bridge engineering management method and system

Also Published As

Publication number Publication date
CN110929607B (en) 2022-05-31

Similar Documents

Publication Publication Date Title
CN110929607B (en) Remote sensing identification method and system for urban building construction progress
CN110136170B (en) Remote sensing image building change detection method based on convolutional neural network
CN109446992B (en) Remote sensing image building extraction method and system based on deep learning, storage medium and electronic equipment
CN108564109B (en) Remote sensing image target detection method based on deep learning
CN107871119B (en) Target detection method based on target space knowledge and two-stage prediction learning
CN111666313B (en) Correlation construction and multi-user data matching method based on multi-source heterogeneous remote sensing data
CN109493320B (en) Remote sensing image road extraction method and system based on deep learning, storage medium and electronic equipment
CN108596055B (en) Airport target detection method of high-resolution remote sensing image under complex background
CN111222474A (en) Method for detecting small target of high-resolution image with any scale
CN110163213B (en) Remote sensing image segmentation method based on disparity map and multi-scale depth network model
CN107358260B (en) Multispectral image classification method based on surface wave CNN
CN108596108B (en) Aerial remote sensing image change detection method based on triple semantic relation learning
CN112183414A (en) Weak supervision remote sensing target detection method based on mixed hole convolution
Sisodia et al. A comparative analysis of remote sensing image classification techniques
CN109614942B (en) Forest disturbance long-time sequence monitoring method based on cloud computing platform
CN110826428A (en) Ship detection method in high-speed SAR image
CN106408030A (en) SAR image classification method based on middle lamella semantic attribute and convolution neural network
Zhang et al. Learning-based spatial–temporal superresolution mapping of forest cover with MODIS images
Peeters et al. Automated recognition of urban objects for morphological urban analysis
CN110992366A (en) Image semantic segmentation method and device and storage medium
CN113610070A (en) Landslide disaster identification method based on multi-source data fusion
CN112395958A (en) Remote sensing image small target detection method based on four-scale depth and shallow layer feature fusion
CN113936214A (en) Karst wetland vegetation community classification method based on fusion of aerospace remote sensing images
CN112084941A (en) Target detection and identification method based on remote sensing image
CN114898089B (en) Functional area extraction and classification method fusing high-resolution images and POI data

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
EE01 Entry into force of recordation of patent licensing contract

Application publication date: 20200327

Assignee: GUILIN JINYIXING TECHNOLOGY DEVELOPMENT Co.,Ltd.

Assignor: GUILIN University OF TECHNOLOGY

Contract record no.: X2023980044509

Denomination of invention: A Remote Sensing Identification Method and System for Urban Building Construction Progress

Granted publication date: 20220531

License type: Common License

Record date: 20231030

EE01 Entry into force of recordation of patent licensing contract