CN111798567A - Vegetation model auxiliary generation method and system based on aerial image and CIM - Google Patents

Vegetation model auxiliary generation method and system based on aerial image and CIM Download PDF

Info

Publication number
CN111798567A
CN111798567A CN202010668637.3A CN202010668637A CN111798567A CN 111798567 A CN111798567 A CN 111798567A CN 202010668637 A CN202010668637 A CN 202010668637A CN 111798567 A CN111798567 A CN 111798567A
Authority
CN
China
Prior art keywords
vegetation
model
information
urban
aerial image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202010668637.3A
Other languages
Chinese (zh)
Inventor
陶润
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhengzhou Angda Information Technology Co ltd
Original Assignee
Zhengzhou Angda Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhengzhou Angda Information Technology Co ltd filed Critical Zhengzhou Angda Information Technology Co ltd
Priority to CN202010668637.3A priority Critical patent/CN111798567A/en
Publication of CN111798567A publication Critical patent/CN111798567A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/08Indexing scheme for image data processing or generation, in general involving all processing steps from image acquisition to 3D model generation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10048Infrared image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • G06T2207/30188Vegetation; Agriculture

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Computer Graphics (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses an aerial image and CIM-based vegetation model auxiliary generation method and system. The method comprises the following steps: constructing a city information model of a three-dimensional digital space; obtaining an original Bayer array and a Bayer array containing near-infrared information; carrying out interpolation processing on the original Bayer array; calculating a near infrared band reflection value NIR of the aerial image; calculating an aerial image normalized vegetation index NDVI; combining the normalized vegetation index NDVI of the aerial image with the RGB data to form four-channel input data, and obtaining a vegetation trunk vertex thermodynamic diagram by adopting a vegetation extraction encoder and a vegetation extraction decoder; post-processing to obtain the vertex coordinates of the vegetation trunks; matching the vegetation model to the position of the vegetation trunk vertex in the urban information model ground coordinate system; and the city information model is visualized by combining a WebGIS technology. By using the method and the device, the quick and effective generation of the regional vegetation model is realized, and the efficiency and the precision of the urban vegetation model generation are improved.

Description

Vegetation model auxiliary generation method and system based on aerial image and CIM
Technical Field
The invention relates to the technical field of artificial intelligence, smart cities and CIM, in particular to a vegetation model auxiliary generation method and system based on aerial images and CIM.
Background
Currently, global information technology is in an accelerated development trend, the position of the information technology in national economy is increasingly prominent, and the construction of a smart city has important strategic significance for the comprehensive improvement of national comprehensive competitiveness.
The smart city is generated from a new generation of information technology represented by the Internet of things, cloud computing, mobile internet and artificial intelligence and an open city innovation ecology gradually bred in a knowledge social environment. The intelligent city is characterized in that intelligent management and operation of the city are realized by utilizing the integration of a new generation of information technology and various communication terminals.
The existing popular three-dimensional model modeling method, oblique photography, adopts visible light for measurement, has high weather requirements and long shooting time, obtains huge data volume, and has extremely large calculation amount in the aspects of extracting image point cloud and generating three-dimensional model. The existing regional vegetation model generation method consumes much time and labor no matter a manual modeling method or an oblique photography modeling method is adopted, so that the rapid modeling method of the three-dimensional model needs to be further improved.
Disclosure of Invention
The invention aims to provide an auxiliary vegetation model generation method and system based on aerial images and CIM (common information model), aiming at the defects in the prior art, so that the rapid and effective regional vegetation model generation is realized, and the urban vegetation model generation efficiency and precision are improved.
A vegetation model auxiliary generation method based on aerial images and CIM comprises the following steps:
step 1, on the basis of three-dimensional urban space geographic information, overlapping BIM information of urban buildings and underground facilities and urban Internet of things information to construct an urban information model of a three-dimensional digital space;
step 2, covering and shooting the ground above a city by using an unmanned aerial vehicle, and obtaining an original Bayer array with an infrared cut-off filter and a Bayer array without the infrared cut-off filter and containing near-infrared information through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain the RGB value of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing the near-infrared information to obtain an aerial image near-infrared waveband reflection value NIR;
step 5, calculating the normalized vegetation index NDVI of the aerial image: NDVI ═ (NIR-R)/(NIR + R);
step 6, combining the normalized vegetation index NDVI of the aerial image and RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network together, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, extracting features of the input data by adopting a vegetation extraction encoder to obtain a feature map;
step 8, performing up-sampling and feature extraction on the feature map by using a vegetation extraction decoder to obtain a vegetation trunk vertex thermodynamic map, wherein hot spots in the thermodynamic map represent confidence coefficients of the vegetation trunk vertexes;
step 9, carrying out post-processing on the thermodynamic diagram of the top points of the vegetation trunks to obtain the top point coordinates of the vegetation trunks;
step 10, projecting the vertex coordinates of the vegetation trunks to a ground coordinate system of the urban information model, obtaining the vegetation model by using three-dimensional modeling software, and matching the vegetation model to the position of the vertex of the vegetation trunks in the ground coordinate system of the urban information model;
and step 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
The data label manufacturing method specifically comprises the following steps: marking pixels of the top points of the vegetation trunks in the aerial images; and performing Gaussian kernel convolution on the labeled vegetation trunk vertexes to generate thermodynamic diagram data of the vegetation points as data labels.
And 6, carrying out normalization processing on the aerial image RGB data, and combining the aerial image RGB data with the aerial image normalized vegetation index NDVI to be used as input data.
The loss function adopted by the training network in step 6 is as follows:
Figure BDA0002581475200000021
wherein, PijAnd the score of the vertex of the vegetation trunk at the position (i, j) is represented, N represents the number of key points in the truth value graph, and alpha and beta are hyper-parameters.
An aerial image and CIM-based vegetation model auxiliary generation system can execute the following steps:
step 1, on the basis of three-dimensional urban space geographic information, overlapping BIM information of urban buildings and underground facilities and urban Internet of things information to construct an urban information model of a three-dimensional digital space;
step 2, covering and shooting the ground above a city by using an unmanned aerial vehicle, and obtaining an original Bayer array with an infrared cut-off filter and a Bayer array without the infrared cut-off filter and containing near-infrared information through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain the RGB value of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing the near-infrared information to obtain an aerial image near-infrared waveband reflection value NIR;
step 5, calculating the normalized vegetation index NDVI of the aerial image: NDVI ═ (NIR-R)/(NIR + R);
step 6, combining the normalized vegetation index NDVI of the aerial image and RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network together, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, extracting features of the input data by adopting a vegetation extraction encoder to obtain a feature map;
step 8, performing up-sampling and feature extraction on the feature map by using a vegetation extraction decoder to obtain a vegetation trunk vertex thermodynamic map, wherein hot spots in the thermodynamic map represent confidence coefficients of the vegetation trunk vertexes;
step 9, carrying out post-processing on the thermodynamic diagram of the top points of the vegetation trunks to obtain the top point coordinates of the vegetation trunks;
step 10, projecting the vertex coordinates of the vegetation trunks to a ground coordinate system of the urban information model, obtaining the vegetation model by using three-dimensional modeling software, and matching the vegetation model to the position of the vertex of the vegetation trunks in the ground coordinate system of the urban information model;
and step 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
Compared with the prior art, the invention has the following beneficial effects:
1. the method combines aerial images, adopts the vegetation trunk vertex to extract the neural network to extract the vegetation trunk vertex, adopts a large number of samples to train, has better robustness, and improves the extraction precision of the vegetation points; the network is simple in structure, vegetation point information can be obtained quickly, vegetation points are provided for the regional vegetation model, and the generation efficiency of the regional vegetation model is improved.
2. The method adopts the normalized vegetation index and the RGB data to jointly train the vegetation trunk vertex extraction network, the normalized vegetation index can provide richer information for the vegetation trunk vertex extraction, and the neural network training efficiency and the accuracy of the vegetation trunk vertex extraction are improved.
3. According to the method, a city information model technology is combined, vegetation point information and a vegetation model obtained through three-dimensional modeling are combined, the vegetation model is matched into the city information model, and convenience is provided for smart city information integration; and the city information model is visualized by combining a WebGIS technology, so that the inquiry and supervision of a supervisor are facilitated.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
The invention provides a vegetation model auxiliary generation method and system based on aerial images and CIM. The method comprises the steps of firstly, obtaining near infrared and RGB information through aerial photography and image processing, then generating a normalized vegetation index (NDVI) in remote sensing through a simulation thought, training RGB and NDVI data through a convolutional neural network, obtaining aerial photography image vegetation points and projecting the aerial photography image vegetation points to a CIM (common information model), providing vegetation position information, matching the vegetation model to a corresponding position in a city information model, and carrying out visualization. FIG. 1 is a flow chart of the method of the present invention. The following description will be made by way of specific examples.
Example 1:
the implementation provides a vegetation model auxiliary generation method based on aerial images and CIM.
The city construction should grasp the convenience brought by the technology, and by means of informationized technologies such as BIM and the like, a city exclusive CIM model is constructed, resources are continuously integrated, positioning is accurately found, a new path for smart city construction is explored, and the city service level and the service quality are improved.
The internal relationship between CIM and information exchange module is first constructed.
CIM is an organic complex which establishes a three-dimensional city space model and building information based on city information data and mainly comprises GIS data and BIM data of urban roads, buildings and infrastructure.
The information exchange module is a CIM-based data exchange platform and mainly realizes information exchange between the urban information model and an external interface, and the information exchange module comprises a three-dimensional urban space model, urban information, geographic information and camera perception information and can update the information content of the urban information model in real time along with the continuous progress of the construction progress of the smart city.
And the CIM is based on three-dimensional urban space geographic information, and overlaps BIM information of urban buildings, underground facilities and urban Internet of things information to construct an urban information model of a three-dimensional digital space. The scene of a city is displayed in Web by combining a CIM city information model with a WebGIS technology, and the system can call an information exchange module to display the latest three-dimensional city space model and city information.
The extraction of vegetation points is realized through aerial image processing and a convolutional neural network, and the vegetation points are rapidly projected to the CIM ground, so that the position information of vegetation is provided, and the vegetation model is stacked according to the vegetation points, so that the vegetation model is convenient to apply and low in cost.
The invention mainly aims at positioning vegetation, thereby providing the position information of urban vegetation.
The vegetation model auxiliary generation method is characterized in that the vegetation model auxiliary generation method provides the position point information of the vegetation, and then the model can be stacked according to the position information of the vegetation points, so that the method is quick and convenient.
Firstly, shooting the ground above a city by using an unmanned aerial vehicle. Since the ir-cut filter is an essential component of the image sensor, the ir-cut filter switch can be used to control the presence of the ir-cut filter, so that the original bayer array S1 and the bayer array S2 containing the near-infrared information can be obtained by shooting.
Here, there is an infrared cut filter, and the array obtained by imaging is S1; the array obtained by imaging without an infrared cut filter was S2.
The original bayer array S1 is a single channel, each pixel includes only a portion of the spectrum, and the RGB values for each pixel must be obtained by interpolation. Bayer array interpolation methods are well known and will not be described in detail here.
The infrared cut filter filters out a range (the range is customizable) of infrared light, so near infrared information NIR can be obtained from S2-S1, namely the value of the S2 array minus the value of the S1 array.
The normalized vegetation index can effectively reflect the vegetation coverage, and the calculation formula is as follows:
NDVI=(NIR-R)/(NIR+R)
namely the sum of the difference ratio of the reflection value of the near infrared band and the reflection value of the red light band, NIR is the reflection value of the near infrared band, and R is the reflection value of the red light band.
And (3) obtaining the normalized vegetation index of the aerial image by the obtained red light information (R) and near infrared information (NIR) through the formula through a simulation thought.
And then, making a data label, labeling the pixels of the top point of the vegetation trunk (namely the top point of the trunk or the central top point of the canopy) in the aerial image, and labeling the top point of the vegetation trunk according to human experience if the trees are dense. And then, carrying out Gaussian kernel convolution on the marked vegetation trunk vertex to generate thermodynamic diagram data of the vegetation point. The specific details, such as the selection method of the gaussian kernel size, can be adjusted according to the implementation scenario.
And after the data are manufactured, sending the aerial image and the label data into a vegetation tree top point extraction network for training.
Details of the vegetation trunk vertex extraction network training are as follows: the image collected by the aerial camera is normalized, the value range of a picture matrix is changed into floating point numbers between [0 and 1] so that the model can be converged better, and when the image is input into the model, the Concatenate NDVI is suggested to form four-channel input data so that a better extraction effect can be obtained. The labels are also normalized.
Training data is input into a network, and a vegetation extraction Encoder Encoder and a vegetation extraction Decoder Decoder are trained end to end. The Encoder extracts the characteristics of the image, inputs the RGB + NDVI data subjected to normalization processing, and outputs the data as Feature map; the Decoder performs upsampling and Feature extraction on the Feature map, inputs the Feature map generated by the Encoder, and outputs the Feature map as a vegetation trunk vertex thermodynamic diagram (Feature map). Hotspots in the thermodynamic diagram characterize the confidence of the vegetation trunk vertices. The Loss function adopts Heatmaps Loss, and the mathematical formula is as follows:
Figure BDA0002581475200000041
wherein, PijRepresents the score of the vegetation trunk vertex at location (i, j), with higher scores being more likely to be the vegetation trunk vertex. y isijThe pixel value at position (i, j) in the truth thermodynamic diagram is shown. N represents the number of key points in the ground route. Alpha and beta are hyper-parameters which need to be set manually, and an implementer is also suggested to search suitable values of alpha and beta by adopting a hyper-parameter search technology.
It should be noted that the output of the network is a thermodynamic diagram, and post-processing is required to obtain the vertex coordinates of the vegetation trunk. The post-processing method of the thermodynamic diagram comprises a softargmax method and the like, and key point regression is carried out.
The vegetation point extraction belongs to a key point detection task in computer vision, so that the convolutional neural network adopts an encoder-decoder structure, popular network models are various, such as Hourglass and HRNet, and an implementer can extract the vegetation point by applying common networks.
Therefore, the extraction of the vegetation points of the aerial images can be completed.
And projecting the vertex coordinates of the vegetation trunks to a ground coordinate system of the urban information model, obtaining the vegetation model by using three-dimensional modeling software, and matching the vegetation model to the position of the vertex coordinates of the vegetation trunks in the ground coordinate system of the urban information model. The vegetation point data is projected onto the CIM, that is, projection transformation is required, and there are many common methods, such as an analytic transformation method and a numerical transformation method, and the solving method thereof is well known and is not described herein again.
Modeling can be performed according to the vegetation point projected on the CIM, and the model is matched to the point. Three-dimensional modeling software is available, for example, 3d max can complete the modeling of vegetation. In order to realize the diversity of the vegetation, an implementer can establish vegetation models of various different styles.
Finally, in order to visually present the three-dimensional scene of the vegetation model, the method combines the WebGIS technology, and updates the three-dimensional space model of the Web end in real time by calling the information exchange module so as to visualize the data and the model.
Example 2:
an aerial image and CIM-based vegetation model auxiliary generation system comprises a memory, a processor and a computer program which is stored on the memory and can run on the processor, and can realize an aerial image and CIM-based vegetation model auxiliary generation method, step 1, based on three-dimensional urban space geographic information, overlapping BIM information of urban buildings, underground facilities on the ground and urban Internet of things information, and constructing a three-dimensional digital space urban information model; step 2, covering and shooting the ground above a city by using an unmanned aerial vehicle, and obtaining an original Bayer array with an infrared cut-off filter and a Bayer array without the infrared cut-off filter and containing near-infrared information through an infrared filter switcher; step 3, carrying out interpolation processing on the original Bayer array to obtain the RGB value of each pixel; step 4, subtracting the original Bayer array from the Bayer array containing the near-infrared information to obtain an aerial image near-infrared waveband reflection value NIR; step 5, calculating the normalized vegetation index NDVI of the aerial image: NDVI ═ (NIR-R)/(NIR + R); step 6, combining the normalized vegetation index NDVI of the aerial image and RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network together, and training a vegetation extraction encoder and a vegetation extraction decoder; step 7, extracting features of the input data by adopting a vegetation extraction encoder to obtain a feature map; step 8, performing up-sampling and feature extraction on the feature map by using a vegetation extraction decoder to obtain a vegetation trunk vertex thermodynamic map, wherein hot spots in the thermodynamic map represent confidence coefficients of the vegetation trunk vertexes; step 9, carrying out post-processing on the thermodynamic diagram of the top points of the vegetation trunks to obtain the top point coordinates of the vegetation trunks; step 10, projecting the vertex coordinates of the vegetation trunks to a ground coordinate system of the urban information model, obtaining the vegetation model by using three-dimensional modeling software, and matching the vegetation model to the position of the vertex of the vegetation trunks in the ground coordinate system of the urban information model; and step 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
The above embodiments are merely preferred embodiments of the present invention, which should not be construed as limiting the present invention, and any modifications, equivalents and improvements made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (5)

1. A vegetation model auxiliary generation method based on aerial images and CIM is characterized by comprising the following steps:
step 1, on the basis of three-dimensional urban space geographic information, overlapping BIM information of urban buildings and underground facilities and urban Internet of things information to construct an urban information model of a three-dimensional digital space;
step 2, covering and shooting the ground above a city by using an unmanned aerial vehicle, and obtaining an original Bayer array with an infrared cut-off filter and a Bayer array without the infrared cut-off filter and containing near-infrared information through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain the RGB value of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing the near-infrared information to obtain an aerial image near-infrared waveband reflection value NIR;
step 5, calculating the normalized vegetation index NDVI of the aerial image: NDVI ═ (NIR-R)/(NIR + R);
step 6, combining the normalized vegetation index NDVI of the aerial image and RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network together, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, extracting features of the input data by adopting a vegetation extraction encoder to obtain a feature map;
step 8, performing up-sampling and feature extraction on the feature map by using a vegetation extraction decoder to obtain a vegetation trunk vertex thermodynamic map, wherein hot spots in the thermodynamic map represent confidence coefficients of the vegetation trunk vertexes;
step 9, carrying out post-processing on the thermodynamic diagram of the top points of the vegetation trunks to obtain the top point coordinates of the vegetation trunks;
step 10, projecting the vertex coordinates of the vegetation trunks to a ground coordinate system of the urban information model, obtaining the vegetation model by using three-dimensional modeling software, and matching the vegetation model to the position of the vertex of the vegetation trunks in the ground coordinate system of the urban information model;
and step 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
2. The method of claim 1, wherein fabricating the data tag is specifically: marking pixels of the top points of the vegetation trunks in the aerial images; and performing Gaussian kernel convolution on the labeled vegetation trunk vertexes to generate thermodynamic diagram data of the vegetation points as data labels.
3. The method of claim 1, wherein the aerial image RGB data is normalized in step 6 and combined with the aerial image normalized vegetation index NDVI as input data.
4. The method of claim 1, wherein the loss function employed to train the network in step 6 is:
Figure FDA0002581475190000011
wherein, PijAnd the score of the vertex of the vegetation trunk at the position (i, j) is represented, N represents the number of key points in the truth value graph, and alpha and beta are hyper-parameters.
5. An aerial image and CIM-based vegetation model auxiliary generation system is characterized by comprising the following steps:
step 1, on the basis of three-dimensional urban space geographic information, overlapping BIM information of urban buildings and underground facilities and urban Internet of things information to construct an urban information model of a three-dimensional digital space;
step 2, covering and shooting the ground above a city by using an unmanned aerial vehicle, and obtaining an original Bayer array with an infrared cut-off filter and a Bayer array without the infrared cut-off filter and containing near-infrared information through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain the RGB value of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing the near-infrared information to obtain an aerial image near-infrared waveband reflection value NIR;
step 5, calculating the normalized vegetation index NDVI of the aerial image: NDVI ═ (NIR-R)/(NIR + R);
step 6, combining the normalized vegetation index NDVI of the aerial image and RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network together, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, extracting features of the input data by adopting a vegetation extraction encoder to obtain a feature map;
step 8, performing up-sampling and feature extraction on the feature map by using a vegetation extraction decoder to obtain a vegetation trunk vertex thermodynamic map, wherein hot spots in the thermodynamic map represent confidence coefficients of the vegetation trunk vertexes;
step 9, carrying out post-processing on the thermodynamic diagram of the top points of the vegetation trunks to obtain the top point coordinates of the vegetation trunks;
step 10, projecting the vertex coordinates of the vegetation trunks to a ground coordinate system of the urban information model, obtaining the vegetation model by using three-dimensional modeling software, and matching the vegetation model to the position of the vertex of the vegetation trunks in the ground coordinate system of the urban information model;
and step 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
CN202010668637.3A 2020-07-13 2020-07-13 Vegetation model auxiliary generation method and system based on aerial image and CIM Withdrawn CN111798567A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010668637.3A CN111798567A (en) 2020-07-13 2020-07-13 Vegetation model auxiliary generation method and system based on aerial image and CIM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010668637.3A CN111798567A (en) 2020-07-13 2020-07-13 Vegetation model auxiliary generation method and system based on aerial image and CIM

Publications (1)

Publication Number Publication Date
CN111798567A true CN111798567A (en) 2020-10-20

Family

ID=72808404

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010668637.3A Withdrawn CN111798567A (en) 2020-07-13 2020-07-13 Vegetation model auxiliary generation method and system based on aerial image and CIM

Country Status (1)

Country Link
CN (1) CN111798567A (en)

Similar Documents

Publication Publication Date Title
US10592780B2 (en) Neural network training system
US10297074B2 (en) Three-dimensional modeling from optical capture
US20190026400A1 (en) Three-dimensional modeling from point cloud data migration
CN110610540B (en) Quick rendering and plotting method of urban live-action three-dimensional model based on BIM and GIS
CN111028358B (en) Indoor environment augmented reality display method and device and terminal equipment
WO2023280038A1 (en) Method for constructing three-dimensional real-scene model, and related apparatus
CN114758337B (en) Semantic instance reconstruction method, device, equipment and medium
CN111599007B (en) Smart city CIM road mapping method based on unmanned aerial vehicle aerial photography
CN112634370A (en) Unmanned aerial vehicle dotting method, device, equipment and storage medium
CN113256778A (en) Method, device, medium and server for generating vehicle appearance part identification sample
CN115375868B (en) Map display method, remote sensing map display method, computing device and storage medium
CN114125310B (en) Photographing method, terminal device and cloud server
Cheng et al. Multi-scale Feature Fusion and Transformer Network for urban green space segmentation from high-resolution remote sensing images
CN108932474B (en) Remote sensing image cloud judgment method based on full convolution neural network composite characteristics
CN111626971B (en) Smart city CIM real-time imaging method with image semantic perception
CN113066112A (en) Indoor and outdoor fusion method and device based on three-dimensional model data
CN116883610A (en) Digital twin intersection construction method and system based on vehicle identification and track mapping
CN113610905B (en) Deep learning remote sensing image registration method based on sub-image matching and application
CN116206068B (en) Three-dimensional driving scene generation and construction method and device based on real data set
CN110189395B (en) Method for realizing dynamic analysis and quantitative design of landscape elevation based on human visual angle oblique photography
CN111627103A (en) Smart city CIM imaging method based on pedestrian activity and density perception
Fu et al. Automatic detection tree crown and height using Mask R-CNN based on unmanned aerial vehicles images for biomass mapping
CN111612894B (en) Vegetation model auxiliary generation method and system based on aerial image and CIM
Zhou et al. Green urban garden landscape simulation platform based on high-resolution image recognition technology and GIS
CN113610032A (en) Building identification method and device based on remote sensing image

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication
WW01 Invention patent application withdrawn after publication

Application publication date: 20201020