CN111612894B - Vegetation model auxiliary generation method and system based on aerial image and CIM - Google Patents

Vegetation model auxiliary generation method and system based on aerial image and CIM Download PDF

Info

Publication number
CN111612894B
CN111612894B CN202010449593.5A CN202010449593A CN111612894B CN 111612894 B CN111612894 B CN 111612894B CN 202010449593 A CN202010449593 A CN 202010449593A CN 111612894 B CN111612894 B CN 111612894B
Authority
CN
China
Prior art keywords
vegetation
information
trunk
model
urban
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010449593.5A
Other languages
Chinese (zh)
Other versions
CN111612894A (en
Inventor
陈祥
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
School Of Science And Arts Jiangsu Normal University
Original Assignee
School Of Science And Arts Jiangsu Normal University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by School Of Science And Arts Jiangsu Normal University filed Critical School Of Science And Arts Jiangsu Normal University
Priority to CN202010449593.5A priority Critical patent/CN111612894B/en
Publication of CN111612894A publication Critical patent/CN111612894A/en
Application granted granted Critical
Publication of CN111612894B publication Critical patent/CN111612894B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/10Geometric CAD
    • G06F30/13Architectural design, e.g. computer-aided architectural design [CAAD] related to design of buildings, bridges, landscapes, production plants or roads
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02ATECHNOLOGIES FOR ADAPTATION TO CLIMATE CHANGE
    • Y02A30/00Adapting or protecting infrastructure or their operation
    • Y02A30/60Planning or developing urban green infrastructure

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • Civil Engineering (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Mathematical Analysis (AREA)
  • Structural Engineering (AREA)
  • Computational Mathematics (AREA)
  • Architecture (AREA)
  • Remote Sensing (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Graphics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a vegetation model auxiliary generation method and system based on aerial images and CIM. Comprising the following steps: constructing a city information model of a three-dimensional digital space; obtaining an original Bayer array and a Bayer array containing near infrared information; interpolation processing is carried out on the original Bayer array; calculating a near infrared band reflection value NIR of the aerial image; calculating an aerial image normalized vegetation index NDVI; combining the normalized vegetation index NDVI of the aerial image with RGB data to form four-channel input data, and obtaining a vegetation trunk peak thermodynamic diagram by adopting a vegetation extraction encoder and a vegetation extraction decoder; post-processing to obtain vegetation trunk vertex coordinates; matching the vegetation model to the position of the vegetation trunk peak in the ground coordinate system of the urban information model; and visualizing the city information model by combining with the WebGIS technology. By utilizing the method, the rapid and effective regional vegetation model generation is realized, and the urban vegetation model generation efficiency and precision are improved.

Description

Vegetation model auxiliary generation method and system based on aerial image and CIM
Technical Field
The invention relates to the technical fields of artificial intelligence, smart cities and CIM, in particular to a vegetation model auxiliary generation method and system based on aerial images and CIM.
Background
Currently, global information technology is in an accelerating development trend, the position of the information technology in national economy is increasingly outstanding, and the construction of smart cities has important strategic significance for the comprehensive improvement of the comprehensive competitiveness of one country.
The generation of the smart city is derived from new generation information technology represented by Internet of things, cloud computing, mobile internet and artificial intelligence and open city innovation ecology gradually inoculated in the knowledge society environment. The smart city emphasizes that the new generation information technology and various communication terminals are integrated to realize the smart management and operation of the city.
The current popular three-dimensional model modeling method, namely the oblique photography technology, adopts visible light for measurement, has higher weather requirements and longer shooting time, and has huge data volume, and particularly has large calculation amount in the aspects of extracting image point clouds and generating a three-dimensional model. The existing regional vegetation model generation method is very time-consuming and labor-consuming no matter adopting a manual modeling method or an oblique photography modeling method, so the rapid modeling method of the three-dimensional model needs to be further perfected.
Disclosure of Invention
Aiming at the defects in the prior art, the invention provides a vegetation model auxiliary generation method and system based on aerial images and CIM, which realizes rapid and effective regional vegetation model generation and improves urban vegetation model generation efficiency and precision.
A vegetation model auxiliary generation method based on aerial image and CIM comprises the following steps:
step 1, based on three-dimensional urban space geographic information, building urban information models of three-dimensional digital spaces are constructed by superposing BIM information of urban buildings, underground facilities and urban Internet of things information;
step 2, performing coverage shooting on the ground above the city by using an unmanned aerial vehicle, and obtaining an original Bayer array under an infrared cut-off filter and a Bayer array without the infrared cut-off filter, wherein the Bayer array contains near infrared information, through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain RGB values of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing near infrared information to obtain a near infrared band reflection value NIR of the aerial image;
step 5, calculating an aerial image normalized vegetation index NDVI: ndvi= (NIR-R)/(nir+r);
step 6, combining the normalized vegetation index NDVI of the aerial image with RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, carrying out feature extraction on the input data by adopting a vegetation extraction encoder to obtain a feature map;
step 8, up-sampling and feature extraction are carried out on the feature map by adopting a vegetation extraction decoder, so as to obtain a vegetation trunk peak thermodynamic diagram, and the confidence coefficient of the hot spot representing the vegetation trunk peak in the thermodynamic diagram;
step 9, post-processing is carried out on the vegetation trunk peak thermodynamic diagram to obtain vegetation trunk peak coordinates;
step 10, projecting the coordinates of the vegetation trunk vertexes to a ground coordinate system of the urban information model, obtaining a vegetation model by using three-dimensional modeling software, and matching the vegetation model to the positions of the vegetation trunk vertexes in the ground coordinate system of the urban information model;
and 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
The data label is specifically manufactured by: labeling pixels of the vegetation trunk vertexes in the aerial image; and carrying out Gaussian kernel convolution on the marked vegetation trunk vertexes to generate thermodynamic diagram data of vegetation points as data labels.
And step 6, carrying out normalization processing on the RGB data of the aerial image, and combining the RGB data with the normalized vegetation index NDVI of the aerial image to serve as input data.
The loss function adopted by the training network in the step 6 is as follows:
Figure BDA0002507193840000021
wherein P is ij Representing the score of the vegetation trunk peak at the position (i, j), N represents the number of key points in the truth-value diagram, and alpha and beta are super-parameters.
A vegetation model auxiliary generation system based on aerial images and CIM can execute the following steps:
step 1, based on three-dimensional urban space geographic information, building urban information models of three-dimensional digital spaces are constructed by superposing BIM information of urban buildings, underground facilities and urban Internet of things information;
step 2, performing coverage shooting on the ground above the city by using an unmanned aerial vehicle, and obtaining an original Bayer array under an infrared cut-off filter and a Bayer array without the infrared cut-off filter, wherein the Bayer array contains near infrared information, through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain RGB values of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing near infrared information to obtain a near infrared band reflection value NIR of the aerial image;
step 5, calculating an aerial image normalized vegetation index NDVI: ndvi= (NIR-R)/(nir+r);
step 6, combining the normalized vegetation index NDVI of the aerial image with RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, carrying out feature extraction on the input data by adopting a vegetation extraction encoder to obtain a feature map;
step 8, up-sampling and feature extraction are carried out on the feature map by adopting a vegetation extraction decoder, so as to obtain a vegetation trunk peak thermodynamic diagram, and the confidence coefficient of the hot spot representing the vegetation trunk peak in the thermodynamic diagram;
step 9, post-processing is carried out on the vegetation trunk peak thermodynamic diagram to obtain vegetation trunk peak coordinates;
step 10, projecting the coordinates of the vegetation trunk vertexes to a ground coordinate system of the urban information model, obtaining a vegetation model by using three-dimensional modeling software, and matching the vegetation model to the positions of the vegetation trunk vertexes in the ground coordinate system of the urban information model;
and 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
Compared with the prior art, the invention has the following beneficial effects:
1. according to the invention, by combining aerial images, the vegetation trunk vertexes are extracted by adopting the neural network to extract the vegetation trunk vertexes, and a large number of samples are adopted for training, so that the method has good robustness and improves the extraction precision of vegetation points; the network is simple in structure, vegetation point information can be obtained rapidly, vegetation points are provided for the regional vegetation model, and the generation efficiency of the regional vegetation model is improved.
2. According to the invention, the vegetation trunk vertex extraction network is trained by combining the normalized vegetation index and the RGB data, and the normalized vegetation index can provide richer information for vegetation trunk vertex extraction, so that the training efficiency of the neural network and the accuracy rate of vegetation trunk vertex extraction are improved.
3. According to the method, the urban information model technology is combined, vegetation point information and a vegetation model obtained through three-dimensional modeling are combined, the vegetation model is matched into the urban information model, and convenience is provided for intelligent urban information integration; and by combining with the WebGIS technology, the city information model is visualized, so that the supervision personnel can inquire and supervise the city information model conveniently.
Drawings
FIG. 1 is a flow chart of the method of the present invention.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
The invention provides a vegetation model auxiliary generation method and system based on aerial images and CIM. Firstly, near infrared and RGB information is obtained through aerial photography and image processing, then normalized vegetation index (NDVI) in remote sensing is generated through a simulation idea, RGB and NDVI data are trained through a convolutional neural network, aerial photographic image vegetation points are obtained and projected onto CIM, vegetation position information is provided, and a vegetation model is matched to a corresponding position in a city information model for visualization. FIG. 1 is a flow chart of the method of the present invention. The following is a description of specific examples.
Example 1:
the implementation provides an auxiliary vegetation model generation method based on aerial images and CIM.
The urban construction is convenient to grasp, an exclusive CIM model of the city is constructed by means of information technologies such as BIM and the like, resources are continuously integrated, positioning is accurately found, a new path of the smart city construction is explored, and the urban service level and service quality are improved.
First, build CIM (City information model) and information exchange module internal relation.
CIM is an organic complex for building a three-dimensional city space model and building information based on city information data, and mainly comprises GIS data and BIM data of city roads, buildings and infrastructure.
The information exchange module is a CIM-based data exchange platform, and mainly realizes information exchange between the city information model and an external interface, and comprises a three-dimensional city space model, city information, geographic information and camera perception information, and the information content of the city information model can be updated in real time along with the continuous promotion of the construction progress of the smart city.
The CIM is based on three-dimensional urban space geographic information, and is used for superposing BIM information of urban buildings, underground facilities and urban Internet of things information to construct a three-dimensional digital space urban information model. The CIM city information model is combined with the WebGIS technology to display the scene of the city in the Web, and the system can call the information exchange module to display the latest three-dimensional city space model and city information.
The vegetation points are extracted through aerial image processing and a convolutional neural network and are rapidly projected onto the CIM ground, the vegetation position information is provided, and the vegetation model is piled according to the vegetation position information, so that the vegetation position information is convenient to apply and low in cost.
The invention is mainly aimed at vegetation positioning, thereby providing position information of urban vegetation.
The vegetation model auxiliary generation method is characterized in that the vegetation model auxiliary generation method provides vegetation position point information, and then the models can be stacked according to the vegetation position information, so that the vegetation model auxiliary generation method is rapid and convenient.
Firstly, shooting the ground by using an unmanned aerial vehicle in the urban sky. Since the ir cut filter is an essential element of the image sensor, the presence or absence of the ir cut filter can be controlled by the ir filter switcher, so that the original bayer array S1 and the bayer array S2 containing near infrared information can be obtained by photographing.
Here, there is an infrared cut filter, and the array obtained by photographing is S1; the array obtained by shooting without the infrared cut-off filter is S2.
The original bayer array S1 is single-channel, each pixel comprising only a part of the spectrum, the RGB values of each pixel having to be obtained by interpolation. The Bayer array interpolation method is well known and will not be described here in detail.
There is an ir cut filter that filters out ir light over a range (which can be tailored) so that the NIR is derived from S2-S1, i.e. the value of the S2 array minus the value of the S1 array.
The normalized vegetation index can effectively reflect vegetation coverage, and the calculation formula is as follows:
NDVI=(NIR-R)/(NIR+R)
i.e. the sum of the difference between the reflection value in the near infrared band and the reflection value in the red band, NIR is the reflection value in the near infrared band and R is the reflection value in the red band.
Through the thought of simulation, the obtained red light information (R) and near infrared information (NIR) are subjected to the formula to obtain the normalized vegetation index of the aerial image.
And then making a data label, marking pixels which are the trunk vertexes of the vegetation (namely trunk vertexes or canopy center vertexes) in the aerial image, and marking the vertexes of the trunk of the vegetation according to human experience if the trees are denser. And then, carrying out Gaussian kernel convolution on the marked vegetation trunk vertexes to generate thermodynamic diagram data of vegetation points. Specific details, such as a method for selecting a gaussian kernel size, may be adjusted according to an implementation scenario.
After the data are produced, the aerial image and the tag data are sent to a vegetation trunk vertex extraction network for training.
Details of the vegetation trunk apex extraction network training are as follows: the image acquired by the aerial camera is subjected to normalization processing, the value range of the image matrix is changed into a floating point number between [0,1] so that the model can be better converged, and a Concate (joint) NDVI is recommended when the image is input into the model, so that four-channel input data are formed, and a better extraction effect is obtained. The labels are also normalized.
Training data is input into the network, and the vegetation extraction Encoder and the vegetation extraction Decoder are trained end to end. The Encoder extracts the characteristics of the image, inputs the image as RGB+NDVI data subjected to normalization processing, and outputs the image as Feature map; the Decoder performs up-sampling and Feature extraction on the Feature map, and inputs the Feature map generated by the Encoder and outputs the Feature map as a vegetation trunk apex thermodynamic diagram (hetmap). The hot spots in the thermodynamic diagram characterize the confidence of the vegetation trunk vertices. The Loss function adopts a Heatmap Loss, and the mathematical formula is as follows:
Figure BDA0002507193840000041
wherein P is ij Representing the score of the trunk apex of the vegetation at position (i, j), the higher the score the more likely it is the trunk apex of the vegetation. y is ij Representing the pixel value at position (i, j) in the true thermodynamic diagram. N represents the number of keypoints in the group trunk. Alpha and beta are super parameters, which need to be set manually, and an implementer is also suggested to search proper values of alpha and beta by adopting a super parameter search technology.
It should be noted that the output of the network is a thermodynamic diagram, and post-processing is required to obtain the vertex coordinates of the vegetation trunk. The post-processing method of the thermodynamic diagram comprises softargmax and other methods, and key point regression is carried out.
The vegetation point extraction belongs to a key point detection task in computer vision, so that the convolutional neural network structure adopts an encoder-decoder structure, various popular network models such as Hourglass, HRNet and the like are adopted, and an implementer can apply a common network to perform vegetation point extraction.
The extraction of the vegetation points of the aerial image can be completed.
Projecting the coordinates of the vertexes of the vegetation trunk to the ground coordinate system of the urban information model, obtaining a vegetation model by using three-dimensional modeling software, and matching the vegetation model to the positions of the vertexes of the vegetation trunk in the ground coordinate system of the urban information model. The projection of vegetation point data onto the CIM, i.e. the projection transformation is required, and many common methods, such as analytical transformation methods and numerical transformation methods, are known, and the solving methods thereof are not described herein.
The model may be modeled based on the vegetation points projected onto the CIM, to which the model is matched. Three-dimensional modeling software is available, for example, 3DMax can complete vegetation modeling. In order to realize the diversification of vegetation, an implementer can establish a plurality of vegetation models with different styles.
Finally, in order to intuitively present the three-dimensional scene of the vegetation model, the invention combines the WebGIS technology, and updates the three-dimensional space model of the Web end in real time by calling the information exchange module so as to visualize the data and the model.
Example 2:
a vegetation model auxiliary generation system based on aerial images and CIM comprises a memory, a processor and a computer program which is stored in the memory and can run on the processor, wherein the system can realize a vegetation model auxiliary generation method based on aerial images and CIM, and step 1, based on three-dimensional urban space geographic information, the BIM information of urban buildings, underground facilities and urban Internet of things information are overlapped to construct a three-dimensional digital space urban information model; step 2, performing coverage shooting on the ground above the city by using an unmanned aerial vehicle, and obtaining an original Bayer array under an infrared cut-off filter and a Bayer array without the infrared cut-off filter, wherein the Bayer array contains near infrared information, through an infrared filter switcher; step 3, carrying out interpolation processing on the original Bayer array to obtain RGB values of each pixel; step 4, subtracting the original Bayer array from the Bayer array containing near infrared information to obtain a near infrared band reflection value NIR of the aerial image; step 5, calculating an aerial image normalized vegetation index NDVI: ndvi= (NIR-R)/(nir+r); step 6, combining the normalized vegetation index NDVI of the aerial image with RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network, and training a vegetation extraction encoder and a vegetation extraction decoder; step 7, carrying out feature extraction on the input data by adopting a vegetation extraction encoder to obtain a feature map; step 8, up-sampling and feature extraction are carried out on the feature map by adopting a vegetation extraction decoder, so as to obtain a vegetation trunk peak thermodynamic diagram, and the confidence coefficient of the hot spot representing the vegetation trunk peak in the thermodynamic diagram; step 9, post-processing is carried out on the vegetation trunk peak thermodynamic diagram to obtain vegetation trunk peak coordinates; step 10, projecting the coordinates of the vegetation trunk vertexes to a ground coordinate system of the urban information model, obtaining a vegetation model by using three-dimensional modeling software, and matching the vegetation model to the positions of the vegetation trunk vertexes in the ground coordinate system of the urban information model; and 11, combining the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model.
The above embodiments are merely preferred embodiments of the present invention and are not intended to limit the present invention, and any modifications, equivalent substitutions and improvements made within the spirit and principles of the present invention should be included in the scope of the present invention.

Claims (4)

1. An auxiliary vegetation model generation method based on aerial images and CIM is characterized by comprising the following steps:
step 1, based on three-dimensional urban space geographic information, building urban information models of three-dimensional digital spaces are constructed by superposing BIM information of urban buildings, underground facilities and urban Internet of things information;
step 2, performing coverage shooting on the ground above the city by using an unmanned aerial vehicle, and obtaining an original Bayer array under an infrared cut-off filter and a Bayer array without the infrared cut-off filter, wherein the Bayer array contains near infrared information, through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain RGB values of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing near infrared information to obtain a near infrared band reflection value NIR of the aerial image;
step 5, calculating an aerial image normalized vegetation index NDVI: ndvi= (NIR-R)/(nir+r);
step 6, combining the normalized vegetation index NDVI of the aerial image with RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, carrying out feature extraction on input data to be detected by adopting a vegetation extraction encoder to obtain a feature map;
step 8, up-sampling and feature extraction are carried out on the feature map by adopting a vegetation extraction decoder, so as to obtain a vegetation trunk peak thermodynamic diagram, and the confidence coefficient of the hot spot representing the vegetation trunk peak in the thermodynamic diagram;
step 9, post-processing is carried out on the vegetation trunk peak thermodynamic diagram to obtain vegetation trunk peak coordinates;
step 10, projecting the coordinates of the vegetation trunk vertexes to a ground coordinate system of the urban information model, obtaining a vegetation model by using three-dimensional modeling software, and matching the vegetation model to the positions of the vegetation trunk vertexes in the ground coordinate system of the urban information model;
step 11, combining with the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model;
the data label is specifically manufactured by: labeling pixels of the vegetation trunk vertexes in the aerial image; and carrying out Gaussian kernel convolution on the marked vegetation trunk vertexes to generate thermodynamic diagram data of vegetation points as data labels.
2. The method of claim 1, wherein the step 6 is performed with normalization of the RGB data of the aerial image, in combination with the normalized vegetation index NDVI of the aerial image as input data.
3. The method of claim 1, wherein the training network in step 6 employs a loss function of:
Figure FDA0004124202890000021
wherein P is ij Representing the score of the vegetation trunk peak at the position (i, j), N represents the number of key points in the truth-value diagram, and alpha and beta are super-parameters.
4. An auxiliary vegetation model generating system based on aerial images and CIM, which is characterized by comprising a memory, a processor and a computer program stored on the memory and capable of running on the processor, wherein the system specifically comprises the following steps:
step 1, based on three-dimensional urban space geographic information, building urban information models of three-dimensional digital spaces are constructed by superposing BIM information of urban buildings, underground facilities and urban Internet of things information;
step 2, performing coverage shooting on the ground above the city by using an unmanned aerial vehicle, and obtaining an original Bayer array under an infrared cut-off filter and a Bayer array without the infrared cut-off filter, wherein the Bayer array contains near infrared information, through an infrared filter switcher;
step 3, carrying out interpolation processing on the original Bayer array to obtain RGB values of each pixel;
step 4, subtracting the original Bayer array from the Bayer array containing near infrared information to obtain a near infrared band reflection value NIR of the aerial image;
step 5, calculating an aerial image normalized vegetation index NDVI: ndvi= (NIR-R)/(nir+r);
step 6, combining the normalized vegetation index NDVI of the aerial image with RGB data to form four-channel input data, inputting the input data and a data label into a vegetation trunk vertex extraction network, and training a vegetation extraction encoder and a vegetation extraction decoder;
step 7, carrying out feature extraction on input data to be detected by adopting a vegetation extraction encoder to obtain a feature map;
step 8, up-sampling and feature extraction are carried out on the feature map by adopting a vegetation extraction decoder, so as to obtain a vegetation trunk peak thermodynamic diagram, and the confidence coefficient of the hot spot representing the vegetation trunk peak in the thermodynamic diagram;
step 9, post-processing is carried out on the vegetation trunk peak thermodynamic diagram to obtain vegetation trunk peak coordinates;
step 10, projecting the coordinates of the vegetation trunk vertexes to a ground coordinate system of the urban information model, obtaining a vegetation model by using three-dimensional modeling software, and matching the vegetation model to the positions of the vegetation trunk vertexes in the ground coordinate system of the urban information model;
step 11, combining with the WebGIS technology, calling an information exchange module to update the three-dimensional space model of the Web end in real time so as to visualize the city information model;
the data label is specifically manufactured by: labeling pixels of the vegetation trunk vertexes in the aerial image; and carrying out Gaussian kernel convolution on the marked vegetation trunk vertexes to generate thermodynamic diagram data of vegetation points as data labels.
CN202010449593.5A 2020-05-25 2020-05-25 Vegetation model auxiliary generation method and system based on aerial image and CIM Active CN111612894B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010449593.5A CN111612894B (en) 2020-05-25 2020-05-25 Vegetation model auxiliary generation method and system based on aerial image and CIM

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010449593.5A CN111612894B (en) 2020-05-25 2020-05-25 Vegetation model auxiliary generation method and system based on aerial image and CIM

Publications (2)

Publication Number Publication Date
CN111612894A CN111612894A (en) 2020-09-01
CN111612894B true CN111612894B (en) 2023-04-25

Family

ID=72203964

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010449593.5A Active CN111612894B (en) 2020-05-25 2020-05-25 Vegetation model auxiliary generation method and system based on aerial image and CIM

Country Status (1)

Country Link
CN (1) CN111612894B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112052811A (en) * 2020-09-11 2020-12-08 郑州大学 Pasture grassland desertification detection method based on artificial intelligence and aerial image

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102708307B (en) * 2012-06-26 2015-07-01 上海大学 Vegetation index construction method applied to city
CN106650673A (en) * 2016-12-27 2017-05-10 中国科学院深圳先进技术研究院 Urban mapping method and device
CN109117811B (en) * 2018-08-24 2021-07-30 颜俊君 System and method for estimating urban vegetation coverage rate based on low-altitude remote sensing measurement technology

Also Published As

Publication number Publication date
CN111612894A (en) 2020-09-01

Similar Documents

Publication Publication Date Title
US10297074B2 (en) Three-dimensional modeling from optical capture
US20190026400A1 (en) Three-dimensional modeling from point cloud data migration
CN109520500B (en) Accurate positioning and street view library acquisition method based on terminal shooting image matching
CN108564647A (en) A method of establishing virtual three-dimensional map
CN112634370A (en) Unmanned aerial vehicle dotting method, device, equipment and storage medium
CN113256778A (en) Method, device, medium and server for generating vehicle appearance part identification sample
CN111028358A (en) Augmented reality display method and device for indoor environment and terminal equipment
CN115641401A (en) Construction method and related device of three-dimensional live-action model
CN111599007B (en) Smart city CIM road mapping method based on unmanned aerial vehicle aerial photography
CN115375868B (en) Map display method, remote sensing map display method, computing device and storage medium
CN115147554A (en) Three-dimensional scene construction method, device, equipment and storage medium
CN113066112A (en) Indoor and outdoor fusion method and device based on three-dimensional model data
CN114758337A (en) Semantic instance reconstruction method, device, equipment and medium
CN110189395B (en) Method for realizing dynamic analysis and quantitative design of landscape elevation based on human visual angle oblique photography
CN111612894B (en) Vegetation model auxiliary generation method and system based on aerial image and CIM
CN111627103A (en) Smart city CIM imaging method based on pedestrian activity and density perception
CN111626971B (en) Smart city CIM real-time imaging method with image semantic perception
WO2023116359A1 (en) Method, apparatus and system for classifying green, blue and gray infrastructures, and medium
CN116206068A (en) Three-dimensional driving scene generation and construction method and device based on real data set
CN115713603A (en) Multi-type block building group form intelligent generation method based on building space map
CN115937673A (en) Geographic element rapid change discovery method based on mobile terminal photo
Gu et al. Surveying and mapping of large-scale 3D digital topographic map based on oblique photography technology
CN115187736A (en) Target map generation method and device, and AR map generation method and device
CN111798567A (en) Vegetation model auxiliary generation method and system based on aerial image and CIM
CN111784822A (en) Smart city CIM real-time imaging method with image semantic perception

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant