EP1851686A1 - Method and apparatus for distinguishing foliage from buildings for topographical modeling - Google Patents

Method and apparatus for distinguishing foliage from buildings for topographical modeling

Info

Publication number
EP1851686A1
EP1851686A1 EP06734356A EP06734356A EP1851686A1 EP 1851686 A1 EP1851686 A1 EP 1851686A1 EP 06734356 A EP06734356 A EP 06734356A EP 06734356 A EP06734356 A EP 06734356A EP 1851686 A1 EP1851686 A1 EP 1851686A1
Authority
EP
European Patent Office
Prior art keywords
dem
foliage
building
threshold
obj
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP06734356A
Other languages
German (de)
French (fr)
Other versions
EP1851686A4 (en
Inventor
Mark Rahmes
John Karp
Anthony Smith
Stephen Connetti, Jr.
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harris Corp
Original Assignee
Harris Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harris Corp filed Critical Harris Corp
Publication of EP1851686A1 publication Critical patent/EP1851686A1/en
Publication of EP1851686A4 publication Critical patent/EP1851686A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • G06V20/13Satellite images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation

Definitions

  • Topographical models of geographical areas may be used for many applications , including flight simulators and flood plain analysis . Furthermore, topographical models of man-made structures (e . g . , cities ) may be extremely helpful in applications such as cellular antenna placement, urban planning, disaster preparedness and analysis , and mapping, for example .
  • DEM digital elevation model
  • a DEM is a sampled matrix representation of a geographical area which may be generated in an automated fashion by a computer .
  • coordinate points are made to correspond with a height value .
  • DEMs are typically used for modeling terrain where the transitions between different elevations (e . g . , valleys , mountains , etc . ) are generally smooth from one to a next . That is , DEMs typically model terrain as a plurality of curved surfaces and any discontinuities therebetween are thus “smoothed" over . For this reason, DEMs generally are not well suited for modeling man-made structures , such as skyscrapers in a downtown area, with sufficient accuracy for many of the above applications .
  • U . S . Patent No . 6, 654 , 690 to Rahmes et al . discloses a significant advance in topography.
  • the ⁇ 690 patent discloses an automated method for making a topographical model of an area including terrain and buildings thereon based upon randomly spaced data of elevation versus position .
  • the ⁇ 690 patent is assigned to the assignee of the present invention and is incorporated herein by reference in its entirety .
  • the method includes processing the randomly spaced data to generate gridded data of elevation versus position conforming to a predetermined position grid, processing the gridded data to distinguish building data from terrain data, and performing polygon extraction for the building data to make the topographical model of the area including terrain and buildings thereon .
  • a terrain-only DEM is generated and a building-only DEM is generated .
  • polygon extraction is performed for the building data .
  • the ⁇ 690 patent makes a typographical model of the area including terrain and buildings thereon in a relatively quick manner and with enhanced accuracy . Nonetheless , what generally happens is that much of the foliage, and in particular trees , may be treated as buildings . That is , polygon extraction is also performed on the data representing the trees . This results in a large number of polygons being used to model a tree as compared to the number of polygons used to model a building .
  • the topographical model is displayed on a viewer, the modeled foliage is not very realistic looking . Consequently, the modeled foliage is manually removed and replaced with a more realistic model . This may be relatively time consuming and labor intensive .
  • a computer implemented method for processing a DEM including data for a plurality of objects may include determining a perimeter versus area parameter for each obj ect in the DEM, and comparing the perimeter versus area parameter for each obj ect to a threshold to identify whether each obj ect in the DEM is a building or foliage .
  • the data for each obj ect includes a height value
  • the computer implemented method may further comprise comparing the height value for each obj ect identified as foliage to a height threshold, and re-identifying each foliage as a building if the height value associated therewith is greater than the height threshold.
  • the computer implemented method may further comprise determining a second perimeter versus area parameter for each obj ect identified as a building, comparing each second perimeter versus area parameter to a second threshold, and re-identifying each building as foliage if the second perimeter versus area parameter is greater than the second threshold .
  • the obj ects identified as buildings may then be separated into a building DEM, and the obj ects identified as foliage may then be separated into a foliage DEM.
  • Separate building and foliage DEMs advantageously allow more realistic topographical models to be generated with significantly less user intervention .
  • the computer implemented method may further comprise modeling each building in the building DEM as vectors , wherein each vector may comprise a plurality of polygons .
  • the computer implemented method may further comprise modeling each foliage in the foliage DEM as 3D points .
  • Another aspect in accordance with the present invention is directed to a computer-readable medium having stored thereon a data structure for processing a digital elevation model (DEM) including data for a plurality of obj ects .
  • the computer-readable medium may comprise a first data field containing data for determining a perimeter versus area parameter for each object in the DEM, and a second data field containing data for comparing the perimeter versus area parameter for each obj ect to a threshold to identify whether each obj ect in the DEM is a building or foliage .
  • the computer system may comprise a processor for processing the computer-readable medium as defined above .
  • a display may be coupled to the processor for displaying a topographical model based upon the processing .
  • FIG. 1 is a schematic block diagram of collecting topographical data, and generating a topographical model from the collected topographical data in accordance with the present invention .
  • FIG . 2 is a flow diagram for generating a topographical model in accordance with the present invention .
  • FIGS . 3-5 are computer screen displays corresponding to generating an original DEM in accordance with the present invention .
  • FIGS . 6-7 are computer screen displays corresponding to a re-sampling of the original DEM in accordance with the present invention .
  • FIGS . 8-10 are computer screen displays corresponding to a null fill process performed on the re- sampled DEM in accordance with the present invention .
  • FIGS . 11-13 are computer screen displays corresponding to DEM subtractions for generating an obj ects- only DEM and a DEM without the objects in accordance with the present invention .
  • FIGS . 14-15 are computer screen displays corresponding to a null expand process performed on the DEM without the objects as provided in FIG . 13.
  • FIGS . 16-17 are computer screen displays corresponding to generation of a terrain-only DEM in accordance with the present invention .
  • FIGS . 18-19 are computer screen displays corresponding to DEM subtractions for generating an enhanced obj ects-only DEM and an enhanced DEM without the objects in accordance with the present invention .
  • FIGS . 20-22 are computer screen displays corresponding to the generation of a final terrain-only DEM and a further enhanced obj ects-only DEM in accordance with the present invention .
  • FIG. 23 is a computer screen display corresponding to the generation of a second further enhanced obj ects-only DEM based upon a null fill/null expansion in accordance with the present invention .
  • FIG . 24 is a computer screen display corresponding to the generation of a noise-only DEM in accordance with the present invention .
  • FIG . 25 is a computer screen display corresponding to the generation of a final obj ects-only DEM in accordance with the present invention .
  • FIG . 26 is a flow diagram for separating the final obj ects-only DEM as provided in FIG . 25 into a building DEM and a foliage DEM.
  • FIG . 27 is a computer screen display for setting the parameters associated with separating the final obj ects-only DEM into a building DEM and a foliage DEM in accordance with the present invention .
  • FIG . 28 is a pictorial representation of the steps for separating the final obj ects-only DEM into a building DEM and a foliage DEM in accordance with the present invention .
  • FIG . 29 is a computer screen display of a topographical model generated in accordance with the present invention. Detailed Description of the Preferred Embodiments
  • a collector 50 for collecting topographical data and a system 60 for generating a digital elevation model (DEM) from the collected topographical data will now be explained with reference to FIG. 1.
  • the DEM is of an area that includes terrain 52 and obj ects on the terrain, wherein the obj ects may be buildings 54 and foliage 56.
  • the foliage 56 primarily includes trees , and consequently, foliage and trees will be interchangeable .
  • Modeling of the terrain 52, buildings 54 and trees 56 is based upon randomly or arbitrarily spaced data of elevation versus position on the area .
  • the collector 50 such as a light detection and ranging (LIDAR) collector, may be used for collecting the randomly spaced data .
  • the randomly spaced data may nominally be a set of non-uniformly spaced measurements of position and height .
  • the LIDAR collector 50 may be carried by an airplane 70 over the area of interest, such as a city .
  • the area may also include relatively small features , such as roads 58 , for example, as compared to the buildings 54 and trees 56.
  • LIDAR source provides data including elevation versus position information from a single image .
  • Multiple optical images of the area taken from different perspectives are generally required to provide elevation versus position data, whereas this same information may be obtained from a single LIDAR image .
  • the present invention may use elevation data versus position data from sources such as optical (e . g . , photography) , electro-optical , and infrared sources , for example, in addition to LIDAR collectors as will be appreciated by those of skill in the art .
  • the position information provided by the LIDAR data may include latitude and longitude information, for example, though other suitable position indicators may also be used.
  • the data may be stored on a storage medium 80 , such as a magnetic or optical disk, for example, for transfer to a computer 62.
  • a storage medium 80 such as a magnetic or optical disk, for example, for transfer to a computer 62.
  • Other suitable methods for transferring data may also be used, as readily appreciated by those skilled in the art .
  • the randomly spaced data is used by the computer 62 to generate a DEM for viewing.
  • a display 64 is connected to the computer 62 for viewing the DEM.
  • Input devj ces such as a keyboard 66 and mouse 68 are also connected to the computer 62.
  • the computer 62 includes a processor 69 for 1 ) enhancing the DEM by creating a terrain- only DEM and an obj ects-only DEM, and then removing noise from the objects-only DEM, and 2 ) separating the obj ects-only DEM into a building DEM and a foliage DEM. Generating and enhancing an original or initial DEM will now be discussed with reference to the flow diagram of FIG . 2 , as well as to the computer display screens illustrated in FIGS . 3-25.
  • steps (I) - (19) will initially be discussed, wherein steps (2) - (19) are considered to be part of a batch process as will be discussed in greater below.
  • steps (2) - (19) are considered to be part of a batch process as will be discussed in greater below.
  • some of the blocks illustrated in the flow diagram will be discussed more than once since their respective functions are repeated based upon a looping process .
  • the number associated with each step being discussed is provided in parenthesis within the corresponding block to better illustrate the method for enhancing the original DEM.
  • an initial step (1) is using the computer 62 in Block 102 to generate an original DEM from the randomly spaced data as provided via the storage medium 80.
  • the user selects the "Generate DEM From Points" in field 202. This causes a "Points To DEM Settings" computer screen 204 to be displayed, as illustrated in FIG . 4.
  • the name of the file storing the collected data is entered in field 206.
  • the format of the points is selected in field 208. In this case, the points are based upon a universal transverse mercator (UTM) grid.
  • the unit of measure of the points is selected in field 210, which is in meters , for example .
  • UTM grid includes 60 north-south zones , with each zone being 6 degrees wide in latitude .
  • the UTM zone of interest is selected in field 212.
  • Zone 15 is selected in field 212, for example .
  • the resolution of the data is selected in field 214 , and the window filter size is selected in field 216, as readily appreciated by those skilled in the art .
  • the generated original DEM is provided in the computer screen 300 as illustrated in FIG . 5.
  • Steps (2) - (19) for enhancing the original DEM are initiated by selecting the "Run Batch Process" in field 220 from the initial computer screen 200 as illustrated in FIG . 3. As the batch process is run, fields 222, 224 and 226 allow the user to set certain parameters associated with the batch process . These parameters will be discussed below .
  • step (2) is the re-sampling of the original DEM.
  • the settings associated with the re-sampling are provided in the computer screen 230 as illustrated in FIG . 6.
  • the original DEM had a resolution of 1 meter, for example , and will now be re-sampled at a lower resolution .
  • the resolution is set in field 232, which is 30 meters , for example .
  • the window filter size is also selected in field 234.
  • the result is provided in the computer screen 302 as illustrated in FIG . 7 , which is a smoothing of the obj ects 54 , 56 on the terrain 52.
  • a null fill is performed on the re- sampled DEM .
  • the null fill is associated with the null manipulations provided in field 226 from the initial computer screen 200 as illustrated in FIG . 3.
  • the null manipulations may be divided into a null expansion or a null fill as provided in computer screen 240 in FIG . 8.
  • Field 242 corresponds to the null expansion and field 244 corresponds to the null filling . Since a null fill is being performed, computer screen 250 is displayed as illustrated in FIG. 9.
  • the settings associated with the null fill include field 252 for the method of the fill , field 254 for the number of fill passes to be performed, and field 256 for the fill reach .
  • the resulting re-sampled DEM after null filling is provided in computer screen 304 as illustrated in FIG . 10.
  • a DEM subtraction is performed .
  • Computer screen 260 is associated with the DEM subtraction as illustrated in FIG . 11.
  • the threshold used in the DEM subtraction is selected in field 262.
  • the re-sampled DEM after null filling in step (3) is subtracted from the original DEM in step (1) to produce an obj ects-only DEM .
  • the obj ects- only DEM is provided in the computer screen 306 as illustrated in FIG. 12.
  • step (5) is another DEM subtraction .
  • the obj ects-only DEM from step (4) is subtracted from the original DEM in step (1) to produce a DEM without the objects .
  • This DEM is provided in the computer screen 308 as illustrated in FIG . 13.
  • a null expansion is performed in Block 112 on the DEM without the obj ects , which corresponds to step (6) .
  • the computer screen 270 is associated with the null expansion as illustrated in FIG. 14.
  • the nulls are expanded corresponding to the value selected in field 272.
  • the null expansion makes sure that all the obj ects have been removed so that the result is a DEM without the objects at the 1 meter resolution, as provided in the computer screen 310 and as illustrated in FIG . 15.
  • Block 104 performs a re-sampling on the DEM without the obj ects as provided in FIG . 15.
  • the re-sampling is performed at a lower resolution, that is , from 1 meter to 30 meters .
  • the result is provided in the computer screen 312 as illustrated in FIG . 16.
  • a second null fill is performed, which corresponds to step (8) .
  • the second null fill is performed on the re-sampled DEM without the obj ects as provided in Block 112. This process generates a terrain-only DEM as provided in the computer screen 314 in FIG . 17.
  • a second DEM subtraction is performed, which corresponds to step (9) .
  • the terrain-only DEM from step (8) is now subtracted from the original DEM in step (1) to produce an enhanced obj ects-only DEM.
  • the enhanced objects-only DEM is provided in the computer screen 316 as illustrated in FIG . 18.
  • step (10) is another DEM subtraction step .
  • the enhanced obj ects-only DEM from step (9) is subtracted from the original DEM in step (1) to produce an enhanced DEM without the obj ects .
  • the enhanced DEM without obj ects is provided in the computer screen 318 as illustrated in FIG . 19.
  • step (11) the run batch process again loops back to Block 104.
  • Block 104 performs another re-sampling on the enhanced DEM without the obj ects as provided by Block 110.
  • the re-sampling is also performed at a lower resolution, that is , from 1 meter to 30 meters .
  • the result is provided in the computer screen 320 as illustrated in FIG . 20.
  • Block 106 another null fill is performed, which corresponds to step (12) .
  • This third null fill is performed on the re-sampled DEM without the objects as provided by Block 104 to generate an enhanced terrain-only DEM as provided in the computer screen 322 in FIG . 21.
  • This DEM is also referred to as the final terrain-only DEM.
  • a third DEM subtraction is performed, which corresponds to step (13) .
  • the enhanced terrain-only DEM from step (12) is subtracted from the original DEM in step (1) to produce an even further enhanced obj ects-only DEM.
  • the further enhanced obj ects-only DEM is provided in the computer screen 324 as illustrated in FIG . 22.
  • a null expansion is performed on the further enhanced objects-only DEM in Block 114, which corresponds to step (14) .
  • a null fill is performed, which corresponds to step (15) .
  • Steps (14) and (15) are performed to remove noise from around the obj ects to generate an even further enhanced obj ects-only DEM.
  • the process loops back to Block 114 so that steps (16) and (17) are performed . That is , another null expansion and null fill are performed to generate a second further enhanced obj ects-only DEM as provided in the computer screen 326 in FIG . 23.
  • a DEM subtraction is performed.
  • the second further enhanced obj ects-only DEM from step (17) is subtracted from the further enhanced obj ects-only DEM from step (15) to produce a noise-only DEM .
  • the noise-only DEM is provided in the computer screen 328 as illustrated in FIG. 24.
  • step 19 another DEM subtraction (step 19) is performed.
  • the noise-only DEM from step (18) is subtracted from the second further enhanced obj ects-only DEM from step
  • the final obj ects-only DEM is provided in the computer screen 330 as illustrated in FIG . 25.
  • steps (2) - (19) a final terrain-only DEM and a final objects-only DEM have been generated. Compared to prior art DEMs , these DEMs are enhanced as a result of the looping iterations performed in steps (2) - (19) .
  • Yet another aspect in accordance with the present invention that will now be discussed is the separation of the final obj ects-only DEM into a building DEM and a foliage DEM. In other words , the final obj ects-only DEM is separated into two separate DEMs so that each DEM may be separately processed.
  • FIG. 26 Separating the final obj ects-only DEM into a building DEM and a foliage DEM will now be discussed with reference to the flow diagram illustrated in FIG . 26, as well as to FIGS . 27-28.
  • the user selects "Separate Buildings and Trees" in field 227. This causes a "Separate Buildings and Trees" computer screen 410 to be displayed, as illustrated in FIG . 27.
  • the user has the option of selecting several threshold parameters . Since the separation is performed based upon calculating a perimeter per area for each obj ect, as well as the height of each obj ect, corresponding comparison threshold values are set via the computer screen 410. For instance, the perimeter per area threshold is set in field 412. The minimum size of each obj ect to be considered is set in field 414. This field is labeled as the minimum posts . The chord residue is selected in field 416, and corresponds to a width or length of a side of the obj ect being considered. A maximum tree height is selected in field 418.
  • a second threshold to be associated with a second perimeter per area test is selected in field 420.
  • This second threshold selected in field 420 may be different than the first threshold selected in field 412.
  • Block 500 a perimeter versus area parameter for each obj ect in the final obj ects- only DEM is determined in Block 502.
  • FIG . 28 provides a pictorial representation of how the final obj ects-only DEM is separated into a building DEM and a foliage DEM.
  • FIG . 28 provides a pictorial representation of how the final obj ects-only DEM is separated into a building DEM and a foliage DEM.
  • FIG . 28 provides a pictorial representation of how the final obj ects-only DEM as initially illustrated in FIG . 25 is provided in frame 600 in FIG . 28.
  • the obj ects include buildings 54 and trees 56 grouped together in the same DEM.
  • the perimeter versus area parameter for each obj ect is compared to the selected threshold in Block 504 to identify whether each obj ect in the DEM is a building 54 or foliage 56. Based upon the comparison to the threshold, the obj ects are separated into a building DEM and a foliage DEM, as illustrated in frames 602 and 604.
  • the data for each object includes a height value, and the height value for each obj ect identified as foliage 56 in frame 604 is compared to a height threshold in Block 506.
  • each foliage 56 in frame 604 is re-identified as a building 54 if the height value associated therewith is greater than the height threshold.
  • the building 54 initially identified as foliage 56 in frame 604 has been re-identified as a building in frame 606.
  • tall trees 56 in frame 604 have now been identified as buildings based upon the comparison to the height threshold, as shown in frame 606.
  • a second perimeter versus area parameter is determined for each obj ect identified as a building in frame 606.
  • Each second perimeter versus area parameter is compared to a second threshold in Block 512.
  • Each building 54 is re-identified as foliage 56 if the second perimeter versus area parameter is greater than the second threshold in Block 514.
  • the obj ects identified as buildings 54 are separated into a building DEM, and the obj ects identified as foliage are separated into a foliage DEM in Block 516.
  • Separate building and foliage DEMs advantageously allow more realistic topographical models to be generated with significantly less user intervention .
  • the building DEM is represented by frame 610, and the foliage DEM is represented by frame 612.
  • the method for separating the final obj ects-only DEM into a building DEM and a foliage DEM ends at Block 518.
  • the above steps for separating the final obj ects-only DEM into two separate DEMs correspond to step (20) in FIG . 2.
  • Block 124 corresponds to step (21) and is optional, but allows the user to manually clean up or edit the separation of the buildings 54 and trees 56 in case the automatic process failed to correctly identify each obj ect correctly .
  • each foliage in the foliage DEM 612 is modeled as 3D points .
  • the user selects the "Generate Points From DEM" in field 228 as shown in FIG . 3 to convert the points into a list of x (latitude) , y (longitude) and z (height) .
  • Block 128, which corresponds to step (23) the buildings are modeled as vectors .
  • Modeling buildings as vectors is disclosed in U . S . Patent No . 6, 654 , 690 as discussed in the background section.
  • Texture is placed on the polygons representing the buildings in Block 130 , which corresponds to step (24) .
  • images are placed on the polygons to give the topographical model a realistic look.
  • RealSiteTM is one such commercially available tool to perform this task . RealSiteTM was developed by the Harris Corporation, which is assignee of the present invention .
  • SceneBuilderTM is used to format all of the generated geometry and textures for display on the computer system 60.
  • SceneBuilderTM is also a commercially available tool .
  • InRealityTM is another commercially available tool developed by the Harris Corporation, and allows the user to navigate virtual scenes and conduct various analyses .
  • InRealityTM is designed to be a companion to the RealSiteTM software .
  • the process ends at Block 136.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Remote Sensing (AREA)
  • Multimedia (AREA)
  • Astronomy & Astrophysics (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Processing Or Creating Images (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Instructional Devices (AREA)
  • Image Processing (AREA)

Abstract

A computer implemented method is for processing a digital elevation model (DEM) including data for a plurality of objects. The method includes determining (Block 502) a perimeter versus area parameter for each object in the DEM, and comparing (Block 504) the perimeter versus area parameter for each object to a threshold to identify whether each object in the DEM is a building 54 or foliage 56.

Description

METHOD AND APPARATUS FOR DISTINGUISHING FOLIAGE FROM BUILDINGS FOR TOPOGRAPHICAL MODELING
Background of the Invention Topographical models of geographical areas may be used for many applications , including flight simulators and flood plain analysis . Furthermore, topographical models of man-made structures (e . g . , cities ) may be extremely helpful in applications such as cellular antenna placement, urban planning, disaster preparedness and analysis , and mapping, for example .
Various types and methods for making topographical models are presently being used. One common topographical model is the digital elevation model (DEM) . A DEM is a sampled matrix representation of a geographical area which may be generated in an automated fashion by a computer . In a DEM, coordinate points are made to correspond with a height value . DEMs are typically used for modeling terrain where the transitions between different elevations (e . g . , valleys , mountains , etc . ) are generally smooth from one to a next . That is , DEMs typically model terrain as a plurality of curved surfaces and any discontinuities therebetween are thus "smoothed" over . For this reason, DEMs generally are not well suited for modeling man-made structures , such as skyscrapers in a downtown area, with sufficient accuracy for many of the above applications .
U . S . Patent No . 6, 654 , 690 to Rahmes et al . discloses a significant advance in topography. The λ 690 patent discloses an automated method for making a topographical model of an area including terrain and buildings thereon based upon randomly spaced data of elevation versus position . The λ 690 patent is assigned to the assignee of the present invention and is incorporated herein by reference in its entirety . The method includes processing the randomly spaced data to generate gridded data of elevation versus position conforming to a predetermined position grid, processing the gridded data to distinguish building data from terrain data, and performing polygon extraction for the building data to make the topographical model of the area including terrain and buildings thereon .
In particular, a terrain-only DEM is generated and a building-only DEM is generated . Once the buildings have been distinguished from the terrain, polygon extraction is performed for the building data . The λ 690 patent makes a typographical model of the area including terrain and buildings thereon in a relatively quick manner and with enhanced accuracy . Nonetheless , what generally happens is that much of the foliage, and in particular trees , may be treated as buildings . That is , polygon extraction is also performed on the data representing the trees . This results in a large number of polygons being used to model a tree as compared to the number of polygons used to model a building . When the topographical model is displayed on a viewer, the modeled foliage is not very realistic looking . Consequently, the modeled foliage is manually removed and replaced with a more realistic model . This may be relatively time consuming and labor intensive .
Summary of the Invention
In view of the foregoing background, it is therefore an obj ect of the present invention to provide a computer implemented method for distinguishing fcliage from buildings within a digital elevation model (DEM) . This and other obj ects , features , and advantages in accordance with the present invention are provided by a computer implemented method for processing a DEM including data for a plurality of objects . The method may include determining a perimeter versus area parameter for each obj ect in the DEM, and comparing the perimeter versus area parameter for each obj ect to a threshold to identify whether each obj ect in the DEM is a building or foliage .
The data for each obj ect includes a height value, and the computer implemented method may further comprise comparing the height value for each obj ect identified as foliage to a height threshold, and re-identifying each foliage as a building if the height value associated therewith is greater than the height threshold. In addition, the computer implemented method may further comprise determining a second perimeter versus area parameter for each obj ect identified as a building, comparing each second perimeter versus area parameter to a second threshold, and re-identifying each building as foliage if the second perimeter versus area parameter is greater than the second threshold .
The obj ects identified as buildings may then be separated into a building DEM, and the obj ects identified as foliage may then be separated into a foliage DEM. Separate building and foliage DEMs advantageously allow more realistic topographical models to be generated with significantly less user intervention .
The computer implemented method may further comprise modeling each building in the building DEM as vectors , wherein each vector may comprise a plurality of polygons . The computer implemented method may further comprise modeling each foliage in the foliage DEM as 3D points .
Another aspect in accordance with the present invention is directed to a computer-readable medium having stored thereon a data structure for processing a digital elevation model (DEM) including data for a plurality of obj ects . The computer-readable medium may comprise a first data field containing data for determining a perimeter versus area parameter for each object in the DEM, and a second data field containing data for comparing the perimeter versus area parameter for each obj ect to a threshold to identify whether each obj ect in the DEM is a building or foliage .
Yet another aspect of the present invention is directed a computer system for topographical modeling . The computer system may comprise a processor for processing the computer-readable medium as defined above . A display may be coupled to the processor for displaying a topographical model based upon the processing .
Brief Description of the Drawings
FIG. 1 is a schematic block diagram of collecting topographical data, and generating a topographical model from the collected topographical data in accordance with the present invention .
FIG . 2 is a flow diagram for generating a topographical model in accordance with the present invention .
FIGS . 3-5 are computer screen displays corresponding to generating an original DEM in accordance with the present invention .
FIGS . 6-7 are computer screen displays corresponding to a re-sampling of the original DEM in accordance with the present invention .
FIGS . 8-10 are computer screen displays corresponding to a null fill process performed on the re- sampled DEM in accordance with the present invention .
FIGS . 11-13 are computer screen displays corresponding to DEM subtractions for generating an obj ects- only DEM and a DEM without the objects in accordance with the present invention .
FIGS . 14-15 are computer screen displays corresponding to a null expand process performed on the DEM without the objects as provided in FIG . 13. FIGS . 16-17 are computer screen displays corresponding to generation of a terrain-only DEM in accordance with the present invention .
FIGS . 18-19 are computer screen displays corresponding to DEM subtractions for generating an enhanced obj ects-only DEM and an enhanced DEM without the objects in accordance with the present invention .
FIGS . 20-22 are computer screen displays corresponding to the generation of a final terrain-only DEM and a further enhanced obj ects-only DEM in accordance with the present invention .
FIG. 23 is a computer screen display corresponding to the generation of a second further enhanced obj ects-only DEM based upon a null fill/null expansion in accordance with the present invention .
FIG . 24 is a computer screen display corresponding to the generation of a noise-only DEM in accordance with the present invention .
FIG . 25 is a computer screen display corresponding to the generation of a final obj ects-only DEM in accordance with the present invention .
FIG . 26 is a flow diagram for separating the final obj ects-only DEM as provided in FIG . 25 into a building DEM and a foliage DEM. FIG . 27 is a computer screen display for setting the parameters associated with separating the final obj ects-only DEM into a building DEM and a foliage DEM in accordance with the present invention .
FIG . 28 is a pictorial representation of the steps for separating the final obj ects-only DEM into a building DEM and a foliage DEM in accordance with the present invention .
FIG . 29 is a computer screen display of a topographical model generated in accordance with the present invention. Detailed Description of the Preferred Embodiments
The present invention will now be described more fully hereinafter with reference to the accompanying drawings , in which preferred embodiments of the invention are shown . This invention may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein . Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art . Like numbers refer to like elements throughout .
A collector 50 for collecting topographical data and a system 60 for generating a digital elevation model (DEM) from the collected topographical data will now be explained with reference to FIG. 1. The DEM is of an area that includes terrain 52 and obj ects on the terrain, wherein the obj ects may be buildings 54 and foliage 56. The foliage 56 primarily includes trees , and consequently, foliage and trees will be interchangeable . Modeling of the terrain 52, buildings 54 and trees 56 is based upon randomly or arbitrarily spaced data of elevation versus position on the area .
The collector 50, such as a light detection and ranging (LIDAR) collector, may be used for collecting the randomly spaced data . The randomly spaced data may nominally be a set of non-uniformly spaced measurements of position and height . The LIDAR collector 50 may be carried by an airplane 70 over the area of interest, such as a city . The area may also include relatively small features , such as roads 58 , for example, as compared to the buildings 54 and trees 56. Those of skill in the art will appreciate that a
LIDAR source provides data including elevation versus position information from a single image . Multiple optical images of the area taken from different perspectives are generally required to provide elevation versus position data, whereas this same information may be obtained from a single LIDAR image . Of course, the present invention may use elevation data versus position data from sources such as optical (e . g . , photography) , electro-optical , and infrared sources , for example, in addition to LIDAR collectors as will be appreciated by those of skill in the art . The position information provided by the LIDAR data may include latitude and longitude information, for example, though other suitable position indicators may also be used. Once the randomly spaced data is collected, the data may be stored on a storage medium 80 , such as a magnetic or optical disk, for example, for transfer to a computer 62. Of course, other suitable methods for transferring data may also be used, as readily appreciated by those skilled in the art . The randomly spaced data is used by the computer 62 to generate a DEM for viewing.
A display 64 is connected to the computer 62 for viewing the DEM. Input devj ces such as a keyboard 66 and mouse 68 are also connected to the computer 62. In accordance with the present invention, the computer 62 includes a processor 69 for 1 ) enhancing the DEM by creating a terrain- only DEM and an obj ects-only DEM, and then removing noise from the objects-only DEM, and 2 ) separating the obj ects-only DEM into a building DEM and a foliage DEM. Generating and enhancing an original or initial DEM will now be discussed with reference to the flow diagram of FIG . 2 , as well as to the computer display screens illustrated in FIGS . 3-25. In the flow diagram, steps (I) - (19) will initially be discussed, wherein steps (2) - (19) are considered to be part of a batch process as will be discussed in greater below. In the batch process , some of the blocks illustrated in the flow diagram will be discussed more than once since their respective functions are repeated based upon a looping process . The number associated with each step being discussed is provided in parenthesis within the corresponding block to better illustrate the method for enhancing the original DEM.
From the start (Block 100) , an initial step (1) is using the computer 62 in Block 102 to generate an original DEM from the randomly spaced data as provided via the storage medium 80. Referring to the initial computer screen 200 as illustrated in FIG . 3 , the user selects the "Generate DEM From Points" in field 202. This causes a "Points To DEM Settings" computer screen 204 to be displayed, as illustrated in FIG . 4. In the "Points To DEM Settings" computer screen 204 , the name of the file storing the collected data is entered in field 206. The format of the points is selected in field 208. In this case, the points are based upon a universal transverse mercator (UTM) grid. The unit of measure of the points is selected in field 210, which is in meters , for example . The
UTM grid includes 60 north-south zones , with each zone being 6 degrees wide in latitude . The UTM zone of interest is selected in field 212. Zone 15 is selected in field 212, for example . The resolution of the data is selected in field 214 , and the window filter size is selected in field 216, as readily appreciated by those skilled in the art . The generated original DEM is provided in the computer screen 300 as illustrated in FIG . 5.
Steps (2) - (19) for enhancing the original DEM are initiated by selecting the "Run Batch Process" in field 220 from the initial computer screen 200 as illustrated in FIG . 3. As the batch process is run, fields 222, 224 and 226 allow the user to set certain parameters associated with the batch process . These parameters will be discussed below . In Block 104, step (2) is the re-sampling of the original DEM. The settings associated with the re-sampling are provided in the computer screen 230 as illustrated in FIG . 6. The original DEM had a resolution of 1 meter, for example , and will now be re-sampled at a lower resolution . The resolution is set in field 232, which is 30 meters , for example . The window filter size is also selected in field 234. The result is provided in the computer screen 302 as illustrated in FIG . 7 , which is a smoothing of the obj ects 54 , 56 on the terrain 52.
In Block 106, a null fill is performed on the re- sampled DEM . The null fill is associated with the null manipulations provided in field 226 from the initial computer screen 200 as illustrated in FIG . 3. The null manipulations may be divided into a null expansion or a null fill as provided in computer screen 240 in FIG . 8. Field 242 corresponds to the null expansion and field 244 corresponds to the null filling . Since a null fill is being performed, computer screen 250 is displayed as illustrated in FIG. 9. The settings associated with the null fill include field 252 for the method of the fill , field 254 for the number of fill passes to be performed, and field 256 for the fill reach . The resulting re-sampled DEM after null filling is provided in computer screen 304 as illustrated in FIG . 10. In Block 108, a DEM subtraction is performed .
Computer screen 260 is associated with the DEM subtraction as illustrated in FIG . 11. The threshold used in the DEM subtraction is selected in field 262. The re-sampled DEM after null filling in step (3) is subtracted from the original DEM in step (1) to produce an obj ects-only DEM . The obj ects- only DEM is provided in the computer screen 306 as illustrated in FIG. 12.
In Block 110 , step (5) is another DEM subtraction . The obj ects-only DEM from step (4) is subtracted from the original DEM in step (1) to produce a DEM without the objects . This DEM is provided in the computer screen 308 as illustrated in FIG . 13.
A null expansion is performed in Block 112 on the DEM without the obj ects , which corresponds to step (6) . The computer screen 270 is associated with the null expansion as illustrated in FIG. 14. The nulls are expanded corresponding to the value selected in field 272. The null expansion makes sure that all the obj ects have been removed so that the result is a DEM without the objects at the 1 meter resolution, as provided in the computer screen 310 and as illustrated in FIG . 15.
The run batch process now loops back to Block 104 for step (7) . Block 104 performs a re-sampling on the DEM without the obj ects as provided in FIG . 15. The re-sampling is performed at a lower resolution, that is , from 1 meter to 30 meters . The result is provided in the computer screen 312 as illustrated in FIG . 16.
In Block 106, a second null fill is performed, which corresponds to step (8) . The second null fill is performed on the re-sampled DEM without the obj ects as provided in Block 112. This process generates a terrain-only DEM as provided in the computer screen 314 in FIG . 17. In Block 108, a second DEM subtraction is performed, which corresponds to step (9) . The terrain-only DEM from step (8) is now subtracted from the original DEM in step (1) to produce an enhanced obj ects-only DEM. The enhanced objects-only DEM is provided in the computer screen 316 as illustrated in FIG . 18. In Block 110, step (10) is another DEM subtraction step . The enhanced obj ects-only DEM from step (9) is subtracted from the original DEM in step (1) to produce an enhanced DEM without the obj ects . The enhanced DEM without obj ects is provided in the computer screen 318 as illustrated in FIG . 19.
For step (11) , the run batch process again loops back to Block 104. Block 104 performs another re-sampling on the enhanced DEM without the obj ects as provided by Block 110. As before, the re-sampling is also performed at a lower resolution, that is , from 1 meter to 30 meters . The result is provided in the computer screen 320 as illustrated in FIG . 20. In Block 106, another null fill is performed, which corresponds to step (12) . This third null fill is performed on the re-sampled DEM without the objects as provided by Block 104 to generate an enhanced terrain-only DEM as provided in the computer screen 322 in FIG . 21. This DEM is also referred to as the final terrain-only DEM.
In Block 108, a third DEM subtraction is performed, which corresponds to step (13) . The enhanced terrain-only DEM from step (12) is subtracted from the original DEM in step (1) to produce an even further enhanced obj ects-only DEM. The further enhanced obj ects-only DEM is provided in the computer screen 324 as illustrated in FIG . 22.
A null expansion is performed on the further enhanced objects-only DEM in Block 114, which corresponds to step (14) . In Block 116, a null fill is performed, which corresponds to step (15) . Steps (14) and (15) are performed to remove noise from around the obj ects to generate an even further enhanced obj ects-only DEM. The process loops back to Block 114 so that steps (16) and (17) are performed . That is , another null expansion and null fill are performed to generate a second further enhanced obj ects-only DEM as provided in the computer screen 326 in FIG . 23.
In Block 118, a DEM subtraction is performed. The second further enhanced obj ects-only DEM from step (17) is subtracted from the further enhanced obj ects-only DEM from step (15) to produce a noise-only DEM . The noise-only DEM is provided in the computer screen 328 as illustrated in FIG. 24.
In Block 120 , another DEM subtraction (step 19) is performed. The noise-only DEM from step (18) is subtracted from the second further enhanced obj ects-only DEM from step
(15) to produce a final obj ects-only DEM. The final obj ects- only DEM is provided in the computer screen 330 as illustrated in FIG . 25. As discussed above for steps (2) - (19) , a final terrain-only DEM and a final objects-only DEM have been generated. Compared to prior art DEMs , these DEMs are enhanced as a result of the looping iterations performed in steps (2) - (19) . Yet another aspect in accordance with the present invention that will now be discussed is the separation of the final obj ects-only DEM into a building DEM and a foliage DEM. In other words , the final obj ects-only DEM is separated into two separate DEMs so that each DEM may be separately processed.
Separating the final obj ects-only DEM into a building DEM and a foliage DEM will now be discussed with reference to the flow diagram illustrated in FIG . 26, as well as to FIGS . 27-28. Referring to the computer screen 200 initially illustrated in FIG . 3, the user selects "Separate Buildings and Trees" in field 227. This causes a "Separate Buildings and Trees" computer screen 410 to be displayed, as illustrated in FIG . 27.
Still referring to FIG. 27 , the user has the option of selecting several threshold parameters . Since the separation is performed based upon calculating a perimeter per area for each obj ect, as well as the height of each obj ect, corresponding comparison threshold values are set via the computer screen 410. For instance, the perimeter per area threshold is set in field 412. The minimum size of each obj ect to be considered is set in field 414. This field is labeled as the minimum posts . The chord residue is selected in field 416, and corresponds to a width or length of a side of the obj ect being considered. A maximum tree height is selected in field 418. A second threshold to be associated with a second perimeter per area test is selected in field 420. This second threshold selected in field 420 may be different than the first threshold selected in field 412. To start separating (Block 500) the final obj ects- only DEM into a building DEM and a foliage DEM, a perimeter versus area parameter for each obj ect in the final obj ects- only DEM is determined in Block 502. For purposes of explaining the present invention, reference will also be made to FIG . 28 , which provides a pictorial representation of how the final obj ects-only DEM is separated into a building DEM and a foliage DEM. For instance, a simplified representation representing the final obj ects-only DEM as initially illustrated in FIG . 25 is provided in frame 600 in FIG . 28.
As illustrated in frame 600, the obj ects include buildings 54 and trees 56 grouped together in the same DEM.
The perimeter versus area parameter for each obj ect is compared to the selected threshold in Block 504 to identify whether each obj ect in the DEM is a building 54 or foliage 56. Based upon the comparison to the threshold, the obj ects are separated into a building DEM and a foliage DEM, as illustrated in frames 602 and 604.
The data for each object includes a height value, and the height value for each obj ect identified as foliage 56 in frame 604 is compared to a height threshold in Block 506. In Block 508, each foliage 56 in frame 604 is re-identified as a building 54 if the height value associated therewith is greater than the height threshold. As illustrated in frames 606 and 608, the building 54 initially identified as foliage 56 in frame 604 has been re-identified as a building in frame 606. However, tall trees 56 in frame 604 have now been identified as buildings based upon the comparison to the height threshold, as shown in frame 606. In Block 510 , a second perimeter versus area parameter is determined for each obj ect identified as a building in frame 606. Each second perimeter versus area parameter is compared to a second threshold in Block 512. Each building 54 is re-identified as foliage 56 if the second perimeter versus area parameter is greater than the second threshold in Block 514.
The obj ects identified as buildings 54 are separated into a building DEM, and the obj ects identified as foliage are separated into a foliage DEM in Block 516. Separate building and foliage DEMs advantageously allow more realistic topographical models to be generated with significantly less user intervention . The building DEM is represented by frame 610, and the foliage DEM is represented by frame 612. The method for separating the final obj ects-only DEM into a building DEM and a foliage DEM ends at Block 518. The above steps for separating the final obj ects-only DEM into two separate DEMs correspond to step (20) in FIG . 2.
The remaining steps (21) - (26) will now be discussed . The buildings 54 and the trees 56 will each be modeled differently . Block 124 corresponds to step (21) and is optional, but allows the user to manually clean up or edit the separation of the buildings 54 and trees 56 in case the automatic process failed to correctly identify each obj ect correctly .
In Block 126, which corresponds to step (22) , each foliage in the foliage DEM 612 is modeled as 3D points . The user selects the "Generate Points From DEM" in field 228 as shown in FIG . 3 to convert the points into a list of x (latitude) , y (longitude) and z (height) .
In Block 128, which corresponds to step (23) , the buildings are modeled as vectors . Modeling buildings as vectors is disclosed in U . S . Patent No . 6, 654 , 690 as discussed in the background section. Texture is placed on the polygons representing the buildings in Block 130 , which corresponds to step (24) . In other words , images are placed on the polygons to give the topographical model a realistic look. RealSite™ is one such commercially available tool to perform this task . RealSite™ was developed by the Harris Corporation, which is assignee of the present invention .
In Block 132, SceneBuilder™ is used to format all of the generated geometry and textures for display on the computer system 60. SceneBuilder™ is also a commercially available tool . Using InReality™ in Block 134, the final topographical model for disp lay is provided, as illustrated in FIG . 29. InReality™ is another commercially available tool developed by the Harris Corporation, and allows the user to navigate virtual scenes and conduct various analyses .
InReality™ is designed to be a companion to the RealSite™ software . The process ends at Block 136.

Claims

1. A computer implemented method for processing a digital elevation model (DEM) including data for a plurality of obj ects , the method comprising : determining a perimeter versus area parameter for each obj ect in the DEM; and comparing the perimeter versus area parameter for each obj ect to a threshold to identify whether each obj ect in the DEM is a building or foliage .
2. A computer implemented method according to Claim 1 wherein the data for each object includes a height value , and further comprising : comparing the height value for each obj ect identified as foliage to a height threshold; and re-identifying each foliage as a building if the height value associated therewith is greater than the height threshold .
3. -A computer implemented method according to Claim 2 further comprising : determining a second perimeter versus area parameter for each obj ect identified as a building; comparing each second perimeter versus area parameter to a second threshold; and re-identifying each building as foliage if the second perimeter versus area parameter is greater than the second threshold.
4. A computer implemented method according to Claim 1 wherein the obj ects identified as buildings are separated into a building DEM; and wherein the obj ects identified as foliage are separated into a foliage DEM.
5. A computer implemented method according to Claim 4 further comprising : modeling the buildings in the building DEM; and modeling the foliage in the foliage DEM; and displaying the modeled buildings and foliage on a display .
6. A computer system for topographical modeling comprising : a processor for processing a digital elevation model (DEM) including data for a plurality of obj ects , the processing comprising determining a perimeter versus area parameter for each object in the DEM, and comparing the perimeter versus area parameter for eacn obj ect to a threshold to identify whether each obj ect in the DEM is a building or foliage ; and a display coupled to said processor for displaying a topographical model based upon the processing .
7. A computer system according to Claim 6 wherein the data for each obj ect includes a height value , and wherein said processor : compares the height value for each obj ect identified as foliage to a height threshold; and re-identifies each foliage as a building if the height value associated therewith is greater than the height threshold.
8. A computer system according to Claim 7 wherein said processor : determines a second perimeter versus area parameter for each object identified as a building; compares each second perimeter versus area parameter to a second threshold; and re-identifies each building as foliage if the second perimeter versus area parameter is greater than the second threshold .
9. A computer system according to Claim 6 wherein the obj ects identified by said processor as buildings are separated into a building DEM; and wherein the obj ects identified by said processor as foliage are separated into a foliage DEM.
10. A computer system according to Claim 9 wherein said processor models the buildings in the building DEM and models the foliage in the foliage DEM for defining the topographical model being displayed on said display.
EP06734356A 2005-02-08 2006-02-06 Method and apparatus for distinguishing foliage from buildings for topographical modeling Withdrawn EP1851686A4 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/053,219 US7191066B1 (en) 2005-02-08 2005-02-08 Method and apparatus for distinguishing foliage from buildings for topographical modeling
PCT/US2006/003978 WO2006086252A1 (en) 2005-02-08 2006-02-06 Method and apparatus for distinguishing foliage from buildings for topographical modeling

Publications (2)

Publication Number Publication Date
EP1851686A1 true EP1851686A1 (en) 2007-11-07
EP1851686A4 EP1851686A4 (en) 2012-02-08

Family

ID=36793363

Family Applications (1)

Application Number Title Priority Date Filing Date
EP06734356A Withdrawn EP1851686A4 (en) 2005-02-08 2006-02-06 Method and apparatus for distinguishing foliage from buildings for topographical modeling

Country Status (6)

Country Link
US (1) US7191066B1 (en)
EP (1) EP1851686A4 (en)
JP (1) JP2008530594A (en)
CA (1) CA2597056C (en)
TW (1) TWI309027B (en)
WO (1) WO2006086252A1 (en)

Families Citing this family (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7424335B2 (en) * 2005-07-13 2008-09-09 Swift Lawrence W Identification of terrestrial foliage location, type and height for scaled physical models
US7881913B2 (en) * 2007-02-12 2011-02-01 Harris Corporation Exemplar/PDE-based technique to fill null regions and corresponding accuracy assessment
JP4378571B2 (en) 2007-05-31 2009-12-09 Necシステムテクノロジー株式会社 MAP CHANGE DETECTION DEVICE, MAP CHANGE DETECTION METHOD, AND PROGRAM
US7650240B2 (en) * 2007-06-22 2010-01-19 Weyerhaeuser Nr Company Estimating an attribute value using spatial interpolation and masking zones
WO2009051258A1 (en) * 2007-10-19 2009-04-23 Pasco Corporation House change judgment method and house change judgment program
US8427505B2 (en) * 2008-11-11 2013-04-23 Harris Corporation Geospatial modeling system for images and related methods
US8275547B2 (en) * 2009-09-30 2012-09-25 Utility Risk Management Corporation, Llc Method and system for locating a stem of a target tree
IL202062A0 (en) 2009-11-11 2010-11-30 Dror Nadam Apparatus, system and method for self orientation
US8503761B2 (en) * 2009-11-12 2013-08-06 Harris Corporation Geospatial modeling system for classifying building and vegetation in a DSM and related methods
US20110144962A1 (en) * 2009-12-11 2011-06-16 Harris Corporation Geospatial modeling system providing enhanced foliage void region inpainting features and related methods
WO2011088473A2 (en) * 2010-01-18 2011-07-21 The Regents Of The University Of California System and method for identifying patterns in and/or predicting extreme climate events
US20140354626A1 (en) * 2010-05-12 2014-12-04 Google Inc. Block Based Level of Detail Representation
WO2012092554A1 (en) 2010-12-30 2012-07-05 Utility Risk Management Corporation, Llc Method for locating vegetation having a potential to impact a structure
US11035674B2 (en) * 2019-05-15 2021-06-15 Applied Research Associates, Inc. GPS-denied geolocation
CN110502979B (en) * 2019-07-11 2023-04-14 哈尔滨工业大学 Laser radar waveform signal classification method based on decision tree

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0505077A2 (en) * 1991-03-20 1992-09-23 Hughes Aircraft Company Rectilinear object image matcher
WO1999022337A1 (en) * 1997-10-24 1999-05-06 Innovative Solutions Group, Inc. Method and apparatus for enhancing cartographic images and identifying and extracting the features therein
US20020147567A1 (en) * 2001-04-05 2002-10-10 Harris Corporation Automated method for making a topographical model and related system

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3451329B2 (en) * 1995-02-09 2003-09-29 松下電器産業株式会社 Cartographic equipment
FR2759803B1 (en) 1997-02-20 1999-03-26 Alsthom Cge Alcatel METHOD FOR AIDING THE DETECTION OF HUMAN STRUCTURES IN A DIGITAL TERRAIN MODEL (DTM)
WO1998044739A1 (en) 1997-03-31 1998-10-08 Sharp Kabushiki Kaisha Mosaic generation and sprite-based image coding with automatic foreground and background separation
US6338027B1 (en) 1999-05-27 2002-01-08 Arborcom Technologies Inc. Canopy modification using computer modelling
US6664529B2 (en) 2000-07-19 2003-12-16 Utah State University 3D multispectral lidar
JP2002074323A (en) * 2000-09-01 2002-03-15 Kokusai Kogyo Co Ltd Method and system for generating three-dimensional urban area space model
AUPR301401A0 (en) 2001-02-09 2001-03-08 Commonwealth Scientific And Industrial Research Organisation Lidar system and method
JP2003156330A (en) * 2001-11-22 2003-05-30 Nec Corp Airborne topography-measuring apparatus and method
JP4228745B2 (en) * 2003-03-28 2009-02-25 株式会社日立製作所 Multispectral image analysis device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0505077A2 (en) * 1991-03-20 1992-09-23 Hughes Aircraft Company Rectilinear object image matcher
WO1999022337A1 (en) * 1997-10-24 1999-05-06 Innovative Solutions Group, Inc. Method and apparatus for enhancing cartographic images and identifying and extracting the features therein
US20020147567A1 (en) * 2001-04-05 2002-10-10 Harris Corporation Automated method for making a topographical model and related system

Non-Patent Citations (8)

* Cited by examiner, † Cited by third party
Title
"Automated Building Extraction and Reconstruction from LIDAR Data", 20040717 , 17 July 2004 (2004-07-17), pages 1-27, XP008122489, Retrieved from the Internet: URL:http://www.icrest.missouri.edu/Project s/NASA/FeatureExtraction-Buildi ngs/Building%20Extraction.pdf *
ANSGAR BRUNN ET AL: "Hierarchical Bayesian nets for building extraction using dense digital surface models", ISPRS JOURNAL OF PHOTOGRAMMETRY AND REMOTE SENSING, vol. 53, no. 5, 1 October 1998 (1998-10-01), pages 296-307, XP55015727, ISSN: 0924-2716, DOI: 10.1016/S0924-2716(98)00012-4 *
G PRIESTNALL ET AL: "Extracting urban features from LiDAR digital surface models", COMPUTERS, ENVIRONMENT AND URBAN SYSTEMS, vol. 24, no. 2, 31 March 2000 (2000-03-31) , pages 65-78, XP55015729, ISSN: 0198-9715, DOI: 10.1016/S0198-9715(99)00047-2 *
HAITHCOAT T L ET AL: "Development of Comprehensive Accuracy Assessment Indexes for Building Footprint Extraction", IEEE TRANSACTIONS ON GEOSCIENCE AND REMOTE SENSING, IEEE SERVICE CENTER, PISCATAWAY, NJ, US, vol. 43, no. 2, 1 February 2005 (2005-02-01), pages 402-404, XP011125841, ISSN: 0196-2892, DOI: 10.1109/TGRS.2004.838418 *
See also references of WO2006086252A1 *
W Cho ET AL: "Pseudo-grid based building extraction using airborne LIDAR data.", International Archives of Photogrammetry and Remote Sensing, 35 B3 (2004), 1 January 2004 (2004-01-01), pages 378-381, XP55015738, Retrieved from the Internet: URL:-- [retrieved on 2012-01-04] *
Zheng Wang ET AL: "BUILDING EXTRACTION AND RECONSTRUCTION FROM LIDAR DATA", International Archives of Photogrammetry and Remote Sensing. Vol. XXXIII, Part B3. Amsterdam 2000., 1 January 2000 (2000-01-01), pages 958-964, XP55015735, Amsterdam Retrieved from the Internet: URL:http://www.isprs.org/proceedings/XXXIII/congress/part3/958_XXXIII-part3.pdf [retrieved on 2012-01-04] *
Zheng Wang: "EXTRACTING BUILDING INFORMATION FROM LIDAR DATA", ISPRS Proceedings of Commission III Symposium on Object Recognition and Scene Classification from Multispectral and Multisensor Pixels, Columbus, Ohio, USA, Volume 32, Part 3/1, 7 July 1998 (1998-07-07), pages 279-284, XP55015736, Columbus, Ohio, USA. Retrieved from the Internet: URL:-- [retrieved on 2012-01-04] *

Also Published As

Publication number Publication date
WO2006086252A1 (en) 2006-08-17
TWI309027B (en) 2009-04-21
CA2597056A1 (en) 2006-08-17
EP1851686A4 (en) 2012-02-08
JP2008530594A (en) 2008-08-07
US7191066B1 (en) 2007-03-13
TW200632781A (en) 2006-09-16
CA2597056C (en) 2012-05-29

Similar Documents

Publication Publication Date Title
CA2597057C (en) Method and apparatus for enhancing a digital elevation model (dem) for topographical modeling
CA2597056C (en) Method and apparatus for distinguishing foliage from buildings for topographical modeling
US6654690B2 (en) Automated method for making a topographical model and related system
Chen et al. A methodology for automated segmentation and reconstruction of urban 3-D buildings from ALS point clouds
US8718393B2 (en) Method for reconstruction of urban scenes
EP2118854B1 (en) Exemplar/pde-based technique to fill null regions and corresponding accuracy assessment
US7983474B2 (en) Geospatial modeling system and related method using multiple sources of geographic information
Truong-Hong et al. Octree-based, automatic building facade generation from LiDAR data
TW200926060A (en) Geospatial modeling system providing user-selectable building shape options and related methods
Su et al. A new hierarchical moving curve-fitting algorithm for filtering lidar data for automatic DTM generation
BRPI0714264A2 (en) geospatial modeling system, computer readable geospatial modeling method and computer executable modules
JP2010525491A (en) Geospatial modeling system and associated method for providing data decimation of geospatial data
CN115344914A (en) Model generation method suitable for city planning design
CN109558801B (en) Road network extraction method, medium, computer equipment and system
BRPI0714260A2 (en) geospatial modeling system and geospatial modeling methods
BRPI0714263A2 (en) geospatial modeling system and method
Gulch Digital systems for automated cartographic feature extraction
CN114529689B (en) Ceramic cup defect sample amplification method and system based on antagonistic neural network
CN110706347A (en) Implementation method for creating 3D building model through wire frame diagram of building
Mahphood et al. Virtual first and last pulse method for building detection from dense LiDAR point clouds
Zeng et al. An improved extraction method of individual building wall points from mobile mapping system data
Valério Reconstrução de Imagem 3D Após Acidente de Trânsito
JPH10312466A (en) Image processor, image processing method and recording medium
Huang et al. Detecting, Modeling and Predicting Vertical Urban Growth: An exploratory review. GeoComputation 2019
Martínez et al. A Semiautomatic Large-Scale Detection of Simple Geometric Primitives for Detecting Structural Defects from Range-Based Information

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20070831

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): DE FR GB

RBV Designated contracting states (corrected)

Designated state(s): DE FR GB

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20120112

RIC1 Information provided on ipc code assigned before grant

Ipc: G06K 9/48 20060101AFI20120105BHEP

Ipc: G06T 7/60 20060101ALI20120105BHEP

Ipc: G06K 9/00 20060101ALI20120105BHEP

17Q First examination report despatched

Effective date: 20130222

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION HAS BEEN WITHDRAWN

18W Application withdrawn

Effective date: 20130319