US11495026B2 - Aerial line extraction system and method - Google Patents
Aerial line extraction system and method Download PDFInfo
- Publication number
- US11495026B2 US11495026B2 US17/047,416 US201917047416A US11495026B2 US 11495026 B2 US11495026 B2 US 11495026B2 US 201917047416 A US201917047416 A US 201917047416A US 11495026 B2 US11495026 B2 US 11495026B2
- Authority
- US
- United States
- Prior art keywords
- area
- interest
- aerial line
- point cloud
- dimensional point
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active, expires
Links
- 238000000034 method Methods 0.000 title claims abstract description 31
- 238000000605 extraction Methods 0.000 title claims abstract description 26
- 230000011218 segmentation Effects 0.000 claims abstract description 36
- 238000012545 processing Methods 0.000 claims description 37
- 230000000630 rising effect Effects 0.000 claims description 10
- 230000009469 supplementation Effects 0.000 claims description 9
- 239000013589 supplement Substances 0.000 claims description 8
- 230000010365 information processing Effects 0.000 claims description 6
- 238000010586 diagram Methods 0.000 description 15
- 230000001502 supplementing effect Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000005484 gravity Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 239000000284 extract Substances 0.000 description 2
- 239000000725 suspension Substances 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 239000012212 insulator Substances 0.000 description 1
- 238000012423 maintenance Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003825 pressing Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B21/00—Measuring arrangements or details thereof, where the measuring technique is not covered by the other groups of this subclass, unspecified or not relevant
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C15/00—Surveying instruments or accessories not provided for in groups G01C1/00 - G01C13/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/483—Details of pulse systems
- G01S7/486—Receivers
- G01S7/487—Extracting wanted echo signals, e.g. pulse detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/48—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
- G01S7/51—Display arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/001—Texturing; Colouring; Generation of texture or colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H02—GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
- H02G—INSTALLATION OF ELECTRIC CABLES OR LINES, OR OF COMBINED OPTICAL AND ELECTRIC CABLES OR LINES
- H02G1/00—Methods or apparatus specially adapted for installing, maintaining, repairing or dismantling electric cables or lines
- H02G1/02—Methods or apparatus specially adapted for installing, maintaining, repairing or dismantling electric cables or lines for overhead lines or cables
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10028—Range image; Depth image; 3D point clouds
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20021—Dividing image into blocks, subimages or windows
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/25—Determination of region of interest [ROI] or a volume of interest [VOI]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/50—Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
Definitions
- the present invention relates to a method of processing a three-dimensional point cloud data, and more particularly to a technique for extracting an aerial line from a three-dimensional point cloud data.
- Patent Document 1 a technique of acquiring three-dimensional map information by using a camera or a laser distance measuring device has been known (for example, Patent Document 1).
- FIG. 1 is a perspective view illustrating an example in which a three-dimensional point cloud data is acquired by a laser distance measuring device.
- a laser distance measuring device 102 mounted on a vehicle 101
- the vehicle is allowed to travel, so that a three-dimensional map data can be generated.
- MMS Mobile Mapping System
- the three-dimensional point cloud data includes not only roads 103 , utility poles 104 , or architectural structures such as buildings and signs, but also data of electric lines and communication lines installed in the air (collectively referred to as an aerial line 105 ). Such arrangement information of the aerial lines 105 is useful for the time of performing maintenance of electric lines, communication lines, and the like.
- FIG. 2 illustrates an example of the three-dimensional point cloud data acquired in FIG. 1 .
- the three-dimensional point cloud data includes a point cloud 203 of roads, a point cloud 204 of utility poles, and a point cloud 205 of aerial lines.
- the point cloud 205 of aerial lines has missing portions 206 and 207 of data. For this reason, in order to generate the three-dimensional map data from the acquired three-dimensional point cloud data, it is necessary to supplement the missing data of the three-dimensional point cloud data.
- the acquired three-dimensional point cloud data is displayed on a display and a user supplements a location where data should be supplemented by designating the portion.
- FIG. 3 conceptually illustrates a process of supplementing the missing portions of the three-dimensional point cloud data of FIG. 2 displayed on the display.
- the point cloud 205 of the aerial line has the missing portions 206 and 207 of data.
- the user designates, for example, the aerial lines at both ends of the missing portions and supplements the missing portions by interpolation or extrapolation.
- points 301 and 302 are designated, and the missing portion 206 between the points is supplemented.
- points 302 and 303 are designated, and the missing portion 207 between the points is supplemented.
- the missing portions 206 and 207 can be supplemented.
- an object of the present invention is to provide a technique that facilitates selecting and designating an arbitrary one of a plurality of aerial lines.
- an aerial line extraction system including: an area-of-interest cropping unit that crops a region where an aerial line is assumed to exist as an area of interest by setting a support of the aerial line as a reference from a three-dimensional point cloud data; an element segmenting unit that segments the area of interest into a plurality of subdivided areas, obtains a histogram by counting three-dimensional point clouds existing in each of the subdivided areas, and obtains a segmentation plane of the area of interest on the basis of the histogram; and an element display unit that segments the area of interest into a plurality of segmented areas by the segmentation plane and displays the three-dimensional point clouds included in each of the segmented areas in a distinguishable manner.
- This method includes: a first step of reading the three-dimensional point cloud data including an aerial line and a support of the aerial line from the storage device; a second step of cropping a region where the aerial line is likely to be included as an area of interest by setting the support of the read three-dimensional point cloud data as a reference; a third step of segmenting the area of interest into a plurality of subdivided areas having the same shape and the same volume; a fourth step of counting the number of three-dimensional point clouds included in each of the subdivided areas; a fifth step of extracting a plane in which a distribution of the three-dimensional point clouds becomes sparse with respect to surroundings as a segmentation plane from the result of a counting; and a sixth step of segmenting the area of interest by the segmentation plane,
- FIG. 1 is a perspective view illustrating an example in which a three-dimensional point cloud data is acquired by a laser distance measuring device.
- FIG. 2 is a conceptual diagram illustrating the three-dimensional point cloud data acquired by the laser distance measuring device.
- FIG. 3 is a conceptual diagram of a process of supplementing a missing portion of three-dimensional point cloud data.
- FIG. 4 is a flowchart illustrating a processing flow of an aerial line extraction system according to an embodiment.
- FIG. 5 is a two-surface diagram illustrating a concept of an area-of-interest cropping processing.
- FIG. 6 is a graph showing the number of point cloud data counted for each slice.
- FIG. 7 is an image diagram of an element selection screen displayed on a display of a first embodiment.
- FIG. 8 is a block diagram illustrating a configuration of the aerial line extraction system according to the embodiment.
- FIG. 9 is a graph showing a concept of processing by an element segmenting unit according to a second embodiment.
- FIG. 10 is a conceptual diagram illustrating a display example of three-dimensional point cloud data including a lead-in line.
- FIG. 11 is a conceptual diagram in which a three-dimensional point cloud including a lead-in line is viewed down from the above.
- FIG. 12 is a graph showing a concept of processing by an element segmenting unit according to a third embodiment.
- FIG. 13 is an image diagram of an element selection screen displayed on a display according to the third embodiment.
- the description may be made with the same reference numerals attached with different subscripts. However, in a case where it is not necessary to distinguish a plurality of elements, the description may be made with the subscripts omitted.
- the notations such as “first”, “second”, and “third” in this specification and the like are given to identify components, and thus, the notations do not necessarily limit the number, order, or contents thereof.
- the numbers for identifying the components are used for each context, and thus, the numbers used in one context do not always indicate the same configuration in other contexts. In addition, it is not prevented that the component identified by a certain number also has the component identified by another number.
- an area where an aerial line is assumed to exist is cropped as an area of interest from a three-dimensional point cloud data by setting a support (for example, a utility pole) of an aerial line as a reference.
- a support for example, a utility pole
- the area of interest is, for example, a rectangular parallelepiped existing between the two utility poles.
- the area of interest is, for example, a cylinder having one utility pole as the central axis.
- the area of interest is segmented into a plurality of subdivided areas (hereinafter, referred to as “slices”) having the same shape and the same volume with a plurality of planes (subdivision planes) approximately parallel to the longitudinal direction (extension direction) of the aerial line.
- the subdivision plane is, for example, a plane that is perpendicular to the ground and parallel to a line connecting the utility poles.
- the subdivision plane is, for example, a plane that is perpendicular to the ground and segments a cylinder at equal angles in the circumferential direction.
- a histogram is obtained by counting the three-dimensional point clouds existing in each slice. Then, the distribution of the three-dimensional point clouds is obtained, in which the portion where the three-dimensional point cloud of the aerial line exists becomes a mountain and the portion where the three-dimensional point cloud does not exist becomes a valley. Therefore, the area of interest is segmented by setting the portion which the valley becomes as a segmentation plane to obtain a plurality of segmented areas. Then, the three-dimensional point clouds (hereinafter, sometimes referred to as “elements”) existing in each of the segmented areas belong to the same aerial line. As described above, since each aerial line can be identified by the segmented area, a user can easily designate a desired aerial line.
- a three-dimensional point cloud including the utility poles and the electric lines is input, the input point cloud is segmented into a plurality of elements that are parallel to a line connecting the utility poles and the utility pole and perpendicular to the ground on the basis of the density, and each of the plurality of segmented elements is displayed on the display unit.
- FIG. 4 is a diagram illustrating a processing flow of an aerial line extraction system according to a first embodiment.
- the aerial line extraction system is realized by allowing a general information processing apparatus to process software, as described later.
- a three-dimensional point cloud data acquired by a method described in FIG. 1 is input to the information processing apparatus via an input interface.
- the three-dimensional point cloud data is a data of a point cloud as illustrated in FIG. 2 .
- each point is represented by coordinates (x, y, z).
- the origin (0, 0, 0) of the coordinates, and the x-axis, y-axis, and z-axis can be arbitrarily determined, and these will be referred to as world coordinates.
- a spherical coordinate system or other coordinate systems may be used.
- the input three-dimensional point cloud data is stored in a storage device and used for subsequent processing.
- step S 402 a portion where the point cloud 205 of the aerial lines is likely to exist is cropped from the three-dimensional point cloud data as an area of interest.
- the cropping method is not particularly limited, but for example, the cropping is performed by defining the portion as an area of interest of the space between the utility poles.
- the user designates two utility pole coordinates (x 1 , y 1 ) and (x 2 , y 2 ) by selecting two utility poles 104 by using a means such as a mouse click while viewing the three-dimensional point cloud data illustrated on the display as illustrated in FIG. 3 .
- the coordinates of the utility pole 104 can be obtained from another database, the coordinates may be used.
- FIG. 5 is a diagram describing the concept of the area-of-interest cropping process.
- FIG. 5 illustrates the schematic diagram (a) of the three-dimensional point cloud data of FIG. 2 viewed from the upper side and the schematic diagram (b) viewed from the side.
- the local coordinates are employed, the longitudinal direction (extension direction) of the aerial line 105 (point cloud 205 of the aerial line) is set to x, the direction of gravity (direction perpendicular to the ground surface) is set to z, and the direction perpendicular to x and z is set to y.
- a predetermined width W is defined in the vertical direction with the line 502 as the center in the xy plane.
- a predetermined height H is defined in the z direction (height direction). The range of the width W and the height H may be determined in advance and stored in the system as an the area definition file 807 .
- the width W is, for example, 1 m on both sides of the line 502 as a center.
- the height H has, for example, a region of 5 m to 10 m from the ground surface.
- the numerical values are examples and may be set arbitrarily. Since the point cloud in the area of interest is a point cloud in which the road 103 , the utility pole 104 , and the like are removed, the possibility of the aerial line 105 is high.
- the length L of the area of interest 501 in the longitudinal direction may be the distance (distance between the utility poles 104 ) between (x 1 , y 1 ) and (x 2 , y 2 ).
- a gap 503 from the utility pole is defined in the area definition file, and the length L of the area of interest 501 in the longitudinal direction may be obtained from “distance between the utility poles ⁇ (gap from the utility pole*2)”.
- this conversion is not essential and may be omitted, but in order to simplify the following description, an example of conversion will be described in the embodiment.
- step S 404 the area of interest 501 cropped from the three-dimensional point cloud data as a region where an aerial line is likely to exist is sliced with a plane (subdivision plane) including the longitudinal direction of the aerial line 105 (direction away from one of the utility poles 104 ) and the gravity direction.
- a plane subdivision plane
- the area of interest 501 is segmented into a plurality of thin areas on the xz plane.
- the thickness of segmentation is defined in advance in the area definition file 807 , and the thickness of segmentation is defined to be, for example, 5 cm.
- the area of interest 501 is segmented in the direction (y direction) in which the aerial lines 105 are aligned, and thus, a plurality of subdivided areas (slices) are generated.
- the slices usually have the same shape and the same volume.
- the length (the size in the x direction in the local coordinate system of FIG. 5 ) of the aerial line 105 of the slice in the longitudinal direction is equal to the length L of the area of interest 501 .
- the size (the size in the z direction in the local coordinate system of FIG. 5 ) of the slice in the gravity direction is equal to the height H of the area of interest 501 . Since the above-mentioned sizes and the size (the thickness T of the slice and the size in the y direction in the local coordinate system of FIG. 5 ) in the vertical direction are segmented, so that the thickness is remarkably small as compared with the length L and the height H.
- the length L is several meters to several tens of meters
- the height H is several meters
- the thickness T of the slice is several centimeters to several tens of centimeters. Therefore, the distribution in the width direction of the point cloud 205 of the aerial line can be accurately identified.
- step S 405 the number of three-dimensional point cloud data included in the slice is counted for each slice.
- FIG. 6 is a graph showing the number of the counted three-dimensional point cloud data.
- a histogram as shown in FIG. 6 is completed.
- three aerial lines 105 are running in parallel
- three peaks 602 are formed in the histogram 601 of the point cloud.
- One of the peaks 602 indicates one aerial line.
- step S 406 the valley portion of the histogram 601 is set as the segmentation plane 603 , so that the three-dimensional point cloud data of the three aerial lines can be separated into the segmented areas 604 a , 604 b , and 604 c .
- the histogram of FIG. 6 may be displayed on the display, and the segmentation plane 603 may be determined on the basis of information designated by the user corresponding to the display. Alternatively, a threshold value may be set, and a location where the number of point clouds is equal to or less than the threshold value may be automatically determined as a segmentation plane. As the thickness T of slice is smaller, the resolution of the histogram becomes higher. However, the thickness is determined in consideration of the trade-off with the processing time.
- the peaks and valleys of the histogram can be clearly identified as shown in FIG. 6 . If the slice direction is inclined by a predetermined angle or more with respect to the longitudinal direction (x-axis direction in FIG. 5 ) of the aerial line 105 , the peaks and valleys of the histogram may not be identified. However, in such a case, the peaks and valleys of the histogram may be identified by adjusting the angle of the slice direction.
- step S 404 the area of interest 501 illustrated in FIG. 5 is sliced by setting the xz plane as a subdivision plane, and the x-axis is a line connecting the two utility poles 104 .
- the x-axis is a line connecting the two utility poles 104 .
- the planar parameters (a, b, c, d) are transmitted to the subsequent processing.
- step S 408 the obtained segmentation plane 603 is used to separate the three-dimensional point cloud data into the respective elements included in the segmented areas 604 a , 604 b , and 604 c , and the displaying of the three-dimensional point cloud and the selection processing of the point cloud of aerial lines are performed.
- FIG. 7 is an image of the element selection screen displayed on the display of the system according to the embodiment.
- the three-dimensional point cloud data illustrated in FIG. 2 is displayed on the display screen 750 .
- segmentation planes 603 a , 603 b , 603 c , and 603 d are illustrated on the screen to be superimposed on the three-dimensional point cloud data.
- the segmented areas 604 a , 604 b , and 604 c are areas partitioned by two adjacent segmentation planes 603 .
- a manipulation button region 751 is displayed on the screen.
- selection buttons of “element 1 ”, “element 2 ”, and “element 3 ” are arranged to correspond to the segmented areas 604 a , 604 b , and 604 c , and in conjunction with the designation of the buttons, a point cloud (element) included in any of the segmented areas 604 a , 604 b , and 604 c is displayed as a point cloud that is selectable with a mouse or the like.
- the color of the point cloud of the selected element is displayed in a color different from those of other point clouds.
- Only the point cloud of the selected element is displayed, and the other point clouds are not displayed.
- Only the point cloud of the selected element is allowed to be selectable with the mouse or the like, and the other point clouds are allowed not to be selectable.
- the missing portions 206 and 207 of the point cloud are supplemented.
- the point designating method and the supplementation processing are not particularly limited, but various known supplementation methods may be applied. For example, three points of the start point, the end point, and the waypoint of the aerial line are selected, and a point cloud obtained by curve approximation (for example, suspension curve approximation) of the three points is output to supplement the missing portion.
- curve approximation for example, suspension curve approximation
- the ends of the aerial lines at both ends of the missing portion may be designated, and the space between the ends may be supplemented by a straight line or a curved line.
- FIG. 8 is a system configuration diagram according to the embodiment.
- an aerial line extraction system 800 is configured with a normal information processing apparatus (for example, a server) including a processing device, a storage device, an input device, and an output device.
- the processing device 801 includes an area-of-interest cropping unit 802 , an element segmenting unit 803 , an element display unit 804 , and an aerial line supplementation unit 805 .
- These configurations are implemented by the processing device executing software stored in the storage device.
- the storage device further stores a three-dimensional point cloud file 806 , an area definition file 807 , and a result file 808 as data.
- a keyboard/mouse 809 and a display 810 which are general input and output devices are provided.
- the aerial line extraction system 800 may have various configurations known as an information processing apparatus instead of or in addition to the aerial line extraction system.
- the configuration of the aerial line extraction system 800 will be described in relation to the processing described in FIG. 4 .
- the three-dimensional point cloud data input in step S 401 is stored in the storage device as a three-dimensional point cloud file 806 .
- the three-dimensional point cloud data includes point cloud data of roads, utility poles, electric lines, and the like. Data inputting can be performed from a known data input port.
- Data defining the size of each area are stored in the area definition file 807 .
- the data for example, cropping information of the area of interest (sizes of above and below portions of the area of interest, a distance between the utility pole and the area of interest) and the thickness T of the slice of the segmented area are stored. These are determined and stored in advance by the user. A plurality of types of data may be stored, so that the user may select the type of data at the time of use.
- the display 810 is used for displaying an image and displays the image as illustrated in FIG. 7 .
- the keyboard/mouse 809 is an example of an input device.
- the input device designates a utility pole, a point, or the like in the area-of-interest cropping unit 802 .
- the input device performs a process of designating element buttons in the manipulation button region 751 on the element display unit 804 .
- the result file 808 stores the point cloud of the supplemented electric lines.
- the area-of-interest cropping unit 802 operates.
- the area-of-interest cropping unit 802 displays an image as illustrated in FIG. 2 on the display 810 .
- the area-of-interest cropping unit 802 obtains coordinate information of the two utility poles.
- the area-of-interest cropping unit 802 obtains the coordinates defining the area of interest 501 from the utility pole coordinates by referring to the area definition file 807 .
- the area-of-interest cropping unit 802 extracts the point cloud in the area of interest 501 from the three-dimensional point cloud data stored in the three-dimensional point cloud file 806 .
- the area-of-interest cropping unit 802 converts the extracted three-dimensional point cloud data to a local coordinate system, if necessary.
- the extracted three-dimensional point cloud data is transmitted to the element segmenting unit 803 .
- step S 404 the element segmenting unit 803 slices the area of interest 501 at regular intervals to generate subdivided areas (slices). After that, in processing S 405 , the element segmenting unit 803 counts the number of point clouds for each slice and creates the histogram described in FIG. 6 .
- the element segmenting unit 803 determines the segmentation plane 603 at the valley portion of the histogram in processing S 406 , converts the segmentation plane 603 to the world coordinates as necessary in processing S 407 , and transmits the segmentation plane parameters (a, b, c, d) to the element display unit 804 .
- the element display unit 804 displays the three-dimensional point cloud data on the display 810 as illustrated in FIG. 7 .
- the element display unit 804 creates a list of the segmented areas 604 interposed between two adjacent segmentation planes 603 and reflects the list on the screen in association with the element of an element selection button.
- the element display unit 804 displays a three-dimensional point cloud included in each of the segmented areas as element 1 , element 2 , and element 3 on the display 810 as a point cloud that can be selected with a mouse or the like.
- the aerial line supplementation unit 805 performs calculation for supplementing the aerial line on the basis of the point selected by the user in step S 408 .
- a well-known method can be employed for the supplementation.
- the supplemented three-dimensional point cloud data obtained as a result is output as the result file 808 to the storage device or the outside of the system.
- the aerial line extraction system 800 may be configured by a single server, or an arbitrary portion of the input device, the output device, the processing device, and the storage device may be configured by another server connected by a network.
- the function equivalent to the function configured by software can be realized by hardware such as Field Programmable Gate Array (FPGA) and Application Specific Integrated Circuit (ASIC).
- FIG. 9 is a diagram describing the concept of processing S 406 by the element segmenting unit 803 .
- the element segmenting unit 803 After extracting the histogram 601 , the element segmenting unit 803 searches for the value of the histogram 601 from the smaller portion to the larger portion of the y-axis.
- a location where the histogram value exceeds the threshold value 901 is defined as a rising edge 902
- a location where the histogram value falls below the threshold value is defined as a falling edge 903 .
- the element segmenting unit 803 detects the rising edge 902 following the falling edge 903 and sets the midpoint thereof as the segmentation plane 603 . In addition, a first rising edge 902 a and a last falling edge c are set as the segmentation plane 603 .
- the histogram may be searched for from the larger portion of the y-axis to the smaller portion of the y-axis. In this case, the correspondence between the rising edge 902 and the falling edge 903 is reversed.
- two threshold values of a large threshold value and a small threshold value are set as the threshold value 901 , and by determining a rising edge in a case where a transition is made in order from the small threshold value to the large threshold value and by determining a rising edge in a case where a transition is made in order from the large threshold value to the small threshold value, t the peaks and valleys can be determined more accurately.
- the lead-in line is an electric line connecting a utility pole to a consumer and usually denotes a line from the utility pole to a lead-in-line attachment point attached to the eaves of a house or the like.
- Most of the system configuration and the processing flow may be configured similarly to the first embodiment.
- the portions different from those of the first embodiment will be described.
- FIG. 10 is a display example of a three-dimensional point cloud data including a lead-in line.
- the lead-in line 1001 from a utility pole 204 b is connected to the lead-in-line attachment point of a customer 1002 .
- a point cloud data of trees 1003 is included.
- the case can be dealt with by changing the area-of-interest cropping processing step S 402 and the area of interest slicing processing step S 404 in the processing according to the first embodiment illustrated in the flow of FIG. 4 .
- FIG. 11 is a conceptual diagram in which a three-dimensional point cloud including the lead-in line in FIG. 10 is viewed down from the above.
- Lead-in lines 1001 a and 1001 b are led from the utility pole 204 b to customers 1002 a and 1002 b .
- a cylinder centered on the utility pole 204 b is defined as the area of interest.
- the numerical values of the radius r and the height H of the cylinder are stored as the area definition file 807 .
- the subsequent processing may be performed by converting to the local coordinate in which the center of the utility pole 204 b is set as the origin.
- the cylinder is segmented into a plurality of subdivided areas by a plurality of subdivision planes that are perpendicular to the ground and segment the cylinder at equal angles in the circumferential direction. Therefore, for example, the cylinder is segmented into 360 subdivided areas by one degree.
- the counting processing step S 405 of the score cloud of the subdivided area may be basically similar to that of the embodiment.
- FIG. 12 shows a histogram obtained from the three-dimensional point cloud data of FIGS. 10 and 11 .
- the horizontal axis is the y-axis, but in the third embodiment, the horizontal axis is defined to have an angle ⁇ .
- the histogram 601 of the portion in which the point cloud 205 of aerial lines, the lead-in line 1001 , or the trees 1003 exist becomes a mountain. Therefore, similarly to the first embodiment, the segmentation plane 603 can be extracted.
- the cylinder is segmented into four segmented areas 604 .
- a segmented area 604 d includes the lead-in line 1001 a and the point cloud (element A) of the consumer 1002 a .
- a segmented area 604 e includes the lead-in line 1001 b and the point cloud (element B) of the consumer 1002 b .
- a segmented area 604 f includes a point cloud (element C) of the trees 1003 .
- a segmented area 604 g includes the point cloud 205 of aerial lines and the point cloud (element D) of a utility pole 204 a.
- FIG. 13 is a display example of the flow of FIG. 4 and the point cloud selection processing step S 408 .
- the element display unit 804 displays a display screen 1301 on the display 810 .
- Manipulation buttons 1302 are displayed on the display screen, and it is possible to selectively display the point cloud (element) included in each of the segmented areas. For example, when the button of the element A is designated with the manipulation button 1302 , a screen 1303 A displaying the element A is displayed. In addition, by designating the buttons of the element B, the element C, and the element D, screens 1303 B, 1303 C, and 1303 D for displaying the respective elements are displayed. Therefore, the user facilitates selectively designating a desired point cloud.
- the target point cannot be directly selected.
- the aerial lines such as electric lines often overlap with each other when viewed from the side, and thus, in many situations, the techniques of the embodiments are required.
- the present invention relates to a method of processing a three-dimensional point cloud data, and is particularly applicable to an industry for extracting an aerial line from the three-dimensional point cloud data.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Electromagnetism (AREA)
- Processing Or Creating Images (AREA)
- Image Analysis (AREA)
- Electric Cable Installation (AREA)
- Optical Radar Systems And Details Thereof (AREA)
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JPJP2018-158445 | 2018-08-27 | ||
JP2018158445A JP7039420B2 (ja) | 2018-08-27 | 2018-08-27 | 空中線抽出システム及び方法 |
JP2018-158445 | 2018-08-27 | ||
PCT/JP2019/001046 WO2020044589A1 (ja) | 2018-08-27 | 2019-01-16 | 空中線抽出システム及び方法 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210142074A1 US20210142074A1 (en) | 2021-05-13 |
US11495026B2 true US11495026B2 (en) | 2022-11-08 |
Family
ID=69645132
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/047,416 Active 2039-06-04 US11495026B2 (en) | 2018-08-27 | 2019-01-16 | Aerial line extraction system and method |
Country Status (4)
Country | Link |
---|---|
US (1) | US11495026B2 (ja) |
JP (1) | JP7039420B2 (ja) |
KR (1) | KR102366062B1 (ja) |
WO (1) | WO2020044589A1 (ja) |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7407648B2 (ja) * | 2020-04-15 | 2024-01-04 | 株式会社日立製作所 | 物体認識装置及び物体認識方法 |
CN111652957B (zh) * | 2020-06-03 | 2023-04-11 | 中铁二院工程集团有限责任公司 | 一种线性工程设计图生成方法 |
CN111796298B (zh) * | 2020-07-06 | 2023-01-10 | 贵州电网有限责任公司 | 一种激光LiDAR电力线点云自动补点方法 |
US11521357B1 (en) * | 2020-11-03 | 2022-12-06 | Bentley Systems, Incorporated | Aerial cable detection and 3D modeling from images |
WO2022176160A1 (ja) * | 2021-02-19 | 2022-08-25 | 日本電信電話株式会社 | 架空ケーブル特定方法及び架空ケーブル特定システム |
JPWO2022176159A1 (ja) * | 2021-02-19 | 2022-08-25 | ||
WO2023135717A1 (ja) * | 2022-01-14 | 2023-07-20 | 日本電信電話株式会社 | 3次元モデルを作成する装置、方法及びプログラム |
WO2023135718A1 (ja) * | 2022-01-14 | 2023-07-20 | 日本電信電話株式会社 | 3次元モデルを作成する装置、方法及びプログラム |
CN117392270B (zh) * | 2023-12-12 | 2024-03-15 | 长沙能川信息科技有限公司 | 基于激光点云的导线拟合的方法、系统和计算机设备 |
Citations (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050007450A1 (en) | 2002-12-13 | 2005-01-13 | Duane Hill | Vehicle mounted system and method for capturing and processing physical data |
JP2009068951A (ja) | 2007-09-12 | 2009-04-02 | Mitsubishi Electric Corp | 空中架線の管理システム |
JP2010218362A (ja) | 2009-03-18 | 2010-09-30 | Geo Technical Laboratory Co Ltd | 地図画像処理装置、地図画像処理方法、および、コンピュータプログラム |
US8244026B2 (en) * | 2008-01-09 | 2012-08-14 | Tiltan Systems Engineering Ltd. | Apparatus and method for automatic airborne LiDAR data processing and mapping using data obtained thereby |
US20130300740A1 (en) * | 2010-09-13 | 2013-11-14 | Alt Software (Us) Llc | System and Method for Displaying Data Having Spatial Coordinates |
US20140177928A1 (en) | 2011-05-16 | 2014-06-26 | Ergon Energy Corporation Limited | Method and system for processing image data |
US20140192050A1 (en) * | 2012-10-05 | 2014-07-10 | University Of Southern California | Three-dimensional point processing and model generation |
JP2015001901A (ja) | 2013-06-17 | 2015-01-05 | 日本電信電話株式会社 | 点群解析処理装置、点群解析処理方法、及びプログラム |
US9235763B2 (en) * | 2012-11-26 | 2016-01-12 | Trimble Navigation Limited | Integrated aerial photogrammetry surveys |
US9406138B1 (en) * | 2013-09-17 | 2016-08-02 | Bentley Systems, Incorporated | Semi-automatic polyline extraction from point cloud |
US9542738B2 (en) * | 2014-01-31 | 2017-01-10 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US20170248969A1 (en) * | 2016-02-29 | 2017-08-31 | Thinkware Corporation | Method and system for providing route of unmanned air vehicle |
US10008123B2 (en) * | 2015-10-20 | 2018-06-26 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
US20180218214A1 (en) * | 2015-08-06 | 2018-08-02 | Accenture Global Services Limited | Condition detection using image processing |
US10380423B2 (en) * | 2017-10-31 | 2019-08-13 | Pilot Fiber Inc. | Utility infrastructure mapping |
US10539676B2 (en) * | 2017-03-22 | 2020-01-21 | Here Global B.V. | Method, apparatus and computer program product for mapping and modeling a three dimensional structure |
US11029211B2 (en) * | 2015-12-09 | 2021-06-08 | Flir Systems, Inc. | Unmanned aerial system based thermal imaging systems and methods |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11402509B2 (en) * | 2017-09-04 | 2022-08-02 | Commonwealth Scientific And Industrial Research Organisation | Method and system for use in performing localisation |
-
2018
- 2018-08-27 JP JP2018158445A patent/JP7039420B2/ja active Active
-
2019
- 2019-01-16 WO PCT/JP2019/001046 patent/WO2020044589A1/ja active Application Filing
- 2019-01-16 US US17/047,416 patent/US11495026B2/en active Active
- 2019-01-16 KR KR1020207028756A patent/KR102366062B1/ko active IP Right Grant
Patent Citations (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20050007450A1 (en) | 2002-12-13 | 2005-01-13 | Duane Hill | Vehicle mounted system and method for capturing and processing physical data |
WO2005017550A2 (en) | 2002-12-13 | 2005-02-24 | Utah State University Research Foundation | A vehicle mounted system and method for capturing and processing physical data |
JP2009068951A (ja) | 2007-09-12 | 2009-04-02 | Mitsubishi Electric Corp | 空中架線の管理システム |
US8244026B2 (en) * | 2008-01-09 | 2012-08-14 | Tiltan Systems Engineering Ltd. | Apparatus and method for automatic airborne LiDAR data processing and mapping using data obtained thereby |
JP2010218362A (ja) | 2009-03-18 | 2010-09-30 | Geo Technical Laboratory Co Ltd | 地図画像処理装置、地図画像処理方法、および、コンピュータプログラム |
US20130300740A1 (en) * | 2010-09-13 | 2013-11-14 | Alt Software (Us) Llc | System and Method for Displaying Data Having Spatial Coordinates |
JP2014520307A (ja) | 2011-05-16 | 2014-08-21 | エルゴン エナジー コーポレーション リミテッド | 画像データを処理する方法およびシステム |
US20140177928A1 (en) | 2011-05-16 | 2014-06-26 | Ergon Energy Corporation Limited | Method and system for processing image data |
US20140192050A1 (en) * | 2012-10-05 | 2014-07-10 | University Of Southern California | Three-dimensional point processing and model generation |
US9235763B2 (en) * | 2012-11-26 | 2016-01-12 | Trimble Navigation Limited | Integrated aerial photogrammetry surveys |
JP2015001901A (ja) | 2013-06-17 | 2015-01-05 | 日本電信電話株式会社 | 点群解析処理装置、点群解析処理方法、及びプログラム |
US9406138B1 (en) * | 2013-09-17 | 2016-08-02 | Bentley Systems, Incorporated | Semi-automatic polyline extraction from point cloud |
US9542738B2 (en) * | 2014-01-31 | 2017-01-10 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
US20180218214A1 (en) * | 2015-08-06 | 2018-08-02 | Accenture Global Services Limited | Condition detection using image processing |
US10008123B2 (en) * | 2015-10-20 | 2018-06-26 | Skycatch, Inc. | Generating a mission plan for capturing aerial images with an unmanned aerial vehicle |
US11029211B2 (en) * | 2015-12-09 | 2021-06-08 | Flir Systems, Inc. | Unmanned aerial system based thermal imaging systems and methods |
US20170248969A1 (en) * | 2016-02-29 | 2017-08-31 | Thinkware Corporation | Method and system for providing route of unmanned air vehicle |
US10539676B2 (en) * | 2017-03-22 | 2020-01-21 | Here Global B.V. | Method, apparatus and computer program product for mapping and modeling a three dimensional structure |
US10380423B2 (en) * | 2017-10-31 | 2019-08-13 | Pilot Fiber Inc. | Utility infrastructure mapping |
Non-Patent Citations (1)
Title |
---|
Korean Office Action dated Jul. 19, 2021 for Korean Patent Application No. 10-2020-7028756. |
Also Published As
Publication number | Publication date |
---|---|
US20210142074A1 (en) | 2021-05-13 |
JP7039420B2 (ja) | 2022-03-22 |
WO2020044589A1 (ja) | 2020-03-05 |
KR20200128133A (ko) | 2020-11-11 |
JP2020035001A (ja) | 2020-03-05 |
KR102366062B1 (ko) | 2022-02-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11495026B2 (en) | Aerial line extraction system and method | |
US7046841B1 (en) | Method and system for direct classification from three dimensional digital imaging | |
US11295130B2 (en) | Aerial line extraction system and aerial line extraction method | |
US9576373B2 (en) | Geospatial imaging system providing segmentation and classification features and related methods | |
CN109271861B (zh) | 多尺度融合的点云交通标识牌自动提取方法 | |
CN109949326A (zh) | 基于背包式三维激光点云数据的建筑物轮廓线提取方法 | |
CN113238578B (zh) | 一种电力杆塔无人机巡检路线规划方法及系统 | |
US11875561B2 (en) | Electrical power grid modeling | |
CN111462134A (zh) | 高分遥感影像和激光雷达点云融合的单木分割方法及系统 | |
CN112132108A (zh) | 地面点云数据的提取方法、装置、设备及存储介质 | |
Yadav et al. | Identification of trees and their trunks from mobile laser scanning data of roadway scenes | |
KR20180092591A (ko) | Uav 영상매칭 기술을 이용한 구조물 형상변화 탐지 알고리즘 | |
WO2018159079A1 (ja) | 地形情報処理装置、地形情報処理方法、およびプログラム | |
CN112097776A (zh) | 用于提高航线地图渲染效率的方法及系统 | |
CN113379826A (zh) | 物流件的体积测量方法以及装置 | |
JP2013008310A (ja) | 三次元地図作成装置およびその窓領域検出装置 | |
CN115861816A (zh) | 一种立体低涡识别方法、装置、存储介质及终端 | |
KR20170016060A (ko) | 데이터 마이닝 기법을 이용한 지상 라이다 데이터의 필터링 및 현황선 추출 방법과 그 장치 | |
US20200242819A1 (en) | Polyline drawing device | |
KR101114904B1 (ko) | 도화원도와 항공 레이저 측량 데이터를 이용한 도시공간정보 구축 시스템 및 그 방법 | |
US20230243976A1 (en) | Systems and methods for utility pole loading and/or clearance analyses | |
CN117760342A (zh) | 基于激光点云的外墙平整度检测方法 | |
JP2023090051A (ja) | 道路縁高付与装置 | |
CN116030006A (zh) | 基于无人机激光雷达的电力塔杆识别方法及系统 | |
CN114676638A (zh) | 电力导线建模方法、装置、设备及存储介质 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HITACHI SOLUTIONS, LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKANO, SADAKI;KIMURA, NOBUTAKA;MARUYAMA, KISHIKO;AND OTHERS;SIGNING DATES FROM 20200907 TO 20200929;REEL/FRAME:054048/0175 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION DISPATCHED FROM PREEXAM, NOT YET DOCKETED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |