WO2018042551A1 - 不要物除去システム、不要物除去方法及びプログラム - Google Patents
不要物除去システム、不要物除去方法及びプログラム Download PDFInfo
- Publication number
- WO2018042551A1 WO2018042551A1 PCT/JP2016/075485 JP2016075485W WO2018042551A1 WO 2018042551 A1 WO2018042551 A1 WO 2018042551A1 JP 2016075485 W JP2016075485 W JP 2016075485W WO 2018042551 A1 WO2018042551 A1 WO 2018042551A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target area
- ground surface
- unnecessary
- point cloud
- altitude
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C7/00—Tracing profiles
- G01C7/02—Tracing profiles of land surfaces
- G01C7/04—Tracing profiles of land surfaces involving a vehicle which moves along the profile to be traced
Definitions
- the present invention relates to an unnecessary object removal system, an unnecessary object removal method, and a program for removing unnecessary objects on the ground surface from data in a target area.
- LIDAR Light Detection and Ranging
- a target area is irradiated with a laser, the reflection point of this laser is acquired as a three-dimensional coordinate, a number of grids are derived on the horizontal plane of the target area,
- a configuration is disclosed in which a topographic model of a target area is created by extracting a coordinate with the lowest elevation value from a vertical coordinate distribution of point cloud data in the middle (see Patent Document 1).
- Patent Document 1 Although it is possible to create a terrain model of the target area, when extracting only the ground surface of the target area, the user himself / herself needs to correct unnecessary objects such as trees. It was.
- An object of the present invention is to provide an unnecessary object removal system, an unnecessary object removal method, and a program that allow a user to extract a ground surface without performing a correction operation by removing unnecessary objects from acquired point cloud data.
- the purpose is to provide.
- the present invention provides the following solutions.
- the invention according to the first feature is an unnecessary object removal system that removes unnecessary objects on the ground surface from data of a target area
- Point cloud data acquisition means for acquiring point cloud data of the target area
- Image analysis means for image analysis of image data obtained by photographing the target area
- Elevation information acquisition means for acquiring elevation information of the target area based on position information of the target area
- a surface height grasping means for grasping the height of the ground surface
- Unnecessary object determining means for determining the unnecessary object based on the result of the image analysis and a portion different from the altitude of the ground surface based on the elevation information
- An unnecessary object removing means for removing the determined unnecessary object from the point cloud data
- An unnecessary object removing system is provided.
- the unnecessary object removal system for removing unnecessary objects on the ground surface from the data of the target area acquires the point cloud data of the target area, and the image data obtained by photographing the target area is converted into an image. Analyzing and acquiring the altitude information of the target area based on the position information of the target area, grasping the altitude of the ground surface based on the acquired altitude information, the result of the image analysis, and the altitude information The unnecessary object is determined on the basis of the altitude and the different part of the ground surface based on, and the determined unnecessary object is removed from the point cloud data.
- the invention according to the first feature is a category of the unnecessary object removal system, but also in other categories such as a method or a program, the same actions and effects according to the category are exhibited.
- the invention according to a second feature includes a three-dimensional coordinate grasping means for grasping the three-dimensional coordinates of the target area based on the point cloud data;
- the ground surface height grasping means grasps the height by assuming that the lowest coordinate among the three-dimensional coordinate height coordinates is the ground surface, and is an invention for removing an unnecessary object according to the first feature. I will provide a.
- the unnecessary object removal system grasps the three-dimensional coordinates of the target area based on the point cloud data, and The height is grasped assuming that the lowest coordinate among the height coordinates is the ground surface.
- a three-dimensional coordinate grasping means for grasping a three-dimensional coordinate of the target area based on the point cloud data; With The surface altitude grasping means grasps the altitude assuming that the height coordinate of the range with the largest number is the ground surface when the height coordinates of the three-dimensional coordinates are divided for each predetermined range.
- the unnecessary object removal system grasps the three-dimensional coordinates of the target area based on the point cloud data, and When the height coordinates are divided for each predetermined range, the altitude is grasped assuming that the height coordinate in the range with the largest number is the ground surface.
- the invention according to a fourth feature is an unnecessary object removal method for removing unnecessary objects on the ground surface from data of a target area, Obtaining point cloud data of the target area; Image analysis of image data obtained by photographing the target area; Obtaining the altitude information of the target area based on the position information of the target area; Based on the acquired elevation information, grasping the altitude of the ground surface; Determining the unnecessary object based on the result of the image analysis and a portion different from the altitude of the ground surface based on the elevation information; Removing the determined unnecessary matter from the point cloud data;
- an unnecessary object removal method characterized by comprising:
- the invention according to the fifth feature provides an unnecessary object removal system for removing unnecessary objects on the surface of the ground from the data of the target area Obtaining point cloud data of the target area; Image analysis of image data obtained by photographing the target area; Obtaining altitude information of the target area based on position information of the target area; Based on the acquired elevation information, grasping the altitude of the ground surface; Determining the unnecessary object based on a result of the image analysis and a portion different from the altitude of the ground surface based on the elevation information; Removing the determined unnecessary matter from the point cloud data; Provide a program to execute.
- an unnecessary object removal system it is possible to provide an unnecessary object removal system, an unnecessary object removal method, and a program that allow a user to extract the ground surface without performing a correction operation.
- FIG. 1 is a diagram showing an outline of the unnecessary object removal system 1.
- FIG. 2 is an overall configuration diagram of the unnecessary object removal system 1.
- FIG. 3 is a functional block diagram of the computer 10 and the flying object 100.
- FIG. 4 is a flowchart showing an unnecessary object removal process executed by the computer 10 and the flying object 100.
- FIG. 5 is a diagram schematically illustrating an example of image data in which the computer 10 recognizes the ground surface and an object.
- FIG. 6 is a diagram illustrating an example of image data obtained by removing unnecessary objects generated by the computer 10.
- FIG. 1 is a diagram for explaining an outline of an unnecessary object removal system 1 which is a preferred embodiment of the present invention.
- the unnecessary object removal system 1 includes a computer 10 and an aircraft 100.
- the number of flying bodies 100 is not limited to one and may be plural.
- the computer 10 is not limited to an actual device, and may be a virtual device.
- each process mentioned later may be implement
- the computer 10 is a computer device capable of data communication with the flying object 100.
- the flying object 100 is a flying object such as a drone or an aircraft capable of data communication with the computer 10.
- the flying object 100 has an LIDAR and an imaging device such as a camera.
- the flying object 100 flies over the target area and acquires area data of point cloud data, image data, and position information (step S01).
- the flying object 100 acquires the ground surface of the target area and the point cloud data of the object existing on the ground surface by LIDAR.
- the flying object 100 captures an image of the target area and acquires image data.
- the flying object 100 acquires position information of the target area from the GPS or the like.
- the flying object 100 transmits the area data to the computer 10 (step S02).
- the computer 10 receives the area data.
- the computer 10 analyzes the image data and recognizes the object and the ground (step S03).
- the computer 10 recognizes the object and the ground surface based on the RGB values of the image data. For example, the computer 10 recognizes a region having an RGB value different from the RGB value as an object based on the RGB value of the ground surface.
- the computer 10 classifies the object and the ground surface by drawing a contour line at the boundary between the identified object and the ground surface. This contour line indicates the contour of the object.
- the computer 10 acquires the altitude information of the target area based on the position information (step S04).
- the computer 10 acquires elevation information corresponding to the position information of the target area from, for example, various databases.
- the computer 10 grasps the altitude of the surface of the target area based on the acquired point cloud data (step S05). For example, the computer 10 grasps the three-dimensional coordinates of the target area based on the point cloud data, and grasps the lowest coordinate among the three-dimensional coordinate height coordinates as the altitude that is the ground surface. In addition, for example, when the height coordinates of the grasped three-dimensional coordinates are divided for each predetermined range, the computer 10 grasps the height coordinates of the range having the largest number as the altitude that is the ground surface.
- the computer 10 determines an unnecessary object such as an object existing on the ground surface based on the altitude of the ground surface grasped from the point cloud data and the altitude information acquired from the position information (step S06). For example, the computer 10 compares the altitude of the ground surface with the height coordinates of the three-dimensional coordinates of the point cloud data, and extracts a portion having an error in height. Then, the computer 10 determines whether or not this error matches the contour of the object identified by the image data.
- the computer 10 removes the determined unnecessary object from the point cloud data (step S07).
- the computer 10 removes the point cloud data corresponding to the object as a target to be removed from the acquired point cloud data, and removes the contour lines by smoothly connecting the heights of the contour lines with a plane. Generate point cloud data.
- FIG. 2 is a diagram showing a system configuration of an unnecessary object removal system 1 which is a preferred embodiment of the present invention.
- the unnecessary object removal system 1 includes a computer 10, a flying object 100, and a public line network (Internet network, third and fourth generation communication network, etc.) 5.
- the number of flying bodies 100 is not limited to one, but may be plural.
- the computer 10 is not limited to a real device, and may be a virtual device.
- each process mentioned later may be implement
- the computer 10 is the above-described computer device having the functions described below.
- the flying object 100 is the above-described object having the functions described later.
- FIG. 3 is a functional block diagram of the computer 10 and the flying object 100.
- the computer 10 includes a CPU (Central Processing Unit), a RAM (Random Access Memory), a ROM (Read Only Memory), etc. as the control unit 11, and a device for enabling communication with other devices as the communication unit 12.
- a WiFi (Wireless Fidelity) compatible device compliant with IEEE 802.11 is provided.
- the computer 10 includes various data analysis devices, various data processing devices, and the like as the processing unit 13.
- control unit 11 reads a predetermined program, thereby realizing the data reception module 20 and the altitude information acquisition module 21 in cooperation with the communication unit 12. Further, in the computer 10, the control unit 11 reads a predetermined program, so that the image analysis module 30, the altitude grasping module 31, the unnecessary object determination module 32, the unnecessary object removal module 33, A group data generation module 34 and an image generation module 35 are realized.
- the flying object 100 includes a CPU, RAM, ROM, and the like as the control unit 110, and a WiFi compatible device for enabling communication with other devices as the communication unit 120.
- the flying object 100 includes a flying device such as a propeller and a motor as the flying unit 130.
- the flying object 100 includes an imaging device such as an image sensor or a lens, or a LIDAR device such as a laser, a scanner, an optical system, a light receiver, an electronic device, a GPS, or an inertial guidance device that acquires point cloud data. Etc.
- the data transmission module 150 is realized in cooperation with the communication unit 120 by the control unit 110 reading a predetermined program.
- the control unit 110 reads a predetermined program, thereby realizing the flight module 160 in cooperation with the flying unit 130.
- the point cloud data acquisition module 170, the imaging module 171, and the position information acquisition module 172 are realized in cooperation with the acquisition unit 140 by the control unit 110 reading a predetermined program.
- FIG. 4 is a diagram illustrating a flowchart of the unnecessary object removal process executed by the computer 10 and the flying object 100. The processing executed by the modules of each device described above will be described together with this processing.
- step S10 the flight module 160 flies over the target area.
- step S10 the flight module 160 flies based on a flight path based on a preset program or a flight instruction from a control terminal or information terminal (not shown).
- the point cloud data acquisition module 170 acquires point cloud data of the target area during flight (step S11).
- the point cloud data acquisition module 170 acquires point cloud data by LIDAR.
- This point cloud data is three-dimensional data.
- the point cloud data acquisition module 170 acquires point cloud data over the entire target area. That is, the point cloud data acquisition module 170 acquires the ground surface of the target area and the point cloud data of an object existing on the ground surface.
- This point cloud data is shown as latitude, longitude, altitude, and GPS as a data format.
- the point cloud data may be data other than the data format described above, or may be any one or a combination of a plurality of data formats described above.
- the imaging module 171 captures an image of the target area during the flight (step S12).
- the imaging module 171 divides the entire view of the target area or the target area into predetermined sections, and captures an image of each section.
- the flying object 100 acquires an image of the target area by taking an image.
- the location information acquisition module 172 acquires its location information from GPS or the like (step S13).
- step S ⁇ b> 13 the position information acquisition module 172 acquires position information at the point where the image capturing module 171 captured.
- the position information acquisition module 172 acquires the position information of the center of the captured entire scene when the imaging module 171 captures the entire scene.
- the position information acquisition module 172 acquires position information of the center of each section when the shooting module 171 captures each section.
- the position information is acquired as, for example, latitude and longitude. Note that the position information acquisition module 172 may acquire position information by a method other than that described above.
- the position information may be in a format other than latitude and longitude.
- steps S11 to S13 described above may be different.
- the process of step S11 or step S13 may be executed, or after executing the process of step S13, the process of step S11 or step S12 may be executed.
- the data transmission module 150 transmits the point cloud data, the image data, and the area data, which is position information, acquired by the above-described processing in steps S11 to S13 to the computer 10 (step S14).
- the data receiving module 20 receives area data.
- the image analysis module 30 performs image analysis on the image data (step S15).
- the image analysis module 30 extracts RGB values of the image data. Note that the image analysis module 30 may extract the feature amount of the image data.
- the image analysis module 30 recognizes the ground and objects present in the image data based on the result of the image analysis (step S16). In step S16, the image analysis module 30 recognizes the ground surface and the object based on the difference in color. The image analysis module 30 recognizes the ground surface based on preset RGB values of the ground surface, and recognizes an area having an RGB value different from the RGB value of the ground surface as an object. The image analysis module 30 recognizes the ground surface based on the general RGB values of the ground surface, and recognizes an area having an RGB value different from the RGB value of the ground surface as an object.
- the image analysis module 30 recognizes an RGB value similar to the RGB value of the ground surface, whether or not a boundary exists between the region of the similar RGB value and the RGB value region of the ground surface is determined. If the boundary exists, the similar RGB value region is determined as an object, and if the boundary does not exist, the similar RGB value region is determined as the ground surface.
- the image analysis module 30 may recognize the ground surface and the object based on a preset ground surface feature amount when the feature amount is extracted. At this time, the image analysis module 30 recognizes the ground surface based on a preset feature amount of the ground surface, and recognizes a region having a feature amount different from the feature amount of the ground surface as an object. The image analysis module 30 may recognize an object based on a preset feature amount of the object, and may recognize a region having a feature amount different from the feature amount of the object as an object. Further, the image analysis module 30 may recognize a region where no feature amount exists as the ground surface.
- FIG. 5 is a diagram schematically illustrating an example of image data in which the image analysis module 30 recognizes the ground surface and an object.
- the image analysis module 30 recognizes the ground surface 300 and the object 310 based on the color of the image data as described above.
- the image analysis module 30 recognizes a region having an RGB value different from the RGB value of the ground surface 300 as a region of the object 310 and recognizes a boundary between the ground surface 300 and the object 310.
- the image analysis module 30 separates the ground surface 300 and the object 310 from each other by drawing an outline 320 at the boundary between the ground surface 300 and the object 310.
- This outline 320 is a rough shape of the object 310 and indicates the outline of the object 310.
- the image analysis module 30 draws the contour 320 by analogizing the unclear point from the point where the boundary is clear. May be.
- the altitude information acquisition module 21 acquires the altitude information of the target area based on the position information (step S17).
- the altitude information acquisition module 21 acquires altitude information corresponding to the position information of the target area from, for example, various databases managed by public institutions, national institutions, private organizations, and the like. This is realized by acquiring corresponding altitude information from the various databases based on the current latitude, longitude, address, and the like.
- the altitude information acquisition module 21 may directly acquire its own altitude information, or may acquire altitude information by a method other than the example described above.
- the altitude grasping module 31 grasps the height coordinates of the target area based on the acquired point cloud data (step S18).
- the altitude grasping module 31 grasps the three-dimensional coordinates of the target area based on the point cloud data.
- the three-dimensional coordinates are orthogonal coordinates including a horizontal X coordinate and a Y coordinate, and a vertical Z direction.
- the altitude grasping module 31 grasps the altitude of the ground surface and the object based on the height coordinates. For example, the altitude grasping module 31 grasps the lowest coordinate among the three-dimensional coordinates of the point cloud data as the altitude of the ground surface. This is effective when there are no holes or the like on the ground surface.
- the altitude grasping module 31 grasps the height coordinates of the range with the largest number as the altitude of the ground surface.
- the predetermined range is, for example, 10 steps in height. This is effective when the area of the ground surface is the largest when the ground surface and the object are compared.
- the altitude grasping module 31 may grasp the altitude of the ground surface by other methods.
- the altitude grasping module 31 may grasp the altitude by a method other than the example described above.
- the unnecessary object determination module 32 extracts a portion having an error in height based on the altitude obtained from the point cloud data and the altitude information acquired from the position information (step S19). In step S19, the unnecessary object determination module 32 compares the altitude of the ground surface and the object grasped from the point cloud data with the altitude information acquired from the position information, and extracts a portion where an error exists.
- the unnecessary object determination module 32 determines whether or not the extracted error matches the contour drawn in the image data (step S20). In step S20, the unnecessary object determination module 32 determines whether or not the portion where the extracted error exists matches the contour line.
- step S20 if the unnecessary object determination module 32 determines that they do not match (NO in step S20), it determines that this error is not due to an object, and ends this process. For example, ridges or irregularities on the ground surface are applicable in this case.
- step S20 determines that the unnecessary object determination module 32 matches (YES in step S20)
- the unnecessary object removal module 33 determines that this contour line.
- the area surrounded by is removed from the point cloud data as an unnecessary object (step S21).
- step S ⁇ b> 21 the unnecessary object determination module 32 specifies point cloud data corresponding to the area surrounded by the outline in the image data.
- the unnecessary object determination module 32 specifies the position of the point cloud data corresponding to this region based on, for example, the position information of the spot where the image was taken, the positional relationship in the image data, and the like.
- the unnecessary object determination module 32 removes the specified point cloud data.
- the point cloud data generation module 34 connects the heights of the removed contours smoothly on a plane to generate point cloud data from which unnecessary objects are removed (step S22).
- the point cloud data generation module 34 supplements the area of the removed point cloud data with the remaining point cloud data.
- the point cloud data generation module 34 smoothly connects the remaining point cloud data.
- the point cloud data generation module 34 may generate point cloud data by removing unnecessary objects by applying a part or all of the remaining point cloud data to the area of the removed point cloud data. However, it may be generated by other methods.
- the image generation module 35 generates image data from which unnecessary objects have been removed based on the point cloud data from which unnecessary objects have been removed (step S23).
- step S23 the image generation module 35 superimposes a part or all of the ground portion of the image data on the position of the image data corresponding to the area where the unnecessary objects are removed in the point cloud data from which the unnecessary objects are removed.
- image data from which unnecessary objects are removed is generated.
- the image data generation module 35 removes the data existing at the position of the image data corresponding to the area where the unnecessary objects are removed from the point cloud data from which the unnecessary objects are removed, and the data corresponding to the ground surface at the removed positions. May be overwritten.
- the image generation module 35 may generate image data from which unnecessary objects are removed by other methods.
- FIG. 6 is a diagram illustrating an example of image data from which unnecessary objects generated by the image generation module 35 are removed.
- the image generation module 35 generates a ground surface 400.
- the ground surface 400 is obtained by removing the object 310 existing in FIG. 5 and superimposing a part of the ground surface 300 on the position of the object 310.
- the image generation module 35 removes the unnecessary object 310 from the image data shown in FIG. 5 and generates image data of only the ground surface 400.
- the means and functions described above are realized by a computer (including a CPU, an information processing apparatus, and various terminals) reading and executing a predetermined program.
- the program is provided in a form recorded on a computer-readable recording medium such as a flexible disk, CD (CD-ROM, etc.), DVD (DVD-ROM, DVD-RAM, etc.).
- the computer reads the program from the recording medium, transfers it to the internal storage device or the external storage device, stores it, and executes it.
- the program may be recorded in advance in a storage device (recording medium) such as a magnetic disk, an optical disk, or a magneto-optical disk, and provided from the storage device to a computer via a communication line.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
前記対象エリアの点群データを取得する点群データ取得手段と、
前記対象エリアを撮影した画像データを画像解析する画像解析手段と、
前記対象エリアの位置情報に基づいて、前記対象エリアの標高情報を取得する標高情報取得手段と、
前記取得された標高情報に基づいて、地表の高度を把握する地表高度把握手段と、
前記画像解析の結果と、前記標高情報に基づく地表の高度と異なる部分とに基づいて、前記不要物を判定する不要物判定手段と、
前記判定された不要物を前記点群データから除去する不要物除去手段と、
を備えることを特徴とする不要物除去システムを提供する。
を備え、
前記地表高度把握手段が、前記3次元座標の高さ座標の中で、一番低い座標を地表であるとして高度を把握することを特徴とする第1の特徴に係る発明である不要物除去システムを提供する。
を備え、
前記地表高度把握手段が、前記3次元座標の高さ座標を、所定の範囲毎に区切ったときに、個数が一番多い範囲の高さ座標を地表であるとして高度を把握することを特徴とする第1の特徴に係る発明である不要物除去システムを提供する。
前記対象エリアの点群データを取得するステップと、
前記対象エリアを撮影した画像データを画像解析するステップと、
前記対象エリアの位置情報に基づいて、前記対象エリアの標高情報を取得するステップと、
前記取得された標高情報に基づいて、地表の高度を把握するステップと、
前記画像解析の結果と、前記標高情報に基づく地表の高度と異なる部分とに基づいて、前記不要物を判定するステップと、
前記判定された不要物を前記点群データから除去するステップと、
を備えることを特徴とする不要物除去方法を提供する。
前記対象エリアの点群データを取得するステップ、
前記対象エリアを撮影した画像データを画像解析するステップ、
前記対象エリアの位置情報に基づいて、前記対象エリアの標高情報を取得するステップ、
前記取得された標高情報に基づいて、地表の高度を把握するステップ、
前記画像解析の結果と、前記標高情報に基づく地表の高度と異なる部分とに基づいて、前記不要物を判定するステップ、
前記判定された不要物を前記点群データから除去するステップ、
を実行させるためのプログラムを提供する。
本発明の好適な実施形態の概要について、図1に基づいて説明する。図1は、本発明の好適な実施形態である不要物除去システム1の概要を説明するための図である。不要物除去システム1は、コンピュータ10、飛行体100から構成される。
図2に基づいて、本発明の好適な実施形態である不要物除去システム1のシステム構成について説明する。図2は、本発明の好適な実施形態である不要物除去システム1のシステム構成を示す図である。不要物除去システム1は、コンピュータ10、飛行体100、公衆回線網(インターネット網や、第3、第4世代通信網等)5から構成される。なお、飛行体100は、1つに限らず、複数であってもよい。また、コンピュータ10は、実在する装置に限らず、仮想的な装置であってもよい。また、後述する各処理は、コンピュータ10又は飛行体100のいずれか又は双方により実現されてもよい。
図3に基づいて、本発明の好適な実施形態である不要物除去システム1の機能について説明する。図3は、コンピュータ10、飛行体100の機能ブロック図を示す図である。
図4に基づいて、不要物除去システム1が実行する不要物除去処理について説明する。図4は、コンピュータ10、飛行体100が実行する不要物除去処理のフローチャートを示す図である。上述した各装置のモジュールが実行する処理について、本処理に併せて説明する。
Claims (5)
- 対象エリアのデータから地表の不要物を除去する不要物除去システムであって、
前記対象エリアの点群データを取得する点群データ取得手段と、
前記対象エリアを撮影した画像データを画像解析する画像解析手段と、
前記対象エリアの位置情報に基づいて、前記対象エリアの標高情報を取得する標高情報取得手段と、
前記取得された標高情報に基づいて、地表の高度を把握する地表高度把握手段と、
前記画像解析の結果と、前記標高情報に基づく地表の高度と異なる部分とに基づいて、前記不要物を判定する不要物判定手段と、
前記判定された不要物を前記点群データから除去する不要物除去手段と、
を備えることを特徴とする不要物除去システム。 - 前記点群データに基づいて、前記対象エリアの3次元座標を把握する3次元座標把握手段と、
を備え、
前記地表高度把握手段は、前記3次元座標の高さ座標の中で、一番低い座標を地表であるとして高度を把握することを特徴とする請求項1に記載の不要物除去システム。 - 前記点群データに基づいて、前記対象エリアの3次元座標を把握する3次元座標把握手段と、
を備え、
前記地表高度把握手段は、前記3次元座標の高さ座標を、所定の範囲毎に区切ったときに、個数が一番多い範囲の高さ座標を地表であるとして高度を把握することを特徴とする請求項1に記載の不要物除去システム。 - 対象エリアのデータから地表の不要物を除去する不要物除去方法であって、
前記対象エリアの点群データを取得するステップと、
前記対象エリアを撮影した画像データを画像解析するステップと、
前記対象エリアの位置情報に基づいて、前記対象エリアの標高情報を取得するステップと、
前記取得された標高情報に基づいて、地表の高度を把握するステップと、
前記画像解析の結果と、前記標高情報に基づく地表の高度と異なる部分とに基づいて、前記不要物を判定するステップと、
前記判定された不要物を前記点群データから除去するステップと、
を備えることを特徴とする不要物除去方法。 - 対象エリアのデータから地表の不要物を除去する不要物除去システムに、
前記対象エリアの点群データを取得するステップ、
前記対象エリアを撮影した画像データを画像解析するステップ、
前記対象エリアの位置情報に基づいて、前記対象エリアの標高情報を取得するステップ、
前記取得された標高情報に基づいて、地表の高度を把握するステップ、
前記画像解析の結果と、前記標高情報に基づく地表の高度と異なる部分とに基づいて、前記不要物を判定するステップ、
前記判定された不要物を前記点群データから除去するステップ、
を実行させるためのプログラム。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2018536586A JP6457159B2 (ja) | 2016-08-31 | 2016-08-31 | 不要物除去システム、不要物除去方法及びプログラム |
PCT/JP2016/075485 WO2018042551A1 (ja) | 2016-08-31 | 2016-08-31 | 不要物除去システム、不要物除去方法及びプログラム |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2016/075485 WO2018042551A1 (ja) | 2016-08-31 | 2016-08-31 | 不要物除去システム、不要物除去方法及びプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018042551A1 true WO2018042551A1 (ja) | 2018-03-08 |
Family
ID=61301747
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2016/075485 WO2018042551A1 (ja) | 2016-08-31 | 2016-08-31 | 不要物除去システム、不要物除去方法及びプログラム |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6457159B2 (ja) |
WO (1) | WO2018042551A1 (ja) |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002063580A (ja) * | 2000-08-22 | 2002-02-28 | Asia Air Survey Co Ltd | 不定形窓を用いた画像間拡張イメージマッチング方法 |
JP2002074323A (ja) * | 2000-09-01 | 2002-03-15 | Kokusai Kogyo Co Ltd | 三次元市街地空間モデル作成方法およびシステム |
US20020035553A1 (en) * | 2000-07-20 | 2002-03-21 | Kim Seung Bum | Intelligent interpolation methods for automatic generation of an accurate digital elevation model |
JP2006323608A (ja) * | 2005-05-18 | 2006-11-30 | Kozo Keikaku Engineering Inc | 立体構造物群モデル作成装置、立体構造物群モデル作成方法及び立体モデル作成システム |
JP2008164481A (ja) * | 2006-12-28 | 2008-07-17 | Mitsubishi Electric Corp | 標高モデル生成装置、標高モデル生成方法および標高モデル生成プログラム |
JP2009014643A (ja) * | 2007-07-09 | 2009-01-22 | Asahi Koyo Kk | 3次元形状抽出装置、方法及びプログラム |
JP2011158278A (ja) * | 2010-01-29 | 2011-08-18 | Pasuko:Kk | レーザデータのフィルタリング方法及び装置 |
JP2013088188A (ja) * | 2011-10-14 | 2013-05-13 | Fuji Architect Co Ltd | 三次元測定対象物の形態調査方法 |
US20140267250A1 (en) * | 2013-03-15 | 2014-09-18 | Intermap Technologies, Inc. | Method and apparatus for digital elevation model systematic error correction and fusion |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2002236019A (ja) * | 2001-02-08 | 2002-08-23 | Pasuko:Kk | 地表面抽出処理システム |
JP4438418B2 (ja) * | 2004-01-13 | 2010-03-24 | 朝日航洋株式会社 | 3次元デ−タ処理方法及び装置 |
-
2016
- 2016-08-31 JP JP2018536586A patent/JP6457159B2/ja active Active
- 2016-08-31 WO PCT/JP2016/075485 patent/WO2018042551A1/ja active Application Filing
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020035553A1 (en) * | 2000-07-20 | 2002-03-21 | Kim Seung Bum | Intelligent interpolation methods for automatic generation of an accurate digital elevation model |
JP2002063580A (ja) * | 2000-08-22 | 2002-02-28 | Asia Air Survey Co Ltd | 不定形窓を用いた画像間拡張イメージマッチング方法 |
JP2002074323A (ja) * | 2000-09-01 | 2002-03-15 | Kokusai Kogyo Co Ltd | 三次元市街地空間モデル作成方法およびシステム |
JP2006323608A (ja) * | 2005-05-18 | 2006-11-30 | Kozo Keikaku Engineering Inc | 立体構造物群モデル作成装置、立体構造物群モデル作成方法及び立体モデル作成システム |
JP2008164481A (ja) * | 2006-12-28 | 2008-07-17 | Mitsubishi Electric Corp | 標高モデル生成装置、標高モデル生成方法および標高モデル生成プログラム |
JP2009014643A (ja) * | 2007-07-09 | 2009-01-22 | Asahi Koyo Kk | 3次元形状抽出装置、方法及びプログラム |
JP2011158278A (ja) * | 2010-01-29 | 2011-08-18 | Pasuko:Kk | レーザデータのフィルタリング方法及び装置 |
JP2013088188A (ja) * | 2011-10-14 | 2013-05-13 | Fuji Architect Co Ltd | 三次元測定対象物の形態調査方法 |
US20140267250A1 (en) * | 2013-03-15 | 2014-09-18 | Intermap Technologies, Inc. | Method and apparatus for digital elevation model systematic error correction and fusion |
Non-Patent Citations (1)
Title |
---|
XU, JINGZHONG ET AL.: "High-precision DEM reconstruction based on airborne LiDAR point clouds, Proceedings of SPIE 9158", REMOTE SENSING OF THE ENVIRONMENT: 18TH NATIONAL SYMPOSIUM ON REMOTE SENSING OF CHINA, vol. 915808, 14 May 2014 (2014-05-14), pages 915808 - 1 - 915808-8 * |
Also Published As
Publication number | Publication date |
---|---|
JPWO2018042551A1 (ja) | 2018-12-20 |
JP6457159B2 (ja) | 2019-01-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2020207166A1 (zh) | 一种物体检测方法、装置、电子设备和存储介质 | |
JP6144826B2 (ja) | データベース作成のための対話型および自動的3dオブジェクト走査方法 | |
CN113192179B (zh) | 一种基于双目立体视觉的三维重建方法 | |
CN112710318A (zh) | 地图生成方法、路径规划方法、电子设备以及存储介质 | |
US9922049B2 (en) | Information processing device, method of processing information, and program for processing information | |
JP6259959B1 (ja) | ドローン制御システム、ドローン制御方法及びプログラム | |
JP2015114954A (ja) | 撮影画像解析方法 | |
WO2019080768A1 (zh) | 信息处理装置、空中摄像路径生成方法、程序、及记录介质 | |
US10645297B2 (en) | System, method, and program for adjusting angle of camera | |
JP2022514429A (ja) | 画像収集機器のキャリブレーション方法、装置、システム、機器及び記憶媒体 | |
JP6775748B2 (ja) | コンピュータシステム、位置推測方法及びプログラム | |
KR102475790B1 (ko) | 지도제작플랫폼장치 및 이를 이용한 지도제작방법 | |
CN113987246A (zh) | 无人机巡检的图片自动命名方法、装置、介质和电子设备 | |
JP6457159B2 (ja) | 不要物除去システム、不要物除去方法及びプログラム | |
WO2019150418A1 (ja) | ドローン、駆除方法及びプログラム | |
CN109242900B (zh) | 焦平面定位方法、处理装置、焦平面定位系统及存储介质 | |
CN114627395B (zh) | 基于嵌套靶标的多旋翼无人机角度分析方法、系统及终端 | |
CN113312435A (zh) | 高精度地图更新方法和设备 | |
CN111890358B (zh) | 双目避障方法、装置、存储介质及电子装置 | |
CN113674331A (zh) | 图像对齐方法和装置、电子设备和计算机可读存储介质 | |
CN110852278B (zh) | 地面标识线识别方法、设备及计算机可读存储介质 | |
US10553022B2 (en) | Method of processing full motion video data for photogrammetric reconstruction | |
KR101967284B1 (ko) | 다중 영상 정렬 장치 및 방법 | |
KR102300541B1 (ko) | GeoAI를 이용한 공간 정보 생성 방법 및 공간 정보 생성 장치 | |
US12025423B2 (en) | Camera information calculation device and system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 2018536586 Country of ref document: JP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 16915112 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 11/06/2019) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 16915112 Country of ref document: EP Kind code of ref document: A1 |