WO2014170060A1 - Method for measuring the volume of a cluster of materials - Google Patents

Method for measuring the volume of a cluster of materials Download PDF

Info

Publication number
WO2014170060A1
WO2014170060A1 PCT/EP2014/053873 EP2014053873W WO2014170060A1 WO 2014170060 A1 WO2014170060 A1 WO 2014170060A1 EP 2014053873 W EP2014053873 W EP 2014053873W WO 2014170060 A1 WO2014170060 A1 WO 2014170060A1
Authority
WO
WIPO (PCT)
Prior art keywords
materials
drone
cluster
stereoscopic images
acquired
Prior art date
Application number
PCT/EP2014/053873
Other languages
French (fr)
Inventor
Frédéric Serre
Original Assignee
Delta Drone
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to FR1353513A priority Critical patent/FR3004801A1/en
Priority to FR1353513 priority
Application filed by Delta Drone filed Critical Delta Drone
Publication of WO2014170060A1 publication Critical patent/WO2014170060A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C39/00Aircraft not otherwise provided for
    • B64C39/02Aircraft not otherwise provided for characterised by special use
    • B64C39/024Aircraft not otherwise provided for characterised by special use of the remote controlled vehicle type, i.e. RPV
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLYING SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/02Arrangements or adaptations of signal or lighting devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLYING SUITS; PARACHUTES; ARRANGEMENTS OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D47/00Equipment not otherwise provided for
    • B64D47/08Arrangements of cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/593Depth or shape recovery from multiple images from stereo images
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • G06T7/62Analysis of geometric attributes of area, perimeter, diameter or volume
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C2201/00Unmanned aerial vehicles; Equipment therefor
    • B64C2201/02Unmanned aerial vehicles; Equipment therefor characterized by type of aircraft
    • B64C2201/027Flying platforms
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C2201/00Unmanned aerial vehicles; Equipment therefor
    • B64C2201/12Unmanned aerial vehicles; Equipment therefor adapted for particular use
    • B64C2201/123Unmanned aerial vehicles; Equipment therefor adapted for particular use for imaging, or topography
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C2201/00Unmanned aerial vehicles; Equipment therefor
    • B64C2201/14Unmanned aerial vehicles; Equipment therefor characterised by flight control
    • B64C2201/141Unmanned aerial vehicles; Equipment therefor characterised by flight control autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C2201/00Unmanned aerial vehicles; Equipment therefor
    • B64C2201/14Unmanned aerial vehicles; Equipment therefor characterised by flight control
    • B64C2201/141Unmanned aerial vehicles; Equipment therefor characterised by flight control autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS]
    • B64C2201/145Unmanned aerial vehicles; Equipment therefor characterised by flight control autonomous, i.e. by navigating independently from ground or air stations, e.g. by using inertial navigation systems [INS] using satellite radio beacon positioning systems, e.g. GPS
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • G06T2207/10021Stereoscopic video; Stereoscopic image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing

Abstract

The invention concerns a method for measuring the volume of a cluster of materials, comprising: - defining (20) a flight plan of a drone (10) for flying over a cluster (2) of materials; - acquiring (22), by the drone, when flying over the cluster of materials along the defined flight plan, a plurality of stereoscopic images of the cluster of materials, the acquired stereoscopic images being suitable for forming a digital terrain model representing the surface topography of the cluster of materials; - combining (28) said acquired images to form the digital terrain model; - calculating (32) the volume of the cluster of materials from the digital terrain model that has been formed.

Description

 METHOD FOR MEASURING THE VOLUME OF A MATERIAL AMAS

[Ooi] The invention relates to a method for measuring the volume of a mass of materials.

[002] In the mining industry, it is common to store materials in the form of large clusters, typically deposited in the open. It is often necessary to measure the volume of such clusters of materials, to evaluate the amount of materials present in a given site.

 [003] For example, in a quarry, it may be necessary to measure the volume of a mass of aggregates deposited in the open. This can be important for making an inventory of the career at a given moment. In addition, a better knowledge of the stocks makes it possible to better manage the exploitation even of the quarry, for example for the extraction or the sale.

 [004] Generally, such a volume measurement is performed manually by surveyors, which is costly in time and manpower. This is why such a measure is rarely done. In addition, such a measure usually requires surveyors to climb on the pile, which presents a risk of major accident.

 [005] From the state of the art is also known from the following documents:

- ES 2366717 A1;

- US 2010/250125 A1;

 JP 2008186145 A;

 EP 2333481 A1;

 - US 2006/082590 A1;

EP 1100048 A1.

 [006] There is therefore a need for a method for measuring, reliably, the volume of a mass of materials, with increased speed and with a reduced cost.

 [007] The invention therefore relates to a method of measuring the volume of a mass of materials according to claim 1.

 [008] The drone allows the acquisition of a digital model of the pile of materials faster than a team of surveyors-experts, while maintaining sufficient precision to then determine the volume of this pile of materials. Reliably. Indeed, the measurement of volume with a drone can be done in less than an hour, while the measurement of an expert surveyor lasts days, even weeks before obtaining a result.

[009] In addition, the use of the drone to fly over the pile of materials is more economical and less complicated to implement than the use of an imaging satellite or a piloted aircraft such as a helicopter . Embodiments of the invention may have one or more of the features in accordance with any of the dependent claims 2 to 5.

 The invention will be better understood on reading the description which follows, given solely by way of nonlimiting example and with reference to the drawings in which:

 - Figure 1 is a schematic illustration, in a perspective view, of a cluster of materials;

 FIG. 2 is a schematic representation of a drone that can be used to measure the volume of the cluster of materials of FIG. 1;

 FIG. 3 is a flowchart of a method for measuring the volume of the cluster of materials of FIG. 1 by means of the drone of FIG. 2;

 FIG. 4 is a flow diagram of a step of the method of FIG. 3. In these figures, the same references are used to designate the same elements.

 In the following description, the features and functions well known to those skilled in the art are not described in detail.

 Figure 1 shows a cluster 2 of materials, deposited on a terrain 4. Here, this cluster 2 is formed of aggregates of homogeneous nature. The particle size of this cluster corresponds here to the granular class "2/8" as defined by the standard "NF

EN 13285 ". The ground 4 is here of flat shape.

 Advantageously, an optical reference pattern 6, also called reference target, is placed on the cluster 2. This pattern 6 will be described in more detail in the following.

A drone 10 is able to overfly the cluster 2 to measure the volume.

 FIG. 2 shows in greater detail an example of such a drone 10 and a control unit 12 of this drone 10. By drone ("Unmanned Aerial Vehicle" in English), an aircraft of dimensions reduced and can travel without a human pilot on board, for example autonomously from a predefined flight plan.

This drone 10 is here a rotary wing drone, for example quadrirotor. This drone 10 is thus able to remain hovering. This drone 10 comprises an optical imaging device 14, capable of acquiring stereoscopic images. For example, the drone 10 is able to move at an altitude less than or equal to 3m or 7m or 15m or 20m. This drone 10 has a mass less than 10kg or 5kg and a wingspan less than 3m or 2,5m or 2m. This drone 10 is particularly capable of transmitting data, such as data acquired by the device 14, to the unit 12. This drone 10 is also able to take off and move autonomously, for example following a plane of flight transmitted by unit 12. This drone 10 advantageously comprises a geolocation device 16. Such a geolocation device is able to provide geographical coordinates of the position occupied by the drone 10. This geographical location is here expressed in the form of coordinates of a system. satellite location, such as GPS coordinates ("Global Positioning System" in English). Such a device 16 therefore comprises here a GPS receiver.

The unit 12 is suitable:

 to transmit instructions, such as a flight plan or a take-off command, to the drone 10, and

to receive data coming from the drone 10, such as data coming from the device 14. This unit 12 here comprises a microcomputer equipped with a communication interface and a control software for the drone 10.

 The device 14 here comprises a stereoscopic optical camera.

 An example of a method for measuring the volume occupied by cluster 2 will now be described with reference to the flowchart of FIG. 3 and using FIGS. 1, 2 and 4.

 During a step 20, a flight plan of the drone 10 is defined so that the drone 10 flies over the cluster 2. For example, the flight plan is automatically defined by means of the unit 12. Here , geographic coordinates of the portion of land 4 comprising cluster 2 are provided. For example, a plurality of crossing points are defined by which the drone must pass by flying over the cluster 2 so as to be able to acquire a set of stereoscopic images illustrating the totality of the outer surface of the cluster 2. For example, the flight plan is defined so that the drone 10 defines a circle around the cluster 2 to acquire images of all the side faces of the cluster 2. Each waypoint is here identified by its geographical coordinates and by its altitude. The flight plan is thus defined so that the drone 10 passes through each of these points of passage, preferably following a path of reduced length.

 Then, during a step 22, a plurality of stereoscopic images of the cluster 2 is acquired by means of the drone. Figure 4 shows in more detail an example of this step 22.

 First, during an operation 24, the defined flight plan is transmitted to the drone 10, for example by means of the unit 12. Then, during an operation 26, a take-off order is transmitted to the drone 10.

Then, during an operation 28, the drone 10 moves over the cluster 2, according to the flight plan received during the operation 24. A plurality of stereoscopic images of the cluster 2 is then acquired using the device 14. In this example, these stereoscopic images are acquired periodically, as the movement of the drone, with a predetermined periodicity. For example, a stereoscopic image is acquired every 0.5s or every 0.1s. Images stereoscopically are advantageously acquired so as to have a mutual overlap two by two. For example, two stereoscopic images acquired consecutively by the device 14 have a mutual overlap. Two images are said to have a mutual overlap if respective portions of these two images represent the same scene. For example, these images have a mutual overlap on a portion of their area greater than or equal to 20% or 40% or 50% or 80% of their total area. The device 16 here records the geographical coordinates of the position occupied by the drone 10 during the acquisition of each image by the device 14. In the remainder of the description, reference will be made to "coordinates of an image" for designate these coordinates, respectively, for each stereoscopic image. Advantageously, the altitude of the drone 10 is also recorded at each acquisition.

 Advantageously, during an operation 30, calibration data are recorded, in order to subsequently determine the scale of the acquired stereoscopic images.

 For example, at least one copy of an optical reference pattern 6, and preferably three copies, are placed on the cluster 2 prior to the acquisition of the stereoscopic images during the operation 28. To simplify only one copy of this pattern 6 is shown in FIG.

 This reference optical pattern 6 is visible from the location of the drone 10 while this drone flies over the cluster 2 according to the flight plan, and is able to be recorded on the stereoscopic images. This reference optical pattern has known dimensions, so as to serve a scale of dimensions on the acquired stereoscopic images.

 In this example, the pattern 6 is placed on the cluster 2. This pattern 6 is here a pattern, cross-shaped, drawn on a panel. Operation 30 is performed in conjunction with the operation 28 of acquiring stereoscopic images.

Then, during an operation 32, the stereoscopic images acquired during the operation 28 are received by the unit 12, each in the form of a digital stereoscopic image. Advantageously, the geographical coordinates of each of these stereoscopic images, as well as their altitude, are also received by the unit 12.

 At the end of step 22, there is therefore a set of stereoscopic images taken at different positions, illustrating the entire outer surface of the cluster 2.

 Then, during a step 40, a surface topography of the cluster 2 is formed from the stereoscopic images acquired during step 22.

By topography of a surface, here is meant a set of data representing, at any point on a surface, the geographical elevation of this surface. This topography is, for example, a digital elevation model in English. Such a digital terrain model, for a given object, is for example represented, in digital form, by a cloud of points. At each of these points corresponds the position and the average altitude of the portion of object corresponding to this point.

 In this example, these stereoscopic images are combined by means of a photogrammetry method, to form the digital terrain model. This combination is for example carried out by means of the software "Agisoft PhotoScan Professional" version 0 .8 distributed by the company "Agisoft". The dimensions of these stereoscopic images are calibrated from the dimensions of the pattern 6 as it appears on these stereoscopic images, and knowing the known dimensions of this pattern 6.

At the end of this step 40, there is a numerical model of field of the cluster 2. In this example, the numerical model of ground obtained for the cluster 2 comprises a cloud of points comprising 52000 points , distributed with an average density of 95 points per m 2 on the surface of cluster 2.

 Then, during a step 50, the volume of the cluster 2 is calculated from the digital terrain model obtained during step 40. This calculation is done automatically here, by interpolating first, to from the cloud of points, a continuous surface forming the perimeter of the cluster 2, then calculating the volume delimited by the union of this continuous surface with the surface of the ground 4.

In this example, the volume of the cluster 2 calculated in step 50 is equal to 395.71 m 3 . In order to illustrate the efficiency of this process, the volume of this cluster 2 was measured independently by a team of surveyors. This team obtained a value of 404m 3 for the volume of the cluster 2, a difference of only 2% compared to the value obtained using the process. The method therefore has sufficient accuracy and reliability, while being faster and less expensive to implement.

 Finally, here, in a step 60, the value of the calculated volume is provided on a communication interface.

 Many other embodiments are possible.

The method can be used to measure the volume of a plurality of clusters of different materials deposited on the ground 4. In this case, the volume of each of these clusters can be determined. The flight plan is then defined so that the drone 10 overflows successively each of these clusters to separately acquire stereoscopic images of each of these clusters on which the other clusters are not visible. In another variant, several clusters may appear on a given stereoscopic image. In this case, during step 40, the clusters are automatically identified, in order to build, separate digital terrain models for each of these clusters. For example, different optical reference patterns are used to identify each of the clusters.

 The pattern 6 may have a shape and / or different dimensions. Several copies of the pattern 6 can be used on the cluster 2. Several different reference patterns between them can be placed on the cluster 2.

 The device 16 may be different. For example, the geolocation of the drone 10 is provided by means of a radio triangulation method. For this purpose, at least three radio transmitters are arranged on the ground 4 at known locations. The device 16 comprises a radio receiver able to receive the radio signals emitted by these transmitters and to determine geographical coordinates from these transmitted radio signals. These radio signals are for example issued according to the so-called "ultra-wide band" (ULB) technology.

 The acquisition periodicity of the images during the operation 28 may be different. This periodicity can for example be defined with respect to the distance traveled by the drone. An image can be acquired every 2m or every meter.

 The calibration data can be recorded differently. For example, the pattern 6 placed on the cluster 2 in the form of a panel is replaced by a predefined image projected by the drone on the cluster 2. In this case, the operation 30 consists in projecting, during the acquisition of each stereoscopic image, this predefined image on the cluster 2, so that this projected predefined image is visible on the acquired stereoscopic images. This predefined image here has known dimensions, and is then used, during operation 40, to calibrate the distances of each of the images. In this case, for this purpose, the drone 10 comprises a light source capable of projecting such a predefined image.

 For example, such a source comprises two laser pointers, each emitting a laser beam along a rectilinear trajectory. These laser pointers are arranged so that these two rectilinear paths are parallel to each other and separated from one another by a predefined distance. The projection of these two laser beams on the cluster 2 leads to the formation, on the surface of this cluster 2, of two luminous spots, spaced from each other by a known distance. This separation between the two spots is then measured on the stereoscopic images acquired to allow the calibration of the distances on the stereoscopic images.

In a variant, these laser pointers may be arranged in such a way that the trajectories of their respective beams are divergent, with a known divergence angle. The two spots therefore have, on the cluster 2, a spacing that can vary according to the distance between the drone 10 and the cluster 2. In this case, the calibration comprises an additional operation for determining the distance between the drone 10 and the cluster 2 to then calculate the distance separating the two spots. According to another variant, the operation 30 does not require a reference optical pattern. In this case, the calibration of a stereoscopic image is performed according to:

 coordinates of this stereoscopic image,

the altitude of the drone 10 recorded for this stereoscopic image, and

 the distance at which the drone 10 was from the cluster 2 at the time of acquisition of the stereoscopic image. This distance is for example measured by means of a telemeter embedded by the drone 10, and recorded during the acquisition of this stereoscopic image.

Claims

1. Method for measuring the volume of a mass of materials, characterized in that this method comprises:
the definition (20) of a flight plan of a drone (10) for flying over a cluster (2) of materials;
 the acquisition (22), by the drone, during the overflight of the cluster of materials according to the defined flight plan, of a plurality of stereoscopic images of the mass of materials, the acquired stereoscopic images being able to form a digital terrain model representing the surface topography of the pile of materials;
 the recording (30) of calibration data for determining the scale of the acquired stereoscopic images, this recording comprising the positioning of an optical reference pattern (6) on the mass of materials, prior to the acquisition of the images stereoscopic;
 · The drone having a light source, configured to project a predefined optical pattern on the pile of materials;
 Recording the calibration data comprising the projection of the predefined pattern onto the mass of materials by means of the light source, this predefined optical pattern forming the reference optical pattern;
at least one of the acquired stereoscopic images of the cluster of materials comprising said optical reference pattern;
 the combination (28) of said acquired images to form the digital terrain model;
 the calculation (32) of the volume of the pile of materials from the digital terrain model formed.
2. Method according to claim 1, wherein the light source comprises two laser pointers, each of these pointers being configured to emit laser radiation along a rectilinear trajectory, said trajectories being parallel to each other and separated from each other by a predefined distance.
The method of claim 1, wherein:
the drone comprises:
 • a geolocation device (16), able to provide geographical coordinates indicating the position of the drone;
• a rangefinder, able to measure the distance separating the drone from the surface of the pile of materials; the recording of the calibration data comprises, during the acquisition of the stereoscopic images, the determination of the distance separating the drone from the surface of the mass of materials by means of the range finder.
4. Method according to any one of claims 1 to 3, wherein the drone comprises a stereoscopic camera (14).
5. Method according to any one of claims 1 to 4, wherein the acquired stereoscopic images present two by two common portions.
PCT/EP2014/053873 2013-04-18 2014-02-27 Method for measuring the volume of a cluster of materials WO2014170060A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
FR1353513A FR3004801A1 (en) 2013-04-18 2013-04-18 Method for measuring the volume of a material amas
FR1353513 2013-04-18

Publications (1)

Publication Number Publication Date
WO2014170060A1 true WO2014170060A1 (en) 2014-10-23

Family

ID=49111329

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2014/053873 WO2014170060A1 (en) 2013-04-18 2014-02-27 Method for measuring the volume of a cluster of materials

Country Status (2)

Country Link
FR (1) FR3004801A1 (en)
WO (1) WO2014170060A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017186515A1 (en) * 2016-04-25 2017-11-02 Siemens Aktiengesellschaft Aircraft for scanning an object, and system for damage analysis for the object
US10217207B2 (en) 2016-01-20 2019-02-26 Ez3D, Llc System and method for structural inspection and construction estimation using an unmanned aerial vehicle

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017170651A1 (en) * 2016-03-31 2017-10-05 住友重機械工業株式会社 Work management system for construction machine, and construction machine

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1100048A1 (en) * 1999-11-12 2001-05-16 Société N3DI S.A.R.L. Automatic building process of a digital model using stereoscopic image couples
US20060082590A1 (en) * 2004-10-14 2006-04-20 Stevick Glen R Method and apparatus for dynamic space-time imaging system
JP2008186145A (en) * 2007-01-29 2008-08-14 Mitsubishi Electric Corp Aerial image processing apparatus and aerial image processing method
US20100250125A1 (en) * 2007-07-04 2010-09-30 Kristian Lundberg Arrangement and method for providing a three dimensional map representation of an area
EP2333481A1 (en) * 2009-11-27 2011-06-15 Thales Optoelectronic system and method for creating three-dimensional identification images
ES2366717A1 (en) * 2008-09-03 2011-10-25 Universidad De Sevilla Information collection equipment in works and infrastructures based on an untripulated air vehicle.

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1100048A1 (en) * 1999-11-12 2001-05-16 Société N3DI S.A.R.L. Automatic building process of a digital model using stereoscopic image couples
US20060082590A1 (en) * 2004-10-14 2006-04-20 Stevick Glen R Method and apparatus for dynamic space-time imaging system
JP2008186145A (en) * 2007-01-29 2008-08-14 Mitsubishi Electric Corp Aerial image processing apparatus and aerial image processing method
US20100250125A1 (en) * 2007-07-04 2010-09-30 Kristian Lundberg Arrangement and method for providing a three dimensional map representation of an area
ES2366717A1 (en) * 2008-09-03 2011-10-25 Universidad De Sevilla Information collection equipment in works and infrastructures based on an untripulated air vehicle.
EP2333481A1 (en) * 2009-11-27 2011-06-15 Thales Optoelectronic system and method for creating three-dimensional identification images

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10217207B2 (en) 2016-01-20 2019-02-26 Ez3D, Llc System and method for structural inspection and construction estimation using an unmanned aerial vehicle
WO2017186515A1 (en) * 2016-04-25 2017-11-02 Siemens Aktiengesellschaft Aircraft for scanning an object, and system for damage analysis for the object
CN109073498A (en) * 2016-04-25 2018-12-21 西门子股份公司 For the mobile aircraft of sweep object and the system of the Failure analysis for object

Also Published As

Publication number Publication date
FR3004801A1 (en) 2014-10-24

Similar Documents

Publication Publication Date Title
Neitzel et al. Mobile 3D mapping with a low-cost UAV system
US7693617B2 (en) Aircraft precision approach control
Xiang et al. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV)
Lin et al. Mini-UAV-borne LIDAR for fine-scale mapping
US20060004496A1 (en) Enhanced vertical situation display
Colomina et al. Unmanned aerial systems for photogrammetry and remote sensing: A review
JP2017501383A (en) Method and apparatus for correcting plane conditions in real time
CN101228412B (en) System and method for data mapping and map discrepancy reporting
Grenzdörffer et al. The photogrammetric potential of low-cost UAVs in forestry and agriculture
Eisenbeiss A mini unmanned aerial vehicle (UAV): system overview and image acquisition
EP2362289B1 (en) Methods and systems for displaying predicted downpath parameters in a vertical profile display
US9007461B2 (en) Aerial photograph image pickup method and aerial photograph image pickup apparatus
KR20110027654A (en) Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features
US20150149000A1 (en) Unkown
Bendea et al. Low cost UAV for post-disaster assessment
US20190094862A1 (en) STRUCTURE FROM MOTION (SfM) PROCESSING FOR UNMANNED AERIAL VEHICLE (UAV)
ES2623372T3 (en) Apparatus and method for landing a rotary wing aircraft
EP2560152B1 (en) Aircraft vision system including a runway position indicator
EP1897081B1 (en) Perspective view conformal traffic targets display
US9336568B2 (en) Unmanned aerial vehicle image processing system and method
CN202549080U (en) Fusion system of radar data, flight plan data and ADS-B data
Küng et al. The accuracy of automatic photogrammetric techniques on ultra-light UAV imagery
WO2006019417A1 (en) Method and apparatus for displaying attitude, heading, and terrain data
EP2724204B1 (en) Method for acquiring images from arbitrary perspectives with uavs equipped with fixed imagers
CA2596063A1 (en) Precision approach guidance system and associated method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14706865

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 14706865

Country of ref document: EP

Kind code of ref document: A1