IL278930A - Location identification based on terrain model with distance measurement - Google Patents

Location identification based on terrain model with distance measurement

Info

Publication number
IL278930A
IL278930A IL278930A IL27893020A IL278930A IL 278930 A IL278930 A IL 278930A IL 278930 A IL278930 A IL 278930A IL 27893020 A IL27893020 A IL 27893020A IL 278930 A IL278930 A IL 278930A
Authority
IL
Israel
Prior art keywords
camera
terrain
location
distance
point
Prior art date
Application number
IL278930A
Other languages
Hebrew (he)
Inventor
Yifrach Aharon
Original Assignee
Israel Aerospace Ind Ltd
Yifrach Aharon
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Israel Aerospace Ind Ltd, Yifrach Aharon filed Critical Israel Aerospace Ind Ltd
Priority to IL278930A priority Critical patent/IL278930A/en
Priority to PCT/IL2021/051356 priority patent/WO2022107126A1/en
Publication of IL278930A publication Critical patent/IL278930A/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/05Geographic models
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/04Interpretation of pictures
    • G01C11/06Interpretation of pictures by comparison of two or more pictures of the same area
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching

Landscapes

  • Engineering & Computer Science (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Computer Graphics (AREA)
  • Theoretical Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Instructional Devices (AREA)
  • Radar Systems Or Details Thereof (AREA)
  • Position Fixing By Use Of Radio Waves (AREA)
  • Measurement Of Optical Distance (AREA)

Description

LOCATION IDENTIFICATION BASED ON TERRAIN MODEL WITH DISTANCE MEASUREMENT TECHNICAL FIELD The presently disclosed subject matter relates to geographic location determination, and in particular to systems for identifying location without dependence on external devices.
BACKGROUND Problems of geographic location determination have been recognized in the conventional art and various techniques have been developed to provide solutions.
GENERAL DESCRIPTION According to one aspect of the presently disclosed subject matter there is provided a method of identifying a current location within a geographic region, the method comprising: a) obtaining, by a processor, a regional terrain model (RTM), the RTM comprising data indicative of three or more vertical elevations, each vertical elevation being indicative of a height of terrain in a respective cell of the geographic region; b) obtaining, by the processor, for each of at least two reference terrain points, data indicative of: a three-dimensional (3D) direction vector of the reference terrain point relative to the current location and an associated distance, 2 thereby giving rise to local area terrain elevation data; c) identifying, by the processor, a location of the geographic region as the current location, the identifying being in accordance with, at least: a. data derivative of the local area terrain elevation data, and b. the data indicative of the three or more vertical elevations.
In addition to the above features, the method according to this aspect of the presently disclosed subject matter can comprise one or more of features (i) to (xii) listed below, in any desired combination or permutation which is technically possible: (i) the data derivative of the local area terrain elevation data comprises: for each terrain point of the at least two reference terrain points: 1. a vertical elevation of the reference terrain point; and 2. a two-dimensional (2D) direction vector of the reference terrain point relative to the current location, and an associated distance. (ii) the identifying the location of the geographic region as the current location is in accordance with a method comprising: a. selecting a location in the geographic region, thereby giving rise to a candidate location; b. for each terrain point of the at least two reference terrain points: i. selecting a cell of the geographic region, in accordance with, at least: 3 the candidate location, the two-dimensional direction vector of the reference terrain point relative to the current location, and the associated distance, thereby giving rise to a terrain point candidate cell, ii. calculating a difference between the vertical elevation of the respective reference terrain point, and a vertical elevation of the respective terrain point candidate cell, indicated by the data comprised in the RTM, thereby giving rise to a calculated difference for each reference terrain point; and c. calculating a metric of matching of the candidate location, in accordance with the calculated differences for each of the at least two reference terrain points. (iii) the method additionally comprises: d) repeating a) - c) for one or more additional candidate locations, thereby giving rise to a plurality of candidate locations and respective calculated metrics of difference; and e) identifying a candidate location of the plurality of candidate locations as the current location, in accordance with, at least, the respective calculated metrics of matching. 4 (iv) the metric of matching is calculated in accordance with the sum of squares of the calculated differences. (v) the identifying comprises selecting a candidate location of the plurality of candidate locations which gave rise to a best calculated metric of matching. (vi) the identifying comprises selecting a candidate location of the plurality of candidate locations which gave rise to a calculated metric of matching meeting a matching threshold. (vii) the obtaining data indicative of the 3D direction and associated distance from the current location to the respective terrain point utilizes a LIDAR. (viii) the obtaining data indicative of the 3D direction and associated distance from the current location to the respective terrain point utilizes a laser rangefinder (LRF). (ix) the obtaining data indicative of the 3D direction and associated distance from the current location to the respective terrain point, comprises: a. capturing, by a first camera in a first location and with a first orientation, a first digital image, the first digital image comprising pixels depicting the respective terrain point, and capturing, by a second camera in a second location and with a second orientation, a second digital image, the second digital image comprising pixels depicting the respective terrain point; the second camera location being separated from the first camera location by an 5 inter-camera spacing, the second camera orientation being substantially parallel to the first camera orientation, thereby giving rise to two captured digital images; b. registering, by a processor, one of the captured digital images to the other captured digital image, thereby giving rise to at least one of: a horizontal pixel-shift distance, and a vertical pixel-shift distance; and c. calculating, by the processor, data indicative of a distance from the current location to the respective terrain point in accordance with, at least: i) at least one of: the horizontal pixel-shift distance, and the vertical pixel-shift distance, and ii) the inter-camera spacing. (x) the obtaining data indicative of a three-dimensional direction comprises: providing a three-dimensional direction in accordance with at least one of: the first camera orientation and the second camera orientation. (xi) the method additionally comprises, prior to the registering: i) calibrating the first camera and the second camera, thereby giving rise to at least one of a group consisting of: a) data indicative of a difference between x-axis rotation of the first camera and x-axis rotation of the second camera, b) data indicative of a difference between y-axis rotation of the first camera and y-axis rotation of the second camera, and 6 c) data indicative of a difference between z-axis rotation of the first camera and z-axis rotation of the second camera; ii) adjusting at least one of the digital images in accordance with at least one of at least one of a group consisting of: a) the difference between x-axis rotation of the first camera and x- axis rotation of the second camera, b) the difference between y-axis rotation of the first camera and y- axis rotation of the second camera, and c) the difference between z-axis rotation of the first camera and z- axis rotation of the second camera. (xii) the first camera and the second camera are the same camera.
According to a further aspect of the presently disclosed subject matter there is provided a system of identifying a current location within a geographic region, the system comprising a first processing circuitry configured to: a) obtain a regional terrain model (RTM), the RTM comprising data indicative of three or more vertical elevations, each vertical elevation being indicative of a height of terrain in a respective cell of the geographic region; b) obtain for each of at least two reference terrain points, data indicative of: a three-dimensional (3D) direction vector of the reference terrain point relative to the current location and an associated distance, thereby giving rise to local area terrain elevation data; c) identifying a location of the geographic region as the current location, the determining being in accordance with, at least: 7 a. data derivative of the local area terrain elevation data, and b. the data indicative of the three or more vertical elevations.
This aspect of the disclosed subject matter can further optionally comprise one or more of features (i) to (xii) listed above with respect to the method, mutatis mutandis, in any desired combination or permutation which is technically possible.
This aspect of the disclosed subject matter can further optionally comprise the following features: a first camera in a first location and with a first orientation, configured to capture a first digital image comprising pixels depicting the respective terrain point. a second camera in a second location and with a second orientation, the second camera location being separated from the first camera location by an inter-camera spacing, the second camera orientation being substantially parallel to the first camera orientation, the second camera being configured to capture a second digital image comprising pixels depicting the respective terrain point; a distance measurement subsystem, operably connected to the processing circuitry, the first camera, and the second camera, wherein the distance measurement subsystem comprises a second processing circuitry, and wherein the second processing circuitry is configured to: a. register one of the captured digital images to the other captured digital image, thereby giving rise to at least one of: a horizontal pixel-shift distance, and a vertical pixel-shift distance, and 8 b. calculate data indicative of a distance from the current location to the respective terrain point in accordance with, at least: i) at least one of: the horizontal pixel-shift distance, and the vertical pixel-shift distance, and ii) the inter-camera spacing.
In addition to the above features, the system according to this aspect of the presently disclosed subject matter can comprise feature (xiii) : (xiii) the second processing circuitry is the first processing circuitry.
According to another aspect of the presently disclosed subject matter there is provided a computer program product comprising a non-transitory computer readable storage medium retaining program instructions, which, when read by a processing circuitry, cause the processing circuitry to perform a method of identifying a current location within a geographic region, the method comprising: a) obtaining, by a processor, a regional terrain model (RTM), the RTM comprising data indicative of three or more vertical elevations, each vertical elevation being indicative of a height of terrain in a respective cell of the geographic region; b) obtaining, by the processor, for each of at least two reference terrain points, data indicative of: a three-dimensional (3D) direction vector of the reference terrain point relative to the current location and an associated distance, thereby giving rise to local area terrain elevation data; 9 c) identifying, by the processor, a location of the geographic region as the current location, the identifying being in accordance with, at least: a. data derivative of the local area terrain elevation data, and b. the data indicative of the three or more vertical elevations.
This aspect of the disclosed subject matter can optionally comprise one or more of features (i) to (xii) listed above with respect to the method, mutatis mutandis, in any desired combination or permutation which is technically possible.
BRIEF DESCRIPTION OF THE DRAWINGS In order to understand the invention and to see how it can be carried out in practice, embodiments will be described, by way of non-limiting examples, with reference to the accompanying drawings, in which: FIG. 1 illustrates an example deployment scenario for a location identification system utilizing a Digital Terrain Model (DTM), in accordance with some embodiments of the presently described subject matter.
FIG. 2 illustrates a block diagram of an example DTM-based location identification system with its components, in accordance with some embodiments of the presently described subject matter.
FIG. 3 illustrates an example logical representation of a digital terrain model, in accordance with some embodiments of the presently described subject matter.
FIG. 4 illustrates a flow diagram of an example method of location identification using a digital terrain model, in accordance with some embodiments of the presently disclosed subject matter. 10 FIG. 5 illustrates a relationship between a three-dimensional direction and distance to a terrain point, and a corresponding 2-dimensional horizontal direction and vertical elevation, in accordance with some embodiments of the presently disclosed subject matter.
FIG. 6 is a logical illustration of a sparse local terrain model, in accordance with some embodiments of the presently disclosed subject matter.
FIG. 7 illustrates a flow diagram of an example method of identifying a location in the geographic region as the current location, in accordance with some embodiments of the presently disclosed subject matter.
FIG. 8 illustrates a block diagram of an example passive parallax-based distance measurement subsystem usable in a location identification system, in accordance with some embodiments of the presently disclosed subject matter.
FIG. 9 illustrates a flow diagram of an example method of determining a distance and 3D direction vector of terrain point relative to the current location, in accordance with some embodiments of the presently disclosed subject matter.
DETAILED DESCRIPTION In the following detailed description, numerous specific details are set forth in order to provide a thorough understanding of the invention. However, it will be understood by those skilled in the art that the presently disclosed subject matter may be practiced without these specific details. In other instances, well-known methods, procedures, components and circuits have not been described in detail so as not to obscure the presently disclosed subject matter.
Unless specifically stated otherwise, as apparent from the following discussions, it is appreciated that throughout the specification discussions utilizing terms such as "processing", "computing", "comparing", "determining", "calculating", "receiving", "providing", "obtaining", "sensing", "capturing" or the like, refer to the action(s) and/or process(es) of a computer that manipulate and/or transform data into other data, said data 11 represented as physical, such as electronic, quantities and/or said data representing the physical objects. The term "computer" should be expansively construed to cover any kind of hardware-based electronic device with data processing capabilities including, by way of non-limiting example, the processor disclosed in the present application.
The terms "non-transitory memory" and "non-transitory storage medium" used herein should be expansively construed to cover any volatile or non-volatile computer memory suitable to the presently disclosed subject matter.
The operations in accordance with the teachings herein may be performed by a computer specially constructed for the desired purposes or by a general-purpose computer specially configured for the desired purpose by a computer program stored in a non- transitory computer-readable storage medium.
Embodiments of the presently disclosed subject matter are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the presently disclosed subject matter as described herein.
Attention is now directed to FIG. 1, which illustrates an example deployment scenario for a location identification system utilizing a Digital Terrain Model (DTM), in accordance with some embodiments of the presently described subject matter.
Location identification system 100 can be located at a position on earth surface 110. Earth surface 110 can be flat, or can have hills, inclines, trees, bodies of water etc.
Location identification system 100 can be – for example – located within a manned or autonomous vehicle, or on a person (for example in a pack carried on the person’s back).
Terrain points (e.g. tops of hills) such as terrain points 120A 120B can be present.
Terrain points 120A 120B can each be located at a particular three-dimensional direction and distance from location identification system 100. Terrain points 120A 120B can also have particular elevations (relative to e.g. sea level) associated with them. 12 In some embodiments, location identification system 100 can determine its geographical location without utilizing a global positioning system (GPS). In some embodiments, location identification system 100 can measure distances and directions to terrain points and detect the current location using the terrain point measurements in conjunction with a DTM. This can be beneficial - for example – in a situation where there is no GPS signal, or where the signal is not reliable or subject to interference.
As used herein, a DTM can be a digital data structure that includes elevation data that represents terrain. As used herein, the term DTM non-exclusively encompasses both a digital terrain model that represents the bare ground surface without objects like plants and buildings, as well as a digital surface model (DSM) that represents the earth's surface including objects on it.
DTMs are used for a variety of tasks such as geomorphological analysis, modeling water flow, etc.
A DTM can be implemented in various ways, such as a raster (a grid of square cells, also known as a heightmap when representing elevation), a vector-based triangular irregular network, a different suitable implementation, etc.
Attention is now directed to FIG. 2, which illustrates a block diagram of an example DTM-based location identification system with its components, in accordance with some embodiments of the presently described subject matter.
Location identification system 100 can include processing circuitry 200.
Processing circuitry 200 can include processor 210 and memory 220.
Processor 210 can be a suitable hardware-based electronic device with data processing capabilities, such as, for example, a general purpose processor, digital signal processor (DSP), a specialized Application Specific Integrated Circuit (ASIC), one or more cores in a multicore processor etc. Processor 210 can also consist, for example, of multiple processors, multiple ASICs, virtual processors, combinations thereof etc.
Memory 220 can be, for example, a suitable kind of volatile or non-volatile storage, and can include, for example, a single physical memory component or a plurality 13 of physical memory components. Memory 220 can also include virtual memory. Memory 220 can be configured to, for example, store various data used in computation.
Processor 210 can be configured to execute several functional modules in accordance with computer-readable instructions implemented on a non-transitory computer-readable storage medium. Such functional modules are referred to hereinafter as comprised in the processor. These modules can include, for example, digital terrain model 240, distance measurement unit 230, and location calculation unit 250.
Digital terrain model 240 can be a suitable type of data structure that includes data pertaining to the elevations of terrain within a mapped region, as described above with reference to FIG. 1. A non-limiting example of such a data structure is provided below with reference to FIG. 3. Digital terrain model 240 can be stored, for example, in a suitable memory or non-volatile storage mechanism – as known in the art.
Distance measurement subsystem 295 can be a suitable type of system that can measure or estimate distances and 3-dimensional directions from the current location to various terrain points. In some embodiments, distance measurement subsystem 295 utilizes a light detection and ranging system such as the system commonly known as LIDAR. In some embodiments, distance measurement subsystem 295 utilizes a laser rangefinder (LRF) to determine distance, while using a suitable mechanism for determining 3-dimensional directions. In some embodiments, distance measurement subsystem 295 is a passive parallax-based distance measurement system such as – for example – the system described below with reference to FIGs. 8-9.
From a particular location of the location detection system 100, various terrain features (e.g. hills, inclines etc.) can be – for example – within line-of-sight. Distance measurement subsystem 295 can measure three-dimensional directions and distances to these terrain points.
Distance measurement unit 230 can be operably connected to distance measurement subsystem 295 and can send/receive data (e.g. measurement data) to/from distance measurement subsystem 295. In some embodiments, distance measurement unit 14 230 also manages distance measurement subsystem 295 (e.g. directs or activates the specific components for measuring distances and directions to particular locations).
Location calculation unit 250 can then utilize data (e.g. measurement data) from distance measurement unit 230 - in combination with elevation data from digital terrain model 240 - to determine the current location. Location calculation unit 250 can do this – for example - using methods described below with reference to FIG. 4 and FIG. 7.
Among the advantages of such methods is that they enable detection of current location in situations where GPS is not available or not reliable, and that they can provide a location detection system that is entirely based on passive components - which can consequently resist detection during deployment (e.g. in tactical situations) and can have low power consumption.
Display unit 260 can be any suitable type of internal or external screen or other output device. Display unit 260 can display a map, coordinate data, or some other indication of the current location as determined by e.g. location calculation unit 250. In some embodiments, display unit 260 is part of processing circuitry 200. In some embodiments, display unit 260 is an external component that is – for example - operably connected to processing circuitry 200.
Attention is now directed to FIG. 3, which illustrates an example logical representation of a digital terrain model, in accordance with some embodiments of the presently described subject matter.
A digital terrain model 300 can include data pertaining to a particular region on the earth’s surface. Accordingly, there can be geographical coordinates (e.g. longitude and/or latitude) associated with the edges of DTM. DTM 300 can have a width 320 (e.g. in the East-West axis) and length 310 (e.g. in the North-South axis).
By way of non-limiting example: a DTM 300 can describe a 10 km x 10 km area, and consist of a grid of 1000 x 1000 cells, where each cell pertains to a region of 10 meters x 10 meters. 15 A DTM 300 can include a group of cells (such as cell 330). Each cell can be associated with a particular geographic coordinate (e.g. longitude and latitude) denoting e.g. the position of the center of the cell or one of the corners of the cell. Each cell can be associated with a length and width e.g. a cell might pertain to a geographic square area of 10 meters x 10 meters, or 5 meters x 5 meters, or some other value. In FIG. 3, the cells containing dots denote additional undepicted cells.
Each cell can be associated with a terrain elevation (e.g. in the example of FIG. 3 elevation value 340 of cell 330 is 20 meters, while elevation value 350 is 35 meters) i.e. a measurement of elevation (e.g. relative to sea level or some other reference elevation) of the terrain corresponding to the cell. In some embodiments, the terrain elevation of the cell denotes the highest elevation of the terrain in the cell. In some embodiments, the terrain elevation of the cell denotes the height of the terrain in the center of the cell. In some embodiments, the terrain elevation of the cell denotes an average height or some other data pertaining to the height of the terrain corresponding to the cell.
In some embodiments, the terrain elevation of the cell denotes the elevation including surface elements such as trees or buildings. In some embodiments, the terrain elevation of the cell denotes the elevation of the earth surface, without trees, buildings etc.
The cells of the geographic region represented in the DTM can be of various shapes and/or various sizes.
Attention is now directed to FIG. 4, which illustrates a flow diagram of an example method of location identification using a digital terrain model, in accordance with some embodiments of the presently disclosed subject matter.
The method described in FIG. 4 operates, in some embodiments, by measuring distances from the current location to various terrain points, and after processing the data attempts to correlate it with data of the DTM. Even though the terrain points are not necessarily the high points of the terrain, the method described herein can identify the current location with sufficient accuracy. 16 Location identification system 100 can identify a current location - for example - in the course of autonomous or human vehicle navigation, or in response to a command from a human user.
The processing circuitry 200 (e.g. location calculation unit 250) can obtain (410) a digital terrain model pertaining to a region in which it is located. The terms "regional digital terrain model" or "regional terrain model" (RTM) used herein can denote a DTM pertaining to a particular geographical region.
By way of non-limiting example, an unmanned vehicle including location identification system 100 can obtain a stored DTM 240 that was preloaded into memory 220 or storage (e.g. when a vehicle is launched into an area). Alternatively, location identification system 100 can receive a DTM by via wireless communication or another suitable mechanism.
For reasons of clarity, the ensuing description in some cases describes a DTM that is implemented as a raster representing the terrain of the region as consisting of three or more cells, and includes - for each of the cells - data indicative of a vertical elevation value, each vertical elevation value being indicative of height of terrain in the cell, such as in the example presented above with reference to FIG. 3.
It will be clear to one skilled in the art how the teachings herein can be extended to utilize other implementations of DTMs that include data indicative of three or more vertical elevations, each vertical elevation being indicative of a height of terrain in a respective cell of the geographic region.
Processing circuitry 200 (e.g. distance measurement unit 230) can then – for some number of terrain points (e.g. two or more) - obtain (420) data indicative of the 3- dimensional (3D) direction vector between the current location and the respective terrain point and the associated distance from the current location to the respective terrain point along the vector (e.g. x, y, z coordinates relative to the current location, angular azimuth/angular elevation/distance - or another suitable notation - are indicative of the 3D direction and distance). 17 A terrain point can be a suitable kind of point or feature in the environment surrounding location identification system 100. In some embodiments, the terrain points are line-of-sight earth features such as hilltops, or points on flat or inclined ground, terrain protrusions such as rocks etc. The term "reference terrain points" can herein refer to these terrain points measured relative to the current location for location determination.
The processing circuitry 200 (e.g. distance measurement unit 230) can utilize distance measurement subsystem 295 to measure the 3-dimensional direction and distance to a reference terrain point ("measuring" as used herein can also include estimating). In some embodiments, distance measurement subsystem 295 is a LIDAR system. In some embodiments, distance measurement subsystem 295 utilizes a laser rangefinder (LRF) to determine distance, while using a suitable mechanism for determining 3-dimensional directions. In some embodiments, distance measurement subsystem 295 is a passive parallax-based system such as the one described below with reference to FIG. 7. In some embodiments, distance measurement subsystem 295 is a different suitable system. Alternatively, in some embodiments, the terrain point measurements can be collected by a separate system and entered manually by a human.
Distance measurement subsystem 295 can measure e.g. 5 reference terrain points, or 10 reference terrain points or some other number of reference terrain points for use by processing circuitry 200 (e.g. distance measurement unit 230). Generally speaking: a larger number of reference terrain points can result in a more accurate determination of location.
It is noted that it is not required that the distance measurement subsystem 295 measure reference terrain points that are local points of maximum height on the earth surface – though measurement of reference terrain points that are local points of maximum height on the earth surface can result in more accurate results. The term "local area terrain elevation data" used herein refers to measured terrain point 3D direction and distance information. 18 After the distances/directions to reference terrain points have been measured, the processing circuitry 200 (e.g. location calculation unit 250) can identify (430) a location of the geographic region as the current location from: a. data derivative of, at least, the local area terrain elevation data, and b. the data comprised in the DTM that is indicative of the three or more vertical elevations.
It is noted that "data derivative of" a particular prior data, as used herein, can include the prior data itself.
It is noted that – in various embodiments - the accuracy of the identification of a location as the current location can depend on various factors, including: - the number of terrain point measurements and the accuracy of these measurements - the number of candidate locations and the distances between such candidate locations (e.g. as described with reference to Fig. 7 below) Accordingly, in some embodiments the identified location of the geographic region can be within the same cell as the current location. In some embodiments, the identified location can be within e.g. 200 meters, 100 meters, or some other distance from the current location.
Generally speaking, the processing circuitry 200 (e.g. location calculation unit 250) can seek a match (e.g. a best match as indicated by a metric of matching) between the local area terrain elevation data and the terrain as described by the DTM. When the best match is found, the current location is indicated by the matched reference terrain points (in conjunction with the measured directions/distances).
By way of non-limiting example, the processing circuitry 200 (e.g. location calculation unit 250) can compute a vertical elevation (for example: relative to sea level or a different fixed elevation) for each of at least two of the reference terrain points, and also compute a two-dimensional direction vector and associated distance from the current location to the respective reference terrain point. An example of this computation is 19 described hereinbelow with reference to FIG. 5. The processing circuitry 200 (e.g. location calculation unit 250) can then compute a best match between the computed vertical elevations of the reference terrain points and corresponding vertical elevations indicated by the data comprised in the DTM. Some example methods for computing such a best match are described below, with reference to FIG. 7.
Alternatively, other methods can be used to infer the current location from the data of the DTM and the local area terrain elevation data. Some of these methods do not directly utilize vertical elevations. For example, the local area terrain elevation data can be represented as a continuous height function in 2 variables. In this case processing circuitry 200 (e.g. location calculation unit 250) can find a best match by correlating the function to the raster of the raster-implemented DTM - as is evident to one versed in the art of digital terrain modeling.
Attention is now directed to FIG. 5, which illustrates a relationship between a three-dimensional direction and distance to a terrain point, and a corresponding 2- dimensional horizontal direction and vertical elevation, in accordance with some embodiments of the presently disclosed subject matter.
As described hereinabove with reference to FIG. 4, the processing circuitry 200 (e.g. distance measurement unit 230) can utilize distance measurement subsystem 295 to measure the distance 515 and 3-dimensional (3D) direction 525 from the current location 500 to a terrain point 510. From the measured distance 515 and measured 3D direction 525, processing circuitry 200 (e.g. location calculation unit 250) can calculate vertical elevation 545 and horizontal direction/distance 535 using trigonometric operations, as is known in the art.
In some embodiments, processing circuitry 200 (e.g. distance measurement unit 230) can utilize an altimeter or other mechanism to determine its own elevation relative to sea level. In this case, processing circuitry 200 (e.g. distance measurement unit 230) can calculate the vertical elevations of the terrain points relative to e.g. sea level. In some other embodiments, processing circuitry 200 (e.g. distance measurement unit 230) lacks 20 awareness of its own elevation, so it can calculate the vertical elevations of the terrain points relative to its own elevation.
Attention is now directed to FIG. 6, which depicts a logical illustration of a sparse local terrain model in accordance with some embodiments of the presently disclosed subject matter.
The term "sparse local terrain model" is herein utilized to refer to a logical collection of terrain point data including the vertical elevation of particular terrain points, and organized by horizontal coordinates (relative to the current location) of the terrain point. A logical illustration of a sparse local terrain model appears below, with reference to FIG. 6.
Processing circuitry 200 (e.g. distance measurement unit 230) can search subsets of cells of the DTM to find a best match between the vertical elevations indicated by the regional DTM and the vertical elevations indicated by the sparse local terrain model.
Sparse local terrain model 600 can have a width 620 (e.g. in the East-West axis) that is smaller or the same as the DTM width 320. Sparse local terrain model 600 can have a length 610 (e.g. in the North-South axis) that is smaller or the same as the DTM width 620. Sparse local terrain model 600 can include a current location cell 630 (i.e. a cell that indicates the location of the current location relative to other cells). Sparse local terrain model 600 can include terrain point cells (e.g. terrain point cell 630). Terrain point cells can include the calculated vertical elevation of the terrain point (e.g. 214 meters for terrain point cell 630).
Attention is now directed to FIG. 7, which illustrates a flow diagram of an example method of identifying a location in the geographic region as the current location, in accordance with some embodiments of the presently disclosed subject matter.
The processing circuitry 200 (e.g. location calculation unit 250) can select a location for evaluation (e.g. the precise center of the cell, or a coordinate within the cell, etc.) thereby giving rise to a candidate location. The term "location" can refer to a geographic area within the cell (or crossing cells), and distance from a such a "location" can be computed from any place within this area. 21 The processing circuitry 200 (e.g. location calculation unit 250) can next, for each of the reference terrain points, select (720) a cell with a 2D direction and associated distance from the candidate location that matches (or approximates) the 2D direction and associated distance of the reference terrain point relative to the current location (for example: as calculated from a 3D direction vector and distance determined by distance measurement subsystem 295). This cell is hereforward termed the "terrain point candidate cell" for a respective terrain point.
More generally, the processing circuitry 200 (e.g. location calculation unit 250) can select each candidate terrain point cell, in accordance with, at least: the candidate location, the two-dimensional direction vector of the reference terrain point relative to the current location, and the associated distance.
The processing circuitry 200 (e.g. location calculation unit 250) can next, for each of the reference terrain points, calculate (730) the difference between a vertical elevation of the reference terrain point and the vertical elevation of the candidate terrain point cell as indicated by the data of the DTM.
The processing circuitry 200 (e.g. location calculation unit 250) can then calculate (740) a metric of matching of the candidate location from the per-terrain point calculated differences.
The metric of matching can be, for example, in accordance with the sum of the per-reference terrain point calculated differences.
The metric of matching can also be, for example, in accordance with the sum of squares of the per-reference terrain point calculated differences. By way of non-limiting example, processing circuitry 200 (e.g. location calculation unit 250) can compute the following formula: ∑ Metric of matching = | − | 22 where E denotes the vertical elevation of a reference terrain point e.g. as m measured (or derived from measurement), E denotes the vertical elevation of the d respective candidate terrain point as indicated in the DTM, and n denotes the number of reference terrain point cells.
In such cases, a smaller result for the metric of matching is indicative of a better match between the candidate terrain point cells and the measured terrain points.
Similarly, in such cases, the smallest result for the metric of matching can be indicative of a best match between the candidate terrain point cells and the measured terrain points.
Optionally: processing circuitry 200 (e.g. location calculation unit 250) can repeat (750) the process of selecting a candidate location, selecting terrain point candidate cells, and calculating a metric of matching for the candidate location based on per-terrain point calculated differences. Processing circuitry 200 (e.g. location calculation unit 250) can perform this repetition for one or more additional candidate locations.
Processing circuitry 200 (e.g. location calculation unit 250) can select a candidate location of the plurality of candidate locations, in accordance with, at least, the respective calculated metrics of matching, thereby giving rise to an identifying of the selected candidate location as the current location.
In some embodiments, processing circuitry 200 (e.g. location calculation unit 250) utilizes a single location of each cell of the DTM (e.g. the center of the cell) as the candidate location for calculating respective metrics of matching. In some other embodiments, processing circuitry 200 (e.g. location calculation unit 250) utilizes more than one location within some or all of the cells of the DTM as candidate locations for calculating respective metrics of matching.
In some embodiments, processing circuitry 200 (e.g. location calculation unit 250) calculates respective metrics of matching, and selects the first candidate location for which the metric of matching meets a matching threshold. In some embodiments, processing circuitry 200 (e.g. location calculation unit 250) calculates all the respective metrics of matching, and then selects the candidate location with the best metric of matching. 23 In some embodiments, processing circuitry 200 (e.g. location calculation unit 250) utilizes some method for identifying potentially matching candidate locations, calculates respective metrics of matching for the identified potentially matching candidate locations, and then selects the candidate location with the best metric of matching or the first candidate location for which the metric of matching meets a matching threshold.
In some embodiments, processing circuitry 200 (e.g. location calculation unit 250) identifies groups of potentially matching candidate terrain point cells without first identifying potentially matching candidate locations, calculates respective metrics of matching for the groups of potentially matching candidate terrain point cells, and then selects the group of potentially matching candidate terrain point cells with the best metric of matching, or the first group of potentially matching candidate terrain point cells for which the metric of matching meets a matching threshold. The processing circuitry 200 (e.g. location calculation unit 250) can then infer the current location from the selected group of potentially matching candidate terrain point cells.
In some embodiments, processing circuitry 200 (e.g. location calculation unit 250) uses another method for inferring the current location.
Attention is now directed to FIG. 8, which illustrates a block diagram of an example passive parallax-based distance measurement subsystem usable in a location identification system, in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 800 can include processor 805 and memory 815. These components can be as described above with reference to FIG. 2. Image processing unit 815 can be comprised in the processing circuitry 800, and can implement methods for image processing and determination of object distance, as described below with reference to FIG. 9. In some embodiments, processing circuitry 800 is distinct from processing circuitry 200. In some embodiments, processing circuitry 800 is identical to or comprised in processing circuitry 200.
Processing circuitry 800 can be operably connected to left camera 810 and right camera 820, and can receive image data from them. In some embodiments, processing 24 circuitry 800 (e.g. image processing unit 815) can control left camera 810 and/or right camera 820. Left camera 810 and right camera 820 can be any type of suitable device for producing digital images of terrain. Left camera 810 and right camera 820 and can be separated by an intercamera spacing 875. Left camera 810 and right camera 820 can have a particular, substantially identical, 3D orientation in space (given, for example, as degrees of azimuth and elevation). Thus, left camera 810 and right camera 820 are substantially parallel. In this context, being "substantially parallel" and having "substantially identical" orientation in space denotes that the left camera orientation and right camera orientation are close enough to each other so that – after any adjustment to compensate for x-axis between-camera rotation, y-axis between-camera rotation, and z- axis between-camera rotation – parallax-based distance detection is accurate enough for the particular application in which it is being used.
Left camera 810 and right camera 820 can generate images with a particular digital image width 885 (e.g. in pixels). Left camera 810 and right camera 820 can generate images with angle of view 865. In some embodiments, left camera 810 and right camera 820 can be the same camera e.g. a single camera can be moved to capture the respective digital images.
Attention is now directed to FIG. 9, which illustrates a flow diagram of an example method of determining a distance and 3D direction vector of terrain point relative to the current location, in accordance with some embodiments of the presently disclosed subject matter.
Processing circuitry 800 (for example: image processing unit 815) can capture (910) a first digital image – including pixels depicting the terrain point - from a first location and with a first orientation (for example: using left camera 810).
Processing circuitry 800 (for example: image processing unit 815) can next capture (920) a second digital image – including pixels depicting the terrain point - from a second location and with a second orientation (for example: using right camera 820).
The camera capturing the first digital image and the camera capturing the second digital 25 image can be separated by a particular inter-camera spacing, and can be substantially parallel to each other.
Optionally: Processing circuitry 800 (for example: image processing unit 815) can adjust (930) at least one of the two images to compensate for intercamera rotation in the x, y or z axis. More specifically: data indicative of a difference between x-axis rotation of one camera and x-axis rotate of the other camera can be derived, for example, from calibrating the first camera and second camera, as known in the art (and differences of y- axis and/or z-axis rotation can be derived similarly). Processing circuitry 800 (for example: image processing unit 815) can then adjust (930) at least one of the two images in accordance with the derived difference of rotation in x-axis, y-axis, and/or z-axis.
Processing circuitry 800 (for example: image processing unit 815) can register (940) one of the digital images to the other (i.e. registering the first image to the second or the second to the first), resulting in a horizontal and/or vertical pixel shift distance.
These shift distances can indicate the distance (in pixels) that the pixels depicting the terrain point have been displaced in one image compared to the other. Processing circuitry 800 (for example: image processing unit 815) can perform registration using techniques known in the art.
Processing circuitry 800 (for example: image processing unit 815) can then calculate (950) data indicative of the distance of the terrain point (e.g. the distance itself), in accordance with the horizontal and/or vertical pixel shift distance, the intercamera spacing, and image width.
In some embodiments, processing circuitry 800 (for example: image processing unit 815) performs the calculation according to the following formula: ∗ = Θ 2 ∗ tan ∗ ! "# $ %%& ' ℎ ' )* 2 Where Θ - Camera angle of view X - picture resolution in pixels B – intercamera spacing 26 Processing circuitry 800 (for example: image processing unit 815) determines the 3D direction vector of terrain point relative to the current location by utilizing the orientation of one of the cameras (as the orientations are substantially identical) or both of the cameras (e.g. averaging the orientation) at the time of the image capture.
It is to be understood that the invention is not limited in its application to the details set forth in the description contained herein or illustrated in the drawings. The invention is capable of other embodiments and of being practiced and carried out in various ways.
Hence, it is to be understood that the phraseology and terminology employed herein are for the purpose of description and should not be regarded as limiting. As such, those skilled in the art will appreciate that the conception upon which this disclosure is based may readily be utilized as a basis for designing other structures, methods, and systems for carrying out the several purposes of the presently disclosed subject matter.
It will also be understood that the system according to the invention may be, at least partly, implemented on a suitably programmed computer. Likewise, the invention contemplates a computer program being readable by a computer for executing the method of the invention. The invention further contemplates a non-transitory computer-readable memory tangibly embodying a program of instructions executable by the computer for executing the method of the invention.
Those skilled in the art will readily appreciate that various modifications and changes can be applied to the embodiments of the invention as hereinbefore described without departing from its scope, defined in and by the appended claims. 27

Claims (17)

1. A method of identifying a current location within a geographic region, the method comprising: a) obtaining, by a processor, a regional terrain model (RTM), the RTM 5 comprising data indicative of three or more vertical elevations, each vertical elevation being indicative of a height of terrain in a respective cell of the geographic region; b) obtaining, by the processor, for each of at least two reference terrain points, 10 data indicative of: a three-dimensional (3D) direction vector of the reference terrain point relative to the current location and an associated distance, 15 thereby giving rise to local area terrain elevation data; c) identifying, by the processor, a location in the geographic region as the current location, the identifying being in accordance with, at least: 20 a. data derivative of the local area terrain elevation data, and b. the data indicative of the three or more vertical elevations.
2. The method of claim 1, wherein the data derivative of the local area terrain elevation data comprises: 25 for each terrain point of the at least two reference terrain points: i) a vertical elevation of the reference terrain point; and ii) a two-dimensional (2D) direction vector of the reference terrain point relative to the current location, and an 30 associated distance. 28
3. The method of claim 2, wherein identifying the location of the geographic region as the current location is in accordance with a method comprising: a) selecting a location in the geographic region, thereby giving rise to a candidate 5 location; b) for each terrain point of the at least two reference terrain points: a. selecting a cell of the geographic region, in accordance with, at least: 10 the candidate location, the two-dimensional direction vector of the reference terrain point relative to the current location, and 15 the associated distance, thereby giving rise to a terrain point candidate cell, b. calculating a difference between 20 the vertical elevation of the respective reference terrain point, and a vertical elevation of the respective terrain point candidate cell, indicated by the data comprised in the RTM, 25 thereby giving rise to a calculated difference for each reference terrain point; and c) calculating a metric of matching of the candidate location, in accordance with the calculated differences for each of the at least two reference terrain points. 30
4. The method of claim 3, additionally comprising: 29 d) repeating a) - c) for one or more additional candidate locations, thereby giving rise to a plurality of candidate locations and respective calculated metrics of difference; and
5. E) selecting a candidate location of the plurality of candidate locations, in accordance with, at least, the respective calculated metrics of matching, thereby giving rise to an identifying of the selected candidate location as the current location. 10 5. The method of any of claims 3-4, wherein the metric of matching is calculated in accordance with the sum of squares of the calculated differences.
6. The method of claim 4, wherein the selecting comprises selecting a candidate location which gave rise to a best calculated metric of matching. 15
7. The method of claim 4, wherein the selecting comprises selecting a candidate location which gave rise to a calculated metric of matching meeting a matching threshold. 20
8. The method of any of claims 1-7, wherein the obtaining data indicative of the 3D direction and associated distance from the current location to the respective terrain point utilizes a LIDAR.
9. The method of any of claims 1-7, wherein the obtaining data indicative of the 3D 25 direction and associated distance from the current location to the respective terrain point utilizes a laser range finder.
10. The method of any of claims 1-7, wherein the obtaining data indicative of the 3D direction and associated distance from the current location to the respective terrain 30 point, comprises: 30 a. capturing, by a first camera in a first location and with a first orientation, a first digital image, the first digital image comprising pixels depicting the respective terrain point, and 5 capturing, by a second camera in a second location and with a second orientation, a second digital image, the second digital image comprising pixels depicting the respective terrain point; the second camera location being separated from the first camera location by an inter-camera spacing, the second camera orientation being 10 substantially parallel to the first camera orientation, thereby giving rise to two captured digital images; b. registering, by a processor, one of the captured digital images to the 15 other captured digital image, thereby giving rise to at least one of: a horizontal pixel-shift distance, and a vertical pixel-shift distance; and c. calculating, by the processor, data indicative of a distance from the current location to the respective terrain point in accordance with, at 20 least: i) at least one of: the horizontal pixel-shift distance, and the vertical pixel-shift distance, and ii) the inter-camera spacing.
11. The method of claim 10, wherein the obtaining data indicative of a three- 25 dimensional direction comprises: providing a three-dimensional direction in accordance with at least one of: the first camera orientation and the second camera orientation. 31
12. The method of any of claims 10-11, additionally comprising, prior to the registering: i) calibrating the first camera and the second camera, thereby giving rise to at least one of a group consisting of: 5 a) data indicative of a difference between x-axis rotation of the first camera and x-axis rotation of the second camera, b) data indicative of a difference between y-axis rotation of the first camera and y-axis rotation of the second camera, and c) data indicative of a difference between z-axis rotation of the first 10 camera and z-axis rotation of the second camera; ii) adjusting at least one of the digital images in accordance with at least one of a group consisting of: a) the difference between x-axis rotation of the first camera and x-axis rotation of the second camera, 15 b) the difference between y-axis rotation of the first camera and y-axis rotation of the second camera, and c) the difference between z-axis rotation of the first camera and z-axis rotation of the second camera. 20
13. The method of any of claims 10-12, wherein the first camera and the second camera are the same camera. 32
14. A system of identifying a current location within a geographic region, the system comprising a first processing circuitry configured to perform a method in accordance with any of claims 1-9. 5 15. The system of claim 14, additionally comprising: a first camera in a first location and with a first orientation, configured to capture a first digital image comprising pixels depicting the respective terrain point, a second camera in a second location and with a second orientation, the second camera location being separated from the first camera location by an inter-camera 10 spacing, the second camera orientation being substantially parallel to the first camera orientation, the second camera being configured to capture a second digital image comprising pixels depicting the respective terrain point; a distance measurement subsystem, operably connected to the processing
15. Circuitry, the first camera, and the second camera, wherein the distance measurement subsystem comprises a second processing circuitry configured to: a. register one of the captured digital images to the other captured digital image, thereby giving rise to at least one of: a horizontal pixel-shift 20 distance, and a vertical pixel-shift distance, and b. calculate data indicative of a distance from the current location to the respective terrain point in accordance with, at least: i) at least one of: the horizontal pixel-shift distance, and the 25 vertical pixel-shift distance, and ii) the inter-camera spacing. 33
16. The system of claim 15, wherein the second processing circuitry is the first processing circuitry. 5
17. A computer program product comprising a non-transitory computer readable storage medium retaining program instructions, which, when read by a processing circuitry, cause the processing circuitry to perform a computerized method of identifying a current location within a geographic region in accordance with any of claims 1-13. 10 For the applicant, By: 15 Israel Aerospace Industries Ltd. 1/9 Terrain point 120A Terrain point 120B Earth Location surface identification 110 system 100 FIG. 1Israel Aerospace Industries Ltd. 2/9 Digital Processor Distance Distance Terrain 210 Measurement Model Measurement Subsystem 240 Unit 230 295 Memory Location 220 Calculation Unit 250 Display Unit 260 Processing Circuitry 200 Location Identification System 100 FIG. 2Israel Aerospace Industries Ltd. 3/9 340 20 20 20 20 ………… 20 330 20 30 30 30 ………… 20 20 30 35 30 ………… 20 310 20 30 30 30 ………… 20 300 350 . . . . . . . . . . . . 20 20 20 20 20 20 320 FIG. 3Israel Aerospace Industries Ltd. 4/9 Obtain a digital terrain model (DTM) of a region – comprising data indicative of per-cell terrain elevations of the region 410 For two or more terrain points, obtain a measurement of the distance and the direction vector between the current position and the terrain point 420 Identify a location of the region with the current location, in accordance with: a) the measured terrain point data and b) the DTM 430 FIG. 4Israel Aerospace Industries Ltd. 5/9 Measured Terrain distance point 515 510 Vertical Elevation Current 545 position 500 Horizontal direction/distance Measured 535 3D direction to terrain point 525 FIG. 5Israel Aerospace Industries Ltd. 6/9 340 24 ………… 214 30 Current ………… 610 Position 30 30 ………… . . . . . 630 . . . . . 600 70 20 20 620 FIG. 6Israel Aerospace Industries Ltd. 7/9 Select, from the geographic region, a candidate cell location 710 For each reference terrain point: select a respective candidate terrain point cell from the geographic region – based on the candidate cell location and the location of the reference terrain point cell relative to the current position 720 For each reference terrain point: calculate the difference between a vertical elevation of the terrain point and the vertical elevation of the candidate terrain point cell as indicated by the DTM 730 Calculate a metric of matching of the candidate cell location from the per-terrain point calculated differences 740 Optionally: repeat for one or more additional candidate cell locations 750 Identify a candidate cell location as the current location, e.g. by selecting the candidate cell which yielded to the best calculated metric of difference 760 FIG. 7Israel Aerospace Industries Ltd. 8/9 Terrain feature 895 Digital image width 885 Camera angle of view 865 Right Left Camera Camera 820 810 Intercamera spacing 875 Processor Image Memory 805 Processing unit 815 815 Processing Circuitry 800 FIG. 8Israel Aerospace Industries Ltd. 9/9 Capture a digital image - including the terrain point - from a first location and first orientation 910 Capture a digital image - including the terrain point - from a second location and second orientation that is substantially parallel to the first 920 Optionally: correct the second image in accordance with correction values for x-axis rotation, y-axis rotation, and z-axis rotation - as determined through calibration 930 Register the one image to the other image, thereby giving rise to at least one of: horizontal pixel shift distance and vertical pixel shift distance 940 Calculate the terrain point distance from the horizontal and/or vertical pixel shift distance, intercamera spacing, and image width 950 FIG. 9
IL278930A 2020-11-23 2020-11-23 Location identification based on terrain model with distance measurement IL278930A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
IL278930A IL278930A (en) 2020-11-23 2020-11-23 Location identification based on terrain model with distance measurement
PCT/IL2021/051356 WO2022107126A1 (en) 2020-11-23 2021-11-16 Location identification based on terrain model with distance measurement

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
IL278930A IL278930A (en) 2020-11-23 2020-11-23 Location identification based on terrain model with distance measurement

Publications (1)

Publication Number Publication Date
IL278930A true IL278930A (en) 2022-06-01

Family

ID=81708596

Family Applications (1)

Application Number Title Priority Date Filing Date
IL278930A IL278930A (en) 2020-11-23 2020-11-23 Location identification based on terrain model with distance measurement

Country Status (2)

Country Link
IL (1) IL278930A (en)
WO (1) WO2022107126A1 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023147599A2 (en) * 2022-01-31 2023-08-03 Polaris Industries Inc. Electric or hybrid vehicle

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3328795A (en) * 1959-11-18 1967-06-27 Ling Temco Vought Inc Fixtaking means and method
JP3850541B2 (en) * 1998-02-18 2006-11-29 富士重工業株式会社 Advanced measuring device
US7522090B2 (en) * 2006-10-31 2009-04-21 Honeywell International Inc. Systems and methods for a terrain contour matching navigation system
US8854453B2 (en) * 2009-06-23 2014-10-07 Here Global B.V. Determining geographic position information from a single image
US10028102B2 (en) * 2014-12-26 2018-07-17 Here Global B.V. Localization of a device using multilateration
US9792521B2 (en) * 2014-12-26 2017-10-17 Here Global B.V. Extracting feature geometries for localization of a device
CN107504974B (en) * 2017-09-15 2020-09-25 哈尔滨工程大学 Terrain matching positioning method based on weighting of terrain blocks and terrain measuring points

Also Published As

Publication number Publication date
WO2022107126A1 (en) 2022-05-27

Similar Documents

Publication Publication Date Title
CN110779498B (en) Shallow river water depth mapping method and system based on unmanned aerial vehicle multi-viewpoint photography
Mills et al. A geomatics data integration technique for coastal change monitoring
US9134127B2 (en) Determining tilt angle and tilt direction using image processing
CN101699313B (en) Method and system for calibrating external parameters based on camera and three-dimensional laser radar
CN103575267B (en) The method for making image related to the landform altitude map for navigating
Holland et al. Practical use of video imagery in nearshore oceanographic field studies
JP6002126B2 (en) Method and apparatus for image-based positioning
CN109813335B (en) Calibration method, device and system of data acquisition system and storage medium
GREJNER‐BRZEZINSKA Direct exterior orientation of airborne imagery with GPS/INS system: Performance analysis
CN104268935A (en) Feature-based airborne laser point cloud and image data fusion system and method
Sánchez-García et al. C-Pro: A coastal projector monitoring system using terrestrial photogrammetry with a geometric horizon constraint
CN103256920A (en) Determining tilt angle and tilt direction using image processing
Khoshelham et al. Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry
CN114820793A (en) Target detection and target point positioning method and system based on unmanned aerial vehicle
WO2022107126A1 (en) Location identification based on terrain model with distance measurement
Bikmaev et al. Improving the accuracy of supporting mobile objects with the use of the algorithm of complex processing of signals with a monocular camera and LiDAR
Mader et al. An integrated flexible self-calibration approach for 2D laser scanning range finders applied to the Hokuyo UTM-30LX-EW
Ostrowski et al. Analysis of point cloud generation from UAS images
Kupervasser et al. Robust positioning of drones for land use monitoring in strong terrain relief using vision-based navigation
Madeira et al. Accurate DTM generation in sand beaches using mobile mapping
CN112050830B (en) Motion state estimation method and device
Ishii et al. Autonomous UAV flight using the Total Station Navigation System in Non-GNSS Environments
Meguro et al. Omni-directional Motion Stereo Vision based on Accurate GPS/INS Navigation System
Del Pizzo et al. Assessment of shoreline detection using UAV
CN112556725B (en) Detection method for relative accuracy of portable unmanned aerial vehicle non-control point survey chart