WO2022096576A1 - Procédé, programme informatique et appareil pour déterminer une position relative d'un premier véhicule aérien et d'au moins un second véhicule aérien l'un par rapport à l'autre - Google Patents
Procédé, programme informatique et appareil pour déterminer une position relative d'un premier véhicule aérien et d'au moins un second véhicule aérien l'un par rapport à l'autre Download PDFInfo
- Publication number
- WO2022096576A1 WO2022096576A1 PCT/EP2021/080650 EP2021080650W WO2022096576A1 WO 2022096576 A1 WO2022096576 A1 WO 2022096576A1 EP 2021080650 W EP2021080650 W EP 2021080650W WO 2022096576 A1 WO2022096576 A1 WO 2022096576A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- aerial vehicle
- image data
- camera system
- relative position
- determining
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 84
- 238000004590 computer program Methods 0.000 title claims description 9
- 238000012545 processing Methods 0.000 claims description 18
- 230000004807 localization Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 7
- 230000001419 dependent effect Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 238000003491 array Methods 0.000 description 2
- 238000013500 data storage Methods 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 238000010801 machine learning Methods 0.000 description 2
- 229920000954 Polyglycolide Polymers 0.000 description 1
- 238000005352 clarification Methods 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 229920000747 poly(lactic acid) Polymers 0.000 description 1
- 235000010409 propane-1,2-diol alginate Nutrition 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/005—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
- G05D1/104—Simultaneous control of position or course in three dimensions specially adapted for aircraft involving a plurality of aircrafts, e.g. formation flying
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0004—Transmission of traffic-related information to or from an aircraft
- G08G5/0008—Transmission of traffic-related information to or from an aircraft with other aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0017—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information
- G08G5/0021—Arrangements for implementing traffic-related aircraft activities, e.g. arrangements for generating, displaying, acquiring or managing traffic information located in the aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0052—Navigation or guidance aids for a single aircraft for cruising
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0047—Navigation or guidance aids for a single aircraft
- G08G5/0069—Navigation or guidance aids for a single aircraft specially adapted for an unmanned aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/0073—Surveillance aids
- G08G5/0086—Surveillance aids for monitoring terrain
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G5/00—Traffic control systems for aircraft, e.g. air-traffic control [ATC]
- G08G5/04—Anti-collision systems
- G08G5/045—Navigation or guidance aids, e.g. determination of anti-collision manoeuvers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- Embodiments of the present disclosure relate to a method, a computer program, and an apparatus for determining a relative position of a first aerial vehicle and at least one second aerial vehicle to each other.
- satellite- or barometer-based positioning systems for navigating encountering aerial vehicles.
- Some applications may require a higher accuracy in locating encountering aerial vehicles than satellite- or barometer-based positioning systems can provide.
- the accuracy in vertical direction may be too low in satellite- and barometer-based positioning systems for some applications.
- the present disclosure relates to a method for determining a relative position of a first aerial vehicle and at least one second aerial vehicle to each other.
- the method comprises receiving first image data of a first camera system attached to the first aerial vehicle and second image data of a second camera system attached to the second aerial vehicle. Further, the method provides for determining the relative position using a geometric relation of the first and the second image data.
- the first and the second camera system can be understood as a device for recording single or multiple images (e.g. a film).
- the first and the second camera system for example, comprise a single camera, a stereo camera, or a multi-camera system/camera array including a (optical) photo camera or a film camera.
- the first/the second aerial vehicle for example, is an airplane, a helicopter, an unmanned aerial vehicle (UAV), or the like.
- the first and the second camera system can be attached below the first and the second aerial vehicle, respectively, and may record an environment below the respective first or second aerial vehicle.
- the geometric relation for example, is determined from similarities present in the first and the second image data.
- the geometric relation may indicate a relative location and/or an orientation of the first and the second camera system to each other and, thus, the relative position of the first and the second aerial vehicle.
- the relative position particularly may be indicative of a relative altitude of the first and the second aerial vehicle to each other. Additionally, or alternatively, the relative position indicates a transversal/horizontal relative position of the first and the second aerial vehicle to each other.
- the method can be applied for positioning/locating the first and the second aerial vehicle and in some applications to avoid collisions of encountering aerial vehicles and to navigate aerial vehicles based on their relative position.
- the above method can provide a sub-m-accuracy in locating the first and the second aerial vehicle.
- the accuracy can be higher than the accuracy of satellite- or barometer-based positioning systems/con- cepts.
- the above method is not limited to one first and one second aerial vehicle but can be applied for locating more than two aerial vehicles.
- the first and the second aerial vehicle can be equipped with multiple cameras providing the first and/or the second image data.
- the first and the second image may include one or more images.
- determining the relative position comprises identifying a plurality of features present in both the first and the second image data using computer vision and determining first coordinates of the features in the first image data and second coordinates of the features in the second image data. For determining the geometric relation, the method can provide for using the first and the second coordinates.
- the features relate to objects captured by both the first and the second camera system.
- the features may relate to static and particularly “unique”, “recognizable”, and/or “striking” objects within the environment of the aerial vehicles.
- the objects are parts of buildings, plants, parking vehicles, infrastructure objects (e.g. streets), or the like.
- the first and the second image data each may include a pixel array, wherein each pixel of the first and the second image data refers to coordinates of a respective (two-dimensional) coordinate system. Accordingly, the features in the image data relate to the first and the second coordinates in the (respective coordinate system of) the first and the second image data.
- first and second coordinates are used as input to, for example, the so-called “(normalized) five-point algorithm” or “(normalized) eight-point algorithm” in computer vision for determining the geometric relation.
- the geometric relation includes the so-called “essential matrix” resulting from the five-point or eight-point algorithm.
- the method comprises synchronizing the first and the second camera system for synchronously recording the first and the second image data.
- the first and the second camera system for example, communicate via radio signals to synchronize with each other.
- At least one of the first and the second aerial vehicle is an unmanned aerial vehicle.
- the method provides for checking whether fields of view of the first and the second camera system overlap by comparing image data of the first and the second camera system.
- the method further can comprise adjusting, if the fields of view do not overlap, a pose of the first and/or the second camera system.
- the image data of the first and the second camera system can be examined for similarities (e.g. features being present in the image data of the first and the second camera system).
- the method comprises receiving first scaled positional data of the first aerial vehicle and second scaled positional data of the second aerial vehicle using a satellitebased positioning system and/or a barometer-based positioning system and deriving a scaled absolute position of the first and the second aerial vehicle to each other based on the relative position, the first, and the second scaled positional data.
- the relative position of the first and the second aerial vehicle is a “non-dimensional” or “unsealed” measure.
- the first and the second positional data can be used as reference data to scale the relative position, i.e. to map the relative position to an absolute scale, to derive the scaled absolute position of the aerial vehicles.
- the method is executed on the first or the second aerial vehicle.
- the first and/or the second aerial vehicle can be equipped with an apparatus configured to execute the above method. This enables stand-alone applications in aerial vehicles where the aerial vehicles, for example, do not communicate with an external data processing apparatus.
- the method is executed on an external server separate from the first and the second aerial vehicle.
- the first and the second camera system for example, communicate the first and the second image data to the external server.
- the external server thus can determine the relative position according to the above method using the first and the second image data.
- the present disclosure relates to a computer program comprising instructions, which, when the computer program is executed by a processor, cause the processor to carry out the aforementioned method.
- the present disclosure relates to an apparatus for determining a relative position of a first aerial vehicle and at least one second aerial vehicle to each other.
- the apparatus comprises at least one interface and a data processing circuitry.
- the (at least one) interface is configured to receive first image data from a first camera system attached to the first aerial vehicle and second image data from a second camera system attached to the second aerial vehicle and a data processing circuitry is configured to determine a geometric relation of the first and the second image data and determine the relative position using the geometric relation of the first and the second image data.
- FIG. 1 shows a flowchart schematically illustrating a method for an imagebased localization of aerial vehicles
- Fig. 2 shows a block diagram schematically illustrating an apparatus providing an image-based localization of aerial vehicles
- Fig. 3a and 3b illustrate a first application scenario of the method/apparatus
- Fig. 4 illustrates a second application scenario of the method/apparatus.
- a localization of aerial vehicles with sub-m-accuracy i.e. an accuracy of less than one meter, is desired, for example, to avoid collisions between aerial vehicles operating in close proximity (e.g. less than one meter of each other).
- satellite- or barometer-based positioning systems cannot provide sub-m-accuracy. In particular, those positioning systems may not be able to provide sub-m-accuracy in vertical direction.
- a basic idea of the present disclosure is an image-based positioning concept using image data from cameras attached to encountering aerial vehicles to locate the aerial vehicles with sub- m-accuracy.
- Fig. 1 shows a flowchart schematically illustrating a method 100 for an image-based localization of aerial vehicles.
- Method 100 comprises receiving 110 first image data of a first camera system attached to the first aerial vehicle and second image data of a second camera system attached to the second aerial vehicle.
- method 100 comprises determining 120 a geometric relation of the first and the second image data and determining 130 the relative position using the geometric relation of the first and the second image data.
- the first and the second camera system can be mounted to the first and the second aerial vehicle, such that their respective field of view points diagonally downwards in flight direction. Accordingly, when the first and the second aerial vehicle approach each other, the fields of view of the first and the second camera system may (at least partly) overlap with each other.
- the first and the second image data can include multiple features being present in the first and the second image data. Due to different perspectives of the camera systems, the features relate to different coordinates in a respective coordinate system of the first and the second image data.
- the respective coordinate system for example, refers to a location of the first and the second camera system, respectively. A comparison of the coordinates can deliver the geometric relation between the coordinate systems and thus the relative position of the aerial vehicles.
- method 100 allows a localization of the first and the second aerial vehicles with sub-m-accuracy in vertical and horizontal direction.
- method 100 can be executed iteratively for tracking the relative position of aerial vehicles operating in close proximity (e.g. less than one meter of each other), encountering, and/or passing each other.
- Fig. 2 shows a block diagram schematically illustrating an apparatus 200 providing an imagebased localization of aerial vehicles. To this end, the apparatus 200 can execute method 100.
- the apparatus 200 comprises an interface 210 and a data processing circuitry 220.
- the interface 210 is configured to receive first image data from a first camera system attached to the first aerial vehicle and second image data from a second camera system attached to the second aerial vehicle.
- the interface 210 for example, receives the first and/or the second image data via a radio signal or a wired connection to the first and the second camera system, respectively.
- the interface 210 is coupled to the data processing circuitry 220 to provide the data processing circuitry 220 with the first and the second image data.
- the data processing circuitry 220 is configured to determine the geometric relation of the first and the second image data and determine the relative position using the geometric relation of the first and the second image data, as stated above with reference to method 100.
- Fig. 3a and Fig. 3b illustrate a first application scenario where a first unmanned aerial vehicle (UAV) 310 and a second UAV 320 encounter each other
- the first and the second UAV 310 and 320 are equipped with a first camera system 330 and a second camera system 340, respectively.
- UAV unmanned aerial vehicle
- both the first and the second UAV 310 and 320 are further equipped with a communication system (not shown).
- UAV 310 can use its communication system to communicate its position to UAV 320.
- UAV 320 can communicate its position to UAV 310.
- both UAV 310 and 320 can detect their mutually presence in a close proximity (e.g. in a range up to 50 m depending on a range of the communication system).
- the positions can be determined using respective barometer- or satellite-based positioning systems on board the UAV 310 and 320, respectively.
- the satellite-based positioning system of UAV 310 and 320 for example, communicates with multiple satellites 360 to measure the position of UAV 310 and 320, respectively.
- an accuracy of such positioning systems in vertical direction may not be accurate (e.g. less than one meter) enough to operate UAV 310 and 320 in a vertical distance of less than two meters from each other.
- method 100 can be applied for a more accurate localization than possible with the satellite- or barometer-based positioning system.
- at least one of UAV 310 and 320 is equipped with the apparatus 200.
- UAV 310 and UAV 320 are further equipped with a camera system 330 and 340, respectively, to record first and second image data of their environment.
- the camera systems 330 and 340 For recording the first and the second image data synchronously, the camera systems 330 and 340, in step 106, synchronize with each other and communicate, in step 108, a shutter time to record the first and the second image data.
- actual shutter times of UAV 310 and 320 may differ from each other and the communicated shutter time. Errors or uncertainties of the resulting relative position, which are induced by differences of the actual shutter times, can be compensated based on the UAVs' velocities determined using satellite-based (e.g. GNSS) or inertial sensors.
- the UAVs 310 and 320 communicate a direction where to steer the respective field of view 330/340 such that the fields of view 330 and 340 have an (maximal expected) overlap.
- the UAVs 310 and 320 can use their positions determined by the satellite-based positioning system to derive those directions.
- the camera systems 330 and 340 i.e. their respective field of view 332/342, point diagonally downwards such that their fields of view 332 and 342 partly overlap in an area 350.
- area 350 is assumed to be planar in the present application scenario.
- image data of the first and the second camera system can be compared to check whether their fields of view overlap and the camera systems can be adjusted or realigned if the fields of view do not overlap.
- step 109 the camera system 330 and 340 synchronously record first and second image data.
- UAV 320 is equipped with the apparatus 200 comprising the interface 210 for receiving 110 the first and the second image data from the camera systems 330 and 340 and the data processing circuitry 220 for determining 120 the geometric relation of the first and the second image data and determining 130 the relative position using the geometric relation.
- the data processing circuitry 220 can identify a plurality of features present in both the first and the second image data using computer vision.
- the features for example, relate to multiple objects in area 350.
- Further data processing of the data processing circuitry 220 includes determining 120 first coordinates of the features in the first image data and second coordinates of the features in the second image data and determining 130 the geometric relation between the first and the second image data using the first and the second coordinates.
- the data processing circuitry 220 uses the first and the second coordinates as input to the eight-point algorithm providing the relative position of the UAVs 310 and 320 in a common coordinate system 370.
- the skilled person having benefit from the present disclosure will appreciate that other approaches for determining the relative position using computer or machine learning can be used. E.g. other machine learning algorithms can be used for determining the relative position.
- step 140 the UAVs 310 and 320 communicate their relative position, for example, to initiate and coordinate an evasive maneuver, if necessary.
- the relative position can be indicative of a “scale-free” relative altitude and relative horizontal position of the UAVs 310 and 320 to each other.
- the relative altitude or horizontal position is sufficient to determine the evasive maneuver.
- the evasive maneuver for example, provides for opposed movements of the UAVs 310 and 320 in vertical direction (e.g. UAV 310 rises by 20cm and UAV 320 sinks by 20cm).
- the apparatus 200 derives a “scaled” absolute position of the first and the second aerial vehicle to each other from the relative position using scaled positional data (e.g. longitude and latitude) of the UAVs 310 and 320 as reference data.
- scaled positional data e.g. longitude and latitude
- the satellite-based positioning system provides the scaled positional data.
- the above method 100 and apparatus 200 allow a more accurate localization of the UAVs 310 and 320 than satellite- or barometer-based positioning systems.
- method 100 and the apparatus 200 allow a more accurate localization of the UAVs 310 and 320 in vertical direction than satellite- or barometer-based positioning systems. In turn, this allows the operation of more UAVs in a given volume of airspace and the determination of more efficient trajectories of the UAVs.
- Fig. 4 illustrates a second application scenario where fields of view 332a and 342a of the camera system 310 and 330 are blocked by a building 380 such that the fields of view 332a and 342a have no overlap.
- the data processing circuitry 220 notices that the fields of view have no overlap by comparing image data of the camera system 330 and 340.
- the UAVs 310 and 320 can realign the camera system 330 and 340 (e.g. using actuators) such that their adjusted fields of view 332b and 342b overlap with each other in the area 350’. Subsequently, the apparatus 200 can determine their relative position in accordance with the above concept using the adjusted fields of view 332b and 342b.
- a method for determining a relative position of a first aerial vehicle and at least one second aerial vehicle to each other comprising: receiving first image data of a first camera system attached to the first aerial vehicle and second image data of a second camera system attached to the second aerial vehicle; determining a geometric relation of the first and the second image data; and determining the relative position using the geometric relation of the first and the second image data.
- determining the relative position comprises: identifying a plurality of features present in both the first and the second image data using computer vision; determining first coordinates of the features in the first image data and second coordinates of the features in the second image data; and determining the geometric relation of the first and the second image data using the first and the second coordinates.
- Method of (1) or (2) wherein the method comprises synchronizing the first and the second camera system for synchronously recording the first and the second image data.
- Method of any one of (1) to (3) wherein at least one of the first and the second aerial vehicle is an unmanned aerial vehicle.
- Method of any one of (1) to (4), the method comprising: checking whether fields of view of the first and the second camera system overlap by comparing image data of the first and the second camera system; and adjusting, if the fields of view do not overlap, a pose of the first and/or the second camera system.
- Method of any one of (1) to (6) wherein the method comprises: receiving first scaled positional data of the first aerial vehicle and second scaled positional data of the second aerial vehicle using a satellite based navigation system; and deriving a scaled absolute position of the first and the second aerial vehicle to each other based on the relative position, the first, and the second scaled positional data.
- a computer program comprising instructions, which, when the computer program is executed by a processor, cause the processor to carry out the method of any one of (1) to (9).
- An apparatus for determining a relative position of a first aerial vehicle and at least one second aerial vehicle to each other comprising: at least one interface configured to receive first image data from a first camera system attached to the first aerial vehicle and second image data from a second camera system attached to the second aerial vehicle; and a data processing circuitry configured to determine a geometric relation of the first and the second image data; and determine the relative position using the geometric relation of the first and the second image data.
- Examples may further be or relate to a (computer) program including a program code to execute one or more of the above methods when the program is executed on a computer, processor, or other programmable hardware component.
- steps, operations, or processes of different ones of the methods described above may also be executed by programmed computers, processors, or other programmable hardware components.
- Examples may also cover program storage devices, such as digital data storage media, which are machine-, processor- or computer-readable and encode and/or contain machine-executable, processor-executable or computer-executable programs and instructions.
- Program storage devices may include or be digital storage devices, magnetic storage media such as magnetic disks and magnetic tapes, hard disk drives, or optically readable digital data storage media, for example.
- Other examples may also include computers, processors, control units, (field) programmable logic arrays ((F)PLAs), (field) programmable gate arrays ((F)PGAs), graphics processor units (GPU), application-specific integrated circuits (ASICs), integrated circuits (ICs) or system-on-a-chip (SoCs) systems programmed to execute the steps of the methods described above.
- FPLAs field programmable logic arrays
- F field) programmable gate arrays
- GPU graphics processor units
- ASICs application-specific integrated circuits
- ICs integrated circuits
- SoCs system-on-a-chip
- aspects described in relation to a device or system should also be understood as a description of the corresponding method.
- a block, device or functional aspect of the device or system may correspond to a feature, such as a method step, of the corresponding method.
- aspects described in relation to a method shall also be understood as a description of a corresponding block, a corresponding element, a property or a functional feature of a corresponding device or a corresponding system.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
Abstract
La présente divulgation concerne un procédé pour déterminer une position relative d'un premier véhicule aérien et d'au moins un second véhicule aérien l'un par rapport à l'autre. Le procédé consiste à recevoir des premières données d'image d'un premier système de caméra fixé au premier véhicule aérien et des secondes données d'image d'un second système de caméra fixé au second véhicule aérien. En outre, le procédé permet de déterminer la position relative à l'aide d'une relation géométrique entre les premières et secondes données d'image.
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/034,381 US20230401741A1 (en) | 2020-11-05 | 2021-11-04 | Method, computer program, and apparatus for determining a relative position of a first aerial vehicle and at least one second aerial vehicle to each other |
EP21805508.5A EP4241043A1 (fr) | 2020-11-05 | 2021-11-04 | Procédé, programme informatique et appareil pour déterminer une position relative d'un premier véhicule aérien et d'au moins un second véhicule aérien l'un par rapport à l'autre |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20206023.2 | 2020-11-05 | ||
EP20206023 | 2020-11-05 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022096576A1 true WO2022096576A1 (fr) | 2022-05-12 |
Family
ID=73138760
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2021/080650 WO2022096576A1 (fr) | 2020-11-05 | 2021-11-04 | Procédé, programme informatique et appareil pour déterminer une position relative d'un premier véhicule aérien et d'au moins un second véhicule aérien l'un par rapport à l'autre |
Country Status (3)
Country | Link |
---|---|
US (1) | US20230401741A1 (fr) |
EP (1) | EP4241043A1 (fr) |
WO (1) | WO2022096576A1 (fr) |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180213208A1 (en) * | 2017-01-25 | 2018-07-26 | Samsung Electronics Co., Ltd. | Method and apparatus for determining stereoscopic multimedia information |
WO2020040679A1 (fr) * | 2018-08-22 | 2020-02-27 | I-Conic Vision Ab | Procédé et système correspondant pour générer des modèles à base de vidéo d'une cible telle qu'un événement dynamique |
-
2021
- 2021-11-04 US US18/034,381 patent/US20230401741A1/en active Pending
- 2021-11-04 WO PCT/EP2021/080650 patent/WO2022096576A1/fr unknown
- 2021-11-04 EP EP21805508.5A patent/EP4241043A1/fr active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180213208A1 (en) * | 2017-01-25 | 2018-07-26 | Samsung Electronics Co., Ltd. | Method and apparatus for determining stereoscopic multimedia information |
WO2020040679A1 (fr) * | 2018-08-22 | 2020-02-27 | I-Conic Vision Ab | Procédé et système correspondant pour générer des modèles à base de vidéo d'une cible telle qu'un événement dynamique |
Also Published As
Publication number | Publication date |
---|---|
EP4241043A1 (fr) | 2023-09-13 |
US20230401741A1 (en) | 2023-12-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10788830B2 (en) | Systems and methods for determining a vehicle position | |
US11112252B2 (en) | Sensor fusion for accurate localization | |
US10515458B1 (en) | Image-matching navigation method and apparatus for aerial vehicles | |
CN110927708B (zh) | 智能路侧单元的标定方法、装置及设备 | |
KR20200044420A (ko) | 위치 추정 방법 및 장치 | |
US7408629B2 (en) | Passive measurement of terrain parameters | |
Hosseinpoor et al. | Pricise target geolocation and tracking based on UAV video imagery | |
US20110282580A1 (en) | Method of image based navigation for precision guidance and landing | |
Dumble et al. | Efficient terrain-aided visual horizon based attitude estimation and localization | |
Kinnari et al. | GNSS-denied geolocalization of UAVs by visual matching of onboard camera images with orthophotos | |
Ruchanurucks et al. | Automatic landing assist system using IMU+ P n P for robust positioning of fixed-wing UAVs | |
Bhamidipati et al. | Integrity monitoring of Graph‐SLAM using GPS and fish‐eye camera | |
Andert et al. | On the safe navigation problem for unmanned aircraft: Visual odometry and alignment optimizations for UAV positioning | |
CN114119752A (zh) | 基于gnss和视觉的室内外衔接的机器人定位方法 | |
Eitner et al. | Development of a navigation solution for an image aided automatic landing system | |
US20230401741A1 (en) | Method, computer program, and apparatus for determining a relative position of a first aerial vehicle and at least one second aerial vehicle to each other | |
Gu et al. | SLAM with 3dimensional-GNSS | |
EP3964863A1 (fr) | Procédé et dispositif d'estimation d'état de déplacement | |
Trusheim et al. | Cooperative localisation using image sensors in a dynamic traffic scenario | |
Grishin | Precision estimation of camera position measurement based on docking marker observation | |
Cho et al. | Analysis in long-range apriltag pose estimation and error modeling | |
Kurz et al. | Generation of Reference Vehicle Trajectories in real-world Situations using Aerial Imagery from a Helicopter | |
Krasuski et al. | Algorithms for improving the position determination of an UAV equipped with a single-frequency GPS receiver for low-altitude photogrammetry | |
Dill et al. | Integration of 3D and 2D imaging data for assured navigation in unknown environments | |
US20230115712A1 (en) | Estimation of Target Location and Sensor Misalignment Angles |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21805508 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021805508 Country of ref document: EP Effective date: 20230605 |