IL320776A - A multimodal fiducial marker, a heterogeneous perception apparatus and a multimodal system comprising both - Google Patents
A multimodal fiducial marker, a heterogeneous perception apparatus and a multimodal system comprising bothInfo
- Publication number
- IL320776A IL320776A IL320776A IL32077625A IL320776A IL 320776 A IL320776 A IL 320776A IL 320776 A IL320776 A IL 320776A IL 32077625 A IL32077625 A IL 32077625A IL 320776 A IL320776 A IL 320776A
- Authority
- IL
- Israel
- Prior art keywords
- marker
- component
- thermal
- vehicle
- relative
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/20—Image preprocessing
- G06V10/24—Aligning, centring, orientation detection or correction of the image
- G06V10/245—Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Remote Sensing (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
- Traffic Control Systems (AREA)
Description
DESCRIPTION A MULTIMODAL FIDUCIAL MARKER, A HETEROGENEOUS PERCEPTION APPARATUS AND A MULTIMODAL SYSTEM COMPRISING BOTH TECHNICAL FIELD 5 id="p-1" id="p-1"
id="p-1"
[0001] The present disclosure relates generally to detection systems and methods.More particularly, the present application relates to systems, methods and devices foraid unmanned or manned vehicles in detecting target locations.
BACKGROUND id="p-2" id="p-2"
id="p-2"
[0002] Unmanned vehicles are autonomous robots that have been used in several 10applications due to the ability to maintain a very stable navigation, and to approach noteasily access areas, resulting in the collection of high-quality data. Said vehicles may beof an aquatic, ground-based or air-based type and its main application areas includeseveral industries, like the cinematographic, military, agricultural and surveillance.Furthermore, several inspection tasks are being carried out by said robotic vehicles, 15representing a reduction of the costs and of the total time consumed. The usual cycle ofa robotic vehicle mission includes its departure from a base station, the execution of itstask, and the returning to the base station or another target location. id="p-3" id="p-3"
id="p-3"
[0003] An important feature that allows the autonomous operation is the capability ofthe vehicle to precise landing/docking on a predefined target location. A crucial task for 20enabling the vehicle to successfully land/dock on the desired target location is its abilityto autonomously detect and recognize said target location in real-time. If the vehiclemisses the target location, it can compromise the continuity of the mission, as well asdamaging the vehicle equipment. id="p-4" id="p-4"
id="p-4"
[0004] In case of aerial unmanned vehicles, such as drones, the landing accuracy of 1-3 25m of the traditional GPS-based method does not satisfy the high precision needed forseveral applications, that require a centimeter level accuracy. Moreover, theperformance of traditional visual approaches does not feature the required robustness and precision to operate in challenging weather and lighting conditions. And in the mostextreme scenarios, these difficulties are equally felt by manned vehicles. id="p-5" id="p-5"
id="p-5"
[0005] To achieve a precise docking/landing of an unmanned or manned vehicle, it isnecessary to accurately retrieve the relative pose between the vehicle and the detectedtarget location. Several robotic applications resort to artificial markers to label positions 5of interest and to help an unmanned vehicle or an automatic manoeuvre module of amanned vehicle to retrieve its relative position. The most common types of artificialmarkers used include visual markers, infrared (IR) beacons, thermal markers andretroreflective markers. But, independently of the type of markers and of the sensorsused, the relative pose estimation requires the awareness of the pose of the markers in 10a fixed coordinate frame. Usually, the knowledge of the real size of the marker is alsonecessary for the scale factor. With this information, combined with the correctassociation between the detected marker and its real pose, it is possible to estimate therelative distance from the vehicle to the target location. id="p-6" id="p-6"
id="p-6"
[0006] When a calibrated camera sensor is used, the task of calculating its position from 15a marker (3D reference points) and its projection toward the image plane (2D projectionpoints) is known as the Perspective-n-Point (PnP) problem. There are different solutionsable to solve this problem and whose performance depends on the type of markersused. Each of these methods allows to estimate the transformation from the camera tothe marker frames. The pose of the vehicle is then calculated by knowing the 20transformation from the camera to the vehicle frames, and from the marker to thetarget location frames. Unlike when using a camera sensor, the use of a range sensor,such as 3D LiDAR (Light Detection And Ranging), can directly perceive the depth of thesurrounding environment. This characteristic enables to easily calculate the relativepose between the vehicle and a marker, that can be detected by analysing the LiDAR’s 25point cloud. id="p-7" id="p-7"
id="p-7"
[0007] Many systems resort to visual information to identify visual markers for amultiplicity of tasks, including for docking/landing areas detection. Several studies havebeen performed in the area of computer vision, which are able to increase thedocking/landing accuracy. The most common approaches use monochromatic markers 30placed on the docking/landing target. Each marker presents a layout that encodes a unique identification by a binary code. They are generally based on regular geometricshapes (such as squares), making any of them distinguishable from any other marker. Inrobotics, some of the most popular fiducial markers are the ARTag, AprilTag and ArUco.The quadrangular shape of these markers allows to extract the camera pose from theirfour corners, as long as the camera is properly calibrated. In resume, these markers are 5detected by firstly extracting the edges of an image collected by a visual camera,followed by filtering the contours that make up a polygon with four vertices, and finallyby extracting its respective binary code. Circular shaped markers, such as the CCTag andthe STag, are also used in several robotics applications. Although there are optimizationtechniques to improve the detectability of these markers, their major drawback is that 10they are dependent on lighting and environment conditions, only performing efficientlyindoors or in daylight with favourable weather. id="p-8" id="p-8"
id="p-8"
[0008] Other type of markers used for the detection of target location areas consists inthe utilization of Light-Emitting Diodes (LEDs). Therefore, the vehicle must be equippedwith a sensor that is able to capture the light emitted by the LEDs. The most common 15approaches propose the use of a beacon with IR LEDs and with an IR camera for thetarget location detection. The major drawback of these IR marker approaches is thatthey are only suitable for indoor operations, given its sensibility to the sunlight. Otherdisadvantage is related with the short range of the beams and with the limitation of theoperation range that the angle of radiation implies, given that the tilting of the camera 20produces high localization errors. id="p-9" id="p-9"
id="p-9"
[0009] Consequently, conventional solutions consist of unimodal methods with reducedpotential of generalization. These approaches are limited to controlled scenarios withfavourable conditions.[0010] In summary, for a vehicle to operate autonomously and effectively, it is 25mandatory to endow them with precise target location abilities. The unmanned ormanned vehicle has to be able to detect the target location and to perform the requiredmanoeuvre to get there without compromising its own safety and the integrity of itssurroundings. id="p-11" id="p-11"
id="p-11"
[0011] However, current solutions do not present the required robustness and 30reliability for accurate detection and navigation to the target location in highly demanding scenarios, particularly due to its inadequacy to perform accordingly withchallenging lighting and weather conditions, including day and night operations. id="p-12" id="p-12"
id="p-12"
[0012] These facts are disclosed in order to illustrate the technical problem addressedby the present disclosure.
GENERAL DESCRIPTION 5 id="p-13" id="p-13"
id="p-13"
[0013] The present document discloses a multimodal fiducial marker for relative poseestimation comprising a first component and a second component arranged to providea surface comprising a first section of the first component and a second section of thesecond component; a heat source arranged to heat the surface by thermal conductionthrough said components; the first section having a reflectivity index different of a 10reflectivity index of the second section; and, the first component having a thermalconductivity coefficient different of the thermal conductivity coefficient of the secondcomponent; wherein said sections are arranged as a pattern of geometric shapesencoding data. id="p-14" id="p-14"
id="p-14"
[0014] In an embodiment, the marker according to the previous claim wherein the 15pattern of geometric shapes is a binary code, in particular, a thermal light-retroreflectivebinary identification pattern. id="p-15" id="p-15"
id="p-15"
[0015] In an embodiment, the marker according to any of the previous claims whereinsaid reflectivity index is a visible-light reflectivity index. id="p-16" id="p-16"
id="p-16"
[0016] In an embodiment, the marker according to any of the previous claims wherein 20said surface is substantially non reflective in the infrared-light spectrum.
BRIEF DESCRIPTION OF THE DRAWINGS id="p-17" id="p-17"
id="p-17"
[0017] The following figures provide preferred embodiments for illustrating thedisclosure and should not be seen as limiting the scope of invention. id="p-18" id="p-18"
id="p-18"
[0018] Figure 1: representation of an embodiment of the multimodal fiducial marker of 25the present application, where the numerical reference signs represent: 1 – marker’s surface first section; 2 – marker’s surface second section; 3 – heat source; 4 –multimodal fiducial marker. id="p-19" id="p-19"
id="p-19"
[0019] Figure 2: representation of an embodiment of the heterogeneous perceptionapparatus of the multimodal system described in the present application, where thenumerical reference signs represent: 5 – 3D-LiDAR unit; 6 – visual camera unit; 7 – 5thermal camera unit; 8 - heterogeneous perception apparatus. id="p-20" id="p-20"
id="p-20"
[0020] Figure 3: illustrative example of the multimodal system described in the presentapplication, where the numerical reference signs represent: 4 – multimodal fiducialmarker; 8 - heterogeneous perception apparatus; 9 – unmanned Vehicle (air-basedtype); 10 – target location. 10 id="p-21" id="p-21"
id="p-21"
[0021] Figure 4: flowchart depicting the position estimation procedure executed by theheterogeneous perception apparatus, in an embodiment of the multimodal systemdescribed in the present application, where the numerical reference signs represent: 5– 3D-LiDAR unit; 6 – visual camera unit; 7 – thermal camera unit; 8 - heterogeneousperception apparatus; 8.1 – processor-based device; 9 – unmanned Vehicle (air-based 15type); 10 – target location. id="p-22" id="p-22"
id="p-22"
[0022] Figures 5A, 5B: thermal images tests comparing an acrylic marker with themultimodal fiducial marker, where the numerical reference signs represent: 4 -multimodal fiducial marker, 503 - acrylic marker, and 505 – a hot spot. id="p-23" id="p-23"
id="p-23"
[0023] Figure 6: thermal images tests for testing the IR reflection, where the numerical 20reference sign represents: 601 – an IR reflection. id="p-24" id="p-24"
id="p-24"
[0024] Figure 7: illustrative acrylic marker and the multimodal fiducial marker, wherethe numerical reference signs represent: 4 - multimodal fiducial marker, and 503 - acrylicmarker.
DETAILED DESCRIPTION 25 id="p-25" id="p-25"
id="p-25"
[0025] It is therefore an object of the present application a multimodal system to aidunmanned or manned vehicles in accurately detect and navigate to target locations. id="p-26" id="p-26"
id="p-26"
[0026] Such system therefore provides the ability to operate under severe environmentand light conditions (such as at different heights, with intense sunlight, no-light or indark environments; thus, it includes operations subject to rain and fog), by means of amultimodal approach that resorts to photometric and radiometric data, in order toperform a robust, redundant and reliable detection of the vehicle’s target location. 5 id="p-27" id="p-27"
id="p-27"
[0027] For that purpose, the system includes at least one multimodal fiducial markerand a heterogeneous perception apparatus to be coupled to the vehicle. The multimodalfiducial marker can be detected and localized through the analysis of visual, thermal andpoint cloud data. This is an active marker that is able to improve the relative localizationof vehicles, which is very relevant for navigational manoeuvres, especially in unmanned 10vehicles. The heterogeneous perception apparatus, in its turn, collects both photometricand radiometric data, resorting to cameras and range sensors. Data is fused andcombined together by means of a particular method also described in the presentapplication. id="p-28" id="p-28"
id="p-28"
[0028] Thus, the multimodal system of the present application improves the situational 15awareness of the vehicles, increasing its detection capabilities and navigational abilities.In an advantageous configuration, the system is comprised by:at least one multimodal fiducial marker as described in the present application; eachmarker positioned at a target location; andat least one heterogeneous perception apparatus as described in the present 20application. id="p-29" id="p-29"
id="p-29"
[0029] The multimodal fiducial marker of the present application is adapted to createan unique thermal retroreflective binary identification pattern. For that purpose, in anadvantageous configuration of the marker, it comprises:a surface of a defined geometry, having a first section of a first component and at 25least a second section of a second component. Such sections are arranged in orderto form the marker’s layout which is configured to encode an unique identificationpattern in binary code, each of the first and the second components being of adifferent binary colour.the first component has a reflectivity index different from the reflectivity index of 30the second component; and a heat source operable to heat the marker’s surface; and wherein, the first and thesecond components have a different thermal conductivity coefficient. id="p-30" id="p-30"
id="p-30"
[0030] The heterogeneous perception apparatus of the present application is adaptedto be coupled to a vehicle and configured to detect the multimodal fiducial markerdescribed in the present application that is positioned in a vehicle’s target location. In 5an advantageous configuration of the apparatus, it comprises:a visual camera unit configured to collect image data in a visible light spectrum of atarget location area and to estimate the marker’s pose relative to a visual cameraunit’s coordinate system.a thermal camera unit configured to collect both thermal and radiometric data of 10the target location area and to estimate the marker’s pose relative to a thermalcamera unit’s coordinate system.a 3D-LiDAR unit configured to collect range and radiometric data from the targetlocation area and to estimate the marker’s pose relative to a 3D-LiDAR unit’scoordinate system. 15a processor-based device:programmed to process marker’s pose estimation data obtained by the visual,thermal and 3D-LiDAR units, to determine a relative pose estimation betweenthe apparatus and the marker, and to determine the location of the marker; therelative positioning between the visual and thermal camera unit and the 3D- 20LiDAR unit being known; andoperable to transmit said location information to the Vehicle. id="p-31" id="p-31"
id="p-31"
[0031] It is also an object of the present application, a method detecting multimodalfiducial markers using the heterogeneous perception apparatus coupled to a Vehicle.The method comprises the following steps: 25scanning a target location area using the heterogeneous perception apparatus toobtain target location data;identifying a marker from the target location data;collecting image, thermal, radiometric and range data from the marker using thesensor units of the apparatus; 30 estimating a marker’s pose relative to the coordinate system of each sensor unitsof the apparatus;determining the relative pose estimation between the apparatus and the markerbased on the estimated poses of the marker obtained in iv., and determine thecorrespondent location of the marker; 5transmit the location information of the marker to the Vehicle. id="p-32" id="p-32"
id="p-32"
[0032] Figure 1 shows a representation of an embodiment of the multimodal fiducialmarker of the present application. id="p-33" id="p-33"
id="p-33"
[0033] In an object of the present application, it is described a multimodal fiducialmarker (4) for relative pose estimation of a vehicle. The marker (4) is operational for any 10type of vehicle (9), whether manned or unmanned, aquatic, ground-based or air-basedtype. id="p-34" id="p-34"
id="p-34"
[0034] The marker comprises a surface of a defined geometry, that is, a geometry thatis pre-defined and is recognizable by a perception apparatus and by the relative posedetection and estimation algorithms. As an example, the marker (4) has a quadrangular 15shape with dimensions 0.22 x 0.22 x 0.02 meters, being detectable by first extracting theedges of an image collected by the apparatus’ visual camera, followed by filtering thecounters to make up a polygon with four vertices. Circular shaped markers may also beused. id="p-35" id="p-35"
id="p-35"
[0035] The marker’s surface has a first section (1) of a first component and at least a 20second section (2) of a second component, such sections and respective componentsbeing of a different binary colour and are arranged in such a way to form a marker’sspecific layout that is able to encode a unique identification pattern in binary code.Additionally, the first component has a reflectivity index different from the reflectivityindex of the second component. 25 id="p-36" id="p-36"
id="p-36"
[0036] The marker (4) also comprises a heat source (3) operable to heat the marker’ssurface, and wherein the first and the second components have a different thermalconductivity coefficient. id="p-37" id="p-37"
id="p-37"
[0037] In view of being comprised by such set of technical features, the marker (4) is anactive marker that is adapted to create a unique thermal retroreflective binary 30 identification pattern, allowing it to be detectable and localizable through the analysisof visual, thermal and point cloud data. This active marker improves the relativelocalization of vehicles (9) endowed with the heterogeneous perception apparatus (8)of the present application, in particular for the precise landing or docking maneuvers(depending on the type of vehicle (9)). 5 id="p-38" id="p-38"
id="p-38"
[0038] In an alternative embodiment of the marker (4) of the present application, itssurface’s geometry is a planar geometry. More particularly, the first section (1) and atleast the second section (2) are arranged on the two-dimensional surface of the marker(4). Consequently, the marker (4) is adapted to create a two-dimensional thermalretroreflective binary identification pattern. 10 id="p-39" id="p-39"
id="p-39"
[0039] Alternatively, the marker’s geometry is a spatial geometry. More particularly, thefirst and at least the second sections (1, 2) are arranged as a multiplicity of differentlyshapes. Consequently, the marker (4) is adapted to create a three-dimensional thermalretroreflective binary identification pattern. id="p-40" id="p-40"
id="p-40"
[0040] In another alternative embodiment of the marker (4), the binary code used to 15encode the marker’s unique identification pattern can be any one of a library of binarycode, such as an ArUco or an AprilTAg or an ARTag code. id="p-41" id="p-41"
id="p-41"
[0041] In another alternative embodiment of the marker (4), the first component is ofa white colour and the second component is of a black colour. Optionally, the firstcomponent is of a blue colour and the second component is of a red colour. Other 20combination of colours for the first and second components are presented by way ofexample: green/blue, yellow/brown or light gray/dark gray. id="p-42" id="p-42"
id="p-42"
[0042] In another embodiment of the marker (4), the first component has a reflectivityindex of at least 70% and the second component is of a non-retroreflective type.Alternatively, the first component is comprised by a layer of retroreflective material 25having a reflectivity index of at least 70%, said layer of retroreflective material beingapplied on top of at least a first component’s material. Said first component’s materialmay have a thermal conductivity of at least 88 W/m-1K-1. In respect of the secondcomponent, it is comprised by at least one material of a non-retroreflective type, said material having a maximum thermal conductivity of 0.38 W/m1K-1. Optionally, the firstmaterial is aluminium and the second material is cork. id="p-43" id="p-43"
id="p-43"
[0043] Finally, in another embodiment of the marker (4), the heat source (3) is operableto heat the marker’s surface to at least a temperature of 100° C. The heat source (3) maybe an electric heated bed which is powered by an electrical source plug or by a battery 5unit. id="p-44" id="p-44"
id="p-44"
[0044] In this way, combining all the technical characteristics, related to the binarycoding, provided by components of different colours, and retro-reflectivity and thermalproperties, it is possible to achieve a synergistic effect allowing this single marker (4) tobe detected and localized in adverse environments, representing a compact and easy to 10use solution for multiple applications. id="p-45" id="p-45"
id="p-45"
[0045] In an object of the present application, it is described an heterogeneousperception apparatus (8) to be coupled to a vehicle (9) and configured to detect themultimodal fiducial marker (4) already described, said marker (4) being positioned in avehicle’s target location (10). 15 id="p-46" id="p-46"
id="p-46"
[0046] The apparatus (8) is adapted to perceive multimodal information, to allow thevehicle (9) to which it is coupled to successfully land/dock in adverse environments.Moreover, the apparatus is designed for the severe offshore environment and obtainsboth photometric and radiometric data, such as visual, thermal and point cloudinformation. 20 id="p-47" id="p-47"
id="p-47"
[0047] The apparatus comprises:a visual camera unit (6) configured to collect image data in a visible light spectrumof a target location area and to estimate the marker’s pose relative to a visualcamera unit’s coordinate system;a thermal camera unit (7) configured to collect both thermal and radiometric data 25of the target location area and to estimate the marker’s pose relative to a thermalcamera unit’s coordinate system;a 3D-LiDAR unit (5) configured to collect range and radiometric data from the targetlocation area and to estimate the marker’s pose relative to a 3D-LiDAR unit’scoordinate system; 30 a processor-based device (8.1):programmed to process marker’s pose estimation data obtained by the visual,thermal and 3D-LiDAR units (6, 7, 5), to determine a relative pose estimationbetween the apparatus (8) and the marker (4), and to determine the location(10) of the marker (4); the relative positioning between the visual and thermal 5camera unit (6, 7) and the 3D-LiDAR unit (5) being known; andoperable to transmit said location information to the vehicle (9). id="p-48" id="p-48"
id="p-48"
[0048] Since the visual camera (6) collects images in the visible light spectrum and thethermal camera (7) gathers both thermal and radiometric information of the scene, itrepresents a more robust sensory approach that does not depend on the lighting 10conditions. On the other hand, the 3D LiDAR (5) directly acquires range data from thesurrounding environment using laser beams, that are represented in point clouds. Whilethe camera sensors (6) collect denser data for close range procedures, the 3D LiDAR (5)has a larger field of view and range, suitable for long range operations. Thus, theapparatus (8) not only allows to collect multimodal and complementary information 15about a target location, but also, being coupled to the vehicle (9), plays an importantand significant role in the navigation manoeuvres, increasing situational awareness of ascenario of operation, contributing to a safer operation of the vehicle (9). id="p-49" id="p-49"
id="p-49"
[0049] The pose estimation given by every sensor (5, 6, 7) needs to be fused to outputa single and redundant localization of the detected marker. For that purpose, a weighted 20average is applied that ensures a short processing time and increases the computationalefficiency for an embedded system, guaranteeing a real time detection and relative poseestimation. Particularly, and in another embodiment of the apparatus (8), the processor-based device (8.1) is programmed to determine the relative pose estimation betweenthe apparatus (8) and the marker (4), ? ? ? , based on the following method: 25 ? ? ? =∑ ⋋? ? ? , ? ? ? ∀ ? ∑ ⋋? ? ? , ? ∀ ? , ? = { ? , ? , ? }, ? = { ? , ? , ? } ⋋? ∈ {0,1} ,? ? , ? ∈ [0,1] , ? ? , ? ∀ ? =1∀ ? . wherein, ? ? ? = ? ? ? , ? ? ? , ? ? ? = ? ? , ? ? , ? ? is the position estimation of the apparatus (8) in relation to the marker (4); ? ? =( ? ? , ? ? , ? ? ) is the estimation of the relative position of the marker (4) given by thevisual camera unit (V), the thermal camera unit (T) and the 3D-LiDAR unit (L); 5 ⋋? is a Boolean variable that equals 1 when its correspondent sensor detects the marker(4) and equals 0 otherwise; and ? ? is a dynamic weight, calculated according to the following expression: ? ? , ? = ? ̅ ? , ? + ? ? , ? , ? = { ? , ? , ? }, ? = { ? , ? , ? } wherein, 10 ? ̅ represents the mean error and ? the standard deviation. id="p-50" id="p-50"
id="p-50"
[0050] In an object of the present application, it is described a method for detectingmultimodal fiducial markers (4), said markers (4) being positioned in a target location(10), using a heterogeneous perception apparatus (8) coupled to a vehicle (9). Themethod comprising: 15scanning a target location area using the heterogeneous perception apparatus (8)to obtain target location data;identifying a marker (4) from the target location data;collecting image, thermal, radiometric and range data from the marker (4) using thesensor units (5, 6, 7) of the apparatus (8); 20estimating a marker’s pose relative to the coordinate system of each sensor units(5, 6, 7) of the apparatus (8);determining the relative pose estimation between the apparatus (8) and the marker(4) based on the estimated poses of the marker (4) obtained in iv., and determinethe correspondent location (10) of the marker (4); 25transmit the location information of the marker (4) to the vehicle (9). id="p-51" id="p-51"
id="p-51"
[0051] The present application also describes a multimodal system comprising: at least one multimodal fiducial marker (4) as described in the present application;each marker (4) positioned at a target location (10); andat least one heterogeneous perception apparatus as described in the presentapplication. id="p-52" id="p-52"
id="p-52"
[0052] More particularly, the system comprises one or more vehicles (9), each vehicle 5(9) having coupled one heterogeneous perception apparatus, and the system beingconfigured to operate according to the method for detecting multimodal fiducialmarkers described in the present application. The vehicle (9) may be of an unmanned ormanned type and being of an aquatic, ground-based or air-based type. Optionally thevehicle (9) is a drone or a vessel or an automated guided vehicle. 10 id="p-53" id="p-53"
id="p-53"
[0053] The complementarity of the apparatus (8) and the marker (4) increasesrobustness and redundancy in target location detection and precise landing/dockingtasks, when compared with other standard and limited systems, especially in severalcomplex scenarios of operation, where variables such altitude, lighting condition andmarker occlusions, caused by environment conditions, degrades the detection rate of 15the state-of-the-art systems. id="p-54" id="p-54"
id="p-54"
[0054] Based on the technical description made, below are presented by way ofexample, several scenarios of application of the system, where the multimodal fiducialmarker (4) and the heterogeneous perception apparatus (8) are used for relative poseestimation and manoeuvre of an unmanned or manned vehicle (9). 20 id="p-55" id="p-55"
id="p-55"
[0055] Drone landing: id="p-56" id="p-56"
id="p-56"
[0056] The use of the marker (4) and the apparatus (8) for detecting and localizinglanding areas allows a safe, accurate and reliable landing. This autonomous skill is validfor several rotating wing drones (9) to which the apparatus (8) is to be coupled, bothmanned and unmanned (fully autonomous and remotely operated). 25 id="p-57" id="p-57"
id="p-57"
[0057] Package delivery (aerial): id="p-58" id="p-58"
id="p-58"
[0058] The drone (9) has the apparatus (8) coupled to its structure, enabling thedetection of the marker (4) placed in a specific target location (10), landing and deliveryof the package. Another possibility consists of detecting the target (10) and droppingthe package in the air (with or without a parachute). 30 id="p-59" id="p-59"
id="p-59"
[0059] Windfarm inspection: id="p-60" id="p-60"
id="p-60"
[0060] The precise landing ability that the use of the system provides enables a drone(9) with an apparatus (8) coupled to its structure, to be resident of a windfarm (bothonshore and offshore), which allows more frequent and broader inspections. In thiscase, the marker (4) being placed on the turbine structure itself, or on a platform 5suitable for the drone landing. id="p-61" id="p-61"
id="p-61"
[0061] Docking vessels and/or ground vehicles: id="p-62" id="p-62"
id="p-62"
[0062] The use of the system enables accurate relative localization of the dockingstation (10) for both surface vessels and ground vehicles (9), such as rovers and AGVs,assisting in the docking manoeuvre. 10 id="p-63" id="p-63"
id="p-63"
[0063] Tests were performed comparing an acrylic marker of a 4mm thick plate with thedisclosed multimodal fiducial marker for the same dimensions, code, and thermal bedas the heat source. id="p-64" id="p-64"
id="p-64"
[0064] It was observed that after 5:10 min the inside of the acrylic marker was gettingwarmer, but it did not seem to affect the detection; at around 5:35 min the corners of 15the acrylic marker were blistering/deforming with the increasing temperature; at 6:30min the acrylic marker is visually blistering/deforming the other way very quickly/easilydeformable. id="p-65" id="p-65"
id="p-65"
[0065] The thermal radiation reflection test was carried out using a soldering iron as ahot body (~450 °C), which was moved above the marker so that the cameras only picked 20up indirect/reflected radiation. id="p-66" id="p-66"
id="p-66"
[0066] The effect of the hot body's reflection can be seen, noting that the detection rateis considerably worse in time. It would be more significant if the hot object were larger. id="p-67" id="p-67"
id="p-67"
[0067] Doing the same reflection test on disclosed multimodal fiducial marker one couldsee the reflection of the soldering iron on the room floor in the thermal image, without 25any effect. id="p-68" id="p-68"
id="p-68"
[0068] In summary, over time the acrylic marker heats up and there is no longer thethermal contrast needed to detect the code; and acrylic marker has thermal radiationreflective properties, which allows "hot" artifacts to negatively influence the detection of the code; the acrylic marker is not completely flat, which reduces the accuracy (andeven detection) of estimating the location of the marker. id="p-69" id="p-69"
id="p-69"
[0069] More generally there are several problems with acrylic fiducial markers, namely:they are not mechanically robust to heating, as they can warp, nor to atmosphericconditions in outdoor environments; although the acrylic acts as a filter at first, it is 5continuously heating up, specially the center of the acrylic marker, and therefore thetemperature contrast tends to get significantly worse over time; furthermore acrylic isnot a visual marker and is not radiometric at the frequency of LiDARs. id="p-70" id="p-70"
id="p-70"
[0070] The term "comprising" whenever used in this document is intended to indicatethe presence of stated features, integers, steps, components, but not to preclude the 10presence or addition of one or more other features, integers, steps, components, orgroups thereof. id="p-71" id="p-71"
id="p-71"
[0071] The disclosure should not be seen in any way restricted to the embodimentsdescribed and a person with ordinary skill in the art will foresee many possibilities tomodifications thereof. The above-described embodiments are combinable. 15 id="p-72" id="p-72"
id="p-72"
[0072] The following claims further set out particular embodiments of the disclosure.
Claims (18)
1. A multimodal fiducial marker (4) for relative pose estimation comprising a firstcomponent and a second component arranged to provide a surface comprising afirst section (1) of the first component and a second section (2) of the secondcomponent;a heat source (3) arranged to heat the surface by thermal conduction through saidcomponents;the first section having a reflectivity index different of a reflectivity index of thesecond section; and,the first component having a thermal conductivity coefficient different of thethermal conductivity coefficient of the second component;wherein said sections are arranged as a pattern of geometric shapes encoding data.
2. The marker (4) according to the previous claim wherein the pattern of geometricshapes is a binary code, in particular, a thermal light-retroreflective binaryidentification pattern.
3. The marker (4) according to any of the previous claims wherein said reflectivityindex is a visible-light reflectivity index.
4. The marker (4) according to any of the previous claims wherein said surface issubstantially non reflective in the infrared-light spectrum.
5. The marker (4) according to claim 1, whereinthe surface’s geometry is a planar geometry; and wherein the first section (1) andat least the second section (2) are arranged on the two-dimensional surface of themarker (4); the marker (4) being adapted to create a two-dimensional thermalretroreflective binary identification pattern; orwherein the surface’s geometry is a spatial geometry; and wherein the first and atleast the second sections (1, 2) are arranged as a multiplicity of differently shapes; the marker (4) being adapted to create a three-dimensional thermal retroreflectivebinary identification pattern.
6. The marker (4) according to any of the previous claims, wherein the binary codeused to encode the marker’s unique identification pattern is an ArUco or anAprilTAg or an ARTag code.
7. The marker (4) according to any of the previous claims, wherein the first componentis of a white colour and the second component is of a black colour; optionally, thefirst component is of a blue colour and the second component is of a red colour.
8. The marker (4) according to any of the previous claims, wherein the first componenthas a reflectivity index of at least 70% and the second component being of a non-retroreflective type.
9. The marker (4) according to any of the previous claims 1 to 4 wherein the firstcomponent comprises a layer of retroreflective material having a reflectivity indexof at least 70%, said layer of retroreflective material being applied on top of at leasta first component’s base material; andthe second component being comprised by at least one material of a non-retroreflective type.
10. The marker (4) according to claim 6, wherein the first component’s material has athermal conductivity of at least 88 W/m-1K-1, and the second component’s materialhas a maximum thermal conductivity of 0.38 W/m-1K-1; optionally, the first materialis aluminium and the second material is cork.
11. The marker (4) according to any of the previous claims, wherein the heat source (3)is operable to heat the marker’s surface to at least a temperature of 100° C;preferably the heat source (3) is an electric heated bed; the heated bed beingpowered by an electrical source plug or a battery unit.
12. A heterogeneous perception apparatus (8) adapted to be coupled to a Vehicle (9)and configured to detect a multimodal fiducial marker (4) according to any of theclaims 1 to 8; said marker (4) being positioned in target location (10); the apparatus(8) comprising:a visual camera unit (6) configured to collect image data in a visible light spectrumof a target location area and to estimate the marker’s pose relative to a visualcamera unit’s coordinate system;a thermal camera unit (7) configured to collect both thermal and radiometric dataof the target location area and to estimate the marker’s pose relative to a thermalcamera unit’s coordinate system;a 3D-LiDAR unit (5) configured to collect range and radiometric data from the targetlocation area and to estimate the marker’s pose relative to a 3D-LiDAR unit’scoordinate system;a processor-based device (8.1):programmed to process marker’s pose estimation data obtained by the visual,thermal and 3D-LiDAR units (6, 7, 5), to determine a relative pose estimationbetween the apparatus (8) and the marker (4), and to determine the location (10)of the marker (4); the relative positioning between the visual and thermal cameraunit (6, 7) and the 3D-LiDAR unit (5) being known; andoperable to transmit said location information to the Vehicle (9).
13. Apparatus according to claim 9, wherein the processor-based device (8.1) isprogrammed to determine the relative pose estimation between the apparatus (8)and the marker (4), ? ? ? , based on the following method: ? ? ? =∑ ⋋? ? ? , ? ? ? ∀ ? ∑ ⋋? ? ? , ? ∀ ? , ? = { ? , ? , ? }, ? = { ? , ? , ? } ⋋? ∈ {0,1} ,? ? , ? ∈ [0,1] , ? ? , ? ∀ ? =1∀ ? wherein, ? ? ? = ? ? ? , ? ? ? , ? ? ? = ? ? , ? ? , ? ? is the position estimation of the apparatus (8) in relation to the marker (4);? ? =( ? ? , ? ? , ? ? ) is the estimation of the relative position of the marker (4) given bythe visual camera unit (V), the thermal camera unit (T) and the 3D-LiDAR unit (L);⋋? is a Boolean variable that equals 1 when its correspondent sensor detects themarker (4) and equals 0 otherwise; and? ? is a dynamic weight, calculated according to the following expression: ? ? , ? = ? ̅ ? , ? + ? ? , ? , ? = { ? , ? , ? }, ? = { ? , ? , ? } wherein,? ̅ represents the mean error and ? the standard deviation.
14. Method for detecting multimodal fiducial markers (4) according to any of the claimsto 8, said markers (4) being positioned in a target location (10), using aheterogeneous perception apparatus (8) according to claim 9 or 10, the apparatus(8) being coupled to a Vehicle (9); the method comprising:scanning a target location area using the heterogeneous perception apparatus (8)to obtain target location data;identifying a marker (4) from the target location data;collecting image, thermal, radiometric and range data from the marker (4) using thesensor units (5, 6, 7) of the apparatus (8);estimating a marker’s pose relative to the coordinate system of each sensor units(5, 6, 7) of the apparatus (8);determining the relative pose estimation between the apparatus (8) and the marker(4) based on the estimated pose of the marker (4), and determine thecorrespondent location (10) of the marker (4);transmit the location information of the marker (4) to the Vehicle (9).
15. A multimodal system comprising:at least one multimodal fiducial marker (4) according to any of the claims 1 to 8;each marker (4) positioned at a target location (10);at least one heterogeneous perception apparatus (8) according to claim 9 or 10.
16. The system according to claim 12, further comprising one or more Vehicles (9); eachvehicle (9) having coupled one heterogeneous perception apparatus (8); the systembeing configured to operate according to method of claim 11.
17. The system according to claim 13, wherein the Vehicle (9) being of an unmanned ormanned type; and being of an aquatic, ground-based or air-based type; optionallythe vehicle (9) is a drone or a vessel or an automated guided vehicle.
18. Use of a multimodal fiducial marker (4) of any of the claims 1 to 8 and theheterogeneous perception apparatus (8) of claims 9 or 10, for relative poseestimation and manoeuvre of an unmanned or manned vehicle (9). Roy S. Melzer, Adv. Patent Attorney G.E. Ehrlich (1995) Ltd. 35 HaMasger Street Sky Tower, 13th Floor Tel Aviv 6721407
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PT11832822 | 2022-11-10 | ||
| EP22212945.4A EP4369308A1 (en) | 2022-11-10 | 2022-12-12 | A multimodal fiducial marker, a heterogeneous perception apparatus and a multimodal system comprising both |
| PCT/IB2023/061384 WO2024100616A1 (en) | 2022-11-10 | 2023-11-10 | A multimodal fiducial marker, a heterogeneous perception apparatus and a multimodal system comprising both |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| IL320776A true IL320776A (en) | 2025-07-01 |
Family
ID=89121872
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| IL320776A IL320776A (en) | 2022-11-10 | 2023-11-10 | A multimodal fiducial marker, a heterogeneous perception apparatus and a multimodal system comprising both |
Country Status (6)
| Country | Link |
|---|---|
| EP (1) | EP4616377A1 (en) |
| JP (1) | JP2025537290A (en) |
| CN (1) | CN120266172A (en) |
| AU (1) | AU2023378882A1 (en) |
| IL (1) | IL320776A (en) |
| WO (1) | WO2024100616A1 (en) |
-
2023
- 2023-11-10 EP EP23818550.8A patent/EP4616377A1/en active Pending
- 2023-11-10 IL IL320776A patent/IL320776A/en unknown
- 2023-11-10 WO PCT/IB2023/061384 patent/WO2024100616A1/en not_active Ceased
- 2023-11-10 AU AU2023378882A patent/AU2023378882A1/en active Pending
- 2023-11-10 JP JP2025526859A patent/JP2025537290A/en active Pending
- 2023-11-10 CN CN202380076418.8A patent/CN120266172A/en active Pending
Also Published As
| Publication number | Publication date |
|---|---|
| AU2023378882A1 (en) | 2025-05-15 |
| WO2024100616A1 (en) | 2024-05-16 |
| EP4616377A1 (en) | 2025-09-17 |
| CN120266172A (en) | 2025-07-04 |
| JP2025537290A (en) | 2025-11-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| Alam et al. | A survey of safe landing zone detection techniques for autonomous unmanned aerial vehicles (UAVs) | |
| Claro et al. | Artuga: A novel multimodal fiducial marker for aerial robotics | |
| Kalinov et al. | High-precision uav localization system for landing on a mobile collaborative robot based on an ir marker pattern recognition | |
| US12136232B2 (en) | System for detecting a foreign object on a runway and a method thereof | |
| Rudol et al. | Human body detection and geolocalization for UAV search and rescue missions using color and thermal imagery | |
| CN105197252B (en) | A kind of SUAV landing method and system | |
| US20180273173A1 (en) | Autonomous inspection of elongated structures using unmanned aerial vehicles | |
| CN109472831A (en) | Obstacle identification and ranging system and method for road roller construction process | |
| CN110599546A (en) | Method, system, device and storage medium for acquiring three-dimensional space data | |
| CN105405126B (en) | A kind of multiple dimensioned vacant lot parameter automatic calibration method based on single camera vision system | |
| CN109242890A (en) | Laser speckle system and method for aircraft | |
| CN110705485A (en) | Traffic signal lamp identification method and device | |
| CN114494997B (en) | A robot-assisted flame identification and positioning method | |
| JP4448233B2 (en) | Landing point search device, flying object using the same, and landing point evaluation device | |
| Stary et al. | Optical detection methods for laser guided unmanned devices | |
| Lim et al. | Autonomous multirotor UAV search and landing on safe spots based on combined semantic and depth information from an onboard camera and LiDAR | |
| CN107424156A (en) | Unmanned plane autonomous formation based on Fang Cang Owl eye vision attentions accurately measures method | |
| Hartley et al. | Using roads for autonomous air vehicle guidance | |
| CN114255264B (en) | Multi-base-station registration method and device, computer equipment and storage medium | |
| CN114252859A (en) | Method, apparatus, computer equipment and storage medium for determining target area | |
| IL320776A (en) | A multimodal fiducial marker, a heterogeneous perception apparatus and a multimodal system comprising both | |
| KR20230127536A (en) | Map matching apparatus and method for indoor autonomous driving of a mobile robot | |
| CN114252869A (en) | Multi-base-station cooperative sensing method and device, computer equipment and storage medium | |
| EP4369308A1 (en) | A multimodal fiducial marker, a heterogeneous perception apparatus and a multimodal system comprising both | |
| Veneruso et al. | Analysis of ground infrastructure and sensing strategies for all-weather approach and landing in Urban Air Mobility |