WO2022249181A1 - Drone à capacité d'atterrissage sur des terrains accidentés - Google Patents

Drone à capacité d'atterrissage sur des terrains accidentés Download PDF

Info

Publication number
WO2022249181A1
WO2022249181A1 PCT/IL2022/050554 IL2022050554W WO2022249181A1 WO 2022249181 A1 WO2022249181 A1 WO 2022249181A1 IL 2022050554 W IL2022050554 W IL 2022050554W WO 2022249181 A1 WO2022249181 A1 WO 2022249181A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
landing
leg
orientation
propellers
Prior art date
Application number
PCT/IL2022/050554
Other languages
English (en)
Inventor
David PARPARA
Original Assignee
Hevendrones Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hevendrones Ltd. filed Critical Hevendrones Ltd.
Publication of WO2022249181A1 publication Critical patent/WO2022249181A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C25/00Alighting gear
    • B64C25/02Undercarriages
    • B64C25/08Undercarriages non-fixed, e.g. jettisonable
    • B64C25/10Undercarriages non-fixed, e.g. jettisonable retractable, foldable, or the like
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C25/00Alighting gear
    • B64C25/02Undercarriages
    • B64C25/08Undercarriages non-fixed, e.g. jettisonable
    • B64C25/10Undercarriages non-fixed, e.g. jettisonable retractable, foldable, or the like
    • B64C25/18Operating mechanisms
    • B64C25/26Control or locking systems therefor
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64CAEROPLANES; HELICOPTERS
    • B64C25/00Alighting gear
    • B64C25/32Alighting gear characterised by elements which contact the ground or similar surface 
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64DEQUIPMENT FOR FITTING IN OR TO AIRCRAFT; FLIGHT SUITS; PARACHUTES; ARRANGEMENT OR MOUNTING OF POWER PLANTS OR PROPULSION TRANSMISSIONS IN AIRCRAFT
    • B64D45/00Aircraft indicators or protectors not otherwise provided for
    • B64D45/04Landing aids; Safety measures to prevent collision with earth's surface
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U60/00Undercarriages
    • B64U60/20Undercarriages specially adapted for uneven terrain
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U60/00Undercarriages
    • B64U60/50Undercarriages with landing legs

Definitions

  • the present disclosure relates to the field of Unmanned Aerial Vehicles (UAV). More particularly, the disclosure relates to a drone with landing capability on uneven terrain.
  • UAV Unmanned Aerial Vehicles
  • drones are used for performing various tasks, such as transportation, shooting, photographing, etc.
  • the drone may be required to land on various areas, where some of them are uneven terrain, such as mountains, hills, a street with a sidewalk and different objects, etc.
  • a drone configured for landing on uneven terrain, comprising: a drone body comprising a plurality of propellers; a plurality of multi-sectional landing legs, each landing leg comprising at least one joint and a servo motor for operating the joint to adjust the orientation of each landing leg; an image sensor configured on an underside of the drone; and a processor configured to: analyze images of a landing region topography captured by the image sensor; extrapolate topographic information from the images; based on the topographic information, determine an appropriate landing orientation for each leg, whereby the drone body and plurality of propellers are horizontally balanced; and instruct each servo motor to operate a respective joint so as to cause each leg to reach its appropriate landing orientation.
  • the drone may further comprise a memory, the memory storing information regarding at least one of (1) dimensions of known objects and (2) detailed topographic contours within a potential landing area of the drone.
  • the processor may be designed to extrapolate the topographic information based on object identification based on image processing of images captured by the image sensor.
  • the processor may be adapted to extrapolate the topographic information based on analysis of one or more of shades, shadows, or shadow directions cast on the terrain and the topographic information based on comparison of shades or shadows cast by items of known height with shadows of items of unknown height.
  • the drone may further comprise a light source on an underside of the drone, wherein the image sensor is configured to capture reflections of light cast from the light source and extrapolate topographic information regarding the landing area based on the reflections.
  • the multi-sectional landing legs may be telescopic legs, and each servo motor may be configured to extend or retract a telescopic leg.
  • the multi-sectional landing legs may be jointed legs, and each servo motor is configured to rotate a joint.
  • Each multi-sectional landing leg may further comprise a foot, which may comprise a spike or a ribbed surface at a distal end thereof.
  • the drone may comprise a flight stabilization system including a gyroscope and an accelerometer, and wherein, during a flight process, the processor is configured to selectively deliver powerto each of the propellers in orderto stabilize the drone based on orientation data received from the gyroscope and accelerometer, and wherein, during a landing process, the processor is configured to selectively deliver power to each of the propellers in order to maintain the drone in a desired orientation prior to landing.
  • a flight stabilization system including a gyroscope and an accelerometer
  • the processor may be configured to control the landing process such that each leg touches down on the landing region substantially simultaneously.
  • a method of landing a drone on uneven terrain comprising: capturing images of a landing area with an image sensor located on an underside of a drone body; analyzing the captured images; extrapolating topographic information from the images; based on the topographic information, determine an appropriate landing orientation for each leg, whereby the drone body and plurality of propellers are horizontally balanced; and instructing each servo motor to operate a respective joint so as to cause each leg to reach its appropriate landing orientation.
  • the stored information may be regarding (1) dimensions of known objects and (2) detailed topographic contours within a potential landing area of the drone.
  • the extrapolating step may comprise performing object identification based on image processing of images captured by the image sensor and /or analyzing one or more of shades, shadows, or shadow directions cast on the terrain and/or comparing shades or shadows cast by items of known height with shadows of items of unknown height.
  • the method may further comprise the step of capturing reflections of light cast from a light source located on an underside of the drone, and wherein the extrapolating step comprises extrapolating topographic information regarding the landing area based on the reflections.
  • the method may further comprise the step of adaptively stabilizing the drone during flight, wherein the adaptively stabilizing step comprises receiving orientation data from an accelerometer and a gyroscope, and, on the basis of the data, selectively delivering power to one or more propellers, and wherein the method further comprises selectively delivering power to each of the propellers in order to maintain the drone in a desired orientation prior to landing.
  • the method may further comprise the step of controlling the landing such that each leg touches down on the landing region substantially simultaneously.
  • FIG. 1 illustrates an embodiment of a drone capable of landing on uneven terrain, according to embodiments of the present disclosure
  • FIG. 2 schematically illustrates the flow information and control operations of the drone of FIG. 1 during flight, according to embodiments of the present disclosure
  • FIG. 3 schematically illustrates flow of information and control operations of the drone of FIG. 1 during landing, according to embodiments of the present disclosure
  • FIG. 4 schematically illustrates steps in a method of landing a drone on uneven terrain, according to embodiments of the present disclosure
  • FIG. 5 depicts the drone of FIG. 1 landing on a flat surface with various obstacles, according to embodiments of the present disclosure
  • FIG. 6 depicts the drone of FIG. 1 landing on a continuously uneven surface, according to embodiments of the present disclosure
  • FIGS. 7A-7D schematically illustrate various techniques for determining the height of different areas of a landing surface, according to embodiments of the present disclosure.
  • the present disclosure relates to the field of Unmanned Aerial Vehicles (UAV). More particularly, the disclosure relates to a drone with landing capability on uneven terrain.
  • UAV Unmanned Aerial Vehicles
  • drone 1 includes a body 2.
  • Body 2 houses a processor, communication modules such as RF and GPS, and various sensors, as will be described in further depth herein invention.
  • the drone body 2 is connected to six propellers 3 by six arms 3a.
  • the use of six propellers 3 is merely exemplary, and there may be another number of propellers, such as four or eight, as is known to those of skill in the art.
  • drone body 2 includes a load 4, which may be released when desired.
  • the load 4 may be a container with cargo for delivery.
  • the container may contain a fire extinguishing compound, which may be released over a fire.
  • Drone 1 has a plurality of multi-sectional landing legs 5, each landing leg comprising at least one joint 10 and a servo motor 11 at the joint 10 for operating the joint 10.
  • the multi-sectional landing legs are jointed legs, and each servo motor is configured to rotate a joint.
  • the landing legs 5 are foldable at the joint 10, which connects between upper leg 12 and lower leg 14.
  • the landing legs 5 may be telescopic and consisting of connected sections, with at least a lower section being extendible and retractable relative to an upper section, at the joint.
  • Landing legs 5 further include a foot 9.
  • the foot 9 includes ribbed surface 9 at a bottom end thereof, which serves to stabilize the legs 5 on the landing surface.
  • the particular configuration of the feet 9 depicted in FIG. 1 is merely exemplary, and other embodiments are also possible.
  • foot 9 may take the form of a spike at a distal end thereof.
  • FIG. 2 and FIG. 3 schematically indicate the processes of operation of drone 1, both in a regular flying mode 200 and in a landing mode 250.
  • the drone 1 includes various sensors for assisting the drone 1 in flying, including accelerometer 202 and gyroscope 204. The data from these sensors is delivered to a processor 208.
  • the drone 1 also includes communication circuitry for receiving instructions from a user (e.g., via radio frequency).
  • Processor 208 calculates the 6D orientation of the drone 1. On the basis of the orientation and the desired direction of travel of drone 1, the processor 208 issues instructions to motors of each of the propellers in order to stabilize the drone 1 and cause the drone 1 to move in accordance with the user's instructions.
  • the flying mode 200 includes an automatic stabilization system for maintaining stability of the drone 1 when a payload is released from the drone.
  • the stabilization system makes anticipatory changes in the orientation of the drone 1, during predicted behaviors of adding or releasing a payload.
  • Exemplary automatic stabilization systems are disclosed in PCT Application IL2021/051076, filed September 2, 2021, entitled “A System for Drones Stabilization With Improved Flight Safety, Including Weapons Carrying Drones," and PCT Application IL2020/050952, filed September 2, 2020, entitled “A System for Drones Stabilization, With Improved Flight Safety,” both of which are assigned to the assignee of the present application, and the contents of which are incorporated by reference as if fully set forth herein.
  • FIG. S illustrates the operation of drone 1 in a landing mode 250.
  • the landing mode 250 relies on different sensors and inputs for determining the height of different portions of a landing region.
  • the term "landing region” refers to a geometric area within which any of the legs 5 are expected to rest when the drone 1 is landed.
  • the term “landing region” includes flat areas with obstacles, such as a sidewalk, as well as areas with highly variable heights along the geometric area, such as a sand dune.
  • drone 1 may include a downward-directed image sensor 252.
  • the image sensor 252 (which is typically configured on an underside of the drone) may be used to capture images of the landing region.
  • the processor 208 may further include image processing software configured to extrapolate from the captured images information regarding the height of the landing region.
  • Image sensor 252 may be, for example, a CMOS sensor or a CCD sensor.
  • Image sensor 252 may be configured to capture images in the visible and/or infrared ranges.
  • the drone 1 may optionally further include an infrared light source, such as an LED, in order to provide sufficient infrared illumination.
  • drone 1 may include a LIDAR apparatus 254.
  • LIDAR apparatus 254 is configured to direct a laser toward the landing region, to capture a reflection of the laser beam from the landing region, to calculate a time of flight from the transmission of the beam to the receipt of the beam, and to determine, on the basis of the time of flight, the distance between the drone 1 and the landing region.
  • memory 256 of the drone 1 may be pre-loaded with data that may be used to determine the height of a landing region.
  • This pre-loaded data may include, for example, high-resolution topographic maps. These high-resolution topographic maps may be used, in conjunction with a GPS of the drone, to determine the relative height of different points within the landing region.
  • the pre-loaded data may further include the heights of common features of landing regions within the flight route of the drone. For example, a drone 1 that flies in urban settings may be pre-programmed with the heights of curbs relative to the streets. In addition, the drone 1 may be programmed with heights of common features of urban settings, such as street lights, which may be useful in determining the heights of other objects, as will be discussed further herein.
  • the data from all of these inputs is directed to processor 208.
  • the processor 208 extrapolates topographic information.
  • the topographic information in particular includes the relative heights of different points within a landing region.
  • the images from the image sensor 252 may be analyzed based on the shades, shadows, or shadow directions captured by the image sensor, in the manner described below in connection with FIGS. 7A-7C.
  • the processor 208 may perform image recognition on the images captured by the image sensor 252, and use the pre-loaded data to determine heights of various identified objects.
  • the processor 208 instructs provision of power to the propellers 210, as in the flight mode 200.
  • the flight mode is used to maintain balance of the drone 1 as the drone 1 is descending.
  • processor 208 further instructs each servo motor 11 to operate its respective joint 10 (to adjust the orientation of each landing leg), so as to cause each landing leg to reach its appropriate landing orientation.
  • the processor 208 instructs provision of power to the propellers 210, as in the flight mode 200.
  • the flight mode is used to maintain balance of the drone 1 as the drone 1 is descending.
  • processor 208 further instructs each servo motor 11 to operate its respective joint 10, so as to cause each leg to reach its appropriate landing orientation.
  • each of the propellers 3 is evenly balanced relative to a center of gravity of the drone 1.
  • This balancing is accomplished through each leg 5 having a different length relative to the body 2 of the drone, due to the bending or extending of the legs at the joints 10.
  • the flight mode 200 and landing mode BOO work cooperatively.
  • extension of legs 5 of the drone 1 causes the center of gravity of the drone 1 to change.
  • the automatic stabilization system may be pre-programmed with required power delivery to each of the propellers 1 in response to movement of each of the leg 5. The landing is thus performed while maintaining the balance of the drone 1.
  • the processor is configured to control the landing process such that each leg touches down on the landing region substantially simultaneously.
  • this ensures that the drone 1 is not tipped off balance even a small amount during the landing process.
  • FIG. 4 depicts steps of a method 400 of landing drone 1, according to embodiments of the present disclosure.
  • the drone 1 performs topographic analysis of the landing area/region. As discussed above in connection with FIG. 3, the topographic analysis may be performed through one or more of image processing, LIDAR, or retrieval of pre-stored topographic data.
  • the drone performs topographic analysis of the landing area/region. As discussed above in connection with FIG. 3, the topographic analysis may be performed through one or more of image processing, LIDAR, or retrieval of pre-stored topographic data.
  • the drone performs topographic analysis of the landing area/region. As discussed above in connection with FIG. 3, the topographic analysis may be performed through one or more of image processing, LIDAR, or retrieval of pre-stored topographic data.
  • the drone performs topographic analysis of the landing area/region. As discussed above in connection with FIG. 3, the topographic analysis may be performed through one or more of image processing, LIDAR, or retrieval of pre-stored topographic data.
  • the drone
  • the processor 1 determines the desired landing orientation of each leg.
  • the drone 1 determines the height required for each leg in order to ensure that the drone 1 is horizontally balanced when landing, and determines the angular and linear extension of each leg that is required in order to achieve this landing orientation (as shown in Figs. 7A-7D).
  • the processor balances the drone 1 around the center of gravity of the drone 1.
  • the processor sets the orientation of each of the landing legs 5. As discussed, the processor adjusts delivery of power to the drone 1 so as to maintain the balance of the drone even as the landing legs 5 are extended.
  • the processor further reduces powerto each of the propellers, thereby landing the drone.
  • FIG. 5 illustrates drone 1 in a landing position on a sidewalk 501 with obstacles.
  • the obstacles include pallet 502 and brick 503.
  • Legs 5a and 5c are resting on sidewalk 501.
  • Leg 5b is folded to fit the height of the leg to the height of a pallet 502, in such a way that the drone 1 will remain straight and will not overturn.
  • Leg 5d is folded to fit the height of the leg to the height of brick 503 located on the other side, in such a way that the drone 1 will remain straight and will not overturn.
  • Fig. 6 shows a picture of the drone 1 outside in an area of sandhills 601, in which every point in the landing region is of a different height.
  • the folding landing legs 5 fit themselves to the different heights and angles of the sandhills.
  • FIGS. 7A-7D depict various ways that the processor may use images captured by the image sensor in order to determine relative height of points within a landing region.
  • FIG. 7A schematically depicts drone 1 with image sensor 252.
  • the scene includes a block 504 of unknown height, which is within a desired landing region.
  • the scene further includes a light pole 506 of known height, whose height is stored in the memory of the drone 1.
  • the block 504 casts a shadow 505, and the light pole 506 casts a shadow 507. Because the heights of the shadows are proportional to the heights of the objects casting the shadows, it is possible to extrapolate the height of the block 504 based on the dimensions of the shadow 505, and the dimensions of pole 506 and shadow 507.
  • drone 1 includes a light source 20 with a lens and a pattern 22 configured thereon.
  • the pattern 22 partially obstructs the light source, thereby causing the casting of a shadow on the landing surface.
  • the size of the pattern on the landing surface is determined by the characteristics of the lens as well as the distance of the drone 1 from the surface.
  • the processor Based on the appearance of the shadow on block 504, as captured by image sensor 252, the processor is able to calculate the distance of the drone 1 from the block 504, and thus the height of the block 504 relative to the landing surface.
  • the processor may further rely on the heights of known objects such as pole 505, and the appearance of the pattern on the known objects, when making its calculation.
  • landing region 701 is divided into different sub-regions 711-719.
  • Each sub-region has a slightly different height, and thus is a slightly different distance from the image sensor.
  • the intensity of reflection of light back into the image sensor varies slightly from point to point.
  • the processor is equipped with an Artificial Intelligence (Al)-based image-processing functionality that is able to convert the differences in shading into distances, for example, by generating stereoscopic views or depth view, from which distances may be derived.
  • Al Artificial Intelligence
  • FIG. 7D illustrates the use of LIDAR in order to determine the topographic contours of the landing region 701.
  • Each sub-region 711-719 has a different time of flight for return of the laser to the drone, which corresponds to a different distance from the drone, and thus a different height.

Landscapes

  • Engineering & Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Mechanical Engineering (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

L'invention concerne un drone conçu pour atterrir sur un terrain accidenté, comprenant un corps de drone comprenant une pluralité d'hélices ; une pluralité de pattes d'atterrissage à sections multiples, chaque patte d'atterrissage comprenant au moins un joint et un servomoteur pour actionner le joint afin d'ajuster l'orientation de chaque patte d'atterrissage ; un capteur d'image conçu sur une face inférieure du drone ; un processeur conçu pour : analyser des images d'une topographie de zone d'atterrissage capturée par le capteur d'image ; extrapoler des informations topographiques à partir des images ; sur la base des informations topographiques, déterminer une orientation d'atterrissage appropriée pour chaque patte, le corps de drone et la pluralité d'hélices étant équilibrés horizontalement ; ordonner à chaque servomoteur d'actionner un joint respectif de façon à amener chaque patte à atteindre son orientation d'atterrissage appropriée.
PCT/IL2022/050554 2021-05-25 2022-05-25 Drone à capacité d'atterrissage sur des terrains accidentés WO2022249181A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IL283434 2021-05-25
IL283434A IL283434A (en) 2021-05-25 2021-05-25 A drone with the ability to land on uneven terrain

Publications (1)

Publication Number Publication Date
WO2022249181A1 true WO2022249181A1 (fr) 2022-12-01

Family

ID=84229586

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2022/050554 WO2022249181A1 (fr) 2021-05-25 2022-05-25 Drone à capacité d'atterrissage sur des terrains accidentés

Country Status (2)

Country Link
IL (1) IL283434A (fr)
WO (1) WO2022249181A1 (fr)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170137118A1 (en) * 2015-03-18 2017-05-18 Amazon Technologies, Inc. Adjustable landing gear assembly for unmanned aerial vehicles
KR101762536B1 (ko) * 2016-03-18 2017-08-04 경희대학교 산학협력단 장애물 회피 및 경사면 착륙을 위한 드론 및 그 제어방법
US20190127052A1 (en) * 2016-06-22 2019-05-02 SZ DJI Technology Co., Ltd. Systems and methods of aircraft walking systems
CN111661316A (zh) * 2020-08-08 2020-09-15 南京航空航天大学 一种具备地形自适应起降和行走功能的变体六旋翼无人机

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170137118A1 (en) * 2015-03-18 2017-05-18 Amazon Technologies, Inc. Adjustable landing gear assembly for unmanned aerial vehicles
KR101762536B1 (ko) * 2016-03-18 2017-08-04 경희대학교 산학협력단 장애물 회피 및 경사면 착륙을 위한 드론 및 그 제어방법
US20190127052A1 (en) * 2016-06-22 2019-05-02 SZ DJI Technology Co., Ltd. Systems and methods of aircraft walking systems
CN111661316A (zh) * 2020-08-08 2020-09-15 南京航空航天大学 一种具备地形自适应起降和行走功能的变体六旋翼无人机

Also Published As

Publication number Publication date
IL283434A (en) 2022-12-01

Similar Documents

Publication Publication Date Title
US11124292B2 (en) Systems and methods of aircraft walking systems
US9878779B2 (en) Unmanned aerial vehicle and landing method thereof
US9641810B2 (en) Method for acquiring images from arbitrary perspectives with UAVs equipped with fixed imagers
US11323687B2 (en) Sensing on UAVs for mapping and obstacle avoidance
CN109425265B (zh) 飞行器成像与瞄准系统
US11727679B2 (en) Automatic terrain evaluation of landing surfaces, and associated systems and methods
US8666571B2 (en) Flight control system for flying object
US8554395B2 (en) Method and system for facilitating autonomous landing of aerial vehicles on a surface
US9936133B2 (en) Gimbaled camera object tracking system
Thurrowgood et al. A biologically inspired, vision‐based guidance system for automatic landing of a fixed‐wing aircraft
Fankhauser et al. Collaborative navigation for flying and walking robots
JP6895835B2 (ja) ガイド情報表示装置およびクレーン
KR101651600B1 (ko) 스테레오 카메라에 의한 자동 착륙 기능을 갖는 무인 비행용 드론
Nieuwenhuisen et al. Collaborative object picking and delivery with a team of micro aerial vehicles at MBZIRC
CN103744390B (zh) 无人机电力线路巡检的协同控制方法
EP3649046A1 (fr) Véhicule aérien sans pilote intégré portatif
KR20190065355A (ko) 이동 가능한 물체의 높이 제어를 위한 시스템 및 방법
KR102625240B1 (ko) 마커를 이용하여 물품을 처리하는 자율 주행 이동체 및 그 방법
WO2022249181A1 (fr) Drone à capacité d'atterrissage sur des terrains accidentés
CN106542105B (zh) 飞行器移动降落方法和系统
WO2019010922A1 (fr) Dispositif de traitement d'informations, objet volant, procédé de génération de réseau de transport, procédé de transport, programme, et support d'enregistrement
CN112731918B (zh) 一种基于深度学习检测跟踪的地面无人平台自主跟随系统
JP2022149017A (ja) 無人移動体
Kuntz et al. Autonomous cargo transport system for an unmanned aerial vehicle, using visual servoing
Fuchslocher et al. Concept and Implementation of a Tele-operated Robot for ELROB 2016

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22810791

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22810791

Country of ref document: EP

Kind code of ref document: A1