WO2019021045A1 - Procédé et système permettant une action d'un robot industriel fondée sur un paramètre - Google Patents

Procédé et système permettant une action d'un robot industriel fondée sur un paramètre Download PDF

Info

Publication number
WO2019021045A1
WO2019021045A1 PCT/IB2017/055833 IB2017055833W WO2019021045A1 WO 2019021045 A1 WO2019021045 A1 WO 2019021045A1 IB 2017055833 W IB2017055833 W IB 2017055833W WO 2019021045 A1 WO2019021045 A1 WO 2019021045A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
robot
control logic
parameters
industrial robot
Prior art date
Application number
PCT/IB2017/055833
Other languages
English (en)
Inventor
Ashish Sureka
David Carroll Shepherd
Patrick Francis
Original Assignee
Abb Schweiz Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Abb Schweiz Ag filed Critical Abb Schweiz Ag
Publication of WO2019021045A1 publication Critical patent/WO2019021045A1/fr

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1671Programme controls characterised by programming, planning systems for manipulators characterised by simulation, either to verify existing program or to create and verify new program, CAD/CAM oriented, graphic oriented programming systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1658Programme controls characterised by programming, planning systems for manipulators characterised by programming language
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40099Graphical user interface for robotics, visual robot user interface
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40392Programming, visual robot programming language

Definitions

  • the present invention generally relates to control of industrial robots, and more specifically to perform operations with industrial robots based on parameters.
  • Industrial robots find applications across a wide variety of industries such as painting, packaging, manufacturing, automotive, and many others.
  • the industrial robot needs to be programmed (e.g. controlled with a controller of the robot).
  • Robot programming can be divided into two types: online programming and offline programming.
  • Online programming requires direct interaction with a physical robot using a teach- pendant or manual lead through.
  • Offline programming consists of programming a robot using an offline simulation and programming system consisting of a virtual controller and CAD (computer aided design) models. Both the online programming and offline programming methods and systems are widely used and each has its own advantages and disadvantages.
  • specifying special coordinates in a three dimensional environment consisting of target positions and orientation is fundamental to defining the movement or motion of a robot (or components thereof).
  • One typically needs to provide accurate tool center point position for example the tool center point of a gripper attached to the robot as an end-effector
  • orientation of the tool for example the tool center point of a gripper attached to the robot as an end-effector
  • orientation of the tool for example the tool center point of a gripper attached to the robot as an end-effector
  • axis configuration and position of external axis in a three dimensional space to create a target for industrial robots or manipulators and instruct the robot.
  • Inputting such parameters is not a trivial task, and add to the complexity in configuring the industrial robot.
  • Various aspects of the present invention relate to a system and method for performing an operation with an industrial robot.
  • the operation can be an operation or part of an industrial process such as painting, packaging, assembling and so forth. Further, an operation may involve one or several tasks such as pick, grab, move, drop and so forth.
  • the system comprises one or more interfaces for displaying graphical objects, and receiving a selection of graphical objects (e.g. from a user) for defining control flows.
  • a graphical object can be a block-based object (as in a visual programming system), which is associated with a task of the operation of the robot.
  • An interface of the one or more interfaces is also configured to receive parameters associated with the operation of the robot. Examples of parameters include, but need not be limited to, position, speed and orientation. Such parameters are associated with graphical objects (i.e. tasks). Accordingly, specific parameters can be defined for specific tasks (that are associated with the graphical objects).
  • Such interfaces may also render a simulation of the industrial robot or a component thereof, which can be used to select the parameters for the operation (or task). Alternately, there may be a direct connection with the industrial robot (or component thereof such as a teach-pendant unit) for receiving the parameters.
  • the system comprises a first interface for displaying a plurality of graphical objects; and receiving a selection of two or more graphical objects of the displayed graphical objects, for defining a control flow for the operation of the industrial robot.
  • the system comprises a second interface for receiving the one or more parameters as input for the operation.
  • the system comprises one or more processors for defining the control flow for the operation of the robot.
  • the control flow is defined with one or more control logic for performing one or more tasks of the operation.
  • the control logic for a task may be fetched from a database (e.g. with control logics for various tasks defined already) and modified. The modification of a control logic may be based on a parameter of the one or more parameters received at the second interface.
  • the system has a third interface for displaying the one or more control logic, and receiving an input for modifying a control logic of the one or more control logic.
  • the control logic may be rendered on the first interface itself.
  • the one or more processors of the system also link the one or more control logic according to the two or more graphical objects selected at the first interface to define the control flow.
  • the interface(s) and processor(s) may be part of same / separate devices.
  • the interface(s) and processor(s) are part of the same computer system (or robot programming system).
  • the interface may be a graphical user interface of a computer system (e.g. a laptop, a tablet device, a smartphone etc.) while the processor may be on a server, connected to the interface and a controller of the robot.
  • the one or more processors of the system are also configured to transmit the one or more control logic for execution with the controller to control the industrial robot.
  • This controller may be part of the robot, or connected with the robot.
  • the controller is a virtual controller (e.g. cloud-based controller).
  • the method for performing the operation comprises receiving selection of two or more graphical objects at the first interface of the robot programming system, for defining a control flow for the operation of the industrial robot.
  • the method also comprises receiving the one or more parameters as input at the second interface of the robot programming system.
  • the method further comprises defining the control flow in the robot programming system.
  • the control flow is defined with one or more control logic for performing one or more tasks of the operation with the industrial robot, based on the one or more parameters received at the second interface.
  • defining the control flow comprises linking the one or more control logic according to the two or more graphical objects selected at the first interface.
  • the method comprises transmitting the one or more control logic for execution with the controller to control the industrial robot.
  • FIG. 1 is a simplified representation of a system for performing an operation with an industrial robot, in accordance with an embodiment of the invention
  • FIG. 2 is an illustration of a first interface of the system, in accordance with an embodiment of the invention.
  • FIG. 3 is a simplified representation of one or more modules of the system, in accordance with an embodiment of the invention.
  • Fig. 4 is a simplified representation of sub-modules of a linking module of the system, in accordance with an embodiment of the invention.
  • Fig. 5 is a flowchart of the method of performing the operation with the robot, in accordance with an embodiment of the invention.
  • Fig. 1 illustrates a system (100) for performing an operation with the industrial robot (or robot).
  • the system comprises a first interface (102), a second interface (104) and a processor (116).
  • the first interface can be understood similar to an interface of a visual programming system, that displays various graphical objects which can be put together to define a control flow for the operation of the robot.
  • a graphical object may be understood as a visual representation (e.g. a block) that represents a task (or a step or sequence of steps) involved in an operation.
  • tasks or operations
  • such tasks are defined in robot programming systems using text-based programming languages such as RAPID.
  • such tasks can be easily defined by using the graphical objects to define the control flow for the operation with the robot.
  • the first interface comprises a library (106) and a canvas (108).
  • the library is a collection of graphical objects (such as 110), which have been defined for the purpose of defining operations (or task(s) thereof) of the robot. Defining the graphical object can include defining control logic required for performing the corresponding task with the robot. Such control logic definition is typically done in a text-based language such as RAPID. As the graphical objects are already define, a user trying to configure the robot for the operation can simply pick and use the available graphical objects.
  • graphical objects can be linked (joined / snapped) to define the control flow for the operation with the robot.
  • Graphical objects can be loaded from the library (e.g. by dragging and dropping, or by snapping and dropping) and linked on the canvas (e.g. joined) to define the control flow.
  • the library can have graphical objects associated with tasks such as 'begin and end task' , 'position and reference frame' , 'path and target' , 'motion' and 'control'. From these, any number of objects can be selected to define the control flow.
  • a path and target object is selected (see 202, 204).
  • An object may be edited as well when loaded from the library. For instance as shown in Fig. 2, which arm of the robot to select, whether to have linear motion etc. can be specified as a parameter value after selecting the graphical object.
  • a graphical object is loaded on the first interface, one or more parameters (or targets) may need to be modified.
  • a user can simply take the provided parameters (e.g. default values), or select parameters as needed for the task.
  • the parameters can be selected using the second interface (i.e. 104 of Fig. 1).
  • the second interface can be an interface of the robot (e.g. connected to a teach- pendant unit (TPU) or a manipulator etc.), which can receive parameters as input (e.g. from a user) from the robot (or TPU, or other component of the robot).
  • TPU teach- pendant unit
  • the second interface may alternately be an interface of a simulation system (e.g. 3D simulation system), which can be used to simply select the desired parameter(s) for the task or operation based on a simulation of the robot (110) (or a component thereof (e.g. tool tip 112)).
  • the parameters (targets) of the task can be created by specifying each parameter such as a position, orientation, and / or axis (e.g. configuration of robot / external axis) from the second interface (e.g. as coordinates, angles etc. of tool tip (112)).
  • the user can select an object like path and target, and then switch to the second interface to set the target (or parameters) by jogging the robot.
  • the values of the parameters (or targets) pick position, intermediate position and place position, can be provided through the second interface.
  • first interface and the second interface may be on rendered one device, or on separate devices. For instance, there may be separate computers rendering the first and second interfaces.
  • the first interface may be provided on a smartphone / tablet device, while the second interface may be provided on a monitor.
  • the system also comprises the processor(s) (such as 106).
  • the interface(s) and processor(s) may be part of same / separate devices.
  • the interface(s) and processor(s) are part of the same computer system (or robot programming system).
  • the interface may be a graphical user interface of a computer system (e.g. a laptop, a tablet device, a smartphone etc.) while the processor may be on a server, connected to the interface and a controller of the robot.
  • the one or more processors are configured for defining the control flow for the operation of the robot.
  • the control flow is defined with one or more control logic for performing one or more tasks of the operation.
  • the control logic for a task may be fetched from the database (e.g. with control logics for various tasks defined already) and modified.
  • the modification of a control logic may be based on a parameter of the one or more parameters received at the second interface.
  • the system has a third interface for displaying the one or more control logic, and receiving an input for modifying a control logic of the one or more control logic.
  • the control logic may be rendered on the first interface itself.
  • the one or more processors of the system also link the one or more control logic according to the two or more graphical objects selected at the first interface to define the control flow.
  • the system includes one or more modules such as a receiving module (302), a linking module (304) and an execution module (306), as shown in Fig. 3.
  • the modules of the system can be implemented with hardware / software on one device (or component) or spread across several devices.
  • the receiving module can be part of one device (e.g. first interface), while the linking and execution modules can be in the same device (e.g. with the processor).
  • the receiving module receives selection of two or more graphical objects (e.g. 116a and 116b).
  • the linking module is configured to define the control flow for performing the operation with the industrial robot by linking the one or more control logic (e.g. as a linker program to link the control logic as per the graphical objects selected by the user).
  • the corresponding parameter as provided as input is utilized.
  • control logic required for each task is generated this way.
  • the various control logic for the different tasks
  • are combined as per the flow defined in the first interface (on the canvas)).
  • the linking module can also be configured determine the compatibility of the control logic (or software code(s) (e.g. in RAPID)) corresponding to the graphical objects while combining the control logic.
  • the linking module may be have one or more sub-modules as shown in Fig. 4.
  • the linking module includes a fetching sub-module (402), a combining sub- module (404) and an error determination sub-module (406).
  • the fetching sub-module is configured to fetch control logic (or software code(s)) corresponding to a graphical object (e.g. 116A, 116B).
  • the combining sub-module is configured to combine the control logic corresponding to the graphical objects as selected by the user to generate the control flow (or a combined control logic).
  • the error determination sub-module is configured to perform an error check on the generated control flow (or combined control logic).
  • the error determination sub can be perform at least one of syntax analysis or syntactic analysis on the control logic.
  • the error determination sub-module is configured to automatically correct one or more identified syntactic errors and/or semantic errors. In a further implementation, the error determination sub-module is configured to suggest corrections for syntactic errors and/or semantic errors determined in the generated control logic(s) for displaying in a suggestion or error window (not shown) in the robot programming environment 300.
  • the control flow (i.e. the one or more control logic combined together) thus generated is executed using the execution module (306).
  • the execution module may be provided on a simulator (e.g. 110 of the second interface) or virtual controller, or to the controller of the robot (i.e. passed to the control of the robot to perform the actual operation as desired by the user).
  • the system may further include a display module (308).
  • the display module is configured to display the robot programming system in accordance with the teachings of the present invention.
  • the display module is further configured to display the robot programming system (e.g. remotely) as a programming interface on a display of a device of the user (e.g. on a laptop, computer, smartphone or tablet device of the user).
  • the robot programming system be incorporated as a plug-in or an add-on to an existing system for programming / simulation (e.g. RobotStudio).
  • a programming / simulation system includes several in-built functionalities to support programming of the industrial robot, and can be configured to support extensions to other functionalities for developing control programs for the industrial robot.
  • FIG. 5 is a flowchart of the method of performing the operation with the robot, in accordance with an embodiment of the invention.
  • selection of a first graphical object and at least one further graphical object are received at the first interface for defining the control flow.
  • drag and drop is adopted to cause selections of the graphical objects.
  • Mechanisms e.g. insert from a list of available graphical objects
  • other than drag and drop can also be adopted for causing selections
  • the method further comprises receiving the one or more parameters as input at the second interface of the robot programming system at 504.
  • parameters or targets
  • the user can perform an activity with the robot, e.g. jog the robot, to feed the parameter values for various positions (e.g. pick position, intermediate position, drop position etc.). Alternately, the user can select these parameters on the 3D simulation system, and same gets defined for the graphical object.
  • the control flow is defined in the robot programming system at 506.
  • the control logic can be defined (e.g. by fetching corresponding text-based instructions (in RAPID) from the database) and modified according to the parameter provided from the second interface. This is done for each task (in the operation), and the control flow is defined with the one or more control logic defined for performing the one or more tasks with the industrial robot. Defining the control flow also comprises linking the one or more control logic according to the two or more graphical objects selected at the first interface (e.g. with the linking module).
  • only those graphical objects can be linked which are semantically correct, i.e., are logical to be combined. For example, while defining the control logic for the operation to be performed with the industrial robot, one graphical object corresponding to identifying an object at a first target location is selected, and another graphical object corresponding to dropping the object at a second target location is selected. In this case, as the action of picking is required. This is because if the object was not picked, it cannot be dropped. Therefore, linking the two graphical objects would not be possible, and hence the control logic for pick and drop will not be defined. [0043] Once the control flow (or control logic) is generated, the same can be displayed to the user for verification (e.g. on the first interface). Alternately, the verification can be done with the error determination module (306). The verification may be an optional step, and the generated control flow (or combined control logic) can be provided directly to perform the operation at the robot or at a simulation thereof.
  • the method comprises transmitting the one or more control logic (i.e. the combined control logic) for execution with the controller to control the industrial robot. Thereafter, at 510, the control flow thus generated is executed.
  • the execution can be at the controller of the industrial robot or a robot simulator, to control the operation of the robot to perform the operation.
  • a user can easily configure an industrial robot for a specific operation by dragging a dropping a block(s) to the canvas for creating a target (parameter).
  • the user can then switch context to the second interface (or simulation environment) of the programming system and set the target (e.g. by jogging the robot).
  • the target block is joined with the move or the path block (graphical object,) resulting in an interlocking of the target block as a parameter to the move instruction.
  • the name of the target is specified both in the 3D simulator and the target block. Accordingly, the exact position and orientation information is read from the second interface, and a syntactically text-based correct code (e.g. in RAPID) is generated.
  • the invention extends the capability of current offline programming systems for industrial robots.
  • the invention enables easy programming for industrial robots, thereby democratizing application development and opening robotics programming for the masses.
  • the invention simplifies robot configuration for technicians / users with who have knowledge about the domain and process, but do not have specialized programming skills.
  • As the invention enables easy programming of different components as well, this becomes especially useful programming dual arm robots and manipulators, which pose unique and different technical challenges in-context to single arm robots.

Landscapes

  • Engineering & Computer Science (AREA)
  • Robotics (AREA)
  • Mechanical Engineering (AREA)
  • Software Systems (AREA)
  • Manipulator (AREA)
  • Numerical Control (AREA)

Abstract

L'invention concerne un système et un procédé de réalisation d'une action au moyen d'un robot industriel sur la base d'un ou de plusieurs paramètres. Le procédé consiste : à recevoir une sélection d'au moins deux objets graphiques pour définir un flux de commande pour l'action, chaque objet graphique étant associé à une tâche pour l'action ; à recevoir un ou plusieurs paramètres en tant qu'entrée au niveau d'une seconde interface pour l'action, chaque paramètre étant associé à au moins un objet graphique ; à définir le flux de commande assorti d'une ou de plusieurs logiques de commande pour réaliser l'action au moyen du robot industriel, sur la base dudit paramètre, et à lier ladite logique de commande en fonction desdits deux objets graphiques sélectionnés au niveau de la première interface. En outre, ladite logique de commande est transmise à un dispositif de commande en vue de l'exécution.
PCT/IB2017/055833 2017-07-26 2017-09-26 Procédé et système permettant une action d'un robot industriel fondée sur un paramètre WO2019021045A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN201741026528 2017-07-26
IN201741026528 2017-07-26

Publications (1)

Publication Number Publication Date
WO2019021045A1 true WO2019021045A1 (fr) 2019-01-31

Family

ID=60162217

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2017/055833 WO2019021045A1 (fr) 2017-07-26 2017-09-26 Procédé et système permettant une action d'un robot industriel fondée sur un paramètre

Country Status (1)

Country Link
WO (1) WO2019021045A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752573A (zh) * 2020-07-03 2020-10-09 中山市恺特自动化科技有限公司 工业机器人通用编程方法及编程器
IT202100001268A1 (it) * 2021-01-25 2022-07-25 Yk Robotics S R L Sistema e metodo di configurazione e programmazione di una cella robotica

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126151A1 (en) * 2000-06-13 2002-09-12 National Instruments Corporation System and method for graphically creating a sequence of motion control, machine vision, and data acquisition (DAQ) operations
WO2006043873A1 (fr) * 2004-10-20 2006-04-27 Abb Research Ltd Systeme et procede de programmation de robot industriel
DE102012004983A1 (de) * 2012-03-14 2013-09-19 Hermann Müller Verfahren zur grafikbasierten Roboterprogrammierung eines Mehrachs-Roboters
US20150273685A1 (en) * 2014-04-01 2015-10-01 Bot & Dolly, Llc Software Interface for Authoring Robotic Manufacturing Process

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020126151A1 (en) * 2000-06-13 2002-09-12 National Instruments Corporation System and method for graphically creating a sequence of motion control, machine vision, and data acquisition (DAQ) operations
WO2006043873A1 (fr) * 2004-10-20 2006-04-27 Abb Research Ltd Systeme et procede de programmation de robot industriel
DE102012004983A1 (de) * 2012-03-14 2013-09-19 Hermann Müller Verfahren zur grafikbasierten Roboterprogrammierung eines Mehrachs-Roboters
US20150273685A1 (en) * 2014-04-01 2015-10-01 Bot & Dolly, Llc Software Interface for Authoring Robotic Manufacturing Process

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Selogica control system - Comprehensive management for injection moulding technology", ARBURG FOCUS, 1 October 2013 (2013-10-01), pages 1 - 16, XP055465553, Retrieved from the Internet <URL:https://www.arburg.com/fileadmin/redaktion/mediathek/prospekte/arburg_selogica_522776_en_gb.pdf> [retrieved on 20180409] *

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111752573A (zh) * 2020-07-03 2020-10-09 中山市恺特自动化科技有限公司 工业机器人通用编程方法及编程器
CN111752573B (zh) * 2020-07-03 2021-11-09 中山市恺特自动化科技有限公司 工业机器人通用编程方法及编程器
IT202100001268A1 (it) * 2021-01-25 2022-07-25 Yk Robotics S R L Sistema e metodo di configurazione e programmazione di una cella robotica

Similar Documents

Publication Publication Date Title
JP6676286B2 (ja) 情報処理方法、および情報処理装置
Sanfilippo et al. Controlling Kuka industrial robots: Flexible communication interface JOpenShowVar
Chitta MoveIt!: an introduction
US9387589B2 (en) Visual debugging of robotic tasks
JP6950347B2 (ja) 情報処理装置、情報処理方法およびプログラム
US20190143524A1 (en) Programming assistance apparatus, robot system, and method for generating program
KR102645817B1 (ko) 로봇의 비헤이비어 관리 방법 및 장치
Sanfilippo et al. JOpenShowVar: an open-source cross-platform communication interface to kuka robots
EP3864480B1 (fr) Marquage d&#39;objet en soutien de tâches réalisées par des machines autonomes
CN111452042B (zh) 一种机械臂的控制方法、系统及控制终端
WO2018176025A1 (fr) Système et procédé de conception de systèmes autonomes
WO2019021045A1 (fr) Procédé et système permettant une action d&#39;un robot industriel fondée sur un paramètre
WO2019021044A1 (fr) Procédé et système destinés à effectuer une manœuvre au moyen d&#39;un robot industriel
Martinez et al. Setup of the yaskawa sda10f robot for industrial applications, using ros-industrial
Lages et al. An architecture for controlling the barrett wam robot using ros and orocos
Zou et al. Development of robot programming system through the use of augmented reality for assembly tasks
Park et al. Development of Digital twin for Plug-and-Produce of a Machine tending system through ISO 21919 interface
Denoun et al. Grasping robot integration and prototyping: The grip software framework
US20220172107A1 (en) Generating robotic control plans
Wojtynek et al. InteractiveWorkspace Layout focusing on the reconfiguration with collaborative robots in modular production systems
Vahrenkamp et al. High-level robot control with ArmarX
EP4254098A1 (fr) Commande d&#39;un systéme d&#39;automatisation comprenant une pluralité de machines
Sartori et al. Agile visual programming of a multi-robot cyber-physical system with operator interaction
Burgh-Oliván et al. ROS-Based Multirobot System for Collaborative Interaction
Kopacek et al. Robot retrofitting by using LinuxCNC complemented with arduino/RaspberryPI

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17787960

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17787960

Country of ref document: EP

Kind code of ref document: A1