WO2018210656A1 - Augmented reality for collaborative interventions - Google Patents

Augmented reality for collaborative interventions Download PDF

Info

Publication number
WO2018210656A1
WO2018210656A1 PCT/EP2018/062013 EP2018062013W WO2018210656A1 WO 2018210656 A1 WO2018210656 A1 WO 2018210656A1 EP 2018062013 W EP2018062013 W EP 2018062013W WO 2018210656 A1 WO2018210656 A1 WO 2018210656A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
controller
space
shared portion
shared
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/EP2018/062013
Other languages
English (en)
French (fr)
Inventor
Ashish PANSE
Molly Lara FLEXMAN
Atul Gupta
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips NV filed Critical Koninklijke Philips NV
Priority to EP18726934.5A priority Critical patent/EP3625649A1/en
Priority to US16/613,607 priority patent/US11069146B2/en
Priority to JP2019563618A priority patent/JP2020520521A/ja
Priority to CN201880047279.5A priority patent/CN110914789A/zh
Publication of WO2018210656A1 publication Critical patent/WO2018210656A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/40ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H80/00ICT specially adapted for facilitating communication between medical practitioners or patients, e.g. for collaborative diagnosis, therapy or health monitoring
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods
    • A61B2017/00017Electrical control of surgical instruments
    • A61B2017/00216Electrical control of surgical instruments with eye tracking or head position tracking control
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2048Tracking techniques using an accelerometer or inertia sensor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/372Details of monitor hardware
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/502Headgear, e.g. helmet, spectacles
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0132Head-up displays characterised by optical features comprising binocular systems
    • G02B2027/0134Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye

Definitions

  • FIG. 7 illustrates a 3D space with a shared virtual object fixed at a location within the 3D for augmented reality for collaborative interventions, in accordance with an aspect of the present disclosure
  • FIG. 10 illustrates a control table for augmented reality for collaborative interventions, in accordance with a representative embodiment
  • FIG. 14 illustrates timelines for augmented reality for collaborative interventions, in accordance with a representative embodiment
  • FIGs. 16B illustrates another process flow for augmented reality for collaborative interventions, in accordance with representative embodiments.
  • mapping and registering can also be performed at the time of the AR session.
  • a 3D space can also be dynamically mapped after an augmented reality session begins, such as by using the sensors of augmented reality devices to collect parameters of a 3D space and then using image recognition to register the physical parameters and physical items in the 3D space.
  • an augmented reality session is initiated.
  • An augmented reality session may be considered to start when a first subject wearing a head-mountable device enters a 3D space.
  • multiple augmented reality sessions may take place simultaneously, wherein each of multiple different subjects wearing head-mountable devices individually enters the 3D space.
  • the head-mountable devices may be pre-programmed or preauthorized to access the shared portion of the 3D space which is occupied by a virtual reality object.
  • augmented reality sessions may correspond to a medical intervention occurring in a pre-mapped operating room serving as the 3D space, and each subject authorized to access the virtual reality object may access information displayed via the virtual reality object during the medical intervention.
  • the first visual information and the first shared portion are not restricted to head-mountable devices. Rather, head-mountable devices are readily explained in the context of a 3D space such as an operating room and are used as examples for convenience herein.
  • the virtual reality object occupying the first shared portion may be displayed to any authorized subject with a view of the 3D space that includes the shared portion. For example, even a remote user watching the 3D space via a camera may have access to the first shared portion when the remote user is so authorized.
  • users with access to a first shared portion may use projected images, transparent heads-up displays, and other forms of augmented reality devices.
  • the 3D space 300 is an enclosure such as an operating room, and may be pre-mapped so that every physical object in the 3D space 300 is mapped in preparation for the augmented reality for collaborative interventions described herein.
  • the pre-mapping for the 3D space may be used therefore to provide each of the five subjects 381-385 with augmented reality. That is, the pre-mapping provides physical constraints that cannot be altered, whereas virtual objects can be provided in the 3D space in locations that do not conflict with the pre-mapped physical constraints.
  • a medical equipment computer #1 381 is used to obtain and display medical information from sensors on or around the patient.
  • An augmented reality interface 386 is used to selectively provide the medical information to personnel via augmented reality.
  • the transparent optical displays 211a, 21 1b may, for example, simultaneously allow subjects to view the physical world and artificially generated virtual objects.
  • the transparent optical displays 21 1a, 211b may include, for example, transparent and polarized OLEDs, light- guide optical elements, and similar materials arranged in a matrix that can be individually and logically controlled, i.e., without a projected beam as in FIG. 4.
  • Examples of the elements and materials that can be used for transparent optical displays 21 1a, 211b include an electroluminescent display elements, liquid crystal display (LCD) elements, and waveguides, reflective coatings.
  • the computer system 600 may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment.
  • the computer system 600 can also be implemented as or incorporated into various devices, such as a head-mountable device, a stationary computer, a mobile computer, a personal computer (PC), a laptop computer, a tablet computer, a wireless smart phone, a personal digital assistant (PDA), a communications device, or any other machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine.
  • different shared regions may be highlighted in different colors, such as a yellow border or shading for a first shared region controlled by a user and a green border or shading for a second shared region controlled by the same user.
  • Other screens that are unshared i.e., for the user's personal viewing
  • a dedicated shared region may be the same fixed location in the room for all users with access to the dedicated shared region, as was the case with shared virtual object 710 in FIG. 7, and as may be useful when, for example, a virtual screen is overlaid onto an existing physical object in the 3D space. Additionally, a screen within a dedicated shared region may be oriented perpendicular to each user in a 3D space, as determined using sensors of the augmented reality devices used by the users. Additionally, a 3D hologram may be shared as an augmented reality object.
  • FIG. 13 illustrates another control process for augmented reality for collaborative interventions, in accordance with a representative embodiment.
  • the process starts with a default display of default visual information at SI 302.
  • the default visual information may be from a feed designated by a controller of a shared portion of the 3D space, such as a feed from a piece of equipment monitoring a patient or a feed showing fixed images such as X-ray images.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • Primary Health Care (AREA)
  • Biomedical Technology (AREA)
  • Epidemiology (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Urology & Nephrology (AREA)
  • Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Pathology (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)
PCT/EP2018/062013 2017-05-16 2018-05-09 Augmented reality for collaborative interventions Ceased WO2018210656A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP18726934.5A EP3625649A1 (en) 2017-05-16 2018-05-09 Augmented reality for collaborative interventions
US16/613,607 US11069146B2 (en) 2017-05-16 2018-05-09 Augmented reality for collaborative interventions
JP2019563618A JP2020520521A (ja) 2017-05-16 2018-05-09 協働的介入のための拡張現実
CN201880047279.5A CN110914789A (zh) 2017-05-16 2018-05-09 用于协同介入的增强现实

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762506916P 2017-05-16 2017-05-16
US62/506,916 2017-05-16

Publications (1)

Publication Number Publication Date
WO2018210656A1 true WO2018210656A1 (en) 2018-11-22

Family

ID=62244456

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2018/062013 Ceased WO2018210656A1 (en) 2017-05-16 2018-05-09 Augmented reality for collaborative interventions

Country Status (5)

Country Link
US (1) US11069146B2 (enExample)
EP (1) EP3625649A1 (enExample)
JP (1) JP2020520521A (enExample)
CN (1) CN110914789A (enExample)
WO (1) WO2018210656A1 (enExample)

Families Citing this family (21)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20200021668A1 (en) * 2018-07-13 2020-01-16 Merge Labs, Inc. Dynamic augmented reality collaboration system using a trackable three-dimensional object
EP4014888B1 (en) * 2019-08-15 2024-11-06 FUJIFILM Corporation Ultrasonic system and method for controlling ultrasonic system
GB2593473B (en) * 2020-03-23 2024-09-04 Cmr Surgical Ltd Virtual console for controlling a surgical robot
EP3926642A1 (en) * 2020-06-17 2021-12-22 Koninklijke Philips N.V. Dynamic positioning for a virtual display
JP7187520B2 (ja) * 2020-11-25 2022-12-12 ソフトバンク株式会社 眼鏡型デバイス、管理サーバ及びプログラム
US12118677B2 (en) * 2020-12-22 2024-10-15 Arkh, Inc. Spatially aware environment interaction
JP2022102885A (ja) * 2020-12-25 2022-07-07 富士フイルムビジネスイノベーション株式会社 情報処理装置、情報処理システムおよびプログラム
US20220223270A1 (en) * 2021-01-08 2022-07-14 Expanded Existence, Llc System and method for medical procedure room supply and logistics management
JP7713189B2 (ja) * 2021-02-08 2025-07-25 サイトフル コンピューターズ リミテッド エクステンデッドリアリティにおけるコンテンツ共有
US12324708B2 (en) 2021-02-17 2025-06-10 Derek Duong Augmented reality dental surgery
US11915377B1 (en) * 2021-02-18 2024-02-27 Splunk Inc. Collaboration spaces in networked remote collaboration sessions
US12112435B1 (en) * 2021-02-18 2024-10-08 Splunk Inc. Collaboration spaces in extended reality conference sessions
US12086920B1 (en) 2021-02-18 2024-09-10 Splunk Inc. Submesh-based updates in an extended reality environment
US12106419B1 (en) 2021-02-18 2024-10-01 Splunk Inc. Live updates in a networked remote collaboration session
US12488529B1 (en) 2021-02-18 2025-12-02 Cisco Technology, Inc. Mesh retexturing in an extended reality environment
US12056416B2 (en) * 2021-02-26 2024-08-06 Samsung Electronics Co., Ltd. Augmented reality device and electronic device interacting with augmented reality device
KR20230160926A (ko) * 2021-03-31 2023-11-24 스냅 인코포레이티드 사용자-정의 맥락 공간들
CN113706718B (zh) * 2021-07-21 2024-10-29 广州中智达信科技有限公司 一种增强现实的协同方法、系统及应用
EP4595015A1 (en) 2022-09-30 2025-08-06 Sightful Computers Ltd Adaptive extended reality content presentation in multiple physical environments
CN116392245A (zh) * 2023-04-07 2023-07-07 河北瑞鹤医疗器械有限公司 手术导航系统及手术导航的方法
CN116473674A (zh) * 2023-04-07 2023-07-25 河北瑞鹤医疗器械有限公司 手术导航系统及手术导航的方法

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293468A1 (en) * 2012-05-04 2013-11-07 Kathryn Stone Perez Collaboration environment using see through displays
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects
WO2016133644A1 (en) * 2015-02-20 2016-08-25 Covidien Lp Operating room and surgical site awareness

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2007108776A2 (en) * 2005-12-31 2007-09-27 Bracco Imaging S.P.A. Systems and methods for collaborative interactive visualization of 3d data sets over a network ('dextronet')
JP5776201B2 (ja) * 2011-02-10 2015-09-09 ソニー株式会社 情報処理装置、情報共有方法、プログラム及び端末装置
EP3654146A1 (en) * 2011-03-29 2020-05-20 QUALCOMM Incorporated Anchoring virtual images to real world surfaces in augmented reality systems
JP6252004B2 (ja) * 2013-07-16 2017-12-27 セイコーエプソン株式会社 情報処理装置、情報処理方法、および、情報処理システム
JP6264087B2 (ja) * 2014-02-21 2018-01-24 ソニー株式会社 表示制御装置、表示装置および表示制御システム
JP6328579B2 (ja) * 2015-03-13 2018-05-23 富士フイルム株式会社 仮想オブジェクト表示システムおよびその表示制御方法並びに表示制御プログラム
WO2016158000A1 (ja) * 2015-03-30 2016-10-06 ソニー株式会社 情報処理装置、情報処理方法及び情報処理システム
CN105117021A (zh) * 2015-09-24 2015-12-02 深圳东方酷音信息技术有限公司 一种虚拟现实内容的生成方法和播放装置

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130293468A1 (en) * 2012-05-04 2013-11-07 Kathryn Stone Perez Collaboration environment using see through displays
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects
WO2016133644A1 (en) * 2015-02-20 2016-08-25 Covidien Lp Operating room and surgical site awareness

Also Published As

Publication number Publication date
JP2020520521A (ja) 2020-07-09
US20200105068A1 (en) 2020-04-02
EP3625649A1 (en) 2020-03-25
CN110914789A (zh) 2020-03-24
US11069146B2 (en) 2021-07-20

Similar Documents

Publication Publication Date Title
US11069146B2 (en) Augmented reality for collaborative interventions
KR102833872B1 (ko) 환경에서 객체들을 조작하기 위한 방법들
JP7713597B2 (ja) メディアをキャプチャ及び表示するためのデバイス、方法、及びグラフィカルユーザインタフェース
US11740757B2 (en) Virtual cover for user interaction in augmented reality
KR20250065914A (ko) 3차원 환경에서 깊이 충돌 완화를 위한 방법들
CN108780360B (zh) 虚拟现实导航
CN107209386B (zh) 增强现实视野对象跟随器
US20210398316A1 (en) Systematic positioning of virtual objects for mixed reality
CN108369449A (zh) 第三方全息门户
CN116648683A (zh) 用于选择对象的方法和系统
US20250118036A1 (en) Systems and methods for region-based presentation of augmented content
CN119512362A (zh) 用于显示与媒体内容相关的用户界面元素的方法
CN120712546A (zh) 用于在三维环境中显示用户界面对象的方法
CN120447805A (zh) 用于管理三维环境中的内容共享的用户界面
CN119948434A (zh) 包括环境表示的用户界面
CN118435158A (zh) 用于捕获和显示媒体的设备、方法和图形用户界面

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18726934

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2019563618

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2018726934

Country of ref document: EP

Effective date: 20191216