EP3227743A1 - Method for calibrating an augmented reality visual rendering system comprising at least one display device that is partially transparent with respect to the user thereof, and associated system - Google Patents
Method for calibrating an augmented reality visual rendering system comprising at least one display device that is partially transparent with respect to the user thereof, and associated systemInfo
- Publication number
- EP3227743A1 EP3227743A1 EP15804385.1A EP15804385A EP3227743A1 EP 3227743 A1 EP3227743 A1 EP 3227743A1 EP 15804385 A EP15804385 A EP 15804385A EP 3227743 A1 EP3227743 A1 EP 3227743A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- curve
- dimensional
- calibration
- display device
- user
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 35
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 27
- 230000000007 visual effect Effects 0.000 title claims abstract description 17
- 238000009877 rendering Methods 0.000 title claims abstract description 16
- 238000011088 calibration curve Methods 0.000 claims abstract description 47
- 238000013507 mapping Methods 0.000 claims description 10
- 239000013256 coordination polymer Substances 0.000 claims description 8
- 210000003128 head Anatomy 0.000 description 11
- 239000011521 glass Substances 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 230000008569 process Effects 0.000 description 4
- 206010049565 Muscle fatigue Diseases 0.000 description 3
- 241001422033 Thestylus Species 0.000 description 3
- 238000013459 approach Methods 0.000 description 3
- 210000001525 retina Anatomy 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 238000012804 iterative process Methods 0.000 description 2
- 238000005457 optimization Methods 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000008901 benefit Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000003292 diminished effect Effects 0.000 description 1
- 238000011156 evaluation Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000003550 marker Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2504—Calibration devices
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60J—WINDOWS, WINDSCREENS, NON-FIXED ROOFS, DOORS, OR SIMILAR DEVICES FOR VEHICLES; REMOVABLE EXTERNAL PROTECTIVE COVERINGS SPECIALLY ADAPTED FOR VEHICLES
- B60J9/00—Devices not provided for in one of main groups B60J1/00 - B60J7/00
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0189—Sight systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/327—Calibration thereof
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/014—Head-up displays characterised by optical features comprising information/image processing systems
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0179—Display position adjusting means not related to the information to be displayed
- G02B2027/0181—Adaptation to the pilot/driver
Definitions
- a method of calibrating an augmented reality visual rendering system comprising at least one partially transparent display device, relative to its user, and associated system
- the present invention relates to a method for calibrating an augmented reality visual rendering system comprising at least one partially transparent display device, relative to its user, and to an associated system.
- augmented reality visual rendering systems comprising at least a partially transparent display device, such as "google glass” which a patent application US 2013/0044042 describes them. It is also known augmented reality visual rendering systems such as augmented reality windshields.
- Such systems often include partially transparent display devices often called semitransparent by language abuse.
- Partially transparent display systems such as semi-transparent goggles, semi-transparent displays, semi-transparent surface projection systems, etc.
- the advantage of these semi-transparent display devices lies in the fact that the user continues to perceive the real environment without latency and from the same point of view that it would have without the display device.
- the display device can be used to highlight an item in the actual scene.
- the activated display area must then be project on the retina of the user's eye in the same place as the image of the real object is projected there. It is then necessary to calibrate the system "eye-display device" in order to have a model characterizing the projection of the image displayed on the display device on the retina of the eye.
- the position of the display device with respect to the eye can vary from one user to another (for example due to the difference in inter-pupillary distance between different users), or even a use to the user.
- a calibration made in the factory does not allow to obtain an optimal alignment of the image displayed on the glasses with the natural vision of the user. It is therefore interesting to allow the user to be able to calibrate or refine a previous calibration, in order to obtain a calibration adapted to the current conditions of use of the device.
- the main technical problem is to provide an ergonomic solution that allows the user to carry out a calibration or fine-tune the factory calibration at their workplace.
- this method assumes that the eye of the user is placed in the same place as the camera while the telescope can be worn by users whose morphology of the head may vary.
- the calibration process requires a manikin head equipped with a camera. If this constraint is compatible with factory calibration, it is not compatible with off-factory calibration. It is also known to require user interactions to perform a calibration.
- the system successively displays points in a telescope, and the user must align each of these 2D points (of a two-dimensional space of the display device) with a 3D point (in the user's space ) whose coordinates are known in the reference of the user's head.
- This procedure has several disadvantages. First of all, in order for the calibration to be accurate throughout the screen, the test pattern must cover the entire screen or be moved as the calibration process progresses. Such a procedure is therefore possible in the context of a factory calibration but is not suitable for a calibration procedure performed at the site of operation of the device.
- This step requiring a stationary position of the user, is performed a large number of times (once for each association between a 3D point and 2D point), which generally involves significant muscle fatigue for the user, and makes the long calibration process to perform. This lack of ergonomics can then cause a rejection of the device by the end user.
- the main limitation of this solution is that the user has to move his hand and his head so as to align the pen tip with the 2D point displayed, then stay fixed the time to validate the alignment. During this time of immobility, the user does not move his hand or his head. Accurate calibration requires repeating this step for dozens of 2D points, which quickly causes muscle fatigue and makes the calibration process slow.
- An object of the invention is to overcome these problems.
- a method of calibrating an augmented reality visual reproduction system comprising at least one partially transparent display device, with respect to its user, comprising at least once the performing the steps of:
- the present invention reduces the number of user interactions, and thus allows for calibration in a shorter time and more ergonomically.
- said step of mapping the three-dimensional curve to the two-dimensional calibration curve displayed on the display device comprises the substeps of:
- the method further comprises, iteratively, the following steps consisting of:
- the calibration is improved without having to redo the user. For example, said iteration is stopped when a convergence criterion is reached.
- the calibration curve (s) comprise at least one curve having at least one discontinuity, and / or at least one polygonal curve.
- the calibration curve (s) comprise at least one curve whose display on the display device ensures that no point of the display device is at a distance from the curve exceeding 25% of the diagonal of the display.
- the calibration curve (s) comprise at least one Hilbert curve, and / or at least one Peano curve, and / or at least one Lebesque curve.
- said calibration applies to an augmented reality goggle system.
- said calibration can be applied to an augmented reality windshield system.
- a visual rendering system actually augmented adapted to implement the method as previously described.
- the augmented reality video rendering system is an augmented reality goggle system or an augmented reality windshield system.
- the example mainly described uses a single calibration curve, in a nonlimiting manner, because, alternatively, it can be used successively a plurality of calibration curves.
- Figure 1 schematically illustrates a system and method according to one aspect of the invention.
- FIG. 1 shows the eye of the user 1, the user's mark 2, a partially transparent display device 3, in this case a partially transparent screen of augmented reality glasses, linked in motion to the head of the user, according to a rigid transformation constant over time, and a marker 4 of the partially transparent screen 3.
- a two-dimensional calibration curve CE 2 D is displayed on the partially transparent screen 3.
- a pointing device 5, in this case a stylus, is also shown with its reference numeral 6.
- the displays of the two-dimensional calibration curve CE 2 D and the three-dimensional curve are represented.
- a projected two-dimensional curve CP 2 D on the semitransparent screen 3 may, for example, be used as an intermediate curve for matching the three-dimensional curve C 3 D and the 2D two-dimensional calibration curve. displayed on the semi-transparent screen 3.
- the pointing device 5 provides a 3D position expressed in the reference 4 rigidly linked to the screen partially transparent 3.
- the stylus 5 and the glasses 3 are located by a same tracking device ("tracking" in English), as a camera attached to the glasses, a magnetic or acoustic tracking device whose receivers are fixed on the glasses as well as the stylus 5. Therefore, the position of the glasses and the stylus are known according to a common reference 4 (that of the tracking device).
- U 0 is the parametric or non-parametric representation.
- the user is prompted to move the pointing device 3D so that the trajectory described by the tip 6 of the latter 5 and the 2D calibration curve displayed on the partially transparent screen 3 align with the point from the eye of the user 1.
- the 3D trajectory is memorized by the system and this trajectory is denoted V 0 .
- the pointing device 5 can be a stylet whose tip 6 can be located using a magnetic, optical, mechanical, acoustic, or laser tracking device, etc.
- the pointing device can also be the user's finger, the end of the latter being able to be located using an optical system such as a 3D camera, a stereovision head, etc.
- an approximate calibration of the device can be used to perform the sub-step of projection of the 3D curve C 3 D on the partially transparent screen 3 to provide a 2D projected curve CP 2 D and an association between points of the 3D curve and points of the projected curve 2D CP 2 D, the 3D trajectory of the pointer 5.
- Each point of the 3D trajectory C 3 D can then be associated with a point of the calibration curve CE 2 D closest to the point corresponding to its projections on the projected curve CP 2D .
- the trajectory V 0 is sampled in a set 3D points.
- Each point Xj 0 is projected in 2D on the partially transparent screen 3 according to the current calibration, thus providing a 2D point noted Yj 0 as illustrated in FIG. 2.
- Each point Yj 0 is then associated with a respective point Zj 0 of the calibration curve CE 2 D displayed on the semitransparent screen 3, parametric or non-parametric U 0, which is closest to it (the distance used can be any standard, such as the Euclidean norm, the norm 1, the infinite norm, ...), as shown in Figure 3. If no approximate calibration of the device is available, the geometric properties of the curve 2D EC 2 D calibration can be used to establish this correspondence. Thus, if the displayed calibration curve CE 2 D is polygonal, the 3D trajectory C 3 D of the pointer 5 can be analyzed in order to identify the points of the trajectory which correspond to the vertices of a polygonal curve.
- a transfer function can be estimated and then used to associate each point of the 3D trajectory C 3 D with a point of the trajectory 2D EC 2D calibration.
- the new calibration can be estimated using a nonlinear optimization algorithm (gradient descent, levenberg marquardt, Dog leg, ...) to modify these initial values so that the respective projections ⁇ Yj 0 ⁇ of the points ⁇ Xj 0 ⁇ according to the new calibration are closer to the respective points ⁇ Zj 0 ⁇ of the displayed 2D EC 2D calibration curve associated with them.
- a nonlinear optimization algorithm gradient descent, levenberg marquardt, Dog leg,
- the 3D curve matching step and the 2D calibration curve displayed on the semitransparent screen and the calibration step can be repeated to increase the quality of the calibration.
- the result of the calibration step is used as the approximate calibration of the next iteration.
- the calibration is more accurate than that used in the previous step of matching the path C 3 D and the displayed calibration curve CE 2 D, the previous iteration, the new mapping of the path C 3 D and the displayed CE 2 D calibration curve will be performed with more precision.
- This iterative improvement of the mapping makes it possible to improve the precision of the calibration of the next iteration.
- the iterative process therefore makes it possible to improve both the mapping between the path C 3 D and the displayed calibration curve CE 2 D, and the calibration.
- the steps of displaying a calibration curve and recording a three-dimensional curve described by the user by means of a three-dimensional pointing device can be repeated several times, with different curves before a calibration step.
- a curve displayed on the screen may vary in shape each time, before performing the mapping step between a 3D curve and the corresponding calibration curve.
- Xj Yji is the projection of this point according to the current calibration.
- Each point Yji is then associated with the nearest 2D point of the 2D calibration curve corresponding to it, this point being noted Zji.
- a computer program can be written in any form of programming language, including compiled or interpreted languages, and the computer program can be deployed in any form, including as a stand-alone program or as a subroutine, element, or other unit suitable for use in a computing environment.
- a computer program can be deployed to run on one computer or multiple computers at a single site or spread across multiple sites and interconnected by a communications network.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Mechanical Engineering (AREA)
- Processing Or Creating Images (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1461866A FR3029649B1 (en) | 2014-12-03 | 2014-12-03 | METHOD FOR CALIBRATING AN INCREASED REALITY VISUAL RESTITUTION SYSTEM COMPRISING AT LEAST ONE PARTIALLY TRANSPARENT DISPLAY DEVICE, IN RELATION TO ITS USER, AND ASSOCIATED SYSTEM |
PCT/EP2015/078139 WO2016087407A1 (en) | 2014-12-03 | 2015-12-01 | Method for calibrating an augmented reality visual rendering system comprising at least one display device that is partially transparent with respect to the user thereof, and associated system |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3227743A1 true EP3227743A1 (en) | 2017-10-11 |
Family
ID=52988163
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP15804385.1A Withdrawn EP3227743A1 (en) | 2014-12-03 | 2015-12-01 | Method for calibrating an augmented reality visual rendering system comprising at least one display device that is partially transparent with respect to the user thereof, and associated system |
Country Status (4)
Country | Link |
---|---|
US (1) | US10415959B2 (en) |
EP (1) | EP3227743A1 (en) |
FR (1) | FR3029649B1 (en) |
WO (1) | WO2016087407A1 (en) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114593688B (en) * | 2022-03-03 | 2023-10-03 | 惠州Tcl移动通信有限公司 | Three-dimensional measurement method and device based on AR (augmented reality) glasses, AR glasses and storage medium |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020105484A1 (en) | 2000-09-25 | 2002-08-08 | Nassir Navab | System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality |
US9285592B2 (en) | 2011-08-18 | 2016-03-15 | Google Inc. | Wearable device with input and output structures |
TWI486629B (en) * | 2012-11-21 | 2015-06-01 | Ind Tech Res Inst | Optical-see-through head mounted display system and interactive operation |
US9477315B2 (en) * | 2013-03-13 | 2016-10-25 | Honda Motor Co., Ltd. | Information query by pointing |
US10019057B2 (en) * | 2013-06-07 | 2018-07-10 | Sony Interactive Entertainment Inc. | Switching mode of operation in a head mounted display |
-
2014
- 2014-12-03 FR FR1461866A patent/FR3029649B1/en not_active Expired - Fee Related
-
2015
- 2015-12-01 EP EP15804385.1A patent/EP3227743A1/en not_active Withdrawn
- 2015-12-01 US US15/524,988 patent/US10415959B2/en not_active Expired - Fee Related
- 2015-12-01 WO PCT/EP2015/078139 patent/WO2016087407A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
US10415959B2 (en) | 2019-09-17 |
WO2016087407A1 (en) | 2016-06-09 |
FR3029649A1 (en) | 2016-06-10 |
US20170322020A1 (en) | 2017-11-09 |
FR3029649B1 (en) | 2016-12-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3101624B1 (en) | Image processing method and image processing device | |
Itoh et al. | Interaction-free calibration for optical see-through head-mounted displays based on 3d eye localization | |
EP2715662B1 (en) | Method for localisation of a camera and 3d reconstruction in a partially known environment | |
CN105917292B (en) | Utilize the eye-gaze detection of multiple light sources and sensor | |
US9773349B2 (en) | Active parallax correction | |
FR2980681A3 (en) | METHOD OF DETERMINING OCULAR AND OPTICAL MEASUREMENTS OF INTEREST FOR THE MANUFACTURE AND MOUNTING OF GLASSES OF CORRECTIVE GLASSES USING A CAMERA IN A NON-CONTROLLED CONTEXT | |
WO2017074662A1 (en) | Tracking of wearer's eyes relative to wearable device | |
EP2923227B1 (en) | Method for automatically recognising a moving magnetic object | |
EP2828834A2 (en) | Model and method for producing photorealistic 3d models | |
US9733764B2 (en) | Tracking of objects using pre-touch localization on a reflective surface | |
EP2884372B1 (en) | Method for locating mobile magnetic objects presented before an array of magnetometers | |
EP3145405B1 (en) | Method of determining at least one behavioural parameter | |
US20170147142A1 (en) | Dynamic image compensation for pre-touch localization on a reflective surface | |
EP3227743A1 (en) | Method for calibrating an augmented reality visual rendering system comprising at least one display device that is partially transparent with respect to the user thereof, and associated system | |
EP2994813B1 (en) | Method for controlling a graphical interface for displaying images of a three-dimensional object | |
FR3065097B1 (en) | AUTOMATED METHOD FOR RECOGNIZING AN OBJECT | |
EP2626837A1 (en) | System for creating three-dimensional representations from real models having similar and pre-determined characteristics | |
EP3724749B1 (en) | Device for augmented reality application | |
WO2022013492A1 (en) | Computer method, device and programme for assisting in the positioning of extended reality systems | |
EP3038354B1 (en) | A method for displaying images or videos | |
Pham | Integrating a Neural Network for Depth from Defocus with a Single MEMS Actuated Camera | |
FR3048108B1 (en) | METHOD FOR RECOGNIZING THE DISPOSITION OF A HAND IN AN IMAGE STREAM | |
Lima et al. | Device vs. user-perspective rendering in AR applications for monocular optical see-through head-mounted displays | |
FR3043295A1 (en) | SPACE ENHANCED REALITY DEVICE FOR OFFICE ENVIRONMENT | |
EP2725806A1 (en) | Device and method for calibrating stereoscopic display parameters of a virtual reality system |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20170601 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20210701 |