EP3776490A1 - Method and system for displaying a 3d model - Google Patents
Method and system for displaying a 3d modelInfo
- Publication number
- EP3776490A1 EP3776490A1 EP19727840.1A EP19727840A EP3776490A1 EP 3776490 A1 EP3776490 A1 EP 3776490A1 EP 19727840 A EP19727840 A EP 19727840A EP 3776490 A1 EP3776490 A1 EP 3776490A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- parts
- model
- control device
- user
- selected area
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 238000000034 method Methods 0.000 title claims abstract description 36
- 238000004590 computer program Methods 0.000 claims description 7
- 230000003190 augmentative effect Effects 0.000 claims description 3
- 230000033001 locomotion Effects 0.000 description 4
- 230000000875 corresponding effect Effects 0.000 description 2
- 238000012800 visualization Methods 0.000 description 2
- 229910000831 Steel Inorganic materials 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 229920000136 polysorbate Polymers 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000010959 steel Substances 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41805—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by assembly
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05B—CONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
- G05B19/00—Programme-control systems
- G05B19/02—Programme-control systems electric
- G05B19/418—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
- G05B19/41885—Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by modeling, simulation of the manufacturing system
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/016—Exploded view
Definitions
- the present invention relates to a method for representing a 3D model of an object and a system for displaying such a 3D model of an object.
- an item in virtual reality environments and / or augmented reality environments, can be represented as a 3D model. It may be desirable for a user to view the individual parts of the article in the 3D model, for example, to understand how the parts are assembled. For this purpose, it is desirable to display the 3D model in such a way that a visualization of the individual parts of the illustrated object is simplified.
- an object of the present invention is to provide an improved representation of a 3D model of an object.
- a method for displaying a 3D model of an article comprising a plurality of parts arranged at original positions comprises:
- the representation of the 3D model can be changed dynamically and in tereptept by operating the control device by a user. A representation of the 3D model can thus be improved.
- This improved rendering of the 3D model may allow the user to better locate a particular portion of the object, such as a particular screw.
- the user can also better localize a machine in a 3D model of a complex industrial plant. The user can thus "look" in particular at the subject, and the user can know better how the selected parts are assembled, for example, allowing the user to better understand how the object works.
- the article may include a device of an industrial plant, for example, an electric motor.
- the article may be both an electronic device and a mechanical object.
- the object may also be an industrial plant with several machines.
- the 3D model is in particular a 3D representation of the object.
- the 3D model can form a realistic representation of the object.
- the 3D model may be a CAD model.
- the plurality of parts is in particular so set together that they form the object or a part of the subject.
- the parts are for example screws, Zylin of, housing parts, valves, pistons or the like.
- the parts may also be entire machines, for example motors or machines of an industrial plant.
- the original positions of the parts may be positions in which the parts are assembled to form the article or a part thereof.
- the representation of the object with its parts in the original positions corresponds in particular to a truthful and / or realistic representation of the subject.
- the control device is actuated, for example, by the user moving the control device and / or actuating a button of the control device.
- the control device may also detect movements of the user and thereby be actuated.
- the Steuereinrich device may be formed as a motion sensor.
- the selected area of the 3D model is, for example, a 3D area of the 3D model.
- the selected area is the area of the object that the user wants to visualize in detail.
- the selected area is, for example, spherical or cuboid.
- the parts of the object that are in the selected area form the selected parts.
- the selection of the selected area leads in particular to the fact that the selected parts are displayed in end positions instead of their original positions.
- the end positions of the selected parts differ in particular from their origin positions. In the end positions, the selected parts can be displayed in such a way that distances between the selected parts increase. The distances between tween the selected parts are in particular larger when the parts are in the end positions, as if they are jump positions in the origin.
- the selected parts are in particular special represented in a 3D exploded view.
- the 3D model is presented in such a way that the parts of the object that are outside the selected area are displayed in their original positions.
- the selected parts are displayed in original positions.
- the selected parts can be thus be highlighted against the non-selected parts outside the selected area.
- the presentation of the 3D model can be further improved.
- the selected parts are moved away from their origin positions such that spacings of the selected parts increase to a reference point within the selected range.
- the reference point is located in particular in the middle of the selected area.
- the extent to which the distance of a selected part to the reference point is increased is, in particular, proportional to the distance between the reference point and the selected part in its original position.
- the selected parts, which are in their original positions near the reference point, are thus in particular less moved than the parts selected which are located further away from the reference point in their original positions.
- the 3D model is displayed in a virtual reality environment (VR environment) and / or in an augmented reality environment (AR environment).
- VR environment virtual reality environment
- AR environment augmented reality environment
- the 3D model is presented in an environment in which it is displayed along with additional information, such as a predetermined text or a predetermined image.
- the 3D model can also be displayed on a 3D screen. This can be a 3D screen of a headset, in particular a VR headset or an AR headset.
- the control device emits virtual beams such that they are only visible in the VR environment and / or in the AR environment and serve to select the selected area when the control device is moved.
- the virtual beams are, in particular, beams that are visible only in the VR and / or in the AR environment. They are visible only by a user with a corre sponding headset, for example.
- the virtual rays may resemble the light rays of a flashlight.
- the user in selecting the selected area, the user directs the virtual rays toward the area of the object that he wishes to select.
- the user In selecting the selected area, the user directs the virtual beams, in particular, to the 3D model representation of the subject.
- the virtual beams in the VR environment and / or the AR environment are frusto-conically emitted by the controller, and a portion of the object cut by the virtual beams forms the selected area.
- the reference point is arranged on a central axis of a truncated cone formed by the frustoconical Strah len.
- the Steuereinrich device is actuated such that:
- a position of the reference point is selected
- a distance of the control means is selected from the reference point
- a size of the selected area is determined.
- the position of the reference point can be selected by moving the control device.
- the distance of the Steuerein direction to the reference point is selected in particular characterized in that a setting is made on the control device be.
- the extent of increasing the intervals between the selected parts can be selected based on a further setting unit on the control device.
- the size of the selected area can be changed for example by determining an opening angle of the truncated cone.
- the method further comprises:
- the selection of the selected area is overridden by the user re-engaging the controller, for example by moving it away from the selected area.
- the user upon actuation and / or movement of the controller, the user selects a new selected area.
- the newly selected portions of the newly selected range may then be presented in end positions in which they are moved away from their original positions such that distances between the newly selected portions increase.
- the method further comprises:
- the user may select one of the selected parts and view it more closely, for example.
- the user can also obtain properties about the predetermined part.
- the selection of the predetermined part can follow without the user knowing the name of the part or the hierarchy.
- the 3D model of the art is shown that a transparency of at least some of the parts of the article, in particular the Vietnamese Selected th parts, is increased.
- the remaining parts can be better visualized. If the transparency of the non-selected parts is increased, the selected parts can be better viewed without the unselected parts hiding the selected parts.
- a computer program product such as a computer program means may, for example, be used as a storage medium, e.g.
- the system includes:
- a controller configured to be actuated by a user to select a selected portion of the 3D model, the portions of the item located in the selected region forming selected portions;
- a display device for displaying the 3D model in such a way that the selected parts are presented in final positions in which they are so different from their original position. moved away, that distances between the selected parts increase.
- the respective device for example the control device or the display device, can be implemented in terms of hardware and / or software technology.
- the respective device may be configured as a device or as part of a device, for example as a computer or as a microprocessor or as a controller of a vehicle.
- the respective device may be designed as a computer program product, as a function, as a routine, as part of a program code or as an executable object.
- a control device for the system according to the second aspect or according to an embodiment of the second aspect for selecting a selected area of a 3D model of an object upon actuation by a user is proposed.
- the Steuerein direction is flashlight-shaped and includes:
- a selection unit for selecting a predetermined part of the selected parts
- a determination unit for determining a position and / or a size of the selected area.
- the control device flashlight-shaped form is particularly advantageous because it is thereby tangible by the user and can be operated with a single hand. Furthermore, the actuation of the control device takes place in special similar to operating a flashlight and is therefore intuitive.
- the extent unit is in particular a slide button.
- the determination unit may be formed as a rotatable ring on the control device.
- the unit of measure and the determination unit can also be designed as buttons, for example.
- the position and / or size of the selected area can also be based on Sensteue tion and / or text input on the determination unit suc conditions.
- Fig. 1 shows a first illustration of a system for displaying a 3D model
- Fig. 2 is a second illustration of the system for displaying a 3D model
- Fig. 3 shows an example of an illustrated subject
- 4 shows a method for displaying a 3D model according to a first embodiment
- 5 shows a method of displaying a 3D model according to a second embodiment
- Fig. 6 shows a control device according to a Ausity
- the system 20 comprises a control device 10 and a display device 2.
- the display device 2 is a screen 2 of a VR headset (not shown), the 3D View pictures.
- a 3D model 1 of an object 3 is displayed on the screen 2.
- the 3D model 1 is a representation of the object 3.
- the object 3 is in the present case of a motor of an industrial plant. It comprises a large number of parts 4, for example screws, cylinders and piston.
- the parts 4 are shown schematically as blocks for the sake of clarity.
- the object 3 comprises 28 parts 4.
- the parts 4 are shown in their original position.
- the controller 10 is flashlight shaped and is made by a user takes it in the hand and moves. The actuation of the control device 10 will be described in more detail below with reference to FIG. 6.
- the system 20 is adapted to execute a method for representing a 3D model 1.
- a method for representing a 3D model 1 is shown, for example, in FIG. 4, which illustrates a method of displaying a 3D model 1 according to a first embodiment shows. In the following, the method will be described with reference to FIGS. 1, 2, 3 and 4.
- a step S1 the controller 10 is operated by a user 7 to select a selected area 5 of the 3D model 1.
- the user 7 takes the control device 10 into the hand 13 and moves it such that virtual beams 11 emitted by the control device 10 are emitted in the direction of the 3D model 1.
- the virtual beams 11 are only visible in the VR environment, ie with the VR headset.
- the user 10 moves the control device 10 in his hand 13 such that the truncated cone emitted virtu ellen rays 11 intersect the object 3.
- the area of the object 3 within the frusto-conical steel 11 forms the selected area 5. It is an area that the user 7 wants to visualize more precisely.
- the side surfaces of the selected parts 14 are shown dotted.
- the selected area comprises eight selected parts 14.
- a step S2 the 3D model 1 is displayed in such a way that the selected parts 14 are displayed in end positions.
- Fig. 2 shows how the selected parts 14 are shown in the end positions.
- the selected parts 14 are moved away from their original positions (FIG. 1) such that distances between the individual selected parts 14 are increased.
- the selected parts are thereby moved away from a reference point 6, which is in the middle of the selected area 5 Be.
- the unselected parts 4 are still displayed in their original positions.
- the user 7 sees the selected parts 14 "fly apart.” This allows the user 7 to better see the selected parts 14. In particular, he also sees the selected parts 14 previously hidden by other parts 4 were.
- Fig. 3 shows schematically how the selected parts 14 are moved.
- reference numerals 14u branch the selected parts 14 which are in their original positions.
- Reference numerals 14e show the selected parts 14 which are in their end positions.
- the system 20 may alternatively also perform a method of presenting a 3D model 1 according to a second embodiment. Such a method will be described below with reference to FIG. 5.
- the method steps S1 and S2 are identical to those of the method according to the first embodiment (FIG. 4).
- a step S3 the user 7 selects a predetermined part from the selected parts 14 on the basis of the control device 10. Properties of the predetermined part are displayed on the screen 2, so that the user 7 receives information about the predetermined part.
- a step S4 the control device 10 is actuated again by the user 7, so that the selection of the selected area 5 is canceled.
- the user 7 moves the control device 10 away from the selected area 5, so that the virtual beams 11 no longer intersect the object 3 in the selected area 5.
- the selected parts 14 are shown in their Endpo positions only as long as the user 7 with the Steuerein direction 10 points to the selected area 5.
- a step S5 the 3D model 1 is again presented Darge that the previously selected parts 14 again in their Origin positions are displayed.
- the previously selected parts 14 are again moved together so that a distance between the respective preselected parts 14 and the reference point 6 is reduced again.
- the steps S1-S5 can be repeated as often as desired. Thereby, the user 7 can sequentially select and explore individual areas of the object 3.
- the Steuerein device 10 is flashlight-shaped and therefore very intuitive to use.
- the emitted by the controller 10 beams 11 are truncated frustoconical with an opening angle emit advantage.
- the opening angle is adjustable by the Benut zer 7 the adjustment ring 16 rotates. By varying the opening angle, a size of the selected area 5 can be changed.
- the frusto-conical rays 11 are emitted along a central axis MA.
- the reference point 6 is located on this central axis MA.
- a distance d between the reference point 6 and the Steuerein direction 10 the user can set 10 based on a slide button 15 of the controller.
- the slide button 15 and the adjusting ring 16 in particular form a determination unit.
- a depth h of the selected area 5 can be determined by the user 7 by means of a voice command.
- the control device 10 is thus operable with a single hand 13. Furthermore, it is possible to provide a haptic input device for adjusting the depth h, such as a slide button. Furthermore, a two-dimensional touchpad can also be used to set both the distance d and the depth h. In addition, the depth h in some embodiments also by turning the control device 10 can be adjusted about its longitudinal axis.
- control device 10 further comprises an actuating unit for switching on and off the control device 10 and / or an extent unit for selecting egg nes extent of increasing the distances between the selected parts 14th
- the object 3 may for example also be any machine of an industrial plant or an entire industrial plant.
- the parts 4 of the Ge object 3 can also be arranged differently than shown in the Fig. 1 within the article 3.
- the 3D model 1 can alternatively be displayed on a normal 3D screen or in an AR environment. It is also conceivable to represent egg nige of the parts 4, for example, the non-selected parts 4, with an increased transparency.
- the controller 10 may also be configured in the form of a remote control with a plurality of buttons. Age natively, the controller 10 may also be atecserken tion device, the movements of the user 7 he knows.
- the described flashlight-shaped Steuerein device 10 can be modified. It may, for example, have various buttons for setting the distance d and / or the opening angle.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Manufacturing & Machinery (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Graphics (AREA)
- Theoretical Computer Science (AREA)
- Quality & Reliability (AREA)
- Automation & Control Theory (AREA)
- Architecture (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018207987.0A DE102018207987A1 (en) | 2018-05-22 | 2018-05-22 | Method and system for displaying a 3D model |
PCT/EP2019/061889 WO2019224009A1 (en) | 2018-05-22 | 2019-05-09 | Method and system for displaying a 3d model |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3776490A1 true EP3776490A1 (en) | 2021-02-17 |
Family
ID=66685565
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP19727840.1A Withdrawn EP3776490A1 (en) | 2018-05-22 | 2019-05-09 | Method and system for displaying a 3d model |
Country Status (6)
Country | Link |
---|---|
US (1) | US20210200192A1 (en) |
EP (1) | EP3776490A1 (en) |
JP (1) | JP2021524632A (en) |
CN (1) | CN112119431A (en) |
DE (1) | DE102018207987A1 (en) |
WO (1) | WO2019224009A1 (en) |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3374122B2 (en) * | 2000-05-29 | 2003-02-04 | ウエストユニティス株式会社 | Article assembly / disassembly movement display system |
JP2003076724A (en) * | 2001-09-04 | 2003-03-14 | Toyota Keeramu:Kk | Apparatus and method for automatic production of disassembling drawing, and recording media |
US7173996B2 (en) * | 2004-07-16 | 2007-02-06 | General Electric Company | Methods and apparatus for 3D reconstruction in helical cone beam volumetric CT |
JP2007018173A (en) * | 2005-07-06 | 2007-01-25 | Canon Inc | Image processing method and image processor |
US8452435B1 (en) * | 2006-05-25 | 2013-05-28 | Adobe Systems Incorporated | Computer system and method for providing exploded views of an assembly |
CN101542420A (en) * | 2006-11-27 | 2009-09-23 | 皇家飞利浦电子股份有限公司 | 3D control of data processing through handheld pointing device |
WO2011026268A1 (en) * | 2009-09-02 | 2011-03-10 | Autodesk, Inc. | Automatic explode based on occlusion |
US8717360B2 (en) * | 2010-01-29 | 2014-05-06 | Zspace, Inc. | Presenting a view within a three dimensional scene |
JP5300777B2 (en) * | 2010-03-31 | 2013-09-25 | 株式会社バンダイナムコゲームス | Program and image generation system |
US9552673B2 (en) * | 2012-10-17 | 2017-01-24 | Microsoft Technology Licensing, Llc | Grasping virtual objects in augmented reality |
US9652115B2 (en) * | 2013-02-26 | 2017-05-16 | Google Inc. | Vertical floor expansion on an interactive digital map |
JP6676071B2 (en) * | 2015-08-04 | 2020-04-08 | グーグル エルエルシー | Input via Context-Dependent Hand Collision with Objects in Virtual Reality |
JP6860776B2 (en) * | 2016-06-30 | 2021-04-21 | キヤノンマーケティングジャパン株式会社 | Virtual space controller, its control method, and program |
EP3301652A1 (en) * | 2016-09-29 | 2018-04-04 | Dassault Systèmes | Computer-implemented method of generating and displaying an exploded view |
-
2018
- 2018-05-22 DE DE102018207987.0A patent/DE102018207987A1/en not_active Ceased
-
2019
- 2019-05-09 WO PCT/EP2019/061889 patent/WO2019224009A1/en unknown
- 2019-05-09 US US17/056,812 patent/US20210200192A1/en not_active Abandoned
- 2019-05-09 CN CN201980034020.1A patent/CN112119431A/en active Pending
- 2019-05-09 EP EP19727840.1A patent/EP3776490A1/en not_active Withdrawn
- 2019-05-09 JP JP2020565318A patent/JP2021524632A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
CN112119431A (en) | 2020-12-22 |
WO2019224009A1 (en) | 2019-11-28 |
DE102018207987A1 (en) | 2019-11-28 |
JP2021524632A (en) | 2021-09-13 |
US20210200192A1 (en) | 2021-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE69130198T2 (en) | Image display systems | |
DE69623105T2 (en) | IMAGE PROCESSING AND IMAGE PROCESSOR | |
DE69607189T2 (en) | SAFE AND INEXPENSIVE COMPUTER PERIPHERAL DEVICES WITH POWER FEEDBACK FOR CONSUMPTION APPLICATIONS | |
DE69724416T2 (en) | HAND CONTROL WITH USER RETURN MECHANISM | |
DE60319847T2 (en) | JOYSTICK WITH VARIABLE COMPLIANCE WITH COMPENSATION ALGORITHMS | |
DE69737275T2 (en) | Video game system with vertical matrix of cursor images | |
DE102010043412A1 (en) | Display control device for a remote control device | |
DE69523717T2 (en) | METHOD AND DEVICE FOR CONTROLLING THE DIRECTION OF AN OBJECT | |
DE102013004692B4 (en) | 3D input device with an additional rotary controller | |
DE69129614T2 (en) | Process for calculating and displaying the luminance in graphics computers | |
DE102010030974A1 (en) | Apparatus and method for managing the functions of peripheral devices | |
DE10045117A1 (en) | Freely specifiable geometry and sound control | |
DE19632223A1 (en) | Method for modifying three-dimensional objects | |
WO2016041683A1 (en) | Display and operating device, especially for a motor vehicle, operating element, and motor vehicle | |
DE102004017148A1 (en) | Operator input device with feedback | |
DE102013011818B4 (en) | Simulator for a working machine | |
EP3259907A1 (en) | Method and device for the distortion-free display of an area surrounding a vehicle | |
DE69522907T2 (en) | Method and device for displaying a pointer along a two-dimensional representation of a computer-generated three-dimensional surface | |
DE102007057332A1 (en) | Domestic appliance display device | |
EP3163358B1 (en) | Visualisation device | |
DE102014009701B4 (en) | Method for operating virtual reality glasses and system with virtual reality glasses | |
EP3716014B1 (en) | Transfer of a condition between vr environments | |
WO2019224009A1 (en) | Method and system for displaying a 3d model | |
DE112004001937T5 (en) | Haptic input device for generating control information | |
DE102014200299A1 (en) | Method and device for operating a vehicle, computer program, computer program product |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20201110 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20221201 |