EP3776490A1 - Procédé et système de présentation d'un modèle 3d - Google Patents

Procédé et système de présentation d'un modèle 3d

Info

Publication number
EP3776490A1
EP3776490A1 EP19727840.1A EP19727840A EP3776490A1 EP 3776490 A1 EP3776490 A1 EP 3776490A1 EP 19727840 A EP19727840 A EP 19727840A EP 3776490 A1 EP3776490 A1 EP 3776490A1
Authority
EP
European Patent Office
Prior art keywords
parts
model
control device
user
selected area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP19727840.1A
Other languages
German (de)
English (en)
Inventor
Rebecca Johnson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Siemens AG
Original Assignee
Siemens AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens AG filed Critical Siemens AG
Publication of EP3776490A1 publication Critical patent/EP3776490A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41805Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by assembly
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/418Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM]
    • G05B19/41885Total factory control, i.e. centrally controlling a plurality of machines, e.g. direct or distributed numerical control [DNC], flexible manufacturing systems [FMS], integrated manufacturing systems [IMS] or computer integrated manufacturing [CIM] characterised by modeling, simulation of the manufacturing system
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2219/00Indexing scheme for manipulating 3D models or images for computer graphics
    • G06T2219/016Exploded view

Definitions

  • the present invention relates to a method for representing a 3D model of an object and a system for displaying such a 3D model of an object.
  • an item in virtual reality environments and / or augmented reality environments, can be represented as a 3D model. It may be desirable for a user to view the individual parts of the article in the 3D model, for example, to understand how the parts are assembled. For this purpose, it is desirable to display the 3D model in such a way that a visualization of the individual parts of the illustrated object is simplified.
  • an object of the present invention is to provide an improved representation of a 3D model of an object.
  • a method for displaying a 3D model of an article comprising a plurality of parts arranged at original positions comprises:
  • the representation of the 3D model can be changed dynamically and in tereptept by operating the control device by a user. A representation of the 3D model can thus be improved.
  • This improved rendering of the 3D model may allow the user to better locate a particular portion of the object, such as a particular screw.
  • the user can also better localize a machine in a 3D model of a complex industrial plant. The user can thus "look" in particular at the subject, and the user can know better how the selected parts are assembled, for example, allowing the user to better understand how the object works.
  • the article may include a device of an industrial plant, for example, an electric motor.
  • the article may be both an electronic device and a mechanical object.
  • the object may also be an industrial plant with several machines.
  • the 3D model is in particular a 3D representation of the object.
  • the 3D model can form a realistic representation of the object.
  • the 3D model may be a CAD model.
  • the plurality of parts is in particular so set together that they form the object or a part of the subject.
  • the parts are for example screws, Zylin of, housing parts, valves, pistons or the like.
  • the parts may also be entire machines, for example motors or machines of an industrial plant.
  • the original positions of the parts may be positions in which the parts are assembled to form the article or a part thereof.
  • the representation of the object with its parts in the original positions corresponds in particular to a truthful and / or realistic representation of the subject.
  • the control device is actuated, for example, by the user moving the control device and / or actuating a button of the control device.
  • the control device may also detect movements of the user and thereby be actuated.
  • the Steuereinrich device may be formed as a motion sensor.
  • the selected area of the 3D model is, for example, a 3D area of the 3D model.
  • the selected area is the area of the object that the user wants to visualize in detail.
  • the selected area is, for example, spherical or cuboid.
  • the parts of the object that are in the selected area form the selected parts.
  • the selection of the selected area leads in particular to the fact that the selected parts are displayed in end positions instead of their original positions.
  • the end positions of the selected parts differ in particular from their origin positions. In the end positions, the selected parts can be displayed in such a way that distances between the selected parts increase. The distances between tween the selected parts are in particular larger when the parts are in the end positions, as if they are jump positions in the origin.
  • the selected parts are in particular special represented in a 3D exploded view.
  • the 3D model is presented in such a way that the parts of the object that are outside the selected area are displayed in their original positions.
  • the selected parts are displayed in original positions.
  • the selected parts can be thus be highlighted against the non-selected parts outside the selected area.
  • the presentation of the 3D model can be further improved.
  • the selected parts are moved away from their origin positions such that spacings of the selected parts increase to a reference point within the selected range.
  • the reference point is located in particular in the middle of the selected area.
  • the extent to which the distance of a selected part to the reference point is increased is, in particular, proportional to the distance between the reference point and the selected part in its original position.
  • the selected parts, which are in their original positions near the reference point, are thus in particular less moved than the parts selected which are located further away from the reference point in their original positions.
  • the 3D model is displayed in a virtual reality environment (VR environment) and / or in an augmented reality environment (AR environment).
  • VR environment virtual reality environment
  • AR environment augmented reality environment
  • the 3D model is presented in an environment in which it is displayed along with additional information, such as a predetermined text or a predetermined image.
  • the 3D model can also be displayed on a 3D screen. This can be a 3D screen of a headset, in particular a VR headset or an AR headset.
  • the control device emits virtual beams such that they are only visible in the VR environment and / or in the AR environment and serve to select the selected area when the control device is moved.
  • the virtual beams are, in particular, beams that are visible only in the VR and / or in the AR environment. They are visible only by a user with a corre sponding headset, for example.
  • the virtual rays may resemble the light rays of a flashlight.
  • the user in selecting the selected area, the user directs the virtual rays toward the area of the object that he wishes to select.
  • the user In selecting the selected area, the user directs the virtual beams, in particular, to the 3D model representation of the subject.
  • the virtual beams in the VR environment and / or the AR environment are frusto-conically emitted by the controller, and a portion of the object cut by the virtual beams forms the selected area.
  • the reference point is arranged on a central axis of a truncated cone formed by the frustoconical Strah len.
  • the Steuereinrich device is actuated such that:
  • a position of the reference point is selected
  • a distance of the control means is selected from the reference point
  • a size of the selected area is determined.
  • the position of the reference point can be selected by moving the control device.
  • the distance of the Steuerein direction to the reference point is selected in particular characterized in that a setting is made on the control device be.
  • the extent of increasing the intervals between the selected parts can be selected based on a further setting unit on the control device.
  • the size of the selected area can be changed for example by determining an opening angle of the truncated cone.
  • the method further comprises:
  • the selection of the selected area is overridden by the user re-engaging the controller, for example by moving it away from the selected area.
  • the user upon actuation and / or movement of the controller, the user selects a new selected area.
  • the newly selected portions of the newly selected range may then be presented in end positions in which they are moved away from their original positions such that distances between the newly selected portions increase.
  • the method further comprises:
  • the user may select one of the selected parts and view it more closely, for example.
  • the user can also obtain properties about the predetermined part.
  • the selection of the predetermined part can follow without the user knowing the name of the part or the hierarchy.
  • the 3D model of the art is shown that a transparency of at least some of the parts of the article, in particular the Vietnamese Selected th parts, is increased.
  • the remaining parts can be better visualized. If the transparency of the non-selected parts is increased, the selected parts can be better viewed without the unselected parts hiding the selected parts.
  • a computer program product such as a computer program means may, for example, be used as a storage medium, e.g.
  • the system includes:
  • a controller configured to be actuated by a user to select a selected portion of the 3D model, the portions of the item located in the selected region forming selected portions;
  • a display device for displaying the 3D model in such a way that the selected parts are presented in final positions in which they are so different from their original position. moved away, that distances between the selected parts increase.
  • the respective device for example the control device or the display device, can be implemented in terms of hardware and / or software technology.
  • the respective device may be configured as a device or as part of a device, for example as a computer or as a microprocessor or as a controller of a vehicle.
  • the respective device may be designed as a computer program product, as a function, as a routine, as part of a program code or as an executable object.
  • a control device for the system according to the second aspect or according to an embodiment of the second aspect for selecting a selected area of a 3D model of an object upon actuation by a user is proposed.
  • the Steuerein direction is flashlight-shaped and includes:
  • a selection unit for selecting a predetermined part of the selected parts
  • a determination unit for determining a position and / or a size of the selected area.
  • the control device flashlight-shaped form is particularly advantageous because it is thereby tangible by the user and can be operated with a single hand. Furthermore, the actuation of the control device takes place in special similar to operating a flashlight and is therefore intuitive.
  • the extent unit is in particular a slide button.
  • the determination unit may be formed as a rotatable ring on the control device.
  • the unit of measure and the determination unit can also be designed as buttons, for example.
  • the position and / or size of the selected area can also be based on Sensteue tion and / or text input on the determination unit suc conditions.
  • Fig. 1 shows a first illustration of a system for displaying a 3D model
  • Fig. 2 is a second illustration of the system for displaying a 3D model
  • Fig. 3 shows an example of an illustrated subject
  • 4 shows a method for displaying a 3D model according to a first embodiment
  • 5 shows a method of displaying a 3D model according to a second embodiment
  • Fig. 6 shows a control device according to a Ausity
  • the system 20 comprises a control device 10 and a display device 2.
  • the display device 2 is a screen 2 of a VR headset (not shown), the 3D View pictures.
  • a 3D model 1 of an object 3 is displayed on the screen 2.
  • the 3D model 1 is a representation of the object 3.
  • the object 3 is in the present case of a motor of an industrial plant. It comprises a large number of parts 4, for example screws, cylinders and piston.
  • the parts 4 are shown schematically as blocks for the sake of clarity.
  • the object 3 comprises 28 parts 4.
  • the parts 4 are shown in their original position.
  • the controller 10 is flashlight shaped and is made by a user takes it in the hand and moves. The actuation of the control device 10 will be described in more detail below with reference to FIG. 6.
  • the system 20 is adapted to execute a method for representing a 3D model 1.
  • a method for representing a 3D model 1 is shown, for example, in FIG. 4, which illustrates a method of displaying a 3D model 1 according to a first embodiment shows. In the following, the method will be described with reference to FIGS. 1, 2, 3 and 4.
  • a step S1 the controller 10 is operated by a user 7 to select a selected area 5 of the 3D model 1.
  • the user 7 takes the control device 10 into the hand 13 and moves it such that virtual beams 11 emitted by the control device 10 are emitted in the direction of the 3D model 1.
  • the virtual beams 11 are only visible in the VR environment, ie with the VR headset.
  • the user 10 moves the control device 10 in his hand 13 such that the truncated cone emitted virtu ellen rays 11 intersect the object 3.
  • the area of the object 3 within the frusto-conical steel 11 forms the selected area 5. It is an area that the user 7 wants to visualize more precisely.
  • the side surfaces of the selected parts 14 are shown dotted.
  • the selected area comprises eight selected parts 14.
  • a step S2 the 3D model 1 is displayed in such a way that the selected parts 14 are displayed in end positions.
  • Fig. 2 shows how the selected parts 14 are shown in the end positions.
  • the selected parts 14 are moved away from their original positions (FIG. 1) such that distances between the individual selected parts 14 are increased.
  • the selected parts are thereby moved away from a reference point 6, which is in the middle of the selected area 5 Be.
  • the unselected parts 4 are still displayed in their original positions.
  • the user 7 sees the selected parts 14 "fly apart.” This allows the user 7 to better see the selected parts 14. In particular, he also sees the selected parts 14 previously hidden by other parts 4 were.
  • Fig. 3 shows schematically how the selected parts 14 are moved.
  • reference numerals 14u branch the selected parts 14 which are in their original positions.
  • Reference numerals 14e show the selected parts 14 which are in their end positions.
  • the system 20 may alternatively also perform a method of presenting a 3D model 1 according to a second embodiment. Such a method will be described below with reference to FIG. 5.
  • the method steps S1 and S2 are identical to those of the method according to the first embodiment (FIG. 4).
  • a step S3 the user 7 selects a predetermined part from the selected parts 14 on the basis of the control device 10. Properties of the predetermined part are displayed on the screen 2, so that the user 7 receives information about the predetermined part.
  • a step S4 the control device 10 is actuated again by the user 7, so that the selection of the selected area 5 is canceled.
  • the user 7 moves the control device 10 away from the selected area 5, so that the virtual beams 11 no longer intersect the object 3 in the selected area 5.
  • the selected parts 14 are shown in their Endpo positions only as long as the user 7 with the Steuerein direction 10 points to the selected area 5.
  • a step S5 the 3D model 1 is again presented Darge that the previously selected parts 14 again in their Origin positions are displayed.
  • the previously selected parts 14 are again moved together so that a distance between the respective preselected parts 14 and the reference point 6 is reduced again.
  • the steps S1-S5 can be repeated as often as desired. Thereby, the user 7 can sequentially select and explore individual areas of the object 3.
  • the Steuerein device 10 is flashlight-shaped and therefore very intuitive to use.
  • the emitted by the controller 10 beams 11 are truncated frustoconical with an opening angle emit advantage.
  • the opening angle is adjustable by the Benut zer 7 the adjustment ring 16 rotates. By varying the opening angle, a size of the selected area 5 can be changed.
  • the frusto-conical rays 11 are emitted along a central axis MA.
  • the reference point 6 is located on this central axis MA.
  • a distance d between the reference point 6 and the Steuerein direction 10 the user can set 10 based on a slide button 15 of the controller.
  • the slide button 15 and the adjusting ring 16 in particular form a determination unit.
  • a depth h of the selected area 5 can be determined by the user 7 by means of a voice command.
  • the control device 10 is thus operable with a single hand 13. Furthermore, it is possible to provide a haptic input device for adjusting the depth h, such as a slide button. Furthermore, a two-dimensional touchpad can also be used to set both the distance d and the depth h. In addition, the depth h in some embodiments also by turning the control device 10 can be adjusted about its longitudinal axis.
  • control device 10 further comprises an actuating unit for switching on and off the control device 10 and / or an extent unit for selecting egg nes extent of increasing the distances between the selected parts 14th
  • the object 3 may for example also be any machine of an industrial plant or an entire industrial plant.
  • the parts 4 of the Ge object 3 can also be arranged differently than shown in the Fig. 1 within the article 3.
  • the 3D model 1 can alternatively be displayed on a normal 3D screen or in an AR environment. It is also conceivable to represent egg nige of the parts 4, for example, the non-selected parts 4, with an increased transparency.
  • the controller 10 may also be configured in the form of a remote control with a plurality of buttons. Age natively, the controller 10 may also be atecserken tion device, the movements of the user 7 he knows.
  • the described flashlight-shaped Steuerein device 10 can be modified. It may, for example, have various buttons for setting the distance d and / or the opening angle.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Manufacturing & Machinery (AREA)
  • Software Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Graphics (AREA)
  • Computer Hardware Design (AREA)
  • Quality & Reliability (AREA)
  • Automation & Control Theory (AREA)
  • Architecture (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de présentation d'un modèle 3D (1) d'un objet (3) comprenant une pluralité d'éléments (4) disposés dans des positions d'origine. Le procédé comprend les étapes suivantes : faire fonctionner (S1) un dispositif de commande (10) par un utilisateur (7) pour sélectionner une zone sélectionnée (5) du modèle 3D (1), les éléments (4) de l'objet (3) situés dans la zone sélectionnée (5) formant des éléments sélectionnés (14) ; et présenter (S2) le modèle 3D (1) de sorte que les éléments sélectionnés (14) soient présentés dans des positions extrêmes dans lesquelles ils sont éloignées de leurs positions d'origine de sorte que les distances entre les éléments sélectionnés (14) augmentent. Les éléments sélectionnés sont présentés de telle sorte qu'un utilisateur puisse mieux voir les éléments sélectionnés.
EP19727840.1A 2018-05-22 2019-05-09 Procédé et système de présentation d'un modèle 3d Withdrawn EP3776490A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018207987.0A DE102018207987A1 (de) 2018-05-22 2018-05-22 Verfahren und System zum Darstellen eines 3D-Modells
PCT/EP2019/061889 WO2019224009A1 (fr) 2018-05-22 2019-05-09 Procédé et système de présentation d'un modèle 3d

Publications (1)

Publication Number Publication Date
EP3776490A1 true EP3776490A1 (fr) 2021-02-17

Family

ID=66685565

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19727840.1A Withdrawn EP3776490A1 (fr) 2018-05-22 2019-05-09 Procédé et système de présentation d'un modèle 3d

Country Status (6)

Country Link
US (1) US20210200192A1 (fr)
EP (1) EP3776490A1 (fr)
JP (1) JP2021524632A (fr)
CN (1) CN112119431A (fr)
DE (1) DE102018207987A1 (fr)
WO (1) WO2019224009A1 (fr)

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3374122B2 (ja) * 2000-05-29 2003-02-04 ウエストユニティス株式会社 物品組立・分解移動表示システム
JP2003076724A (ja) * 2001-09-04 2003-03-14 Toyota Keeramu:Kk 分解図自動作成装置、分解図自動作成方法及びその記録媒体
US7173996B2 (en) * 2004-07-16 2007-02-06 General Electric Company Methods and apparatus for 3D reconstruction in helical cone beam volumetric CT
JP2007018173A (ja) * 2005-07-06 2007-01-25 Canon Inc 画像処理方法、画像処理装置
US8452435B1 (en) * 2006-05-25 2013-05-28 Adobe Systems Incorporated Computer system and method for providing exploded views of an assembly
JP2010511221A (ja) * 2006-11-27 2010-04-08 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ 手持型ポインティング装置を介したデータ処理の三次元制御
WO2011026268A1 (fr) * 2009-09-02 2011-03-10 Autodesk, Inc. Explosion automatique basée sur l'occlusion
US8717360B2 (en) * 2010-01-29 2014-05-06 Zspace, Inc. Presenting a view within a three dimensional scene
JP5300777B2 (ja) * 2010-03-31 2013-09-25 株式会社バンダイナムコゲームス プログラム及び画像生成システム
US9552673B2 (en) * 2012-10-17 2017-01-24 Microsoft Technology Licensing, Llc Grasping virtual objects in augmented reality
US9652115B2 (en) * 2013-02-26 2017-05-16 Google Inc. Vertical floor expansion on an interactive digital map
EP4374942A3 (fr) * 2015-08-04 2024-07-10 Google LLC Entrée par l'intermédiaire de collisions sensibles au contexte de mains avec des objets en réalité virtuelle
JP6860776B2 (ja) * 2016-06-30 2021-04-21 キヤノンマーケティングジャパン株式会社 仮想空間制御装置、その制御方法、及びプログラム
EP3301652A1 (fr) * 2016-09-29 2018-04-04 Dassault Systèmes Procédé informatique de génération et d'affichage d'un éclaté

Also Published As

Publication number Publication date
DE102018207987A1 (de) 2019-11-28
WO2019224009A1 (fr) 2019-11-28
JP2021524632A (ja) 2021-09-13
CN112119431A (zh) 2020-12-22
US20210200192A1 (en) 2021-07-01

Similar Documents

Publication Publication Date Title
DE69724416T2 (de) Zeigersteuerung mit benutzerrückführungsmechanismus
DE60319847T2 (de) Joystick mit variabler nachgiebigkeit mit kompensationsalgorithmen
DE69737275T2 (de) Videospielsystem mit vertikaler Matrix aus Cursorbildern
DE69307419T2 (de) Elektronisches Bremspedalnachstellung-Gerät und Verfahren dazu
DE102010043412A1 (de) Anzeigesteuervorrichtung für eine Fernsteuervorrichtung
DE102013004692B4 (de) 3D-Eingabegerät mit einem zusätzlichen Drehregler
DE102010030974A1 (de) Vorrichtung und Verfahren zum Verwalten der Funktionen von Peripheriegeräten
DE19632223A1 (de) Verfahren zum Modifizieren dreidimensionaler Objekte
DE102004017148A1 (de) Bedienereingabegerät mit Rückmeldung
DE102013011818B4 (de) Simulator für eine Arbeitsmaschine
EP1194005B1 (fr) Méthode de réglage de la caractéristique de transmission d'un circuit éléctronique
WO2016131452A1 (fr) Procédé et dispositif d'affichage sans distorsion d'un environnement d'un véhicule
DE102007057332A1 (de) Hausgerätanzeigevorrichtung
EP3163358B1 (fr) Dispositif de visualisation
DE102014009701B4 (de) Verfahren zum Betreiben einer Virtual-Reality-Brille und System mit einer Virtual-Reality-Brille
EP3776490A1 (fr) Procédé et système de présentation d'un modèle 3d
DE102014009299A1 (de) Verfahren zum Betreiben einer Virtual-Reality-Brille und System mit einer Virtual-Reality-Brille
WO2017054894A1 (fr) Système de commande interactif et procédé de réalisation d'une opération de commande sur un système de commande intéractif
EP3716014B1 (fr) Transfert d'un état entre des environnements de la réalité virtuelle
WO2006061185A1 (fr) Procede pour realiser des dessins techniques a partir de modeles tridimensionnels comportant au moins deux elements tridimensionnels en collision
DE102013219145A1 (de) Medizinsystem
DE3921300A1 (de) Verfahren zum drehen eines objekts in dreidimensionaler darstellung
DE102005028103A1 (de) Verfahren zur Darstellung flexibler längenerstreckter Volumenobjekte
DE102014200299A1 (de) Verfahren und Vorrichtung zum Betreiben eines Fahrzeugs, Computerprogramm, Computer-Programmprodukt
DE3786547T2 (de) Steuerknüppel.

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20201110

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20221201