WO2014108150A2 - Interface utilisateur pour une entrée de caractères manuscrite dans un appareil - Google Patents

Interface utilisateur pour une entrée de caractères manuscrite dans un appareil Download PDF

Info

Publication number
WO2014108150A2
WO2014108150A2 PCT/EP2013/003528 EP2013003528W WO2014108150A2 WO 2014108150 A2 WO2014108150 A2 WO 2014108150A2 EP 2013003528 W EP2013003528 W EP 2013003528W WO 2014108150 A2 WO2014108150 A2 WO 2014108150A2
Authority
WO
WIPO (PCT)
Prior art keywords
character
detected
input
motor vehicle
camera
Prior art date
Application number
PCT/EP2013/003528
Other languages
German (de)
English (en)
Other versions
WO2014108150A3 (fr
Inventor
Michael SCHLITTENBAUER
Lorenz Bohrer
Martin Roehder
Original Assignee
Audi Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Audi Ag filed Critical Audi Ag
Publication of WO2014108150A2 publication Critical patent/WO2014108150A2/fr
Publication of WO2014108150A3 publication Critical patent/WO2014108150A3/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the invention relates to a method for operating a user interface of a device, which has a handwriting recognition for a contactless input of handwritten characters.
  • the invention also includes a motor vehicle with such an operator interface, a computer program product for providing such operating interface parts in a mobile terminal, in particular a smartphone or a tablet PC, and such a mobile terminal.
  • a motor vehicle In a motor vehicle is usually provided to call individual functions of the motor vehicle, such as entering the navigation destination in a navigation device, by pressing it provided, defined function keys.
  • each key To sense such key presses, today each key an electrical or capacitive switch is necessary.
  • the keys are localized and fixed. Searching the buttons or even searching for control panels on a touch-sensitive control panel, for example a so-called touchpad or touch screen, can distract the driver during a journey in an undesirable manner from the traffic situation.
  • each key requires a relatively expensive evaluation unit. The more different equipment variants of a motor vehicle must be, the more different buttons must be installed and connected accordingly. This makes variant diversity costly.
  • character recognition may be provided instead of hardware buttons or emulated buttons on a touchscreen.
  • a user can draw individual characters, that is to say letters, numbers or even special characters, in succession on one and the same touch-sensitive control panel with a finger or also with a pen.
  • An automatic handwriting Identification also referred to as text recognition or optical character recognition, can then recognize, for example on the basis of pattern recognition from the respectively drawn character trace, the drawn letters and pass on corresponding digital information to the navigation device.
  • Automatic handwriting recognition allows the recognition of single characters or even several characters drawn one after the other with a single character track or intermittently.
  • US 201 1/0254765 A1 discloses an automatic handwriting recognition system in which a user with one hand independently of a hand-held item is free in space, i. in the air, can write and this movement of the hand is detected as a 3D motion trajectory of an optical sensor. The 3D motion trajectory is projected onto a plane. The 2D image of the 3D motion trajectory thus formed is then fed to automatic handwriting recognition.
  • the system assumes that the user is standing in front of the optical sensor when performing his hand movements, performing writing movements like on a blackboard.
  • the system can detect when the user finishes an entry with a special gesture.
  • a special gesture in the case of fully written words with several letters, a statistical analysis can be used to check which part of the 3D motion trajectory belongs to the actual character input.
  • the invention has for its object to provide a user interface for a device by means of which characters can be entered into the device in a small space.
  • an automatic handwriting recognition is also used in a manner known per se for recognizing a character which has been handwritten by a user with an input element, for example his finger or a drawing instrument.
  • the character track represents one or more characters in the usual way as a two-dimensional, ie flat image as the user has drawn.
  • a character recognition device is then generated in a known manner depending on the 2D character track at least one recognized character, i. ie e.g. output at an output, and the recognized characters are provided as a character input to the device to be operated.
  • the text input is possible by a gesture without touching an input element.
  • a 3D movement trajectory (trajectory) of the input element in a predetermined spatial area is detected by a sensor device according to the method.
  • the coordinates of the points which make up the motion trajectory can differ in all three spatial directions. It is therefore a three-dimensional structure.
  • the detected 3D movement trajectory is then projected onto a two-dimensional projection plane by the sensor device and the projection of the 3D motion trajectory is transmitted as the 2D character trace to the character recognition device.
  • the 3D motion trajectory is now detected with a TOF camera (TOF) and from a viewing direction whose direction vector lies in the projection plane or parallel to it or at an acute angle smaller than 45 ° to the projection plane, preferably less than 30 °, is arranged.
  • TOF TOF
  • the TOF camera observes the user's drawing finger from above or below, for example, while the user draws the characters upright in space.
  • the 3D camera for example, in the headliner of the motor vehicle and to image the driver from above by means of the 3D camera.
  • the 3D camera By capturing the thus drawn 3D motion trajectory through a TOF camera, the Person does not pay attention to what angle to the 3D camera she performs the movement.
  • the detection of the 3D motion trajectory by means of the TOF camera in all three spatial directions takes place with sufficient reliability.
  • a TOF camera is part of a stereo camera arrangement. This advantageously avoids the shading effect known for TOF cameras, which results when the TOF camera detects an object located freely in space only from one side and thus does not detect the depth dimensions of the object along the optical axis of the TOF camera can be.
  • Another way to compensate for shadowing is by a hand model, as e.g. of the product "Kinect" ® of the company Microsoft ® is known.
  • an inclination of the projection plane in space is adapted to the respectively detected 3D motion trajectory.
  • an envelope of the 3D motion trajectory is determined.
  • dimensions of the edges of a virtual cuboid can be determined, which completely encompasses (envelopes) the 3D motion trajectory and from which, when the dimension of one of the edges is reduced, the 3D motion trajectory would stand out of the cuboid.
  • the expansion direction along which the shell has a smallest extension and the projection plane perpendicular to this expansion direction is then determined.
  • the direction of expansion can also be determined, for example, by means of a principal component analysis (PCA) of the coordinates of the points describing the 3D movement trajectory.
  • PCA principal component analysis
  • a further advantage results if the detection of the 3D movement trajectory is started for each detection operation, if it is detected by the sensor device that the input element penetrates into said spatial area, and the detection is terminated again, if it is detected by the sensor device, that the input element leaves the room area again.
  • the space area may be defined as a cuboid volume in space. Based on the coordinates the input element, so for example a fingertip, can then be checked whether the fingertip is inside this cuboid or not. The user can then start detecting a 3D motion trajectory by dipping his fingertip into the volume and then, by pulling his hand out of the volume, stop the recognition process.
  • Another possibility, in order to be able to reliably differentiate between an input procedure on the one hand and a hand movement which is not intended to cause an input, is achieved according to an embodiment of the method by detecting the 3D motion trajectory in the spatial area only if, as a result Sensor device is detected that a hand of the user is at least partially in the space area and at least one finger of the hand has a respective predetermined finger position.
  • detection of the 3D motion trajectory is only triggered when the user makes a certain finger gesture when drawing by hand.
  • the invention also includes a motor vehicle.
  • the motor vehicle according to the invention has an operator interface with a character recognition device, that is to say an automatic handwriting recognition, for handwritten character input into a component of the motor vehicle.
  • the component may be, for example, a navigation device or an infotainment system.
  • the character recognition device can be a per se known embodiment for a 2D character trace recognition.
  • the motor vehicle according to the invention is characterized in that a TOF camera for handwritten character input is provided at the operator interface and the operator interface is designed to carry out an embodiment of the method according to the invention.
  • the camera is preferably arranged in a headliner of the motor vehicle.
  • a headliner of the motor vehicle This results in no disadvantages in the detection of a 2D character trace for the character recognition device when the inventive method is used.
  • the non-contact text input can be provided on the basis of the invention, but not only in a motor vehicle, but also in a mobile device, so for example in a smartphone or a tablet PC.
  • a computer program product with a program stored on at least one storage medium, which is designed to execute the program by a processor device of the mobile terminal on the Based on camera data of a camera of the mobile terminal to perform an embodiment of the method according to the invention.
  • the invention also includes a mobile terminal which has an embodiment of the computer program product according to the invention.
  • the mobile terminal can be operated in such a way that hands-free character input is possible without contact.
  • Fig. 1 is a block diagram of an optical sensor device which may be installed in an embodiment of the motor vehicle according to the invention.
  • FIG. 2 shows a sketch of an operating procedure, as is possible for an operator on the basis of an embodiment of the method according to the invention.
  • the examples shown represent preferred embodiments of the invention.
  • FIG. 1 shows an optical sensor device 10 and a re-input device 12 of a motor vehicle, for example a passenger car.
  • the reproduction device 12 can be, for example, an infotainment system, an audio system, a navigation system, a television system, a telephone, a combination instrument or a head-up display.
  • the sensor device 10 comprises a measuring device 14 and a calculation unit 16.
  • the measuring device 14 comprises an sensor 18, which may for example be a TOF camera or PMD camera (PMD - photon mixing detector).
  • the optical sensor 18 may also be a stereo arrangement, for example. In the example shown in FIG. 1, it has been assumed that the optical sensor 18 is a PMD camera.
  • the optical sensor 18 may be arranged, for example, in a headliner of the motor vehicle.
  • the optical sensor 18 may be configured in a manner known per se, i. a light source 20, e.g. an infrared light illuminates a detection area 22, for example a space above a center console of the motor vehicle. If there is an object in it, for example a hand 24 of the driver of the motor vehicle, then the electromagnetic radiation emitted by the light source 20 is reflected back by the hand 24 to a sensor array 26. By means of the sensor array 26 then 3D image data can be generated, which indicate 3D coordinates to individual surface elements of the hand 24. The 3D image data are transmitted from the measuring device 14 to the calculation unit 16.
  • the calculation unit 16 may be, for example, a control unit of the motor vehicle.
  • the signals are evaluated and then the evaluated data are made available to the vehicle, for example, by being transmitted to the reproduction device 12.
  • limbs such as a hand
  • limbs can be segmented from the 3D image data, whereby, for example, the position of a fingertip in the detection area 22 can be determined.
  • known per se segmentation algorithms can be used.
  • the 3D image data of the sensor array 26 of the optical sensor 8 may also represent a sequence of consecutive 3D images, i. With the optical sensor 18 and movements of the hand 24 can be detected. By tracing the trajectory, for example the fingertip in this 3D image sequence, in particular by tracking the position and the speed of the fingertip, a motion gesture indicated by the fingertip can be extrapolated from the trajectory.
  • FIG. 2 an interior 28 of a motor vehicle, such as a passenger car, is shown. Shown are a center console 30 with a gear selector lever 32 and a dashboard 34. A driver can draw with his hand 24 in a space above the center console 30 freely in space and so set, for example, a place name for a selection of a navigation target in the display device 12, in this case, a navigation device or a navigation module in an infotainment system. He can concentrate fully on the traffic, ie he is not distracted from his driving task. The character input also requires no electrical evaluation units for buttons or a touchpad.
  • the non-contact automatic character recognition is provided by means of the optical sensor device 10, which represents an operating interface for the playback unit 12.
  • the optical sensor 26 of the sensor device 14 can be arranged, for example, in a headliner above the center console 30.
  • An optical axis of a camera of the optical sensor can point vertically or obliquely downward.
  • a direction vector V of the optical axis is shown in FIG.
  • the optical sensor 26 films the interior space 28 above the center console 30. From the 3D image data thus generated, the calculation unit 16 extracts the image information belonging to the hand 24 and thus determines a position P of the fingertip of the hand 24.
  • the position P of the fingertip is is a feature that is tracked by the computing unit 16 over time. In the example, with his finger outstretched, the driver drew the letter A in the air with his fingertip.
  • the position P of the fingertip has thereby described a trajectory 36 which has been detected by the calculation unit 16 on the basis of the 3D image data of the sensor unit 14. In this case, the calculation unit monitors whether the position P is within a predetermined spatial area 38.
  • the spatial region 38 can be described, for example, by coordinates of its corners and edges in a coordinate system 40.
  • the calculation unit 16 detects an entry point 40 of the trajectory 36 and an exit point 42 from the spatial area 38, for example by means of a coordinate comparison.
  • the portion of the trajectory 36 located within the spatial area 38 forms a 3D movement trajectory 44, on the basis of which the character recognition takes place.
  • the 3D motion trajectory 44 is projected onto a projection plane 46 by the computing unit 16. This takes place computationally within the calculation unit 16 and is shown in FIG. 2 only for the sake of clarity for the spatial region 28 as well.
  • the directional vector V of the optical axis may be aligned in the projection plane 46 or parallel to the projection plane 46.
  • An angle included by the directional vector V and the projection plane 46 may also have a value between 0 ° and 45 °.
  • the projection plane 46 is the yz plane of a coordinate system K, as defined by the vehicle vertical axis and the vehicle lateral axis.
  • An orientation of the projection plane 46 in space can also be adjusted depending on the 3D motion trajectory 44.
  • the projection of the 3D movement trajectory 44 onto the projection plane 46 results in a 2D character trace 48, which is fed to a 2D handwriting recognition known from the prior art realized in the calculation unit 16. Through the handwriting recognition, the 2D character trace 48 is assigned in a manner known per se to a recognized character 50, in this case the correctly recognized letter A.
  • the recognized character 50 is then output to the display device 12 as a character input.
  • the example shows how certain features can be traced (tracked) by the TOF camera or body parts of the occupant due to the physically perfect location of the occupant and the trajectory obtained thereby can be evaluated.
  • the tip of the outstretched finger can be followed.
  • the y-z plane of the trajectory then reflects the painted letter.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un procédé de fonctionnement d'une interface utilisateur destinée à une entrée de caractères sans contact dans un appareil (12), un dispositif détecteur (10) permettant de déterminer un tracé 2D de caractères (48) qui représente au moins un caractère (50) dessiné à la main par un utilisateur à l'aide d'un élément d'entrée (24), et un dispositif de reconnaissance de caractères (16) permettant de produire en fonction du tracé 2D de caractères (48) au moins un caractère (50) reconnu en tant qu'entrée de caractère et le cédant à l'appareil (12). L'invention a pour but de procurer une interface utilisateur destinée à un appareil et permettant d'entrer des caractères dans un espace restreint de l'appareil. A cet effet, le dispositif détecteur (10) assure à l'aide d'une caméra à temps de vol - TOF - (26) la détection d'une trajectoire de mouvement 3D (44) de l'élément d'entrée (24) dans une zone spatiale (38) prédéfinie et la trajectoire de mouvement 3D (48) est projetée sur un plan de projection (46) bidimensionnel. La projection de la trajectoire de mouvement 3D (44) est ensuite transmise en qualité de tracé 2D de caractères (48) au dispositif de reconnaissance de caractères (16). L'acquisition d'image peut également être effectuée par le haut.
PCT/EP2013/003528 2013-01-08 2013-11-22 Interface utilisateur pour une entrée de caractères manuscrite dans un appareil WO2014108150A2 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102013000072.6A DE102013000072A1 (de) 2013-01-08 2013-01-08 Bedienschnittstelle für eine handschriftliche Zeicheneingabe in ein Gerät
DE102013000072.6 2013-01-08

Publications (2)

Publication Number Publication Date
WO2014108150A2 true WO2014108150A2 (fr) 2014-07-17
WO2014108150A3 WO2014108150A3 (fr) 2014-12-04

Family

ID=49680973

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2013/003528 WO2014108150A2 (fr) 2013-01-08 2013-11-22 Interface utilisateur pour une entrée de caractères manuscrite dans un appareil

Country Status (2)

Country Link
DE (1) DE102013000072A1 (fr)
WO (1) WO2014108150A2 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112036315A (zh) * 2020-08-31 2020-12-04 北京百度网讯科技有限公司 字符识别方法、装置、电子设备及存储介质
CN112540683A (zh) * 2020-12-08 2021-03-23 维沃移动通信有限公司 智能指环及手写文字的识别方法和电子设备

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102014224618A1 (de) * 2014-12-02 2016-06-02 Robert Bosch Gmbh Verfahren und Vorrichtung zum Betreiben einer Eingabevorrichtung
CN105872729A (zh) * 2015-04-21 2016-08-17 乐视致新电子科技(天津)有限公司 识别操作事件的方法和装置
DE102015010421A1 (de) 2015-08-11 2017-02-16 Daimler Ag Dreidimensionale Erfassung des Fahrzeuginnenraums

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001059975A2 (fr) * 2000-02-11 2001-08-16 Canesta, Inc. Procede et appareil destines a entrer des donnees a l'aide d'un dispositif d'entree virtuelle
US20060159344A1 (en) * 2002-12-26 2006-07-20 Xiaoling Shao Method and system for three-dimensional handwriting recognition
US20110254765A1 (en) * 2010-04-18 2011-10-20 Primesense Ltd. Remote text input using handwriting
US20110286676A1 (en) * 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles
DE102011089195A1 (de) * 2011-06-30 2013-01-03 Johnson Controls Gmbh Vorrichtung und Verfahren zur berührungslosen Erfassung von Gegenständen und/oder Personen und von diesen ausgeführten Gesten und/oder Bedienvorgängen

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8817087B2 (en) * 2010-11-01 2014-08-26 Robert Bosch Gmbh Robust video-based handwriting and gesture recognition for in-car applications

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001059975A2 (fr) * 2000-02-11 2001-08-16 Canesta, Inc. Procede et appareil destines a entrer des donnees a l'aide d'un dispositif d'entree virtuelle
US20060159344A1 (en) * 2002-12-26 2006-07-20 Xiaoling Shao Method and system for three-dimensional handwriting recognition
US20110254765A1 (en) * 2010-04-18 2011-10-20 Primesense Ltd. Remote text input using handwriting
US20110286676A1 (en) * 2010-05-20 2011-11-24 Edge3 Technologies Llc Systems and related methods for three dimensional gesture recognition in vehicles
DE102011089195A1 (de) * 2011-06-30 2013-01-03 Johnson Controls Gmbh Vorrichtung und Verfahren zur berührungslosen Erfassung von Gegenständen und/oder Personen und von diesen ausgeführten Gesten und/oder Bedienvorgängen

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112036315A (zh) * 2020-08-31 2020-12-04 北京百度网讯科技有限公司 字符识别方法、装置、电子设备及存储介质
CN112540683A (zh) * 2020-12-08 2021-03-23 维沃移动通信有限公司 智能指环及手写文字的识别方法和电子设备

Also Published As

Publication number Publication date
DE102013000072A1 (de) 2014-07-10
WO2014108150A3 (fr) 2014-12-04

Similar Documents

Publication Publication Date Title
EP1998996B1 (fr) Serveur interactif et procédé permettant de faire fonctionner le serveur interactif
DE102014116292A1 (de) System zur Informationsübertragung in einem Kraftfahrzeug
WO2014108150A2 (fr) Interface utilisateur pour une entrée de caractères manuscrite dans un appareil
DE102010007455A1 (de) System und Verfahren zum berührungslosen Erfassen und Erkennen von Gesten in einem dreidimensionalen Raum
DE102014221510A1 (de) System und Verfahren für das Erkennen einer Benutzergeste für das Ausführen der Bedienung eines Fahrzeugs
DE102014204320A1 (de) Informationsabfrage durch Zeigen
EP3358454B1 (fr) Interface utilisateur, véhicule et procédé de distinction de l'utilisateur
DE102012020607B4 (de) Kraftwagen mit einer Gestensteuerungseinrichtung sowie Verfahren zum Steuern eines Auswahlelements
WO2014108147A1 (fr) Zoom et déplacement d'un contenu d'image d'un dispositif d'affichage
DE112019002604T5 (de) Virtuelle Bildanzeigevorrichtung
DE112017007471T5 (de) Mitteilungssteuervorrichtung und Mitteilungssteuerverfahren
DE102013000071A1 (de) Synchronisieren von Nutzdaten zwischen einem Kraftfahrzeug und einem mobilen Endgerät
DE102013000069B4 (de) Kraftfahrzeug-Bedienschnittstelle mit einem Bedienelement zum Erfassen einer Bedienhandlung
WO2016120251A1 (fr) Procédé pour faire fonctionner un dispositif de saisie, dispositif de saisie
EP3234736B1 (fr) Dispositif pour faire fonctionner un dispositif de saisie, dispositif de saisie, véhicule automobile
DE102018220693B4 (de) Steuerungssystem und Verfahren zum Steuern einer Funktion eines Fahrzeugs, sowie Fahrzeug mit einem solchen
WO2020148060A1 (fr) Système de commande avec unité d'interface portable ainsi que véhicule automobile équipé du système de commande
DE102013000081B4 (de) Bedienschnittstelle zum berührungslosen Auswählen einer Gerätefunktion
DE102014224599A1 (de) Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung
DE102018100335B4 (de) Verfahren und Vorrichtung zur 3D-Gestenerkennung
DE102013211046A1 (de) Verfahren und Vorrichtung zum Gewinnen eines Stellsignals aus einer Bediengeste
EP3025214A1 (fr) Procédé de fonctionnement d'un dispositif d'entrée et dispositif d'entrée
DE102013016490A1 (de) Kraftfahrzeug mit berührungslos aktivierbarem Handschrifterkenner
DE102020122969A1 (de) Verfahren zur Erkennung einer Bewegung eines Eingabegegenstands gegenüber einer Anzeigevorrichtung über optische Merkmale, Aufnahmevorrichtung mit Recheneinheit, Anzeigevorrichtung und Kraftfahrzeug
WO2024099605A1 (fr) Procédé et dispositif de suivi d'un objet

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13798575

Country of ref document: EP

Kind code of ref document: A2

DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase

Ref document number: 13798575

Country of ref document: EP

Kind code of ref document: A2