EP3762654A1 - Interaktionsmodul - Google Patents

Interaktionsmodul

Info

Publication number
EP3762654A1
EP3762654A1 EP19708261.3A EP19708261A EP3762654A1 EP 3762654 A1 EP3762654 A1 EP 3762654A1 EP 19708261 A EP19708261 A EP 19708261A EP 3762654 A1 EP3762654 A1 EP 3762654A1
Authority
EP
European Patent Office
Prior art keywords
interaction module
image
projector
camera
work surface
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19708261.3A
Other languages
German (de)
English (en)
French (fr)
Inventor
Markus Helminger
Gerald Horst
Philipp Kleinlein
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BSH Hausgeraete GmbH
Original Assignee
BSH Hausgeraete GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BSH Hausgeraete GmbH filed Critical BSH Hausgeraete GmbH
Publication of EP3762654A1 publication Critical patent/EP3762654A1/de
Pending legal-status Critical Current

Links

Classifications

    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C7/00Stoves or ranges heated by electric energy
    • F24C7/08Arrangement or mounting of control or safety devices
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F24HEATING; RANGES; VENTILATING
    • F24CDOMESTIC STOVES OR RANGES ; DETAILS OF DOMESTIC STOVES OR RANGES, OF GENERAL APPLICATION
    • F24C3/00Stoves or ranges for gaseous fuels
    • F24C3/12Arrangement or mounting of control or safety devices
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F27FURNACES; KILNS; OVENS; RETORTS
    • F27DDETAILS OR ACCESSORIES OF FURNACES, KILNS, OVENS, OR RETORTS, IN SO FAR AS THEY ARE OF KINDS OCCURRING IN MORE THAN ONE KIND OF FURNACE
    • F27D21/00Arrangements of monitoring devices; Arrangements of safety devices
    • F27D21/02Observation or illuminating devices
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F27FURNACES; KILNS; OVENS; RETORTS
    • F27DDETAILS OR ACCESSORIES OF FURNACES, KILNS, OVENS, OR RETORTS, IN SO FAR AS THEY ARE OF KINDS OCCURRING IN MORE THAN ONE KIND OF FURNACE
    • F27D21/00Arrangements of monitoring devices; Arrangements of safety devices
    • F27D21/02Observation or illuminating devices
    • F27D2021/026Observation or illuminating devices using a video installation

Definitions

  • the invention relates to an interaction module.
  • the invention relates to an interaction module for dynamically displaying information on a work surface.
  • An interaction module comprises a projector arranged to project an image onto a work surface and an optical scanner to determine a gesture.
  • the projector may be used to project a control onto the work surface and the scanner may be determined when a user touches the control surface with his finger.
  • a predetermined action can be triggered, for example the switching on or off of a device in the area of the work surface.
  • the interaction module can be used in particular in the area of a work surface of a kitchen and the control function can relate to a kitchen appliance, for example a stove, an oven or a fume hood.
  • An object underlying the present invention is to provide an improved interaction module.
  • the invention achieves this object by means of the subject matters of the independent claims. Subclaims give preferred embodiments again.
  • an interaction module comprises a projector which is set up to project a first image onto a work surface; and a camera adapted to receive a second image of an object placed on the work surface.
  • the work surface usually has a horizontal surface and the interaction module can be mounted above this surface.
  • the camera can use the interaction module to provide the second image.
  • the function of the projector can support the camera. For example, the projector may illuminate the object while the camera is taking the second image. In particular, if the interaction module is used in the area of a kitchen, a food prepared there can be photographed promptly and with little effort.
  • the projector may be configured to project a position marker onto the work surface, with the position marker indicating a scan range of the camera.
  • the position marker may project a spot, spot, or icon on which the object may preferably be centered.
  • a viewfinder or similar output device can therefore be omitted.
  • the user can easily and precisely position the object in a scanning area of the camera. By displaying the position marker at a predetermined location, second images of different objects can be made from the same perspectives, so that the images can be compared more easily.
  • the position marker may give an indication of a limitation of the scanning range of the camera in the plane of the work surface.
  • the position marking can run along an outline of the area that can be imaged by means of the camera.
  • the outline can also run inside or outside the mappable area. The user can thereby more easily compose the image to be made, for example by partially or wholly bringing an additional object such as a spice, a cutlery or an ingredient into the scanning area.
  • optical axes of the camera and the projector are close to each other so that the position marker is visible on a part of the object if the object is not completely within the scanning range.
  • the camera and the projector are preferably mounted above the work surface so that the object is located between the interaction module and the work surface. If, by means of the projector, the area that can be picked up by the camera is now completely or completely illuminated, a light cone or a pyramid of light is actually provided which at least partially illuminates the usual three-dimensional object. If a section of the object protrudes from this three-dimensional light body, this is immediately recognizable by a user.
  • the position marker may be within the area that can be imaged by the camera, with an outwardly projecting, unlit portion of the object not being visible on the later second image.
  • the position marker may illuminate a region outside the imageable region, in which case an outwardly projecting portion of the object that is illuminated will not be displayed on the later second image.
  • the optical axes of the camera and the projector can be considered close if they have a distance of less than 20 cm, more preferably less than 15 cm, even more preferably less than approximately 10 cm from one another. These distances go back to übli che proportions of a kitchen work surface, which may have a depth of about 60 to 65 cm and a clear height (for example, up to an overhead cabinet or hood) of about 45 to 80 cm.
  • the interaction module may additionally include an optical scanner arranged to determine a gesture of a user in an area above the work surface.
  • the interaction module can be set up to control a household appliance, more preferably a kitchen appliance.
  • the interaction module can be used to control the camera.
  • a control surface for the time-delayed release of the camera may be displayed and the second image may be effected a predetermined time after a user has determined a touch of the button.
  • the operation of the camera can be done easily and hygienically, even if the user, for example, has no clean hands.
  • the optical scanning device may also be adapted to detect the touch of the button with another object, such as a wooden spoon or other device.
  • the first image projected by the projector comprises a representation of the second image. This allows precise control of the recorded second image. A user can thereby change the design of the second image according to his ideas in a particularly simple manner.
  • the representation of the second image is arranged outside a scanning range of the camera.
  • the scanning range of the camera in this case is smaller than an area of the work surface that can be projected by the projector. This will be the This avoids picture-to-picture problems in which the second image projected onto the work surface is again captured by the camera and reprojected, which can lead to an infinite picture-in-picture especially when the picture content changes.
  • control and buttons are projected outside the scanning range of the camera by the projector. These are monitored by means of an optical scanner for detecting user gestures.
  • the scanning device is arranged in the interaction module. An approach of a user's finger to a button detected by the scanning device triggers corresponding control commands.
  • Such control commands may be the recording or saving of an image of the camera, or an optical change of the background image or the object illumination by the projector.
  • the projected buttons can be embodied, for example, as virtual pressure switches or rotary or sliding actuators.
  • the virtual buttons are preferably arranged in the vicinity of the representation of the second image.
  • the projector is configured to illuminate the object with light of a predetermined spectrum.
  • the spectrum comprises different wavelength ranges of visible light, which can be represented in different intensities.
  • a cold light, a warm light or a colored light can be provided.
  • a spectrum adapted for food photography can be used to produce a realistic or pleasing second image of a dish.
  • the projector may also be configured to illuminate different portions of the object with different predetermined spectra. For example, if the object contains a plate of meat and salad, the meat can be illuminated in reddish to brownish light tones, while the salad can be highlighted in greenish to yellowish shades of light. As a result, the user can be improved even before taking the second image, which color will be visible on the image afterwards.
  • the projector may also be configured to project a predetermined background around the object.
  • the background can be a color, a structure or a Pattern include. It can also be projected additional objects on the work surface, such as a cutlery or a floral decoration.
  • the interaction module can also have an interface for receiving a background to be projected.
  • One or more backgrounds can be stored in a data store. A user can thereby improve his or her preferred backgrounds or, for example, consistently use a particular background with a watermark or a personal logo. The user can optionally select the background to be projected among several backgrounds stored in the data memory.
  • the interaction module can comprise a data memory which is set up to store a cooking recipe.
  • the interaction module may comprise a processing device which is set up to associate the second image with a recipe in the data memory. This allows the user to save the second image of successful or less successful follow-up of the recipe for later use. The image can be used as a reminder or for the long-term optimization of the recipe.
  • the interaction module further comprises an interface for providing the second image, for example in a social network.
  • an interface for providing the second image for example in a social network. This allows the user, for example, in a social group improved to share the results of his efforts. He can thus learn or teach the preparation of food in an improved way.
  • a method of using an interaction module described herein comprises steps of projecting a first image onto a work surface by means of the projector; and capturing a second image of an object placed on the work surface by means of the camera.
  • the method can be carried out, in particular, completely or partially by means of a processing device, which can be included in the interaction module.
  • a part of the method may be in the form of a computer program product with program code means to carry out the corresponding part of the method when the part runs on a processor.
  • the computer program product can also work on be stored on a computer-readable medium.
  • FIG. 1 shows an exemplary system with an interaction module
  • Figure 2 illustrates a flowchart of an example method.
  • FIG. 1 shows an exemplary system 100 with an interaction module 105.
  • the interaction module 105 is mounted in the region of a work surface 110, wherein the work surface 110 may comprise, in particular, a table or worktop, in particular a horizontal direction.
  • the interaction module 105 is preferably mounted at a distance of at least about 35 cm above the working surface 110.
  • the interaction module 105 can be attached, in particular, to an underside of a piece of furniture or equipment that is fastened in an area above the work surface 110.
  • a distance of the interaction module 105 from a contact surface in the direction of the depth, in particular of a wall can amount to approximately 20 cm, for example.
  • the furniture or appliance may be attached to the contact surface.
  • the interaction module 105 may be configured to control a device, in particular a household appliance, in response to a gesture of a user.
  • the interaction module 105 can be provided, in particular, for use in a kitchen, and an exemplary appliance to be controlled can comprise, for example, an extractor hood 115.
  • the interaction module 105 comprises a projector 120, a camera 125, an optional scanning device 130, and usually a processing device 135. Furthermore, a data memory 140 and / or an interface 145 for, in particular, wireless data transmission may optionally be provided.
  • the projector 120, the camera 125 and the scanner 130 are directed substantially to matching areas of the work surface 10 1.
  • a button can be projected onto the working surface 110 by means of the projector 120.
  • a user may touch the button with his or her finger, direction 130 and can be converted into a corresponding control signal.
  • a device such as the hood 1 15, are controlled.
  • the projector 120 is usually set up to display any content, including moving pictures.
  • the interaction module 105 It is proposed to additionally equip the interaction module 105 with the camera 125 in order to take a picture of an object 150 which is arranged on the work surface 110.
  • the object 150 is exemplified by a cooked dish exemplified in a bowl on a plate and with a spoon.
  • the court may have been prepared by a user with the assistance of technical equipment of the kitchen shown, in particular of the interaction module 105.
  • the user Before serving, the user can make a particular electronic image of his work and optionally store it in the data memory 140 or provide it externally via the interface 145, for example to a service, in particular in a cloud, or a social network.
  • the image-forming by the projector 120 is supported.
  • a position marking can be projected onto the work surface 110 in order to give the user an impression of what the imageable surface of the camera 125 on the work surface 110 is.
  • the position marker may include, for example, a spot, a crosshair, a dot, a Siemens star, or another figure on which the object 150 may be centered.
  • the position mark may also indicate a limitation of the imageable range. For this purpose, for example, the entire area of the working surface 1 10 which can be scanned by the camera 125 can be illuminated by means of the projector 120.
  • the projector 120 and the camera 125 are preferably mounted in as close relative proximity as possible within the interaction module 105, so that it can be assumed with good approximation that only those sections of the object 150 that will be illuminated by the projector 120 will appear on the image ,
  • the position marking may be outside the range that can be imaged by the camera 125, so that it is precisely those portions of the object 150 which will be outside the image section that can be illuminated.
  • two sections 155 lie outside the imageable area. A user can perceive this through the lighting and decide for himself whether he is satisfied with this trim or not.
  • the object 150 may be illuminated while the image is being captured by the projector 120, or supplemented with a projected image or pattern that may extend to the object 150 itself or the work surface 110.
  • a pattern may be projected reminiscent of a tablecloth.
  • Projection can also be used to project an additional object into the area of the image.
  • the projector 120 can also be used to illuminate the object 150, wherein in particular a light intensity and / or a light temperature can be adapted to the object 150 to be recorded or a user specification.
  • a portion, a partial object or a detail of the object 150 may be removed from the image by projection or made inconspicuous.
  • the camera 125 may be triggered by a user making a corresponding gesture within a scanning area of the scanner 130.
  • the scanning region can be as similar as possible, ideally coincident with the receiving area of the camera 125 or the projection area of the projector 120.
  • the image projected by the projector 120 may be overlaid with a button that the user may manually touch to control the making of an image.
  • the camera 125 is triggered in a time-delayed manner to give the user his time Hand out of the shooting range of the camera 125 to remove and the projector 120, the button shown.
  • the first image projected by the projector 120 comprises a representation of the second image, wherein the representation of the second image is arranged outside a scanning region of the camera 125.
  • virtual buttons or operating elements are arranged, which enable the user to trigger the camera 125 for recording or for storing the second image, as well as a change in the image background. possible.
  • the recognition of an operation of the virtual operating elements by the user takes place by evaluation of the gestures of the user detected by the scanning device 130.
  • a ready-made image can be stored in the data memory 140. In addition, it can be assigned, for example, to a recipe which can likewise be stored in the data memory 140.
  • the image can also be provided to the outside by means of the interface 145, optionally for example to a portable mobile computer (smartphone, laptop), a storage or processing service or a social network.
  • FIG. 2 shows a flow chart of an exemplary method 200.
  • the method 200 can be executed in particular by means of the interaction module 105 and more preferably by means of the processing device 135.
  • a background, a pattern, the image of an object 150, or other image information may be uploaded into the interaction module 105. From a collection of predetermined and / or user-defined backgrounds, one or more can later be selected for projection.
  • the object 150 can be detected in the area of the work surface 110.
  • the detection can take place by means of the camera 125, by means of the scanning device 130 or by specifying a user.
  • the default can be made in one embodiment by gesture control of the user, for which projected by the projector 120, a control surface on the work surface 1 10, there touched by the user and the touch can be detected by means of the scanning device 130.
  • a position marker may be projected onto the work surface 110 to facilitate positioning of the object 150 within a mapping area of the camera 125 for the user.
  • an indication for example for further user guidance, can be projected.
  • One or more buttons for further controlling the method 200 may also be projected.
  • a mark may be projected onto the object 155 that includes a garnish or dividing proposal. This can be especially true for a round object like a cake, a pizza or a fruit.
  • a pattern may be projected onto a cake to facilitate sharing by the user into a predetermined number of equal parts. The number of parts can be predetermined or, in particular, selected in dialog form. So can too io
  • a background may be projected in the region of the object 150.
  • the background may have previously been uploaded in step 205, otherwise predetermined or dynamically generated.
  • effect lighting can be output by means of the projector 120.
  • the effect lighting can be influenced in particular in brightness, color spectrum, light temperature or tint.
  • the effect lighting can influence the output of the background.
  • the camera 125 may take an image of the object 150.
  • the object 150 and / or a surrounding area of the working surface 110 are preferably illuminated by means of the projector 120.
  • the completed image may be assigned to another object in an optional step 235.
  • the image can be assigned to a recipe, another image or further information that can be kept in particular in the data memory 140.
  • the image can be provided, in particular by means of the interface 145.
  • the providing may include storing or sending the image, for example to a social network. Before sending, the user may be given the opportunity to confirm the sending, to change the picture, to add a text or to do other usual editing.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Mechanical Engineering (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Chemical & Material Sciences (AREA)
  • Combustion & Propulsion (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
EP19708261.3A 2018-03-07 2019-02-25 Interaktionsmodul Pending EP3762654A1 (de)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018203349.8A DE102018203349A1 (de) 2018-03-07 2018-03-07 Interaktionsmodul
PCT/EP2019/054526 WO2019170447A1 (de) 2018-03-07 2019-02-25 Interaktionsmodul

Publications (1)

Publication Number Publication Date
EP3762654A1 true EP3762654A1 (de) 2021-01-13

Family

ID=65628749

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19708261.3A Pending EP3762654A1 (de) 2018-03-07 2019-02-25 Interaktionsmodul

Country Status (5)

Country Link
US (1) US20200408411A1 (zh)
EP (1) EP3762654A1 (zh)
CN (1) CN111788433B (zh)
DE (1) DE102018203349A1 (zh)
WO (1) WO2019170447A1 (zh)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7369627B2 (ja) * 2020-01-15 2023-10-26 リンナイ株式会社 加熱調理器

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100231506A1 (en) * 2004-09-07 2010-09-16 Timothy Pryor Control of appliances, kitchen and home
WO2006038577A1 (ja) * 2004-10-05 2006-04-13 Nikon Corporation プロジェクタ装置を有する電子機器
JP5347673B2 (ja) * 2009-04-14 2013-11-20 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
US8549418B2 (en) * 2009-12-23 2013-10-01 Intel Corporation Projected display to enhance computer device use
US9733789B2 (en) * 2011-08-04 2017-08-15 Eyesight Mobile Technologies Ltd. Interfacing with a device via virtual 3D objects
US20130044912A1 (en) * 2011-08-19 2013-02-21 Qualcomm Incorporated Use of association of an object detected in an image to obtain information to display to a user
US9606992B2 (en) * 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
CN102508578B (zh) * 2011-10-09 2015-07-22 清华大学深圳研究生院 投影定位装置及方法、交互系统和交互方法
DE102013200372A1 (de) * 2013-01-14 2014-07-17 BSH Bosch und Siemens Hausgeräte GmbH Kochfeld, Küchenarbeitsplatte mit einem integrierten Kochfeld und Küchenzeile
CN103914152B (zh) * 2014-04-11 2017-06-09 周光磊 三维空间中多点触控与捕捉手势运动的识别方法与系统
DE102014007172A1 (de) * 2014-05-15 2015-11-19 Diehl Ako Stiftung & Co. Kg Vorrichtung zur Bedienung eines elektronischen Geräts
CN106371593B (zh) * 2016-08-31 2019-06-28 李姣昂 一种投影交互式书法练习系统及其实现方法
CN106873789B (zh) * 2017-04-20 2020-07-07 歌尔科技有限公司 一种投影系统

Also Published As

Publication number Publication date
WO2019170447A1 (de) 2019-09-12
CN111788433B (zh) 2022-12-27
CN111788433A (zh) 2020-10-16
DE102018203349A1 (de) 2019-09-12
US20200408411A1 (en) 2020-12-31

Similar Documents

Publication Publication Date Title
EP3593683B1 (de) Steuerung eines hausgeräts
EP3500798A1 (de) Feststellen eines bräunungsgrads von gargut
EP1505350A2 (de) Verfahren und Vorrichtung zur Bedienung eines Kochsystems
DE102013110644B3 (de) Verfahren zum Garen von Lebensmitteln in einem Gargerät
DE102014115966A1 (de) Beleuchtungsvorrichtung
DE102017209841A1 (de) Anzeigesystem, Dunstabzug und Verfahren zur Anzeige zumindest eines Zustandes auf einem Kochfeld
DE112014002627T5 (de) Ophthalmologische Abbildungsvorrichtung undophthalmologische Bildanzeigevorrichtung
DE102018221749A1 (de) Backofen und Steuerverfahren
DE102015014700A1 (de) Medizinische Beleuchtungsvorrichtung und medizinische Gestensteuerungsvorrichtung
EP3762654A1 (de) Interaktionsmodul
DE19731303A1 (de) Verfahren und Vorrichtung zum kontaktlosen, helmfreien Messen der Blickrichtung von Augen bei größeren und schnelleren Kopf- und Augenbewegungen
EP3203153B1 (de) Küchengerät mit einer beleuchtungseinheit und verfahren zur betätigung einer beleuchtungseinheit
DE102011002577A1 (de) Fernsteuerungseinrichtung zur Steuerung einer Vorrichtung anhand eines beweglichen Objektes sowie Schnittstellen-Modul zur Kommunikation zwischen Modulen einer derartigen Fernsteuerungseinrichtung oder zwischen einem der Module und einer externen Vorrichtung
EP4274997A1 (de) Verfahren zum bestimmen eines garzeiteindes von gargut sowie haushaltsgargerät
EP3948091B1 (de) Verfahren zum zubereiten eines garguts mit optisch angezeigten gargutzonen des garguts, gargerät und computerprogrammprodukt
EP2790151B1 (de) Verfahren zur echtzeitfähigen Materialschätzung und zur materialbasierten Bildsegmentierung in elektronischen Bildsequenzen
DE102018214391A1 (de) Interaktionseinrichtung
BE1030917B1 (de) Gargerät und Verfahren zum Betreiben eines Gargeräts
BE1029940B1 (de) Küchengerät, System mit einem Küchengerät und einer externen Computereinheit und Verfahren zum Betrieb des Küchengeräts oder des Systems
EP4176426A1 (de) Lebensmittelzubereitungsvorrichtung und verfahren zur lebensmittelzubereitung
DE112020003104T5 (de) Datenverarbeitungssystem und Verfahren zur Datenverarbeitung
EP4118382A1 (de) Einstellen eines ziel-bräunungsgrads an einem haushalts-gargerät
DE102023209367A1 (de) Betreiben eines Gargeräts mit einer digitalen Garraum-Farbkamera
EP3938710A1 (de) Optisches erkennen eines garguts
DE102020214872A1 (de) Haushaltsgerät mit einer Bedieneinheit und Verfahren zum Anzeigen von Inhalten geräteexterner Datenquellen

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20201007

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20230117