EP2816536B1 - Empty goods return machine - Google Patents

Empty goods return machine Download PDF

Info

Publication number
EP2816536B1
EP2816536B1 EP13172507.9A EP13172507A EP2816536B1 EP 2816536 B1 EP2816536 B1 EP 2816536B1 EP 13172507 A EP13172507 A EP 13172507A EP 2816536 B1 EP2816536 B1 EP 2816536B1
Authority
EP
European Patent Office
Prior art keywords
vending machine
recognition unit
reverse vending
gesture recognition
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP13172507.9A
Other languages
German (de)
French (fr)
Other versions
EP2816536A1 (en
Inventor
Roy Gergs
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wincor Nixdorf International GmbH
Original Assignee
Wincor Nixdorf International GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wincor Nixdorf International GmbH filed Critical Wincor Nixdorf International GmbH
Priority to EP13172507.9A priority Critical patent/EP2816536B1/en
Priority to DK13172507.9T priority patent/DK2816536T3/en
Publication of EP2816536A1 publication Critical patent/EP2816536A1/en
Application granted granted Critical
Publication of EP2816536B1 publication Critical patent/EP2816536B1/en
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F7/00Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus
    • G07F7/06Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by returnable containers, i.e. reverse vending systems in which a user is rewarded for returning a container that serves as a token of value, e.g. bottles
    • G07F7/0609Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by returnable containers, i.e. reverse vending systems in which a user is rewarded for returning a container that serves as a token of value, e.g. bottles by fluid containers, e.g. bottles, cups, gas containers

Definitions

  • the invention relates to a reverse vending machine according to the preamble of claim 1.
  • Such a reverse vending machine comprises a processing device for accepting or issuing an empties object or for issuing a receipt.
  • returnable bottles or deposit funds can be returned with a user inserts a deposit bottle or empties box in a configured receiving device of the reverse vending machine or pushes and in return receives the deposit value of the returned empties amount paid or a receipt (Pledge) on the deposit value for payment to a cashier receives.
  • a deposit bottle or empties box in a configured receiving device of the reverse vending machine or pushes and in return receives the deposit value of the returned empties amount paid or a receipt (Pledge) on the deposit value for payment to a cashier receives.
  • an acceptance device is activated when a user inserts or inserts an emptied object into the acceptance device.
  • a conveyor of the acceptance device is set in motion, which is the emptied object a detection unit and then forwarded for storage or compaction (for disposable plastic bottles).
  • compaction for disposable plastic bottles.
  • a user After a return of empties objects, a user must also use a suitable input device, such as a button to pay the deposit or to receive a receipt. This requires, on the one hand, the provision of such an input device at the reverse vending machine and, on the other hand, an additional operating step for the user when returning empties objects.
  • a suitable input device such as a button to pay the deposit or to receive a receipt.
  • a switch-on device for turning on a detection unit is provided.
  • the switch-on device can be formed, for example, by a light barrier, a light scanner or a sensor which detects an approach of a reusable container to be returned and as a result switches on a detection unit.
  • the DE 10 2011 109 392 A1 discloses a vending machine with a goods dispenser, in which a user interface is provided with a touch screen.
  • the touch screen is used in a touch-sensitive way to enter commands and also to display information.
  • the DE 10 2010 040 177 A1 discloses a vending machine or reverse vending machine, for example, for taking back empty bottles, in which a tactile screen is provided.
  • the FR 2 970 797 A1 discloses a device called an interactive facade in which a touch of a facade can be detected by means of image matrix sensors.
  • a recycle system for reclaimed goods is known, in which a controller can be made, for example, by means of data gloves whose movement is converted into computer gestures.
  • the US 2007/0086764 A1 describes a gesture control for a camera.
  • Object of the present invention is to provide a reverse vending machine and a method for operating a reverse vending machines are available that allow the simplest, comfortable for a user operation for returning an emptied object.
  • a gesture recognition unit is additionally provided, which is designed to recognize an operating gesture of a user located in a detection area of the gesture recognition unit and to convert it into a control command for controlling the processing facility.
  • the present invention is based on the idea to use a gesture recognition unit for controlling a reverse vending machine.
  • the gesture recognition unit is designed to recognize at least one operating gesture of a user.
  • the operating gesture may be in a movement of the user (For example, the movement of an arm to approach an empties object to an acceptance device), in a viewing direction of the user, a facial expression of the user, a posture of the user or the like, possibly in combination with an audible utterance of the user exist.
  • the gesture recognition unit detects such a gesture and evaluates it to generate a control command in response to an operator gesture, which is then used to control the processing unit, for example, an acceptance device for accepting an empties object or an output device for issuing money or for issuing a document , Depending on the control command then, for example, an acceptance device can be activated or stopped again, it can be the issue of a pledge equivalent money or a receipt, or it can be issued to a user, for example, to correctly enter an empties object or for further operation ,
  • the gesture recognition unit is in this case designed to recognize an operator gesture within a detection range of the gesture recognition unit. If a user is at least partially within the detection area, an operating gesture of the user can be evaluated and converted into a suitable control command.
  • the detection area is in this case advantageously preceded by a housing of the reverse vending machine and detects an area in which a user usually resides in order to supply an emptied object to an acceptance device of the reverse vending machine or to perform another operating action on the reverse vending machine.
  • the gesture recognition unit for example, a spatial area with a radius of up to four meters, preferably up to two meters in front of the reverse vending machine are detected, wherein the gesture recognition unit detects a room and thus recognizes three-dimensional operation gestures of a user.
  • the sensor device is generally designed to detect an operating gesture in a three-dimensional space.
  • the sensor device can in this case, for example, as in the US 2008/0240502 A1 , of the US 7,433,024 B2 , of the US 2009/0185274 A1 , of the US 2010/0020078 A1 whose content is to be fully included in the present case.
  • the projection device of the sensor device can be designed, for example, to generate an infrared projection image.
  • the projection device serves to project a projection image into the detection space in the manner of a predetermined pattern in order to capture reflections of the projected projection image by means of the recording device and to generate a depth image on the basis of the projected projection image acquired in this way, the precise information about the distance of an object to the sensor device contains.
  • the depth image is hereby generated by evaluating distortions of the projected projection image due to reflections on an object located in the detection area.
  • the projection device in this case projects a predetermined pattern into the space of the detection area, for example a dot or bar pattern, which is reflected on objects within the detection area.
  • the projected projection image is picked up by the pickup device, and depth information is calculated from the deviation between the reflected and the known predetermined patterns of the projected image projected by the projection device.
  • the projection device and likewise the recording device operate in the infrared wavelength range, the calculation of the depth information is insensitive to daylight.
  • the recording device may, for example, have a frame rate of 30 fps for recording moving pictures and a resolution in the full HD range (1920 ⁇ 1080 pixels) or lower (for example 1280 ⁇ 1024 pixels).
  • the camera is preferably designed as a color camera (RGB camera).
  • the camera thus serves to record a color image, wherein the camera, for example, a frame rate of 30 fps and a resolution in the Full HD range (1920 x 1080 pixels) or even lower (for example, 1280 x 1,024 pixels) may have.
  • the sensor device comprises at least two spaced-apart microphones for the spatial detection of an acoustic signal.
  • two or more microphones can thus be spatially recorded an acoustic signal, so that a spatial localization of the acoustic source can be made.
  • the sensor device with its individual sensors can be fixedly arranged on a housing of the reverse vending machine. It is also conceivable that the sensor device can be adjusted by motor relative to the housing, for example, by being able to be inclined about a horizontal pivot axis by a predetermined angular range. This allows the adjustment of the detection range, for example, depending on the presence and the behavior of a user, for example, depending on the height of a user or its approach to the reverse vending machine, the adjustment can be done automatically as a result of detected by the sensor sensor sensor signals.
  • the gesture recognition unit can be designed, for example, to recognize within the detection area a user gesture, for example the approach of an empties object to an acceptance device or the termination of a throw-in process or the like, in order to control the empties return device depending on a recognized operating gesture.
  • a user gesture for example the approach of an empties object to an acceptance device or the termination of a throw-in process or the like
  • the gesture recognition unit it is recognized in accordance with the claim whether an emptied object which is approximated or inserted is an emptied object which can be accepted and processed at the reverse vending machine. Depending on the recognition of a gesture recognition unit, an emptied object can then be rejected or accepted early.
  • the provision of a separate detection unit on an acceptance device of the reverse vending machine can thus be dispensed with.
  • the object is further achieved by a method for operating a reverse vending machine.
  • the reverse vending machine has a processing device for receiving or outputting an empties object or for outputting a receipt. It is provided that a gesture recognition unit detects an operating gesture of a user located in a detection area of the gesture recognition unit and converts it into a control command for controlling the processing device.
  • Fig. 1 shows in a schematic view of a reverse vending machine 1, for returning an empties object F in the form of a reusable or disposable bottle (see Fig. 3 ) is configured by a user N or to return an empties box.
  • the reverse vending machine 1 has an acceptance device 10 in the form of an acceptance shaft for insertion or insertion of an emptied object F in the form of a bottle and an acceptance device 11 for setting an empties box.
  • an acceptance device 10 in the form of an acceptance shaft for insertion or insertion of an emptied object F in the form of a bottle and an acceptance device 11 for setting an empties box.
  • a detection unit for detecting the empties object F is fed and then be transported by means of a suitable conveyor to a deposit or a Kompakt michsaku (in disposable plastic bottles).
  • the reverse vending machine 1 is used in a conventional manner for the acceptance of empties objects and pays a user N in return via an output device 12 a deposit in cash or a receipt (pledge), by means of which the user N at a checkout pay off the deposit can let.
  • the in Fig. 1 illustrated reverse vending machine 1 has a gesture recognition unit 2 for detecting a user gesture N in a detection area E (see Fig. 3 ) on.
  • the gesture recognition unit 2 is configured to recognize an operation gesture such as a movement of the user N, a posture of the user N, a line of sight of the user N or the like, and to convert them into a control command based on which the acceptance devices 10, 11 or the output device 12 (the processing facilities of the reverse vending machine 1 realize) can be controlled.
  • gesture recognition units are known today, for example, from mobile phones or video game consoles.
  • the gesture recognition unit 2 of the reverse vending machine 1 can recognize that a user N directs his gaze to a specific receiving device 10, 11, turns to an acceptance device 10, 11 or approaches an emptied object F to an acceptance device 10, 11.
  • the gesture recognition unit 2 may generate a control command that activates one or more acceptance devices 10, 11 or the output device 12 and thus prepares for the acceptance of an empties object F or causes the issuance of a deposit or a deposit slip.
  • the operation of the reverse vending machine 1 by a user N becomes simple and intuitive.
  • the user N does not have to press any keys or operate any other input devices.
  • the reverse vending machine 1 is automatically controlled without requiring any special, conscious interaction of the user F with other input devices.
  • FIG Fig. 2 An embodiment of a gesture recognition unit 2 is shown schematically in FIG Fig. 2 shown.
  • the gesture recognition unit 2 has, for example, two microphones 20, 21, which are arranged at a distance from one another and serve to receive a three-dimensional sound field.
  • the gesture recognition unit 2 further has a projection device 22 for projecting an infrared projection image into the detection area E.
  • Such an infrared projection image P as shown schematically in FIG Fig. 4 may, for example, take the form of a predetermined pattern, for example a dot or dash pattern. Reflections of the projection image P at a user N or another object in the detection area E, which is upstream of the reverse vending machine 1 (see Fig. 3 ), are recorded by means of a recording device 23, which is designed in the manner of an infrared camera, for example, with a frame rate of, for example, 30 fps (fps: frames per second).
  • depth information can be calculated in the manner of a depth image, so that in a spatially highly resolved manner pinpoint the distance of a user N and determine its body regions and limbs from the gesture recognition unit 2.
  • the gesture recognition unit 2 further has a camera 24 in the form of a color camera (RGB camera) for recording color images with a frame rate of, for example, 30 fps.
  • the camera 24 is thus designed to record images in the visible light range, so that 24 color and brightness information can be obtained and evaluated via the camera.
  • the microphones 20, 21, the projection device 22, the recording device 23 and the camera 24 together realize a sensor device of the gesture recognition unit 2 for detecting sensor signals.
  • the sensor signals that is to say the signals of the microphones 20, 21, the recording device 23 and the camera 24, are supplied to a control device 27, for example in the form of a chip, and are evaluated by the control device 27 for generating one or more suitable control commands.
  • the control device 27 can then, for example the acceptance devices 10, 11 or the output device 12 are driven to perform a predetermined action.
  • the control device 27 may be connected to a memory 28.
  • predetermined operating gesture patterns may be stored on the memory 28 in the manner of a database, so that a detection can be carried out in dependence on a comparison of a recorded operating gesture with a previously stored operating gesture pattern, by means of which a control command can then be generated.
  • the controller 27 also acts together with arranged on a housing 13 of the reverse vending machine 1 speakers 25, 26 and a screen 29.
  • the speakers 25, 26, for example, an acoustic output to guide a user N to operate the empties return vending machine 1, warnings generate or other acoustic feedback, such as a warning sound or the like, output.
  • About the screen 29 operating instructions can be issued in a comparable manner, or it can be played advertising or the like.
  • the sensor device 20-24 of the gesture recognition unit 2 is used to perform a gesture recognition within a detection range E, which is a housing 13 of the reverse vending machine 1 upstream.
  • a gesture recognition can be performed here, for example, within a range with a radius of up to four meters, for example up to two meters to the gesture recognition unit 2, so that only gestures of a user N, which is close to a reverse vending machine 1, detected and recognized.
  • the detection area E extends spatially in three dimensions and may, for example, have the shape of a detection funnel.
  • the sensor device 20-24 can be arranged with its individual sensors (microphones 20, 21, projection device 22, receiving device 23, camera 24) or adjustable in total on the housing 13 of the reverse vending machine 1.
  • the adjustability may be such that the sensor device 20-24 as a whole or individual ones of the sensors 20-24 are separated from one another about a horizontal pivot axis S (see FIG Fig. 3 ) can be pivoted, so that the detection range E in height, for example, depending on the height of a user N can be automatically adjusted.
  • gesture recognition unit basically completely different operating gestures of a user can be recognized and evaluated, so that operation of a reverse vending machine can proceed completely without (conscious) user interaction.
  • a reverse vending machine can be activated as a whole as soon as a user enters a detection area.
  • An acceptance device can be activated when the user carries out a movement of an empties object in the direction of the acceptance device. And when the user then directs his gaze, for example, toward an issuer, an issue of the deposit or a pledge may be made without requiring the user to operate an input device of some sort, such as a button or a button.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)

Description

Die Erfindung betrifft einen Leergutrücknahmeautomat nach dem Oberbegriff des Anspruchs 1.The invention relates to a reverse vending machine according to the preamble of claim 1.

Ein derartiger Leergutrücknahmeautomat umfasst eine Verarbeitungseinrichtung zur Annahme oder Ausgabe eines Leergutobjekts oder zur Ausgabe eines Belegs.Such a reverse vending machine comprises a processing device for accepting or issuing an empties object or for issuing a receipt.

Über einen solchen Leergutrücknahmeautomat können beispielsweise Pfandflaschen oder mit einem Pfand beaufschlagte Leergutkästen retourniert werden, wobei ein Nutzer eine Pfandflasche oder einen Leergutkasten in eine dazu ausgestaltete Annahmeeinrichtung des Leergutrücknahmeautomat einwirft oder einschiebt und im Gegenzug einen dem Pfandwert des retournierten Leerguts entsprechenden Betrag ausgezahlt bekommt oder einen Beleg (Pfandbon) über den Pfandwert zur Auszahlung an einer Kasse erhält.About such a reverse vending machine, for example, returnable bottles or deposit funds can be returned with a user inserts a deposit bottle or empties box in a configured receiving device of the reverse vending machine or pushes and in return receives the deposit value of the returned empties amount paid or a receipt (Pledge) on the deposit value for payment to a cashier receives.

Bei herkömmlichen Leergutrücknahmeautomat wird eine Annahmeeinrichtung beispielsweise dann aktiviert, wenn ein Nutzer ein Leergutobjekt in die Annahmeeinrichtung einwirft oder einschiebt. Infolge der Aktivierung wird beispielsweise eine Fördereinrichtung der Annahmeeinrichtung in Gang gesetzt, die das Leergutobjekt einer Erkennungseinheit zuführt und sodann zur Lagerung oder Kompaktierung (bei Einwegplastikflaschen) weiterbefördert. Dadurch, dass die Aktivierung der Annahmeeinrichtung erst erfolgt, wenn das Leergutobjekt beispielsweise in einen Aufnahmeschacht der Annahmeeinrichtung eingeworfen oder eingeschoben wird, ist die Aktivierung vergleichsweise träge und gegebenenfalls mit einer (wenn auch kurzen) Wartezeit für den Nutzer verbunden.In the case of a conventional reverse vending machine, for example, an acceptance device is activated when a user inserts or inserts an emptied object into the acceptance device. As a result of the activation, for example, a conveyor of the acceptance device is set in motion, which is the emptied object a detection unit and then forwarded for storage or compaction (for disposable plastic bottles). The fact that the activation of the acceptance device only takes place when the empties object is thrown or inserted, for example, in a receiving shaft of the acceptance device, the activation is relatively sluggish and possibly associated with a (albeit short) waiting time for the user.

Nach einer erfolgten Rückgabe von Leergutobjekten muss ein Nutzer zudem zur Auszahlung des Pfandwerts oder zum Erhalt eines Belegs eine geeignete Eingabeeinrichtung, beispielsweise einen Knopf, betätigen. Dies erfordert zum einen das Bereitstellen einer solchen Eingabeeinrichtung an dem Leergutautomat und zum anderen einen zusätzlichen Bedienschritt für den Nutzer bei der Rückgabe von Leergutobjekten.After a return of empties objects, a user must also use a suitable input device, such as a button to pay the deposit or to receive a receipt. This requires, on the one hand, the provision of such an input device at the reverse vending machine and, on the other hand, an additional operating step for the user when returning empties objects.

Bei einer aus der DE 298 12 678 U1 bekannten Rücknahmestation für Mehrwegbehälter ist eine Einschaltvorrichtung zum Einschalten einer Erkennungseinheit vorgesehen. Die Einschaltvorrichtung kann beispielsweise durch eine Lichtschranke, einen Lichttaster oder einen Sensor gebildet sein, der eine Annäherung eines zu retournierenden Mehrwegbehälters erkennt und infolgedessen eine Erkennungseinheit einschaltet.At one of the DE 298 12 678 U1 known return station for reusable containers, a switch-on device for turning on a detection unit is provided. The switch-on device can be formed, for example, by a light barrier, a light scanner or a sensor which detects an approach of a reusable container to be returned and as a result switches on a detection unit.

Die DE 10 2011 109 392 A1 offenbart einen Warenautomaten mit einer Warenausgabevorrichtung, bei dem eine Benutzerschnittstelle mit einem Touchscreen vorgesehen ist. Der Touchscreen dient in berührungssensitiver Weise zur Eingabe von Befehlen und darüber hinaus zum Anzeigen von Informationen.The DE 10 2011 109 392 A1 discloses a vending machine with a goods dispenser, in which a user interface is provided with a touch screen. The touch screen is used in a touch-sensitive way to enter commands and also to display information.

Aus der US 2010/0103131 A1 und der WO 2012/072835 A1 sind jeweils einen Warenautomat mit einer Touchscreen-Einrichtung bekannt.From the US 2010/0103131 A1 and the WO 2012/072835 A1 are each a vending machine with a touchscreen device known.

Die DE 10 2010 040 177 A1 offenbart einen Verkaufs- oder Rücknahmeautomaten beispielsweise zum Rücknehmen von Leerflaschen, bei dem ein Tastschirm vorgesehen ist.The DE 10 2010 040 177 A1 discloses a vending machine or reverse vending machine, for example, for taking back empty bottles, in which a tactile screen is provided.

Die FR 2 970 797 A1 offenbart eine als interaktive Fassade bezeichnete Einrichtung, bei dem mittels Bildmatrixsensoren eine Berührung einer Fassade erfasst werden kann.The FR 2 970 797 A1 discloses a device called an interactive facade in which a touch of a facade can be detected by means of image matrix sensors.

Aus der WO 2009/021227 A2 ist ein Rücknahmesystem für Recyclegüter bekannt, bei dem eine Steuerung beispielsweise mittels Datenhandschuhen, deren Bewegung in Computergesten umgesetzt wird, vorgenommen werden kann.From the WO 2009/021227 A2 For example, a recycle system for reclaimed goods is known, in which a controller can be made, for example, by means of data gloves whose movement is converted into computer gestures.

Aus einer Veröffentlichung von M. Morales et al. " Vendor Profile: Omek Interactive: Gesture Recognition and Machine Vision Technology for Intelligent Systems", XP055085512, August 2012 ist eine Gestensteuerung für einen Warenautomat bekannt.From a publication of M. Morales et al. "Vendor Profiles: Omek Interactive: Gesture Recognition and Machine Vision Technology for Intelligent Systems", XP055085512, August 2012 is a gesture control for a vending machine known.

Die US 2007/0086764 A1 beschreibt eine Gestensteuerung für eine Kamera.The US 2007/0086764 A1 describes a gesture control for a camera.

Aufgabe der vorliegenden Erfindung ist es, ein Leergutrücknahmeautomat und ein Verfahren zum Betreiben eines Leergutrücknahmeautomaten zur Verfügung zu stellen, die eine möglichst einfache, für einen Nutzer angenehme Bedienung zur Rückgabe eines Leergutobjekts ermöglichen.Object of the present invention is to provide a reverse vending machine and a method for operating a reverse vending machines are available that allow the simplest, comfortable for a user operation for returning an emptied object.

Diese Aufgabe wird durch einen Gegenstand mit den Merkmalen des Anspruchs 1 gelöst.This object is achieved by an article having the features of claim 1.

Demnach ist bei einem Leergutrücknahmeautomat zusätzlich eine Gestenerkennungseinheit vorgesehen, die ausgebildet ist, eine Bedienungsgeste eines in einem Erfassungsbereich der Gestenerkennungseinheit befindlichen Nutzers zu erkennen und in einen Steuerbefehl zur Steuerung der Verarbeitungseinrichtung umzusetzen.Accordingly, in a reverse vending machine, a gesture recognition unit is additionally provided, which is designed to recognize an operating gesture of a user located in a detection area of the gesture recognition unit and to convert it into a control command for controlling the processing facility.

Die vorliegende Erfindung geht von dem Gedanken aus, eine Gestenerkennungseinheit zur Steuerung eines Leergutrücknahmeautomaten zu verwenden. Die Gestenerkennungseinheit ist ausgebildet, mindestens eine Bedienungsgeste eines Nutzers zu erkennen. Die Bedienungsgeste kann hierbei in einer Bewegung des Nutzers (beispielsweise der Bewegung eines Arms zur Annäherung eines Leergutobjekts an eine Annahmeeinrichtung), in einer Blickrichtung des Nutzers, einem Gesichtsausdruck des Nutzers, einer Körperhaltung des Nutzers oder dergleichen, gegebenenfalls in Kombination mit einer akustischen Äußerung des Nutzers, bestehen. Die Gestenerkennungseinheit erfasst eine solche Geste und wertet diese aus, um in Abhängigkeit einer Bedienungsgeste einen Steuerbefehl zu erzeugen, der dann zur Ansteuerung der Verarbeitungseinheit, beispielsweise einer Annahmeeinrichtung zur Annahme eines Leergutobjekts oder einer Ausgabeeinrichtung zur Ausgabe von Geld oder zur Ausgabe eines Belegs, herangezogen wird. In Abhängigkeit von dem Steuerbefehl kann dann beispielsweise eine Annahmeeinrichtung aktiviert oder auch wieder gestoppt werden, es kann die Ausgabe von einem Pfandwert entsprechendem Geld oder eines Belegs erfolgen, oder es können Hinweise an einen Nutzer beispielsweise zur richtigen Eingabe eines Leergutobjekts oder zur weiteren Bedienung ausgegeben werden.The present invention is based on the idea to use a gesture recognition unit for controlling a reverse vending machine. The gesture recognition unit is designed to recognize at least one operating gesture of a user. The operating gesture may be in a movement of the user (For example, the movement of an arm to approach an empties object to an acceptance device), in a viewing direction of the user, a facial expression of the user, a posture of the user or the like, possibly in combination with an audible utterance of the user exist. The gesture recognition unit detects such a gesture and evaluates it to generate a control command in response to an operator gesture, which is then used to control the processing unit, for example, an acceptance device for accepting an empties object or an output device for issuing money or for issuing a document , Depending on the control command then, for example, an acceptance device can be activated or stopped again, it can be the issue of a pledge equivalent money or a receipt, or it can be issued to a user, for example, to correctly enter an empties object or for further operation ,

Die Gestenerkennungseinheit ist hierbei ausgebildet, eine Bedienungsgeste innerhalb eines Erfassungsbereichs der Gestenerkennungseinheit zu erkennen. Befindet sich ein Nutzer zumindest teilweise innerhalb des Erfassungsbereichs, so kann eine Bedienungsgeste des Nutzers ausgewertet und in einen geeigneten Steuerbefehl umgesetzt werden. Der Erfassungsbereich ist hierbei vorteilhafterweise einem Gehäuse des Leergutrücknahmeautomaten vorgelagert und erfasst einen Bereich, in dem ein Nutzer sich üblicherweise aufhält, um ein Leergutobjekt einer Annahmeeinrichtung des Leergutautomaten zuzuführen oder eine sonstige Bedienungshandlung an dem Leergutautomaten vorzunehmen.The gesture recognition unit is in this case designed to recognize an operator gesture within a detection range of the gesture recognition unit. If a user is at least partially within the detection area, an operating gesture of the user can be evaluated and converted into a suitable control command. The detection area is in this case advantageously preceded by a housing of the reverse vending machine and detects an area in which a user usually resides in order to supply an emptied object to an acceptance device of the reverse vending machine or to perform another operating action on the reverse vending machine.

So kann mittels der Gestenerkennungseinheit beispielsweise ein räumlicher Bereich mit einem Radius bis zu vier Metern, vorzugsweise bis zu zwei Metern vor dem Leergutautomaten erfasst werden, wobei die Gestenerkennungseinheit einen Raum erfasst und somit in dreidimensionaler Weise Bedienungsgesten eines Nutzers erkennt.Thus, by means of the gesture recognition unit, for example, a spatial area with a radius of up to four meters, preferably up to two meters in front of the reverse vending machine are detected, wherein the gesture recognition unit detects a room and thus recognizes three-dimensional operation gestures of a user.

Die Gestenerkennungseinheit umfasst vorzugsweise mindestens eine Sensoreinrichtung zur Bereitstellung eines Sensorsignals und eine Steuereinrichtung zur Erzeugung eines Steuerbefehls aus dem Sensorsignal. Die Sensoreinrichtung kann hierbei beispielsweise

  • mindestens ein Mikrofon zur Erfassung eines akustischen Signals,
  • mindestens eine Projektionseinrichtung zur Erzeugung eines Projektionsbildes in dem Erfassungsbereich und eine Aufnahmeeinrichtung zur Aufnahme des projizierten Projektionsbilds aus dem Erfassungsbereich und/oder
  • eine Kamera zur Aufnahme eines Bilds sichtbaren Lichts aus dem Erfassungsbereich aufweisen. Die Sensoreinrichtung ist somit dazu ausgestaltet, unterschiedliche optische und/oder akustische Signale aus dem Erfassungsbereich aufzunehmen, um anhand der empfangenen Signale eine Bedienungsgeste zu erkennen und in einen geeigneten Steuerbefehl umzusetzen.
The gesture recognition unit preferably comprises at least one sensor device for providing a sensor signal and a control device for generating a control command from the sensor signal. The sensor device can in this case, for example
  • at least one microphone for detecting an acoustic signal,
  • at least one projection device for generating a projection image in the detection area and a recording device for recording the projected projection image from the detection area and / or
  • a camera for receiving an image of visible light from the detection area. The sensor device is thus designed to record different optical and / or acoustic signals from the detection range in order to recognize an operating gesture based on the received signals and to convert it into a suitable control command.

Die Sensoreinrichtung ist generell zur Erfassung einer Bedienungsgeste in einem dreidimensionalen Raum ausgestaltet. Die Sensoreinrichtung kann hierbei beispielsweise wie in der US 2008/0240502 A1 , der US 7,433,024 B2 , der US 2009/0185274 A1 , der US 2010/0020078 A1 , deren Inhalt vorliegend vollumfänglich mit einbezogen werden soll, ausgestaltet sein.The sensor device is generally designed to detect an operating gesture in a three-dimensional space. The sensor device can in this case, for example, as in the US 2008/0240502 A1 , of the US 7,433,024 B2 , of the US 2009/0185274 A1 , of the US 2010/0020078 A1 whose content is to be fully included in the present case.

Die Projektionseinrichtung der Sensoreinrichtung kann beispielsweise zur Erzeugung eines infraroten Projektionsbildes ausgestaltet sein. Die Projektionseinrichtung dient dabei dazu, ein Projektionsbild nach Art eines vorbestimmten Musters in den Erfassungsraum hinein zu projizieren, um mittels der Aufnahmeeinrichtung Reflexionen des projizierten Projektionsbildes zu erfassen und anhand des so erfassten projizierten Projektionsbilds ein Tiefenbild zu erzeugen, das punktgenaue Informationen über den Abstand eines Objekts zu der Sensoreinrichtung enthält.The projection device of the sensor device can be designed, for example, to generate an infrared projection image. In this case, the projection device serves to project a projection image into the detection space in the manner of a predetermined pattern in order to capture reflections of the projected projection image by means of the recording device and to generate a depth image on the basis of the projected projection image acquired in this way, the precise information about the distance of an object to the sensor device contains.

Das Tiefenbild wird hierbei dadurch erzeugt, dass Verzerrungen des projizierten Projektionsbildes aufgrund von Reflexionen an einem in dem Erfassungsbereich befindlichen Objekt ausgewertet werden. Die Projektionseinrichtung projiziert hierbei ein vorbestimmtes Muster in den Raum des Erfassungsbereichs, beispielsweise ein Punkt- oder Strichmuster, das an Objekten innerhalb des Erfassungsbereichs reflektiert wird. Das projizierte Projektionsbild wird durch die Aufnahmeeinrichtung aufgenommen, wobei Tiefeninformationen aus der Abweichung zwischen dem reflektierten und dem bekannten, vorbestimmten Muster des durch die Projektionseinrichtung projizierten Projektionsbilds berechnet werden.The depth image is hereby generated by evaluating distortions of the projected projection image due to reflections on an object located in the detection area. The projection device in this case projects a predetermined pattern into the space of the detection area, for example a dot or bar pattern, which is reflected on objects within the detection area. The projected projection image is picked up by the pickup device, and depth information is calculated from the deviation between the reflected and the known predetermined patterns of the projected image projected by the projection device.

Dadurch, dass die Projektionseinrichtung und genauso die Aufnahmeeinrichtung im infraroten Wellenlängenbereich arbeiten, ist die Berechnung der Tiefeninformationen unempfindlich gegenüber Tageslicht.Because the projection device and likewise the recording device operate in the infrared wavelength range, the calculation of the depth information is insensitive to daylight.

Die Aufnahmeeinrichtung kann beispielsweise eine Bildwechselfrequenz von 30 fps zur Aufnahme beweglicher Bilder und eine Auflösung im Full-HD-Bereich (1920 x 1080 Pixel) oder niedriger (zum Beispiel 1280 x 1024 Pixel) aufweisen.The recording device may, for example, have a frame rate of 30 fps for recording moving pictures and a resolution in the full HD range (1920 × 1080 pixels) or lower (for example 1280 × 1024 pixels).

Die Kamera ist vorzugsweise als Farbbildkamera (RGB-Kamera) ausgestaltet. Die Kamera dient somit dazu, ein Farbbild aufzunehmen, wobei die Kamera beispielsweise eine Bildwechselfrequenz von 30 fps und eine Auflösung im Full-HD-Bereich (1920 x 1080 Pixel) oder auch geringer (beispielsweise 1280 x 1.024 Pixel) aufweisen kann.The camera is preferably designed as a color camera (RGB camera). The camera thus serves to record a color image, wherein the camera, for example, a frame rate of 30 fps and a resolution in the Full HD range (1920 x 1080 pixels) or even lower (for example, 1280 x 1,024 pixels) may have.

Vorzugsweise umfasst die Sensoreinrichtung mindestens zwei räumlich voneinander beabstandete Mikrofone zur räumlichen Erfassung eines akustischen Signals. Mittels zwei oder mehr Mikrofonen kann somit ein akustisches Signal räumlich aufgenommen werden, so dass auch eine räumliche Lokalisation der akustischen Quelle vorgenommen werden kann.Preferably, the sensor device comprises at least two spaced-apart microphones for the spatial detection of an acoustic signal. By means of two or more microphones can thus be spatially recorded an acoustic signal, so that a spatial localization of the acoustic source can be made.

Die Sensoreinrichtung mit ihren einzelnen Sensoren kann fest an einem Gehäuse des Leergutrücknahmeautomaten angeordnet sein. Denkbar ist aber auch, dass die Sensoreinrichtung motorisch relativ zu dem Gehäuse verstellt werden kann, beispielsweise indem sie um eine horizontale Schwenkachse um einen vorbestimmten Winkelbereich geneigt werden kann. Dies ermöglicht die Anpassung des Erfassungsbereichs beispielsweise in Abhängigkeit der Präsenz und dem Verhalten eines Nutzers, beispielsweise in Abhängigkeit von der Körpergröße eines Nutzers oder dessen Annäherung an den Leergutrücknahmeautomat, wobei die Verstellung selbsttätig in Folge von durch die Sensoreinrichtung erfassten Sensorsignalen erfolgen kann.The sensor device with its individual sensors can be fixedly arranged on a housing of the reverse vending machine. It is also conceivable that the sensor device can be adjusted by motor relative to the housing, for example, by being able to be inclined about a horizontal pivot axis by a predetermined angular range. This allows the adjustment of the detection range, for example, depending on the presence and the behavior of a user, for example, depending on the height of a user or its approach to the reverse vending machine, the adjustment can be done automatically as a result of detected by the sensor sensor sensor signals.

Die Gestenerkennungseinheit kann beispielsweise dazu ausgebildet sein, innerhalb des Erfassungsbereichs eine von einem Nutzer durchgeführte Bedienungsgeste, beispielsweise die Annäherung eines Leergutobjekts an eine Annahmeeinrichtung oder die Beendigung eines Einwerfvorgangs oder dergleichen, zur erkennen, um in Abhängigkeit einer erkannten Bedienungsgeste den Leergutrücknahmeautomat zu steuern. In diesem Rahmen ist auch denkbar und möglich, die Gestenerkennungseinheit mit einer Erkennungseinheit zur Erfassung und Erkennung eines Leergutobjekts zu kombinieren. Mittels der Gestenerkennungseinheit wird anspruchsgemäß erkannt, ob es sich bei einem angenäherten oder eingeworfenen Leergutobjekt um ein Leergutobjekt handelt, dass an dem Leergutrücknahmeautomat angenommen und verarbeitet werden kann. In Abhängigkeit von der Erkennung einer Gestenerkennungseinheit kann dann ein Leergutobjekt frühzeitig zurückgewiesen oder akzeptiert werden. Das Bereitstellen einer separaten Erkennungseinheit an einer Annahmeeinrichtung des Leergutrücknahmeautomaten kann somit entfallen.The gesture recognition unit can be designed, for example, to recognize within the detection area a user gesture, for example the approach of an empties object to an acceptance device or the termination of a throw-in process or the like, in order to control the empties return device depending on a recognized operating gesture. In this context, it is also conceivable and possible to combine the gesture recognition unit with a recognition unit for detecting and recognizing an emptied object. By means of the gesture recognition unit, it is recognized in accordance with the claim whether an emptied object which is approximated or inserted is an emptied object which can be accepted and processed at the reverse vending machine. Depending on the recognition of a gesture recognition unit, an emptied object can then be rejected or accepted early. The provision of a separate detection unit on an acceptance device of the reverse vending machine can thus be dispensed with.

Die Aufgabe wird weiterhin durch ein Verfahren zum Betreiben eines Leergutrücknahmeautomaten gelöst. Der Leergutrücknahmeautomat weist eine Verarbeitungseinrichtung zur Aufnahme oder Ausgabe eines Leergutobjekts oder zur Ausgabe eines Belegs auf. Dabei ist vorgesehen, dass eine Gestenerkennungseinheit eine Bedienungsgeste eines in einem Erfassungsbereich der Gestenerkennungseinheit befindlichen Nutzers erkennt und in einen Steuerbefehl zur Steuerung der Verarbeitungseinrichtung umsetzt.The object is further achieved by a method for operating a reverse vending machine. The reverse vending machine has a processing device for receiving or outputting an empties object or for outputting a receipt. It is provided that a gesture recognition unit detects an operating gesture of a user located in a detection area of the gesture recognition unit and converts it into a control command for controlling the processing device.

Zu Vorteilen und vorteilhaften Ausgestaltungen soll auf das vorangehend Ausgeführte verwiesen werden, das analog auch auf das Verfahren Anwendung findet.For advantages and advantageous embodiments, reference is made to the above, which applies analogously to the method application.

Der der Erfindung zugrundeliegende Gedanke soll nachfolgend anhand der in den Figuren dargestellten Ausführungsbeispiele näher erläutert werden. Es zeigen:

Fig. 1
eine schematische Ansicht eines Leergutrücknahmeautomaten;
Fig. 2
eine schematische Ansicht einer Sensoreinrichtung des Leergutrücknahmeautomaten zur Bereitstellung einer Gestenerkennungseinheit;
Fig. 3
eine schematische Ansicht eines Nutzers an einem Leergutrücknahmeautomat; und
Fig. 4
eine schematische Ansicht eines mittels einer Projektionseinrichtung projizierten Projektionsbilds zur Erfassung durch eine Aufnahmeeinrichtung.
The idea underlying the invention will be explained in more detail with reference to the embodiments illustrated in the figures. Show it:
Fig. 1
a schematic view of a reverse vending machine;
Fig. 2
a schematic view of a sensor device of the reverse vending machine for providing a gesture recognition unit;
Fig. 3
a schematic view of a user at a reverse vending machine; and
Fig. 4
a schematic view of projected by means of a projection device projection image for detection by a recording device.

Fig. 1 zeigt in einer schematischen Ansicht einen Leergutrücknahmeautomat 1, der zur Rückgabe eines Leergutobjekts F in Form einer Mehrweg- oder Einwegflasche (siehe Fig. 3) durch einen Nutzer N oder zur Rückgabe eines Leergutkastens ausgestaltet ist. Fig. 1 shows in a schematic view of a reverse vending machine 1, for returning an empties object F in the form of a reusable or disposable bottle (see Fig. 3 ) is configured by a user N or to return an empties box.

Hierzu weist der Leergutrücknahmeautomat 1 eine Annahmeeinrichtung 10 in Form eines Annahmeschachts zum Einschieben oder Einwerfen eines Leergutobjekts F in Form einer Flasche und eine Annahmeeinrichtung 11 zum Einstellen eines Leergutkastens auf. Über die Annahmeeinrichtungen 10, 11 können eingeworfene bzw. eingeschobene Leergutobjekte F beispielsweise in einer an sich bekannten Weise einer Erkennungseinheit zur Erkennung des Leergutobjekts F zugeführt und anschließend mittels einer geeigneten Fördereinrichtung hin zu einer Lagerstätte oder einer Kompaktierungseinheit (bei Einwegplastikflaschen) befördert werden.For this purpose, the reverse vending machine 1 has an acceptance device 10 in the form of an acceptance shaft for insertion or insertion of an emptied object F in the form of a bottle and an acceptance device 11 for setting an empties box. About the receiving devices 10, 11 inserted or inserted empties objects F, for example, in a known manner a detection unit for detecting the empties object F is fed and then be transported by means of a suitable conveyor to a deposit or a Kompaktierungseinheit (in disposable plastic bottles).

Der Leergutrücknahmeautomat 1 dient in an sich bekannter Weise zur Annahme von Leergutobjekten und zahlt einem Nutzer N im Gegenzug über eine Ausgabeeinrichtung 12 einen Pfandwert in Geld aus oder gibt einen Beleg (Pfandbon) aus, mittels dessen sich der Nutzer N an einer Kasseneinrichtung den Pfandwert auszahlen lassen kann.The reverse vending machine 1 is used in a conventional manner for the acceptance of empties objects and pays a user N in return via an output device 12 a deposit in cash or a receipt (pledge), by means of which the user N at a checkout pay off the deposit can let.

Der in Fig. 1 dargestellte Leergutrücknahmeautomat 1 weist eine Gestenerkennungseinheit 2 zur Erfassung einer Bedienungsgeste eines Nutzers N in einem Erfassungsbereich E (siehe Fig. 3) auf. Die Gestenerkennungseinheit 2 ist ausgestaltet, eine Bedienungsgeste, beispielsweise eine Bewegung des Nutzers N, eine Körperhaltung des Nutzers N, eine Blickrichtung des Nutzers N oder dergleichen, zu erkennen und in einen Steuerbefehl umzusetzen, auf Basis dessen die Annahmeeinrichtungen 10, 11 oder die Ausgabeeinrichtung 12 (die Verarbeitungseinrichtungen des Leergutrücknahmeautomaten 1 verwirklichen) angesteuert werden können.The in Fig. 1 illustrated reverse vending machine 1 has a gesture recognition unit 2 for detecting a user gesture N in a detection area E (see Fig. 3 ) on. The gesture recognition unit 2 is configured to recognize an operation gesture such as a movement of the user N, a posture of the user N, a line of sight of the user N or the like, and to convert them into a control command based on which the acceptance devices 10, 11 or the output device 12 (the processing facilities of the reverse vending machine 1 realize) can be controlled.

Solche Gestenerkennungseinheiten sind heutzutage beispielsweise von Mobiltelefonen oder auch von Videospielkonsolen bekannt. Beispielsweise kann die Gestenerkennungseinheit 2 des Leergutrücknahmeautomaten 1 erkennen, dass ein Nutzer N seinen Blick auf eine bestimmte Annahmeeinrichtung 10, 11 richtet, sich einer Annahmeeinrichtung 10, 11 zuwendet oder ein Leergutobjekt F einer Annahmeeinrichtung 10, 11 annähert. Infolgedessen kann die Gestenerkennungseinheit 2 einen Steuerbefehl erzeugen, der eine oder mehrere Annahmeeinrichtungen 10, 11 oder die Ausgabeeinrichtung 12 aktiviert und somit für die Annahme eines Leergutobjekts F in Bereitschaft versetzt oder zur Ausgabe von Pfandgeld oder eines Pfandbelegs veranlasst.Such gesture recognition units are known today, for example, from mobile phones or video game consoles. For example, the gesture recognition unit 2 of the reverse vending machine 1 can recognize that a user N directs his gaze to a specific receiving device 10, 11, turns to an acceptance device 10, 11 or approaches an emptied object F to an acceptance device 10, 11. As a result, the gesture recognition unit 2 may generate a control command that activates one or more acceptance devices 10, 11 or the output device 12 and thus prepares for the acceptance of an empties object F or causes the issuance of a deposit or a deposit slip.

Durch Verwendung der Gestenerkennungseinheit 2 wird die Bedienung des Leergutrücknahmeautomat 1 durch einen Nutzer N einfach und intuitiv. Der Nutzer N muss insbesondere keine Tasten drücken oder keine sonstigen Eingabeeinrichtungen betätigen. Dadurch, dass er sich in intuitiver Weise zur Rückgabe eines Leergutobjekts F verhält und bewegt, wird der Leergutrücknahmeautomat 1 in automatischer Weise gesteuert, ohne dass eine besondere, bewusste Interaktion des Nutzers F über sonstige Eingabeeinrichtungen erforderlich ist.By using the gesture recognition unit 2, the operation of the reverse vending machine 1 by a user N becomes simple and intuitive. In particular, the user N does not have to press any keys or operate any other input devices. By intuitively behaving and moving to return an empties object F, the reverse vending machine 1 is automatically controlled without requiring any special, conscious interaction of the user F with other input devices.

Ein Ausführungsbeispiel einer Gestenerkennungseinheit 2 ist schematisch in Fig. 2 dargestellt. Die Gestenerkennungseinheit 2 weist beispielsweise zwei Mikrofone 20, 21 auf, die beabstandet zueinander angeordnet sind und zur Aufnahme eines dreidimensionalen Schallfeldes dienen. Dadurch, dass die Mikrofone 20, 21 voneinander beabstandet sind, kann nicht nur ein akustisches Signal an sich aufgenommen werden, sondern das akustische Signal kann auch in dem Erfassungsbereich E lokalisiert werden.An embodiment of a gesture recognition unit 2 is shown schematically in FIG Fig. 2 shown. The gesture recognition unit 2 has, for example, two microphones 20, 21, which are arranged at a distance from one another and serve to receive a three-dimensional sound field. The fact that the microphones 20, 21 are spaced from each other, not only an acoustic signal can be picked up, but the acoustic signal can also be located in the detection area E.

Die Gestenerkennungseinheit 2 weist weiter eine Projektionseinrichtung 22 zur Projektion eines infraroten Projektionsbildes in den Erfassungsbereich E auf. Ein solches infrarotes Projektionsbild P, wie es schematisch in Fig. 4 dargestellt ist, kann beispielsweise die Form eines vorbestimmten Musters, beispielsweise eines Punkt- oder Strichmusters, annehmen. Reflexionen des Projektionsbilds P an einem Nutzer N oder einem anderen Objekt in dem Erfassungsbereich E, der dem Leergutrücknahmeautomat 1 vorgelagert ist (siehe Fig. 3), werden mittels einer Aufnahmeeinrichtung 23 aufgenommen, die nach Art einer Infrarotbildkamera beispielsweise mit einer Bildfrequenz von beispielsweise 30 fps (fps: frames per second) ausgestaltet ist. Aus den Abweichungen zwischen dem aufgenommenen Projektionsbild P und dem vorbekannten Muster des Projektionsbilds P, das von der Projektionseinrichtung 22 projiziert worden ist, können dann Tiefeninformationen nach Art eines Tiefenbilds berechnet werden, so dass sich in räumlich hoch aufgelöster Weise punktgenau der Abstand eines Nutzers N und seiner Körperregionen und Gliedmaßen von der Gestenerkennungseinheit 2 bestimmen lässt.The gesture recognition unit 2 further has a projection device 22 for projecting an infrared projection image into the detection area E. Such an infrared projection image P, as shown schematically in FIG Fig. 4 may, for example, take the form of a predetermined pattern, for example a dot or dash pattern. Reflections of the projection image P at a user N or another object in the detection area E, which is upstream of the reverse vending machine 1 (see Fig. 3 ), are recorded by means of a recording device 23, which is designed in the manner of an infrared camera, for example, with a frame rate of, for example, 30 fps (fps: frames per second). From the deviations between the recorded projection image P and the previously known pattern of the projection image P, which has been projected by the projection device 22, then depth information can be calculated in the manner of a depth image, so that in a spatially highly resolved manner pinpoint the distance of a user N and determine its body regions and limbs from the gesture recognition unit 2.

Die Gestenerkennungseinheit 2 weist weiter eine Kamera 24 in Form einer Farbbildkamera (RGB-Kamera) zur Aufnahme von Farbbildern mit einer Bildfrequenz von beispielsweise 30 fps auf. Die Kamera 24 ist somit zur Aufnahme von Bildern im Bereich des sichtbaren Lichts ausgestaltet, so dass über die Kamera 24 Farb- und Helligkeitsinformationen erhalten und ausgewertet werden können.The gesture recognition unit 2 further has a camera 24 in the form of a color camera (RGB camera) for recording color images with a frame rate of, for example, 30 fps. The camera 24 is thus designed to record images in the visible light range, so that 24 color and brightness information can be obtained and evaluated via the camera.

Die Mikrofone 20, 21, die Projektionseinrichtung 22, die Aufnahmeeinrichtung 23 und die Kamera 24 verwirklichen zusammen eine Sensoreinrichtung der Gestenerkennungseinheit 2 zur Erfassung von Sensorsignalen. Die Sensorsignale, also die Signale der Mikrofone 20, 21, der Aufnahmeeinrichtung 23 und der Kamera 24, werden einer Steuereinrichtung 27 beispielsweise in Form eines Chips zugeführt und werden durch die Steuereinrichtung 27 zur Erzeugung eines oder mehrerer geeigneter Steuerbefehle ausgewertet. Mittels der Steuereinrichtung 27 können dann beispielsweise die Annahmeeinrichtungen 10, 11 oder die Ausgabeeinrichtung 12 zur Ausführung einer vorbestimmten Handlung angesteuert werden.The microphones 20, 21, the projection device 22, the recording device 23 and the camera 24 together realize a sensor device of the gesture recognition unit 2 for detecting sensor signals. The sensor signals, that is to say the signals of the microphones 20, 21, the recording device 23 and the camera 24, are supplied to a control device 27, for example in the form of a chip, and are evaluated by the control device 27 for generating one or more suitable control commands. By means of the control device 27 can then, for example the acceptance devices 10, 11 or the output device 12 are driven to perform a predetermined action.

Die Steuereinrichtung 27 kann mit einem Speicher 28 verbunden sein. Auf dem Speicher 28 können beispielsweise nach Art einer Datenbank vorbestimmte Bedienungsgestenmuster abgespeichert sein, so dass in Abhängigkeit von einem Vergleich einer aufgenommenen Bedienungsgeste mit einem vorab abgespeicherten Bedienungsgestenmuster eine Erkennung durchgeführt werden kann, anhand derer dann ein Steuerbefehl erzeugt werden kann.The control device 27 may be connected to a memory 28. For example, predetermined operating gesture patterns may be stored on the memory 28 in the manner of a database, so that a detection can be carried out in dependence on a comparison of a recorded operating gesture with a previously stored operating gesture pattern, by means of which a control command can then be generated.

Die Steuereinrichtung 27 wirkt zudem zusammen mit an einem Gehäuse 13 des Leergutautomaten 1 angeordneten Lautsprechern 25, 26 und einem Bildschirm 29. Über die Lautsprecher 25, 26 kann beispielsweise eine akustische Ausgabe erfolgen, um einen Nutzer N zur Bedienung des Leergutrückgabeautomaten 1 anzuleiten, Warnhinweise zu erzeugen oder sonstige akustische Rückmeldungen, beispielsweise einen Warnton oder dergleichen, auszugeben. Über den Bildschirm 29 können in vergleichbarer Weise Bedienungshinweise ausgegeben werden, oder es kann Werbung oder dergleichen abgespielt werden.The controller 27 also acts together with arranged on a housing 13 of the reverse vending machine 1 speakers 25, 26 and a screen 29. About the speakers 25, 26, for example, an acoustic output to guide a user N to operate the empties return vending machine 1, warnings generate or other acoustic feedback, such as a warning sound or the like, output. About the screen 29 operating instructions can be issued in a comparable manner, or it can be played advertising or the like.

Wie schematisch in Fig. 3 dargestellt, dient die Sensoreinrichtung 20-24 der Gestenerkennungseinheit 2 dazu, eine Gestenerkennung innerhalb eines Erfassungsbereiches E durchzuführen, der einem Gehäuse 13 des Leergutrücknahmeautomat 1 vorgelagert ist. Eine Gestenerkennung kann hierbei beispielsweise innerhalb eines Bereichs mit einem Radius von bis zu vier Meter, zum Beispiel bis zu zwei Metern um die Gestenerkennungseinheit 2 durchgeführt werden, so dass nur Gesten eines Nutzers N, der einem Leergutautomaten 1 angenähert ist, erfasst und erkannt werden. Der Erfassungsbereich E erstreckt sich hierbei räumlich in drei Dimensionen und kann beispielsweise die Gestalt eines Erfassungstrichters aufweisen.As schematically in Fig. 3 illustrated, the sensor device 20-24 of the gesture recognition unit 2 is used to perform a gesture recognition within a detection range E, which is a housing 13 of the reverse vending machine 1 upstream. A gesture recognition can be performed here, for example, within a range with a radius of up to four meters, for example up to two meters to the gesture recognition unit 2, so that only gestures of a user N, which is close to a reverse vending machine 1, detected and recognized. The detection area E extends spatially in three dimensions and may, for example, have the shape of a detection funnel.

Die Sensoreinrichtung 20-24 kann mit ihren einzelnen Sensoren (Mikrofone 20, 21, Projektionseinrichtung 22, Aufnahmeeinrichtung 23, Kamera 24) oder insgesamt verstellbar an dem Gehäuse 13 des Leergutrücknahmeautomaten 1 angeordnet sein. Die Verstellbarkeit kann hierbei derart sein, dass die Sensoreinrichtung 20-24 insgesamt oder einzelne der Sensoren 20-24 getrennt voneinander um eine horizontale Schwenkachse S (siehe Fig. 3) verschwenkt werden können, so dass der Erfassungsbereich E in seiner Höhe beispielsweise in Abhängigkeit von der Körpergröße eines Nutzers N selbsttätig angepasst werden kann.The sensor device 20-24 can be arranged with its individual sensors (microphones 20, 21, projection device 22, receiving device 23, camera 24) or adjustable in total on the housing 13 of the reverse vending machine 1. In this case, the adjustability may be such that the sensor device 20-24 as a whole or individual ones of the sensors 20-24 are separated from one another about a horizontal pivot axis S (see FIG Fig. 3 ) can be pivoted, so that the detection range E in height, for example, depending on the height of a user N can be automatically adjusted.

Der der Erfindung zugrundeliegende Gedanke ist nicht auf die vorangehend geschilderten Ausführungsbeispiele beschränkt, sondern lässt sich grundsätzlich auch bei gänzlich anders gearteten Ausführungsformen verwirklichen.The idea underlying the invention is not limited to the embodiments described above, but can in principle also be implemented in completely different embodiments.

Mittels der bereitgestellten Gestenerkennungseinheit können grundsätzlich ganz unterschiedliche Bedienungsgesten eines Nutzers erkannt und ausgewertet werden, so dass eine Bedienung eines Leergutrücknahmeautomaten vollständig ohne (bewusste) Interaktion eines Nutzers ablaufen kann.By means of the gesture recognition unit provided, basically completely different operating gestures of a user can be recognized and evaluated, so that operation of a reverse vending machine can proceed completely without (conscious) user interaction.

So kann beispielsweise ein Leergutrücknahmeautomat insgesamt aktiviert werden, sobald ein Nutzer einen Erfassungsbereich betritt. Eine Annahmeeinrichtung kann dann aktiviert werden, wenn der Nutzer eine Bewegung eines Leergutobjekts in Richtung der Annahmeeinrichtung ausführt. Und wenn der Nutzer seinen Blick anschließend beispielsweise in Richtung einer Ausgabeeinrichtung richtet, so kann eine Ausgabe des Pfandgelds oder eines Pfandbelegs erfolgen, ohne dass der Nutzer eine Eingabeeinrichtung irgendeiner Art, wie beispielsweise einen Knopf oder eine Taste, betätigen muss.For example, a reverse vending machine can be activated as a whole as soon as a user enters a detection area. An acceptance device can be activated when the user carries out a movement of an empties object in the direction of the acceptance device. And when the user then directs his gaze, for example, toward an issuer, an issue of the deposit or a pledge may be made without requiring the user to operate an input device of some sort, such as a button or a button.

Die Bedienung durch einen Nutzer wird somit einfach, intuitiv und komfortabel, wobei trägheitsbedingte Wartezeiten für einen Nutzer weitestgehend vermieden werden können, indem beispielsweise eine Aktivierung oder Ansteuerung von Annahmeeinrichtungen oder Ausgabeeinrichtungen frühzeitig anhand einer erkannten Geste erfolgen kann.The operation by a user is thus simple, intuitive and comfortable, with inertia-related waiting times for a user can be largely avoided by, for example, an activation or control of acceptance devices or output devices can be done early on the basis of a detected gesture.

BezugszeichenlisteLIST OF REFERENCE NUMBERS

11
LeergutannahmeautomatEmpties accepting Machine
10, 1110, 11
Annahmeeinrichtungacceptor
1212
Ausgabeeinrichtungoutput device
1313
Gehäusecasing
22
GestenerkennungseinheitGesture recognition unit
20, 2120, 21
Mikrofonmicrophone
2222
Projektionseinrichtungprojection device
2323
Aufnahmeeinrichtungrecording device
2424
Kameracamera
25, 2625, 26
Lautsprecherspeaker
2727
Steuereinrichtungcontrol device
2828
SpeicherStorage
2929
Bildschirmscreen
Ee
Erfassungsbereichdetection range
FF
Leergutobjektempties object
NN
Nutzeruser
PP
Projektionsbildprojection image

Claims (10)

  1. Reverse vending machine (1), comprising
    - a processing device (10, 11, 12) for receiving or outputting an empties object (F) or for issuing a receipt,
    characterized by
    a gesture recognition unit (2) that is configured to recognize a control gesture of a user (N) located in a capturing region (E) of the gesture recognition unit (2) and to convert said gesture into a control command for controlling the processing device (10, 11, 12), wherein the gesture recognition unit (2) is configured to capture a space and to recognize control gestures of a user in a three-dimensional manner, wherein the gesture recognition unit (2) is configured for recognizing an approaching empties object (F) so as to reject or to accept the empties object (F) depending on the recognition, wherein the reverse vending machine (1) recognizes, using the gesture recognition unit (2), whether the approaching empties object (F) is an empties object (F) that can be received and processed at the reverse vending machine (1).
  2. Reverse vending machine (1) according to Claim 1, characterized in that the gesture recognition unit (2) has a sensor device (20-24) for providing a sensor signal and a control device (27) for generating a control command from the sensor signal.
  3. Reverse vending machine (1) according to Claim 2, characterized in that the sensor device (20-24) has
    - at least one microphone (20, 21) for capturing an acoustic signal,
    - at least one projection device (22) for generating a projection image (P) in the capturing region (E) and a recording device (23) for recording the projected projection image (P) from the capturing region (E) and/or
    - a camera for recording an image of visible light from the capturing region (E).
  4. Reverse vending machine (1) according to Claim 3, characterized in that the projection device (22) is configured to generate an infrared projection image (P) and the recording device (23) is configured to record the projected infrared projection image (P).
  5. Reverse vending machine (1) according to Claim 3 or 4, characterized in that the gesture recognition unit (2) is configured to derive spatial information relating to the user (N) located in the capturing region (E) from the projected projection image (P) that is recorded by the recording device (23).
  6. Reverse vending machine (1) according to one of Claims 3 to 5, characterized in that the camera (24) is configured as a colour camera.
  7. Reverse vending machine (1) according to one of Claims 3 to 6, characterized in that the sensor device (20-24) has at least two microphones (20, 21), which are spaced apart from one another, for spatially capturing an acoustic signal.
  8. Reverse vending machine (1) according to one of Claims 3 to 7, characterized in that the sensor device (20-24) is displaceable relative to a housing (13) of the reverse vending machine (1).
  9. Reverse vending machine (1) according to one of the preceding claims, characterized in that the gesture recognition unit (2) is configured to recognize
    - a control gesture carried out by the user (N) in the capturing region (E) for inserting an empties object (F),
    - an entrance of the user (N) into the capturing region (E) and/or
    - an approach of an empties object (F) to the processing device (10, 11).
  10. Method for operating a reverse vending machine (1) having a processing device (10, 11, 12) for receiving or outputting an empties object (F) or for issuing a receipt,
    characterized in that
    a gesture recognition unit (2) recognizes a control gesture of a user (N) located in a capturing region (E) of the gesture recognition unit (2) and converts said gesture into a control command for controlling the processing device (10, 11, 12), wherein the gesture recognition unit (2) is configured to capture a space and to recognize control gestures of a user in a three-dimensional manner, wherein the gesture recognition unit (2) recognizes an approaching empties object (F) and rejects or accepts the empties object (F) depending on the recognition, wherein the reverse vending machine (1) recognizes, using the gesture recognition unit (2), whether the approaching empties object (F) is an empties object (F) that can be received and processed at the reverse vending machine (1).
EP13172507.9A 2013-06-18 2013-06-18 Empty goods return machine Not-in-force EP2816536B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP13172507.9A EP2816536B1 (en) 2013-06-18 2013-06-18 Empty goods return machine
DK13172507.9T DK2816536T3 (en) 2013-06-18 2013-06-18 Returnable Packaging Machine

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP13172507.9A EP2816536B1 (en) 2013-06-18 2013-06-18 Empty goods return machine

Publications (2)

Publication Number Publication Date
EP2816536A1 EP2816536A1 (en) 2014-12-24
EP2816536B1 true EP2816536B1 (en) 2016-05-18

Family

ID=48625948

Family Applications (1)

Application Number Title Priority Date Filing Date
EP13172507.9A Not-in-force EP2816536B1 (en) 2013-06-18 2013-06-18 Empty goods return machine

Country Status (2)

Country Link
EP (1) EP2816536B1 (en)
DK (1) DK2816536T3 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015110758A1 (en) * 2015-07-03 2017-01-05 Mathias Jatzlauk Visualization system with gesture control

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE29812678U1 (en) 1998-07-16 1998-10-29 Sauer Winfried Return station for reusable containers
US7697827B2 (en) * 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
JP4917615B2 (en) 2006-02-27 2012-04-18 プライム センス リミティド Range mapping using uncorrelated speckle
WO2008087652A2 (en) 2007-01-21 2008-07-24 Prime Sense Ltd. Depth mapping using multi-beam illumination
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
WO2009021227A2 (en) * 2007-08-09 2009-02-12 Recyclebank, Llc Drop-off recycling system and method thereof
WO2009093228A2 (en) 2008-01-21 2009-07-30 Prime Sense Ltd. Optical designs for zero order reduction
US8463430B2 (en) * 2008-10-23 2013-06-11 Utique, Inc Interactive and 3-D multi-senor touch selection interface for an automated retail store, vending machine, digital sign, or retail display
DE102010040177A1 (en) * 2010-09-02 2012-03-08 Sielaff Gmbh & Co. Kg Automatenbau Reverse vending machine has control device that switches touch screen from article delivery/return mode to maintenance mode when door is in open state
WO2012072835A1 (en) * 2010-12-03 2012-06-07 Lopez De Aragon Lopez Diego Device for automatic purchase of products, in particular precious metals
FR2970797B1 (en) * 2011-01-25 2013-12-20 Intui Sense TOUCH AND GESTURE CONTROL DEVICE AND METHOD FOR INTERPRETATION OF THE ASSOCIATED GESTURE
DE102011109392B4 (en) * 2011-08-04 2013-05-23 Findbox Gmbh Vending Machine
CN102981615B (en) * 2012-11-05 2015-11-25 瑞声声学科技(深圳)有限公司 Gesture identifying device and recognition methods

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
None *

Also Published As

Publication number Publication date
DK2816536T3 (en) 2016-08-29
EP2816536A1 (en) 2014-12-24

Similar Documents

Publication Publication Date Title
WO2009071121A2 (en) Interaction arrangement for interaction between a display screen and a pointer object
EP3073452B1 (en) Method for monitoring and controlling an access control system
EP2070008A1 (en) Automatic recognition apparatus
EP3298594B1 (en) System and method for processing value documents
CN104867190B (en) Vehicle access management system for stereo garage
EP3494551B1 (en) Card machine with a security user interface
WO2016030105A1 (en) Device for acquisition of biometric features of a face of a person
EP2015263B1 (en) Goods selection unit
EP2816536B1 (en) Empty goods return machine
WO1999041713A1 (en) Method for monitoring the exploitation process of an apparatus and self-service device monitored according to said method
WO2014108150A2 (en) User interface for handwritten character input in a device
CN104094297A (en) Method for triggering an action of a device authorised by a paper document
EP3025215B1 (en) Method and device for processing value documents
DE102016014420B3 (en) Shop with facilities for the prevention of return fraud and procedures for the prevention of return fraud in shops
EP3719764B1 (en) Vending machine and method
EP3896663A1 (en) Self-service machine
EP0488952B1 (en) Method and device for collecting articles, e.g. empty beverage cans
WO2017108560A1 (en) Display device and operating device
EP4010883A1 (en) Apparatus for the automated return of a good and method for operating the apparatus
EP3611659A1 (en) Device for providing a plurality of biometric features of a plurality of people in a group
DE202007009844U1 (en) card reader
EP3837623A1 (en) Method for capturing and subsequently generating data for a user of a self-service terminal
AT13756U1 (en) vending machine
DE102016003625A1 (en) Method and device for contactless gesture-controlled operation of a user interface
DE202016106755U1 (en) Cashier system for goods

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20140526

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

17Q First examination report despatched

Effective date: 20150327

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

INTG Intention to grant announced

Effective date: 20160201

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

Free format text: NOT ENGLISH

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 4

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

Free format text: LANGUAGE OF EP DOCUMENT: GERMAN

Ref country code: AT

Ref legal event code: REF

Ref document number: 801066

Country of ref document: AT

Kind code of ref document: T

Effective date: 20160615

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 502013003015

Country of ref document: DE

REG Reference to a national code

Ref country code: NL

Ref legal event code: FP

REG Reference to a national code

Ref country code: DK

Ref legal event code: T3

Effective date: 20160822

REG Reference to a national code

Ref country code: SE

Ref legal event code: TRGR

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

REG Reference to a national code

Ref country code: NO

Ref legal event code: T2

Effective date: 20160518

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160919

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: RS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160819

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: BE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 502013003015

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SM

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

26N No opposition filed

Effective date: 20170221

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160630

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160630

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 5

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160618

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20130618

REG Reference to a national code

Ref country code: FR

Ref legal event code: PLFP

Year of fee payment: 6

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: LU

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20160618

Ref country code: MK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20160518

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: NL

Payment date: 20190527

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DK

Payment date: 20190524

Year of fee payment: 7

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: FR

Payment date: 20190522

Year of fee payment: 7

REG Reference to a national code

Ref country code: DK

Ref legal event code: EBP

Effective date: 20200630

REG Reference to a national code

Ref country code: NL

Ref legal event code: MM

Effective date: 20200701

REG Reference to a national code

Ref country code: AT

Ref legal event code: MM01

Ref document number: 801066

Country of ref document: AT

Kind code of ref document: T

Effective date: 20200618

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: NL

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200701

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200630

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 502013003015

Country of ref document: DE

Representative=s name: VIERING, JENTSCHURA & PARTNER MBB PATENT- UND , DE

Ref country code: DE

Ref legal event code: R082

Ref document number: 502013003015

Country of ref document: DE

Representative=s name: KILBURN & STRODE LLP, NL

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: AT

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200618

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DK

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20200630

REG Reference to a national code

Ref country code: DE

Ref legal event code: R082

Ref document number: 502013003015

Country of ref document: DE

Representative=s name: VIERING, JENTSCHURA & PARTNER MBB PATENT- UND , DE

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: SE

Payment date: 20220623

Year of fee payment: 10

Ref country code: NO

Payment date: 20220621

Year of fee payment: 10

Ref country code: GB

Payment date: 20220623

Year of fee payment: 10

Ref country code: DE

Payment date: 20220530

Year of fee payment: 10

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 502013003015

Country of ref document: DE

REG Reference to a national code

Ref country code: NO

Ref legal event code: MMEP

REG Reference to a national code

Ref country code: SE

Ref legal event code: EUG

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20230618