DK2816536T3 - Returnable Packaging Machine - Google Patents

Returnable Packaging Machine Download PDF

Info

Publication number
DK2816536T3
DK2816536T3 DK13172507.9T DK13172507T DK2816536T3 DK 2816536 T3 DK2816536 T3 DK 2816536T3 DK 13172507 T DK13172507 T DK 13172507T DK 2816536 T3 DK2816536 T3 DK 2816536T3
Authority
DK
Denmark
Prior art keywords
return
recognition unit
gesture recognition
gesture
user
Prior art date
Application number
DK13172507.9T
Other languages
Danish (da)
Inventor
Roy Gergs
Original Assignee
Wincor Nixdorf Int Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wincor Nixdorf Int Gmbh filed Critical Wincor Nixdorf Int Gmbh
Application granted granted Critical
Publication of DK2816536T3 publication Critical patent/DK2816536T3/en

Links

Classifications

    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07FCOIN-FREED OR LIKE APPARATUS
    • G07F7/00Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus
    • G07F7/06Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by returnable containers, i.e. reverse vending systems in which a user is rewarded for returning a container that serves as a token of value, e.g. bottles
    • G07F7/0609Mechanisms actuated by objects other than coins to free or to actuate vending, hiring, coin or paper currency dispensing or refunding apparatus by returnable containers, i.e. reverse vending systems in which a user is rewarded for returning a container that serves as a token of value, e.g. bottles by fluid containers, e.g. bottles, cups, gas containers

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • User Interface Of Digital Computer (AREA)
  • Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)

Description

The invention relates to a reverse vending machine according to the preamble of Claim 1.
Such a reverse vending machine comprises a processing device for receiving or outputting an empties object or for issuing a receipt.
Via such a reverse vending machine, for example, deposit bottles or empties crates to which a deposit is applied can be returned, wherein a user introduces or inserts a deposit bottle or an empties crate into a receiving device of the reverse vending machine designed for this purpose and in return has an amount corresponding to the deposit value of the returned empties object paid out or receives a receipt (voucher) for the deposit value for payment at a cashier.
In conventional reverse vending machines, a receiving device is activated, for example, when a user introduces or inserts an empties object into the receiving device. As a result of the activation, for example, a conveyor device of the receiving device is set into motion, which supplies the empties object to a recognition unit and then conveys it further for storage or compacting (in the case of disposable plastic bottles). Because the activation of the receiving device first takes place when the empties object has been introduced or inserted into a receiving shaft of the receiving device, for example, the activation is comparatively sluggish and is possibly linked to a waiting time (although brief) for the user.
After completed return of empties objects, a user additionally has to actuate a suitable input device, for example, a knob, for the payment of the deposit value or to receive a receipt. This requires, on the one hand, the provision of such an input device on the reverse vending machine and, on the other hand, an additional control step for the user during the return of empties objects.
In a return station for reusable containers, which is known from DE 298 12 678 Ul, an activation device is provided for activating a recognition unit. The activation device can be formed, for example, by a light barrier, a light scanner, or a sensor, which recognizes an approach of a reusable container to be returned and turns on a recognition unit as a result. DE 10 2011 109 392 A1 discloses a vending machine having a product output device, in which a user interface having a touchscreen is provided. The touchscreen is used in a touch-sensitive manner for inputting commands and, in addition, for displaying items of information. A vending machine having a touchscreen device are known from each of US 2010/0103131 A1 and WO 2012/072835 Al. DE 10 2010 040 177 Al discloses a vending machine or reverse vending machine, for example, for returning empty bottles, in which a touchscreen is provided. FR 2 970 797 Al discloses a device designated as an interactive fagade, in which a touch of a fagade can be detected by means of image matrix sensors. A return system for recycled goods is known from WO 2009/021227 A2, in which control can be performed, for example, by means of data gloves, the movement of which is converted into computer gestures. A gesture controller for a vending machine is known from a publication by M. Morales et al. "Vendor Profile: Omek Interactive: Gesture Recognition and Machine Vision Technology for Intelligent Systems", XP055085512, August 2012. US 2007/0086764 Al describes a gesture controller for a camera.
The object of the present invention is to provide a reverse vending machine and a method for operating a reverse vending machine, which enable the simplest possible operation, which is pleasant for a user, for returning an empties object.
This object is achieved by subject matter having the features of Claim 1.
Accordingly, in a reverse vending machine, a gesture recognition unit is additionally provided, which is configured to recognize a control gesture of a user located in a capturing region of the gesture recognition unit and to convert said gesture into a control command for controlling the processing device.
The present invention proceeds from the concept of using a gesture recognition unit for controlling a reverse vending machine. The gesture recognition unit is configured to recognize at least one control gesture of a user. The control gesture can consist in this case of a movement of the user (for example, the movement of an arm to move an empties object toward a receiving device), a viewing direction of the user, a facial expression of the user, a physical posture of the user, or the like, possibly in combination with an acoustic expression of the user. The gesture recognition unit captures such a gesture and analyzes it to generate a control command as a function of a control gesture, which control command is then used for activating the processing unit, for example, a receiving device for receiving an empties object or an output device for issuing money or issuing a receipt. Then, for example, a receiving device can be activated or also stopped again as a function of the control command, the issuance of money corresponding to a deposit value or a receipt can take place, or suggestions can be output to a user, for example, for the correct input of an empties object or for further control.
The gesture recognition unit is configured in this case to recognize a control gesture inside a capturing region of the gesture recognition unit. If a user is at least partially located inside the capturing region, a control gesture of the user can thus be analyzed and converted into a suitable control command. The capturing region is advantageously placed in front of a housing of the reverse vending machine in this case and captures a region in which a user is typically located to supply an empties object to a receiving device of the reverse vending machine or to perform another control action on the reverse vending machine.
Thus, by means of the gesture recognition unit, for example, a spatial region having a radius of up to 4 m, preferably up to 2 m in front of the reverse vending machine can be captured, wherein the gesture recognition unit captures a space and therefore recognizes control gestures of a user in a three-dimensional manner.
The gesture recognition unit preferably comprises at least one sensor device for providing a sensor signal and a control device for generating a control command from the sensor signal. The sensor device can have for this purpose, for example • at least one microphone for capturing an acoustic signal, • at least one projection device for generating a projection image in the capturing region and a recording device for recording the projected projection image from the capturing region and/or • a camera for recording an image of visible light from the capturing region. The sensor device is therefore configured to record different optical and/or acoustic signals from the capturing region, to recognize a control gesture on the basis of the received signals and convert said gesture into a suitable control command.
The sensor device is configured in general for capturing a control gesture in a three-dimensional space. The sensor device can be designed in this case, for example, as in US 2008/0240502 Al, US 7,433,024 B2, US 2009/0185274 Al, or US 2010/0020078 Al, the content of the disclosure of which is also incorporated in its entirety.
The projection device of the sensor device can be configured, for example, to generate an infrared projection image. The projection device is used in this case to project a projection image in the manner of a predetermined pattern into the capturing space, to capture reflections of the projected projection image by means of the recording device and to generate a depth image on the basis of the projected projection image thus captured, which contains very accurate items of information about the distance of an object from the sensor device.
The depth image is generated in this case in that distortions of the projected projection image as a result of reflections on an object located in the capturing region are analyzed. The projection device projects a predetermined pattern into the space of the capturing region in this case, for example, a point pattern or bar pattern, which is reflected on objects inside the capturing region. The projected projection image is recorded by the recording device, wherein items of depth information are calculated from the deviation between the reflected pattern and the known, predetermined pattern of the projection image projected by the projection device.
Because the projection device and also the recording device operate in the infrared wavelength range, the calculation of the items of depth information is insensitive to daylight.
The recording device can have, for example, a frame rate of 30 FPS for recording moving images and a resolution in the full HD range (1920 x 1080 pixels) or lower (for example, 1280 x 1024 pixels).
The camera is preferably configured as a colour camera (RGB camera). The camera is therefore used to record a colour image, wherein the camera can have, for example, a frame rate of 30 FPS and a resolution in the full HD range (1920 x 1080 pixels) or also less (for example, 1280 x 1024 pixels).
The sensor device preferably comprises at least two microphones, which are spaced apart from one another, for spatially capturing an acoustic signal. By means of two or more microphones, an acoustic signal can therefore be spatially recorded, so that a spatial localization of the acoustic source can also be performed.
The sensor device having its individual sensors can be arranged fixedly on a housing of the reverse vending machine. However, it is also conceivable that the sensor device can be displaced by a motor in relation to the housing, for example, in that it can be inclined about a horizontal pivot axis by a predetermined angle range. This enables the adaptation of the capturing region, for example, as a function of the presence and the behaviour of the user, for example, as a function of the physical size of a user or the approach of said user to the reverse vending machine, wherein the displacement can be performed automatically as a result of sensor signals captured by the sensor device.
The gesture recognition unit can be configured, for example, to recognize a control gesture carried out by a user in the capturing region, for example, the approach of an empties object to a receiving device or the ending of an introduction procedure or the like, to control the reverse vending machine as a function of a recognized control gesture. In this context, it is also conceivable and possible to combine the gesture recognition unit with a recognition unit for capturing and recognizing an empties object. It is recognized according to the claim by means of the gesture recognition unit whether an empties object which has approached or been inserted is an empties object which can be received and processed at the reverse vending machine. An empties object can then be rejected or accepted early as a function of the recognition of a gesture recognition unit. The provision of a separate recognition unit on a receiving device of the reverse vending machine can therefore be omitted.
The object is furthermore achieved by a method for operating a reverse vending machine. The reverse vending machine has a processing device for receiving or outputting an empties object or for issuing a receipt. It is provided in this case that a gesture recognition unit recognizes a control gesture of a user located in a capturing region of the gesture recognition unit and converts said gesture into a control command for controlling the processing device.
Reference is made to the above statements with regard to advantages and advantageous embodiments, which also applies similarly to the method.
The fundamental concept of the invention will be explained in greater detail hereafter on the basis of the exemplary embodiments illustrated in the figures. In the figures
Figure 1 shows a schematic view of a reverse vending machine;
Figure 2 shows a schematic view of a sensor device of the reverse vending machine for providing a gesture recognition unit;
Figure 3 shows a schematic view of a user at a reverse vending machine; and Figure 4 shows a schematic view of a projection image, which is projected by means of a projection device, for capturing by a recording device.
Figure 1 shows a schematic view of a reverse vending machine 1, which is configured for the return of an empties object F in the form of a reusable or disposable bottle (see Figure 3) by a user N or for the return of an empties crate.
For this purpose, the reverse vending machine 1 has a receiving device 10 in the form of a receiving shaft for introducing or inserting an empties object F in the form of a bottle and a receiving device 11 for putting in an empties crate. Empties objects F which are introduced or inserted can be supplied via the receiving devices 10, 11 in a manner known per se, for example, to a recognition unit for recognizing the empties object F and subsequently can be conveyed by means of a suitable conveying device to a storage site or a compacting unit (in the case of disposable plastic bottles).
The reverse vending machine 1 is used in a way known per se for receiving empties objects and in return pays a user N a deposit value in money via an outputting device 12 or issues a receipt (voucher), by means of which the user N can have the deposit value paid out at a cashier device.
The reverse vending machine 1 shown in Figure 1 has a gesture recognition unit 2 for capturing a control gesture of a user Nina capturing region E (see Figure 3). The gesture recognition unit 2 is configured to recognize a control gesture, for example, a movement of the user N, a physical posture of the user N, a viewing direction of the user N, or the like and converts said gesture into a control command, on the basis of which the receiving devices 10, 11 or the outputting device 12 (which implement the processing devices of the reverse vending machine 1) can be activated.
Such gesture recognition units are presently known, for example, from mobile telephones or also from videogame consoles. For example, the gesture recognition unit 2 of the reverse vending machine 1 can recognize that a user N directs his vision onto a specific receiving device 10, 11, turns toward a receiving device 10, 11, or an empties object F approaches a receiving device 10, 11. As a result, the gesture recognition unit 2 can generate a control command, which activates one or more receiving devices 10, 11 or the outputting device 12 and therefore makes them ready to receive an empties object F or causes them to issue deposit money or a deposit receipt.
By using the gesture recognition unit 2, the operation of the reverse vending machine 1 by a user N becomes simple and intuitive. The user N in particular does not have to press buttons or actuate any other input devices. Because he behaves and moves in an intuitive manner for the return of an empties object F, the reverse vending machine 1 is controlled automatically, without a separate, intentional interaction of the user F via other input devices being necessary.
An exemplary embodiment of a gesture recognition unit 2 is schematically shown in Figure 2. The gesture recognition unit 2 has, for example, two microphones 20, 21, which are arranged spaced apart from one another and are used to record a three-dimensional sound field. Because the microphones 20, 21 are spaced apart from one another, not only can an acoustic signal be recorded per se, but rather the acoustic signal can also be located in the capturing region E.
The gesture recognition unit 2 furthermore has a projection device 22 for projecting an infrared projection image in the capturing region E. Such an infrared projection image P, as is schematically shown in Figure 4, can assume the form of a predetermined pattern, for example, a point pattern or bar pattern, for example. Reflections of the projection image P on a user N or another object in the capturing region E, which is placed in front of the reverse vending machine 1 (see Figure 3), are recorded by means of a recording device 23, which is configured like an infrared camera, for example, having a frame rate of 30 FPS (FPS: frames per second), for example. Items of depth information can then be calculated from the deviations between the recorded projection image P and the previously known pattern of the projection image P, which has been projected by the projection device 22, in the manner of a depth image, so that the distance of a user N and his body regions and limbs may be determined by the gesture recognition unit 2 in a spatially high-resolution manner with high accuracy.
The gesture recognition unit 2 furthermore has a camera 24 in the form of a colour camera (RGB camera) for recording colour images at a frame rate of 30 FPS, for example. The camera 24 is therefore configured to record images in the range of visible light, so that items of colour and brightness information can be obtained and analyzed via the camera 24.
The microphones 20, 21, the projection device 22, the recording device 23, and the camera 24 together implement a sensor device of the gesture recognition unit 2 for capturing sensor signals. The sensor signals, i.e., the signals of the microphones 20, 21, the recording device 23, and the camera 24, are supplied to a control device 27, for example, in the form of a chip, and are analyzed by the control device 27 to generate one or more suitable control commands. Then, for example, the receiving devices 10, 11 or the outputting device 12 can be activated to execute a predetermined action by means of the control device 27.
The control device 27 can be connected to a memory 28. For example, predetermined control gesture patterns can be stored in the memory 28 in the manner of a database, so that as a function of a comparison of a recorded control gesture to a previously stored control gesture pattern, a recognition can be carried out, on the basis of which a control command can then be generated.
The control device 27 additionally interacts with loudspeakers 25, 26 and a display screen 29, which are arranged on a housing 13 of the reverse vending machine 1. For example, an acoustic output can take place via the loudspeakers 25, 26 to introduce a user N to the control of the reverse vending machine 1, to generate warning notifications, or to output other acoustic feedback, for example, a warning tone or the like. In a comparable manner, operating instructions can be output via the display screen 29, or advertising or the like can be played back.
As is schematically shown in Figure 3, the sensor device 20-24 of the gesture recognition unit 2 is used to carry out gesture recognition within a capturing region E, which is placed in front of a housing 13 of the reverse vending machine 1. A gesture recognition can be carried out in this case, for example, within a region having a radius of up to 4 m, for example, up to 2 m around the gesture recogni tion unit 2, so that only gestures of a user N, who has approached a reverse vending machine 1, are captured and recognized. The capturing region E extends spatially in three dimensions in this case and can have the form of a capturing funnel, for example.
The sensor device 20-24 can be arranged so it is displaceable on the housing 13 of the reverse vending machine 1 with its individual sensors (microphones 20, 21, projection device 22, recording device 23, camera 24) or as a whole. The dis-placeability in this case can be such that the sensor device 20-24 as a whole or individual ones of the sensors 20-24 separately from one another can be pivoted about a horizontal pivot axis S (see Figure 3), so that the capturing region E can be automatically adapted in its height, for example, as a function of the physical size of a user N.
The fundamental concept of the invention is not restricted to the above-described exemplary embodiments, but rather may also be implemented in principle in entirely different embodiments.
By means of the provided gesture recognition unit, in principle entirely different control gestures of a user can be recognized and analyzed, so that control of a reverse vending machine can run completely without (intentional) interaction of a user.
Thus, for example, a reverse vending machine can be activated as a whole as soon as a user steps into a capturing region. A receiving device can be activated when the user executes a movement of an empties object in the direction of the receiving device. When the user subsequently directs his vision, for example, in the direction of an outputting device, an issuance of the deposit money or a deposit receipt can thus be performed, without the user having to actuate an input device of any type, for example, a knob or a button.
The control by a user therefore becomes simple, intuitive, and comfortable, wherein inertia-related waiting times can be substantially avoided for a user, for example, in that activation of receiving devices or outputting devices can take place early on the basis of a recognized gesture.
List of reference numerals 1 reverse vending machine 10, 11 receiving device 12 outputting device 13 housing 2 gesture recognition unit 20, 21 microphone 22 projection device 23 recording device 24 camera 25, 26 loudspeaker 27 control device 28 memory 29 display screen E capturing region F empties object N user P projection image

Claims (11)

1. Returemballageautomat (1), omfattende - en procesindretning (10, 11, 12) til at modtage eller afgive en returemballagegenstand (F) eller til at give en kvittering, kendetegnet ved en gestusgenkendelsesenhed (2), som er konfigureret til at genkende en betjeningsgestus fra en bruger (N), som befinder sig i et registreringsområde (E) af gestusgenkendelsesenheden (2), og til at konvertere denne gestus til en styrekommando til styring af procesindretningen (10, 11, 12), hvori gestusgenkendelsesenheden (2) er konfigureret til at registrere og genkende en brugers styregestus på tredimensionel måde, hvori gestusgenkendelsesenheden (2) er konfigureret til at genkende en returemballagegenstand (F), der nærmer sig, for at forkaste eller acceptere returemballagegenstanden (F) i afhængighed af genkendelsen, hvori returemballageautomaten (1) under brug af gestusgenkendelsesenheden (2) genkender, om returemballagegenstanden (F), der nærmer sig, er en returemballagegenstand (F), der kan modtages og behandles i returemballageautomaten (1).A return packaging machine (1), comprising - a processing device (10, 11, 12) for receiving or delivering a return packaging object (F) or for providing a receipt, characterized by a gesture recognition unit (2) configured to recognize a operating gestures of a user (N) located in a registration area (E) of the gesture recognition unit (2), and to convert this gesture to a control command for controlling the process device (10, 11, 12) wherein the gesture recognition unit (2) is configured to detect and recognize a user's control gesture in a three-dimensional manner in which the gesture recognition unit (2) is configured to recognize an approaching return packing object (F) to reject or accept the return packing object (F), depending on the recognition in which the return packing machine ( 1) using the gesture recognition unit (2), recognizes whether the return packaging item (F) approaching is a return packaging item (F) capable of receiving is given and processed in the return packaging machine (1). 2. Returemballageautomat (1) ifølge krav 1, kendetegnet ved, at gestusgenkendelsesenheden (2) har en følerindretning (20-24) til at tilvejebringe et følersignal og en styreindretning (27) til at generere en styrekommando af følersignalet.Return packing machine (1) according to claim 1, characterized in that the gesture recognition unit (2) has a sensor device (20-24) for providing a sensor signal and a control device (27) for generating a control command of the sensor signal. 3. Returemballageautomat (1) ifølge krav 2, kendetegnet ved, at følerindretningen (20-24) har - mindst en mikrofon (20, 21) til at opfange et akustisk signal, - mindst en projektionsindretning (22) til at generere et projektionsbillede (P) i registreringsområdet (E) og en optagelsesindretning (23) til at optage det projicerede projektionsbillede (P) fra registreringsområdet (E), og/eller - et kamera til at optage et billed af synligt lys fra registreringsområdet (E).Return packaging machine (1) according to claim 2, characterized in that the sensor device (20-24) has - at least one microphone (20, 21) for capturing an acoustic signal - at least one projection device (22) for generating a projection image ( P) in the recording area (E) and a recording device (23) for recording the projected projection image (P) from the recording area (E), and / or - a camera for capturing an image of visible light from the recording area (E). 4. Returemballageautomat (1) ifølge krav 3, kendetegnet ved, at projektionsindretningen (22) er konfigureret til at generere et infrarødt projektionsbillede (P), og at optagelsesindretningen (23) er konfigureret til at registrere det projicerede infrarøde projektionsbillede (P).Return packing machine (1) according to claim 3, characterized in that the projection device (22) is configured to generate an infrared projection image (P) and the recording device (23) is configured to detect the projected infrared projection image (P). 5. Returemballageautomat (1) ifølge krav 3 eller 4, kendetegnet ved, at gestusgenkendelsesenheden (2) er konfigureret til at udlede rumlig information an gående brugeren (N), som befinder sig i registreringsområdet (E), af det projicerede projektionsbillede (P), som er optaget af optagelsesindretningen (23).Return packaging machine (1) according to claim 3 or 4, characterized in that the gesture recognition unit (2) is configured to derive spatial information concerning the user (N) located in the registration area (E) of the projected projection image (P) which is occupied by the recording device (23). 6. Returemballageautomat (1) ifølge et af krav 3-5, kendetegnet ved, at kameraet (24) er indrettet som et farvekamera.Return packaging machine (1) according to one of claims 3-5, characterized in that the camera (24) is arranged as a color camera. 7. Returemballageautomat (1) ifølge et af krav 3-6, kendetegnet ved, at følerindretningen (20-24) har mindst to mikrofoner (20, 21), som er placeret i afstand fra hinanden, til rumligt at opfange et akustisk signal.Return packaging machine (1) according to one of claims 3-6, characterized in that the sensor device (20-24) has at least two spaced microphones (20, 21) for spatially capturing an acoustic signal. 8. Returemballageautomat (1) ifølge et af krav 3-7, kendetegnet ved, at følerindretningen (20-24) kan forskydes i forhold til et hus (13) af returemballageautomaten (1).Return packaging machine (1) according to one of claims 3-7, characterized in that the sensor device (20-24) can be displaced relative to a housing (13) of the return packaging machine (1). 9. Returemballageautomat (1) ifølge et af de foregående krav, kendetegnet ved, at gestusgenkendelsesenheden (2) er konfigureret til at genkende - en betjeningsgestus udført af brugeren (N) i registreringsområdet (E) for indføring af en returemballagegenstand (F), - indgang af brugeren (N) i registreringsområdet (E), og/eller - en returemballagegenstand (F), der nærmer sig procesindretningen (10, 11).Return packaging machine (1) according to one of the preceding claims, characterized in that the gesture recognition unit (2) is configured to recognize - an operating gesture performed by the user (N) in the registration area (E) for introducing a return packaging object (F), - entry of the user (N) into the registration area (E), and / or - a return packaging object (F) approaching the processing device (10, 11). 10.10th Fremgangsmåde til drift af en returemballageautomat (1) med en procesindretning (10, 11, 12) til at modtage eller afgive en returemballagegenstand (F) eller give en kvittering, kendetegnet ved, at en gestusgenkendelsesenhed (2) genkender en betjeningsgestus fra en bruger (N), som befinder sig i et registreringsområde (E) af gestusgenkendelsesenheden (2), og konverterer denne gestus til en styrekommando til styring af procesindretningen (10, 11, 12), hvori gestusgenkendelsesenheden (2) er konfigureret til at registrere og genkende en brugers styregestus på tredimensionel måde, hvori gestusgenkendelsesenheden (2) genkender en returemballagegenstand (F), der nærmer sig, og forkaster eller accepterer returemballagegenstanden (F) i afhængighed af genkendelsen, hvori returemballageautomaten (1) under brug af gestusgenkendelsesenheden (2) genkender, om returemballagegenstanden (F), der nærmer sig, er en returemballagegenstand (F), der kan modtages og behandles i returemballageautomaten (1).Method of operating a return packaging machine (1) with a processing device (10, 11, 12) for receiving or delivering a return packaging object (F) or providing a receipt, characterized in that a gesture recognition unit (2) recognizes a user gesture ( N) located in a registration area (E) of the gesture recognition unit (2) and convert this gesture to a control command for controlling the process device (10, 11, 12), wherein the gesture recognition unit (2) is configured to detect and recognize a the user's control gesture in a three-dimensional manner in which the gesture recognition unit (2) recognizes a returning packaging object (F) approaching and rejects or accepts the return packaging object (F), depending on the recognition, wherein the return packaging machine (1) during use of the gesture recognition unit (2) recognizes, the return packaging item (F) approaching is a return packaging item (F) that can be received and processed in the return packaging machine (1).
DK13172507.9T 2013-06-18 2013-06-18 Returnable Packaging Machine DK2816536T3 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
EP13172507.9A EP2816536B1 (en) 2013-06-18 2013-06-18 Empty goods return machine

Publications (1)

Publication Number Publication Date
DK2816536T3 true DK2816536T3 (en) 2016-08-29

Family

ID=48625948

Family Applications (1)

Application Number Title Priority Date Filing Date
DK13172507.9T DK2816536T3 (en) 2013-06-18 2013-06-18 Returnable Packaging Machine

Country Status (2)

Country Link
EP (1) EP2816536B1 (en)
DK (1) DK2816536T3 (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102015110758A1 (en) * 2015-07-03 2017-01-05 Mathias Jatzlauk Visualization system with gesture control

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE29812678U1 (en) 1998-07-16 1998-10-29 Sauer Winfried Return station for reusable containers
US7697827B2 (en) * 2005-10-17 2010-04-13 Konicek Jeffrey C User-friendlier interfaces for a camera
WO2007096893A2 (en) 2006-02-27 2007-08-30 Prime Sense Ltd. Range mapping using speckle decorrelation
WO2008087652A2 (en) 2007-01-21 2008-07-24 Prime Sense Ltd. Depth mapping using multi-beam illumination
US8150142B2 (en) 2007-04-02 2012-04-03 Prime Sense Ltd. Depth mapping using projected patterns
WO2009021227A2 (en) * 2007-08-09 2009-02-12 Recyclebank, Llc Drop-off recycling system and method thereof
WO2009093228A2 (en) 2008-01-21 2009-07-30 Prime Sense Ltd. Optical designs for zero order reduction
US8463430B2 (en) * 2008-10-23 2013-06-11 Utique, Inc Interactive and 3-D multi-senor touch selection interface for an automated retail store, vending machine, digital sign, or retail display
DE102010040177A1 (en) * 2010-09-02 2012-03-08 Sielaff Gmbh & Co. Kg Automatenbau Reverse vending machine has control device that switches touch screen from article delivery/return mode to maintenance mode when door is in open state
WO2012072835A1 (en) * 2010-12-03 2012-06-07 Lopez De Aragon Lopez Diego Device for automatic purchase of products, in particular precious metals
FR2970797B1 (en) * 2011-01-25 2013-12-20 Intui Sense TOUCH AND GESTURE CONTROL DEVICE AND METHOD FOR INTERPRETATION OF THE ASSOCIATED GESTURE
DE102011109392B4 (en) * 2011-08-04 2013-05-23 Findbox Gmbh Vending Machine
CN102981615B (en) * 2012-11-05 2015-11-25 瑞声声学科技(深圳)有限公司 Gesture identifying device and recognition methods

Also Published As

Publication number Publication date
EP2816536B1 (en) 2016-05-18
EP2816536A1 (en) 2014-12-24

Similar Documents

Publication Publication Date Title
US10001845B2 (en) 3D silhouette sensing system
TWI653563B (en) Projection touch image selection method
KR101092909B1 (en) Gesture Interactive Hologram Display Appatus and Method
US10380814B1 (en) System for determining entry of user to an automated facility
US8166421B2 (en) Three-dimensional user interface
US11657617B1 (en) Presentation of a user interface for confirming unreliable group data
EP2672880B1 (en) Gaze detection in a 3d mapping environment
CN102473041B (en) Image recognition device, operation determination method, and program
JP6730076B2 (en) Product sales data processing device, product sales data processing system and program
CN108268134B (en) Gesture recognition device and method for taking and placing commodities
CN101379455A (en) Input device and its method
JP6664920B2 (en) Surveillance camera system and surveillance method
US20130127705A1 (en) Apparatus for touching projection of 3d images on infrared screen using single-infrared camera
US9052750B2 (en) System and method for manipulating user interface by 2D camera
KR101623495B1 (en) Apparatus for selling item
US11030980B2 (en) Information processing apparatus, information processing system, control method, and program
JP7254072B2 (en) virtual image display
WO2016053320A1 (en) Gesture based manipulation of three-dimensional images
KR20190066916A (en) Facial recognition type cafe system and method of providing service of threrof
EP3055742A1 (en) System for controlling an industrial machine comprising a touch interface integrated into a glass pane
JP2022528023A (en) Accounting methods, equipment and systems
KR101575063B1 (en) multi-user recognition multi-touch interface apparatus and method using depth-camera
CN106255977A (en) For performing the apparatus and method of variable data acquisition procedure
DK2816536T3 (en) Returnable Packaging Machine
CN108885499A (en) Non-touch controls graphic user interface