EP2668556A2 - Dispositif à commandes tactile et gestuelle et procédé d'interprétation de la gestuelle associé - Google Patents
Dispositif à commandes tactile et gestuelle et procédé d'interprétation de la gestuelle associéInfo
- Publication number
- EP2668556A2 EP2668556A2 EP12705365.0A EP12705365A EP2668556A2 EP 2668556 A2 EP2668556 A2 EP 2668556A2 EP 12705365 A EP12705365 A EP 12705365A EP 2668556 A2 EP2668556 A2 EP 2668556A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- facade
- interest
- wall
- pointer
- region
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F19/00—Complete banking systems; Coded card-freed arrangements adapted for dispensing or receiving monies or the like and posting such transactions to existing accounts, e.g. automatic teller machines
- G07F19/20—Automatic teller machines [ATMs]
- G07F19/207—Surveillance aspects at ATMs
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F9/00—Details other than those peculiar to special kinds or types of apparatus
- G07F9/02—Devices for alarm or indication, e.g. when empty; Advertising arrangements in coin-freed apparatus
- G07F9/023—Arrangements for display, data presentation or advertising
- G07F9/0235—Arrangements for display, data presentation or advertising the arrangements being full-front touchscreens
Definitions
- the invention relates to any device comprising an interactive or tactile facade allowing a user to interact directly on said facade to obtain a service.
- the invention relates more specifically, but in a non-limiting manner, vending machines food (snacking in English terminology), hot or cold beverages. It also concerns panels or interactive windows for delivering location information, prices, advertisements, etc.
- Automatic distribution is a privileged channel for the food industry. In particular, it allows substantial margins. To maximize the attractiveness of such dispensers, manufacturers have chosen to equip the main facade of said dispensers with a touch screen. To reduce the costs associated with the implementation of this technology and increase the robustness of the facade, exploitation of stereovision may be preferred.
- the latter technique consists in providing, around a generally translucent wall, sensors, an illuminated or backlit frame for detecting a finger for example, or more generally a pointer, near said translucent wall. Using known methods, such as triangulation for example, it is possible to know precisely the position of the pointer on the wall. Actions can be preprogrammed for a particular pointer position to trigger a service.
- gesture recognition systems that offer many features especially in gaming applications.
- these systems do not offer touch function to validate certain actions.
- To choose and / or validate an operation the user must generally perform complex or typical gestures (hands, heads, etc.) so that the system can correctly interpret them.
- the risks of confusion, slow execution or repetition of gestures are real.
- the invention makes it possible to meet all the disadvantages raised by the known solutions.
- the invention mainly consists in adapting the touch control systems whose applications require a high degree of precision when interpreting the presence of pointers on an interactive wall. Thanks to the invention, it is possible to increase the attractiveness of such devices as well as their ability to interact with a user without altering the precision of the interpretation of validation or purchase commands for example.
- a device comprising:
- capture means for capturing at least one image of the environment bordering on the external face of the wall of said facade and delivering a digital representation of said environment
- the invention provides that said digital representation comprises a first region of interest describing the very close to the wall of the facade and a second region of interest adjacent to the first one. describing the distance of said wall.
- the invention further provides that the means for analyzing detect, determine the position and / or interpret the displacement of a pointer by jointly operating the content of said first and second regions of interest.
- the interactive facade comprises a frame surrounding the outer face of the wall, said frame comprising the capture means;
- the first region of interest describes a representation of all or part of the frame of the facade. Alternatively, it can be provided that:
- the interactive facade comprises a frame surrounding the outer face of the wall, said frame comprising first capture means for delivering a first digital representation describing all or part of the frame of the facade;
- the device comprises second capture means for delivering a second digital representation describing the distance from the external face of the wall of the facade;
- said first and second digital representations respectively constitute the first and second regions of interest of a numerical representation thus aggregated of the environment bordering the wall of the facade exploited by the means for analyzing.
- the means for analyzing such a device can interpret the presence of one or more pointers only if they detect such a presence in the second region of interest.
- the invention further provides that a device according to the invention may include means for controlling lighting very close to the wall of the facade.
- the invention provides a method of interpreting touch and gesture commands implemented by the processing means of a device according to the invention.
- Such a method comprises:
- the method comprises a step prior to the implementation of tactile actions to evaluate the concordance of the Cartesian coordinates in the estimated space of the "border" pixels representative of a pointer detected at the boundary of the first and second regions of interest to confirm or deny the continuity of a pointer crossing the two regions of interest;
- the step for triggering a tactile action is implemented only if the step for evaluating the concordance attests the capture of a traversing pointer.
- the method of interpretation may comprise:
- this furthermore provides that such a method may comprise a step prior to capturing the environment bordering the facade to parameterize the processing means for associating at least:
- a method according to the invention and implemented by a device comprising means for controlling lighting very close to the wall of the interactive facade may include a step prior to the capture of the surrounding environment of the facade wall to operate the means for controlling the lighting of the very close to the wall of the interactive facade as soon as a pointer has been detected substantially at the boundary of the first and second regions of interest.
- the step for calibrating and setting up the capture means of such a method can then consist in providing the nominal acquisition parameters of the capture means so that the latter can deliver a digital representation of the first region of interest likely to reveal the possible presence of a pointer in the very close to the wall of the interactive facade during the lighting of said very close.
- FIG. 1 shows a known device with an interactive facade in the form of a vending machine products
- FIG. 2 presents an interactive facade according to the state of the art
- FIG. 3 shows two examples of images obtained by means of a matrix image sensor of an interactive facade according to the state of the art
- FIG. 4 shows a gesture recognition system according to the state of the art
- FIGS. 5a and 5b respectively show two types of tactile and gestural control device according to the invention.
- FIGS. 6a to 6d respectively show scenes of capture of the gestures of a user according to the invention.
- FIG. 7 illustrates embodiments of a method for interpreting the gesture of a user implemented by a device according to the invention.
- Figure 1 shows an example of a known interactive device that delivers food (sweets, drinks in cans, etc.).
- the device la comprises means for making a payment: a coin mechanism 3 or a credit card reader and a receptacle 4 to possibly collect the overpayment or a purchase certificate.
- It further comprises an interactive facade 10 covering a flat screen 7 on which information is displayed.
- the active face 7a of the screen 7 (as opposed to its rear face 7b) faces the translucent wall of the interactive facade 10.
- a light source 2 diffuses light on the frame of the facade for retro -light and thus enhance the latter.
- a user can point 20 (with a finger) for example) an area of said facade and control the display of information on the screen 7.
- processing means 30 exploit the content of captured images in the immediate vicinity of the wall of the interactive facade to detect the presence of a pointer and interpret its displacement.
- the processing means trigger predetermined operations previously associated with particular areas of the wall to display and / or deliver products according to the location of said pointer on the wall.
- the user can thus collect a product, for example, a drink, via a dispensing receptacle 5.
- the device may be a distributor of magazines directly visible on their display through the wall of the interactive facade.
- Figure 2 allows to describe an interactive facade 10 used in the previous example.
- the facade comprises a wall 14, generally translucent, surrounded by a frame 15. On the upper part 11 of the frame are arranged, according to this example, three matrix image sensors 12a, 12b and 12c.
- Each provides a two-dimensional image of a region of the frame 15, said region of interest as shown in Figure 3 which shows two respective polygonal images 15a and 15b rectangle captured with sensors 12a and 12b whose fields respective vision 12v and 12w alone do not allow to embrace the entire wall 14 of the facade.
- a frame is not required.
- the image delivered by the matrix sensor or sensors is then that of the ground or any body limiting the capture field. For an optimized detection, the presence of a frame will however be preferred.
- a known facade 10 may comprise preferably one or more light sources dedicated to this use.
- four light-emitting diodes 13a, 13b, 13c and 13d are placed in close proximity to the image sensors.
- identical diodes may also be arranged all around the frame for example if the surface of the facade is important. This gives a backlight 2 in accordance with that of the device described in connection with Figure 1.
- the processing means 30 of a device integrating the interactive facade exploit a set 21 of points resulting from images such as the images 15a or 15b respectively taken by the sensors 12a and 12b. From information delivered during a stage calibration, and the assembly 21, the means 30 implement a triangulation function to determine the Cartesian coordinates in the space (x, y and z) of a pointer in the immediate vicinity of the wall of a facade. To implement a triangulation function, it is necessary that at least two sensors can capture said pointer. According to the techniques of the prior art, an intense and inhomogeneous illumination of the wall related to excessive sunlight, for example, can greatly reduce the detection and positioning capability of a pointer.
- the shadow carried by a pointer or by any object close to the wall can lead to inadvertent detection.
- the backlighting of the frame associated with a calibration of the exposure parameters of the image sensors makes it possible to maximize the detection capability of the interactive facade.
- the means 30 being capable of associating predetermined actions respectively with zones of the wall 14 of the facade 10, the exploitation of the images delivered by the sensors 12a, 12b and 12c thus makes it possible to make the wall 14 "tactile".
- a pointer such as a hand 20
- said pointer is captured by one or more image sensors of the facade 10 and the means 30 trigger the appropriate action .
- Figure 4 illustrates such a system.
- one or more image sensor (s) 18 capture (s) a stream of images of the environment 102 of said one or more sensors.
- a control device includes a processing unit 30 for analyzing the captured images and triggering the display of graphic, textual or sound with an interface 19 for example a living room television to implement a video game.
- the means 30 are able to interpret the movements or gestures of said character.
- the means 30 can reproduce them on the TV screen 19 by means of an avatar or interpret the movements of an arm or hands to select a program, etc.
- the means 30 calculate an optical flux corresponding to the difference between two captured images at a few moments of interval. It is possible to deduce the speed and orientation of a set of pixels.
- the control unit discriminates the different gestures of the user (lateral movements, jumps, etc.) and associates them with predetermined actions. A particular movement can thus trigger a validation, cancellation action or actuate an image carousel to select an item within a plurality.
- the gestural recognition is usually rather crude but provides a great wealth of expression.
- FIG. 5a shows a user 20 able to interact with an interactive showcase lb of a store for example, said showcase 1b being adapted according to the invention to be a device with tactile and gestural controls.
- This window has an interactive facade 10 similar to the facade described in connection
- the image sensor or sensors of the facade are adapted to capture not only the very close 101 of the wall of the facade 10 but also the more distant environment 102.
- FIG. 5b describes a variant for which a user 20 can interact with a container of products, for example foodstuffs, also comprising an interactive facade 10 similar to that described with reference to FIG. 2.
- the device is capable of interpreting touch controls of a user to enable him to select a product by designating a particular area of the wall of the interactive facade with a finger for example.
- the conventional tactile recognition techniques previously described implemented by the container allow it to accurately detect the presence and position of a pointer in the immediate vicinity of the wall of the facade. For this, digital representations from the facade capture means calibrated to capture the very near 101 of the latter are exploited by 30 means of processing means 30 not shown in Figure 5b.
- Such a device is further adapted - according to the invention - to include capture means complementary to those of the interactive facade 10.
- the complementary capture means are able to capture the distant 102 of said facade 10.
- the gesture of a user 20 of the device can be captured.
- the user can with his hands choose the ingredients, determine the proportions of a mixture while remaining at a distance from the wall of the facade. He can also validate his order by pointing a suitable area of the facade with his finger, for example, by positioning it so that it comes into contact with the wall of the facade 10.
- the complementary capture means consist for example of a camera 18 (not shown in FIG. 5b) positioned above the frontage 10.
- the devices lb and the are thus able to have both images describing the very near 101 of the interactive facade but also its distant 102. They are also able to trigger actions adapted to the gesture thus captured and interpreted.
- the invention makes it possible to multiply the interaction between a user and an interactive device with gestural and tactile controls.
- the processing means 30 of an interactive display case lb or of a container are thus also adapted - in accordance with the invention - to analyze the digital representation of the surrounding environment of the external wall of the interactive facade by considering that said digital representation has a first region of interest ROIt describing the very near 101 of the wall of the facade and a second region of interest ROIg adjacent to the first describing the far 102 of said wall.
- Said means 30 are further arranged to detect, determine the position and / or interpreting unambiguously the displacement of a pointer by joint exploitation of the content of said first and second regions of interest.
- the ROIt region corresponds to a region of interest that we will call “tactile” and the ROIg region, a region of interest that we will call “gestures”.
- each of its two regions of interest ROIt and ROIg can be performed by the processing means 30 in a known manner as described above.
- said processing means 30 of a device according to the invention are suitable for characterizing and interpreting a gesture of the user according to whether it is detected in a region of interest or within the two regions of the region. ROIt and ROIg interest.
- FIGS. 6a to 6c make it possible to describe the method for characterizing the gesture of a user whose hand is detected in one or both regions of interest.
- the means 30 Prior to the detection of a pointer, the means 30 are parameterized to associate with a characterizable gesture in the ROIg zone one or more actions called “gestural actions” particular. They are also set to associate one or more particular actions called “touch actions” to a position or a movement of a pointer in the ROIt area.
- the invention provides that the priority of analysis and interpretation is given to the exploitation of the ROIt tactile area of interest.
- the processing means 30 interpret gestures possibly captured in the ROIg zone.
- Figure 6a illustrates a hand 20 detected only in the ROIg area. Gl or G2 movements of this can be discriminated by the processing means 30 and then interpreted to trigger gestural actions that have been previously associated with such theoretical movements.
- FIG. 6b describes a situation for which the hand 20 now performs a displacement G3 aiming to gradually bring it closer to the wall of the interactive facade. As long as the border f between the two zones ROIt and ROIg is not crossed, the gestural interpretation of the hand continues.
- FIG. 6c now describes the continuation of the G3 movement previously described.
- One of the fingers of the user's hand eventually comes into contact with the wall of the interactive facade.
- the respective content of the digital representations of the ROIt and ROIg zones teaches the means that a pointer is now detected in the ROIT area.
- Said means 30 now interpret the gesture of the user and translate it in the form of tactile actions. Only the position of the finger in contact with the wall or the displacement thereof in the very close to the wall will be interpreted by the processing means 30. The user can then undertake tactile commands resulting in validation actions or which require a high degree of precision in their interpretation.
- Figure 6d shows a situation for which different pointers are captured simultaneously.
- a pointer 20 which crosses the ROIg and ROIt zones.
- the interpretation of the gesture of the pointer 20 results in the implementation of tactile actions only.
- a second pointer 20g is detected only in the region ROIg.
- the processing means 30 interpret the movements of said pointer 20g by implementing gestural actions.
- the pointer 20f is only detected in the ROIt region.
- the invention provides a variant of embodiment for which, while maintaining the priority to the detection of a pointer in the so-called touch ROIt region, an additional continuity check of said pointer can be implemented by the means 30 to check that there is a pointer detected in the adjacent area ROIg whose Cartesian coordinates in space and and cg (respectively at the ends of the neighboring ROIt and ROIg areas) are substantially equivalent to the border f of the two areas ROIt and ROIg. If so - case of the pointer 20 in FIG. 6d - the processing means 30 validate the detection of a pointer in the touch zone ROIT. Conversely - case of the pointer 20f of Figure 6d - the means 30 invalidate the detection and somehow ignore this pointer 20f.
- Such a “false” pointer 20f can be the work of the malicious removal of a chewing gum on the wall of the facade, or an insect of sufficient size whose capture could lead to irrelevant pointer detection.
- pointers crossing the border f of the ROIt and ROIg zones are considered as pointers whose interpretation of positioning and displacement is translated by tactile actions.
- a method 200 for interpreting the gestures of a user according to the invention is presented in connection with FIG. 7. Such a method is implemented by the processing means 30 of a tactile and gestural control device comprising an interactive facade.
- the method comprises a first step 201 for capturing one or more images of the environment bordering the interactive facade. It also includes a step 202 to analyze the content of the ROIg digital representation of the distant environment 102 of the facade.
- the method comprises a step 204 for detecting and characterizing the gesture of a pointer possibly detected in said region of interest ROIg. If no pointer is detected in this region, the process is finished 210: no gestural or tactile action is performed.
- the method comprises a step 203 for analyzing the content of the digital representation ROIt of the very close of the facade, the region ROIt being adjacent to ROIg.
- the method comprises a step 205 for detecting and characterizing the displacement of a pointer possibly detected in the region of interest ROIT. If a pointer has previously been detected 204 in the ROIg region, then according to the invention, the processing means 30 examine the results of the analysis 203 of the content of the ROIt region. If these results do not reveal the presence of a pointer in ROIt, the processing means implement the appropriate gesture action (or actions) in order to interpret the remote gestures of the user. . In the opposite case, if the presence of a pointer is confirmed 205 in the ROIt region, the processing means implement the appropriate tactile action (or actions) to interpret only the position and / or the displacement of the pointer in the very near 101 of the interactive facade.
- such a method may further comprise an additional matching test 208 prior to the implementation of tactile actions.
- This test is performed by the means 30 in the case where a pointer has been detected 205, 206, simultaneously in the ROIt and ROIg regions.
- This test consists in evaluating the concordance of Cartesian coordinates in the estimated space of the "frontier" pixels relative to the pointers detected at the border f of the two regions of interest ROIt (coordinates and) and ROIg (coordinates cg). This test makes it possible to confirm / invalidate the continuity of a pointer crossing the two regions of interest.
- the tactile actions 207 are implemented.
- a tolerance threshold can be defined to characterize the border f between the two regions of interest ROIt and ROIg and / or to estimate the concordance of the coordinates and and cg to characterize a traversing pointer.
- the digital representations of the two regions of interest can be obtained from a single sensor (for example 12b) or from a single set of sensors. [12a, 12b, 12c] or from two sets of sensors [12a, 12b, 12c] and 18 respectively dedicated to capturing the very near and far of the interactive facade.
- the method 200 described in connection with FIG. 7 may then comprise a preliminary step 209 for calibrating the different capture means. Let us first take the case of a device lb for which capture means deliver a global representation of the surrounding environment of the outer wall of the interactive facade.
- a parameterization of the processing means 30 is necessary during the calibration 209 of the capture means. It is enough to determine the limits of the two respective zones and thus materialize the frontier f.
- the ROIt and ROIg zones dissociated, the processing means 30 analyze them according to the techniques adapted to the detection and interpretation of the movements of the pointers to trigger the gestures or tactile actions appropriate.
- lighting is provided to allow the capture of the surrounding environment of the facade wall according to nominal acquisition parameters.
- the nominal acquisition parameters make it possible to capture the scene in the distant 102 of the facade and to trigger gesture actions 206.
- a backlighting of the facade is also provided so that the presence of a pointer in the very close to the facade is not catchable (pixel saturation for example) with nominal acquisition parameters.
- the processing means 30 detect the crossing of the border f, the parameters are modified in "touch" mode to take into account the use of the backlight and make a pointer capturable in the very near the facade. In doing so, only the ROIT region is now able to reveal the presence of a pointer and thus trigger tactile actions.
- the invention provides that the illumination may in turn be controlled while the acquisition parameters of the capture means remain invariant.
- the frame of the interactive facade may be illuminated (or backlit) only when detecting a crossing of the border f between the two areas of interest.
- the processing means 30 can not detect pointers that would only be present in the very near of the facade.
- the frame is illuminated (or backlit).
- the detection 205 of a pointer in the very near becomes possible.
- the lighting (or backlighting) of the facade warns the user that he is about to make a touch control so presumably to validate an order and / or make a purchase.
- the processing implemented by the means 30 is somewhat different.
- the first capture means incorporated in the interactive facade are positioned and calibrated (209) to capture the very near 101 of the wall of the facade.
- a backlight Of the backlighting type 2 of the device 1a
- the second capture means can be positioned and calibrated (exposure in particular) for normal acquisition conditions. The backlight comes this time possibly saturate the pixels relative to the very near.
- the ROIg is thus materialized. Any other method for determining the geometry, adjacency and alignment of the regions of interest ROIt and ROIg could be implemented without limiting the scope of the invention.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Business, Economics & Management (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- User Interface Of Digital Computer (AREA)
- Control Of Vending Devices And Auxiliary Devices For Vending Devices (AREA)
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1150590A FR2970797B1 (fr) | 2011-01-25 | 2011-01-25 | Dispositif a commandes tactile et gestuelle et procede d'interpretation de la gestuelle associe |
PCT/FR2012/050150 WO2012101373A2 (fr) | 2011-01-25 | 2012-01-24 | Dispositif à commandes tactile et gestuelle et procédé d'interprétation de la gestuelle associé |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2668556A2 true EP2668556A2 (fr) | 2013-12-04 |
Family
ID=45755389
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12705365.0A Withdrawn EP2668556A2 (fr) | 2011-01-25 | 2012-01-24 | Dispositif à commandes tactile et gestuelle et procédé d'interprétation de la gestuelle associé |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2668556A2 (fr) |
FR (1) | FR2970797B1 (fr) |
WO (1) | WO2012101373A2 (fr) |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2816536B1 (fr) * | 2013-06-18 | 2016-05-18 | Wincor Nixdorf International GmbH | Automate de reprise pour bouteilles consignées |
CN103530943A (zh) * | 2013-09-27 | 2014-01-22 | 申烽 | 一种售餐机 |
EP3220322A1 (fr) * | 2014-04-03 | 2017-09-20 | Cubic Corporation | Distributeur automatique commandé à distance |
JP6335695B2 (ja) * | 2014-07-09 | 2018-05-30 | キヤノン株式会社 | 情報処理装置、その制御方法、プログラム、及び記憶媒体 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010012001A1 (en) * | 1997-07-07 | 2001-08-09 | Junichi Rekimoto | Information input apparatus |
US20020071277A1 (en) * | 2000-08-12 | 2002-06-13 | Starner Thad E. | System and method for capturing an image |
US20050277071A1 (en) * | 2004-06-14 | 2005-12-15 | Microsoft Corporation | Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface |
US20090102800A1 (en) * | 2007-10-17 | 2009-04-23 | Smart Technologies Inc. | Interactive input system, controller therefor and method of controlling an appliance |
EP2258587A1 (fr) * | 2008-03-19 | 2010-12-08 | Denso Corporation | Dispositif d'entrée de commande pour véhicule |
-
2011
- 2011-01-25 FR FR1150590A patent/FR2970797B1/fr not_active Expired - Fee Related
-
2012
- 2012-01-24 WO PCT/FR2012/050150 patent/WO2012101373A2/fr active Application Filing
- 2012-01-24 EP EP12705365.0A patent/EP2668556A2/fr not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20010012001A1 (en) * | 1997-07-07 | 2001-08-09 | Junichi Rekimoto | Information input apparatus |
US20020071277A1 (en) * | 2000-08-12 | 2002-06-13 | Starner Thad E. | System and method for capturing an image |
US20050277071A1 (en) * | 2004-06-14 | 2005-12-15 | Microsoft Corporation | Method for controlling an intensity of an infrared source used to detect objects adjacent to an interactive display surface |
US20090102800A1 (en) * | 2007-10-17 | 2009-04-23 | Smart Technologies Inc. | Interactive input system, controller therefor and method of controlling an appliance |
EP2258587A1 (fr) * | 2008-03-19 | 2010-12-08 | Denso Corporation | Dispositif d'entrée de commande pour véhicule |
Also Published As
Publication number | Publication date |
---|---|
WO2012101373A3 (fr) | 2014-06-26 |
FR2970797B1 (fr) | 2013-12-20 |
FR2970797A1 (fr) | 2012-07-27 |
WO2012101373A2 (fr) | 2012-08-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2668556A2 (fr) | Dispositif à commandes tactile et gestuelle et procédé d'interprétation de la gestuelle associé | |
WO2013160323A1 (fr) | Procede pour interagir avec un appareil mettant en oeuvre une surface de commande capacitive, interface et appareil mettant en oeuvre ce procede | |
EP2183633A2 (fr) | Dispositif de projection panoramique, et procede mis en oeuvre dans ce dispositif | |
JP2019514092A (ja) | タッチレス制御グラフィカルユーザインターフェース | |
EP0615183A1 (fr) | Terminal pour un dialogue homme/machine avec un système informatique faisant intervenir une pluralité d'éléments de visualisation | |
EP2098072A2 (fr) | Dispositif pour permettre une communication par visioconference et procede de communication associe | |
CA2387066A1 (fr) | Ensemble d'interface entre un utilisateur et un dispositif electronique | |
EP2956926B1 (fr) | Système interactif à base de tags pour dispositif diffusant un contenu multimédia | |
EP3188623B1 (fr) | Dispositif d'interaction d'un objet exposé à l'aide d'un bras robotique | |
FR2999847A1 (fr) | Procede d'activation d'un dispositif mobile dans un reseau, dispositif d'affichage et systeme associes | |
FR3023513A1 (fr) | Procede d'interaction pour piloter un combine d'instruments d'un vehicule automobile | |
EP3491488A1 (fr) | Procédé et système de commande de l'affichage d'informations et terminal d'utilisateur mettant en oeuvre ce procédé | |
CA2813547C (fr) | Dispositif permettant un enregistrement des etapes de fabrication d'une preparation pharmaceutique et procede d'enregistrement apparente | |
WO2019110395A1 (fr) | Procede d'interaction avec un sous-titre affiche sur un ecran de television, dispositif, produit-programme d'ordinateur et support d'enregistrement pour la mise en œuvre d'un tel procede | |
WO2023152246A1 (fr) | Distributeur automatique destiné à la vente de consommables sans contact | |
BE1020522A5 (fr) | Procede de determination des parametres geometriques indiquant le mouvement d'une camera. | |
FR2887659A1 (fr) | Dispositif pour rendre un volume ou une surface interactifs | |
FR2887660A1 (fr) | Procede et dispositif pour rendre un volume ou une surface interactifs | |
WO2012127161A2 (fr) | Dispositif interactif robuste aux ombres portées | |
CN102141859A (zh) | 光学式触控显示装置及其方法 | |
WO2009121199A1 (fr) | Procede et dispositif pour realiser une surface tactile multipoints a partir d'une surface plane quelconque et pour detecter la position d'un objet sur une telle surface | |
BE1023596B1 (fr) | Système interactif basé sur des gestes multimodaux et procédé utilisant un seul système de détection | |
EP2684112A2 (fr) | Système d'acquisition et de traitement d'images robuste pour façade interactive, façade et dispositif interactifs associes | |
FR3121251A3 (fr) | Procédé et système informatisés de navigation en réalités virtuelles et programme d’ordinateur | |
FR3144880A1 (fr) | Système de saisie tactile d’un code à destination des personnes malvoyantes |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
R17D | Deferred search report published (corrected) |
Effective date: 20140626 |
|
17P | Request for examination filed |
Effective date: 20150105 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
17Q | First examination report despatched |
Effective date: 20170627 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200801 |