EP2686760A2 - Dispositif interactif robuste aux ombres portées - Google Patents
Dispositif interactif robuste aux ombres portéesInfo
- Publication number
- EP2686760A2 EP2686760A2 EP12714805.4A EP12714805A EP2686760A2 EP 2686760 A2 EP2686760 A2 EP 2686760A2 EP 12714805 A EP12714805 A EP 12714805A EP 2686760 A2 EP2686760 A2 EP 2686760A2
- Authority
- EP
- European Patent Office
- Prior art keywords
- wall
- representation
- facade
- region
- interest
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07F—COIN-FREED OR LIKE APPARATUS
- G07F9/00—Details other than those peculiar to special kinds or types of apparatus
- G07F9/02—Devices for alarm or indication, e.g. when empty; Advertising arrangements in coin-freed apparatus
- G07F9/023—Arrangements for display, data presentation or advertising
- G07F9/0235—Arrangements for display, data presentation or advertising the arrangements being full-front touchscreens
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
Definitions
- the invention relates to a device comprising an image acquisition and processing system for an interactive or tactile facade enabling a user to interact directly on said facade to obtain a service.
- the invention relates more specifically, but in a non-limiting manner, vending machines food (snacking in English terminology), hot or cold beverages. It also concerns panels or interactive windows for delivering location information, prices, advertisements, etc.
- Automatic distribution is a privileged channel for the food industry. In particular, it allows substantial margins.
- distributors are equipped with an interactive facade exploiting the technique of stereovision.
- This technique consists in providing around a generally translucent wall, sensors for capturing a region of interest describing the very close to the wall to detect a finger for example, or more generally a pointer near said wall.
- An illuminated or even backlit frame surrounding the wall is generally a reference for the detection of pointers.
- known methods such as triangulation for example, it is possible to know precisely the position of the pointer on the wall. Actions can be preprogrammed for a particular pointer position to trigger a specific service.
- the invention relates, according to a first object, to a device comprising an image acquisition and image processing system for an interactive facade, said facade comprising a wall, image capture means for providing a digital representation of the image. a region of interest describing the very close to the wall, said system comprising means for analyzing said representation and detecting the presence of a pointer near said wall.
- the latter comprises means having a projecting portion that is raised relative to the plane of the wall external of the facade, the depth of said relief being less than the depth of the region of interest so that the digital representation provided by the capturing means comprises a first band describing the means having a salient part and a second band describing the distant wall not masked by said means.
- the means for analyzing a digital representation is arranged to implement a method for identifying a set of pixels having at least one common side with one or more others of substantially identical light intensities and reflecting the capture. a pointer crossing the region of interest from the distance to the outer wall of the facade.
- the means having a protruding part may consist of a projecting frame encircling all or part of the wall of the facade, the depth of the projecting portion of said frame being strictly less than that of the region of interest captured.
- the invention furthermore relates to a method for adapting a device comprising an interactive facade acquisition and image processing system, said facade comprising a wall, image capture means for providing a digital representation of a region of interest describing the very close to the wall, said system comprising means for analyzing said representation and detecting the presence of a pointer near said wall.
- a method for adapting a device comprising an interactive facade acquisition and image processing system, said facade comprising a wall, image capture means for providing a digital representation of a region of interest describing the very close to the wall, said system comprising means for analyzing said representation and detecting the presence of a pointer near said wall.
- FIG. 1 shows an interactive facade according to the state of the art
- FIG. 2 shows two examples of images obtained by means of a matrix image sensor according to the state of the art
- FIG. 3 describes a functional system making it possible, from an interactive facade, to detect a pointer close to it;
- FIG. 4 shows an example of a device adapted to host an interactive facade
- FIG. 5 describes an interactive facade illuminated directly by an ambient light source
- FIG. 6 presents capture means for existing image acquisition and image processing systems
- FIG. 9 a illustrates the exploitation of an image captured by a system for acquiring and processing images of an existing device
- FIG. 7 describes an example of an interactive device according to the invention.
- FIG. 8 shows a capture scene by an image acquisition and image processing system embedded in a device according to the invention
- FIGS. 9b and 10 illustrate the use of an image captured by a system for acquiring and processing images of a device according to the invention
- FIG. 11 shows an exemplary method implemented by an image acquisition and image processing system embedded in a device according to the invention.
- FIG 1 shows an interactive facade 10 used in a device, for example an interactive totem.
- a vending machine drinks or more generally food ...
- the facade 10 has a wall 14, generally translucent, surrounded by a frame 15.
- image capture means 12a, 12b and 12c respectively in the form of a matrix image sensor (or a camera) such as the sensor 12 described later in connection with FIGS. 8.
- Each sensor provides an image two dimensions of a region of the frame 15, said region of interest.
- FIG. 2 illustrates, by way of example, two respectively polygonal 15a and rectangle 15b images captured using the capture means 12a and 12b whose respective fields of view ch12a and ch12b do not by themselves make it possible to embrace the entire wall 14 of the facade.
- the facade 10 may include one or more light source (s) dedicated (s) for this use.
- the facade 10 may include one or more light source (s) dedicated (s) for this use.
- four light-emitting diodes 13a, 13b, 13c and 13d are placed in the immediate vicinity of the image capture means.
- identical diodes are also arranged all around the frame, especially if the surface of the facade is important.
- the sensors of the interactive facades are sometimes sensitive mainly in the near infrared to reduce the sensitivity to changes in the ambient brightness. In this case, the diodes emit according to these same wavelengths.
- the arrangement of the sensors 12a, 12b and 12c makes it possible to make the wall 14 touch. Thus, as soon as a pointer, such as a hand P, approaches the wall, the latter is captured by one or more means of capture of the facade 10.
- FIG. 3 illustrates a block diagram of a device comprising an acquisition and image processing system for an interactive facade.
- a processing unit 30 exploits a set 21 of points whose coordinates are expressed in a two-dimensional space u and v resulting from images 15a, 15b and 15c respectively taken by the capture means 12a, 12b and 12c.
- the means 30 From information 23 delivered during a parameterization step, and from the assembly 21, the means 30 implement a triangulation function to determine the spatial position 15z in three dimensions (x, y and z) of a pointer near the wall of a facade. To implement a triangulation function, it is necessary that at least two capture means can capture said pointer.
- means 31 can trigger an appropriate action to deliver a service to a user.
- Figure 4 describes an example of devices hosting an interactive facade.
- a device 1 of the "interactive showcase” type delivers information: geographical positioning, subway map, jukebox etc.
- the device 1 may comprise means for making a payment (coin mechanism 3 and currency receptacle 4).
- It further comprises an interactive facade 10, generally large, which can cover a flat screen 7 on which information is displayed.
- the active face 7a of the screen 7 (as opposed to its rear face 7b) faces the translucent wall of the interactive facade 10.
- a light source 2 diffuses light on the frame of the facade to backlight and thus highlight the latter.
- a user can point P an area of said facade and control the display of information on the screen 7.
- It can also, alternatively, collect a product, such as a digital medium for example, via a distribution receptacle 5.
- the device 1 is a food distributor. According to this variant, the products are not directly visible on a display. Visual information (photo (s), video (s), composition, etc.) and / or advertising is presented
- the device comprising an interactive facade
- it can be implanted in various places for which the ambient brightness can be variable or extreme and generate shadows likely to be detected as false pointers by the system. acquisition and image processing for interactive facade.
- FIG. 5 illustrates a situation in which an interactive facade - such as the facade 10 described with reference to FIG. 1 - is subjected to ambient lighting S likely to alter the detection capacity of an existing system. This same situation will be exploited also to illustrate the contribution of the invention.
- the interactive facade 10 described in Figure 5 comprises a wall 14 surrounded by a frame 15, the projecting portion has a relief relative to the plane of the outer wall of the facade.
- the depth of said relief (or the depth of the projecting portion of the frame) is hc.
- An image of the frame captured by capture means 12 is the reference for a pointer detection process in the immediate vicinity of the wall.
- the capture field of the means 12 is schematized by the lines cc1 and cc2.
- Intense illumination - symbolized by a sun S whose rays form an extremely bright beam schematized by the lines si, s2 - partially illuminates the facade.
- a pointer P - in the form of the index of a hand moves in the immediate vicinity of the wall.
- the region of interest captured by the means 12 corresponds to a portion of the frame 15 embraced by the capture field.
- the capture means 12 are configured or parameterized so that the depth of said region of interest is identical to that of the frame 15: hc.
- This portion of the frame constitutes the repository for the detection process implemented by the acquisition and image processing system for interactive facade.
- FIG. 6 depicts an image captured by capture means 12 of a facade exposed to intense illumination such as the facade 10 described with reference to FIG. 5.
- the capture means 12 described in FIG. in a single matrix sensor.
- Said capture means 12 could be multiple such as the means 12a, 12b and 12c described in connection with Figure 1.
- the single sensor 12 is configured to capture only the frame 15 as a region of interest.
- the rectangle image I thus obtained mainly comprises pixels reflecting the reference frame as well as two regions za and zb respectively translating the capture of the pointer P and that of its shadow shadow P '.
- Said regions za and zb respectively represent two distinct sets of pixels having at least one side in common with one or more others and whose light intensities are substantially identical within each set.
- a system as described in connection with FIG. 3 translates such an image I delivered by the sensor 12 into a representation such as that described with reference to FIG. 9a.
- said rectangle representation consists of 3 rows L1 to L3 of 16 pixels (ie 16 columns of pixels C1 to C16). If we consider a row of pixels, they can be discriminated according to a criterion of luminosity or luminous intensity.
- said brightness criterion is expressed in a scale of 256 values: 0 for the darkest and 255 for the brightest. Another scale of values could be exploited.
- the capture means 12 deliver a representation I close to that Ir of the reference frame. except for a few pixels whose respective light intensities may be greater or less than the average brightness of the pixels representing the reference frame. This difference in light intensity is used to highlight the possible presence of a pointer more or less bright than the reference frame.
- a threshold ⁇ - predetermined or parameterizable - which applied to the average value of light intensity of a repository (frame or floor) to detect a pointer.
- an acquisition and processing system of images for an interactive facade can subtract from the representation I of a scene describing the very close to the facade (matrix image located at the bottom left of figure 9a), the representation Ir of an identical scene for which only the repository is captured (matrix image located at the top left of Figure 9a).
- the invention aims to design or adapt devices comprising an interactive facade to eliminate the disadvantages related in particular to the environment.
- a device designed according to the invention has manufacturing costs particularly competitive (removal of light sources on the facade, use of sensitive sensors in the visible, etc.).
- the invention provides that the capture means 12 (or even means 12a, 12b and 12c described with reference to FIG. 1) are configured to capture a region of interest with a relief.
- a device 1 comprises an interactive facade provided with capture means 12. The latter are configured to capture a region of interest 101 constituted by the ground F on which the device 1 rests. The depth of the region of interest is noted hr.
- the device is provided with projecting means 15 in the form of a ruler protruding under the wall or a projecting frame surrounding all or part of said wall of the facade 10
- the depth hc of the protruding part of said means 15 is less than that hr of the captured region of interest.
- the depth hc of the relief formed by said projecting portion relative to the plane of the outer wall of the facade is less than the depth of the region of interest hr.
- the capture means 12 capture an image I as described in FIG. 8.
- a facade exposed to intense lighting such as the facade 10 described in connection with FIG. 5.
- the capture means 12 described in Figure 8 consist of a single matrix sensor.
- Said capture means 12 could be multiple such as the means 12a, 12b and 12c described in connection with Figure 1.
- the single sensor 12 is configured to capture a portion 101 of the ground F on which the device rests .
- the capture field also encompasses the frame or the ruler 15.
- a delivered representation I which comprises - in the absence of pointers - two bands: one 1100 describing the very near 100 of the wall and translating the means 15 thus captured and the other 1101 the distance translating the captured region 101 of the ground F on which the interactive device is based.
- These two bands 1100 and 1101 constitute the mixed frame-ground reference system for the detection process.
- a pointer P crosses the region of interest, the latter is captured. It is the same for its shadow P '.
- the image I thus obtained mainly comprises the translation of the reference as well as three regions za, zb and zc.
- Said regions za, zb and zc respectively correspond to three distinct sets of pixels having at least one side common with one or more others and whose light intensities are substantially identical within each set.
- the region za reflects the capture of the pointer P. This being traversing, the region za concerns all the rows of pixels of the matrix representation I.
- Zb is a set of pixels having at least one common side with one or more others partially covering the band expressing the projecting means 15.
- zc is a set of pixels having at least one common side with one or more others partially covering the band translating the soil refer to F.
- a device - comprising an interactive facade 10 as described with reference to FIG. 1, comprising a projecting frame 15 surrounding all or part of the wall of the facade - was adapted so that the capture means - traditionally configured to capture only said frame referenced - were parameterized to capture slightly beyond said frame so that the region of interest thus materialized embraces not only the frame 15 but also an area of the ground F on which rests the device.
- the depth hr of the region of interest is thus greater than the depth hc of the relief formed by the protruding portion of the frame 15.
- Two adjacent strips respectively translating the very close 100 of the wall (the projecting part of the frame 15) and the distant 101 (the ground F) would result from the capture of said region of interest comprising a relief.
- a device must also be adapted so that the image analysis method implemented by the acquisition and image processing system for interactive facade, includes a step to seek to distinguish from a reference frame a set of pixels having at least one side with one or more others, of light intensities substantially identical and translating a pointer crossing the region of interest from the distance to the wall of the facade.
- a drop shadow is systematically "broken" by the presence of the relief. It does not translate into a set of pixels having at least one side in common with one or more others relating to all the rows of a digital representation delivered by the capture means. It is the same for the presence of the feet of a user. The latter do not cover the salient frame, their capture does not result in the presence of a set of pixels having at least one side common with one or more others for all the rows of a digital representation delivered by the means of capture. Drop shadows and / or any pointer that does not completely cross the region of interest from the far to the very near the wall of the facade are ignored or remain invisible by the image acquisition and image processing system for interactive facades. a device according to the invention.
- FIG. 9b illustrates a processing of images similar to that commented previously in connection with FIG. 9a.
- Figure 11 shows the main steps of said method.
- a first image analysis method comprises a first step for receiving 201 from capture means 12 a reference numerical representation Ir of the captured region of interest in the absence of any pointer in the vicinity of the wall of the facade.
- the image Ir thus delivered comprises, by way of example, four rows L1 to L4 of 16 pixels.
- the image resulting from the capture made by the means 12 may therefore comprise a substantially larger number of rows with respect to an image captured according to the state of the art.
- the image Ir describing a mixed reference frame (projecting means 15 and a portion of the ground F) relative to the region of interest comprising a relief results in two bands such as those previously described. These correspond to the rows L1 and L2 describing the very close to the wall (and therefore the projecting means 15) and the rows L3 and L4 describing the distance (thus the ground F).
- the analysis method comprises a step for receiving 202 from the capture means a second digital representation I of the region of interest.
- the image I reproduces a scene similar to that described with reference to FIG. 8.
- the translation of a pointer P is found: all the pixels of the columns C12 and C13.
- a method implemented by the processing means of the image acquisition and processing system according to the invention also comprises a step for developing a third digital representation I 'by subtracting from said second representation I the reference representation. Ir.
- each pixel of this representation I ' can be expressed - by way of example - according to a discrete scale of 256 values (0 - 255).
- a value of 0 characterizes the darkest pixels and the brightest 255 pixels.
- the pixels of the image I 'translating the reference frame have a light intensity close to 0; those translating P or P 'have a luminous intensity higher than a predetermined threshold ⁇ so that they can be distinguished from a measurement noise.
- the luminous intensity of the pixels would then be expressed virtually by means of a scale of values between -128 and 127.
- the luminous intensity of the pixels translating the reference frame would be close to 0.
- a luminous pointer would mean for its part translated by pixels of light intensities substantially greater than the threshold ⁇ , its shadow carried by pixels whose light intensities are substantially less than the negative value of said threshold.
- a method according to the first embodiment according to the invention further comprises a step for determining, iteratively, column by column - for example in absolute value - the highest luminous intensity. low among those characterizing the pixels constituting a column.
- the method determines that the minimum intensity is substantially equal to 0 for columns C1 to C16 with the exception of columns C12 and C13 for which said luminous intensity is greater than at the threshold ⁇ . This difference reveals the presence of the pointer P.
- Such a method therefore comprises a step for detecting a pointer 205 as soon as the minimum luminous intensity of a column of pixels of the representation I 'is greater than or equal to the predetermined threshold ⁇ .
- the shadow P ' (concerned with the columns C3, C4, C6 and C7) is ignored.
- Figure 10 depicts a digital representation I '
- PI is a relatively oblique pointer to the plane of the wall of the facade.
- P3 is a pointer substantially orthogonal to said plane.
- P2 is a pointer that does not cross the region of interest. It eventually results from a dirt or insect positioned on the wall of the facade after the repository has been captured.
- the invention provides a second embodiment of an analysis method for searching for the presence of a set of pixels having at least one common side with one or more others covering all the rows of a digital representation delivered by the capture means.
- such a method implemented by the interactive facade acquisition and processing system of a device comprises first, second and third steps 201, 202 and 203 similar to those of the previous embodiment. to develop a representation I 'as described in connection with Figure 9b.
- a method according to the second embodiment comprises a step for determining - for example in absolute value - the luminous intensity of the pixels of said representation I '. It further comprises a step 204 implemented for any pixel of a row bordering the representation I 'and whose luminous intensity is greater than or equal to the predetermined threshold ⁇ , to identify at least one pixel whose luminous intensity is substantially identical in each of the other rows, said identified pixels having at least one common side with one or more others.
- all the pixels having at least one common side with one or more others and with light intensities substantially identical to the row pixel L1 and column C12 belong to the columns C12 and C13 whatever the rows considered.
- all the pixels having at least one common side with one or more others and with substantially identical light intensities at row pixel L1 and column C6 relate to the pixels of columns C6 and C7 and rows L1 and L2.
- Such a method comprises a step for detecting a pointer 205 if each row of pixels of the representation I 'comprises at least one pixel of a set of pixels having at least one side in common with one or more others and whose respective light intensities are substantially identical.
- the invention can not be limited to these exemplary embodiments of image analysis method. Other methods could alternatively be implemented. It is sufficient for this denier to search for the existence of a set of pixels having at least one side in common with one or more others whose luminous intensities are substantially identical and concerning all the rows of a captured image of the region. interest comprising a relief.
- a third example could consist in the iterative application of a mask to any pixel of the representation I 'to assign to said pixel the lowest luminous intensity among that of neighboring pixels: for example the pixel of an adjacent column within the same row and the pixel of the same column but an adjacent row.
- a method detects a pointer crossing only if a set of pixels having at least one side common with one or more others and whose light intensities are substantially the same set of pixels for all rows.
- it could also be expected to classify the different pixels according to the assigned light intensity so as to possibly characterize different pointers.
- Different pointers can also be distinguished according to the number of sets of disjoint pixels identified.
- the invention provides that the two-dimensional coordinates u and v of a detected pointer are determined by the coordinates, within the representation I ', of the pixel situated in the center of the base of an identified set, said pixel belonging to the row describing the closest to the outer wall of the facade.
- the facade comprises a plurality of sensors (at least two) like the capture means 12a to 12c of the facade 10 described in connection with FIG. 1
- a system for acquiring and processing images conforms to FIG. the invention can implement a triangulation function to determine the spatial position 15z in three dimensions (x, y and z) of any pointer detected near the wall of the facade. All that is required is that said pointer is detected by at least two matrix sensors.
- the invention has been described primarily to combat the presence of drop shadows that can be detected as so many false pointers applied against an interactive facade.
- any ray of light emitted from a source and inadvertently illuminating an interactive facade would also be ignored - as well as a shadow - by a device according to the invention. Indeed, such a radius would be "broken" by the projecting means embodying a relief within the region of interest.
- Such a captured ray would translate - as well as a shadow P '- by at least two sets of disjoint pixels corresponding to at least two pointers not entirely crossing the region of interest and therefore ignored.
- a device according to the invention is therefore also robust to direct radiation applied to its interactive facade.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Image Processing (AREA)
- Image Analysis (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
FR1152279A FR2972820B1 (fr) | 2011-03-18 | 2011-03-18 | Dispositif interactif robuste aux ombres portees |
PCT/FR2012/050565 WO2012127161A2 (fr) | 2011-03-18 | 2012-03-16 | Dispositif interactif robuste aux ombres portées |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2686760A2 true EP2686760A2 (fr) | 2014-01-22 |
Family
ID=45974426
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP12714805.4A Withdrawn EP2686760A2 (fr) | 2011-03-18 | 2012-03-16 | Dispositif interactif robuste aux ombres portées |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2686760A2 (fr) |
FR (1) | FR2972820B1 (fr) |
WO (1) | WO2012127161A2 (fr) |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3968477B2 (ja) * | 1997-07-07 | 2007-08-29 | ソニー株式会社 | 情報入力装置及び情報入力方法 |
JP4707034B2 (ja) * | 2006-07-07 | 2011-06-22 | 株式会社ソニー・コンピュータエンタテインメント | 画像処理方法、入力インタフェース装置 |
US8487881B2 (en) * | 2007-10-17 | 2013-07-16 | Smart Technologies Ulc | Interactive input system, controller therefor and method of controlling an appliance |
-
2011
- 2011-03-18 FR FR1152279A patent/FR2972820B1/fr not_active Expired - Fee Related
-
2012
- 2012-03-16 EP EP12714805.4A patent/EP2686760A2/fr not_active Withdrawn
- 2012-03-16 WO PCT/FR2012/050565 patent/WO2012127161A2/fr active Application Filing
Non-Patent Citations (2)
Title |
---|
None * |
See also references of WO2012127161A2 * |
Also Published As
Publication number | Publication date |
---|---|
WO2012127161A3 (fr) | 2014-09-18 |
FR2972820B1 (fr) | 2013-04-19 |
WO2012127161A2 (fr) | 2012-09-27 |
FR2972820A1 (fr) | 2012-09-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2225608B1 (fr) | Dispositif d'evaluation de la surface d'un pneumatique | |
WO2021105265A1 (fr) | Mesure de profondeur à l'aide d'un dispositif d'affichage | |
Ramesh et al. | 5d time-light transport matrix: What can we reason about scene properties? | |
US20230403906A1 (en) | Depth measurement through display | |
FR2939920A1 (fr) | Capteur matriciel | |
FR2932573A1 (fr) | Dispositif d'imagerie gamma ameliore permettant la localisation precise de sources irradiantes dans l'espace | |
EP3891658A1 (fr) | Activité de surveillance à l'aide d'une caméra multispectrale et de profondeur | |
CA3066478A1 (fr) | Marque de securite et procede de validation de l'authenticite d'une marque de securite | |
EP2946228B1 (fr) | Procede et systeme pour fournir a un dispositif mobile des informations sur sa position par rapport a une cible, robot integrant un tel systeme et tablette | |
EP2307948A2 (fr) | Dispositif interactif et procédé d'utilisation | |
GB2473239A (en) | Touch screen displays which can discriminate between near field objects and touching objects | |
WO2015033036A1 (fr) | Equipements de véhicule automobile intégrant un dispositif de mesure de distance d'objets | |
EP3388976A1 (fr) | Procede de detection de fraude | |
EP2686760A2 (fr) | Dispositif interactif robuste aux ombres portées | |
EP2668556A2 (fr) | Dispositif à commandes tactile et gestuelle et procédé d'interprétation de la gestuelle associé | |
US11893100B2 (en) | Spoof detection based on specular and diffuse reflections | |
FR2972544A1 (fr) | Systeme d'acquisition et de traitement d'images robuste pour facade interactive, facade et dispositif interactifs associes | |
US20220307981A1 (en) | Method and device for detecting a fluid by a computer vision application | |
EP3295364A1 (fr) | Système et procédé de détection optique d'intrusion, dispositif électronique, programme et support d'enregistrement correspondants | |
CN102141859A (zh) | 光学式触控显示装置及其方法 | |
JP2023531733A (ja) | バイオメトリック光学センサモジュールを備える電子デバイス | |
WO2021130204A1 (fr) | Dispositif de determination d'une face d'un de reposant sur une surface laissant passer un signal optique | |
BE1023596B1 (fr) | Système interactif basé sur des gestes multimodaux et procédé utilisant un seul système de détection | |
FR3141788A1 (fr) | Système de surveillance volumétrique d’un espace et programme d’ordinateur correspondant. | |
WO2011117505A2 (fr) | Dispositif interactif adaptable aux conditions de luminosite ambiante |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
AK | Designated contracting states |
Kind code of ref document: A2 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAX | Request for extension of the european patent (deleted) | ||
R17D | Deferred search report published (corrected) |
Effective date: 20140918 |
|
17P | Request for examination filed |
Effective date: 20150318 |
|
RBV | Designated contracting states (corrected) |
Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
17Q | First examination report despatched |
Effective date: 20160422 |
|
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20191220 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20200603 |