WO2013191888A1 - Dispositif permettant une interaction sans outil avec une image projetée - Google Patents

Dispositif permettant une interaction sans outil avec une image projetée Download PDF

Info

Publication number
WO2013191888A1
WO2013191888A1 PCT/US2013/043971 US2013043971W WO2013191888A1 WO 2013191888 A1 WO2013191888 A1 WO 2013191888A1 US 2013043971 W US2013043971 W US 2013043971W WO 2013191888 A1 WO2013191888 A1 WO 2013191888A1
Authority
WO
WIPO (PCT)
Prior art keywords
target area
height
series
projected
work surface
Prior art date
Application number
PCT/US2013/043971
Other languages
English (en)
Inventor
Ronald D. Jesme
Nathaniel J. SIGRIST
Craig R. Schardt
Original Assignee
3M Innovative Properties Company
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 3M Innovative Properties Company filed Critical 3M Innovative Properties Company
Priority to US14/407,025 priority Critical patent/US20150160741A1/en
Publication of WO2013191888A1 publication Critical patent/WO2013191888A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/041Indexing scheme relating to G06F3/041 - G06F3/045
    • G06F2203/041012.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup

Definitions

  • the system may also include a second infrared beam projected from the infrared light emitting source and a second target area towards which the second infrared beam is projected.
  • the system may also include a third infrared beam projected from the infrared light emitting source and a third target area towards which the third infrared beam is projected.
  • the system may include a camera lens that collects a portion of the infrared light from the first target area and focuses the light on the multi-functional sensor device.
  • the system may also include an emitter lens positioned between the infrared light emitting source and first target area, the emitter lens focusing the infrared light onto or near the first target.
  • the system may compute a series of heights associated with the touch gesture and determine that an intended press occurred if the following conditions are met: 1) a first height in the series of heights is above a first reference height, and 2) a second height in the series of heights occurring after the first height in the series is below a second reference height.
  • the system may further determine that an intended click occurred if the further condition is met: 3) a third height in the series of heights occurring after the second height in the series is above the first reference height.
  • the difference between the first reference height and second reference highly is roughly between about 0.5cm and about 2cm.
  • the system may compute a series of heights associated with the gesture and determine that an intended lift occurred if the following conditions are met: 1) a first height in the series of heights is below the second reference height, and 2) a second height in the series of heights occurring after the first height in the series is above the first reference height.
  • Figure 1 is a diagram of a system according to the present description.
  • Figure 2 is a diagram of a system according to the present description.
  • Figures 3a-3e illustrate sub-areas of target and non-target areas, as well as centroids of affected sub-areas.
  • Figures 4a-c illustrate diagrams providing how a touch is registered according to height.
  • Figure 5 is a top-down view of a work surface with a projected image and target areas.
  • Figure 6 is an oblique view of a work surface with a projected image and target areas.
  • a low cost, tool-free interactive solution that works on a wide range of surfaces, requires little computational power, consumes little power, is small in size, and does not require the solution to be mounted on or near the interactive surface.
  • One application that could benefit from such a solution is an interactive projector that could be used in the kitchen, where the image is projected onto the countertop, and the user interacts with the content with bare hands. The user's hands and the countertop can be readily cleaned and sanitized without the need to clean a mouse, keyboard or other interactive tool.
  • the projector and interactivity-enabling components can be mounted under a cabinet above the countertop in a location that does not obstruct working surfaces.
  • Typical interactivity solutions that might meet some of the performance criteria are too large, expensive and consuming of power to be practical solutions.
  • general sensing systems typically employ a complex structured lighting pattern (that in some cases is modulated over time with structures that are dynamically varied to best resolve the sensed structure), one or more high resolution cameras that include a high resolution sensing array and multi-optic lens, significant computational power of significant size (such as that of a PC) that is located some distance from the camera to ensure displacement from the sensing area.
  • This computational power is interconnected with a digitizer connecting the camera to the computer.
  • the "structured lighting" employed by such general sensing systems often utilize a high number of identical illumination dots or strips, which alone are not distinguishable from one another. Thus, additional computational power is needed to resolve the ambiguity introduced by their identical nature.
  • the present description utilizes physical spatial configuration to avoid such ambiguity even if identical illumination shapes are used.
  • the present invention could also utilize illumination patches of distinct size and shape such that the system can easily distinguish them based on simple geometry.
  • one illumination patch could be a square and a second illumination patch could be a circle.
  • one illumination patch could be a solid circle, while the second illumination patch could be a circular ring of light with a void in the middle (a ring or doughnut shape).
  • the illumination patches could be shaped as any appropriate shape either by the IR light source from which the light is emitted, or by optics manipulating the illuminated light before the emitted light reaches the target area.
  • illumination patches could be distinguished by size.
  • System 100 includes an infrared light emitting source 102 that projects an infrared beam 104 towards a first target area 106.
  • the infrared light emitting source may be a light emitting diode (LED), laser or incandescent filament.
  • the system further includes a monolithic multi-functional sensor device 108.
  • the multi-functional sensor device includes an image capture function and an image processing function.
  • the infrared light emitting source 102 and multifunctional sensor device 108 are configured such that when a user provides a touch gesture near the first target area 106, the existence and position of the touch gesture is detected by the multi-functional sensor device and processed.
  • the multi-functional sensor device may, at least in part, be made of up of a semiconductor chip (rather than, e.g., a CPU).
  • the image capture function may be an infrared camera.
  • the infrared camera may use an optical filter to block the projected image from a sensing array of the device.
  • the multi-functional sensor device may be a monolithic (meaning on a single crystal) semiconductor device.
  • a monolithic device includes both a sensing array to capture an image, and image processing electronics.
  • the output of this device may include the Cartesian coordinates of the centroid of bright spots captured by the sensing array.
  • the calculation of centroids results in coordinates that have sub-pixel resolution, thus adequate resolution can be obtained from a relatively low resolution sensing array.
  • This relatively low resolution of the sensing array thus requires relatively little image processing because there are relatively few image pixels to process.
  • This relatively small imaging array and relatively limited image processing requirement improves the viability for both of these functions to be realized in a monolithic silicon crystal. Solutions that implement separate sensing arrays (e.g. camera) and image processing circuits need to convey all of the pixel information from the camera to the image processing circuit, adversely affecting power, size and cost.
  • the system may also incorporate a simple microcontroller where the centroid data is subsequently evaluated to identify a touch.
  • System 100 may also include a first work surface 110 that is positioned such that the first target area 106 is located upon or near it.
  • the first work surface 110 may in at least one embodiment be a countertop.
  • work surface 110 may be absorbent of infrared light, such as that emitted from light source 102.
  • work surface 1 10 may scatter and/or reflect infrared light. It may be preferable for the work surface (or potentially a mat placed on the work surface) to provide improved brightness and contrast of a projected image (as noted below in system 200). The surface could also be used to improve the brightness and contrast of the IR spots.
  • System 100 may include a camera lens 112 that collects a portion of the infrared light from the first target area 106 and focuses the light on the multi-functional sensor device 108.
  • the system may further include an emitter lens 114 that is positioned between the infrared light emitting source 102 and first target area 106.
  • the emitter lens 114 serves to focus, direct, or shape the infrared light 104 onto or near the first target area 106.
  • an emitter aperture 114 may be used rather than an emitter lens, to direct light onto the target area.
  • System 200 in Fig. 2 illustrates further potential embodiments of a system according to the present description.
  • System 200 is illustrated rotated 90 degrees from the views of Figs 1, and 4a-4c, such that the multifunctional sensor device 108 and camera lens 112 are positioned on the opposite side of the light emitting source 102 from the viewer.
  • System 200 also includes light emitting source 102, infrared beam 104, first target area 106, and multi-functional sensor device 108.
  • system 200 includes work surface 110, camera lens 112 that collects a portion of the infrared light from the first target area and focuses it on multi-functional sensor device, and emitter lens 114 that focuses infrared light onto or near the first target area 106.
  • system 200 also includes a projection device 116.
  • the projector can comprise a spatial light modulator (e.g. a LCOS panel or an array of micro mirrors (e.g. Texas Instruments DLP)), LED, laser or incandescent illumination and a projection lens.
  • the projection device may also be of the beam-scanning type such as the laser beam scanning projectors from Micro Vision, Inc. (Redmond, WA).
  • the projection device 116 may project an image onto the first work surface 110, where the projected image has a width 130. In this system, the existence and position of the touch gesture being processed by the multifunctional sensor device 108 may include altering the projected image.
  • the infrared light emitting source 102 may emit a second infrared beam 118 that is projected from the infrared light emitting source towards a second target area 120.
  • Second target area 120 may be positioned proximate to first target area 106, such that it is also upon or near the first work surface 110.
  • the infrared light emitting source may additionally emit a third infrared beam 122. This infrared beam may be projected towards a third target area 124 that is also located near the first work surface 110 and first and second target areas.
  • Working surface 110 need not be made up solely of target areas.
  • Working surface 110 may also include areas that are non-target areas, i.e., are not areas onto which an infrared beam is projected. In the case where an IR absorbing work surface is used, a bright spot will not be detected because the IR beam 104 projected onto the surface 110 will be absorbed. However when the IR beam 104 is intercepted by a finger (or most other objects) the beam will be partially scattered and some of this scattered light will be collected by lens 112 and focused onto the sensing array 108. The height of the finger intercepting the array defines the location where the scattered light will fall onto the array 108.
  • a non IR absorbing, IR scattering work surface 110 is used.
  • the IR spot will be imaged on the sensing array 108 even when the IR beam 104 is not intercepted by a finger or other object.
  • the location of the IR spot focused onto the sensing array 108 is representative of the location of the height of the work surface.
  • Figures 3a - 3e illustrate a more detailed examination of how a touch gesture on one of the target areas is determined by the nature of the target areas.
  • the working surface of a device (as viewed from above) may be made up of a number of sub-areas 303.
  • Each sub-area 303 may be configured such that when a user provides a touch gesture on the work surface, the multi-functional sensor device determines which of the sub-areas were affected and, through an algorithm, computes a position of the touch gesture as the centroid 31 1 of the affected sub-areas 309 in order to determine whether to register a touch on a target or no touch.
  • the centroids 31 1 of the various spots can be found using the following equations:
  • L y is the logic value of 1 or 0 assigned to each pixel based on a threshold
  • a T is the number of affected sub-areas 309.
  • L y is the logic value of 1 or 0 assigned to each pixel based on a threshold
  • a T is the number of affected sub-areas 309.
  • a T is equal to 9
  • 3 ⁇ 4 is equal to 54
  • ( X, Y ) is (54/9, 54/9) or (6,6) according to the numbering on the X and Y axes of Figure 3c.
  • Figure 3d depicts a spot 313 that is oval. The spot may not be round due to optical aberration or other system configuration geometry etc.
  • Equations 1 , 2 and 3 can also be used to find the centroid of the spot in Figure 3d:
  • a T is 16, ⁇ ij iLjj is 96, and ⁇ ij jLjj is 104, thus (X, Y) is (96/16,104/16) or (6,6.5). This demonstrates that the centroid can be found with sub-pixel resolution.
  • FIGS 4a-4c provide a more detailed illustration of how the system determines whether a touch at a target area has occurred.
  • the infrared light emitting source 402 emits light towards work surface 410.
  • the work surface or image surface may be understood as located at a height of zero, illustrated by element 460.
  • the IR light is not interfered with by a user, some of it is directed towards and imaged (possibly via a lens 406) at a first location 462 on the multifunctional sensor device 408.
  • the multi-functional sensor device may include an array of sensors.
  • the system will then properly be capable of determining that an intended touch has occurred at the target area. As noted earlier, this determination and processing may occur solely through the multi-functional sensor device, or through the multifunctional sensor device in conjunction with a simple microcontroller.
  • first reference height 440 As the finger enters the picture it will be measured at above a first reference height 440. It will then move downward such that it is located immediately below second reference height 450.
  • height 460 is simply height zero, that is, the height of the work surface and projection surface.
  • Height 450 is chosen to be a bit above the maximum expected finger thickness of a user.
  • Height 440 is generally chosen to be approximately 1cm above height 450. This distance may, in some embodiments, be chosen in a range from 0.5cm to 2cm. If height 450 is too low, the top surface of the finger will be too high to enable a touch event to occur. If height 460 is too high, then taller object on the work surface 410 could trigger a transition through the height zone.
  • height 460 relatively close in height to height 450 aids in preventing items on the work surface from interfering with the touch system by preventing tall objects from erroneously being interpreted as a touch transition through height 460. Similarly, this algorithm prevents short or thin objects placed on the work surface or counter top from erroneously being interpreted as a touch.
  • the system may instead register simply a "lift" touch event, potentially when a finger is swiped or slid onto a target area and then lifted away from the surface.
  • the multi-functional sensor device may determine that an intended lift occurred if a first height in the series of heights is below a second reference height (i.e. 450), and a second height in the series of heights that occurs after the first height in the series of heights is above a first reference height (i.e. 440).
  • the array of sensors in the multi-functional device will not only extend in a first direction (e.g. the direction along which the height of a touch event may be considered), but also in a lateral direction, such that different target areas laterally spaced from one another may be sensed by the area.
  • a first direction e.g. the direction along which the height of a touch event may be considered
  • a lateral direction such that different target areas laterally spaced from one another may be sensed by the area.
  • the layout of one such plurality of target areas is illustrated in Figures 5 and 6.
  • Figure 5 provides a top-down view of a work surface that illustrates how a work surface could appear to a user.
  • work surface is illustrated by element 562.
  • an image 564 may be projected. This image could include, e.g. a number of various recipes, steps to a recipe, decoration templates, an internet browser or electronic photo album. Within the projected image may be five (or any other number of) boxes 565, 566, 567, 568 and 569 that correspond to commands or prompts the user may like to enter by touch gestures.
  • each box Corresponding to each box is a target area towards which an infrared beam of light is projected.
  • a user touches one of these boxes whether by a press, click, double-click or slide/lift, etc.
  • the touch is registered by the multifunctional sensor device, and the projected image 564 and potentially the target areas may change in response.
  • An oblique view of this is illustrated in Figure 6 with system 600.
  • the illumination patches sent from the IR light sources may be of different size and shapes and may in fact be distinguished from one another based on their respective sizes and shapes.
  • a change in shape of a given IR beam's illumination patch at the target area may indicate that a touch has occurred.
  • a sensor device may read that no touch is occurring when a circle of IR light is being reflected towards it (where, e.g., the work surface area is highly reflected), but indicate that a touch has occurred when a touch "blocks" an inner portion of the light, such that a ring shape, rather than a circle shape is registered by the sensor device.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention concerne des dispositifs qui permettent une interaction avec une image, plus précisément une interaction sans outil avec une image projetée.
PCT/US2013/043971 2012-06-20 2013-06-04 Dispositif permettant une interaction sans outil avec une image projetée WO2013191888A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/407,025 US20150160741A1 (en) 2012-06-20 2013-06-04 Device allowing tool-free interactivity with a projected image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261661898P 2012-06-20 2012-06-20
US61/661,898 2012-06-20

Publications (1)

Publication Number Publication Date
WO2013191888A1 true WO2013191888A1 (fr) 2013-12-27

Family

ID=48670797

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/043971 WO2013191888A1 (fr) 2012-06-20 2013-06-04 Dispositif permettant une interaction sans outil avec une image projetée

Country Status (2)

Country Link
US (1) US20150160741A1 (fr)
WO (1) WO2013191888A1 (fr)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9411432B2 (en) * 2013-04-18 2016-08-09 Fuji Xerox Co., Ltd. Systems and methods for enabling gesture control based on detection of occlusion patterns
WO2015047223A1 (fr) * 2013-09-24 2015-04-02 Hewlett-Packard Development Company, L.P. Identification d'une région tactile cible d'une surface tactile sur la base d'une image
JP6665415B2 (ja) * 2015-03-30 2020-03-13 セイコーエプソン株式会社 プロジェクター、及び、プロジェクターの制御方法

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003054683A2 (fr) * 2001-12-07 2003-07-03 Canesta Inc. Interface lumineuse amelioree pour dispositifs electroniques
WO2008011361A2 (fr) * 2006-07-20 2008-01-24 Candledragon, Inc. Interfaçage avec un utilisateur
GB2466497A (en) * 2008-12-24 2010-06-30 Light Blue Optics Ltd A touch sensitive holographic image display device for holographically projecting a touch sensitive displayed image at an acute angle onto a surface
EP2400367A2 (fr) * 2010-06-28 2011-12-28 Pantech Co., Ltd. Appareil pour le traitement dýun objet tridimensionnel interactif

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5439347B2 (ja) * 2010-12-06 2014-03-12 日立コンシューマエレクトロニクス株式会社 操作制御装置

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2003054683A2 (fr) * 2001-12-07 2003-07-03 Canesta Inc. Interface lumineuse amelioree pour dispositifs electroniques
WO2008011361A2 (fr) * 2006-07-20 2008-01-24 Candledragon, Inc. Interfaçage avec un utilisateur
GB2466497A (en) * 2008-12-24 2010-06-30 Light Blue Optics Ltd A touch sensitive holographic image display device for holographically projecting a touch sensitive displayed image at an acute angle onto a surface
EP2400367A2 (fr) * 2010-06-28 2011-12-28 Pantech Co., Ltd. Appareil pour le traitement dýun objet tridimensionnel interactif

Also Published As

Publication number Publication date
US20150160741A1 (en) 2015-06-11

Similar Documents

Publication Publication Date Title
US7534988B2 (en) Method and system for optical tracking of a pointing object
US10275096B2 (en) Apparatus for contactlessly detecting indicated position on reproduced image
US8907894B2 (en) Touchless pointing device
US8922526B2 (en) Touch detection apparatus and touch point detection method
US7782296B2 (en) Optical tracker for tracking surface-independent movements
US20100201637A1 (en) Touch screen display system
US20110102319A1 (en) Hybrid pointing device
KR101109834B1 (ko) 감지 모듈과 이를 구비한 광학 감지 시스템
EP2302491A2 (fr) Système tactile optique et procédé
US20070103436A1 (en) Optical tracker with tilt angle detection
WO2012070950A1 (fr) Système et procédé d'interaction à touchers multiples et d'éclairage à base de caméra
JP6721875B2 (ja) 非接触入力装置
US20130038577A1 (en) Optical touch device and coordinate detection method thereof
US20150253934A1 (en) Object detection method and calibration apparatus of optical touch system
WO2013035553A1 (fr) Dispositif d'affichage d'interface utilisateur
WO2017170027A1 (fr) Appareil de reconnaissance d'image, procédé de reconnaissance d'image, et unité de reconnaissance d'image
US9639209B2 (en) Optical touch system and touch display system
US20150160741A1 (en) Device allowing tool-free interactivity with a projected image
JP2006202291A (ja) 光学スライドパッド
US9886105B2 (en) Touch sensing systems
US9007346B2 (en) Handwriting system and sensing method thereof
WO2008130145A1 (fr) Appareil à écran tactile et procédé associé faisant intervenir un laser et des fibres optiques
US8912482B2 (en) Position determining device and method for objects on a touch device having a stripped L-shaped reflecting mirror and a stripped retroreflector
US9519380B2 (en) Handwriting systems and operation methods thereof
KR20120057146A (ko) 광학 방식의 터치 입력장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13730716

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13730716

Country of ref document: EP

Kind code of ref document: A1