WO2015030795A1 - Touch input association - Google Patents

Touch input association Download PDF

Info

Publication number
WO2015030795A1
WO2015030795A1 PCT/US2013/057549 US2013057549W WO2015030795A1 WO 2015030795 A1 WO2015030795 A1 WO 2015030795A1 US 2013057549 W US2013057549 W US 2013057549W WO 2015030795 A1 WO2015030795 A1 WO 2015030795A1
Authority
WO
WIPO (PCT)
Prior art keywords
touch sensitive
touch
horizontal
mat
touch input
Prior art date
Application number
PCT/US2013/057549
Other languages
French (fr)
Inventor
Bradley Suggs
Original Assignee
Hewlett Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett Packard Development Company, L.P. filed Critical Hewlett Packard Development Company, L.P.
Priority to PCT/US2013/057549 priority Critical patent/WO2015030795A1/en
Publication of WO2015030795A1 publication Critical patent/WO2015030795A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for entering handwritten data, e.g. gestures, text
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1639Details related to the display arrangement, including those related to the mounting of the display in the housing the display being based on projection
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 – G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1637Details related to the display arrangement, including those related to the mounting of the display in the housing
    • G06F1/1647Details related to the display arrangement, including those related to the mounting of the display in the housing including at least an additional display
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device; Cooperation and interconnection of the display device with other functional units
    • G06F3/1423Digital output to display device; Cooperation and interconnection of the display device with other functional units controlling a plurality of local displays, e.g. CRT and flat panel display

Abstract

In one example in accordance with the present disclosure, a method conducted by a projective computing system is provided. The method includes displaying a first interface on a vertical display, projecting a second interface on a horizontal touch sensitive mat, receiving a touch input modification request, and changing the touch input association from the horizontal touch sensitive mat to the vertical display such that a touch input on the horizontal touch sensitive mat controls the first interface displayed on the vertical display.

Description

TOUCH INPUT ASSOCIATION BACKGROUND

[0001] Computer systems typically employ a display or multiple displays which are mounted on a support stand and/or are incorporated into some other component of the computer system. For displays employing touch sensitive technology (e.g., touch screens), it is often desirable for a user to interact directly with such displays in order to fully utilize such touch technology during system operations. However, optimum ergonomic placement of a display for simply viewing an image thereon Is often at odds with such placement for engaging in touch interaction therewith. Thus, users desiring to use a single computer system for both traditional viewing applications as well as touch interactive application often encounter difficulties In positioning and/or utilizing such systems.

BRIEF DESCRIPTION OF THE DRAWINGS

[0002] For a detailed description of various examples, reference will now be made to the accompanying drawings in which:

[0003] Figure 1 is a schematic perspective view of an example of a projective computer system in accordance with the principles disclosed herein;

[0004] Figure 2 is another schematic perspective view of the computer system of

Figure 1 in accordance with the principles disclosed herein;

[0005] Figure 3 is a schematic side view of the computer system of Figure 1 in accordance with the principles disclosed herein;

[0006] Figure 4 Is a schematic front view of the computer system of Figure 1 in accordance with the principles disclosed herein;

[0007] Figure 5 is a schematic side view of the computer system of Figure 1 during operation in accordance with the principles disclosed herein;

[0008] Figure 6 is a schematic front view of the system of Figure 1 during operation in accordance with the principles disclosed herein;

[0009] Figure 7 is a black box circuit diagram of the computer system of Figure 1 in accordance with the principles disclosed herein;

[00010] Figure 8 is a process flow diagram of processes conducted by the system of Figure 1 in accordance with the principles disclosed herein;

[00011] Figure 9 is a schematic view showing a marker provided on the vertical display of the computer system of Figure 1 when an input device is touching or proximate to the horizontal touch sensitive mat in accordance with the principles disclosed herein; and

[00012] Figure 10 is a schematic view showing a clone on the vertical display projected onto the horizontal touch sensitive mat of the computer system of Figure 1 in accordance with the principles disclosed herein.

NOTATION AND NOMENCLATURE

[00013] Certain terms are used throughout the following description and claims to refer to particular system components. As one skilled in the art will appreciate, computer companies may refer to a component by different names. This document does not intend to distinguish between components that differ in name but not function, in the following discussion and in the claims, the terms "Including" and "comprising" are used in an open- ended fashion, and thus should be interpreted to mean "including, but not limited to... ." Also, the term "couple" or "couples" is intended to mean either an indirect or direct connection. Thus, if a first device couples to a second device, that connection may be through a direct electrical or mechanical connection, through an indirect electrical or mechanical connection via other devices and connections, through an optical electrical connection, or through a wireless electrica! connection. As used herein the term "approximately" means plus or minus 10%. In addition, as used herein, the phrase "user input device" refers to any suitable device for providing an input, by a user, into an electrical system such as, for example, a mouse, keyboard, a hand (or any finger thereof), a stylus, a pointing device, etc. Furthermore, the term "vertical" is Intended to mean upright and approximately perpendicular to the plane of the horizon. In addition, the term "horizontal" is intended to mean approximately parallel to the plane of the horizon.

DETAILED DESCRIPTION

[00014] The following discussion is directed to various examples of the disclosure. Although one or more of these examples may be preferred, the examples disclosed should not be interpreted, or otherwise used, as limiting the scope of the disclosure, including the claims. In addition, one skilled in the art will understand that the following description has broad application, and the discussion of any example is meant only to be descriptive of that example, and not intended to intimate that the scope of the disclosure, including the claims, is limited to that example.

[00015] Referring now to Figures 1-4, a projective computing system 100 in accordance with the principles disclosed herein Is shown. In this example, system 100 generaliy comprises a support structure 1 10, a computing device 150, a projector unit 180, and a touch sensitive mat 200. Computing device 150 may comprise any suitable computing device whiie stiii complying with the principles disclosed herein. For example, in some implementations, device 150 may comprise an electronic display, a smartphone, a tablet, an all-in-one computer (i.e., a display that also houses the computer's board), or some combination thereof. In this example, device 150 is an al!-in-one computer that includes a central axis or center !ine 155, first or top side 150a, a second or bottom side 150b axially opposite the top side 150a, a front side 150c extending axialiy between the sides 150a, 150b, a rear side also extending axlal!y between the sides 150a, 150b and generally radially opposite the front side 150c. A display 152 defines a viewing surface and is disposed along the front side 150c to project images for viewing and interaction by a user (not shown). In some examples, display 152 is not touch sensitive. In other examples, display 152 Inciudes touch sensitive technology such as, for example, resistive, capacltive, acoustic wave, infrared (IR), strain gauge, optical, acoustic pulse recognition, or some combination thereof. Therefore, throughout the following description, display 152 may periodically be referred to as a touch sensitive or non-touch sensitive surface or display. In addition, In some examples, device 150 further Includes a camera 154 that is to take images of a user while he or she is positioned In front of display 152. In some implementations, camera 154 is a web camera. Further, in some examples, device 150 also includes a microphone or similar device that Is arranged to receive sound inputs (e.g., voice) from a user during operation.

[00018] Referring still to Figures 1-4, support structure 1 10 includes a base 120, an upright member 140, and a top 160. Base 120 inciudes a first or front end 120a, and a second or rear end 120b. During operation, base 120 engages with a support surface 15 to support the weight of at least a portion of the components (e.g., member 140, unit 180, device 150, top 180, etc.) of system 100 during operation. In this example, front end 120a of base 120 Includes a raised portion 122 that is slightly separated above the support surface 15 thereby creating a space or clearance between portion 122 and surface 15. As will be explained in more detail below, during operation of system 100, one side of mat 200 is received within the space formed between portion 122 and surface 15 to ensure proper alignment of mat 200. However, it should be appreciated that In other examples, other suitable alignments methods or devices may be used while still complying with the principles disclosed herein.

[00017] Upright member 140 includes a first or upper end 140a, a second or lower end 140b opposite the upper end 140a, a first or front side 140c extending between the ends 140a, 140b, and a second or rear side 140d opposite the front side 140c and also extending between the ends 140a, 140b. The lower end 140b of member 140 is coupled to the rear end 120b of base 120, such that member 140 extends substantially upward from the support surface 15,

[00018] Top 160 includes a first or proximate end 160a, a second or distal end 160b opposite the proximate end 160a, a top surface 160c extending between the ends 160a, 160b, and a bottom surface 160d opposite the top surface 160c and also extending between the ends 160a, 160b. Proximate end 160a of top 160 is coupled to upper end 140a of upright member 140 such thai distal end 160b extends outward therefrom. As a result, in the example shown in Figure 2, top 160 is supported only at end 160a and thus is referred to herein as a "cantilevered" top. In some examples, base 120, member 140, and top 160 are all monoiithica!ly formed; however, it should be appreciated that in other example, base 120, member 140, and/or top 160 may not be monolithically formed while still complying with the principles disclosed herein.

[00019] Referring still to Figures 1-4, mat 200 includes a central axis or center!ine 205, a first or front side 200a, and a second or rear side 200b axialiy opposite the front side 200a. In this example, a touch sensitive surface 202 is disposed on mat 200 and is substantially aligned with the axis 205. Surface 202 may comprise any suitable touch sensitive technology for detecting and tracking one or multiple touch inputs by a user in order to allow the user to interact with software being executed by device 150 or some other computing device (not shown). For example, in some Implementations, surface 202 may utilize known touch sensitive technologies such as, for example, resistive, capacitlve, acoustic wave, infrared, strain gauge, optical, acoustic pulse recognition, or some combination thereof while still complying with the principles disclosed herein. In addition, in this example, surface 202 extends over only a portion of mat 200; however, it should be appreciated that In other examples, surface 202 may extend over substantially all of mat 200 while still complying with the principles disclosed herein. Furthermore, in some examples discussed further below with reference to Figures 7-10, a touch Input on the mat 200 may be associated with either the horizontal interface projected on the mat 200 or with vertical interface provided by the display 152, depending on the user's preference. Hence, a user may optionally utilize the touch on the horizontal mat 200 to control the interface provided by the vertical display 152.

[00020] During operation, mat 200 is aligned with base 120 of structure 1 10, as previously described to ensure proper alignment thereof. In particular, in this example, rear side 200b of mat 200 is placed between the raised portion 122 of base 120 and support surface 15 such that rear end 200b is aligned with front side 120a of base, thereby ensuring proper overall alignment of mat 200, and particularly surface 202, with other components within system 100. In some examples, mat 200 is aligned with device 150 such that the center line 155 of device 150 Is substantially aligned with center line 205 of mat 200; however, other alignments are possible, in addition, as will be described in more detail below, in at least some examples surface 202 of mat 200 and device 150 are electrically coupled to one another such that user inputs received by surface 202 are communicated to device 150. Any suitable wireless or wired electrical coupling or connection may be used between surface 202 and device 150 such as, for example, Wi- Fi, BLUETOOTH®, ultrasonic, electrical cables, electrical leads, electrical spring-loaded pogo pins with magnetic holding force, or some combination thereof, while still complying with the principles disclosed herein, in this example, exposed electrical contacts disposed on rear side 200b of mat 200 engage with corresponding electrical pogo-pin leads within portion 122 of base 120 to transfer signals between device 150 and surface 202 during operation, in addition, in this example, the electrical contacts are held together by adjacent magnets located in the clearance between portion 122 of base 120 and surface 15, previously described, to magnetically attract and hold (e.g., mechanically) a corresponding ferrous and/or magnetic material disposed along rear side 200b of mat 200.

[00021] Referring specifically now to Figure 3, projector unit 180 comprises an outer housing 182, and a projector assembly 184 disposed within housing 182. Housing 182 includes a first or upper end 182a, a second or lower end 182b opposite the upper end 182a, and an inner cavity 183. in this embodiment, housing 182 further Includes a coupling or mounting member 186 to engage with and support device 150 during operations. In general, member 186 may be any suitable member or device for suspending and supporting a computer device (e.g., device 150) while still complying with the principles disclosed herein. For example, in some implementations, member 188 comprises hinge that includes an axis of rotation such that a user (not shown) may rotate device 150 about the axis of rotation to attain an optimal viewing angle therewith. Further, in some examples, device 150 is permanently or semi-permanent!y attached to housing 182 of unit 180. For example, in some implementations, the housing 180 and device 150 are integrally and/or monolithicaliy formed as a single unit.

[00022] Thus, referring briefly to Figure 4, when device 150 is suspended from structure 1 10 through the mounting member 186 on housing 182, projector unit 180 (i.e., both housing 182 and assembly 184) is substantially hidden behind device 150 when system 100 is viewed from a viewing surface or viewing angle that is substantially facing display 152 disposed on front side 150c of device 150, in addition, as Is also shown in Figure 4, when device 150 is suspended from structure 1 10 in the manner described, projector unit 180 (i.e., both housing 182 and assembly 184) and any image projected thereby Is substantially aligned or centered with respect to the center line 155 of device 150.

[00023] Projector assembly 184 is generally disposed within cavity 183 of housing 182, and includes a first or upper end 184a, a second or lower end 184b opposite the upper end 184a. Upper end 184a is proximate upper end 182a of housing 182 while lower end 184b is proximate lower end 182b of housing 182. Projector assembly 184 may comprise any suitable digital !ight projector assembly for receiving data from a computing device (e.g., device 150) and projecting an image or Images (e.g., out of upper end 184a) that correspond with that input data. For example, in some implementations, projector assembly 184 comprises a digital light processing (DLP) projector or a liquid crystal on silicon (LCoS) projector which are advantageously compact and power efficient projection engines capable of multiple display resolutions and sizes, such as, for example, standard XGA (1024 x 768) resolution 4:3 aspect ratio or standard WXGA (1280 x 800) resolution 18:10 aspect ratio. Projector assembly 184 is further electrically coup!ed to device 150 in order to receive data therefrom for producing light and images from end 184a during operation. Projector assembly 184 may be electrically coupled to device 150 through any suitable type of electrical coupling while stl!l complying with the principles disclosed herein. For example, In some implementations, assembly 184 is electrically coupled to device 150 through an electric conductor, WI-FI, BLUETOOTH®, an optical connection, an ultrasonic connection, or some combination thereof, in this example, device 150 is electrically coupled to assembly 184 through electrical leads or conductors (previously described) that are disposed within mounting member 188 such that when device 150 is suspended from structure 1 10 through member 186, the electrical leads disposed within member 188 contact corresponding leads or conductors disposed on device 150.

[00024] Referring still to Figure 3, top 180 further includes a fold mirror 182 and a sensor bundle 164. Mirror 162 includes a highly reflective surface 182a that Is disposed along bottom surface 160d of top 160 and is positioned to reflect images and/or light projected from upper end 184a of projector assembly 184 toward mat 200 during operation. Mirror 162 may comprise any suitable type of mirror or reflective surface while still complying with the principles disclosed herein. In this example, fold mirror 162 comprises a standard front surface vacuum metalized aluminum coated glass mirror that acts to fold light emitted from assembly 184 down to mat 200. in other examples, mirror 162 could have a complex aspherical curvature to act as a reflective lens element to provide additional focusing power or optical correction.

[00025] Sensor bundle 164 includes a plurality of sensors and/or cameras to measure and/or detect various parameters occurring on or near mat 200 during operation. For example, In the specific implementation depicted in Figure 3, bundle 164 includes an ambient light sensor 164a, a camera (e.g., a visual RGB 14.1 megapixel high resolution camera) 164b, a depth sensor or camera 164c, and a three dimensional (3D) user interface sensor 164d. Ambient light sensor 164a is arranged to measure the intensity of light of the environment surrounding system 100, in order to, in some implementations, adjust the camera's and/or sensors (e.g., sensors 164a, 164b, 164c, 164d) exposure settings, and/or adjust the intensity of the light emitted from other sources throughout system such as, for example, projector assembly 184, display 152, etc. Camera 164b may, in some instances, comprise a color camera which is arranged to take either a still image or a video of an object 40 (e.g., a document, photo, book, 2D object, and/or 3D object) disposed on mat 200. For example, the camera 164b may be a visual 14.1 megapixel RBG camera. Depth sensor 164c generally indicates when a 3D object is on the work surface. In particular, depth sensor 164c may sense or detect the presence, shape, contours, motion, and/or the 3D depth of an object (or specific feature(s) of an object) placed on mat 200 during operation. Thus, in some implementations, sensor 164c may employ any suitable sensor or camera arrangement to sense and detect a 3D object and/or the depth values of each pixel (whether infrared, color, or other) disposed in the sensor's fie!d-of-view (FOV). For example, in some implementations sensor 164c may comprise a single Infrared (IR) camera sensor with a uniform flood of IR light, a dual IR camera sensor with a uniform flood of IR light, structured light depth sensor technology, time-of-fiight (TOF) depth sensor technoiogy, or some combination thereof. User interface sensor 164d includes any suitable device or devices (e.g., sensor or camera) for tracking a user input device such as, for example, a hand, stylus, pointing device, etc. In some implementations, sensor 164d includes a pair of cameras which are arranged to stereoscopicaiiy track the location of a user input device (e.g., a stylus) as It Is moved by a user about the mat 200, and particularly about surface 202 of mat 200. in other examples, sensor 164d may also or alternatively include an infrared camera(s) or sensor(s) that is arranged to detect infrared light that Is either emitted or reflected by a user input device. It should further be appreciated that bundle 164 may comprise other sensors and/or cameras either in lieu of or in addition to sensors 164a, 164b, 164c, 164d, previously described. In addition, as will explained in more detail below, each of the sensors 164a, 164b, 164c, 164d within bundle 164 is electrically and communicatively coupled to device 150 such that data generated within bundle 164 may be transmitted to device 150 and commands issued by device 150 may be communicated to the sensors 164a, 164b, 164c, 164d during operations. As is explained above for other components of system 100, any suitable electrical and/or communicative coupling may be used to couple sensor bundle 164 to device 150 such as for example, an electric conductor, WI-FI, BLUETOOTH®, an optical connection, an ultrasonic connection, or some combination thereof. In this example, electrical conductors are routed from bundle 164, through top 160, upright member 140, and projector unit 180 and into device 150 through the leads that are disposed within mounting member 186, previously described.

[00028] Referring now to Figures 5 and 6, during operation of system 100, light 187 is emitted from projector assembly 184 and reflected off of mirror 162 towards mat 200 thereby displaying an image and/or user interface on a projector display space 188. In this example, space 188 Is substantially rectangular and is defined by a length L188 and a width Wise, In some examples length L-i88 may equal approximately 16 inches, while width Wise may equal approximately 12 inches; however, it should be appreciated that other values for both length L188 arid width W188 may be used while stili complying with the principles disclosed herein. In addition, the sensors (e.g., sensors 164a, 164b, 164c, 164d) within bundle 164 include a sensed space 168 that, in at least some examples, overlaps and/or corresponds with projector display space 188, previously described. Space 168 defines the area that the sensors within bundle 164 are arranged to monitor and/or detect the conditions thereof in the manner previously described. In some examples, both space 188 and space 168 coincide or correspond with surface 202 of mat 200, previously described, to effectively integrate the functionality of the touch sensitive surface 202, projector assembly 184, and sensor bundle 164 within a defined area,

[00027] Referring now to Figures 5-7, in some examples, device 150 directs assembly 184 to project an image and/or user Interface onto surface 202 of mat 200. in addition, device 150 may also display an image and/or user interface on the display 152 (which may or may not be the same as the image and/or user Interface projected onto surface 202 by assembly 184). The image and/or user Interface projected by assembly 184 may comprise information and/or images produced by software executing within device 150. A user (not shown) may then interact with the image and/or user interface displayed on surface 202 and display 152 by physically engaging the touch sensitive surface 202 of mat 200. Such interaction may take place through any suitable method such as, direct interaction with a users hand 35, through a stylus 25, or other suitable user input device(s).

[00028] As best shown in Figure 7, when a user interacts with surface 202 of mat 200, a signal is generated which is routed to device 150 through any of the electrical coupling methods and devices previously described. Once device 150 receives the signal generated within mat 200, it is routed, through internal conductor paths 153, to a processor 250 which communicates with a non-transitory computer-readable storage device 280 to generate an output signal which is then routed back to projector assembly 184 and/or display 152 to implement a change In the image and/or user interface projected onto surface 202 and/or the image and/or user interface displayed on display 152, respectively, it should also be appreciated that the processor 250 may be at least one of a central processing unit (CPU), a semiconductor-based microprocessor, a graphics processing unit (GPU), a microcontroller, or another processing device configured to fetch, decode, and/or and execute instructions retrieved from the non-transitory computer-readabie storage device 260. it should also be appreciated that the non-transitory computer-readabie storage device 260 may correspond to any typical storage device that stores machine- readable instructions, such as programming code, software, firmware, or the like. For example, the non-transitory computer-readable storage device 260 may include one or more of a non-vo!atile memory, a volatile memory, and/or a storage device. Examples of non-volatile memory Include, but are not limited to, electronically erasable programmable read only memory (EEPROM) and read only memory (ROM). Examples of volatile memory include, but are not limited to, static random access memory (SRAM) and dynamic random access memory (DRAM). Examples of storage devices include, but are not limited to, hard disk drives, compact disc drives, digital versatile disc drives, optical devices, and flash memory devices, in some Implementations, the instructions may be part of an installation package that can be executed by the processor 250. in this case, the non-transitory computer-readable storage device 260 may be a portable medium such as a CD, DVD, or flash drive or a memory maintained by a server from which the installation package can be downloaded and installed. In another implementation, the instructions may be part of an application or application already installed. Here, the non- transitory computer-readabie storage device 260 may Include integrated memory such as a hard drive. Furthermore, in some examples, the processor 250 is integrated with the computer readable storage device 260, while in other examples, the processor 250 and the computer readable storage device 260 are discrete components. [00029] In one example in accordance with aspects of the present disclosure, a user may toggle the touch input association such that touch input directed to the mat 200 may be associated with the horizontal user interface projected on the mat 200 by the projector assembly 184, or, alternatively, may be associated with the user interface displayed on the vertical display 152 (where the vertical display may or may not be touch sensitive). More specifically, and referring to Figure 7, a touch input modification request may be received by a touch coordination module 290 comprising processor 250 and computer readable storage device 280. In response to the receiving the touch Input modification request, the touch coordination module 290 may change the touch Input association from the horizontal touch sensitive mat 200 to the vertical display 152 such that a touch Input on the horizontal touch sensitive mat 200 controls the first interface displayed on vertical display 152 as opposed to the second interface projected onto the mat 200. Thereafter, in response to receiving another touch input modification request, the touch coordination module 290 may revert the touch input association from the vertical display 152 to the horizontal touch sensitive mat 200 such that a touch input on the horizontal touch sensitive mat 200 controls the second interface projected on the mat 200 as opposed to the first interface displayed on vertical display 152.

[00030] St should be understood that while Figure 7 depicts the touch coordination module 290 as comprising only processor 250 and computer readable storage device 260, in various examples, the touch coordination module 290 comprises additional or alternative components. For example, the touch coordination module 290 may comprise a functionally equivalent circuit like an analog circuit, a digital signal processing device circuit, an application specific integrated circuit (ASIC), or other logic devices arranged to perform the same functions as the above-mentioned processor 250 and computer readable storage device 260.

[00031] Turning now to Figure 8, this figure depicts an example process flow diagram 800 for modifying touch input assignment in accordance with an example. It should be readily apparent that the processes depicted in Fig. 8 represent generalized illustrations, and that other processes may be added or existing processes may be removed, modified, or rearranged without departing from the scope and spirit of the present disclosure. Further, it should be understood that the processes may represent executable Instructions stored on memory that may cause the projective computing system 100 to respond, to perform actions, to change states, and/or to make decisions. Thus, the described processes may be implemented as executable Instructions and/or operations provided by a memory 260 associated with the system 100. Alternatively or in addition, the processes nay represent functions and/or actions performed by functionaiiy equivalent circuits like an analog circuit, a digital signal processing device circuit, an application specific Integrated circuit (ASIC), or other logic devices associated with the system 100. Furthermore, Figure 8 is not intended to limit the implementation of the described implementations, but rather the figure illustrates functional information one skilled In the art could use to design/fabricate circuits, generate software, or use a combination of hardware and software to perform the illustrated.

[00032] The process 800 may begin at block 810 where a first interface is displayed on the vertical display 152. As mentioned above, the vertical display may or may not be touch sensitive. At block 820, a second interface is projected on the horizontal touch sensitive mat 200. As discussed, the projector assembly 184 projects this user Interface upward and it reflects off a mirror back down to the mat 200. At block 830, a touch input modification request is received. This request may be received at the touch coordination module 290 and may be triggered by a user gesture, a button depression, a verbal command, or another user input. For example, in one Implementation, the 3D sensor 184d or another sensor may detect a particular gesture (e.g., hand/finger movement) by the user and this may trigger the touch Input modification request. This gesture may occur on or above the mat, and may occur within or outside of the region 202. In another example implementation, the input modification request may be triggered by depression/touching of a button on the system 100. For example, a button may be located on the base 120 or another portion of the system, and a user may depress/touch this button to trigger the touch modification request, and thereby toggle the touch Input assignment. This toggling may occur without the user needing to shutdown, logout, and/or reboot, and therefore provide the user with a user-friendly and seamless experience. In addition, the toggling may occur automatically and without further user interaction after, e.g., depression/touching the button or performing the predetermined toggling gesture.

[00033] Thereafter, at block 840, after receiving the touch input modification request, the touch coordination module 290 causes the touch Input association to change from the horizontal touch sensitive mat 200 to the vertical display 152 such that a touch input on the horizontal touch sensitive mat 200 controls the first interface displayed on the vertical display 152. This may be accomplished, for example, by updating register values that associate Interfaces/displays with touch inputs. These registers may be read, for example, by the operating system (OS) to coordinate touch Input, user interface control, and/or information displayed. [00034] In addition to the above, the touch assignment may be reverted back in response to receiving another touch input modification request. For example and continuing with the above example, in response to receiving another touch Input modification request, the touch coordination module 290 may change the touch Input association from the vertical display 152 to the horizontal touch sensitive mat 200 such that a touch input on the horizontal touch sensitive mat controls the second interface projected on the horizontal touch sensitive mat. As mentioned above, this may be accomplished, for example, by updating register values which associate interfaces/displays with touch inputs, and further may occur automatically without the user needing to shutdown, logout, and/or reboot the system 100.

[00035] Also, to assist the user with understanding their finger/stylus location with respect to the vertical display 152 when the mat 200 is being utilized as a touch input, an example in accordance with the present disclosure causes a marker (e.g., a cursor, dot, etc.) to be shown on the vertical display 152 when an Input device (e.g., finger, stylus, etc.) is touching or proximate to the horizontal touch sensitive mat. Thus, as shown in Fig. 9, when a user's finger 910 or stylus touches and/or is proximate to the mat 200 (e.g., within a threshold distance such as 1 inch), a marker 920 is shown on the vertical display 152 to provide an idea of finger/stylus positioning on the mat 200 with respect to the vertical display 152. !n an example, the location of the user's finger and/or stylus Is detected by the 3D sensor 164d or another sensor associated with the system 100. Further, in an example, the marker is a cursor, dot, star, or another symbol.

[00038] Furthermore, in an additional example depicted in Figure 10, In order to assist the user with referencing their position on the horizontal mat 200 with respect to the vertical display 152, a duplicate/clone of the first interface 1010 may be projected on all or a portion of the horizontal touch sensitive mat 200 while the touch input on the horizontal touch sensitive mat is controlling the first Interface displayed on the vertical display. A user, thus, can utilize the projected duplicate/clone first interface to more easily control the actual first user interface 1020 displayed on the vertical display 152. It should be understood that while the clone/duplicate first interface Is shown in Figure 10 as encompassing the entire touch mat 200, in some examples, only a portion of the touch mat 200 is utilized for the clone display. For instance, the duplicate/clone first Interface may be a "thumbnail" projected in the upper right hand quadrant of the touch mat 200 and the remaining portion of the touch mat may display the second interface. Such settings may be user configurable or set by default, depending on the Implementation. Furthermore, In some examples, since the aspect ratio may differ between the projected display on the horizoRiai touch mat and the vertical display, processing may be conducted to alter the image based on the aspect ratio to provide an optimal appearance.

[00037] In the manner described, the projective computer system 100 comprises a touch sensitive or non-touch sensitive vertical display (e.g., AiO computer or display) to display a first Interface, a horizontal touch sensitive mat communicatively coupled to the vertical display, and a projector assembly to project a second interface on the horizontal touch sensitive mat. Further, the system 100 Includes a touch coordination module to receive a touch input modification request, and in response to the receiving the touch input modification request, change the touch input association from the horizontal touch sensitive mat to the vertical display such that a touch input on the horizontal touch sensitive mat controls the first interface displayed on vertical display. Among other things, this architecture provides the user with an intuitive manner to change the touch association such that it is a more ergonomic Input method, and further reduces manufacturing costs by reducing the need to have touch components included In both the horizontal and vertical surfaces. That is, the vertical display may be non-touch sensitive but have such capability by switching the touch association from the horizontal touch mat to the non-touch sensitive display.

[00038] While device 150 has been described as an all-in-one computer, it should be appreciated that in other examples, device 150 may further employ the use of more traditional user input devices such as, for example, a keyboard and a mouse. In addition, while sensors 184a, 164b, 164c, 164d within bundle 164 have been described as each representing a single sensor or camera, it should be appreciated that each of the sensors 164a, 164b, 164c, 164d may each include multiple sensors or cameras while still complying with the principles described herein. Further, while top 180 has been described herein as a cantiievered top, it should be appreciated that in other examples, top 160 may be supported at more than one point and is thus may not be cantiievered while still complying with the principles disclosed herein.

[00039] The above discussion is meant to be illustrative of the principles and various embodiments of the present disclosure. Numerous variations and modifications will become apparent to those skilled in the art once the above disclosure is fully appreciated. It is intended that the following claims be interpreted to embrace all such variations and modifications.

Claims

WHAT IS CLAIMED IS:
1. A projective computing system, comprising:
a vertical ali-in-one computer to display a first interface; a horizontal touch sensitive mat communicatively coupled to the vertical all-in-one computer;
a projector assembly to project a second interface on the horizontal touch sensitive mat; and
a touch coordination module to
receive a touch input modification request, and
in response to the receiving the touch input modification request, automatically and without further input change the touch input association from the horizontal touch sensitive mat to the vertical all-in-one computer such that a touch input on the horizontal touch sensitive mat controls the first interface displayed on vertical all-in-one computer.
2. The projective computing system of claim 1 , wherein the touch input modification request is triggered by depression of a touch Input toggle button.
3. The projective computing system of claim 1 , wherein the touch input modification request is triggered by a gesture.
4. The projective computing system of claim 3, wherein the gesture is detected by a 3D sensor.
5. The projective computing system of claim 1 , wherein the touch coordination module is further to, in response to the receiving another touch input modification request, change the touch input association from vertical all-in-one computer to the horizontal touch sensitive mat such that a touch input on the horizontal touch sensitive mat controls the second interface projected on the horizontal touch sensitive mat.
6. The projective computing system of claim 1 , wherein the touch coordination module is further to cause a marker to be shown on the vertical all-in-one computer when an Input device is touching or proximate to the horizontal touch sensitive mat.
7. The projective computing system of claim 6, wherein the input device is a finger or a stylus, and wherein the location of the finger or the stylus is detected by a 3D sensor,
8. The projective computing system of claim 1 , wherein a duplicate of the first interface is projected on all or a portion of the horizontal touch sensitive mat while the touch Input on the horizontal touch sensitive mat is controlling the first interface displayed on the vertical all-in-one computer.
9. The projective computing system of claim 1 , wherein the vertical all-in-one computer display is not touch sensitive.
10. The projective computing system of claim 1 , wherein the vertical all-in-one computer display is touch sensitive.
1 1 . A method to modify touch input In a projective computing system, comprising:
displaying a first interface on a non-touch sensitive vertical display;
projecting a second interface on a horizontal touch sensitive mat;
receiving a touch input modification request; and
changing the touch input association from the horizontal touch sensitive mat to the non-touch sensitive vertical display such that a touch input on the horizontal touch sensitive mat controls the first interface displayed on the non- touch sensitive vertical display.
12. The method of claim 1 1 , further comprising:
receiving another touch Input modification request; and
changing the touch input association from the non-touch sensitive vertical display to the horizontal touch sensitive mat such that a touch input on the horizontal touch sensitive mat controls the second interface projected on the horizontal touch sensitive mat.
13. The method of claim 1 1 , further comprising:
displaying a marker on the non-touch sensitive vertical display when an input device is touching or proximate to the horizontal touch sensitive mat.
14. The method of claim 1 1 , further comprising:
projecting a duplicate of the first interface on all or a portion of the horizontal touch sensitive mat while touch input on the horizontal touch sensitive mat is controlling the first interface displayed on the non-touch sensitive vertical display.
15. A non-transitory computer readable storage device comprising instructions which when executed cause a projective computing system to:
receive a touch Input modification request:
change the touch input association from a horizontal touch sensitive mat to a non-touch sensitive vertical display such that a touch input on the horizontal touch sensitive mat controls a first interface displayed on the non-touch sensitive vertical display:
receive another touch input modification request; and
change the touch input association from the non-touch sensitive vertical display to the horizontal touch sensitive mat such that a touch input on the horizontal touch sensitive mat controls a second Interface projected on the horizontal touch sensitive mat.
PCT/US2013/057549 2013-08-30 2013-08-30 Touch input association WO2015030795A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2013/057549 WO2015030795A1 (en) 2013-08-30 2013-08-30 Touch input association

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US14/914,956 US10168897B2 (en) 2013-08-30 2013-08-30 Touch input association
EP13892742.1A EP3039515A4 (en) 2013-08-30 2013-08-30 Touch input association
CN201380079241.3A CN105492990A (en) 2013-08-30 2013-08-30 Touch input association
PCT/US2013/057549 WO2015030795A1 (en) 2013-08-30 2013-08-30 Touch input association
TW103125844A TWI530863B (en) 2013-08-30 2014-07-29 Projective computing system, method to modify touch input in the same, and related non-transitory computer readable storage device
TW105103546A TWI588736B (en) 2013-08-30 2014-07-29 Projective computing system, method to modify touch input in the same, and related non-transitory computer readable storage device

Publications (1)

Publication Number Publication Date
WO2015030795A1 true WO2015030795A1 (en) 2015-03-05

Family

ID=52587142

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2013/057549 WO2015030795A1 (en) 2013-08-30 2013-08-30 Touch input association

Country Status (5)

Country Link
US (1) US10168897B2 (en)
EP (1) EP3039515A4 (en)
CN (1) CN105492990A (en)
TW (2) TWI588736B (en)
WO (1) WO2015030795A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103838376B (en) * 2014-03-03 2016-09-28 深圳超多维光电子有限公司 Dimensional perspective a method and interactive system interaction
CN106178546A (en) * 2016-09-25 2016-12-07 依云智酷(北京)科技有限公司 Projection touch intelligent toy

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0626636A2 (en) 1993-03-16 1994-11-30 Hitachi, Ltd. Information processing system
US20110163944A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Intuitive, gesture-based communications with physics metaphors
WO2012009039A1 (en) * 2010-07-16 2012-01-19 Qualcomm Incorporated Methods and systems for interacting with projected user interface
US20120235922A1 (en) * 2011-03-14 2012-09-20 Lenovo (Singapore) Pte, Ltd. Central display with private secondary displays
US20120262379A1 (en) * 2011-04-12 2012-10-18 Apple Inc. Gesture visualization and sharing between electronic devices and remote displays
WO2012176142A2 (en) * 2011-06-20 2012-12-27 Chris Argiro Video-game controller assemblies designed for progressive control of actionable-objects displayed on touchscreens: expanding the method and breadth of touch-input delivery
US20130141331A1 (en) 2011-12-02 2013-06-06 Htc Corporation Method for performing wireless display control, and associated apparatus and associated computer program product

Family Cites Families (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6614422B1 (en) 1999-11-04 2003-09-02 Canesta, Inc. Method and apparatus for entering data using a virtual input device
US7710391B2 (en) 2002-05-28 2010-05-04 Matthew Bell Processing an image utilizing a spatially varying pattern
WO2003054683A2 (en) 2001-12-07 2003-07-03 Canesta Inc. User interface for electronic devices
JP2003344086A (en) 2002-05-28 2003-12-03 Pioneer Electronic Corp Touch panel device and display input device for car
US7203384B2 (en) 2003-02-24 2007-04-10 Electronic Scripting Products, Inc. Implement for optically inferring information from a planar jotting surface
US20050078092A1 (en) 2003-10-08 2005-04-14 Clapper Edward O. Whiteboard desk projection display
US7110100B2 (en) 2003-11-04 2006-09-19 Electronic Scripting Products, Inc. Apparatus and method for determining an inclination of an elongate object contacting a plane surface
US7268956B2 (en) 2003-11-24 2007-09-11 Electronic Scripting Products, Inc. Solid catadioptric lens with two viewpoints
US7038846B2 (en) 2003-11-24 2006-05-02 Electronic Scripting Products, Inc. Solid catadioptric lens with a single viewpoint
US7088440B2 (en) 2003-12-22 2006-08-08 Electronic Scripting Products, Inc. Method and apparatus for determining absolute position of a tip of an elongate object on a plane surface with invariant features
US9229540B2 (en) 2004-01-30 2016-01-05 Electronic Scripting Products, Inc. Deriving input from six degrees of freedom interfaces
US7961909B2 (en) 2006-03-08 2011-06-14 Electronic Scripting Products, Inc. Computer interface employing a manipulated object with absolute pose detection component and a display
US7826641B2 (en) 2004-01-30 2010-11-02 Electronic Scripting Products, Inc. Apparatus and method for determining an absolute pose of a manipulated object in a real three-dimensional environment with invariant features
US8542219B2 (en) 2004-01-30 2013-09-24 Electronic Scripting Products, Inc. Processing pose data derived from the pose of an elongate object
US7023536B2 (en) 2004-03-08 2006-04-04 Electronic Scripting Products, Inc. Apparatus and method for determining orientation parameters of an elongate object
US7161664B2 (en) 2004-04-13 2007-01-09 Electronic Scripting Products, Inc. Apparatus and method for optical determination of intermediate distances
US7113270B2 (en) 2004-06-18 2006-09-26 Electronics Scripting Products, Inc. Determination of an orientation parameter of an elongate object with a scan beam apparatus
US7599561B2 (en) 2006-02-28 2009-10-06 Microsoft Corporation Compact interactive tabletop with projection-vision
US7729515B2 (en) 2006-03-08 2010-06-01 Electronic Scripting Products, Inc. Optical navigation apparatus using fixed beacons and a centroid sensing device
TWI305894B (en) 2006-05-10 2009-02-01 Compal Communications Inc
US20080018591A1 (en) 2006-07-20 2008-01-24 Arkady Pittel User Interfacing
US8199117B2 (en) 2007-05-09 2012-06-12 Microsoft Corporation Archive for physical and digital objects
US9377874B2 (en) * 2007-11-02 2016-06-28 Northrop Grumman Systems Corporation Gesture recognition light and video image projector
KR101020029B1 (en) 2008-07-02 2011-03-09 삼성전자주식회사 Mobile terminal having touch screen and method for inputting key using touch thereof
US8121640B2 (en) 2009-03-19 2012-02-21 Microsoft Corporation Dual module portable devices
KR100987461B1 (en) 2009-09-21 2010-10-13 동국대학교 산학협력단 System and method for generating touch screen interface
TWI423096B (en) 2010-04-01 2014-01-11 Compal Communication Inc Projecting system with touch controllable projecting picture
TWI405106B (en) 2010-04-23 2013-08-11 Wen Jong Wu Interactive multi touch computer system and control method
EP2565751A1 (en) 2011-08-31 2013-03-06 Z124 Multi-screen display control
US8839134B2 (en) 2010-12-24 2014-09-16 Intel Corporation Projection interface techniques
US8736583B2 (en) 2011-03-29 2014-05-27 Intel Corporation Virtual links between different displays to present a single virtual object
US20130044075A1 (en) 2011-08-19 2013-02-21 Korry Electronics Co. Reconfigurable fixed function, nbc compatible integrated display and switch system
US9495012B2 (en) 2011-09-27 2016-11-15 Z124 Secondary single screen mode activation through user interface activation
JP5137150B1 (en) * 2012-02-23 2013-02-06 株式会社ワコム Portable electronic apparatus having a handwriting input device and handwritten information input apparatus
US8970709B2 (en) 2013-03-13 2015-03-03 Electronic Scripting Products, Inc. Reduced homography for recovery of pose parameters of an optical apparatus producing image data with structural uncertainty

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0626636A2 (en) 1993-03-16 1994-11-30 Hitachi, Ltd. Information processing system
US20110163944A1 (en) * 2010-01-05 2011-07-07 Apple Inc. Intuitive, gesture-based communications with physics metaphors
WO2012009039A1 (en) * 2010-07-16 2012-01-19 Qualcomm Incorporated Methods and systems for interacting with projected user interface
US20120235922A1 (en) * 2011-03-14 2012-09-20 Lenovo (Singapore) Pte, Ltd. Central display with private secondary displays
US20120262379A1 (en) * 2011-04-12 2012-10-18 Apple Inc. Gesture visualization and sharing between electronic devices and remote displays
WO2012176142A2 (en) * 2011-06-20 2012-12-27 Chris Argiro Video-game controller assemblies designed for progressive control of actionable-objects displayed on touchscreens: expanding the method and breadth of touch-input delivery
US20130141331A1 (en) 2011-12-02 2013-06-06 Htc Corporation Method for performing wireless display control, and associated apparatus and associated computer program product

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3039515A4 *

Also Published As

Publication number Publication date
CN105492990A (en) 2016-04-13
EP3039515A4 (en) 2017-04-05
US20160210039A1 (en) 2016-07-21
TWI588736B (en) 2017-06-21
EP3039515A1 (en) 2016-07-06
TWI530863B (en) 2016-04-21
US10168897B2 (en) 2019-01-01
TW201519074A (en) 2015-05-16
TW201631463A (en) 2016-09-01

Similar Documents

Publication Publication Date Title
JP5277703B2 (en) Electronics
US9535516B2 (en) System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
CN102016713B (en) Projection of images onto tangible user interfaces
CA2636678C (en) Uniform illumination of interactive display panel
US20140101576A1 (en) Multi display device and method of providing tool therefor
KR101284496B1 (en) Transparent electronic device
US8432362B2 (en) Keyboards and methods thereof
US8289288B2 (en) Virtual object adjustment via physical object detection
JP4955684B2 (en) Input method of interactive display surface
US8947402B2 (en) Touch sensitive image display
US7173605B2 (en) Method and apparatus for providing projected user interface for computing device
US9183806B2 (en) Adjusting font sizes
US20120113044A1 (en) Multi-Sensor Device
US8593434B2 (en) Touchscreen display with plural cameras
US8611667B2 (en) Compact interactive tabletop with projection-vision
JP5396975B2 (en) Image display device
CN103827788B (en) Dynamic control of the active input region of the user interface
CN102812424B (en) A lens system for light-based touch-screen
US8775971B2 (en) Touch display scroll control
US7970211B2 (en) Compact interactive tabletop with projection-vision
CN101833391A (en) Information processing apparatus, information processing method, and program
KR101825779B1 (en) Projection capture system and method
EP2350794A1 (en) Interactive input system with multi-angle reflecting structure
CN101351766A (en) Orientation free user interface
CN101859208A (en) Image display apparatus, image display method, and recording medium having image display program stored therein

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201380079241.3

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13892742

Country of ref document: EP

Kind code of ref document: A1

REEP Request for entry into the european phase

Ref document number: 2013892742

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2013892742

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 14914956

Country of ref document: US

NENP Non-entry into the national phase in:

Ref country code: DE