US20060230192A1 - Display of a user interface - Google Patents
Display of a user interface Download PDFInfo
- Publication number
- US20060230192A1 US20060230192A1 US11/092,444 US9244405A US2006230192A1 US 20060230192 A1 US20060230192 A1 US 20060230192A1 US 9244405 A US9244405 A US 9244405A US 2006230192 A1 US2006230192 A1 US 2006230192A1
- Authority
- US
- United States
- Prior art keywords
- user interface
- electronic device
- user
- display
- detecting
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/039—Accessories therefor, e.g. mouse pads
- G06F3/0393—Accessories for touch pads or touch screens, e.g. mechanical guides added to touch screens for drawing straight lines, hard keys overlaying touch screens or touch pads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
Definitions
- Computing devices serving entertainment purposes may lack features sometimes considered desirable. They typically cannot easily interact with other electronic devices that the average consumer owns. For example, home theater personal computers frequently have difficulty communicating with mobile phones, remote controls, and the like. Furthermore, even when the computing devices are able to communicate with such other devices, they many times cannot replace them, so that the user may maintain many different devices, in contradistinction to the potential advantages promised by integrating different types of functionality in one device.
- FIG. 1 is a diagram of a perspective view of an embodiment of an interactive display system, according to an embodiment of the present disclosure.
- FIG. 2 is a diagram of an exploded view of the embodiment of the interactive display system of FIG. 1 , according to an embodiment of the present disclosure.
- FIG. 3 is a flowchart of an embodiment of a method depicting how an embodiment of an interactive display system can interact with an external physical object, according to an embodiment of the present disclosure.
- FIG. 4 is a diagram of a representative user interface corresponding to an object as displayed by an embodiment of an interactive display system, according to an embodiment of the present disclosure.
- FIG. 5 is a diagram of another representative user interface corresponding to an object as displayed by an embodiment of an interactive display system, according to an embodiment of the present disclosure.
- FIG. 6 is a rudimentary block diagram of the embodiment of the interactive display system, according to an embodiment of the present disclosure.
- FIGS. 1 and 2 show an embodiment of a display system, such as interactive display system 10 , according to an embodiment of the present disclosure.
- the interactive display system 10 is depicted in FIGS. 1 and 2 as embodied in a table 12 , with the table surface functioning as the display surface 14 .
- Multiple users each having his or her own data-receiving device D 1 through Dn, can view and access the display surface 14 by sitting around the table 12 .
- the physical embodiment of the display system 10 can take any number of forms other than that of a table.
- the interactive display system 10 may be more generally referred to as an electronic device.
- the interactive display system 10 can include a display surface 14 , a digital light processor (DLP) 16 or other projection or display device, a touch-sensitive surface 36 , and a controller 18 .
- the touch-sensitive surface 36 is typically disposed over the display surface 14 , such that the devices D 1 -Dn would be disposed through the surface 36 and onto the display surface 14 .
- the controller 18 is configured to generate electrical image signals indicative of viewable images, such as computer programs, movie videos, video games, Internet web pages, and so on, which are provided for generation to the DLP 16 .
- the DLP 16 in response to the electrical signals, generates digital optical (viewable) images that are viewable on the display surface 14 .
- the controller 18 may receive data and other information to generate the image signals from various sources, such as hard disk drives, compact discs (CD's) or digital versatile discs (DVD's) 32 , computer servers, local and/or wide area networks, the Internet, and so on.
- the controller 18 may also provide additional output in the form of projected images from an auxiliary projector 20 and sound from a speaker 22 .
- the interactive display system 10 can include a variety of other components, such as a projector 20 , configured to simultaneously project the content of the display surface 14 onto a wall-mounted screen, for instance.
- the projector 20 may display content that is different than the content displayed on the display surface 14 .
- the interactive display system 10 may also include one or more speakers 22 for producing audible sounds that accompany the visual content on the display surface 14 .
- the interactive display system 10 may include one or more devices for storing and retrieving data, such as a CD or DVD drive, hard disk drives, flash memory ports, and so on.
- the systems and methods of embodiments of the present disclosure are not limited to displaying information to a display surface 14 using a DLP 16 . Rather, any number of panel display devices having addressable pixels may be used, such as a liquid crystal display (LCD), a plasma display, or another type of flat panel display.
- LCD liquid crystal display
- plasma display or another type of flat panel display.
- the DLP 16 may also assume a variety of forms in differing embodiments of the present disclosure.
- the DLP 16 generates a viewable digital image on the display surface 14 by projecting a plurality of pixels of light onto the display surface 14 .
- Each viewable image may be made up of millions of pixels, a fewer number pixels, or a greater number of pixels.
- Each pixel is individually controlled and addressable by the DLP 16 to have a certain color (or gray-scale).
- the combination of many light pixels of different colors (or gray-scales) on the display surface 14 generates a viewable image or “frame.” Continuous video and graphics may be generated by sequentially combining frames together, as in a motion picture.
- DLP 16 includes a digital micro-mirror device (DMD) configured to vary the projection of light pixels onto the display surface 14 .
- DMD digital micro-mirror device
- Other embodiments could include, but are in no way limited to, diffractive light devices (DLD), as well as non-projection-type displays, such as plasma displays, and liquid crystal displays (LCD's). Additionally, other display technologies could be substituted for the DLP ( 16 ) without varying from the scope of the present system and method.
- DLD digital micro-mirror device
- LCD's liquid crystal displays
- the touch-sensitive surface 36 may in one embodiment of the present disclosure be present to provide the users of the system 10 with a form of user input in addition to and/or in lieu of the devices D 1 -Dn.
- the touch-sensitive surface 36 is depicted in FIG. 1 as being separate from the display surface 14 , but in another embodiment, it may be integrated with or substitute for the display surface 14 .
- the touch-sensitive surface 36 is sensitive to the placement of physical objects, such as the fingertips of users, and so on, on the display surface 14 .
- the touch-sensitive surface 36 may employ any of a number of different types of touch-detection technology, such as resistive, capacitive, infrared, optical wave, and/or other types of touch-detection technologies.
- a back-side imaging camera renders the surface 36 touch sensitive by detecting user input on the surface 36 .
- FIG. 3 shows a method 300 for using the interactive display system 10 that has been described in conjunction with a physical object external to the system 10 , according to an embodiment of the present disclosure.
- At least some parts of the method 300 may be implemented as parts of a computer program stored on a computer-readable medium for execution by the system 10 .
- the computer program parts may be software objects, subroutines, routines, and so on.
- the computer-readable medium may be a removable or a non-removable medium, and a volatile or a non-volatile medium.
- the medium may be a semiconductor medium, such as a memory, a magnetic medium, such as a hard disk drive or a floppy disk, and/or an optical medium, such as a CD or a DVD.
- the physical object may be an electronic or a non-electronic device.
- the object may be an electronic device like a cellular phone, a remote control, or another type of electronic device.
- the object may also be a game piece, like a chess or checkers piece, or another type of non-electronic device. It should be recognized that embodiments of the present disclosure are not limited to the type of physical object that can be used in conjunction with the method 300 of FIG. 3 .
- the interactive display system 10 detects the presence of the physical object in proximate vicinity to the system 10 ( 302 ). That is, the system 10 detects the placement of the physical object on a surface of the system 10 .
- the object may be placed on the display surface 14 (or the surface 36 ) of the system 1 0 , such that the touch-sensitive surface 36 thereof detects the presence of the object.
- the object may have an optical interfacing mechanism, electrically connective connector, or a radio frequency (RF) transceiver that allows the system 10 to detect the presence of the object, upon placement of the object on a surface of the system 10 , which in such an embodiment may or may not be a touch-sensitive surface.
- RF radio frequency
- the object may have a tag or a marking, such as a bar code, that when placed in appropriate proximity to the system 10 allows the system 10 to detect and retrieve the marking, upon placement of the object on a surface of the system 10 , which in such an embodiment may or may not be a touch-sensitive surface.
- a tag or a marking such as a bar code
- the detection in 302 is accomplished without the interactive display system 10 receiving a video signal from the physical object.
- a video signal may be capable of providing a video signal to the system 10 .
- PDA personal digital assistant
- such a video signal is not the manner or mechanism by which the system 10 detects the presence of this physical object.
- the object may have an infrared port compatible with the Infrared Data Association (IrDA) standard. Placing the infrared port of the object in appropriate proximity to a corresponding port of the system 10 allows the system 10 to detect presence of the object.
- a connector of the object may be inserted into a corresponding connector of the system 10 to allow the system 10 to detect presence of the object.
- the object may emit RF signals in accordance with a proprietary or non-proprietary standard, such as Bluetooth, 802.11 a/b/g, and so on. The system 10 detects these RF signals in order to detect presence of the object.
- a bar code or other tag on the object may be placed in appropriate proximity to a scanning mechanism of the system 10 to allow the system 10 to detect presence of the object.
- the interactive display system 10 detects the identity of the object ( 304 ), such as the type or class of the physical object that the presence of which has been detected.
- the system 10 may detect the identity as a particular kind of gaming piece, such as a particular kind of chess piece, or as a particular kind of mobile phone.
- the system 10 may be able to detect the particular mobile phone, being able to distinguish between two mobile phones of the same type. Detection of the identity of the object may be accomplished similarly as detection of the presence itself of the object is accomplished.
- the system 10 may be able to discern the identity of the object based on the footprint of the object on the display surface 14 (or on the surface 36 ), as detected by the touch-sensitive surface 36 .
- the system 10 may alternatively receive communication from the object indicating its identity, such as its type, via infrared communication, wireless communication, direct wired communication, or optically, such as via a camera detecting such identity, such as by using a camera.
- the system 10 may alternatively still determine the identity of the object by detecting and interpreting a marking on the object, such as a bar code.
- the interactive display system 10 next displays a user interface corresponding to the object ( 306 ).
- the system 10 may display the user interface using the DLP 16 , such that the user interface is viewable on the display surface 14 .
- the system 10 may display the user interface using the auxiliary projector 20 , such that one or more users are able to view the user interface as projected on a wall, screen, or other surface external to the system 10 .
- the user interface may in one embodiment be password protected in one or more forms. For instance, to be able to use the user interface, the proper password would be first entered in by a user. Alternatively, to be able to change the user interface, the proper password would be first entered.
- the user interface displayed may take a variety of different forms.
- the display system 10 may display a user interface that is substantially identical to the built-in user interface of the object.
- cellular phones have a soft-type user interface in that menu items are displayed on built-in screens of the phones. Such menu items can be duplicated in the interface displayed by the display system 10 .
- remote controls typically have a hard-type user interface in that there are a number of physical buttons on the remote controls. Such physical buttons can be virtually duplicated within the interface displayed by the display system 10 .
- the display system 10 may display a user interface that provides greater functionality than the user interface of the physical object itself.
- the physical object may not have a user interface. Therefore, any user interface for the object displayed by the display system 10 inherently has greater functionality than that which the object itself can provide.
- a game piece such as a chess piece
- the display system 10 may show as the user interface for this chess piece, the type of chess piece, describe the rules on how and/or where the chess piece can be moved, where it is in relation to other chess pieces on a chess board, and so on.
- the game piece is a role-playing game (RPG) piece
- the display system 10 may further show as the user interface how the character represented by the piece can be upgraded or enhanced, and so on.
- the display system 10 may show an extended user interface that does have built-in address book functionality for such a cell phone. Using such a cell phone through the display system 10 thus affords the user with the ability to leverage functionality that is otherwise not provided by the cell phone itself.
- FIGS. 4 and 5 show sample user interfaces that the interactive display system 10 may display for different types of physical objects, according to varying embodiments of the present disclosure.
- the user interface 400 corresponds at least substantially identically to the hard-type user interface of a typical remote control.
- the remote control may have physical buttons corresponding to selecting DVD, CD, or TV, increasing and lowering the volume, changing the channel, as well as the numbers 0 through 9.
- the user interface 400 displayed by the display system 10 may therefore, in one embodiment, be substantially similar in function to the physical buttons of the remote control.
- the buttons of the user interface 400 are virtual buttons, since they are displayed by the display system 10 , and do not correspond to actual physical buttons or controls of the display system 10 .
- the user interface 500 extends the functionality provided by a rudimentary cellular phone.
- the cellular phone in question may not have address book functionality, and may have a small or no display.
- the user interface 500 has two columns 502 and 504 , whereas the cellular phone may only be able to show one such column of information at a time, if any, and thus provides for a larger user interface than which can be provided by the cellular phone itself.
- the phone does not have address book functionality
- the user interface 500 provides such address book functionality. For instance, by selecting “address book” in the left column 502 , the user is presented with a scrollable list of names in the right column 504 from which he or she can select a name having a corresponding phone number to make a phone call.
- user input is then received by the interactive display system 10 that represents user interaction with the user interface ( 308 ).
- the user may push virtual buttons of the user interface, or select menu items of the user interface, by using the touch-sensitive surface 36 or the devices D 1 -Dn of the display system 10 .
- the user input corresponding to user interaction with the user interface may be processed by the display system 10 in at least one of two different ways.
- the user input may be communicated from the display system 10 back to the physical object ( 310 ).
- the display system 10 is effectively acting as an input device and as a display device for the object.
- the user can instead use the larger display and the larger controls of the display system 10 to interact with physical object.
- the physical object changes the user interface in response to the user input received, such changes are mirrored on the version of the user interface for the object displayed by the display system 10 .
- the user input may be used by the display system 10 in a manner other than conveying the input to the physical object ( 312 ).
- the display system 10 can subsume the functionality provided by the physical object, providing that functionality itself without employing the physical object.
- the electronic device is thus responsive to user interaction with the user interface independent of the object.
- the display system 10 in response to user selection of the virtual controls of the user interface provided by the system 10 may appropriately send infrared signals corresponding to those controls without using the remote control itself.
- the display system 10 may have infrared emitters by which it can send infrared signals to audio/visual components like DVD and CD players without use of the remote control to send the infrared signals.
- the call may be placed by the display system 10 and not by the cellular phone.
- the system 10 may place the call through a standard phone line to which the system 10 is coupled, or over the Internet using voice-over-Internet Protocol (VolP) technology.
- a telephone handset may be part of the system 10 for use by the user in making the phone call, or the system 10 may have an integrated microphone to detect speech by the user, and use the speaker 22 to emit sound to the user.
- the device itself can perform the functionality; for instance, the cellular phone may make the phone call initiated through the user interfaced provided by the system 10 and corresponding to the phone.
- FIG. 6 shows a rudimentary block diagram of an embodiment of the display system 10 , according to an embodiment of the present disclosure.
- the display system 10 is depicted as including a touch-sensitive tabletop surface 602 , a display mechanism 604 , an object-control mechanism 606 , a projection mechanism 608 , and a communication mechanism 610 .
- the system 10 may have other mechanism, in addition to and/or in lieu of those depicted in FIG. 6 .
- Each of the mechanisms 602 , 604 , 606 , and 608 includes hardware, and may also optionally include software.
- the touch-sensitive tabletop surface 602 is substantially transparent and receptive to user touch input.
- the surface 602 is substantially transparent in that it is able to pass light therethrough.
- Substantial transparency can include complete transparency, where all such light passes through the surface 602 , as well as partial opacity, where some but not all of the light passes through the surface 602 .
- Substantial transparency does not include complete opacity, where little or none of the light passes through the surface 602 .
- the surface 602 may include the display surface 14 and the surface 36 that have been described.
- the display mechanism 604 is situated underneath or above the surface 602 , and is capable of displaying information to one or more users.
- the display mechanism 604 may include the DLP 16 that has been described.
- the object-control mechanism 606 detects presence of an external physical object that is in proximate vicinity to the display system 10 .
- the object-control mechanism 606 also displays a user interface corresponding to the physical object via at least the display mechanism 604 , and receives user input via at least the touch-sensitive tabletop surface 602 representing user interaction with the user interface.
- the mechanism 606 may include the controller 18 that has been described, as well as additional user-input devices, such as the devices D 1 -Dn.
- the mechanism 606 may further download information regarding the physical object for display via the display mechanism 604 , examples of such information are presented later in the detailed description.
- the projection mechanism 608 projects information for external display to the user.
- the projection mechanism 608 may include the auxiliary projector 20 .
- the object-control mechanism 606 may display the user interface via the projection mechanism 608 , either in addition to and/or in lieu of displaying the user interface via the display mechanism 604 .
- the communication mechanism 610 is to detect the identity and/or presence of the physical object, and/or communicate with the physical object.
- the communication mechanism 610 may include a transceiver to receive RF communication from and transmit RF communication to the object.
- the communication mechanism 610 may alternatively include a barcode reader to detect a barcode on the physical object.
- the object-control mechanism may detect the presence of a physical object via the touch-sensitive tabletop surface 602 .
- the embodiments described herein can be extended and used in a variety of different applications.
- the user interface that has been described as being displayed for a detected object may be stored on a different electronic device than that which detects the object.
- Such a user interface may be stored at a server that is communicatively coupled to the electronic device over the Internet or another type of network. The user is thus able to modify or update the user interface by appropriately logging onto the server, as can be appreciated by those of ordinary skill within the art.
- the user interface has been described such that a user is able to interact with the user interface, and the results of such user interactions are directly reflected in the object itself.
- the address book of a cellular phone can be modified by the user interacting with the user interface displayed for the cellular phone.
- the user interface that is displayed may also have different formats among which the user can select, such as different or custom skins, fonts, colors, and so on.
- the user interface may further may be able to be zoomed in, and out, for increased readability, such as by users having sight-oriented disabilities.
- the user interface may also be password protected, as has been noted above, such that certain aspects, attributes, or properties of the object can be modified if the proper password has been entered, while other aspects, attributes, or properties may not be modified without the proper password, or certain portions of the user interface are viewable by a user without entry of the proper password, while other portions of the user interface are not viewable by a user without entry of the proper password.
- the electronic device itself, as well as the object may have different passwords, such that both passwords may have to be entered in order to view the user interface for the object. Different passwords may correspond to different privilege levels, such that some users can interact with a given object in some limited way, and other users can interact with the object more fully.
- a further type of object is a car key, or fob, which upon detection brings up as the user interface for this object the maintenance records for the vehicle with which the key or fob is associated.
- the electronic device may in such an embodiment be even able to communicate with the vehicle, wirelessly or in a wired manner, to acquire the current information for the vehicle.
- Such information may include mileage, self-diagnostic information, music stored in the vehicle, navigation information, data transferred to the vehicle at another location, and so on. The user may therefore be able to store and manipulate this information at the electronic device itself, and the electronic device may be programmed to advise and allow appropriate maintenance to be scheduled with local automotive service businesses.
- another type of object is a book.
- the electronic device may be able to detect the title of the book, such as by reading the ISBN bar code thereof.
- the text of the book may then be downloaded by the electronic device from a server connected to the Internet or another network.
- the user would then be able to electronically search the text of the book.
- notes or comments from others regarding the book may be downloaded, for viewing by the user.
- Extra content regarding the book may further be downloaded, such as additional information regarding the author thereof, forums in which the user can participate in discussions regarding the book, alternate endings of the book, and so on.
Abstract
Description
- Computing devices serving entertainment purposes may lack features sometimes considered desirable. They typically cannot easily interact with other electronic devices that the average consumer owns. For example, home theater personal computers frequently have difficulty communicating with mobile phones, remote controls, and the like. Furthermore, even when the computing devices are able to communicate with such other devices, they many times cannot replace them, so that the user may maintain many different devices, in contradistinction to the potential advantages promised by integrating different types of functionality in one device.
- The drawings referenced herein form a part of the specification. Features shown in the drawing are meant as illustrative of only some embodiments of the present disclosure, and not of all embodiments of the present disclosure.
-
FIG. 1 is a diagram of a perspective view of an embodiment of an interactive display system, according to an embodiment of the present disclosure. -
FIG. 2 is a diagram of an exploded view of the embodiment of the interactive display system ofFIG. 1 , according to an embodiment of the present disclosure. -
FIG. 3 is a flowchart of an embodiment of a method depicting how an embodiment of an interactive display system can interact with an external physical object, according to an embodiment of the present disclosure. -
FIG. 4 is a diagram of a representative user interface corresponding to an object as displayed by an embodiment of an interactive display system, according to an embodiment of the present disclosure. -
FIG. 5 is a diagram of another representative user interface corresponding to an object as displayed by an embodiment of an interactive display system, according to an embodiment of the present disclosure. -
FIG. 6 is a rudimentary block diagram of the embodiment of the interactive display system, according to an embodiment of the present disclosure. - In the following detailed description of exemplary embodiments of the present disclosure, reference is made to the accompanying drawings that form a part thereof, and in which is shown by way of illustration specific exemplary embodiments in which the subject matter of the present disclosure may be practiced. These embodiments are described in sufficient detail to enable those skilled in the art to practice the subject matter of the present disclosure. Other embodiments may be utilized, and logical, mechanical, electrical, electro-optical, software/firmware and other changes may be made without departing from the spirit or scope of the present disclosure. The following detailed description is, therefore, not to be taken in a limiting sense, and the scope of the present disclosure is defined only by the appended claims.
-
FIGS. 1 and 2 show an embodiment of a display system, such asinteractive display system 10, according to an embodiment of the present disclosure. Theinteractive display system 10 is depicted inFIGS. 1 and 2 as embodied in a table 12, with the table surface functioning as thedisplay surface 14. Multiple users, each having his or her own data-receiving device D1 through Dn, can view and access thedisplay surface 14 by sitting around the table 12. It is noted that the physical embodiment of thedisplay system 10 can take any number of forms other than that of a table. Theinteractive display system 10 may be more generally referred to as an electronic device. - The
interactive display system 10 can include adisplay surface 14, a digital light processor (DLP) 16 or other projection or display device, a touch-sensitive surface 36, and acontroller 18. The touch-sensitive surface 36 is typically disposed over thedisplay surface 14, such that the devices D1-Dn would be disposed through thesurface 36 and onto thedisplay surface 14. According to one embodiment, thecontroller 18 is configured to generate electrical image signals indicative of viewable images, such as computer programs, movie videos, video games, Internet web pages, and so on, which are provided for generation to theDLP 16. TheDLP 16, in response to the electrical signals, generates digital optical (viewable) images that are viewable on thedisplay surface 14. Thecontroller 18 may receive data and other information to generate the image signals from various sources, such as hard disk drives, compact discs (CD's) or digital versatile discs (DVD's) 32, computer servers, local and/or wide area networks, the Internet, and so on. Thecontroller 18 may also provide additional output in the form of projected images from anauxiliary projector 20 and sound from aspeaker 22. - As shown in
FIGS. 1 and 2 , theinteractive display system 10 can include a variety of other components, such as aprojector 20, configured to simultaneously project the content of thedisplay surface 14 onto a wall-mounted screen, for instance. Alternatively, theprojector 20 may display content that is different than the content displayed on thedisplay surface 14. Theinteractive display system 10 may also include one ormore speakers 22 for producing audible sounds that accompany the visual content on thedisplay surface 14. Further, theinteractive display system 10 may include one or more devices for storing and retrieving data, such as a CD or DVD drive, hard disk drives, flash memory ports, and so on. - While the
interactive display system 10 is described above in the context of a display device including aDLP 16, the systems and methods of embodiments of the present disclosure are not limited to displaying information to adisplay surface 14 using aDLP 16. Rather, any number of panel display devices having addressable pixels may be used, such as a liquid crystal display (LCD), a plasma display, or another type of flat panel display. TheDLP 16 may also assume a variety of forms in differing embodiments of the present disclosure. - In general, the
DLP 16 generates a viewable digital image on thedisplay surface 14 by projecting a plurality of pixels of light onto thedisplay surface 14. Each viewable image may be made up of millions of pixels, a fewer number pixels, or a greater number of pixels. Each pixel is individually controlled and addressable by theDLP 16 to have a certain color (or gray-scale). The combination of many light pixels of different colors (or gray-scales) on thedisplay surface 14 generates a viewable image or “frame.” Continuous video and graphics may be generated by sequentially combining frames together, as in a motion picture. - One embodiment of a
DLP 16 includes a digital micro-mirror device (DMD) configured to vary the projection of light pixels onto thedisplay surface 14. Other embodiments could include, but are in no way limited to, diffractive light devices (DLD), as well as non-projection-type displays, such as plasma displays, and liquid crystal displays (LCD's). Additionally, other display technologies could be substituted for the DLP (16) without varying from the scope of the present system and method. - The touch-
sensitive surface 36 may in one embodiment of the present disclosure be present to provide the users of thesystem 10 with a form of user input in addition to and/or in lieu of the devices D1-Dn. The touch-sensitive surface 36 is depicted inFIG. 1 as being separate from thedisplay surface 14, but in another embodiment, it may be integrated with or substitute for thedisplay surface 14. The touch-sensitive surface 36 is sensitive to the placement of physical objects, such as the fingertips of users, and so on, on thedisplay surface 14. The touch-sensitive surface 36 may employ any of a number of different types of touch-detection technology, such as resistive, capacitive, infrared, optical wave, and/or other types of touch-detection technologies. In one embodiment, a back-side imaging camera renders thesurface 36 touch sensitive by detecting user input on thesurface 36. -
FIG. 3 shows amethod 300 for using theinteractive display system 10 that has been described in conjunction with a physical object external to thesystem 10, according to an embodiment of the present disclosure. At least some parts of themethod 300 may be implemented as parts of a computer program stored on a computer-readable medium for execution by thesystem 10. For example, the computer program parts may be software objects, subroutines, routines, and so on. The computer-readable medium may be a removable or a non-removable medium, and a volatile or a non-volatile medium. The medium may be a semiconductor medium, such as a memory, a magnetic medium, such as a hard disk drive or a floppy disk, and/or an optical medium, such as a CD or a DVD. - The physical object may be an electronic or a non-electronic device. For example, the object may be an electronic device like a cellular phone, a remote control, or another type of electronic device. The object may also be a game piece, like a chess or checkers piece, or another type of non-electronic device. It should be recognized that embodiments of the present disclosure are not limited to the type of physical object that can be used in conjunction with the
method 300 ofFIG. 3 . - The
interactive display system 10 detects the presence of the physical object in proximate vicinity to the system 10 (302). That is, thesystem 10 detects the placement of the physical object on a surface of thesystem 10. In one embodiment, the object may be placed on the display surface 14 (or the surface 36) of thesystem 1 0, such that the touch-sensitive surface 36 thereof detects the presence of the object. In another embodiment, the object may have an optical interfacing mechanism, electrically connective connector, or a radio frequency (RF) transceiver that allows thesystem 10 to detect the presence of the object, upon placement of the object on a surface of thesystem 10, which in such an embodiment may or may not be a touch-sensitive surface. In still another embodiment, the object may have a tag or a marking, such as a bar code, that when placed in appropriate proximity to thesystem 10 allows thesystem 10 to detect and retrieve the marking, upon placement of the object on a surface of thesystem 10, which in such an embodiment may or may not be a touch-sensitive surface. - The detection in 302 is accomplished without the
interactive display system 10 receiving a video signal from the physical object. This is not to say that the object itself cannot provide such a video signal to thesystem 10, but only that the video signal is not used by thesystem 10 to detect the object. For example, a personal digital assistant (PDA) device may be capable of providing a video signal to thesystem 10. However, such a video signal is not the manner or mechanism by which thesystem 10 detects the presence of this physical object. - Furthermore, the object may have an infrared port compatible with the Infrared Data Association (IrDA) standard. Placing the infrared port of the object in appropriate proximity to a corresponding port of the
system 10 allows thesystem 10 to detect presence of the object. As another example, a connector of the object may be inserted into a corresponding connector of thesystem 10 to allow thesystem 10 to detect presence of the object. As another example, the object may emit RF signals in accordance with a proprietary or non-proprietary standard, such as Bluetooth, 802.11 a/b/g, and so on. Thesystem 10 detects these RF signals in order to detect presence of the object. As another example, a bar code or other tag on the object may be placed in appropriate proximity to a scanning mechanism of thesystem 10 to allow thesystem 10 to detect presence of the object. - Furthermore, the
interactive display system 10 detects the identity of the object (304), such as the type or class of the physical object that the presence of which has been detected. For example, thesystem 10 may detect the identity as a particular kind of gaming piece, such as a particular kind of chess piece, or as a particular kind of mobile phone. As another example, thesystem 10 may be able to detect the particular mobile phone, being able to distinguish between two mobile phones of the same type. Detection of the identity of the object may be accomplished similarly as detection of the presence itself of the object is accomplished. - For instance, the
system 10 may be able to discern the identity of the object based on the footprint of the object on the display surface 14 (or on the surface 36), as detected by the touch-sensitive surface 36. Thesystem 10 may alternatively receive communication from the object indicating its identity, such as its type, via infrared communication, wireless communication, direct wired communication, or optically, such as via a camera detecting such identity, such as by using a camera. Thesystem 10 may alternatively still determine the identity of the object by detecting and interpreting a marking on the object, such as a bar code. - The
interactive display system 10 next displays a user interface corresponding to the object (306). Thesystem 10 may display the user interface using theDLP 16, such that the user interface is viewable on thedisplay surface 14. Alternatively, or additionally, thesystem 10 may display the user interface using theauxiliary projector 20, such that one or more users are able to view the user interface as projected on a wall, screen, or other surface external to thesystem 10. - The user interface, and thus the object itself, may in one embodiment be password protected in one or more forms. For instance, to be able to use the user interface, the proper password would be first entered in by a user. Alternatively, to be able to change the user interface, the proper password would be first entered.
- The user interface displayed may take a variety of different forms. Where the physical object itself has a built-in user interface, the
display system 10 may display a user interface that is substantially identical to the built-in user interface of the object. For example, cellular phones have a soft-type user interface in that menu items are displayed on built-in screens of the phones. Such menu items can be duplicated in the interface displayed by thedisplay system 10. As another example, remote controls typically have a hard-type user interface in that there are a number of physical buttons on the remote controls. Such physical buttons can be virtually duplicated within the interface displayed by thedisplay system 10. - Furthermore, the
display system 10 may display a user interface that provides greater functionality than the user interface of the physical object itself. For example, the physical object may not have a user interface. Therefore, any user interface for the object displayed by thedisplay system 10 inherently has greater functionality than that which the object itself can provide. For instance, a game piece, such as a chess piece, usually does not have a user interface. Therefore, thedisplay system 10 may show as the user interface for this chess piece, the type of chess piece, describe the rules on how and/or where the chess piece can be moved, where it is in relation to other chess pieces on a chess board, and so on. Where the game piece is a role-playing game (RPG) piece, thedisplay system 10 may further show as the user interface how the character represented by the piece can be upgraded or enhanced, and so on. - As another example, rudimentary cell phones may not have built-in address book functionality. Therefore, the
display system 10 may show an extended user interface that does have built-in address book functionality for such a cell phone. Using such a cell phone through thedisplay system 10 thus affords the user with the ability to leverage functionality that is otherwise not provided by the cell phone itself. -
FIGS. 4 and 5 show sample user interfaces that theinteractive display system 10 may display for different types of physical objects, according to varying embodiments of the present disclosure. InFIG. 4 , theuser interface 400 corresponds at least substantially identically to the hard-type user interface of a typical remote control. The remote control may have physical buttons corresponding to selecting DVD, CD, or TV, increasing and lowering the volume, changing the channel, as well as thenumbers 0 through 9. Theuser interface 400 displayed by thedisplay system 10 may therefore, in one embodiment, be substantially similar in function to the physical buttons of the remote control. However, the buttons of theuser interface 400 are virtual buttons, since they are displayed by thedisplay system 10, and do not correspond to actual physical buttons or controls of thedisplay system 10. - In
FIG. 5 , theuser interface 500 extends the functionality provided by a rudimentary cellular phone. The cellular phone in question may not have address book functionality, and may have a small or no display. By comparison, theuser interface 500 has twocolumns 502 and 504, whereas the cellular phone may only be able to show one such column of information at a time, if any, and thus provides for a larger user interface than which can be provided by the cellular phone itself. Furthermore, whereas the phone does not have address book functionality, theuser interface 500 provides such address book functionality. For instance, by selecting “address book” in theleft column 502, the user is presented with a scrollable list of names in the right column 504 from which he or she can select a name having a corresponding phone number to make a phone call. - Referring back to the
method 300 ofFIG. 3 , once the user interface corresponding to the physical object has been display, user input is then received by theinteractive display system 10 that represents user interaction with the user interface (308). For example, the user may push virtual buttons of the user interface, or select menu items of the user interface, by using the touch-sensitive surface 36 or the devices D1-Dn of thedisplay system 10. The user input corresponding to user interaction with the user interface may be processed by thedisplay system 10 in at least one of two different ways. - First, the user input may be communicated from the
display system 10 back to the physical object (310). As such, thedisplay system 10 is effectively acting as an input device and as a display device for the object. For example, rather than constraining him or herself to a small display with small controls of a physical object like a cellular phone, the user can instead use the larger display and the larger controls of thedisplay system 10 to interact with physical object. As the physical object changes the user interface in response to the user input received, such changes are mirrored on the version of the user interface for the object displayed by thedisplay system 10. - Second, the user input may be used by the
display system 10 in a manner other than conveying the input to the physical object (312). As such, thedisplay system 10 can subsume the functionality provided by the physical object, providing that functionality itself without employing the physical object. The electronic device is thus responsive to user interaction with the user interface independent of the object. - For example, in the case of a remote control, the
display system 10 in response to user selection of the virtual controls of the user interface provided by thesystem 10 may appropriately send infrared signals corresponding to those controls without using the remote control itself. Thedisplay system 10 may have infrared emitters by which it can send infrared signals to audio/visual components like DVD and CD players without use of the remote control to send the infrared signals. - As another example, in the case of a cellular phone, if the user attempts to place a phone call through the user interface provided by the
system 10 and corresponding to the phone, the call may be placed by thedisplay system 10 and not by the cellular phone. For instance, thesystem 10 may place the call through a standard phone line to which thesystem 10 is coupled, or over the Internet using voice-over-Internet Protocol (VolP) technology. As such, a telephone handset may be part of thesystem 10 for use by the user in making the phone call, or thesystem 10 may have an integrated microphone to detect speech by the user, and use thespeaker 22 to emit sound to the user. Alternatively, the device itself can perform the functionality; for instance, the cellular phone may make the phone call initiated through the user interfaced provided by thesystem 10 and corresponding to the phone. -
FIG. 6 shows a rudimentary block diagram of an embodiment of thedisplay system 10, according to an embodiment of the present disclosure. Thedisplay system 10 is depicted as including a touch-sensitive tabletop surface 602, adisplay mechanism 604, an object-control mechanism 606, aprojection mechanism 608, and acommunication mechanism 610. As can be appreciated by those of ordinary skill within the art, thesystem 10 may have other mechanism, in addition to and/or in lieu of those depicted inFIG. 6 . Each of themechanisms - The touch-
sensitive tabletop surface 602 is substantially transparent and receptive to user touch input. Thesurface 602 is substantially transparent in that it is able to pass light therethrough. Substantial transparency can include complete transparency, where all such light passes through thesurface 602, as well as partial opacity, where some but not all of the light passes through thesurface 602. Substantial transparency does not include complete opacity, where little or none of the light passes through thesurface 602. Thesurface 602 may include thedisplay surface 14 and thesurface 36 that have been described. Thedisplay mechanism 604 is situated underneath or above thesurface 602, and is capable of displaying information to one or more users. Thedisplay mechanism 604 may include theDLP 16 that has been described. - The object-
control mechanism 606 detects presence of an external physical object that is in proximate vicinity to thedisplay system 10. The object-control mechanism 606 also displays a user interface corresponding to the physical object via at least thedisplay mechanism 604, and receives user input via at least the touch-sensitive tabletop surface 602 representing user interaction with the user interface. Themechanism 606 may include thecontroller 18 that has been described, as well as additional user-input devices, such as the devices D1-Dn. Themechanism 606 may further download information regarding the physical object for display via thedisplay mechanism 604, examples of such information are presented later in the detailed description. - The
projection mechanism 608 projects information for external display to the user. Theprojection mechanism 608 may include theauxiliary projector 20. The object-control mechanism 606 may display the user interface via theprojection mechanism 608, either in addition to and/or in lieu of displaying the user interface via thedisplay mechanism 604. - The
communication mechanism 610 is to detect the identity and/or presence of the physical object, and/or communicate with the physical object. For example, thecommunication mechanism 610 may include a transceiver to receive RF communication from and transmit RF communication to the object. Thecommunication mechanism 610 may alternatively include a barcode reader to detect a barcode on the physical object. Where thecommunication mechanism 610, or in addition to using thecommunication mechanism 610, the object-control mechanism may detect the presence of a physical object via the touch-sensitive tabletop surface 602. - As can be appreciated by those of ordinary skill within the art, the embodiments described herein can be extended and used in a variety of different applications. For example, the user interface that has been described as being displayed for a detected object may be stored on a different electronic device than that which detects the object. Such a user interface may be stored at a server that is communicatively coupled to the electronic device over the Internet or another type of network. The user is thus able to modify or update the user interface by appropriately logging onto the server, as can be appreciated by those of ordinary skill within the art.
- The user interface has been described such that a user is able to interact with the user interface, and the results of such user interactions are directly reflected in the object itself. For instance, the address book of a cellular phone can be modified by the user interacting with the user interface displayed for the cellular phone. The user interface that is displayed may also have different formats among which the user can select, such as different or custom skins, fonts, colors, and so on. The user interface may further may be able to be zoomed in, and out, for increased readability, such as by users having sight-oriented disabilities.
- The user interface may also be password protected, as has been noted above, such that certain aspects, attributes, or properties of the object can be modified if the proper password has been entered, while other aspects, attributes, or properties may not be modified without the proper password, or certain portions of the user interface are viewable by a user without entry of the proper password, while other portions of the user interface are not viewable by a user without entry of the proper password. Indeed, the electronic device itself, as well as the object, may have different passwords, such that both passwords may have to be entered in order to view the user interface for the object. Different passwords may correspond to different privilege levels, such that some users can interact with a given object in some limited way, and other users can interact with the object more fully.
- The specific types of objects that have been described herein are just examples of the kinds of objects that can be utilized by different embodiments. Other objects and other types and kinds of objects can also be utilized, as can be appreciated by those of ordinary skill within the art, for the same or different types of usage applications and scenarios. For example, a further type of object is a car key, or fob, which upon detection brings up as the user interface for this object the maintenance records for the vehicle with which the key or fob is associated. The electronic device may in such an embodiment be even able to communicate with the vehicle, wirelessly or in a wired manner, to acquire the current information for the vehicle. Such information may include mileage, self-diagnostic information, music stored in the vehicle, navigation information, data transferred to the vehicle at another location, and so on. The user may therefore be able to store and manipulate this information at the electronic device itself, and the electronic device may be programmed to advise and allow appropriate maintenance to be scheduled with local automotive service businesses.
- As another example, another type of object is a book. The electronic device may be able to detect the title of the book, such as by reading the ISBN bar code thereof. The text of the book may then be downloaded by the electronic device from a server connected to the Internet or another network. The user would then be able to electronically search the text of the book. Furthermore, notes or comments from others regarding the book may be downloaded, for viewing by the user. Extra content regarding the book may further be downloaded, such as additional information regarding the author thereof, forums in which the user can participate in discussions regarding the book, alternate endings of the book, and so on.
- Therefore, it is noted that, although specific embodiments have been illustrated and described herein, it will be appreciated by those of ordinary skill in the art that any arrangement is calculated to achieve the same purpose may be substituted for the specific embodiments shown. This application is intended to cover any adaptations or variations of the present disclosure. Therefore, it is manifestly intended that this present disclosure be limited only by the claims and equivalents thereof.
Claims (34)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/092,444 US7639231B2 (en) | 2005-03-29 | 2005-03-29 | Display of a user interface |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US11/092,444 US7639231B2 (en) | 2005-03-29 | 2005-03-29 | Display of a user interface |
Publications (2)
Publication Number | Publication Date |
---|---|
US20060230192A1 true US20060230192A1 (en) | 2006-10-12 |
US7639231B2 US7639231B2 (en) | 2009-12-29 |
Family
ID=37084372
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/092,444 Expired - Fee Related US7639231B2 (en) | 2005-03-29 | 2005-03-29 | Display of a user interface |
Country Status (1)
Country | Link |
---|---|
US (1) | US7639231B2 (en) |
Cited By (17)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080167913A1 (en) * | 2007-01-05 | 2008-07-10 | Microsoft Corporation | Delivering content based on physical object characteristics |
US20080198138A1 (en) * | 2007-02-20 | 2008-08-21 | Microsoft Corporation | Identification of devices on touch-sensitive surface |
US20080281851A1 (en) * | 2007-05-09 | 2008-11-13 | Microsoft Corporation | Archive for Physical and Digital Objects |
US20080291283A1 (en) * | 2006-10-16 | 2008-11-27 | Canon Kabushiki Kaisha | Image processing apparatus and control method thereof |
US20090134332A1 (en) * | 2007-11-27 | 2009-05-28 | Thompson Jason R | Infrared Encoded Objects and Controls for Display Systems |
US20100182438A1 (en) * | 2009-01-20 | 2010-07-22 | Soiba Mohammed | Dynamic user interface for remote control of camera |
US20100237983A1 (en) * | 2009-03-20 | 2010-09-23 | Xerox Corporation | System and method for using concealed infrared identifiers to control motion-detecting social computing devices |
US20110145724A1 (en) * | 2009-12-15 | 2011-06-16 | Acer Incorporated | Multi-Screen Electronic Device and Reference Material Display Method Thereof |
US20110273368A1 (en) * | 2005-06-24 | 2011-11-10 | Microsoft Corporation | Extending Digital Artifacts Through An Interactive Surface |
US20120062490A1 (en) * | 2010-07-08 | 2012-03-15 | Disney Enterprises, Inc. | Game Pieces for Use with Touch Screen Devices and Related Methods |
US20120068812A1 (en) * | 2010-09-17 | 2012-03-22 | Kazuyuki Yamamoto | Information processing apparatus, information processing system, information processing method, and program |
US20130038548A1 (en) * | 2011-08-12 | 2013-02-14 | Panasonic Corporation | Touch system |
US20130117135A1 (en) * | 2009-11-27 | 2013-05-09 | Compurants Limited | Multi-user food and drink ordering system |
US8565843B1 (en) * | 2009-05-13 | 2013-10-22 | Lugovations LLC | Portable device shell |
US10293247B2 (en) | 2010-07-08 | 2019-05-21 | Disney Enterprises, Inc. | Physical pieces for interactive application using touch screen devices |
US10346529B2 (en) * | 2008-09-30 | 2019-07-09 | Microsoft Technology Licensing, Llc | Using physical objects in conjunction with an interactive surface |
US11490496B1 (en) * | 2021-09-09 | 2022-11-01 | Power Mos Electronic Limited | Interactive display system |
Families Citing this family (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2175348A1 (en) * | 2008-10-07 | 2010-04-14 | Sensitive Object | Tactile man-machine interface with data communication interface |
WO2015185629A2 (en) | 2014-06-06 | 2015-12-10 | Lego A/S | Interactive game apparatus and toy construction system |
BR112017006603B1 (en) | 2014-10-02 | 2022-10-18 | Lego A/S | GAME SYSTEM, DETECTION DEVICE FOR DETECTOR THE PRESENCE OF AN IDENTIFICATION ELEMENT WITHIN A DETECTION AREA, AND METHOD FOR OPERATING A GAME SYSTEM |
US10306193B2 (en) | 2015-04-27 | 2019-05-28 | Microsoft Technology Licensing, Llc | Trigger zones for objects in projected surface model |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020050983A1 (en) * | 2000-09-26 | 2002-05-02 | Qianjun Liu | Method and apparatus for a touch sensitive system employing spread spectrum technology for the operation of one or more input devices |
US20060161846A1 (en) * | 2002-11-29 | 2006-07-20 | Koninklijke Philips Electronics N.V. | User interface with displaced representation of touch area |
US20060209016A1 (en) * | 2005-03-17 | 2006-09-21 | Microsoft Corporation | Computer interaction based upon a currently active input device |
US7397464B1 (en) * | 2004-04-30 | 2008-07-08 | Microsoft Corporation | Associating application states with a physical object |
-
2005
- 2005-03-29 US US11/092,444 patent/US7639231B2/en not_active Expired - Fee Related
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020050983A1 (en) * | 2000-09-26 | 2002-05-02 | Qianjun Liu | Method and apparatus for a touch sensitive system employing spread spectrum technology for the operation of one or more input devices |
US20060161846A1 (en) * | 2002-11-29 | 2006-07-20 | Koninklijke Philips Electronics N.V. | User interface with displaced representation of touch area |
US7397464B1 (en) * | 2004-04-30 | 2008-07-08 | Microsoft Corporation | Associating application states with a physical object |
US20060209016A1 (en) * | 2005-03-17 | 2006-09-21 | Microsoft Corporation | Computer interaction based upon a currently active input device |
Cited By (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110273368A1 (en) * | 2005-06-24 | 2011-11-10 | Microsoft Corporation | Extending Digital Artifacts Through An Interactive Surface |
US10044790B2 (en) * | 2005-06-24 | 2018-08-07 | Microsoft Technology Licensing, Llc | Extending digital artifacts through an interactive surface to a mobile device and creating a communication channel between a mobile device and a second mobile device via the interactive surface |
US10318076B2 (en) | 2006-10-16 | 2019-06-11 | Canon Kabushiki Kaisha | Image displaying apparatus with changed menu based on detection of mobile information terminal placed thereon |
US20080291283A1 (en) * | 2006-10-16 | 2008-11-27 | Canon Kabushiki Kaisha | Image processing apparatus and control method thereof |
US9280776B2 (en) * | 2007-01-05 | 2016-03-08 | Microsoft Technology Licensing, Llc | Delivering content based on physical object characteristics |
US20080167913A1 (en) * | 2007-01-05 | 2008-07-10 | Microsoft Corporation | Delivering content based on physical object characteristics |
US20080198138A1 (en) * | 2007-02-20 | 2008-08-21 | Microsoft Corporation | Identification of devices on touch-sensitive surface |
US8063888B2 (en) * | 2007-02-20 | 2011-11-22 | Microsoft Corporation | Identification of devices on touch-sensitive surface |
US8199117B2 (en) * | 2007-05-09 | 2012-06-12 | Microsoft Corporation | Archive for physical and digital objects |
US20080281851A1 (en) * | 2007-05-09 | 2008-11-13 | Microsoft Corporation | Archive for Physical and Digital Objects |
US20090134332A1 (en) * | 2007-11-27 | 2009-05-28 | Thompson Jason R | Infrared Encoded Objects and Controls for Display Systems |
US10346529B2 (en) * | 2008-09-30 | 2019-07-09 | Microsoft Technology Licensing, Llc | Using physical objects in conjunction with an interactive surface |
US20100182438A1 (en) * | 2009-01-20 | 2010-07-22 | Soiba Mohammed | Dynamic user interface for remote control of camera |
US20100237983A1 (en) * | 2009-03-20 | 2010-09-23 | Xerox Corporation | System and method for using concealed infrared identifiers to control motion-detecting social computing devices |
US8565843B1 (en) * | 2009-05-13 | 2013-10-22 | Lugovations LLC | Portable device shell |
US20130117135A1 (en) * | 2009-11-27 | 2013-05-09 | Compurants Limited | Multi-user food and drink ordering system |
US20110145724A1 (en) * | 2009-12-15 | 2011-06-16 | Acer Incorporated | Multi-Screen Electronic Device and Reference Material Display Method Thereof |
US9274641B2 (en) * | 2010-07-08 | 2016-03-01 | Disney Enterprises, Inc. | Game pieces for use with touch screen devices and related methods |
US10293247B2 (en) | 2010-07-08 | 2019-05-21 | Disney Enterprises, Inc. | Physical pieces for interactive application using touch screen devices |
US20120062490A1 (en) * | 2010-07-08 | 2012-03-15 | Disney Enterprises, Inc. | Game Pieces for Use with Touch Screen Devices and Related Methods |
US8766766B2 (en) * | 2010-09-17 | 2014-07-01 | Sony Corporation | Information processing apparatus, information processing system, information processing method, and program |
US20120068812A1 (en) * | 2010-09-17 | 2012-03-22 | Kazuyuki Yamamoto | Information processing apparatus, information processing system, information processing method, and program |
US20130038548A1 (en) * | 2011-08-12 | 2013-02-14 | Panasonic Corporation | Touch system |
US11490496B1 (en) * | 2021-09-09 | 2022-11-01 | Power Mos Electronic Limited | Interactive display system |
Also Published As
Publication number | Publication date |
---|---|
US7639231B2 (en) | 2009-12-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US7639231B2 (en) | Display of a user interface | |
US8502789B2 (en) | Method for handling user input in an interactive input system, and interactive input system executing the method | |
JP5535626B2 (en) | Private screen automatically distributed along the shop window | |
CN102405453B (en) | Context-based state change for an adaptive input device | |
US8421824B2 (en) | Hand image feedback | |
CN116203731A (en) | Matching of content to a spatial 3D environment | |
US20090164930A1 (en) | Electronic device capable of transferring object between two display units and controlling method thereof | |
US20110154233A1 (en) | Projected display to enhance computer device use | |
US20110087974A1 (en) | User interface controls including capturing user mood in response to a user cue | |
CN111615682B (en) | Method and apparatus for selecting a presentation mode based on a viewing angle | |
US20100070898A1 (en) | Contextual window-based interface and method therefor | |
JP2007089732A (en) | Input device | |
CN100432912C (en) | Mobile electronic apparatus, display method, program and graphical interface thereof | |
CN101384979A (en) | Method for confirming touch input | |
US20110191699A1 (en) | System and method of interfacing interactive content items and shared data variables | |
US20080316145A1 (en) | Virtual shadow for physical object placed on surface | |
CN102693045A (en) | Image generation device, projector, and image generation method | |
KR20170057823A (en) | Method and electronic apparatus for touch input via edge screen | |
US20140347264A1 (en) | Device and method for displaying an electronic document using a double-sided display | |
TW201025085A (en) | Keyboard formed from a touch display, method of endowing a touch display with a keyboard function, and a device with functions of keyboard or writing pad input and image output | |
US8108008B2 (en) | Electronic apparatus and controlling component and controlling method for the electronic apparatus | |
TWI614706B (en) | Operation method of portable electronic apparatus | |
Sciarretta et al. | Elderly and tablets: considerations and suggestions about the design of proper applications | |
Buerger | Types of public interactive display technologies and how to motivate users to interact | |
KR20110119639A (en) | Mouse |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:PARRY, TRAVIS;BLYTHE, MICHAEL M.;REEL/FRAME:016431/0243 Effective date: 20050328 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FPAY | Fee payment |
Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20211229 |