US20180024420A1 - Projector capable of determining how a moving object interacts with a projected image - Google Patents
Projector capable of determining how a moving object interacts with a projected image Download PDFInfo
- Publication number
- US20180024420A1 US20180024420A1 US15/724,214 US201715724214A US2018024420A1 US 20180024420 A1 US20180024420 A1 US 20180024420A1 US 201715724214 A US201715724214 A US 201715724214A US 2018024420 A1 US2018024420 A1 US 2018024420A1
- Authority
- US
- United States
- Prior art keywords
- visible image
- projector
- image
- light
- visible
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/48—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus
- G03B17/54—Details of cameras or camera bodies; Accessories therefor adapted for combination with other photographic or optical apparatus with projector
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B33/00—Colour photography, other than mere exposure or projection of a colour film
- G03B33/10—Simultaneous recording or projection
- G03B33/12—Simultaneous recording or projection using beam-splitting or beam-combining systems, e.g. dichroic mirrors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3141—Constructional details thereof
- H04N9/315—Modulator illumination systems
- H04N9/3164—Modulator illumination systems using multiple light sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3179—Video signal processing therefor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/12—Picture reproducers
- H04N9/31—Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]
- H04N9/3191—Testing thereof
- H04N9/3194—Testing thereof including sensor feedback
-
- H04N13/0271—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
Definitions
- Projectors are conventionally designed to project a visible image in which the projected light has frequencies within the visible spectrum.
- conventional projectors are capable of projecting an image composed of multiple pixels, each emitted by a distinct pixel unit.
- the projector includes multiple light-emitting diodes (LEDs).
- LEDs light-emitting diodes
- a pixel unit typically might include a red LED, a green LED, and a blue LED.
- the projected image passes through optics in a manner that the visible image is then focused on a surface at a distance from the projector.
- Projectors are not conventionally used to project images outside of the visible spectrum.
- Embodiments described herein relate to a projector that projects a visible image as well as a non-visible image.
- the non-visible image might be used for any purpose, but an example is to provide depth information regarding physical item(s) interacting with the projected visible image.
- the projector includes multiple projecting units (e.g., one for each pixel to be displayed).
- Each projecting unit includes light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image.
- Optics may be positioned to project the visible image and the non-visible image.
- the optics might include a portion that directs a reflected portion of the non-visible image (and perhaps also a reflected portion of the visible image) to a camera for capture of the reflected image.
- a depth sensing module detects depth of surfaces within the scope of the non-visible image using the reflected portion of that non-visible image.
- FIG. 1 abstractly illustrates a projection system that includes a projector that includes projecting units, each for projecting a pixel of a visible image;
- FIG. 2 illustrates a flowchart of a method for projecting an image
- FIG. 3 illustrates a more detailed abstract diagram of a projection system, and represents an example of the projection system of FIG. 1 ;
- FIG. 4 abstractly illustrates a computing system that may be used to implement aspects described herein when using software
- FIG. 5 illustrates a first physical embodiment in which the projection system is a projector mounted to a ceiling
- FIG. 6A illustrates a side view of a second physical embodiment in which the projection system is incorporated into a cam light system
- FIG. 6B illustrates a bottom view of the cam light system of FIG. 6A .
- the principles described herein relate to a projection system, or a projector, that projects a visible image as well as a non-visible image.
- the non-visible image might be used for any purpose, but an example is to provide depth information regarding physical item(s) interacting with the projected visible image.
- the projection system includes multiple projecting units (e.g., one for each pixel to be displayed).
- Each projecting unit includes light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image.
- Optics may be positioned to project the visible image and the non-visible image.
- the optics might include a portion that directs a reflected portion of the non-visible image (and perhaps also a reflected portion of the visible image) to a camera for capture of the reflected image.
- a depth sensing module detects depth of surfaces within the scope of the non-visible image using the reflected portion of that non-visible image.
- FIG. 1 abstractly illustrates a projection system 100 that includes a projector that includes projecting units 101 , each for projecting a pixel of a visible image.
- the projecting units 101 are illustrated as including three projecting units 101 A, 101 B and 101 C, although the ellipses 101 D abstractly represent that the projection system 100 would typically including many more projecting units 101 , as modern projectors typically can project images of hundreds and thousands, or even millions of pixels, and future projectors may be capable of generating even more.
- the projecting units 101 A, 101 B and 101 C are illustrated.
- Each projecting unit has multiple light-emitting elements that are configured to emit in the visible spectrum of electromagnetic wavelengths that are visible to the human eye.
- projecting unit 101 A is illustrated as including light-emitting elements 102 A in the form of two light-emitting elements 102 Aa and 102 Ab, although the ellipses 102 Ac represent that there may be other numbers of light-emitting elements 102 A within the projecting unit 101 A that are also capable of emitting light in the visible spectrum.
- projecting unit 101 B is illustrated as including light-emitting elements 102 B in the form of two light-emitting elements 102 Ba and 102 Bb, although the ellipses 102 Bc represent that there may be other numbers of light-emitting elements 102 B within the projecting unit 101 B that are also capable of emitting light in the visible spectrum.
- projecting unit 101 C is illustrated as including light-emitting elements 102 C in the form of two light-emitting elements 102 Ca and 102 Cb, although the ellipses 102 Cc represent that there may be other numbers of light-emitting elements 102 C within the projecting unit 101 C that are also capable of emitting light in the visible spectrum. The same may be true of the other projecting units 101 that are not illustrated and which are represented by the ellipses 101 D.
- the light-emitting elements 102 within each of the projecting units 101 constitutes a red light-emitting element, a green light-emitting element, and a blue light-emitting element.
- light-emitting element 102 Aa of the projecting unit 101 A might be a red Light-Emitting Diode (LED)
- the light-emitting element 102 Ab of the projecting unit 101 A might be a green LED
- another light-emitting element (represented by ellipses 102 Ac) of the projecting unit 101 A might be a blue LED.
- Some or all of the projecting units 101 comprise further emitting elements 103 that emit light outside of the visible spectrum.
- the group of projecting units 101 that are capable of doing this are sometimes referred to herein as a “collection” of the projecting units 101 .
- the collection could be all of the projecting units 101 , or just a subset of the projecting units 101 .
- the use of the term “collection” should not be construed as implying that such projecting units are collected together, as the collection may be distributed in any manner amongst the total number of projecting units.
- the projecting unit 101 A is illustrated as including an emitting element 103 A that emits light outside of the visible spectrum.
- projecting unit 101 B is illustrated as including emitting element 103 B that emits light outside of the visible spectrum.
- Projecting unit 101 C is not shown as including a corresponding emitting element 103 that emits outside of the visible spectrum, emphasizing that the broadest principles described herein do not require that all of the projecting units 101 have an emitting element 103 that emits light outside of the visible spectrum.
- each of the emitting elements 103 might emit infra-red light.
- FIG. 2 illustrates a flowchart of a method 200 for projecting an image. As the method 200 may be performed in the context of the projection system 100 , FIGS. 1 and 2 will be described in an integrated fashion with frequent reference to each other.
- the method 200 includes emitting a portion of a visible image from each projecting unit to thereby generate a visible image (act 201 ). In the context of FIG. 1 , this has already been described with respect to the light-emitting elements 102 of the respective projecting units 101 emitting a visible image 151 .
- the method 200 also includes emitting a portion of a non-visible image from each of at least some of the projecting units (act 202 ). In the context of FIG. 1 , this has already been described with respect to the emitting elements 103 of the collection of projecting units 101 emitting a non-visible image 152 .
- the projection system 100 further includes optics 110 positioned to project the visible image 151 emitted by the projecting units 101 , and also to project the non-visible image 152 emitted by the collection of projecting units 101 (i.e., the set of the projecting units 101 that includes an emitting element 103 ).
- the method 200 thus further includes projecting the visible image and the non-visible image through optics (act 203 ).
- the projected form of the visible image 151 is represented by projected visible image 151 ′.
- the projected form of the non-visible image 152 is represented by projected non-visible image 152 ′.
- the visible image 151 and the non-visible image 152 are projected into a field of projection 140 .
- the field of projection 140 might be a wall, table-top, a flat surface, a complex surface, and might include one or more mobile objects (such as a hand or game pieces) positioned within the field of projection 140 .
- the method 200 further includes the optics 110 receiving a reflected portion of the visible image 151 and the non-visible image 152 (act 204 ).
- the surface onto which the projected visible image 151 ′ and the projected non-visible image 152 ′ are projected may be the same surface on which the projection system 100 sits.
- a portion 151 ′′′ of the reflected portion 151 ′′ of the projected visible image 151 ′ is redirected to a camera 120 by a portion 111 of the optics 110 , whereupon the camera 120 captures the portion 151 ′′′ of the projected visible image 151 ′.
- a portion 152 ′′′ of the reflected portion 152 ′′ of the projected non-visible image 152 ′ is redirected to the camera 120 by a portion 111 of the optics 110 , whereupon the camera 120 captures the portion 152 ′′′ of the projected portion 152 ′′ of the projected non-visible image 152 ′.
- the method 200 includes redirecting at least a portion of the received visible and non-visible image to a camera (act 205 ), and capturing the received visible and non-visible image (act 206 ).
- the same camera 120 captures both the projected visible image 151 ′ and the reflected non-visible image 152 ′, though separate cameras 120 might capture each image instead.
- a depth sensing module 130 detects depth information associated with surfaces within the field of projection 140 by using the captured image information regarding the portion 151 ′′′ of the reflected portion 151 ′′ of the projected visible image 151 ′ and regarding the portion 152 ′′′ of the reflected portion 152 ′′ of the projected non-visible image 152 ′. Accordingly, the method 200 includes deriving depth information regarding one or more objects within the field of projection 140 of the visible image 151 using the captured portion 152 ′′′ of the reflected portion 152 ′′ of the non-visible image 152 (act 207 ).
- FIG. 3 illustrates a more detailed abstract diagram of a projection system 300 , and represents an example of the projection system 100 of FIG. 1 .
- a blue LED 301 A emits through three one way mirrors 302 A, 302 B and 302 C to generate a blue optical signal associated with a pixel of a visible image.
- a green LED 301 B reflects off of one way mirror 302 A to be collimated with the blue optical signal and also pass through the one way mirrors 302 B and 302 C to generate a green optical signal associated with the pixel of the visible image.
- a red LED 301 C emits light that reflects off of one way mirror 302 B to be collimated with the blue and green signals and that also passes through the one way mirror 302 C to generate a red optical signal associated with the pixel.
- the blue, green, and red optical signals are each reflected off of mirror 303 , focused using focus optics 304 , redirected with the Digital Micromirror Device (DMD) 305 , passes through one way mirror 306 , and is projected by the projection lens 307 as a pixel of the visible image 311 .
- DMD Digital Micromirror Device
- an infra-red LED 301 D for that pixel emits infra-red light, which reflects off of mirror 302 C to be collimated with the visible optical signals for that pixel.
- the infra-red light likewise reflects off of mirror 303 , is focused using focus optics 304 , redirected with the Digital Micromirror Device (DMD) 305 , passes through one way mirror 306 , and is projected by the projection lens 307 as a pixel of the non-visible image 312 .
- this pixel of non-visible image 312 is overlaid on the same pixel of the visible image 311 .
- a portion of the visible image 311 and the non-visible image 312 are reflected back through the projection lens 307 , and a portion of the reflected images are then redirected with the one way mirror 306 towards the camera 308 .
- the projection system and the camera 308 thus both use the same optics, and thus the assembly may be made quite small.
- the projection system 100 might be incorporated within a single computing system that may itself be quite small, such as a laptop, a smartphone, or an accessory to a laptop or smartphone.
- Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system.
- the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor.
- the memory may take any form and may depend on the nature and form of the computing system.
- a computing system may be distributed over a network environment and may include multiple constituent computing systems.
- a computing system 400 includes at least one processing unit 402 and computer-readable media 404 .
- the computer-readable media 404 may conceptually be thought of as including physical system memory, which may be volatile, non-volatile, or some combination of the two.
- the computer-readable media 404 also conceptually includes non-volatile mass storage. If the computing system 400 is distributed, the processing, memory and/or storage capability may be distributed as well.
- executable module can refer to software objects, routings, or methods that may be executed on the computing system.
- the different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
- Such executable modules may be managed code in the case of being executed in a managed environment in which type safety is enforced, and in which processes are allocated their own distinct memory objects.
- Such executable modules may also be unmanaged code in the case of executable modules being authored in native code such as C or C++.
- embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions.
- such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product.
- An example of such an operation involves the manipulation of data.
- the computer-executable instructions (and the manipulated data) may be stored in the computer-readable media 404 of the computing system 400 .
- Computing system 400 may also contain communication channels 408 that allow the computing system 400 to communicate with other processors over, for example, network 410 .
- Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
- Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
- Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
- Computer-readable media that store computer-executable instructions are physical storage media.
- Computer-readable media that carry computer-executable instructions are transmission media.
- embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible storage medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
- a network or another communications connection can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa).
- computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface controller (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system.
- a network interface controller e.g., a “NIC”
- NIC network interface controller
- computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
- the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both local and remote memory storage devices.
- the depth processing module 130 may be created and/or operated by the computing system 400 in response to the computing system 400 accessing a computer program product having one or more computer-readable media 404 having thereon computer-executable instructions that are structured such that, when executed by one or more processors of the computing system 400 , the computing system 400 creates and/or operates the depth processing module 130 .
- the depth processing module 130 might allow the computing system 400 to infer information regarding the surface on which the projection system projects in the absence of objects placed within the field of project, while likewise detecting objects and characteristics of objects placed within the field of projection.
- the depth information might affect state of the computing system 400 thereby affecting the visible image.
- the non-visible image is a pattern that is perhaps repeated (although it may be a non-repeating pattern also) and that allows for depth information to be derived based on reflections of that pattern.
- the depth information may be obtained by a non-visible image via phase-based or other time of flight methods, or any other method for determining depth information from non-visible images.
- FIG. 5 illustrates a first physical embodiment 500 in which the projection system 100 is a projector 501 mounted to a ceiling 502 using mechanical mounts 507 .
- the projector 501 projects an image 506 onto a vertical wall surface 504 .
- a planar light emitter 503 emits co-planar infra-red light planes, and based on reflections, provides capture depth information to the projector 501 (which depth information may supplement depth information captured within the projector 501 via its own optics).
- the planar light emitter 503 send electrical signals over wiring 505 , although wireless embodiments are also possible.
- FIGS. 6A and 6B illustrates a second physical embodiment in which the projection system is incorporated into a cam light, or can light, and, thus, is referred to as a “cam light system 600 .”
- FIG. 6A illustrates a side view of the cam light system 600 .
- the cam light system 600 includes the cam light 601 in which the projection system 100 is housed.
- the cam light 601 includes an exposed portion 602 that faces downward into the interior of the room whilst the remainder is generally hidden from view above the ceiling 603 .
- a mounting plate 604 and mounting bolts 605 assist in mounting the cam light 601 within the ceiling 603 .
- a power source 606 supplies power to the cam light 601 .
- FIG. 6B illustrates a bottom view, looking up, of the exposed portion 602 of the cam light 601 .
- a visible light projector 610 emits light downward onto a horizontal surface below the cam light system 600 (such as a table or countertop). When not projecting images, the visible light projector 610 may simply emit visible light to irradiate that portion of the room, and function as a regular cam light.
- the remote controller 615 may be used to communicate to the remote sensor 612 , when the visible light projector 610 is to take on its image projection role.
- the color camera 611 captures visible images reflected from field of projection.
- an infrared light emitter 613 emits non-visible light so that the infrared camera 614 may capture reflections of that non-visible light to thereby extract depth information and thus user interaction within the field of projection.
- the infrared image may instead or addition be projected from the visible light projector 610 .
- Speakers 616 emit sound associated with the projected visible image. Accordingly, users can quickly transition from sitting at the dinner table having a well-illuminated dinner, to a fun family game activity, without moving to a different location.
Abstract
Description
- This application is a continuation of U.S. patent application Ser. No. 13/968,267, filed on Aug. 15, 2013 and titled PROJECTOR FOR PROJECTING VISIBLE AND NON-VISIBLE IMAGES (“the '267 Application”), issued as U.S. Pat. No. 9,778,546 on Oct. 3, 2017. The entire disclosure of the '267 is hereby incorporated herein by reference.
- Projectors are conventionally designed to project a visible image in which the projected light has frequencies within the visible spectrum. For instance, conventional projectors are capable of projecting an image composed of multiple pixels, each emitted by a distinct pixel unit. For each pixel unit, the projector includes multiple light-emitting diodes (LEDs). For instance, a pixel unit typically might include a red LED, a green LED, and a blue LED. The projected image passes through optics in a manner that the visible image is then focused on a surface at a distance from the projector. Projectors are not conventionally used to project images outside of the visible spectrum.
- Embodiments described herein relate to a projector that projects a visible image as well as a non-visible image. The non-visible image might be used for any purpose, but an example is to provide depth information regarding physical item(s) interacting with the projected visible image.
- The projector includes multiple projecting units (e.g., one for each pixel to be displayed). Each projecting unit includes light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image. Optics may be positioned to project the visible image and the non-visible image. The optics might include a portion that directs a reflected portion of the non-visible image (and perhaps also a reflected portion of the visible image) to a camera for capture of the reflected image. A depth sensing module detects depth of surfaces within the scope of the non-visible image using the reflected portion of that non-visible image.
- This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- In order to describe the manner in which the above-recited and other advantages and features can be obtained, a more particular description of various embodiments will be rendered by reference to the appended drawings. Understanding that these drawings depict only sample embodiments and are not therefore to be considered to be limiting of the scope of the invention, the embodiments will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
-
FIG. 1 abstractly illustrates a projection system that includes a projector that includes projecting units, each for projecting a pixel of a visible image; -
FIG. 2 illustrates a flowchart of a method for projecting an image; -
FIG. 3 illustrates a more detailed abstract diagram of a projection system, and represents an example of the projection system ofFIG. 1 ; -
FIG. 4 abstractly illustrates a computing system that may be used to implement aspects described herein when using software; -
FIG. 5 illustrates a first physical embodiment in which the projection system is a projector mounted to a ceiling; -
FIG. 6A illustrates a side view of a second physical embodiment in which the projection system is incorporated into a cam light system; and -
FIG. 6B illustrates a bottom view of the cam light system ofFIG. 6A . - The principles described herein relate to a projection system, or a projector, that projects a visible image as well as a non-visible image. The non-visible image might be used for any purpose, but an example is to provide depth information regarding physical item(s) interacting with the projected visible image.
- The projection system includes multiple projecting units (e.g., one for each pixel to be displayed). Each projecting unit includes light-emitting elements configured to emit light in the visible spectrum. Some or all of those projecting units might also include an emitting element for emitting light in the non-visible spectrum so as to collectively emit a non-visible image. Optics may be positioned to project the visible image and the non-visible image. The optics might include a portion that directs a reflected portion of the non-visible image (and perhaps also a reflected portion of the visible image) to a camera for capture of the reflected image. A depth sensing module detects depth of surfaces within the scope of the non-visible image using the reflected portion of that non-visible image.
-
FIG. 1 abstractly illustrates aprojection system 100 that includes a projector that includesprojecting units 101, each for projecting a pixel of a visible image. For instance theprojecting units 101 are illustrated as including threeprojecting units 101A, 101B and 101C, although the ellipses 101D abstractly represent that theprojection system 100 would typically including many more projectingunits 101, as modern projectors typically can project images of hundreds and thousands, or even millions of pixels, and future projectors may be capable of generating even more. However, to simplify the description, only threeprojecting units 101A, 101B and 101C (corresponding to just three pixels) are illustrated. - Each projecting unit has multiple light-emitting elements that are configured to emit in the visible spectrum of electromagnetic wavelengths that are visible to the human eye. For instance,
projecting unit 101A is illustrated as including light-emitting elements 102A in the form of two light-emitting elements 102Aa and 102Ab, although the ellipses 102Ac represent that there may be other numbers of light-emitting elements 102A within theprojecting unit 101A that are also capable of emitting light in the visible spectrum. - The same is true of the other projecting units in the
projecting units 101. For instance, projecting unit 101B is illustrated as including light-emitting elements 102B in the form of two light-emitting elements 102Ba and 102Bb, although the ellipses 102Bc represent that there may be other numbers of light-emitting elements 102B within the projecting unit 101B that are also capable of emitting light in the visible spectrum. Furthermore, projecting unit 101C is illustrated as including light-emitting elements 102C in the form of two light-emitting elements 102Ca and 102Cb, although the ellipses 102Cc represent that there may be other numbers of light-emitting elements 102C within the projecting unit 101C that are also capable of emitting light in the visible spectrum. The same may be true of theother projecting units 101 that are not illustrated and which are represented by the ellipses 101D. - In one embodiment, the light-emitting
elements 102 within each of theprojecting units 101 constitutes a red light-emitting element, a green light-emitting element, and a blue light-emitting element. For instance, light-emitting element 102Aa of theprojecting unit 101A might be a red Light-Emitting Diode (LED), the light-emitting element 102Ab of theprojecting unit 101A might be a green LED, and another light-emitting element (represented by ellipses 102Ac) of theprojecting unit 101A might be a blue LED. - Some or all of the projecting
units 101 comprisefurther emitting elements 103 that emit light outside of the visible spectrum. The group of projectingunits 101 that are capable of doing this are sometimes referred to herein as a “collection” of the projectingunits 101. The collection could be all of theprojecting units 101, or just a subset of theprojecting units 101. The use of the term “collection” should not be construed as implying that such projecting units are collected together, as the collection may be distributed in any manner amongst the total number of projecting units. - In
FIG. 1 , theprojecting unit 101A is illustrated as including anemitting element 103A that emits light outside of the visible spectrum. Likewise, projecting unit 101B is illustrated as including emitting element 103B that emits light outside of the visible spectrum. Projecting unit 101C is not shown as including acorresponding emitting element 103 that emits outside of the visible spectrum, emphasizing that the broadest principles described herein do not require that all of the projectingunits 101 have anemitting element 103 that emits light outside of the visible spectrum. In one embodiment, each of the emittingelements 103 might emit infra-red light. - The light-emitting
elements 102 thus collectively emit avisible image 151 represented abstractly as an arrow. Likewise, the collection ofemitting elements 103 thus emits a non-visibleimage 152, as represented by another arrow.FIG. 2 illustrates a flowchart of amethod 200 for projecting an image. As themethod 200 may be performed in the context of theprojection system 100,FIGS. 1 and 2 will be described in an integrated fashion with frequent reference to each other. - The
method 200 includes emitting a portion of a visible image from each projecting unit to thereby generate a visible image (act 201). In the context ofFIG. 1 , this has already been described with respect to the light-emittingelements 102 of the respective projectingunits 101 emitting avisible image 151. Themethod 200 also includes emitting a portion of a non-visible image from each of at least some of the projecting units (act 202). In the context ofFIG. 1 , this has already been described with respect to the emittingelements 103 of the collection of projectingunits 101 emitting anon-visible image 152. - The
projection system 100 further includesoptics 110 positioned to project thevisible image 151 emitted by the projectingunits 101, and also to project thenon-visible image 152 emitted by the collection of projecting units 101 (i.e., the set of the projectingunits 101 that includes an emitting element 103). Themethod 200 thus further includes projecting the visible image and the non-visible image through optics (act 203). The projected form of thevisible image 151 is represented by projectedvisible image 151′. The projected form of thenon-visible image 152 is represented by projectednon-visible image 152′. Thevisible image 151 and thenon-visible image 152 are projected into a field ofprojection 140. For instance, the field ofprojection 140 might be a wall, table-top, a flat surface, a complex surface, and might include one or more mobile objects (such as a hand or game pieces) positioned within the field ofprojection 140. - A reflected
portion 151″ of the projectedvisible image 151′ is received back into theoptics 110. Likewise, a reflectedportion 152″ of the projectednon-visible image 152′ is received back into theoptics 110. Accordingly, themethod 200 further includes theoptics 110 receiving a reflected portion of thevisible image 151 and the non-visible image 152 (act 204). In one embodiment, the surface onto which the projectedvisible image 151′ and the projectednon-visible image 152′ are projected may be the same surface on which theprojection system 100 sits. - A
portion 151′″ of the reflectedportion 151″ of the projectedvisible image 151′ is redirected to acamera 120 by aportion 111 of theoptics 110, whereupon thecamera 120 captures theportion 151′″ of the projectedvisible image 151′. Likewise, aportion 152′″ of the reflectedportion 152″ of the projectednon-visible image 152′ is redirected to thecamera 120 by aportion 111 of theoptics 110, whereupon thecamera 120 captures theportion 152′″ of the projectedportion 152″ of the projectednon-visible image 152′. Accordingly, themethod 200 includes redirecting at least a portion of the received visible and non-visible image to a camera (act 205), and capturing the received visible and non-visible image (act 206). In some embodiments, thesame camera 120 captures both the projectedvisible image 151′ and the reflectednon-visible image 152′, thoughseparate cameras 120 might capture each image instead. - A
depth sensing module 130 detects depth information associated with surfaces within the field ofprojection 140 by using the captured image information regarding theportion 151′″ of the reflectedportion 151″ of the projectedvisible image 151′ and regarding theportion 152′″ of the reflectedportion 152″ of the projectednon-visible image 152′. Accordingly, themethod 200 includes deriving depth information regarding one or more objects within the field ofprojection 140 of thevisible image 151 using the capturedportion 152′″ of the reflectedportion 152″ of the non-visible image 152 (act 207). -
FIG. 3 illustrates a more detailed abstract diagram of aprojection system 300, and represents an example of theprojection system 100 ofFIG. 1 . Here, ablue LED 301A emits through three one way mirrors 302A, 302B and 302C to generate a blue optical signal associated with a pixel of a visible image. Furthermore, a green LED 301B reflects off of oneway mirror 302A to be collimated with the blue optical signal and also pass through the one way mirrors 302B and 302C to generate a green optical signal associated with the pixel of the visible image. Also, a red LED 301C emits light that reflects off of one way mirror 302B to be collimated with the blue and green signals and that also passes through the oneway mirror 302C to generate a red optical signal associated with the pixel. The blue, green, and red optical signals are each reflected off ofmirror 303, focused usingfocus optics 304, redirected with the Digital Micromirror Device (DMD) 305, passes through oneway mirror 306, and is projected by theprojection lens 307 as a pixel of thevisible image 311. - In addition, an infra-red LED 301D for that pixel emits infra-red light, which reflects off of
mirror 302C to be collimated with the visible optical signals for that pixel. The infra-red light likewise reflects off ofmirror 303, is focused usingfocus optics 304, redirected with the Digital Micromirror Device (DMD) 305, passes through oneway mirror 306, and is projected by theprojection lens 307 as a pixel of thenon-visible image 312. In one embodiment, this pixel ofnon-visible image 312 is overlaid on the same pixel of thevisible image 311. - A portion of the
visible image 311 and thenon-visible image 312 are reflected back through theprojection lens 307, and a portion of the reflected images are then redirected with the oneway mirror 306 towards thecamera 308. - The projection system and the
camera 308 thus both use the same optics, and thus the assembly may be made quite small. In fact, theprojection system 100 might be incorporated within a single computing system that may itself be quite small, such as a laptop, a smartphone, or an accessory to a laptop or smartphone. - Computing systems are now increasingly taking a wide variety of forms. Computing systems may, for example, be handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally been considered a computing system. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions that may be executed by the processor. The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems.
- As illustrated in
FIG. 4 , acomputing system 400 includes at least oneprocessing unit 402 and computer-readable media 404. The computer-readable media 404 may conceptually be thought of as including physical system memory, which may be volatile, non-volatile, or some combination of the two. The computer-readable media 404 also conceptually includes non-volatile mass storage. If thecomputing system 400 is distributed, the processing, memory and/or storage capability may be distributed as well. - As used herein, the term “executable module” or “executable component” can refer to software objects, routings, or methods that may be executed on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads). Such executable modules may be managed code in the case of being executed in a managed environment in which type safety is enforced, and in which processes are allocated their own distinct memory objects. Such executable modules may also be unmanaged code in the case of executable modules being authored in native code such as C or C++.
- In the description that follows, embodiments are described with reference to acts that are performed by one or more computing systems. If such acts are implemented in software, one or more processors of the associated computing system that performs the act direct the operation of the computing system in response to having executed computer-executable instructions. For example, such computer-executable instructions may be embodied on one or more computer-readable media that form a computer program product. An example of such an operation involves the manipulation of data. The computer-executable instructions (and the manipulated data) may be stored in the computer-
readable media 404 of thecomputing system 400.Computing system 400 may also containcommunication channels 408 that allow thecomputing system 400 to communicate with other processors over, for example,network 410. - Embodiments described herein may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments described herein also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other tangible storage medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface controller (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
- As an example, the
depth processing module 130 may be created and/or operated by thecomputing system 400 in response to thecomputing system 400 accessing a computer program product having one or more computer-readable media 404 having thereon computer-executable instructions that are structured such that, when executed by one or more processors of thecomputing system 400, thecomputing system 400 creates and/or operates thedepth processing module 130. - The
depth processing module 130 might allow thecomputing system 400 to infer information regarding the surface on which the projection system projects in the absence of objects placed within the field of project, while likewise detecting objects and characteristics of objects placed within the field of projection. Thus, the depth information might affect state of thecomputing system 400 thereby affecting the visible image. In one embodiment, the non-visible image is a pattern that is perhaps repeated (although it may be a non-repeating pattern also) and that allows for depth information to be derived based on reflections of that pattern. Also, the depth information may be obtained by a non-visible image via phase-based or other time of flight methods, or any other method for determining depth information from non-visible images. - Physical embodiments of the
projection system 100 will now be described, although the diversity within these two physical embodiments should convey that the projection system described herein really have no limit on the actual physical implementation. -
FIG. 5 illustrates a firstphysical embodiment 500 in which theprojection system 100 is aprojector 501 mounted to aceiling 502 usingmechanical mounts 507. Here, theprojector 501 projects animage 506 onto avertical wall surface 504. Aplanar light emitter 503 emits co-planar infra-red light planes, and based on reflections, provides capture depth information to the projector 501 (which depth information may supplement depth information captured within theprojector 501 via its own optics). For instance, theplanar light emitter 503 send electrical signals overwiring 505, although wireless embodiments are also possible. -
FIGS. 6A and 6B illustrates a second physical embodiment in which the projection system is incorporated into a cam light, or can light, and, thus, is referred to as a “camlight system 600.”FIG. 6A illustrates a side view of thecam light system 600. Thecam light system 600 includes thecam light 601 in which theprojection system 100 is housed. Thecam light 601 includes an exposedportion 602 that faces downward into the interior of the room whilst the remainder is generally hidden from view above theceiling 603. A mounting plate 604 and mountingbolts 605 assist in mounting thecam light 601 within theceiling 603. Apower source 606 supplies power to thecam light 601. -
FIG. 6B illustrates a bottom view, looking up, of the exposedportion 602 of thecam light 601. Avisible light projector 610 emits light downward onto a horizontal surface below the cam light system 600 (such as a table or countertop). When not projecting images, the visiblelight projector 610 may simply emit visible light to irradiate that portion of the room, and function as a regular cam light. However, theremote controller 615 may be used to communicate to theremote sensor 612, when the visiblelight projector 610 is to take on its image projection role. When projecting images, thecolor camera 611 captures visible images reflected from field of projection. Optionally, aninfrared light emitter 613 emits non-visible light so that theinfrared camera 614 may capture reflections of that non-visible light to thereby extract depth information and thus user interaction within the field of projection. However, using theprojection system 100 ofFIG. 1 , the infrared image may instead or addition be projected from the visiblelight projector 610.Speakers 616 emit sound associated with the projected visible image. Accordingly, users can quickly transition from sitting at the dinner table having a well-illuminated dinner, to a fun family game activity, without moving to a different location. - The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scopes.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/724,214 US20180024420A1 (en) | 2013-08-15 | 2017-10-03 | Projector capable of determining how a moving object interacts with a projected image |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/968,267 US9778546B2 (en) | 2013-08-15 | 2013-08-15 | Projector for projecting visible and non-visible images |
US15/724,214 US20180024420A1 (en) | 2013-08-15 | 2017-10-03 | Projector capable of determining how a moving object interacts with a projected image |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/968,267 Continuation US9778546B2 (en) | 2013-08-15 | 2013-08-15 | Projector for projecting visible and non-visible images |
Publications (1)
Publication Number | Publication Date |
---|---|
US20180024420A1 true US20180024420A1 (en) | 2018-01-25 |
Family
ID=52466617
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/968,267 Active 2034-04-22 US9778546B2 (en) | 2013-08-15 | 2013-08-15 | Projector for projecting visible and non-visible images |
US15/724,214 Abandoned US20180024420A1 (en) | 2013-08-15 | 2017-10-03 | Projector capable of determining how a moving object interacts with a projected image |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/968,267 Active 2034-04-22 US9778546B2 (en) | 2013-08-15 | 2013-08-15 | Projector for projecting visible and non-visible images |
Country Status (2)
Country | Link |
---|---|
US (2) | US9778546B2 (en) |
WO (1) | WO2015023987A1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109243223A (en) * | 2018-09-28 | 2019-01-18 | 邯郸学院 | A kind of Multi-media interactive device |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10372269B2 (en) * | 2014-07-29 | 2019-08-06 | Sony Corporation | Projection display apparatus |
JP6682941B2 (en) * | 2016-03-24 | 2020-04-15 | セイコーエプソン株式会社 | projector |
CN106990651A (en) * | 2017-03-30 | 2017-07-28 | 广景视睿科技(深圳)有限公司 | A kind of infrared projection system |
WO2019054204A1 (en) * | 2017-09-14 | 2019-03-21 | ソニー株式会社 | Image processing device and method |
CN111123625B (en) * | 2019-12-13 | 2021-05-18 | 成都极米科技股份有限公司 | Projector and projection method |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020186221A1 (en) * | 2001-06-05 | 2002-12-12 | Reactrix Systems, Inc. | Interactive video display system |
US20040183775A1 (en) * | 2002-12-13 | 2004-09-23 | Reactrix Systems | Interactive directed light/sound system |
US20060289760A1 (en) * | 2005-06-28 | 2006-12-28 | Microsoft Corporation | Using same optics to image, illuminate, and project |
US20110181553A1 (en) * | 2010-01-04 | 2011-07-28 | Microvision, Inc. | Interactive Projection with Gesture Recognition |
US20110216205A1 (en) * | 2010-03-03 | 2011-09-08 | Christie Digital Systems Usa, Inc. | Automatic calibration of projection system using non-visible light |
US20110251905A1 (en) * | 2008-12-24 | 2011-10-13 | Lawrence Nicholas A | Touch Sensitive Holographic Displays |
US20120162140A1 (en) * | 2010-12-23 | 2012-06-28 | Electronics And Telecommunications Research Institute | Method and apparatus for user interaction using pattern image |
US20130229396A1 (en) * | 2012-03-05 | 2013-09-05 | Kenneth J. Huebner | Surface aware, object aware, and image aware handheld projector |
US20140043516A1 (en) * | 2012-08-07 | 2014-02-13 | Barnesandnoble.Com Llc | Front projection ereader system |
US8933974B1 (en) * | 2012-09-25 | 2015-01-13 | Rawles Llc | Dynamic accommodation of display medium tilt |
Family Cites Families (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CA2196048A1 (en) | 1994-07-28 | 1996-02-08 | Pinhas Gilboa | Computerized game board |
US6281878B1 (en) | 1994-11-01 | 2001-08-28 | Stephen V. R. Montellese | Apparatus and method for inputing data |
US5844985A (en) | 1995-09-22 | 1998-12-01 | Qualcomm Incorporated | Vertically correcting antenna for portable telephone handsets |
IL121666A (en) | 1997-08-31 | 2001-03-19 | Bronfeld Joshua | Electronic dice |
US6614422B1 (en) | 1999-11-04 | 2003-09-02 | Canesta, Inc. | Method and apparatus for entering data using a virtual input device |
US6710770B2 (en) | 2000-02-11 | 2004-03-23 | Canesta, Inc. | Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device |
GB0004351D0 (en) | 2000-02-25 | 2000-04-12 | Secr Defence | Illumination and imaging devices and methods |
US6611252B1 (en) | 2000-05-17 | 2003-08-26 | Dufaux Douglas P. | Virtual data input device |
US6650318B1 (en) | 2000-10-13 | 2003-11-18 | Vkb Inc. | Data input device |
US6832954B2 (en) | 2000-05-30 | 2004-12-21 | Namco Ltd. | Photographing game machine, photographing game processing method and information storage medium |
US8040328B2 (en) | 2000-10-11 | 2011-10-18 | Peter Smith | Books, papers, and downloaded information to facilitate human interaction with computers |
FI113094B (en) | 2000-12-15 | 2004-02-27 | Nokia Corp | An improved method and arrangement for providing a function in an electronic device and an electronic device |
US6728582B1 (en) | 2000-12-15 | 2004-04-27 | Cognex Corporation | System and method for determining the position of an object in three dimensions using a machine vision system with two cameras |
US8035612B2 (en) | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Self-contained interactive video display system |
US7103236B2 (en) | 2001-08-28 | 2006-09-05 | Adobe Systems Incorporated | Methods and apparatus for shifting perspective in a composite image |
AU2002357335A1 (en) | 2001-12-28 | 2003-07-24 | Applied Precision, Llc | Stereoscopic three-dimensional metrology system and method |
US6997803B2 (en) | 2002-03-12 | 2006-02-14 | Igt | Virtual gaming peripherals for a gaming machine |
US7385708B2 (en) | 2002-06-07 | 2008-06-10 | The University Of North Carolina At Chapel Hill | Methods and systems for laser based real-time structured light depth extraction |
US7334791B2 (en) | 2002-08-24 | 2008-02-26 | Blinky Bones, Inc. | Electronic die |
JP4230999B2 (en) | 2002-11-05 | 2009-02-25 | ディズニー エンタープライゼス インコーポレイテッド | Video-operated interactive environment |
US7884804B2 (en) | 2003-04-30 | 2011-02-08 | Microsoft Corporation | Keyboard with input-sensitive display device |
CN1890004B (en) | 2003-09-05 | 2011-05-04 | 百利娱乐国际公司 | Systems, methods, and devices for monitoring card games, such as baccarat |
US7040764B2 (en) | 2003-10-23 | 2006-05-09 | Hewlett-Packard Development Company, L.P. | Projection system using ambient light |
EP1694821B1 (en) | 2003-12-11 | 2017-07-05 | Strider Labs, Inc. | Probable reconstruction of surfaces in occluded regions by computed symmetry |
US6955297B2 (en) | 2004-02-12 | 2005-10-18 | Grant Isaac W | Coordinate designation interface |
JP3904562B2 (en) | 2004-02-18 | 2007-04-11 | 株式会社ソニー・コンピュータエンタテインメント | Image display system, recording medium, and program |
JP2006051292A (en) | 2004-02-23 | 2006-02-23 | Aruze Corp | Gaming machine |
US7204428B2 (en) | 2004-03-31 | 2007-04-17 | Microsoft Corporation | Identification of object on interactive display surface by identifying coded pattern |
US7095033B2 (en) | 2004-04-27 | 2006-08-22 | Nicholas Sorge | Multi-sided die with authenticating characteristics and method for authenticating same |
US7394459B2 (en) | 2004-04-29 | 2008-07-01 | Microsoft Corporation | Interaction between objects and a virtual environment display |
US7397464B1 (en) | 2004-04-30 | 2008-07-08 | Microsoft Corporation | Associating application states with a physical object |
JP3816931B2 (en) | 2004-09-08 | 2006-08-30 | コナミ株式会社 | Video game machine for business use, server for video game machine, and video game machine system |
US7399086B2 (en) | 2004-09-09 | 2008-07-15 | Jan Huewel | Image processing method and image processing device |
US20060073891A1 (en) | 2004-10-01 | 2006-04-06 | Holt Timothy M | Display with multiple user privacy |
JP4489555B2 (en) | 2004-10-15 | 2010-06-23 | ビーエルデーオリエンタル株式会社 | Bowling game machine |
US8490971B2 (en) | 2004-10-25 | 2013-07-23 | Koninklijke Philips Electronics N.V. | Autonomous wireless die |
US7450086B2 (en) | 2005-03-14 | 2008-11-11 | Hewlett-Packard Development Company, L.P. | Projector |
US7727060B2 (en) | 2005-07-15 | 2010-06-01 | Maurice Mills | Land-based, on-line poker system |
US7911444B2 (en) | 2005-08-31 | 2011-03-22 | Microsoft Corporation | Input method for surface of interactive display |
US7599561B2 (en) | 2006-02-28 | 2009-10-06 | Microsoft Corporation | Compact interactive tabletop with projection-vision |
WO2007107874A2 (en) | 2006-03-22 | 2007-09-27 | Home Focus Development Ltd | Interactive playmat |
US20100203965A1 (en) | 2006-05-03 | 2010-08-12 | Idx, Inc. | Display device, system and methods for a craps table |
US8666366B2 (en) | 2007-06-22 | 2014-03-04 | Apple Inc. | Device activation and access |
JP4341680B2 (en) | 2007-01-22 | 2009-10-07 | セイコーエプソン株式会社 | projector |
US8545300B2 (en) | 2007-03-08 | 2013-10-01 | Roland C. Colton | System and method of tracking and displaying outcomes of a live craps game |
US8269822B2 (en) | 2007-04-03 | 2012-09-18 | Sony Computer Entertainment America, LLC | Display viewing system and methods for optimizing display view based on active tracking |
US20080280682A1 (en) | 2007-05-08 | 2008-11-13 | Brunner Kevin P | Gaming system having a set of modular game units |
US20080278894A1 (en) | 2007-05-11 | 2008-11-13 | Miradia Inc. | Docking station for projection display applications |
US20090020947A1 (en) | 2007-07-17 | 2009-01-22 | Albers John H | Eight piece dissection puzzle |
US20090029754A1 (en) | 2007-07-23 | 2009-01-29 | Cybersports, Inc | Tracking and Interactive Simulation of Real Sports Equipment |
US20090124382A1 (en) | 2007-11-13 | 2009-05-14 | David Lachance | Interactive image projection system and method |
US8007110B2 (en) | 2007-12-28 | 2011-08-30 | Motorola Mobility, Inc. | Projector system employing depth perception to detect speaker position and gestures |
US8267524B2 (en) | 2008-01-18 | 2012-09-18 | Seiko Epson Corporation | Projection system and projector with widened projection of light for projection onto a close object |
US8167700B2 (en) | 2008-04-16 | 2012-05-01 | Universal Entertainment Corporation | Gaming device |
JP6043482B2 (en) | 2008-06-03 | 2016-12-14 | トウィードルテック リミテッド ライアビリティ カンパニー | Intelligent board game system, game piece, how to operate intelligent board game system, how to play intelligent board game |
US7967451B2 (en) | 2008-06-27 | 2011-06-28 | Microsoft Corporation | Multi-directional image displaying device |
JP5338166B2 (en) | 2008-07-16 | 2013-11-13 | ソニー株式会社 | Transmitting apparatus, stereoscopic image data transmitting method, receiving apparatus, and stereoscopic image data receiving method |
US9218116B2 (en) | 2008-07-25 | 2015-12-22 | Hrvoje Benko | Touch interaction with a curved display |
US20100035684A1 (en) | 2008-08-08 | 2010-02-11 | Bay Tek Games, Inc. | System and method for controlling movement of a plurality of game objects along a playfield |
US8317612B2 (en) | 2008-08-25 | 2012-11-27 | David W Guthrie | Sports net with socks and promotion method used therewith |
US8840249B2 (en) | 2008-10-31 | 2014-09-23 | Christie Digital Systems, Inc. | Method, system and apparatus for projecting visible and non-visible images |
US8226476B2 (en) | 2008-11-04 | 2012-07-24 | Quado Media Inc. | Multi-player, multi-screens, electronic gaming platform and system |
US8442304B2 (en) | 2008-12-29 | 2013-05-14 | Cognex Corporation | System and method for three-dimensional alignment of objects using machine vision |
US8425325B2 (en) | 2009-02-06 | 2013-04-23 | Apple Inc. | Automatically generating a book describing a user's videogame performance |
JP5282617B2 (en) | 2009-03-23 | 2013-09-04 | ソニー株式会社 | Information processing apparatus, information processing method, and information processing program |
US20110256927A1 (en) | 2009-03-25 | 2011-10-20 | MEP Games Inc. | Projection of interactive game environment |
US20110165923A1 (en) | 2010-01-04 | 2011-07-07 | Davis Mark L | Electronic circle game system |
US8246467B2 (en) | 2009-04-29 | 2012-08-21 | Apple Inc. | Interactive gaming with co-located, networked direction and location aware devices |
US20100285881A1 (en) | 2009-05-07 | 2010-11-11 | Microsoft Corporation | Touch gesturing on multi-player game space |
JP5273478B2 (en) | 2009-07-07 | 2013-08-28 | ソニー株式会社 | Video display device and video display system |
US8851475B2 (en) | 2009-11-12 | 2014-10-07 | Tangiamo Ab | Electronic gaming system |
US8421634B2 (en) | 2009-12-04 | 2013-04-16 | Microsoft Corporation | Sensing mechanical energy to appropriate the body for data input |
CN101776836B (en) * | 2009-12-28 | 2013-08-07 | 武汉全真光电科技有限公司 | Projection display system and desktop computer |
US8134717B2 (en) | 2010-05-21 | 2012-03-13 | LTS Scale Company | Dimensional detection system and associated method |
US8751049B2 (en) | 2010-05-24 | 2014-06-10 | Massachusetts Institute Of Technology | Kinetic input/output |
US8388146B2 (en) | 2010-08-01 | 2013-03-05 | T-Mobile Usa, Inc. | Anamorphic projection device |
US8905551B1 (en) * | 2010-12-23 | 2014-12-09 | Rawles Llc | Unpowered augmented reality projection accessory display device |
US20120223885A1 (en) | 2011-03-02 | 2012-09-06 | Microsoft Corporation | Immersive display experience |
US8784206B1 (en) | 2011-04-15 | 2014-07-22 | Wms Gaming, Inc. | Modifying presentation of three-dimensional, wagering-game content |
US20130113975A1 (en) | 2011-11-04 | 2013-05-09 | Peter Gabris | Projector Image Correction Method and System |
-
2013
- 2013-08-15 US US13/968,267 patent/US9778546B2/en active Active
-
2014
- 2014-08-15 WO PCT/US2014/051352 patent/WO2015023987A1/en active Application Filing
-
2017
- 2017-10-03 US US15/724,214 patent/US20180024420A1/en not_active Abandoned
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20020186221A1 (en) * | 2001-06-05 | 2002-12-12 | Reactrix Systems, Inc. | Interactive video display system |
US20040183775A1 (en) * | 2002-12-13 | 2004-09-23 | Reactrix Systems | Interactive directed light/sound system |
US20060289760A1 (en) * | 2005-06-28 | 2006-12-28 | Microsoft Corporation | Using same optics to image, illuminate, and project |
US20110251905A1 (en) * | 2008-12-24 | 2011-10-13 | Lawrence Nicholas A | Touch Sensitive Holographic Displays |
US20110181553A1 (en) * | 2010-01-04 | 2011-07-28 | Microvision, Inc. | Interactive Projection with Gesture Recognition |
US20110216205A1 (en) * | 2010-03-03 | 2011-09-08 | Christie Digital Systems Usa, Inc. | Automatic calibration of projection system using non-visible light |
US20120162140A1 (en) * | 2010-12-23 | 2012-06-28 | Electronics And Telecommunications Research Institute | Method and apparatus for user interaction using pattern image |
US20130229396A1 (en) * | 2012-03-05 | 2013-09-05 | Kenneth J. Huebner | Surface aware, object aware, and image aware handheld projector |
US20140043516A1 (en) * | 2012-08-07 | 2014-02-13 | Barnesandnoble.Com Llc | Front projection ereader system |
US8933974B1 (en) * | 2012-09-25 | 2015-01-13 | Rawles Llc | Dynamic accommodation of display medium tilt |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109243223A (en) * | 2018-09-28 | 2019-01-18 | 邯郸学院 | A kind of Multi-media interactive device |
Also Published As
Publication number | Publication date |
---|---|
WO2015023987A8 (en) | 2015-04-09 |
US9778546B2 (en) | 2017-10-03 |
WO2015023987A1 (en) | 2015-02-19 |
US20150049308A1 (en) | 2015-02-19 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20180024420A1 (en) | Projector capable of determining how a moving object interacts with a projected image | |
US10481475B2 (en) | Smart lighting device and control method thereof | |
US20190028674A1 (en) | Holographic video capture and telepresence system | |
JP5941146B2 (en) | Projection capture system, program and method | |
CN104076914A (en) | Electronic equipment and projection display method | |
CN106537247B (en) | Projection type display device | |
WO2016082670A1 (en) | Intelligent projection light bulb and interactive and intelligent projection method thereof | |
US10073567B2 (en) | Interactive display systems | |
US20190235262A1 (en) | Holographic projection device, method, apparatus, and computer readable storage medium | |
US20160381334A1 (en) | Technologies for projecting a noncontinuous image | |
CN106954323B (en) | Information sharing system and method based on desk lamp | |
CN104714769B (en) | data processing method and electronic equipment | |
US20180176460A1 (en) | Photo terminal stand system | |
US10275092B2 (en) | Transforming received touch input | |
US9367152B2 (en) | Interactive projection system and interactive image-detecting method | |
US20210100336A1 (en) | Color reproduction services | |
US10958851B2 (en) | Camera apparatus for indicating camera field of view | |
JP2016184850A (en) | Projector and detection method | |
CN108200418A (en) | The projecting method and electronic equipment of electronic equipment | |
Sukthankar | Towards ambient projection for intelligent environments | |
US20240137454A1 (en) | Stitching Helper For A Unified Video Stream | |
CN105701922A (en) | Open type self-photo-shooting shop | |
CN110830780A (en) | Projection method, projection system, and computer-readable storage medium | |
CN105630071A (en) | Electronic device with shooting function | |
TWM555534U (en) | Cuisine recognition system for meal box |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
AS | Assignment |
Owner name: MEP TECH, INC., UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MEALING, DONALD ROY;DAVIS, MARK L.;HOOLE, ROGER H.;AND OTHERS;SIGNING DATES FROM 20140219 TO 20170507;REEL/FRAME:050605/0885 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE AFTER FINAL ACTION FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |