New! View global litigation for patent families

US6753879B1 - Creating overlapping real and virtual images - Google Patents

Creating overlapping real and virtual images Download PDF

Info

Publication number
US6753879B1
US6753879B1 US09609724 US60972400A US6753879B1 US 6753879 B1 US6753879 B1 US 6753879B1 US 09609724 US09609724 US 09609724 US 60972400 A US60972400 A US 60972400A US 6753879 B1 US6753879 B1 US 6753879B1
Authority
US
Grant status
Grant
Patent type
Prior art keywords
image
user
generated
computer
system
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US09609724
Inventor
William C. DeLeeuw
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Grant date

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0325Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image

Abstract

A computer generated image may be displayed inside an enclosure. The enclosure may include a transreflective surface so that the user can view the image generated by the computer system overlaid with a body part of the user. The computer generated images may move relative to the known position of the body part. As a result, the illusion is created that the physical body part interacts in a meaningful way with the computer generated image.

Description

BACKGROUND

This invention relates generally to processor-based systems and particularly to processor-based systems to create virtual reality displays.

Virtual reality devices enable a user to interact with images generated by a computer system. From the user's point of view, the images may appear to be real objects. By receiving feedback about the user's movements, the computer generated images may be modified so that those images appear to interact with the human user.

For example, head mounted displays may be utilized to produce the illusion of an interaction with a virtual reality image. In addition, data generating gloves may be worn by the user to generate position signals indicative of the user's position. In some cases, relatively expensive equipment and burdensome procedures may be necessary to assume a virtual reality experience.

Thus, there is a continuing need for better ways to make a virtual reality experience available at a relatively lower cost, for example in connection with toys and other entertainment devices.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a block depiction of one embodiment of the present invention;

FIG. 2 is a block depiction of the processor-based system shown in FIG. 1;

FIG. 3 is a flow chart for software in accordance with one embodiment of the present invention;

FIG. 4a is an image generated by the system shown in FIG. 1 in accordance with one embodiment of the present invention;

FIG. 4b is an image generated by the system shown in FIG. 1 in accordance with one embodiment of the present invention;

FIG. 5a is an image generated by the system of FIG. 1 in accordance with another embodiment of the present invention;

FIG. 5b is another image generated by the system of FIG. 1 in accordance with another embodiment of the present invention;

FIG. 6a shows an image generated by the system in accordance with still another embodiment of the present invention; and

FIG. 6b shows an object created using an embodiment of the present invention.

DETAILED DESCRIPTION

Referring to FIG. 1, a processor-based system 10 may include an enclosure 12, a computer display 14, and a computer system 16 in accordance with one embodiment of the present invention. The computer system 16 may be any conventional processor-based computer system including a desktop computer system, a laptop computer system or even a handheld computer system or appliance. The display 14 may be a computer monitor, a liquid crystal display or a television, as examples.

The enclosure 12 may be a rectangular enclosure including at least three openings. An opening 20 enables one or more hands 17 to be inserted into the enclosure 12. Port 26 enables computer generated images displayed on the display 14 to be seen within the enclosure 12. The port 28 enables the user, indicated by the user's eye E, to visualize the interior of the enclosure 12 monoptically or bioptically.

A transreflective or dichroic reflector 24 reflects light from the display 14 for viewing by the user E through the port 28. The reflector 24 also enables the user to directly view the user's hands 17. Thus, the reflector 24 is dichroic in that it both reflects and transmits light. As a result, the resulting image is a composite of the images created by the display 14 and the actual physical object or objects provided beneath the reflector 24, such as the user's hand 17 and any other objects within the enclosure 12. In an alternative embodiment, the display 14 may be viewed directly and the hand 17 may be viewed via reflection from the reflector 24.

A combined light source and digital imaging unit 22 may both illuminate the visual field below the reflector 24 and capture an image of the illuminated visual field. In an alternative embodiment, separate imaging and illuminating units may be used. The light generated by the unit 22 may be reflected from the user's hand 17 and passed through the dichroic reflector 24, as indicated by the arrow J. Similarly, the images generated by the display 14 may be reflected off the reflector 24 for viewing by the user E, as indicated by the arrows I.

Thus, the arrow H indicates a composite image of the actual object or objects such as the hand 17 and the virtual images created by the display 14, represented by the arrow I. The images generated by the computer system 16 may be coupled to the display 14 over a cable 18 such as a serial interface cable.

In some embodiments of the present invention, the ports 26 and 28 include anaglyphic means. Typical anaglyphic means include red/green or red/blue binocular separate devices that create the appearance of three dimensions. The view port 28 may include an anaglyphic lens such that the lens for one user eye is red colored and the lens for the other user eye is blue or green. Thus, the computer generated two dimensional image may appear to have a depth commensurate with the depth of the enclosure 12.

Alternatively, a liquid crystal shutter in the port 28 may be utilized adjacent each of the user's eyes. The computer generated images (I) may be alternated such that each eye sees a slightly different image (in a different time interval) resulting in the illusion of depth. In general, any technique for producing the illusion of three dimensional depth from a two dimensional image may be utilized in accordance with some embodiments of the present invention.

The dichroic reflector 24 may be made of any transreflective material such as Plexiglas or acrylic. The light source included with the unit 22 may be directional and may be shielded from direct observation by the user.

The interior of the enclosure 12 may be covered with a light absorbing or scattering material such as felt. The use of such a material may substantially eliminate any ability to discern features of the interior of the enclosure 12 itself.

Referring to FIG. 2, the computer system 16 may include a processor 60 coupled to a bridge 62. The bridge 62 is in turn coupled to the system memory 64 and a bus 66 in accordance with one embodiment of the present invention. The bus 66 may be coupled to a display controller 74 that is coupled to the display 14.

The bus 66 may also be coupled to a video interface 68 that receives video streams from video capture devices such as the cameras 22 a, 22 b and 22 c. Advantageously, the imaging units 22 may be positioned around the interior of the enclosure 12, underneath the reflector 24, to capture a plurality of images of the user's hand 17 or any other objects located therein. In another embodiment, a single camera may capture video from more than one camera angle or a single camera may otherwise provide depth information.

The bus 66 is also coupled to a bridge 70 that is in turn coupled to a hard disk drive 72. The hard disk drive 72 may store software 78 for creating virtual illusions.

In an alternative embodiment, other techniques may be used to resolve the position of the user's hand rather than the use of the cameras 22 a-c. For example, ultrasonic transducers may be used like sonar. As another example, data generating gloves may be used.

Referring next to FIG. 3, the software 78 begins by receiving video from the imaging units 22, as indicated in diamond 82, in accordance with one embodiment of the present invention. The video may include a plurality of frames received as streaming video. The video may be analyzed using pattern recognition techniques, as one example, to identify the user's hand 17, as indicated in block 84. The position of the user's hand in three dimensional space may then be resolved as indicated in block 86. In some embodiments of the present invention, this may be done by comparing the frames generated from each of the imaging units 22 a, 22 b and 22 c in order to resolve, in three dimensions, the position of the hand 17.

Thereafter, image elements may be generated by the computer system 16.relative to the hand 17 position as indicated in block 88. More particularly, the video may be generated and overlaid over the apparent position of the hand 17. As a result, the generated images may appear to move relative to the hand 17, creating the appearance of actual interaction between the virtual or computer generated images and the physical hand 17.

Referring next to FIGS. 4a and 4 b, in accordance with one embodiment of the present invention, a scene 29 a viewed through the port 28 may include an image of an animate being 30. The being 30 image may be generated by the display 14 and reflected from the reflector 24 for view through the port 28. The user may appear to touch the animate being 30 image as indicated by the user's finger 32. Even though the user's hand 17 is below the reflector 24, the finger 32 may actually appear to overlay the animate being 30. This may be because the depiction of the animate being 30, generated by the computer system 16, may be altered by omission of the portion where the finger 32 is located. As a result, the impression is created that the user's finger, which is actually under the reflector 24, is in fact over the image of the animate being 30.

In response to the user touching the apparent location of the animate being's heart 34, a new display 29 b, shown in FIG. 4b, may be created that appears to be the exposed heart of the animate being 30. Thus, the user may attempt to effectively dissect the animate being 30, which in one game play scenario, may be an alien. The user may be treated to depictions of various body parts merely by touching the location associated with a given body part image. The body part image, such as the heart 34, may be generated by the computer system 16. Again, the user's finger 32 may appear in an overlaying arrangement with respect to the heart 34 even though in fact it is physically located under the reflector 24. Similarly, a game scenario may be implemented wherein the user actually dissects an animate model so that the user can “remove” skin or other tissue to view the internal organs of the animate being.

Turning next to FIG. 5a, in accordance with another embodiment of the present invention, the view through the port 28 is indicated at 37 a. In this case, a plurality of insect body part images including heads 36 a and 36 b, wings 40 a and 40 b, tails 42 a and 42 b, bodies 44 a and 44 b, and legs 38 a and 38 b appear to be distributed across the interior of the enclosure 12. In fact, all of the images indicated in the scene 37 a are computer generated.

The user can attempt to grasp the array of images illustrated in FIG. 5a and to build an insect 44 shown in FIG. 5b. In this case, a variety of incongruous body parts may be joined together to create an image. In fact, the user can select any of the body part images and push them together to cause them to be joined. The computer system 16 then realigns the images and permanently positions them, where apparently located, by user hand 17 manipulation. The remaining unused body parts may then be automatically removed from view. Thus, the user can pick up the seemingly light, if not weightless, insect image elements, and physically manipulate them to cause them to join to form an insect image 44. That insect image 44, once completed, may be caused to appear to fly around the interior of the enclosure 12. Of course, in fact all of the images shown in the scenes 37 a and 37 b may be computer system 16 generated through the display 14.

Turning finally to FIGS. 6a and 6 b, a physical block 50 of a carvable material may be positioned beneath the reflector 24. The block 50 then becomes visible through the port 28. The user may use a knife 52 to physically carve the block 50. However, as the user is carving the block 50, an overlay image 54 is generated that appears to be located within the block 50. The location of the block 50 may be determined from the cameras 22 a, 22 b and 22 c.

As a result, the user is able to see an image of the final object that the user wishes to carve from the block 50, apparently located within the block 50. The user can progressively carve the block 50 using the image 54 as a template. Thus, as shown in the scene 49 b (FIG. 6b), after the carving is done, a physical object 54 a which corresponds to the image 54 may be produced.

The apparent depth of the computer generated images may be adjusted by changing the angle of the sheet 24, the viewing angle or the location of the monitor 14. The computer generated image may be caused to appear to be located over or under the user's hand.

While the present invention has been described with respect to a limited number of embodiments, those skilled in the art will appreciate numerous modifications and variations therefrom. It is intended that the appended claims cover all such modifications and variations as fall within the true spirit and scope of this present invention.

Claims (19)

What is claimed is:
1. A method comprising:
determining the location of a human controlled object;
overlaying a computer generated image for viewing over said human controlled object by reflecting one of said computer generated image and an image of said object and transmitting the other of said computer generated image and said object image; and
controlling said computer generated image so that said computer generated image appears to interact with said human controlled object.
2. The method of claim 1 including providing an enclosure for viewing a human body part and said computer generated image.
3. The method of claim 2 including providing a port in said enclosure that enables viewing of both said human body part and said computer generated image.
4. The method of claim 3 including reflecting one of said human body part or said computer generated image for viewing through said port.
5. The method of claim 1 including providing a transreflective layer that reflects one of said computer generated image and said object and transmits the other of said image and said object for viewing.
6. The method of claim 1 wherein determining the location of the object includes determining the location of said object in three dimensions.
7. The method of claim 6 wherein determining the location of said object in three dimensions includes imaging said object with at least two imaging devices.
8. The method of claim 1 including creating the impression that a user is disassembling an image into one or more parts.
9. The method of claim 1 including enabling the user to manually assemble a plurality of computer generated images into a composite whole.
10. The method of claim 1 including creating an apparent image of an object to be formed.
11. A system comprising:
an enclosure;
a first port positioned to enable a user to look into said enclosure;
a second port arranged to allow the user to insert an object into said enclosure;
a third port arranged to receive a computer-generated image; and
a transreflective sheet arranged to enable said computer-generated image and said object to be viewed through said first port in an overlapping arrangement.
12. The system of claim 11 including a processor-based system having a storage and a display device aligned with said third port, said storage storing instructions that enable said processor-based system to determine the location of the object, overlay the computer generated image for viewing over said object and control said computer generated image so that the image appears to interact with the object.
13. The system of claim 12 including a plurality of video cameras for imaging the object in said enclosure.
14. The system of claim 11 including a plurality of lights inside said enclosure underneath said transreflective sheet to illuminate the object.
15. The system of claim 11 including a device to create the appearance of three dimensions from a two dimensional image.
16. The system of claim 15 wherein at least one of said ports includes an anaglyphic device.
17. The system of claim 11 where said sheet transmits an image of the object and reflects the computer-generated image.
18. The system of claim 11 wherein said second port receives the user's hand into said enclosure.
19. An article comprising a medium storing instructions that, if executed, enable a processor-based system to:
determine the location of a human controlled object in the form of a block of sculptable material;
overlay a computer generated image for viewing over said human controlled object;
control said image so that said image appears to interact with said human controlled object;
create an apparent image of a sculpture to be formed from the object; and
display said apparent image at a predefined location within said object to enable a user to cut said image from said object.
US09609724 2000-07-03 2000-07-03 Creating overlapping real and virtual images Active 2021-10-30 US6753879B1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09609724 US6753879B1 (en) 2000-07-03 2000-07-03 Creating overlapping real and virtual images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09609724 US6753879B1 (en) 2000-07-03 2000-07-03 Creating overlapping real and virtual images

Publications (1)

Publication Number Publication Date
US6753879B1 true US6753879B1 (en) 2004-06-22

Family

ID=32469771

Family Applications (1)

Application Number Title Priority Date Filing Date
US09609724 Active 2021-10-30 US6753879B1 (en) 2000-07-03 2000-07-03 Creating overlapping real and virtual images

Country Status (1)

Country Link
US (1) US6753879B1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020094189A1 (en) * 2000-07-26 2002-07-18 Nassir Navab Method and system for E-commerce video editing
US20020140709A1 (en) * 2001-03-27 2002-10-03 Frank Sauer Augmented reality guided instrument positioning with modulated guiding graphics
US20030037449A1 (en) * 2001-08-23 2003-02-27 Ali Bani-Hashemi Augmented and virtual reality guided instrument positioning using along-the-line-of-sight alignment
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US20050166163A1 (en) * 2004-01-23 2005-07-28 Chang Nelson L.A. Systems and methods of interfacing with a machine
DE202005001702U1 (en) * 2005-02-02 2006-06-14 Sata Farbspritztechnik Gmbh & Co.Kg Virtual coating system and spray gun
DE102005016847A1 (en) * 2005-04-12 2006-10-19 UGS Corp., Plano Three-dimensional computer-aided design object visualization method, involves determining position of user-controlled cursor on display device and displaying view on device based on position of cursor relative to another view
US20070247454A1 (en) * 2006-04-19 2007-10-25 Norbert Rahn 3D visualization with synchronous X-ray image display
US20080111310A1 (en) * 2006-11-14 2008-05-15 Lydia Parvanta Game table television and projector system, and method for same
US20090268945A1 (en) * 2003-03-25 2009-10-29 Microsoft Corporation Architecture for controlling a computer using hand gestures
US20110260967A1 (en) * 2009-01-16 2011-10-27 Brother Kogyo Kabushiki Kaisha Head mounted display
US8090561B1 (en) 2008-08-14 2012-01-03 Jai Shin System and method for in situ display of a virtual wheel on a wheeled vehicle
US8217856B1 (en) 2011-07-27 2012-07-10 Google Inc. Head-mounted display that displays a visual representation of physical interaction with an input interface located outside of the field of view
US9596643B2 (en) 2011-12-16 2017-03-14 Microsoft Technology Licensing, Llc Providing a user interface experience based on inferred vehicle state

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5083558A (en) * 1990-11-06 1992-01-28 Thomas William R Mobile surgical compartment with micro filtered laminar air flow
US5588139A (en) * 1990-06-07 1996-12-24 Vpl Research, Inc. Method and system for generating objects for a multi-person virtual world using data flow networks
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5795227A (en) * 1996-06-28 1998-08-18 Raviv; Roni Electronic game system
US5856811A (en) * 1996-01-31 1999-01-05 Delco Electronics Corp. Visual display and helmet assembly
WO2000038117A1 (en) * 1998-12-23 2000-06-29 Washington State University Research Foundation Method and system for a virtual assembly design environment
US6195104B1 (en) * 1997-12-23 2001-02-27 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
US6434255B1 (en) * 1997-10-29 2002-08-13 Takenaka Corporation Hand pointing apparatus

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588139A (en) * 1990-06-07 1996-12-24 Vpl Research, Inc. Method and system for generating objects for a multi-person virtual world using data flow networks
US5083558A (en) * 1990-11-06 1992-01-28 Thomas William R Mobile surgical compartment with micro filtered laminar air flow
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5856811A (en) * 1996-01-31 1999-01-05 Delco Electronics Corp. Visual display and helmet assembly
US5795227A (en) * 1996-06-28 1998-08-18 Raviv; Roni Electronic game system
US6434255B1 (en) * 1997-10-29 2002-08-13 Takenaka Corporation Hand pointing apparatus
US6195104B1 (en) * 1997-12-23 2001-02-27 Philips Electronics North America Corp. System and method for permitting three-dimensional navigation through a virtual reality environment using camera-based gesture inputs
WO2000038117A1 (en) * 1998-12-23 2000-06-29 Washington State University Research Foundation Method and system for a virtual assembly design environment

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PCT Publication #: PCT/US99/30753, International Publication #: WO 00/38117, International Publication Date: Jun. 29, 2000, Jayaram, et al., Virtual Assembly Design Environment (VADE). *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020094189A1 (en) * 2000-07-26 2002-07-18 Nassir Navab Method and system for E-commerce video editing
US20020140709A1 (en) * 2001-03-27 2002-10-03 Frank Sauer Augmented reality guided instrument positioning with modulated guiding graphics
US20030037449A1 (en) * 2001-08-23 2003-02-27 Ali Bani-Hashemi Augmented and virtual reality guided instrument positioning using along-the-line-of-sight alignment
US7379077B2 (en) * 2001-08-23 2008-05-27 Siemens Corporate Research, Inc. Augmented and virtual reality guided instrument positioning using along-the-line-of-sight alignment
US20040193413A1 (en) * 2003-03-25 2004-09-30 Wilson Andrew D. Architecture for controlling a computer using hand gestures
US8745541B2 (en) 2003-03-25 2014-06-03 Microsoft Corporation Architecture for controlling a computer using hand gestures
US20100146455A1 (en) * 2003-03-25 2010-06-10 Microsoft Corporation Architecture For Controlling A Computer Using Hand Gestures
US20090268945A1 (en) * 2003-03-25 2009-10-29 Microsoft Corporation Architecture for controlling a computer using hand gestures
US9652042B2 (en) 2003-03-25 2017-05-16 Microsoft Technology Licensing, Llc Architecture for controlling a computer using hand gestures
US7755608B2 (en) * 2004-01-23 2010-07-13 Hewlett-Packard Development Company, L.P. Systems and methods of interfacing with a machine
US20050166163A1 (en) * 2004-01-23 2005-07-28 Chang Nelson L.A. Systems and methods of interfacing with a machine
DE202005001702U1 (en) * 2005-02-02 2006-06-14 Sata Farbspritztechnik Gmbh & Co.Kg Virtual coating system and spray gun
DE102005016847A1 (en) * 2005-04-12 2006-10-19 UGS Corp., Plano Three-dimensional computer-aided design object visualization method, involves determining position of user-controlled cursor on display device and displaying view on device based on position of cursor relative to another view
US20070247454A1 (en) * 2006-04-19 2007-10-25 Norbert Rahn 3D visualization with synchronous X-ray image display
US20080111310A1 (en) * 2006-11-14 2008-05-15 Lydia Parvanta Game table television and projector system, and method for same
US8090561B1 (en) 2008-08-14 2012-01-03 Jai Shin System and method for in situ display of a virtual wheel on a wheeled vehicle
US20110260967A1 (en) * 2009-01-16 2011-10-27 Brother Kogyo Kabushiki Kaisha Head mounted display
US8217856B1 (en) 2011-07-27 2012-07-10 Google Inc. Head-mounted display that displays a visual representation of physical interaction with an input interface located outside of the field of view
US9596643B2 (en) 2011-12-16 2017-03-14 Microsoft Technology Licensing, Llc Providing a user interface experience based on inferred vehicle state

Similar Documents

Publication Publication Date Title
US6062865A (en) System for training persons to perform minimally invasive surgical procedures
US4743964A (en) Method and device for recording and restitution in relief of animated video images
US20040109009A1 (en) Image processing apparatus and image processing method
US20080024597A1 (en) Face-mounted display apparatus for mixed reality environment
US5847710A (en) Method and apparatus for creating three dimensional drawings
US6317130B1 (en) Apparatus and method for generating skeleton-based dynamic picture images as well as medium storing therein program for generation of such picture images
US20020082082A1 (en) Portable game machine having image capture, manipulation and incorporation
US6633289B1 (en) Method and a device for displaying at least part of the human body with a modified appearance thereof
US20010055748A1 (en) System for training persons to perform minimally invasive surgical procedures
Azuma A survey of augmented reality
US20140361976A1 (en) Switching mode of operation in a head mounted display
US5368309A (en) Method and apparatus for a virtual video game
US20140361977A1 (en) Image rendering responsive to user actions in head mounted display
US20070060336A1 (en) Methods and systems for enabling depth and direction detection when interfacing with a computer program
US20130234934A1 (en) Three-Dimensional Collaboration
US20040208358A1 (en) Image generation system, image generation method, program, and information storage medium
US6822648B2 (en) Method for occlusion of movable objects and people in augmented reality scenes
US8072470B2 (en) System and method for providing a real-time three-dimensional interactive environment
US20110187832A1 (en) Naked eye three-dimensional video image display system, naked eye three-dimensional video image display device, amusement game machine and parallax barrier sheet
US20090238378A1 (en) Enhanced Immersive Soundscapes Production
US20100045601A1 (en) Interaction with a multi-component display
US5999641A (en) System for manipulating digitized image objects in three dimensions
US7445549B1 (en) Networked portable and console game systems
US8704879B1 (en) Eye tracking enabling 3D viewing on conventional 2D display
WO2008132724A1 (en) A method and apparatus for three dimensional interaction with autosteroscopic displays

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DELEEUW, WILLIAM C.;REEL/FRAME:010948/0043

Effective date: 20000629

FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
FPAY Fee payment

Year of fee payment: 8

FPAY Fee payment

Year of fee payment: 12