US20130201285A1 - 3-d glasses with illuminated light guide - Google Patents

3-d glasses with illuminated light guide Download PDF

Info

Publication number
US20130201285A1
US20130201285A1 US13/260,701 US201013260701A US2013201285A1 US 20130201285 A1 US20130201285 A1 US 20130201285A1 US 201013260701 A US201013260701 A US 201013260701A US 2013201285 A1 US2013201285 A1 US 2013201285A1
Authority
US
United States
Prior art keywords
glasses
light
frame
camera
lgp
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/260,701
Inventor
Xiaodong Mao
Richard Marks
Dominic Mallinson
Eric J. Larsen
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Inc
Original Assignee
Sony Computer Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Inc filed Critical Sony Computer Entertainment Inc
Assigned to SONY COMPUTER ENTERTAINMENT INC. reassignment SONY COMPUTER ENTERTAINMENT INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LARSEN, ERIC J., MALLINSON, DOMINIC, MAO, XIAODONG, MARKS, RICHARD
Publication of US20130201285A1 publication Critical patent/US20130201285A1/en
Assigned to SONY INTERACTIVE ENTERTAINMENT INC. reassignment SONY INTERACTIVE ENTERTAINMENT INC. CHANGE OF NAME (SEE DOCUMENT FOR DETAILS). Assignors: SONY COMPUTER ENTERTAINMENT INC.
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/02Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
    • G01B11/026Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring distance between sensor and object
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/008Aspects relating to glasses for viewing stereoscopic images

Definitions

  • PCT Patent Cooperation Treaty
  • US2010/051827 filed Oct. 7, 2010, by some of the same inventors, titled “3-D Glasses With Camera Based Head Tracking” (Attorney Docket No. 026340-007800PC), filed the same day, which is incorporated by reference in its entirety for all purposes.
  • Embodiments of the present invention relate to camera-based head tracking in which the person's head being tracked is wearing illuminated glasses, and, in particular, to camera-based head tracking of variable-illumination glasses for use with 3-D video display and video game entertainment devices.
  • Video games have become more immersive as technology progresses. Video game consoles are often produced with state-of-the-art processors, extremely fast memory, and high-end graphics cards.
  • Input controllers have evolved from simple knobs, joysticks, and button-based controllers to accelerometer-enabled controllers that a user can swing in his or hands or wear.
  • Further input technologies involve a camera, usually mounted on top of a television, tracking a user's body, including tracking his or her head, torso, arms, and legs. Users can control such video games by simply moving their bodies or parts thereof. For example, a player of a skateboarding game can duck down so that he or she clears an virtual bridge.
  • Three-dimensional (3-D, or 3D) televisions help immerse users in events happening on their display screens.
  • 3-D televisions a user sometimes dons 3-D glasses.
  • Earlier 3-D glasses included red and blue lenses for discerning an anaglyph.
  • Shuttered 3-D glasses have lenses that rapidly and alternatingly switch between being opaque and transparent in synchronization with a display that rapidly shows left and right images.
  • Other types of 3-D presentation technology exist. Many are similar in that they present a separate two-dimensional image to a viewer's left eye and a separate two-dimensional image to the viewer's right eye either contemporaneously or very rapidly (e.g., at 60 Hz) in order to trick the viewer's brain into interpreting the stereoscopic images as a 3-D environment.
  • Video games utilizing 3-D display technologies can immerse a player in a game through the use of 3-D effects on the screen.
  • video game consoles with body tracking can use 3-D effects to coordinate a player's actual movements in the real world with his or her virtual movement in a displayed virtual world.
  • Head tracking can be critical for games that render based on where a user's head is. For example, as a user steps toward the television in his living room, a video game console can render a 3-D virtual pyre on the television so that it appears like he moves closer to it.
  • tracking the location, orientation, and movement of a viewer or other user's head can be important for some video games, especially those that use head-tracking to render 3-D objects closer to the user.
  • head-tracking to render 3-D objects closer to the user.
  • Devices, systems, and methods of the present disclosure are related to camera-based head tracking of a user wearing a pair of glasses such as 3-D glasses for a 3-D television.
  • the pair of glasses has a light and a light guide, including a hollow, fiber optic, transparent/translucent foam, etc. light-conveying interior, such as a light guide plate (LGP) that conveys and disperses light through the frame.
  • LGP light guide plate
  • the light can be carried from the lights to all corners of the frames in order to increase visibility of the glasses to a tracking camera.
  • the lights can optionally be turned brighter or dimmer depending on competing ambient light.
  • Staggered retroreflector strips can also be mounted on the glasses with reflective backsides that help reflect light back into the LGP.
  • the pair of glasses can have lenticular printing, angular metamerism plastic, or mini prisms that are aligned to give a predetermined color (e.g., red) in one predetermined orientation (e.g., facing the camera, 0 degrees of yaw) and another predetermined color (e.g., violet) in another predetermined orientation (e.g., looking to the left at 90 degrees).
  • a color camera can then be used to not only track a position of the glasses but also determine the wearer's head angle by identifying the color refracted and reflected by the mini prisms.
  • the change in colors observed by a camera can be used to determine the rate of change of the angle (or higher derivatives) of the wearer's head without determining the absolute angle of the wearer's head.
  • LGP can extend the lighting to all corners of the glasses frames so that its geometry is more pronounced and easier to pick up by the camera.
  • Retroreflectors not only reflect ambient light, but their reflective backs can increase the efficiency of the LGP by reflecting light back into the LGP.
  • Prisms and the colors they refract and reflect can be used for angle or other orientation measurements.
  • Such prisms can also refract and reflect near-infrared light, which can often be picked up by charge-coupled device (CCD) cameras.
  • CCD charge-coupled device
  • An embodiment of the present disclosure relates to a pair of glasses for tracking a wearer's head, comprising a frame, including a light guide for conveying light, a light supported by the frame, the light being adjustable in brightness (such as between a high and low brightness level), and a circuit operatively coupled to the light, the circuit configured to adjust a brightness of the light based on an amount of ambient light.
  • the frame can include an LGP to convey light from the light, and a retroreflector with a reflective backing, supported by the frame, can lie over a section of the LGP.
  • the circuit can be configured to adjust the brightness of the light among a predetermined number of brightness levels. Ambient light can be measured by an on-board sensor or from an off-board source.
  • the glasses can be 3-D glasses with lenses that alternate between being opaque and transparent. The alternations can be synchronized with the lenses so that they lessen distractions to the wearer of the glasses.
  • An embodiment relates to a system for tracking a user's head, comprising a frame, including a light guide for conveying light, a light supported by the frame, the light being adjustable in brightness, a circuit operatively coupled to the lights, the circuit configured to adjust a brightness of the light, and a receiver supported by the device for receiving an indication to adjust the brightness of the light in response to ambient light.
  • the system further comprises a camera, a circuit configured to use the camera to track the glasses, and a transmitter operatively coupled to the circuit. The circuit is configured to transmit using the transmitter to the receiver an indication to adjust the brightness of the light based on an amount of ambient light.
  • An embodiment relates to a system for determining a user's head angle, comprising glasses, a camera, and a processor configured to use the camera to image the glasses and determine an orientation of the glasses based on a color refracted and reflected by prisms supported by the frame and which are configured to refract and reflect toward the camera a first predetermined color corresponding to a first predetermined orientation of the glasses and a second predetermined color corresponding to a second predetermined orientation of the glasses.
  • An angle of a head of a wearer of the glasses can be determined.
  • the prisms can be configured to refract and reflect a first predetermined near-infrared wavelength corresponding to a first predetermined orientation of the glasses and a second predetermined near-infrared wavelength corresponding to a second predetermined orientation of the glasses.
  • the camera can image near-infrared wavelengths and the processor is configured to determine an orientation of the glasses based on a near-infrared wavelength of electromagnetic radiation reflected by the prisms.
  • FIG. 1A illustrates 3-D glasses in accordance with an embodiment.
  • FIGS. 1B-1E illustrate alternate cross-sections of the 3-D glasses of FIG. 1A .
  • FIG. 2 illustrates 3-D glasses in accordance with an embodiment.
  • FIG. 3 illustrates 3-D glasses in accordance with an embodiment.
  • FIG. 4A illustrates 3-D glasses with prisms in accordance with an embodiment.
  • FIG. 4B illustrates the 3-D glasses of FIG. 4A rotated.
  • FIG. 5 illustrates a prism refracting and reflecting light in accordance with an embodiment.
  • FIG. 6 illustrates head tracking for a video game console in accordance with an embodiment.
  • FIG. 7 is an example computer system suitable for use with embodiments of the invention.
  • the present disclosure is related to tracking of a user's head using a camera and illuminated glasses.
  • the glasses can be 3-D glasses that are synchronized in time with a television.
  • the glasses employ a light guide, such as LGP, that conveys and disperses light from LEDs through the frame.
  • the glasses can also have retroreflectors that not only reflect ambient light to the tracking camera, but also have a reflective backing that reflects light back into the LGP over which it is affixed.
  • the LEDs can be dimmed or brightened depending on the ambient light.
  • a sensor for the ambient light can be on-board or off-board the glasses.
  • the camera itself can be used as the off-board sensor.
  • Mini prisms can be affixed to the glasses in order to refract and reflect a predetermined color in one predetermined orientation while refracting and reflecting a different predetermined color at another predetermined orientation.
  • Lenticular printing or plastic with angular mesmerism can be used instead of or in addition to mini prisms.
  • the rate of motion of the user's head can be determined by the amount of change of color over time.
  • FIG. 1A illustrates 3-D glasses in accordance with an embodiment.
  • 3-D glasses 100 include frame 102 , right lens 104 , and left lens 106 .
  • Right and left lenses 104 / 106 can shutter between transparent and opaque states in sync with a 3-D display, such as a 3-D television.
  • Frame 102 includes a light guide, here light guide plate (LGP) 110 , that consists of transparent-to-translucent material (or hollowed material), such as polymethyl methacrylate (PMMA), that conveys light from four embedded LEDs 108 that shine into the LGP.
  • LGP light guide plate
  • PMMA polymethyl methacrylate
  • LEDs 108 can be controlled through circuit 112 , which in turn is connected to ambient light sensor 114 . If ambient light sensor 114 senses very bright light, then circuit 112 brightens LEDs 114 so that LGP 110 can be better made out by a camera that tracks the glasses. If ambient light sensor 114 senses very low light levels, then circuit 112 dims LEDs so that the glow of LGP 110 does not distract the wearer.
  • the lenses can be synchronized with the LED duty cycles. For example, when lens 104 is opaque, LEDs 108 on the right side of the glasses can be turned on, and when lens 104 is transparent, LEDs 108 on the right side of the glasses can be turned off.
  • the on/off duty cycle of the LEDs not only saves battery power but can lessen distractions to the user.
  • Wireless receiver 116 synchronizes the 3-D glasses to a 3-D television.
  • Receiver 116 also can receive an indicator to turn up or down the brightness of the LEDs in the glasses.
  • an off-board ambient light sensor such as a tracking camera, can determine whether there is enough ambient light and send an indicator, through a wireless connection, to the glasses signaling to increase or decrease the LED brightnesses.
  • an off-board source may determine that certain lights need to be turned on brighter, and the corresponding lights may be individually brightened.
  • LEDs 108 can be manufactured to emit near-infrared wavelengths that are visible to many charge-coupled device (CCD) cameras. Electromagnetic radiation from near-infrared LEDs is not visible to humans and may be less likely to distract a wearer.
  • CCD charge-coupled device
  • LEDs 108 can be adjusted in brightness within a finite number of brightness levels (e.g., 0 through 15) or have an infinite number of levels within a range. With a finite number of brightness levels (e.g., 16), the glasses can communicate using various brightness levels back to the camera and associated electronics. For example, the glasses may have a low battery, and the LEDs may indicate this by shining at level 15 (highest), then level 0 (off), then 8 (middle) in rapid succession. A video game console connected with the camera can decipher this code and display “Low Battery Warning” on the screen. This may avoid the need to have a low battery indicator on-board the glasses.
  • a finite number of brightness levels e.g. 16
  • the glasses can communicate using various brightness levels back to the camera and associated electronics. For example, the glasses may have a low battery, and the LEDs may indicate this by shining at level 15 (highest), then level 0 (off), then 8 (middle) in rapid succession.
  • a video game console connected with the camera
  • FIG. 1B illustrates cross section A-A for FIG. 1A .
  • LGP 110 is one, integral translucent material, such as polymethyl methacrylate (PMMA) or other plastic suitable for a light guide.
  • PMMA polymethyl methacrylate
  • Other cross section geometries other than rectangular can be used, for example circular, oval, triangular, complex, etc.
  • FIG. 1C illustrates an alternate cross section A-A for FIG. 1A .
  • translucent/transparent material 152 surrounds core material 154 , which can be other translucent/transparent material, foam, air, etc.
  • FIG. 1D illustrates an alternate cross section A-A for FIG. 1A .
  • material 152 may be opaque and configured as a c-channel, with one side open.
  • Translucent/transparent core material 154 resides in the c-channel.
  • FIG. 1E illustrates an alternate cross section A-A for FIG. 1A .
  • translucent/transparent material 152 has ridges 158 that can further disperse and reflect light conveyed through the frames.
  • the ridges may be prisms as will be described later.
  • FIGS. 2 and 3 illustrate 3-D glasses according to an embodiment.
  • LEDs 208 embedded in frame 200 illuminate LGP 210 , which is partially covered by retroreflectors 218 .
  • Retroreflectors 218 are affixed in 5-millimeter (mm) wide strips over portions of LGP 210 .
  • the face of the retroreflectors reflects light, either ambient or from an illumination light, back toward the camera so that the camera can better track the glasses.
  • a reflective backing on the rear of retroreflectors 218 helps bounce light escaping from LGP 210 back into LGP 210 so that causes more internal reflections within the LGP. More light stays in the LGP and is carried further through the frame.
  • LGP is relatively straight in the direction of LED illumination, but it is possible to curve the LGP more and convey light around the frame.
  • LEDs 308 embedded in frame 300 illuminate LGP 310 .
  • LGP 310 is partially covered by small sections of retroreflectors 318 .
  • the small sections of the retroreflectors have foil backings to reflect light back into the LGP.
  • This configuration having smaller sections of retroreflectors may be better applied to situations in which a user is closer to a camera and the camera can therefore distinguish the retroreflectors from the LGP.
  • a first player can wear the glasses of FIG. 2
  • second player can wear the glasses of FIG. 3 .
  • the camera can then distinguish between players one and two by recognizing the pattern of retroreflectors on the respective user's glasses.
  • whole sections of LGP can be turned on or off.
  • the top horizontal bar of the frames can be on while the U sections that hold the lenses can be off.
  • These sections can be optically isolated from one another, by metal foil or other opaque material, so that there is minimal bleed through from section to section.
  • FIGS. 4A and 4B illustrate 3-D glasses with prisms in accordance with an embodiment.
  • glasses 400 have mini prisms 424 in frames 402 .
  • the mini prisms are illuminated by white light 422 , which can be ambient light or a light collocated with camera 420 .
  • Camera 420 is stationary.
  • the glasses are in front of and at zero degrees of yaw to the camera.
  • the white light from white light 422 is refracted and reflected at a predetermined wavelength 426 .
  • the glasses are still in front of the camera, but they have been yawed at e degrees to the camera. Because the mini prisms refract and reflect differently at this angle, the white light 422 is refracted and reflected at a different predetermined wavelength 428 back to the camera.
  • the camera can distinguish between the wavelengths (i.e., colors) and based on a calibration table or other lookup table, determine the angle at which the glasses are oriented with respect to the camera.
  • Lenticular printing can be used instead of or in addition to mini prisms to refract and reflect light such that different colors are emitted at different angles.
  • Plastics or other materials engineered with angular metamerism can also be used such that different angles produce different colors.
  • mini prisms, lenticular printing, angular metamerism, or other materials with similar properties are used, the color of light refracted and reflected from the materials can be calibrated with predetermined angles of pitch, yaw, and roll of the glasses.
  • FIG. 5 illustrates a single mini prism refracting and reflecting light.
  • White light 522 enters prism 524 from the top right.
  • the white light Upon entry into the prism, the white light refracts and separates into colors, a phenomenon known as dispersion.
  • the angle of separation of colors depends on the wavelength-dependent refractive index of the material out of which the prism is made.
  • the beam reflects off of reflective portion 530 and back through the prism, until it exits prism 524 in its component colors 532 .
  • the dispersion angle between two colors is a function of the material used for the prism.
  • the angles can be predetermined, calculated from the refractive index or empirically measured from the prism. For example, when the prism is angled between 30-40 degrees, red would be refracted and reflected out the front. Therefore, a camera seeing red can infer that the prism (and glasses to which the prism is affixed) is at an 30-40 degree angle from head on.
  • FIG. 6 illustrates head tracking for a video game according to an embodiment.
  • Player's head 634 with glasses 602 is at one position and orientation at time t 1 , another position and orientation at time t 2 , and a further position and orientation at time t 3 .
  • Head 634 moves by swinging down and to the right during time period t 1 through t 3 .
  • the player can use conventional game controller 648 to play a video game and/or use only his or her body movements to control the game.
  • Camera 620 seated in this embodiment on top of display 642 , captures the user's position and movements and feeds them to game console 644 .
  • Tracking algorithms can be integral with camera 620 or be part of software or hardware in game console 644 . In other embodiments, tracking logic can be in yet a separate device.
  • camera 620 tracks player's head 634 using camera 620 , which tracks illuminated glasses 602 .
  • ambient lighting may be of insufficient intensity for camera 620 to resolve glasses 602 .
  • the player or his roommates may turn out the lights and draw curtains closed to make the game experience more encompassing.
  • light sensor 646 which is distinct from game console 644 and camera 620 , detects or otherwise measures a lack of ambient light and sends a signal to game console 644 .
  • Game console 644 compares the measured ambient light to a threshold value.
  • the threshold value may be preset in the factory settings of the camera or console or be adjusted by the user.
  • game console 644 determines that the ambient light level is beyond the threshold value. Game console 644 then sends a wireless signal to the glasses to adjust the lights embedded in the glasses.
  • the tracked head movements can be used as inputs to the game, to render 3-D images, etc.
  • Beyond a threshold value includes a value that is farther than where the threshold value delineates, whether lower than or higher than the threshold value. For example, a value of 9 is not beyond a threshold value of 10 if the threshold value establishes a ceiling or maximum. However, a value of 11 is beyond the threshold value in this threshold ceiling situation. Conversely, a value of 11 is not beyond a threshold value of 10 if the threshold of 10 establishes a floor or minimum. A value of 9 would be beyond the threshold of 10 in this threshold floor situation.
  • FIG. 7 illustrates an example of a hardware system suitable for implementing a device in accordance with various embodiments.
  • This block diagram illustrates a computer system 700 , such as a personal computer, video game console and associated display, mobile device, personal digital assistant, or other digital device, suitable for practicing embodiments of the invention.
  • Computer system 700 includes a central processing unit (CPU) 705 for running software applications and optionally an operating system.
  • CPU 705 may be made up of one or more homogeneous or heterogeneous processing cores.
  • Memory 710 stores applications and data for use by the CPU 705 .
  • Storage 715 provides non-volatile storage and other computer readable media for applications and data and may include fixed disk drives, removable disk drives, flash memory devices, and CD-ROM, DVD-ROM, Blu-ray, HD-DVD, or other optical storage devices, as well as signal transmission and storage media.
  • User input devices 720 communicate user inputs from one or more users to the computer system 700 , examples of which may include keyboards, mice, joysticks, touch pads, touch screens, still or video cameras, and/or microphones.
  • Network interface 725 allows computer system 700 to communicate with other computer systems via an electronic communications network, and may include wired or wireless communication over local area networks and wide area networks such as the Internet.
  • An audio processor 730 is adapted to generate analog or digital audio output from instructions and/or data provided by the CPU 705 , memory 710 , and/or storage 715 .
  • the components of computer system 700 including CPU 705 , memory 710 , data storage 715 , user input devices 720 , network interface 725 , and audio processor 730 are connected via one or more data buses 735 .
  • a graphics subsystem 740 is further connected with data bus 735 and the components of the computer system 700 .
  • the graphics subsystem 740 includes a graphics processing unit (GPU) 745 and graphics memory 750 .
  • Graphics memory 750 includes a display memory (e.g., a frame buffer) used for storing pixel data for each pixel of an output image.
  • Graphics memory 750 can be integrated in the same device as GPU 745 , connected as a separate device with GPU 745 , and/or implemented within memory 710 .
  • Pixel data can be provided to graphics memory 750 directly from the CPU 705 .
  • CPU 705 provides the GPU 745 with data and/or instructions defining the desired output images, from which the GPU 745 generates the pixel data of one or more output images.
  • the data and/or instructions defining the desired output images can be stored in memory 710 and/or graphics memory 750 .
  • the GPU 745 includes 3D rendering capabilities for generating pixel data for output images from instructions and data defining the geometry, lighting, shading, texturing, motion, and/or camera parameters for a scene.
  • the GPU 745 can further include one or more programmable execution units capable of executing shader programs.
  • the graphics subsystem 740 periodically outputs pixel data for an image from graphics memory 750 to be displayed on display device 755 .
  • Display device 755 can be any device capable of displaying visual information in response to a signal from the computer system 700 , including CRT, LCD, plasma, and OLED displays.
  • Computer system 700 can provide the display device 755 with an analog or digital signal.
  • CPU 705 is one or more general-purpose microprocessors having one or more processing cores. Further embodiments can be implemented using one or more CPUs with microprocessor architectures specifically adapted for highly parallel and computationally intensive applications, such as media and interactive entertainment applications.
  • the components of the system can be connected via a network, which may be any combination of the following: the Internet, an IP network, an intranet, a wide-area network (“WAN”), a local-area network (“LAN”), a virtual private network (“VPN”), the Public
  • PSTN Switched Telephone Network
  • a network may include both wired and wireless connections, including optical links. Many other examples are possible and apparent to those skilled in the art in light of this disclosure. In the discussion herein, a network may or may not be noted specifically.
  • the embodiments may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
  • the term “memory” or “memory unit” may represent one or more devices for storing data, including read-only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices, or other computer-readable mediums for storing information.
  • ROM read-only memory
  • RAM random access memory
  • magnetic RAM magnetic RAM
  • core memory magnetic disk storage mediums
  • optical storage mediums flash memory devices
  • computer-readable mediums includes, but is not limited to, portable or fixed storage devices, optical storage devices, wireless channels, a sim card, other smart cards, and various other mediums capable of storing, containing, or carrying instructions or data.
  • embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof.
  • the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium such as a storage medium. Processors may perform the necessary tasks.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Optics & Photonics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Position Input By Displaying (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

3-D or other glasses with an illuminated light guide for tracking with a camera are presented. The light guide can be a light guide plate (LGP) or other transparent and/or translucent material that conveys light through its interior. LEDs embedded within the frames illuminate the light guide and optionally can be dimmed or brightened depending on ambient lighting.

Description

    CROSS-REFERENCES TO RELATED APPLICATIONS
  • This application is related to Patent Cooperation Treaty (PCT) Application No. PCT/US2010/051827, filed Oct. 7, 2010, by some of the same inventors, titled “3-D Glasses With Camera Based Head Tracking” (Attorney Docket No. 026340-007800PC), filed the same day, which is incorporated by reference in its entirety for all purposes.
  • BACKGROUND
  • 1. Field of the Invention
  • Embodiments of the present invention relate to camera-based head tracking in which the person's head being tracked is wearing illuminated glasses, and, in particular, to camera-based head tracking of variable-illumination glasses for use with 3-D video display and video game entertainment devices.
  • 2. Description of the Related Art
  • Video games have become more immersive as technology progresses. Video game consoles are often produced with state-of-the-art processors, extremely fast memory, and high-end graphics cards. Input controllers have evolved from simple knobs, joysticks, and button-based controllers to accelerometer-enabled controllers that a user can swing in his or hands or wear. Further input technologies involve a camera, usually mounted on top of a television, tracking a user's body, including tracking his or her head, torso, arms, and legs. Users can control such video games by simply moving their bodies or parts thereof. For example, a player of a skateboarding game can duck down so that he or she clears an virtual bridge.
  • Three-dimensional (3-D, or 3D) televisions help immerse users in events happening on their display screens. For such 3-D televisions, a user sometimes dons 3-D glasses. Earlier 3-D glasses included red and blue lenses for discerning an anaglyph. Shuttered 3-D glasses have lenses that rapidly and alternatingly switch between being opaque and transparent in synchronization with a display that rapidly shows left and right images. Other types of 3-D presentation technology exist. Many are similar in that they present a separate two-dimensional image to a viewer's left eye and a separate two-dimensional image to the viewer's right eye either contemporaneously or very rapidly (e.g., at 60 Hz) in order to trick the viewer's brain into interpreting the stereoscopic images as a 3-D environment.
  • Video games utilizing 3-D display technologies can immerse a player in a game through the use of 3-D effects on the screen. Furthermore, video game consoles with body tracking can use 3-D effects to coordinate a player's actual movements in the real world with his or her virtual movement in a displayed virtual world. Head tracking can be critical for games that render based on where a user's head is. For example, as a user steps toward the television in his living room, a video game console can render a 3-D virtual pyre on the television so that it appears like he moves closer to it.
  • Thus, tracking the location, orientation, and movement of a viewer or other user's head can be important for some video games, especially those that use head-tracking to render 3-D objects closer to the user. There exists a need in the art for more robust head tracking that is not too expensive for average consumers.
  • BRIEF SUMMARY
  • Devices, systems, and methods of the present disclosure are related to camera-based head tracking of a user wearing a pair of glasses such as 3-D glasses for a 3-D television. The pair of glasses has a light and a light guide, including a hollow, fiber optic, transparent/translucent foam, etc. light-conveying interior, such as a light guide plate (LGP) that conveys and disperses light through the frame. The light can be carried from the lights to all corners of the frames in order to increase visibility of the glasses to a tracking camera. The lights can optionally be turned brighter or dimmer depending on competing ambient light. Staggered retroreflector strips can also be mounted on the glasses with reflective backsides that help reflect light back into the LGP.
  • The pair of glasses can have lenticular printing, angular metamerism plastic, or mini prisms that are aligned to give a predetermined color (e.g., red) in one predetermined orientation (e.g., facing the camera, 0 degrees of yaw) and another predetermined color (e.g., violet) in another predetermined orientation (e.g., looking to the left at 90 degrees). A color camera can then be used to not only track a position of the glasses but also determine the wearer's head angle by identifying the color refracted and reflected by the mini prisms. The change in colors observed by a camera can be used to determine the rate of change of the angle (or higher derivatives) of the wearer's head without determining the absolute angle of the wearer's head.
  • Technical advantages of the methods, devices, and systems herein include robust head tracking using glasses that are visible in a variety of lighting conditions. The glasses can be inexpensively mass produced. Variable lighting on board the glasses offers less distraction to the user in low light situations and reduces power requirements. LGP can extend the lighting to all corners of the glasses frames so that its geometry is more pronounced and easier to pick up by the camera. Retroreflectors not only reflect ambient light, but their reflective backs can increase the efficiency of the LGP by reflecting light back into the LGP. Prisms and the colors they refract and reflect can be used for angle or other orientation measurements. Such prisms can also refract and reflect near-infrared light, which can often be picked up by charge-coupled device (CCD) cameras.
  • An embodiment of the present disclosure relates to a pair of glasses for tracking a wearer's head, comprising a frame, including a light guide for conveying light, a light supported by the frame, the light being adjustable in brightness (such as between a high and low brightness level), and a circuit operatively coupled to the light, the circuit configured to adjust a brightness of the light based on an amount of ambient light. The frame can include an LGP to convey light from the light, and a retroreflector with a reflective backing, supported by the frame, can lie over a section of the LGP. The circuit can be configured to adjust the brightness of the light among a predetermined number of brightness levels. Ambient light can be measured by an on-board sensor or from an off-board source. The glasses can be 3-D glasses with lenses that alternate between being opaque and transparent. The alternations can be synchronized with the lenses so that they lessen distractions to the wearer of the glasses.
  • An embodiment relates to a system for tracking a user's head, comprising a frame, including a light guide for conveying light, a light supported by the frame, the light being adjustable in brightness, a circuit operatively coupled to the lights, the circuit configured to adjust a brightness of the light, and a receiver supported by the device for receiving an indication to adjust the brightness of the light in response to ambient light. The system further comprises a camera, a circuit configured to use the camera to track the glasses, and a transmitter operatively coupled to the circuit. The circuit is configured to transmit using the transmitter to the receiver an indication to adjust the brightness of the light based on an amount of ambient light.
  • An embodiment relates to a system for determining a user's head angle, comprising glasses, a camera, and a processor configured to use the camera to image the glasses and determine an orientation of the glasses based on a color refracted and reflected by prisms supported by the frame and which are configured to refract and reflect toward the camera a first predetermined color corresponding to a first predetermined orientation of the glasses and a second predetermined color corresponding to a second predetermined orientation of the glasses. An angle of a head of a wearer of the glasses can be determined. Optionally, the prisms can be configured to refract and reflect a first predetermined near-infrared wavelength corresponding to a first predetermined orientation of the glasses and a second predetermined near-infrared wavelength corresponding to a second predetermined orientation of the glasses. The camera can image near-infrared wavelengths and the processor is configured to determine an orientation of the glasses based on a near-infrared wavelength of electromagnetic radiation reflected by the prisms.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • A further understanding of the nature and advantages of the present invention may be realized by reference to the following drawings. In the appended figures, similar components or features may have the same reference label.
  • FIG. 1A illustrates 3-D glasses in accordance with an embodiment.
  • FIGS. 1B-1E illustrate alternate cross-sections of the 3-D glasses of FIG. 1A.
  • FIG. 2 illustrates 3-D glasses in accordance with an embodiment.
  • FIG. 3 illustrates 3-D glasses in accordance with an embodiment.
  • FIG. 4A illustrates 3-D glasses with prisms in accordance with an embodiment.
  • FIG. 4B illustrates the 3-D glasses of FIG. 4A rotated.
  • FIG. 5 illustrates a prism refracting and reflecting light in accordance with an embodiment.
  • FIG. 6 illustrates head tracking for a video game console in accordance with an embodiment.
  • FIG. 7 is an example computer system suitable for use with embodiments of the invention.
  • The figures will now be used to illustrate different embodiments in accordance with the invention. The figures are specific examples of embodiments and should not be interpreted as limiting embodiments, but rather exemplary forms and procedures.
  • DETAILED DESCRIPTION
  • The present disclosure is related to tracking of a user's head using a camera and illuminated glasses. The glasses can be 3-D glasses that are synchronized in time with a television. The glasses employ a light guide, such as LGP, that conveys and disperses light from LEDs through the frame. The glasses can also have retroreflectors that not only reflect ambient light to the tracking camera, but also have a reflective backing that reflects light back into the LGP over which it is affixed. The LEDs can be dimmed or brightened depending on the ambient light. A sensor for the ambient light can be on-board or off-board the glasses. The camera itself can be used as the off-board sensor.
  • Mini prisms can be affixed to the glasses in order to refract and reflect a predetermined color in one predetermined orientation while refracting and reflecting a different predetermined color at another predetermined orientation. Lenticular printing or plastic with angular mesmerism can be used instead of or in addition to mini prisms. In another embodiment, the rate of motion of the user's head can be determined by the amount of change of color over time.
  • FIG. 1A illustrates 3-D glasses in accordance with an embodiment. Although non-3-D glasses can certainly be used, this description will hereon refer to 3-D glasses. 3-D glasses 100 include frame 102, right lens 104, and left lens 106. Right and left lenses 104/106 can shutter between transparent and opaque states in sync with a 3-D display, such as a 3-D television. Frame 102 includes a light guide, here light guide plate (LGP) 110, that consists of transparent-to-translucent material (or hollowed material), such as polymethyl methacrylate (PMMA), that conveys light from four embedded LEDs 108 that shine into the LGP.
  • LEDs 108 can be controlled through circuit 112, which in turn is connected to ambient light sensor 114. If ambient light sensor 114 senses very bright light, then circuit 112 brightens LEDs 114 so that LGP 110 can be better made out by a camera that tracks the glasses. If ambient light sensor 114 senses very low light levels, then circuit 112 dims LEDs so that the glow of LGP 110 does not distract the wearer.
  • The lenses can be synchronized with the LED duty cycles. For example, when lens 104 is opaque, LEDs 108 on the right side of the glasses can be turned on, and when lens 104 is transparent, LEDs 108 on the right side of the glasses can be turned off The on/off duty cycle of the LEDs not only saves battery power but can lessen distractions to the user.
  • Wireless receiver 116 synchronizes the 3-D glasses to a 3-D television. Receiver 116 also can receive an indicator to turn up or down the brightness of the LEDs in the glasses. In this way, an off-board ambient light sensor, such as a tracking camera, can determine whether there is enough ambient light and send an indicator, through a wireless connection, to the glasses signaling to increase or decrease the LED brightnesses. In some embodiments, an off-board source may determine that certain lights need to be turned on brighter, and the corresponding lights may be individually brightened.
  • LEDs 108 can be manufactured to emit near-infrared wavelengths that are visible to many charge-coupled device (CCD) cameras. Electromagnetic radiation from near-infrared LEDs is not visible to humans and may be less likely to distract a wearer.
  • LEDs 108 can be adjusted in brightness within a finite number of brightness levels (e.g., 0 through 15) or have an infinite number of levels within a range. With a finite number of brightness levels (e.g., 16), the glasses can communicate using various brightness levels back to the camera and associated electronics. For example, the glasses may have a low battery, and the LEDs may indicate this by shining at level 15 (highest), then level 0 (off), then 8 (middle) in rapid succession. A video game console connected with the camera can decipher this code and display “Low Battery Warning” on the screen. This may avoid the need to have a low battery indicator on-board the glasses.
  • FIG. 1B illustrates cross section A-A for FIG. 1A. LGP 110 is one, integral translucent material, such as polymethyl methacrylate (PMMA) or other plastic suitable for a light guide. Other cross section geometries other than rectangular can be used, for example circular, oval, triangular, complex, etc.
  • FIG. 1C illustrates an alternate cross section A-A for FIG. 1A. In this case, translucent/transparent material 152 surrounds core material 154, which can be other translucent/transparent material, foam, air, etc.
  • FIG. 1D illustrates an alternate cross section A-A for FIG. 1A. In this case, material 152 may be opaque and configured as a c-channel, with one side open. Translucent/transparent core material 154 resides in the c-channel.
  • FIG. 1E illustrates an alternate cross section A-A for FIG. 1A. In this case, translucent/transparent material 152 has ridges 158 that can further disperse and reflect light conveyed through the frames. The ridges may be prisms as will be described later.
  • FIGS. 2 and 3 illustrate 3-D glasses according to an embodiment. In FIG. 2, LEDs 208 embedded in frame 200 illuminate LGP 210, which is partially covered by retroreflectors 218. Retroreflectors 218 are affixed in 5-millimeter (mm) wide strips over portions of LGP 210. The face of the retroreflectors reflects light, either ambient or from an illumination light, back toward the camera so that the camera can better track the glasses. A reflective backing on the rear of retroreflectors 218 helps bounce light escaping from LGP 210 back into LGP 210 so that causes more internal reflections within the LGP. More light stays in the LGP and is carried further through the frame. Other variations of patterns can be used depending on what patterns are easiest for a camera to track and depending upon the flexibility of shapes for the LGP. In the exemplary embodiment, LGP is relatively straight in the direction of LED illumination, but it is possible to curve the LGP more and convey light around the frame.
  • In FIG. 3, LEDs 308 embedded in frame 300 illuminate LGP 310. LGP 310 is partially covered by small sections of retroreflectors 318. The small sections of the retroreflectors have foil backings to reflect light back into the LGP. This configuration having smaller sections of retroreflectors may be better applied to situations in which a user is closer to a camera and the camera can therefore distinguish the retroreflectors from the LGP. In other situations, a first player can wear the glasses of FIG. 2, while second player can wear the glasses of FIG. 3. The camera can then distinguish between players one and two by recognizing the pattern of retroreflectors on the respective user's glasses.
  • In some embodiments, whole sections of LGP can be turned on or off. For example, the top horizontal bar of the frames can be on while the U sections that hold the lenses can be off. These sections can be optically isolated from one another, by metal foil or other opaque material, so that there is minimal bleed through from section to section.
  • FIGS. 4A and 4B illustrate 3-D glasses with prisms in accordance with an embodiment.
  • In the figures, glasses 400 have mini prisms 424 in frames 402. The mini prisms are illuminated by white light 422, which can be ambient light or a light collocated with camera 420. Camera 420 is stationary.
  • In FIG. 4A, the glasses are in front of and at zero degrees of yaw to the camera. In this configuration, the white light from white light 422 is refracted and reflected at a predetermined wavelength 426. In FIG. 4B, the glasses are still in front of the camera, but they have been yawed at e degrees to the camera. Because the mini prisms refract and reflect differently at this angle, the white light 422 is refracted and reflected at a different predetermined wavelength 428 back to the camera. The camera can distinguish between the wavelengths (i.e., colors) and based on a calibration table or other lookup table, determine the angle at which the glasses are oriented with respect to the camera.
  • Lenticular printing can be used instead of or in addition to mini prisms to refract and reflect light such that different colors are emitted at different angles. Plastics or other materials engineered with angular metamerism can also be used such that different angles produce different colors. Whether mini prisms, lenticular printing, angular metamerism, or other materials with similar properties are used, the color of light refracted and reflected from the materials can be calibrated with predetermined angles of pitch, yaw, and roll of the glasses.
  • FIG. 5 illustrates a single mini prism refracting and reflecting light. White light 522 enters prism 524 from the top right. Upon entry into the prism, the white light refracts and separates into colors, a phenomenon known as dispersion. The angle of separation of colors depends on the wavelength-dependent refractive index of the material out of which the prism is made. The beam reflects off of reflective portion 530 and back through the prism, until it exits prism 524 in its component colors 532.
  • The dispersion angle between two colors, for example red at angle α1 and orange at angle α2, is a function of the material used for the prism. The angles can be predetermined, calculated from the refractive index or empirically measured from the prism. For example, when the prism is angled between 30-40 degrees, red would be refracted and reflected out the front. Therefore, a camera seeing red can infer that the prism (and glasses to which the prism is affixed) is at an 30-40 degree angle from head on.
  • The difference between the angles, Δα=α1−α2, can be predetermined so that a rate of change can be determined. For example, Δα/Δt≈ω(angular velocity). For example, if the difference between angles for red and orange are 12 degrees, then a camera seeing red and then orange in 1 second can determine that the prism (and glasses to which the prism is affixed) is rotating at 12 degrees/second.
  • FIG. 6 illustrates head tracking for a video game according to an embodiment. Player's head 634 with glasses 602 is at one position and orientation at time t1, another position and orientation at time t2, and a further position and orientation at time t3. Head 634 moves by swinging down and to the right during time period t1 through t3. The player can use conventional game controller 648 to play a video game and/or use only his or her body movements to control the game. Camera 620, seated in this embodiment on top of display 642, captures the user's position and movements and feeds them to game console 644. Tracking algorithms can be integral with camera 620 or be part of software or hardware in game console 644. In other embodiments, tracking logic can be in yet a separate device.
  • As the player swings his head and enjoys the game, camera 620 tracks player's head 634 using camera 620, which tracks illuminated glasses 602. In certain instances, however, ambient lighting may be of insufficient intensity for camera 620 to resolve glasses 602. For example, the player or his roommates may turn out the lights and draw curtains closed to make the game experience more encompassing. Upon the light conditions deteriorating, light sensor 646, which is distinct from game console 644 and camera 620, detects or otherwise measures a lack of ambient light and sends a signal to game console 644. Game console 644 compares the measured ambient light to a threshold value. The threshold value may be preset in the factory settings of the camera or console or be adjusted by the user. In its comparison, game console 644 determines that the ambient light level is beyond the threshold value. Game console 644 then sends a wireless signal to the glasses to adjust the lights embedded in the glasses. The tracked head movements can be used as inputs to the game, to render 3-D images, etc.
  • “Beyond a threshold value” includes a value that is farther than where the threshold value delineates, whether lower than or higher than the threshold value. For example, a value of 9 is not beyond a threshold value of 10 if the threshold value establishes a ceiling or maximum. However, a value of 11 is beyond the threshold value in this threshold ceiling situation. Conversely, a value of 11 is not beyond a threshold value of 10 if the threshold of 10 establishes a floor or minimum. A value of 9 would be beyond the threshold of 10 in this threshold floor situation.
  • FIG. 7 illustrates an example of a hardware system suitable for implementing a device in accordance with various embodiments. This block diagram illustrates a computer system 700, such as a personal computer, video game console and associated display, mobile device, personal digital assistant, or other digital device, suitable for practicing embodiments of the invention. Computer system 700 includes a central processing unit (CPU) 705 for running software applications and optionally an operating system. CPU 705 may be made up of one or more homogeneous or heterogeneous processing cores. Memory 710 stores applications and data for use by the CPU 705. Storage 715 provides non-volatile storage and other computer readable media for applications and data and may include fixed disk drives, removable disk drives, flash memory devices, and CD-ROM, DVD-ROM, Blu-ray, HD-DVD, or other optical storage devices, as well as signal transmission and storage media. User input devices 720 communicate user inputs from one or more users to the computer system 700, examples of which may include keyboards, mice, joysticks, touch pads, touch screens, still or video cameras, and/or microphones. Network interface 725 allows computer system 700 to communicate with other computer systems via an electronic communications network, and may include wired or wireless communication over local area networks and wide area networks such as the Internet. An audio processor 730 is adapted to generate analog or digital audio output from instructions and/or data provided by the CPU 705, memory 710, and/or storage 715. The components of computer system 700, including CPU 705, memory 710, data storage 715, user input devices 720, network interface 725, and audio processor 730 are connected via one or more data buses 735.
  • A graphics subsystem 740 is further connected with data bus 735 and the components of the computer system 700. The graphics subsystem 740 includes a graphics processing unit (GPU) 745 and graphics memory 750. Graphics memory 750 includes a display memory (e.g., a frame buffer) used for storing pixel data for each pixel of an output image. Graphics memory 750 can be integrated in the same device as GPU 745, connected as a separate device with GPU 745, and/or implemented within memory 710. Pixel data can be provided to graphics memory 750 directly from the CPU 705. Alternatively, CPU 705 provides the GPU 745 with data and/or instructions defining the desired output images, from which the GPU 745 generates the pixel data of one or more output images. The data and/or instructions defining the desired output images can be stored in memory 710 and/or graphics memory 750. In an embodiment, the GPU 745 includes 3D rendering capabilities for generating pixel data for output images from instructions and data defining the geometry, lighting, shading, texturing, motion, and/or camera parameters for a scene. The GPU 745 can further include one or more programmable execution units capable of executing shader programs.
  • The graphics subsystem 740 periodically outputs pixel data for an image from graphics memory 750 to be displayed on display device 755. Display device 755 can be any device capable of displaying visual information in response to a signal from the computer system 700, including CRT, LCD, plasma, and OLED displays. Computer system 700 can provide the display device 755 with an analog or digital signal.
  • In accordance with various embodiments, CPU 705 is one or more general-purpose microprocessors having one or more processing cores. Further embodiments can be implemented using one or more CPUs with microprocessor architectures specifically adapted for highly parallel and computationally intensive applications, such as media and interactive entertainment applications.
  • The components of the system can be connected via a network, which may be any combination of the following: the Internet, an IP network, an intranet, a wide-area network (“WAN”), a local-area network (“LAN”), a virtual private network (“VPN”), the Public
  • Switched Telephone Network (“PSTN”), or any other type of network supporting data communication between devices described herein, in different embodiments. A network may include both wired and wireless connections, including optical links. Many other examples are possible and apparent to those skilled in the art in light of this disclosure. In the discussion herein, a network may or may not be noted specifically.
  • It should be noted that the methods, systems, and devices discussed above are intended merely to be examples. It must be stressed that various embodiments may omit, substitute, or add various procedures or components as appropriate. For instance, it should be appreciated that, in alternative embodiments, the methods may be performed in an order different from that described, and that various steps may be added, omitted, or combined. Also, features described with respect to certain embodiments may be combined in various other embodiments. Different aspects and elements of the embodiments may be combined in a similar manner. Also, it should be emphasized that technology evolves and, thus, many of the elements are examples and should not be interpreted to limit the scope of the invention.
  • Specific details are given in the description to provide a thorough understanding of the embodiments. However, it will be understood by one of ordinary skill in the art that the embodiments may be practiced without these specific details. For example, well-known circuits, processes, algorithms, structures, and techniques have been shown without unnecessary detail in order to avoid obscuring the embodiments.
  • Also, it is noted that the embodiments may be described as a process which is depicted as a flow diagram or block diagram. Although each may describe the operations as a sequential process, many of the operations can be performed in parallel or concurrently. In addition, the order of the operations may be rearranged. A process may have additional steps not included in the figure.
  • Moreover, as disclosed herein, the term “memory” or “memory unit” may represent one or more devices for storing data, including read-only memory (ROM), random access memory (RAM), magnetic RAM, core memory, magnetic disk storage mediums, optical storage mediums, flash memory devices, or other computer-readable mediums for storing information. The term “computer-readable medium” includes, but is not limited to, portable or fixed storage devices, optical storage devices, wireless channels, a sim card, other smart cards, and various other mediums capable of storing, containing, or carrying instructions or data.
  • Furthermore, embodiments may be implemented by hardware, software, firmware, middleware, microcode, hardware description languages, or any combination thereof. When implemented in software, firmware, middleware, or microcode, the program code or code segments to perform the necessary tasks may be stored in a computer-readable medium such as a storage medium. Processors may perform the necessary tasks.
  • Having described several embodiments, it will be recognized by those of skill in the art that various modifications, alternative constructions, and equivalents may be used without departing from the spirit of the invention. For example, the above elements may merely be a component of a larger system, wherein other rules may take precedence over or otherwise modify the application of the invention. Also, a number of steps may be undertaken before, during, or after the above elements are considered. Accordingly, the above description should not be taken as limiting the scope of the invention.

Claims (20)

What is claimed is:
1. A pair of glasses for tracking a wearer's head, comprising:
a frame, including a light guide for conveying light;
a light supported by the frame, the light being adjustable in brightness;
a circuit operatively coupled to the light, the circuit configured to adjust a brightness of the light based on an amount of ambient light.
2. The glasses of claim 1 wherein the light guide comprises:
a light guide plate (LGP).
3. The glasses of claim 2 further comprising:
a retroreflector supported by the frame to reflect ambient light.
4. The glasses of claim 3 wherein the retroreflector has a reflective backing and lies over a section of the LGP, the reflective backing configured to reflect light conveyed by the LGP back into the LGP.
5. The glasses of claim 2 wherein the LGP comprises polymethyl methacrylate (PMMA).
6. The glasses of claim 1 wherein the light is a light emitting diode (LED).
7. The glasses of claim 1 wherein the light is enabled to emit near-infrared wavelengths of light.
8. The glasses of claim 1 wherein the circuit is configured to adjust the brightness of the light among a predetermined number of brightness levels.
9. The glasses of claim 1 further comprising:
an ambient light sensor supported by the frame.
10. The glasses of claim 1 further comprising:
a receiver supported by the frame for receiving an indication from an off-board device to adjust the brightness of the light based on amount of ambient light.
11. The glasses of claim 1 wherein the glasses are 3-D glasses further comprising:
a pair of lenses coupled to the frames, the lenses configured to alternate between opaque and transparent states in synchronization with a 3-D display.
12. The glasses of claim 11 wherein the circuit is configured to synchronize a duty cycle of the light with the lenses.
13. A system for tracking a user's head, comprising:
glasses comprising:
a frame, including a light guide for conveying light;
a light supported by the frame, the light being adjustable in brightness;
a circuit operatively coupled to the lights, the circuit configured to adjust a brightness of the light;
a receiver supported by the device for receiving an indication to adjust the brightness of the light in response to ambient light
a camera;
a circuit configured to use the camera to track the glasses; and
a transmitter operatively coupled to the circuit,
wherein the circuit is configured to transmit using the transmitter to the receiver an indication to adjust the brightness of the light based on an amount of ambient light.
14. A system for determining a user's head angle, comprising:
glasses comprising:
a frame;
prisms supported by the frame,
wherein the prisms arc configured to refract and reflect toward the camera a first predetermined color corresponding to a first predetermined orientation of the glasses and a second predetermined color corresponding to a second predetermined orientation of the glasses;
a camera;
a processor configured to use the camera to image the glasses and determine an orientation of the glasses based on a color refracted and reflected by the prisms,
wherein an angle of a head of a wearer of the glasses can be determined from the orientation of the glasses.
15. The glasses of claim 14 wherein:
the prisms are configured to refract and reflect toward the camera a first predetermined near-infrared wavelength corresponding to a first predetermined orientation of the glasses and a second predetermined near-infrared wavelength corresponding to a second predetermined orientation of the glasses;
the camera can image near-infrared wavelengths;
the processor is configured to determine an orientation of the glasses based on a near-infrared wavelength refracted and reflected by the prisms.
16. The glasses of claim 14 wherein the prisms are configured in strips on the glasses frame.
17. The glasses of claim 14 further comprising:
a light supported by the frame and configured to illuminate the prisms.
18. The glasses of claim 17 further comprising:
a circuit operatively coupled to the light, the circuit configured to adjust a brightness of the light based on an amount of ambient light.
19. The glasses of claim 14 further comprising:
a retroreflector supported by the frame to reflect ambient light.
20. The glasses of claim 19 wherein the retroreflector has a reflective backing and lies over a section of LGP of the frame, the reflective backing configured to reflect light conveyed by the LGP back into the LGP.
US13/260,701 2010-10-07 2010-10-07 3-d glasses with illuminated light guide Abandoned US20130201285A1 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2010/051836 WO2012047222A1 (en) 2010-10-07 2010-10-07 3-d glasses with illuminated light guide

Publications (1)

Publication Number Publication Date
US20130201285A1 true US20130201285A1 (en) 2013-08-08

Family

ID=45924829

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/260,701 Abandoned US20130201285A1 (en) 2010-10-07 2010-10-07 3-d glasses with illuminated light guide
US12/900,403 Active 2031-04-06 US8922644B2 (en) 2010-10-07 2010-10-07 Tracking head position and orientation

Family Applications After (1)

Application Number Title Priority Date Filing Date
US12/900,403 Active 2031-04-06 US8922644B2 (en) 2010-10-07 2010-10-07 Tracking head position and orientation

Country Status (2)

Country Link
US (2) US20130201285A1 (en)
WO (1) WO2012047222A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130242056A1 (en) * 2012-03-14 2013-09-19 Rod G. Fleck Imaging structure emitter calibration
US20160076934A1 (en) * 2014-09-17 2016-03-17 Delphi Technologies, Inc. Vehicle optical sensor system
US9298012B2 (en) 2012-01-04 2016-03-29 Microsoft Technology Licensing, Llc Eyebox adjustment for interpupillary distance
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9581820B2 (en) 2012-06-04 2017-02-28 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US20170068120A1 (en) * 2014-02-24 2017-03-09 Lunettes Inc. Glasses type information terminal, information processing device, computer program and recording medium
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US9779643B2 (en) 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
WO2018132415A1 (en) * 2017-01-12 2018-07-19 Janssen Pharmaceutica Nv Trans-orbital infrared light therapy
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US10388073B2 (en) 2012-03-28 2019-08-20 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US20200110289A1 (en) * 2018-10-03 2020-04-09 Carlos de la Fuente Ambient light adjustable eyeglasses
US10874874B2 (en) 2019-04-15 2020-12-29 Janssen Pharmaceutica Nv Transorbital NIR light therapy device
US10926102B2 (en) 2019-04-15 2021-02-23 Janssen Pharmaceutica Nv Transorbital NIR LIGHT THERAPY DEVICES
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1982306A1 (en) * 2006-02-07 2008-10-22 France Télécom Method of tracking the position of the head in real time in a video image stream
US9575392B2 (en) 2013-02-06 2017-02-21 Apple Inc. Electronic device with camera flash structures
WO2015048030A1 (en) * 2013-09-24 2015-04-02 Sony Computer Entertainment Inc. Gaze tracking variations using visible lights or dots
WO2015048026A1 (en) 2013-09-24 2015-04-02 Sony Computer Entertainment Inc. Gaze tracking variations using dynamic lighting position
WO2015048028A1 (en) 2013-09-24 2015-04-02 Sony Computer Entertainment Inc. Gaze tracking variations using selective illumination
US10523993B2 (en) * 2014-10-16 2019-12-31 Disney Enterprises, Inc. Displaying custom positioned overlays to a viewer
US10455138B2 (en) 2015-04-20 2019-10-22 Ian Schillebeeckx Camera calibration with lenticular arrays
CN104825168B (en) * 2015-05-23 2017-04-26 京东方科技集团股份有限公司 Cervical vertebra movement measurement device and method
US10043281B2 (en) 2015-06-14 2018-08-07 Sony Interactive Entertainment Inc. Apparatus and method for estimating eye gaze location
EP3550259B1 (en) 2017-01-17 2021-11-10 National Institute of Advanced Industrial Science and Technology Marker and posture estimation method using marker
JP7253440B2 (en) * 2019-05-09 2023-04-06 東芝テック株式会社 Tracking device and information processing program

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201823A1 (en) * 2009-02-10 2010-08-12 Microsoft Corporation Low-Light Imaging Augmented With Non-Intrusive Lighting
US20120162221A1 (en) * 2009-08-06 2012-06-28 Sony Corporation Method and apparatus for stereoscopic multi-users display

Family Cites Families (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH03119821U (en) * 1990-03-22 1991-12-10
US6057811A (en) 1993-09-28 2000-05-02 Oxmoor Corporation 3-D glasses for use with multiplexed video images
JP3752323B2 (en) * 1996-09-09 2006-03-08 オリンパス株式会社 Camera ranging device
CA2307877C (en) 1997-10-30 2005-08-30 The Microoptical Corporation Eyeglass interface system
US7483049B2 (en) * 1998-11-20 2009-01-27 Aman James A Optimizations for live event, real-time, 3D object tracking
US6200713B1 (en) * 1999-07-23 2001-03-13 Eastman Kodak Company Method and apparatus for locating arrays with periodic structures relative to composite images
US20020101988A1 (en) * 2001-01-30 2002-08-01 Jones Mark A. Decryption glasses
US20040063480A1 (en) 2002-09-30 2004-04-01 Xiaoling Wang Apparatus and a method for more realistic interactive video games on computers or similar devices
JP2004287699A (en) * 2003-03-20 2004-10-14 Tama Tlo Kk Image composition device and method
US7292269B2 (en) 2003-04-11 2007-11-06 Mitsubishi Electric Research Laboratories Context aware projector
JP4279079B2 (en) 2003-04-16 2009-06-17 Sriスポーツ株式会社 Automatic golf swing tracking method
US7164518B2 (en) * 2003-10-10 2007-01-16 Yuping Yang Fast scanner with rotatable mirror and image processing system
US7123411B2 (en) * 2003-11-18 2006-10-17 Merlin Technology Limited Liability Company Reflective multi-image surface
US20070263923A1 (en) 2004-04-27 2007-11-15 Gienko Gennady A Method for Stereoscopic Measuring Image Points and Device for Carrying Out Said Method
US6857739B1 (en) * 2004-06-08 2005-02-22 Peter Watson Illuminated eyewear and a method for illuminating eyewear
US20060012974A1 (en) * 2004-07-16 2006-01-19 Chi-Yang Su Multifunctional glasses
US7295947B2 (en) * 2004-09-10 2007-11-13 Honeywell International Inc. Absolute position determination of an object using pattern recognition
US8224024B2 (en) * 2005-10-04 2012-07-17 InterSense, LLC Tracking objects with markers
WO2007117485A2 (en) 2006-04-03 2007-10-18 Sony Computer Entertainment Inc. Screen sharing method and apparatus
JP4884867B2 (en) 2006-07-25 2012-02-29 任天堂株式会社 Information processing apparatus and information processing program
GB0622451D0 (en) 2006-11-10 2006-12-20 Intelligent Earth Ltd Object position and orientation detection device
KR101549404B1 (en) * 2008-11-18 2015-09-02 삼성전자주식회사 Method and apparatus for controlling flash emmision and digital photographing apparatus using thereof

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100201823A1 (en) * 2009-02-10 2010-08-12 Microsoft Corporation Low-Light Imaging Augmented With Non-Intrusive Lighting
US20120162221A1 (en) * 2009-08-06 2012-06-28 Sony Corporation Method and apparatus for stereoscopic multi-users display

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9298012B2 (en) 2012-01-04 2016-03-29 Microsoft Technology Licensing, Llc Eyebox adjustment for interpupillary distance
US9684174B2 (en) 2012-02-15 2017-06-20 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9779643B2 (en) 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US20130242056A1 (en) * 2012-03-14 2013-09-19 Rod G. Fleck Imaging structure emitter calibration
US9578318B2 (en) * 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US9807381B2 (en) 2012-03-14 2017-10-31 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US10388073B2 (en) 2012-03-28 2019-08-20 Microsoft Technology Licensing, Llc Augmented reality light guide display
US10478717B2 (en) 2012-04-05 2019-11-19 Microsoft Technology Licensing, Llc Augmented reality and physical games
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US9581820B2 (en) 2012-06-04 2017-02-28 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
US9977265B2 (en) * 2014-02-24 2018-05-22 Lunettes Inc. Glasses type information terminal, information processing device, computer program and recording medium
US20170068120A1 (en) * 2014-02-24 2017-03-09 Lunettes Inc. Glasses type information terminal, information processing device, computer program and recording medium
US9506803B2 (en) * 2014-09-17 2016-11-29 Delphi Technologies, Inc. Vehicle optical sensor system
US20160076934A1 (en) * 2014-09-17 2016-03-17 Delphi Technologies, Inc. Vehicle optical sensor system
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
WO2018132415A1 (en) * 2017-01-12 2018-07-19 Janssen Pharmaceutica Nv Trans-orbital infrared light therapy
US10960224B2 (en) 2017-01-12 2021-03-30 Janssen Pharmaceutica Nv Trans-orbital infrared light therapy
US20200110289A1 (en) * 2018-10-03 2020-04-09 Carlos de la Fuente Ambient light adjustable eyeglasses
US10874874B2 (en) 2019-04-15 2020-12-29 Janssen Pharmaceutica Nv Transorbital NIR light therapy device
US10926102B2 (en) 2019-04-15 2021-02-23 Janssen Pharmaceutica Nv Transorbital NIR LIGHT THERAPY DEVICES

Also Published As

Publication number Publication date
US20120086801A1 (en) 2012-04-12
US8922644B2 (en) 2014-12-30
WO2012047222A1 (en) 2012-04-12

Similar Documents

Publication Publication Date Title
US20130201285A1 (en) 3-d glasses with illuminated light guide
EP2609731B1 (en) Tracking head position and orientation
US8830329B2 (en) 3-D glasses with camera based head tracking
US9465226B2 (en) Automatic shutdown of 3D based on glasses orientation
CA2820950C (en) Optimized focal area for augmented reality displays
CN104956252B (en) Peripheral display for near-eye display device
US10503248B1 (en) Selective color sensing for motion tracking
US10528128B1 (en) Head-mounted display devices with transparent display panels for eye tracking
EP2486441B1 (en) 3-d glasses with camera based head tracking
US11507201B2 (en) Virtual reality
EP3878529A1 (en) Interactive entertainment system
US20180018943A1 (en) Dual display immersive screen technology
JP6945409B2 (en) Information processing methods, computers, and programs
EP3454098A1 (en) System with semi-transparent reflector for mixed/augmented reality
US20180253902A1 (en) Method executed on computer for providing object in virtual space, program for executing the method on the computer, and computer apparatus
US20180267601A1 (en) Light Projection for Guiding a User within a Physical User Area During Virtual Reality Operations
CN103565399A (en) Pupil detector
US10514544B1 (en) Tilted displays for a wide field of view
CN117999510A (en) Position tracking system and method for head-mounted display system
US10416445B1 (en) Lenses with consistent distortion profile
CN111247473B (en) Display apparatus and display method using device for providing visual cue
KR20230162090A (en) Eyewear Projector Brightness Control
Lancelle Visual Computing

Legal Events

Date Code Title Description
AS Assignment

Owner name: SONY COMPUTER ENTERTAINMENT INC., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MAO, XIAODONG;MARKS, RICHARD;MALLINSON, DOMINIC;AND OTHERS;SIGNING DATES FROM 20101011 TO 20101012;REEL/FRAME:025168/0418

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION

AS Assignment

Owner name: SONY INTERACTIVE ENTERTAINMENT INC., JAPAN

Free format text: CHANGE OF NAME;ASSIGNOR:SONY COMPUTER ENTERTAINMENT INC.;REEL/FRAME:039239/0343

Effective date: 20160401