EP2394195A2 - Method of stereoscopic 3d image capture and viewing - Google Patents

Method of stereoscopic 3d image capture and viewing

Info

Publication number
EP2394195A2
EP2394195A2 EP10739079A EP10739079A EP2394195A2 EP 2394195 A2 EP2394195 A2 EP 2394195A2 EP 10739079 A EP10739079 A EP 10739079A EP 10739079 A EP10739079 A EP 10739079A EP 2394195 A2 EP2394195 A2 EP 2394195A2
Authority
EP
European Patent Office
Prior art keywords
image
images
camera
user
data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP10739079A
Other languages
German (de)
French (fr)
Other versions
EP2394195A4 (en
Inventor
James Mentz
Samuel Caldwell
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bit Cauldron Corp
Original Assignee
Bit Cauldron Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bit Cauldron Corp filed Critical Bit Cauldron Corp
Publication of EP2394195A2 publication Critical patent/EP2394195A2/en
Publication of EP2394195A4 publication Critical patent/EP2394195A4/en
Withdrawn legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/20Image signal generators
    • H04N13/204Image signal generators using stereoscopic image cameras
    • H04N13/239Image signal generators using stereoscopic image cameras using two 2D image sensors having a relative position equal to or related to the interocular distance
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/22Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
    • G02B30/24Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/001Constructional or mechanical details

Definitions

  • the present invention relates to stereoscopic 3D image acquisition and methods and apparatus. More particularly, the present invention relates to stereoscopic 3D image capture and viewing devices incorporating radio frequency communications.
  • Another drawback includes that such systems typically reply upon expensive silver or metalized reflective screens, that maintain the appropriate polarization of light from the projector to the right and left eye images. Such screens are often too expensive for the average consumer. Yet another drawback is that because both left and right eye images are displayed to the user at the same time, despite the polarized glasses, right eye images are often visible to the left eye and left eye images are often visible to the right eye. This light pollution degrades the quality of the 3D images and can be termed as "ghosting" of 3D images. [0004] The inventors are aware of a number of techniques that may be used to reduce this ghosting effect.
  • Some techniques may include deliberate degradation of left eye images to account for right eye image ghosting and the deliberate degradation of right eye images to account for left eye image ghosting.
  • the inventors believe that such techniques are disadvantageous as they tend to reduce the contrast of objects in an image, and they may result in a visible halo around objects in the image.
  • the inventors have recognized that 3D versions of features often do not appear as aesthetically pleasing as 2D versions of such features.
  • FIG. 1 Another approach to 3D visualization has included the use of stereoscopic shutter glasses that are based upon physical shutters, or more commonly liquid crystal display (LCD) technology.
  • LCD liquid crystal display
  • left and right images are alternatively displayed to the user, and the right and left LCD lenses alternate between a dark and transparent state.
  • the shutter glasses quickly alternate between transparency in the left then right eyes in synchronicity with an image which presents alternating left and right points of view, the left and right eyes receive different 2D images and the observer experiences the perception of depth in a 3D image.
  • Fig. 1 illustrates a typical stereoscopic system.
  • such systems typically include a computer 1, an infrared transmitter 3, a display 12, and a pair of liquid crystal display glasses (LCD shutter glasses) 8.
  • computer 1 alternatively provides left eye images and right eye images on signal line 2, in addition to a signal that distinguishes when the left eye image or right eye image is displayed.
  • IR transmitter 3 In response to the signal, IR transmitter 3 outputs infrared data 6 that indicate when the right eye image is being output and when the left eye image is being output.
  • infrared data 6 For example, one simple format for infrared data is a simple square wave with a high signal indicating left and a low signal indicating right; and another format includes a 8-bit word. Because of these different data formats, IR transmitters from one manufacturer often cannot be used with LCD glasses from another manufacturer.
  • infrared data 6 is received by LCD glasses 8, and in response, for example, the right LCD of the LCD glasses 8 becomes opaque and the left LCD becomes translucent (e.g. clear, neutral density), or the left LCD of the LCD glasses 8 becomes opaque and the right LCD becomes translucent.
  • the right LCD becomes translucent display 12 is displaying a right eye image
  • the left LCD becomes translucent display 12 is displaying a left eye image.
  • One such limitation includes the difficulty in synchronizing the glasses to the images that are displayed. Synchronization data is typically based upon when the images are provided to the 3D display. Limitations to such approaches, determined by the inventors includes that both latency and timing jitter are introduced as it is processed and rendered by the 3D display device. As a result of such latency and jitter information, the LCD lenses or shutters are often opened and closed often at improper times, e.g. out of phase, with some of the image intended for the left eye being shown to the right eye and vice versa. This is perceivable by the user as ghosting effects. Additionally, as the inventors have determined that the phase difference is not constant and is subject to jitter, the user may see the image brightness change or flicker undesirably.
  • One approach to reduce such latency or jitter effects has been to reduce the amount of time the left LCD shutter and the amount of time the right LCD shutter are translucent.
  • the left shutter instead of the left shutter being open for example 50% of the time, the left shutter may be open 35% of the time, or the like. This reduction in open time should reduce the amount of ghosting.
  • the inventors recognize drawbacks to such approach to image ghosting.
  • One such drawback is the reduction in net amount of light transmitted to the user's eyes.
  • the user will perceive a darkening of the images for each eye. Accordingly, a 3D version of a feature will appear darker and duller compared to a 2D version of the feature when using IR-type shutter glasses.
  • IR LCD glasses based upon IR receivers often lose synchronization with the display as a result of stray reflections. For example, it has been observed by the inventors that IR LCD glasses may become confused as a result of sunlight reflecting from household objects; heat sources such candles, open flames, heat lamps; other IR remote controls (e.g. television remotes, game controllers); light sources (e.g. florescent lights); and the like. Additionally, it has been observed by the inventors that IR LCD Glasses may also lose synchronization as a result of clothing, hair, portions of other users bodies (e.g. head), or the like, that temporarily obscure an IR receiver of the LCD glasses. The loss of synchronization may lead the user to seeing a series of flickering or rolling images and / or the left eye seeing the right eye image. The inventors believe these types of anomalies are highly disturbing to most users and should be inhibited or minimized.
  • manufacturers of such devices specifically instruct users to use IR LCD glasses in highly controlled environments. For example, they suggest that the 3D displays and glasses be used only in darkened rooms. The inventors believe such a solution limits the applicability and attractiveness of such 3D display devices to typical consumers. This is believed to be because most consumers do not have the luxury of a dedicated, light- controlled room for a home theater, and that most consumer entertainment rooms are multipurpose family rooms.
  • each 3D display system includes its own IR transmitter and 2D field timing and phase data. Then, when two such systems are in close proximity, a user's IR LCD glasses may receive IR transmissions from either of the 3D display systems. Because of this, although a user is viewing a first 3D display, the user's 3D glasses may be synchronizing to a different 3D display, causing the user to undesirably view flickering and rolling images.
  • the inventors of the present invention thus recognize that multiple 3D display systems cannot easily be used in applications such as for public gaming exhibitions, tournaments, or contests, trade shows, in stadiums, in restaurants or bars, or the like.
  • stereoscopic 3D image capture the inventors have recognized that methods for capturing such images have been limited and have been beyond the reach of the average consumer.
  • the inventors of the present invention recognize that capturing of stereoscopic 3D images currently requires complex 3D imaging hardware and processing software. These systems are typically dedicated to acquiring or generating stereoscopic images. For example, the left and right 2D images that are used to form stereoscopic 3D images are generated entirely by computation (e.g. animation software) or by a pair of professional grade cameras which have been firmly affixed to each other with tightly manufactured proximity and spacing, not necessarily in the prior art.
  • the inventors believe that because current systems provide such narrow and specialized functionality, they are too "expensive" for typical consumers to purchase. The inventors believe that if the cost of 3D hardware and software capturing systems could be reduced, consumers would more readily embrace stereoscopic 3D imaging.
  • FIG. 10 illustrates a number of devices 601-603 including cameras 605, 607 and 609, that may or may not exist, that might use such software.
  • device 604 includes an off-center camera 605; device 606 includes centered camera 607, and device 603 includes camera 609.
  • portions 610 and 608 of device 603 may be reoriented or repositioned with respect to each other, as shown in dotted positions 611 and 612.
  • the present invention relates to stereoscopic 3D image capture and viewing methods and apparatus. More particularly, the present invention relates to stereoscopic 3D image viewing devices incorporating radio frequency communications.
  • a stereoscopic 3D image viewing device is based upon liquid crystal display (LCD) shutters that are synchronized to a source of 3D images.
  • the synchronization is based upon rf protocols such as Bluetooth, ZigBee radio (ZigBee Alliance), IEEE Standard 802.15.1, IEEE Standard 802.15.4, or any other type of rf communications protocol.
  • the stereoscopic 3D image viewing device may transmit data back to the source of 3D images, via the rf communications mechanism or protocol, to increase the level of synchronization between the two devices.
  • a system, method, and apparatus of perceiving stereoscopic 3D can be generated which improves the level of synchronization between the alternating images and the alternating action of shutter glasses.
  • a system, apparatus, method, and computer-readable media are provided to enable stereoscopic viewing.
  • the physical method of connecting the display system to stereoscopic glasses is the IEEE 802.15.4 wireless radio, ZigBee radio or Bluetooth technology. This allows a user to move one's head into positions that would normally lose reception of wireless transmissions (e.g. IR) thus simplifying the user experience of wearing stereoscopic glasses.
  • the wireless radio connection also has the advantage of replacing the infra-red light transmission method and its associated interference with remote controls and tendency to accept interference from natural and artificial light sources, thus enhancing the user experience.
  • a method for synchronization between the video transmitter and the shutter glasses. Synchronization is provided via a protocol that provides timing information such as a beacon offset or any series of packets that is used as the energy to excite a clock. A precision timing protocol may be utilized to provide synchronization between the transmitter and the receiver.
  • the shutter glasses and the transmission device may include executable computer programs resident in a memory that instructs a respective processor to perform specific functions or operations, such as to transmit data, to determine a latency, or the like.
  • a device for providing 3D synchronization data to a user 3D viewing device includes a receiving portion configured to receive 3D image data indicating timing of left and right images from a source of 3D data.
  • a system may include a radio frequency transmitter coupled to the receiving portion, wherein the radio frequency transmitter is configured to output 3D synchronization signals to the 3D viewing device in response to the 3D image data.
  • a method for transmitting stereoscopic display information includes receiving a plurality of 3D video synchronization signals from a source of 3D image data, and converting the plurality of 3D video synchronization signals into a plurality of wireless radio signals.
  • a method may include outputting the plurality of wireless radio signals to a pair of shutter glasses associated with a user that are adapted to receive the wireless radio signals, wherein the plurality of wireless radio signals are adapted to change the states for a pair of LCD shutters on the pair of shutter glasses, in response to the wireless radio signals.
  • a method for operating a pair of shutter glasses including a right LCD shutter and a left LCD shutter includes receiving synchronization data via radio frequency transmissions from a radio frequency transmitter, and determining shutter timings for the right LCD shutter and the left LCD shutter in response to the synchronization data.
  • a technique may include applying the shutter timings to the right LCD shutter and the left LCD shutter to enable the viewer to view right-eye images via the right LCD shutter and left-eye images via the left LCD shutter.
  • a method for transmitting stereoscopic display information includes converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device.
  • a method for transmitting stereoscopic display information includes: converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device; wherein the wireless radio is the IEEE Standard 802.15.1, Bluetooth, or components thereof.
  • a method for transmitting stereoscopic display information includes: converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device; wherein the wireless radio is the IEEE 802.15 (802.15.1-4) ZigBee radio, or components thereof.
  • a method for transmitting stereoscopic display information includes: converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device; wherein the wireless radio is the IEEE Standard 802.11, WiFi, or components thereof.
  • a method for transmitting stereoscopic display information includes: a pair of shutter glasses or other consumer electronics device which contains a localized clock such that the device remains synchronous to the video display system even when the connection to the source transmitting the synchronization information is interrupted or is not present.
  • the synchronization information between the display system and the glasses or other device are determined by a precision timing protocol in which bidirectional communication of timing information occurs.
  • a method for transmitting stereoscopic display information includes: a pair of shutter glasses or other consumer electronics device which receives synchronous information from the video display system, and a means and method of storing the delay and synchronization information in the transmitter or the video source generating computer, home theater system, or device.
  • the delay and synchronization information are stored and then transmitted to multiple devices to allow multiple users to simultaneously use the same system.
  • a method for transmitting stereoscopic display information includes: a pair of shutter glasses or other consumer electronics device which receives synchronous information from the video display system, and a means of determining the delay and synchronization information through information contained in the display and transmitter from the display via the video signal cable.
  • a method for transmitting stereoscopic display information includes: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization
  • a method for transmitting stereoscopic display information includes: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both infrared and visible light sources.
  • a method for transmitting stereoscopic display information includes: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both infrared and radio sources.
  • a method for transmitting stereoscopic display information including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both infrared and radio sources.
  • a method for transmitting stereoscopic display information including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both visible light and radio sources.
  • a method for transmitting stereoscopic display information including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from infrared, visible light and radio sources.
  • the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources.
  • the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources.
  • the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. In other aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. In other aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. I other aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources.
  • a method for transmitting stereoscopic display information including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both visible light and another source, and the visible light information is also used to deduce that correct image is going to the correct eye and that the information is not reversed such that the left image is going to the right eye and vice versa.
  • a method for transmitting stereoscopic display information including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information where the system is capable of dynamically changing whether all viewers are sharing the same set of images or different sets of images.
  • a method for transmitting stereoscopic display information including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information where the system is capable of displaying a sequence such that the wearers of shutter glasses or other consumer electronics devices see stereoscopic content while viewers without glasses or without other consumer electronics devices see only the left or right image, a non-stereoscopic version of the content, a blank or solid colored screen, or an arbitrary piece of content such as an advertisement.
  • anti left, anti right, or combined anti left and anti right images are incorporated into the video frame sequence.
  • a method for transmitting stereoscopic display information including: a pair of shutter glasses or other consumer electronics device and a transmitter of synchronization information which has been incorporated into a mobile device either by using an unused wireless or infrared technology on the phone or by adding addition information to a protocol the phone is already using.
  • the wireless technology is the IEEE Standard 802.15.1, Bluetooth, or components thereof.
  • the wireless technology is the IEEE Standard 802.15.3, ZigBee radio (compliant with IEEE 802.15.4), or components thereof.
  • a method for transmitting stereoscopic display information including: a pair of shutter glasses or other consumer electronics device and a transmitter of synchronization information which has been incorporated into a mobile device which has been augmented by an external cradle, dongle or device which contains additional hardware and means of providing synchronization information or image viewing.
  • the cradle, dongle or device contains an image projector.
  • a method of using stereoscopic glasses as ordinary sunglasses is disclosed.
  • the stereoscopic glasses incorporate a visible light sensor and make automatic decisions about the appropriate level of perceived darkening.
  • the level of perceived darkening is based on computer algorithms, information about the user and the environment stored on a mobile device, information retrieved from a computer network via the mobile device, and other deductions.
  • the stereoscopic glasses and ordinary sunglasses are combined with a wireless headset, Bluetooth headset, or stereo Bluetooth headset.
  • a method of combining stereoscopic glasses with a wireless headset, Bluetooth headset, or stereo Bluetooth headset is disclosed.
  • a method of combining ordinary or automatically darkening sunglasses with a wireless headset, Bluetooth headset, or stereo Bluetooth headset is disclosed.
  • Embodiments of the present invention include an imaging device including one or more image sensors (e.g. cameras) and a communications channel.
  • the imaging device may be physically coupled to a general purpose consumer device such as a personal media player (e.g. iPod), a communications device (e.g. iPhone, Android-based phone), a mobile internet device, a processing device (e.g. netbook, notebook, desktop computers), or the like.
  • the imaging device may utilize the communications channel (e.g. Bluetooth, Wi-Fi, ZigBee radio, IR, USB, IEEE 802.15.1, IEEE 802.15.4) to provide image data from the imaging device to the consumer device.
  • the communications channel e.g. Bluetooth, Wi-Fi, ZigBee radio, IR, USB, IEEE 802.15.1, IEEE 802.15.4
  • the imaging device may be used independently of the consumer device to acquire stereoscopic images, and such images may be provided to the consumer device via the communications channel.
  • the consumer device may process and / or retransmit the stereoscope images to a remote server.
  • stereoscopic images may be viewed on the consumer device and / or uploaded to the web (e.g. Facebook, MySpace, TwitPic), sent via e-mail, IM, or the like.
  • the imaging device may capture one of the left or right pair of 2D images, and an image sensor on the general purpose consumer device may be used to capture the other 2D image.
  • the imaging device may include two or more image sensors (e.g. embedded therein) and be used to capture the left and right stereoscopic pair of 2D images.
  • pair of images are typically captured simultaneously or within a short amount of time apart (e.g. less than 1 second) to facilitate proper 3D image capture. This time period may increase when photographing still life, landscapes, or the like.
  • users e.g.
  • Embodiments for methods of stereoscopic 3D image capture could incorporate an existing phone or device, a new piece of hardware such as a cradle or dongle for an existing phone or device.
  • Other embodiments may include a piece of software or computer readable method of using an existing or new device to capture stereoscopic 3D images.
  • Still other embodiments may include a system and method that combines these aspects in the capture of stereoscopic 3D images.
  • a consumer device for capturing stereoscopic images includes a plurality of image acquisition devices, wherein a first image acquisition device and a second image acquisition device are both approximately directed in a common direction, wherein the first image acquisition device and the second image acquisition device are displaced by a displacement, wherein the first image acquisition device is configured to capture a first image, and wherein the second image acquisition device is configured to capture a second image.
  • a system may include a user input device configured to receive an input from a user, a memory configured to store the first image and the second image, and a wired or wireless communications portion configured to transmit data to a remote device.
  • Various devices may include a processor coupled to the first image acquisition device, to the second image acquisition device, to the user input device, and to the communications portion, wherein the processor is configured to approximately contemporaneously direct acquisition of the first image by the first image acquisition device and of the second image by the second image acquisition device in response to the input from the user, wherein the processor is configured direct storage the first image and the second image in the memory, and wherein the processor is configured to direct the communications portion to transmit at least a portion of the first image and at least a portion of the image to a remote device.
  • a method for capturing stereoscopic images, photos or videos on a mobile computing device wherein the mobile computing device includes at least a first camera and a second camera, and wherein a distance and an orientation between the first and the second cameras are determinable.
  • Techniques may include receiving an initiation signal from a user, while the user points the first and the second cameras in a direction of interest, and substantially simultaneously acquiring a first image with first camera, a second image with a second camera and camera parameters, in response to the initiation signal.
  • One process may include storing the first image, the second image and the camera parameters in a memory, and uploading at least a portion of the first image, at least a portion of the second image, and the camera parameters to a remote server.
  • FIG. 1 is a block diagram illustrating aspects of the prior art
  • FIGS. 2A-D include block diagrams of various embodiments of the present invention illustrating the process of elements of a system in which stereoscopic glasses are synchronized with the display device by incorporation of a wireless radio into the system;
  • FIG. 3 illustrates a block diagram of a process according to various embodiments of the present invention.
  • FIG. 4 is a timing diagram of various embodiments of the present invention illustrating a method of sending image information to a display in which the frames which compose the image are sent sequentially;
  • FIG. 5 illustrates various embodiments incorporated into a mobile phone's hardware, firmware, and software and into a pair of stereoscopic shutter glasses;
  • FIG. 6 illustrates various embodiments incorporated into a mobile phone, some of the methods are incorporated into a pair of stereoscopic shutter glasses, and other methods are incorporated into a cradle or other device that attaches to the mobile phone;
  • FIG. 7 illustrates various embodiments incorporated into a pair of stereoscopic shutter glasses combined with a mobile phone headset;
  • FIG. 8 illustrates various embodiments of the present invention
  • Fig. 9 illustrates various embodiments of the present invention
  • FIG. 10 is a diagram illustrating aspects of the prior art
  • FIG. 11 is a diagram illustrating embodiments of the present invention including methods of incorporating stereoscopic capture capabilities into a mobile device by embedded multiple cameras;
  • FIG. 12 is a diagram illustrating embodiments of the present invention including methods of incorporating stereoscopic capture capabilities into a mobile phone or other device by attaching an external dongle or fitting the device to a cradle;
  • FIG. 13 is a diagram illustrating embodiments of the present invention where multiple cameras have been incorporated into a mobile device in which the relative orientation of the cameras can be changed by way of actuating a hinge which is part of the mobile device;
  • FIG. 14 is a diagram illustrating embodiments of the present invention where the field of view of two cameras changes as a hinge is manipulated;
  • FIG. 15 is a diagram illustrating embodiments of the present invention where the field of view of two obliquely oriented cameras are rotated and/ or cropped to specific regions of interest.
  • FIGS. 2A-D illustrate various embodiments of the present invention.
  • FIGS. 2A-D illustrate various arrangements of embodiments of the present invention.
  • FIG. 2A includes a 3D source 34 of image data, a transmission device 37, a display 43, and shutter glasses 42.
  • 3D source 34 may be a computer, a Blu-ray or DVD player, a gaming console, a portable media player, set-top-box, home theater system, preamplifier, a graphics card of a computer, a cable box, or the like
  • 3D display 43 may be any 3D display device such as an LCD/Plasma/OLED display, a DLP display, a projection display, or the like.
  • transmission device 37 and shutter glasses 42 may be embodied by a product developed by the assignee of the current patent application, Bit Cauldron Corporation of Gainesville, FL.
  • shutter glasses 42 may be implemented with mechanical shutters or LCD shutters. For example, LCD shutters based upon pi-cell technology may be used.
  • 3D source 34 sends 3D display signals to display 43 through a video cable 35, typically through a standards-based interface such as VGA, DVI, HDMI, Display Port (DP), or the like.
  • Such 3D display signals are often configured as one or more interleaved full right-eye images then full left-eye images (e.g. field sequential); double wide (e.g. side by side) or double height (e.g. stacked) images including both left and right images; images interleaved with right-eye images and left-eye images on a pixel by pixel basis; or the like.
  • a transmission device 37 e.g. a radio transmitter may be inserted between the 3D source 34 or other video source and 3D display 43.
  • transmission device 37 determines 3D timing information by decoding the 3D display signals as they pass through to display 43 on signal line or cable 44.
  • transmission device 37 includes a transmitter based upon radio frequency (rf) signals.
  • the rf signals may use or may be combined with any conventional transmission protocol such as IEEE Standard 802.15.1 (e.g. Bluetooth), Wi-Fi, IEEE Standard 802.15.4 (e.g. ZigBee Alliance radio), or the like.
  • IEEE Standard 802.15.1 e.g. Bluetooth
  • Wi-Fi Wireless Fidelity
  • IEEE Standard 802.15.4 e.g. ZigBee Alliance radio
  • transmission device 37 may be a stand-alone device, e.g. a dongle, a USB "key,” or the like and transmission device 37 may be powered by power source 36 and 38, self-powered, powered from 3D data source, USB powered, or the like. In other embodiments, transmission device 37 may incorporated into another device, such as 3D source 34, 3D display 43, a pre-amplifier, or the like.
  • FIG. 2B illustrates additional embodiments of the present invention.
  • Fig. 2B includes a source of 3D images 100, a transmission device 110, and a 3D display 120.
  • 3D image source 100 provides 3D images (e.g. double- wide or double- height images including both right and left images) to 3D display 120 via a signal line 130 such as a VGA, DVI, Display Port (DP), cable, or the like.
  • 3D image source 100 provides a synchronization signal along signal line 140 to transmission device 110.
  • 3D image source 100 includes an industry standard interface such as a VESA miniDIN-3 connector, VESA 1997.11, USB connector, or the like, to which transmission device 110 may be coupled.
  • FIG. 2C illustrates additional embodiments of the present invention.
  • Fig. 2C includes a source of 3D images 160, a transmission device 170, and a 3D display 180.
  • 3D image source 160 provides 3D images (e.g. double-wide or double- height images including both right and left images) to 3D display 180 via a signal line 190 such as a VGA, DVI, HDMI cable, Display Port (DP), or the like.
  • 3D display 180 provides a synchronization signal along signal line 200 to transmission device 170.
  • 3D display 180 includes an industry standard interface such as a VESA miniDIN-3 connector, USB connector, or the like, to which transmission device 170 may be coupled.
  • FIG. 2D illustrates other additional embodiments of the present invention.
  • Fig. 2D includes a source of 3D images 220, a transmission device 230, and a 3D display 240.
  • 3D image source 220 provides 3D images (e.g. double-wide or double-height images including both right and left images) to 3D display 240 via a signal line 250 such as a VGA, DVI, HDMI cable, Display Port (DP), or the like.
  • transmission device 230 may be disposed within 3D display 240.
  • transmission device 230 may be installed within the manufacturing facility of 3D display 240, or the like.
  • 2D display 240 may also power transmission device 230.
  • 3D display 240 provides a (derived) synchronization signal along signal line 260 to transmission device 230.
  • shutter glasses 42 include a radio receiver 41 that receives the synchronization signals 40.
  • shutter glasses 42 alternatively changes the properties of one lens from translucent to opaque (e.g. dark) to translucent and of the other lens from opaque to translucent (e.g. clear) to opaque.
  • a user / viewer views 3D display images 45 from display 43 at the proper timing. More particularly, the user's right eye is then exposed to a right-eye image from 3D display images 45, and then the user's left eye is then exposed to a left-eye image from 3D display images 45, etc.
  • transmission device 37 based upon a radio frequency transmitter has several advantages over an infrared transmitter.
  • One advantage recognized is that radio signals can be received in many situations where an infrared signal would be blocked. For example this allows the user of a pair of 3D shutter glasses or the like, to move their head much farther away from the 3D display or transmission device than if IR were used, and allows the user to move throughout a room with a larger range of motion while maintaining synchronization with the 3D display.
  • rf transmitters allow other people or objects to pass in front the user / viewer with out interrupting the signal.
  • shutter glasses 42 may include its own localized clock. Benefits to such a configuration include that it allows shutter glasses 42 to remain approximately synchronized to display 43 even though the connection to transmission device 37 is interrupted and / or synchronization signals 40 are not received.
  • a precision timing protocol can be used so that the clock that is local to shutter glasses 42 is synchronized with a clock within transmission device 37 and / or the 3D display signals.
  • a precision timing protocol may include the transmission of data packets with a time stamp time associated with the 3D display signals to shutter glasses 42.
  • the protocol may include transmission of a data packet with a time stamp associated with shutter glasses 42 to transmission device 37.
  • shutter glasses 42 receive the time stamp from the 3D data source, compares the received time stamp to its local clock and returns a data packet with its local time stamp. Using this information, transmission device 37 can determine a round-trip time for data between transmission device 37 and shutter glasses 42.
  • the round-trip time offset is evenly divided between transmission device 37 and shutter glasses 42.
  • one or both devices are capable of determining a difference in speed or lag between the two transmissions, then a more precise determination of the relative values of both clocks (offsets) can be determined. As a result, in various embodiments, more precise synchronization between the two clocks can be established.
  • the difference in rate (e.g. frequency) between the two clocks (transmission device 37 or 3D source 34 and shutter glasses 42) can be more precisely determined, hi some embodiments if there is a low degree of consistency in the latencies, the period of time between the determination of a latency process may be made small, e.g. once a minute; and if there is a higher degree of consistency in the latencies, the period of time between the determination of a latency process may be increased, e.g. once every ten minutes.
  • Embodiments of the present invention enable the use of multiple pairs of shutter glasses 42. hi such embodiments, a single pair of shutter glasses 42 may be used to determine delay and jitter as discussed was discussed above. Next, a simpler protocol, such as a unidirectional or broadcast protocol, may be used by transmission device 37 to communicate this synchronization information to the remaining pairs of shutter glasses, hi various embodiments, the delay and jitter information can be stored in transmission device 37, in 3D source 34, or other consumer electronics device generating the 3D data, either in a volatile or non- volatile manner.
  • a simpler protocol such as a unidirectional or broadcast protocol
  • this data may be determined using bidirectional communications on cable 44, such as the DisplayPort protocol, or the like, as illustrated in FIG. 2C.
  • Communications protocols such as display data channel (DDC and DDC2) protocols, PanelLink serial protocol or a similar protocols allows the display to communicate information back to the computer, home theater system, video source, or the like, hi various embodiments, this serial protocol can be enhanced to provide the appropriate latency and synchronization characteristics of 3D display 43 back to 3D source 34 and / or transmission device 37.
  • these protocols can be used to determine the manufacturer, vendor, or other identifying information for 3D display 34, and a table of pre-determined synchronization information can be retrieved, either locally, across a local area network, across a network, or the like This information may include an appropriate delay and synchronization information for respective 3D displays.
  • FIG. 3 illustrates a block diagram of a process according to various embodiments of the present invention. More specifically, FIG. 3 illustrates a process for synchronizing shutter glasses to a source of 3D images.
  • a 3D data source provides 3D images, step L.
  • the 3D images may be provided in any number of specific formats, such as right and left images: sequentially transmitted, packed vertically or horizontally into a single image and transmitted, combined on a pixel by pixel basis into a single image and transmitted, or the like, hi other embodiments, as illustrated in Fig. 2B, 3D data source may provide specific timing data.
  • synchronization data such as an identifier of a timing clock resident on 3D data source is determined, step 310. hi various embodiments, this may include a packet of data including a source time stamp, or the like.
  • the synchronization data may then be transmitted through radio frequency transmissions to a first pair of shutter glasses, step 320.
  • the shutter glasses receive the source time stamp and synchronizes the operation of the right / left shutters to the synchronization data, step 330.
  • the synchronization data can then be maintained within the shutter glasses by an internal clock within such glasses, step 340.
  • the internal clock can be resynchronized.
  • Such embodiments are believed to be advantageous as the glasses need not wait for synchronization data from the 3D data source to be able to switch. Accordingly, synchronization data from the transmission device may be dropped or lost while the shutter glasses continue to operate properly. When synchronization data is reestablished, the synchronization described above may be performed.
  • rf communications using the ZigBee radio occur at 2.4 GHz, the same band as most Wi-Fi transmissions.
  • embodiments of the present shutter glasses are designed to inhibit communications, and defer to such Wi-Fi signals.
  • the shutter glasses will continue to operate autonomously, until the interference stops and new synchronization data is received from the transmission server.
  • the shutter glasses may transmit data back to the rf transmission device. More specifically, the shutter glasses may transmit the received source time stamp and / or the glasses time stamp back to the transmission device via the same rf communications channel, or the like, step 350. [0095] In FIG. 3, in response to the received source time stamp and / or the glasses time stamp, and the source time stamp when these data are received, the transmission device may determine adjustments to subsequent synchronization data that will be sent to the shutter glasses, step 360. As an example, the transmission device may determine that it should output synchronization data to the shutter glasses, even before the synchronization data is determined or received from the 3D data source. As a numeric example, if it is determined that the shutter glasses lag the 3D data source by 100 microseconds, the shutter glasses may trigger its shutters 100 microseconds before the expected arrival of a synchronization pulse.
  • this adjustment to synchronization data may be used to drive 3D glasses of other viewers of the 3D image.
  • 3D glasses of other viewers in the room may also have synchronization data adjusted using the process described above.
  • the transmission device may output the synchronization data at different times for different 3D glasses.
  • the shutter glasses may verify that they are in sync. If not, the shutter glasses may adjust the frequency of its own internal clocks until they are kept in a higher amount of synchronization.
  • the process may be repeated.
  • the synchronization process may be performed periodically, with the period dependent upon how well the 3D data source and the shutter clock stays remain in synchronization - if highly synchronized, the synchronization process may be performed at longer time periods apart (e.g. 2 minutes) than if these devices continually have synchronization problems (e.g. every 10 seconds).
  • Various embodiments of the present invention may include shutter glasses or other devices that includes multiple physical methods for receiving synchronization information.
  • some embodiments may contain both an infrared and radio receiver; an infrared and visible light receiver; a radio or visible light receiver; a combination of infrared, visible light and radio receivers; or the like.
  • the shutter glasses or other receiving device may include executable computer program that instructs a processor to automatically determine which communications channel or channels are available, and automatically use the communications channel having the strongest signal, lowest number of dropped data packets, or the like.
  • a visible light receiver e.g. IR
  • another synchronization transmission technology e.g. rf
  • the information transmitted via visible light and the synchronization information transmitted via another transmission technology may be combined within shutter glasses 42 to deduce unknown elements of the delay in 3D display 43 and other synchronization information.
  • the data from the different communication channels are compared to more precisely synchronize 3D display 43 and shutter glasses 42.
  • the two communications channels can be used to verify that a left image displayed on 3D display 43 is going to the left eye and the right image displayed on 3D display 43 is going to the right eye. Li such an example, this would preventing the error of a reversal of synchronization information somewhere in the system that results in the sending the left image to the right eye and vice versa.
  • shutter glasses 42 may be used to provide a variety of new functions.
  • FIG. 4 illustrates typical video output timing where frame one 26, frame two 28, frame three 30 and frame four 32 are output sequentially.
  • left images (frames) and right images (frames) are alternatively output.
  • frame one 26 is left
  • frame two 28 is right
  • frame three 30 is left
  • frame four 32 is right, creating the sequence L, R, L, R images to the user.
  • Various embodiments of the present invention may be applied to 3D displays having display rates on the order of 120Hz and higher.
  • the refresh rate is 120 Hz
  • right and left images will be displayed and refreshed at 60 Hz. Accordingly, the viewer should not be able to detect significant flickering, however, the viewer may detect a darkening of the images.
  • the higher refresh rate may enable new features, as described below.
  • more than one left image and right image may be output.
  • multiple viewers may view a 3D display, and different viewers may see different 3D images.
  • a two viewer sequence of output images may be user 1 left, user 1 right, user 2 left, user 2 right, etc. This could be represented as: Ll, Rl, L2, R2.
  • shutter glasses of a first viewer will allow the first viewer will see images Ll and Rl and a shutter glasses of a second viewer will allow the second viewer will see images L2 and R2.
  • other sequences are contemplated, such as Ll, L2, Rl, R2, and the like.
  • refresh rate for a 3D display having a 240 Hz refresh rate, a viewer will see the respective right and left images at a refresh rate of 60Hz. As noted above, this frequency should be above the typical sensitivity of the eye, however, viewers may detect a darker image. Such artifacts may be mitigated by increasing the brightness of the images, or the like.
  • FIG. 1 Other embodiments may be extended to additional (e.g. three or more viewers). Applications of such embodiments may include for computer or console gaming, or the like. As an example, two or more viewers may initially see the same 3D image, and subsequently one or more viewers "break off to view a different 3D image. For example, three people could be playing a multiplayer game in which all three are traveling together and see the same 3D images. Next, one player then breaks away from the other players. Using the additional communications protocols disclosed in various embodiments of the present invention, the player's glasses can be reprogrammed to allow the third person to see a different 3D image. Subsequently, the third person may return to the group, and then see the same 3D image.
  • additional communications protocols disclosed in various embodiments of the present invention
  • a sequence of images output by the 3D display could begin with L0-R0-L0-R0, where 0 indicates everyone in the party.
  • the 3D display could switch and output images in a sequence such as L1&2, R1&2, L3, R3; L1&2, L3, R1&2, R3; or the like.
  • the sequence may revert to LO, RO, LO, RO. hi various embodiments, switching back and forth may occur with little, if any, visible interruption in the 3D images viewed by the viewer.
  • the inventors recognize that the brightness of each frame may have to be adjusted to correct for the changes in overall viewing time.
  • sequences of images enable still other types functionality.
  • separate anti-left, anti-right images or both may also be sent.
  • theater-goers can decide whether they care to watch the same movie or feature with or without 3D glasses; game players can play in 3D while viewers watch the same display in 2D.
  • users not utilizing embodiments of the 3D glasses may view other arbitrary images.
  • the viewer with 3D glasses may see the left image in the left eye and the right image in the right eye, and may not see the Arbitrary image.
  • FIG. 5 illustrates additional embodiments of the present invention. More specifically, FIG. 5 illustrates a general purpose consumer device (e.g. mobile phone, personal media player, laptop, or the like) capable of 3D image output, hi such embodiments, the synchronization information to the shutter glasses may be provided by with the consumer device including embodiments of the rf transmitter described above, or unused or available transmitters available in the consumer device. Various examples may use infrared, WiFi, Bluetooth, or the like, to provide synchronization signals to shutter glasses according to embodiments of the present invention.
  • a general purpose consumer device e.g. mobile phone, personal media player, laptop, or the like
  • the synchronization information to the shutter glasses may be provided by with the consumer device including embodiments of the rf transmitter described above, or unused or available transmitters available in the consumer device.
  • Various examples may use infrared, WiFi, Bluetooth, or the like, to provide synchronization signals to shutter glasses according to embodiments of the present invention.
  • Fig. 6 illustrates additional embodiments of the present invention wherein existing consumer devices (e.g. mobile phone) may be augmented to better support stereoscopic 3D viewing.
  • a cradle or dongle which attaches to the mobile device or holds the mobile device may be used, hi such examples, the cradle or dongle may incorporate a projection system such that the image may be projected at a larger size than the screen on the mobile device.
  • the cradle or dongle or consumer device may also provide the synchronization signals to the shutter glasses.
  • the cradle or dongle may include a ZigBee radio-type transmitter (IEEE 802.15.4) that transmits the synchronization data to the shutter glasses, or the like.
  • IEEE 802.15.4 ZigBee radio-type transmitter
  • stereoscopic shutter glasses that are to be used with the consumer device described above, can be used for other purposes.
  • glasses incorporate a visible light sensor, they can be worn as ordinary sunglasses but make improved automatic decisions about the appropriate level of perceived darkening.
  • This information can be based on computer algorithms, information about the user and the environment that is stored on a mobile device; information retrieved from a computer network via the mobile device, and the like.
  • Fig. 7 illustrates yet another embodiment of the present invention.
  • a user of the consumer device may desire to perform multiple functions at the same time, such as: talk on a Bluetooth headset, view stereoscopic 3D content, and wear sunglasses.
  • Embodiments illustrated in Fig. 7 may include a pair of shutter glasses 57 combined with a pair of sunglasses and a Bluetooth or stereo mobile Bluetooth headset with a left earpiece 58 and a right earpiece 55, or the like.
  • Fig. 8 illustrates various embodiments of the present invention, hi particular, Fig. 8 illustrates a block diagram of various embodiments of a dongle 400 providing rf transmissions, as described above.
  • a physical interface 410 is illustrated.
  • physical interface 410 may be a DVI port, HDMI port, Display Port (DP), USB, VESA 1997.11, or the like, for coupling to a source of 3D data (e.g. computer, DVD / BluRay player, HD display, monitor, etc.).
  • the 3D data may include 3D image data
  • the 3D data may include 3D timing data.
  • an interface chip or block 420 may provide the electronic interface to physical interface 410.
  • a processing device such as a CPLD (complex programmable logic device) 430 may be used to decode 3D synchronization data from 3D image data or 3D timing data.
  • 3D synchronization data 440 is then provided to an rf interface device 450 that references a clock 440.
  • rf interface device 450 is a TI CC2530 System on a Chip, that includes a 8051 MCU (processor), RAM, Flash memory, and a IEEE 802.14.4 ZigBee RF transceiver.
  • the flash memory is configured to store executable computer code or instructions that directs the processor to perform various functions, as described herein.
  • the flash memory includes computer code that directs the processor to transmit the 3D synchronization data to the 3D glasses, to receive timing data back from the 3D glasses, to determine a round- trip communication latency, to adjust 3D synchronization data in response to the round-trip communication latency, and the like, as described above.
  • dongle 400 may include an output port or 460 driven by an output interface 470.
  • the output port may be a DVI port, HDMI port, Display Port (DP), or the like providing 3D image data to a 3D display (e.g. an display, projector, etc.).
  • a 3D display e.g. an display, projector, etc.
  • Fig. 9 illustrates various embodiments of the present invention.
  • Fig. 9 illustrates a block diagram of a pair of shutter glasses 500 according to various embodiments of the present invention.
  • Shutter glasses 500 is illustrated to include an rf interface device 510 that references a clock 540 and a pair of electronically controlled LCD shutter elements 520 and 530.
  • 3D synchronization data 550 is received in rf interface device 510.
  • rf interface device 510 is also an TI CC2530 System on a Chip, that includes a 8051 MCU (processor), RAM, Flash memory, and a IEEE 802.14.4 ZigBee RF transceiver.
  • the flash memory is configured to store executable computer code or instructions that directs the processor to perform various functions, as described herein.
  • the flash memory includes computer code that directs the processor to receive the 3D synchronization data, to change the states of/ drive shutter elements 520 and 530 at the appropriate timing (e.g. Ll and Rl in the sequence Ll, L2, Rl, R2), to send clock or timing data back to a transmission device via rf communications, and the like.
  • Embodiments described above may be useful for hand-held consumer devices such as cell-phones, personal media players, mobile internet devices, or the like. Other embodiments may also be applied to higher-end devices such as laptop computers, desktop computers, DVRs. BluRay players, gaming consoles, hand-held portable devices, or the like. Other embodiments may take advantage of existing IR transmission devices for IR shutter glasses. More specifically, in such embodiments, an IR to RF conversion portion may be added to receive the IR 3D output instructions and to convert them to RF 3D transmission signals, described above. In some embodiments, an RF receiver is thus used.
  • the RF 3D transmission signals are then transmitted to the RF 3D shutter glasses, described above.
  • Such embodiments can therefore be a simple upgrade to available IR 3D glasses transmitters.
  • feedback from shutter glasses to the transmitter device described above with regards to synchronization may be used for additional purposes.
  • One such embodiment may allow the 3D image source (e.g. a cable box, computer, or the like) to take the indication that a pair of shutter glasses are currently synchronized to mean a person is viewing the 3D content, and to provide that data back to a marketing company such as Media Metrics, Nielsen Ratings, or the like. By doing this, such market research companies may determine the number of viewers of specific 3D features, or the like.
  • FIG. 11 illustrates various embodiments of the present invention. More specifically, Fig. 11 illustrates incorporation of more than one image sensor onto a consumer device to provide stereoscopic capture capabilities as described herein.
  • a consumer device 613 such as a mobile telephone, personal media player, mobile internet device, or the like includes two imaging sensors 616 and 617 coupled to body 615.
  • imaging sensors 616 and 617 are configured to acquire left and right 2D image pairs at substantially the same time (e.g. within a second or less).
  • stereoscopic cameras 616 and 617 may be embedded directly into consumer device 613, and imaging sensors 616 and 617 may have a fixed position and orientation with respect to body 615.
  • imaging sensors 616 and 617 may be movable within body 615 (e.g. along a track or point of rotation) or may be removable from body 615.
  • cameras 616 and 617 may be affixed at a known displacement (e.g. offset or location) relative to each other. In other embodiments, the displacement between cameras 616 and 617 may be modified by the user and the displacement may be determined by consumer device 613. In various embodiments, cameras 616 and 617 may alternatively capture left and right 2D images or videos or simultaneously capture such images at any other speed fast enough to approximate simultaneity.
  • the acquisition of such images may be initiated by the user via software and / or hardware portions of consumer device 613.
  • the acquisition may be initiated via depression of a physical switch or the selection of a "soft" button on a display of the consumer device 613.
  • executable software code stored within a memory in consumer device 613 may instruct one or more processors within consumer device 613 to acquire at least a pair of images from image sensors 616 and 617 and to store such images into a memory.
  • the acquired left and / or right images may be displayed to back to the user on a display of consumer device 613. Additionally, the left / right images may processed by the one or more processors for display as a stereoscopic image pair.
  • the images may be combined into a static stereoscopic image, and when a lenticular lens (e.g. prismatic) is disposed on top the display, the user may simultaneously see the left / right images with their respective left / right eyes.
  • a lenticular lens may be provided in various embodiments of the present invention, in the form of a removable sheet the user places over a display of the consumer device to view 3D images.
  • the lens may be part of a removable sleeve the user slips onto the consumer device to view 3D images.
  • the left / right images may be uploaded to another consumer device, such as a laptop, desktop, cloud storage system, television, HD monitor, or the like, hi such embodiments, the right / left images may be displayed on a display in a time- interleaved manner and viewed by the viewer.
  • another consumer device such as a laptop, desktop, cloud storage system, television, HD monitor, or the like
  • the right / left images may be displayed on a display in a time- interleaved manner and viewed by the viewer.
  • the optical settings and characteristics of cameras 616 and 617 may also be recorded in the memory and / or referenced.
  • Such parameters or settings may be made available to various processing software (resident upon consumer device 613, or other processing device) to further deduce, capture, or process information from cameras 616 and 617.
  • processing software resident upon consumer device 613, or other processing device
  • estimates of distances and other measurements may be performed in three-dimensions.
  • parameters or settings may include camera parameters, e.g. shutter speed, aperture, gain, contrast, and the like may be measured from a left camera and be applied to the right camera to normalize the captured left / right 2D images.
  • consumer device 613 may be vertically oriented when acquiring a left / right images.
  • a consumer device may be horizontally oriented when acquiring left / right image pairs.
  • An example of this is illustrated by consumer device 614 (e.g. mobile phone) in Fig. 11, where cameras 618 and 620 are distributed along the long axis of device 621.
  • image sensors 618 and 620 may capture right / left images at substantially the same time (or the like) to enable the generation / viewing of stereoscopic images.
  • one or more additional image sensors such as camera 619 may be provided as part of the consumer device.
  • camera 619 may be directed towards the user, while cameras 618 and 620 are directed away from the user.
  • Such embodiments may be provided to capture not only a right / left image pair, but a reaction of the user.
  • the user may use consumer device 613 to record a video of a roller coaster ride in "3D" and to contemporaneously record their reactions
  • camera 619 may be installed or rotated such that cameras 618, 619 and 620 are all pointed in approximately the same direction, toward the same plane, line, or the like, towards or away from the user, or the like
  • display of consumer device may also be directed towards or away from the user.
  • images and camera parameters captured by cameras 618-620 may also be used as a source of stereoscopic image data, such as for 3D scene reconstruction, or the like.
  • the consumer device may include one or more segments which move with respect to each other such that the stereoscopic cameras remain horizontally oriented (e.g. level) while the display or other sections of the consumer device are rotated or manipulated, hi various embodiments, the cameras may be manually leveled by the user to be horizontal disposed, and in other embodiments, the cameras may be automatically manipulated by the consumer device via feedback from one or more tilt sensors or accelerometers provided in the consumer device.
  • Fig. 12 illustrates embodiments of the present invention directed towards supplementing consumer devices having an image sensor with right / left image acquisition capabilities, hi particular, FIG. 12. illustrates embodiments where stereoscopic image capture capabilities can be added to an existing consumer device (e.g. mobile phone) by means of an external cradle, dongle, or other device.
  • a consumer device 622 includes a body portion 624 coupled to a dongle 626.
  • Dongle 626 may include one or more image sensors, such as cameras 627 and 629.
  • dongle 626 provides image data, camera parameter data, or the like to consumer device 622 via a physical and / or data communications channel, such as USB or microUSB connector, wireless (e.g. IR, Bluetooth, Wi-Fi, ZigBee radio (ZigBee Alliance), IEEE Standard 802.15.4, IEEE Standard 802.15.1), docking (e.g.
  • any other method for physically restraining dongle 626 with respect to consumer device 622 is contemplated, additionally, any other transfer protocol for providing data from dongle 626 to consumer device 622 is also contemplated.
  • a user may initiate capture of right / left images on dongle 626 via one or more physical buttons on dongle 626 or consumer device 622 or soft buttons on a display of consumer device 622. Similar to the embodiments described above, executable software code operating upon consumer device 622 may direct a processing device within consumer device 622 or dongle 626 to initiate acquisition of images by cameras 627 and 629. It is contemplated that consumer device 622 may send one or more instruction signals to dongle 626 via the same physical and / or data communications channel as described above. Alternatively, other communications methods and mechanisms for instructing dongle 626 are contemplated.
  • dongle 626 initiates the capturing of one or more images from image sensors 627 and 629. Additionally, dongle 626 may capture image parameters from one or both of image sensors 627 and 629 to assist in capturing, normalizing, and / or generating of stereoscopic images. In various embodiments, such information may include the fixed or relative locations of cameras 627 and 629, optical parameters (e.g. aperture, shutter speed, focal length, iso, focal point in the images, and the like), level or orientation information from tilt sensor 28, and the like, hi various embodiments of the present invention, consumer device 622 may include functionality described above for dongle 626, such as tilt sensor 628, or the like.
  • optical parameters e.g. aperture, shutter speed, focal length, iso, focal point in the images, and the like
  • dongle 626 may be capable of using more than one communication protocol and may connect to other devices than consumer device 622.
  • dongle 626 may provide right / left images directly to other users' mobile phones, computers, televisions, or the like, via Bluetooth, Wi-Fi, ZigBee radio, IEEE 802.15.1, IEEE 802.15.4, IR, or the like.
  • dongle 626 may communicate with such devices either one at a time, in an interleaved manner, simultaneously, or the like.
  • Fig. 12 also illustrates additional embodiments of the present invention. More particularly, a consumer device 623 is illustrated including an external device such as cradle 634 physically holding or enveloping the body 631 of consumer device 623.
  • cradle 634 includes a single image sensor 632, although in other embodiments, more than one image sensor may be provided.
  • cradle 634 is operated such that the image sensor 630 of consumer device 623 operates in coordination with image sensor 632 to capture right / left image pairs.
  • cradle 634 may be physically coupled to consumer device 623, as illustrated, or may be physically coupled in any manner contemplated or described in the embodiments above (e.g. iPod connector, USB). As illustrated in Fig. 12, consumer device 623 is placed into an undersized opening of cradle 634, and thus consumer device 623 and cradle 634 are physically restrained with respect to each other. Further, cradle 634 and device 623 may communicate image data, sensor data, instructions to and from consumer device 623 in any manner described in the embodiments above (e.g. Bluetooth, Wi-Fi, IR).
  • additional camera information such as camera parameters and / or information from a tilt sensor 633 maybe determined by cradle 634.
  • cradle 634 may also communicate such information about the optical characteristics and properties of image sensor 632 to consumer device 623. Such data may be used by consumer device 623 to coordinate the actions of image sensors 630 and 632.
  • camera or lens parameters from image sensor 632 may be used to set the parameters of image sensor 630.
  • a gain setting from image sensor 630 may be used to set a gain setting of image sensor 632
  • a shutter speed of image sensor 632 may be used to set a shutter speed of image sensor 630, and the like.
  • Fig. 13 illustrates various embodiments of the present invention.
  • a consumer device 635 is illustrated including two sections 639 and 642 and at least a pair of image sensors 640 and 641.
  • section 635 and 642 are coupled together by a hinge or other conveyance.
  • consumer device 635 may be "folded-up", or consumer device 636 may be partially opened, consumer device 637 may be fully opened, or the like, as shown.
  • the displacement between sensors may vary.
  • image sensor 643 is disposed upon the side or end of section 644 and image sensor (e.g. camera) 647 is disposed upon the end of side or end of section 646.
  • cameras 643 and 647 are laterally displaced with respect to each other; in another case image, sensor 640 is adjacent to 41; and in another case, cameras 648 and 652 are far away from each other.
  • the orientation of the two cameras in terms of their distance relative to each other, their rotation relative to each other, the tilt of the entire system and the like, are variable. Because of this, in various embodiments, the displacements between the cameras, camera parameters, image parameters and the like may be recorded. As described in the various embodiments above, such data may be used for many purposes by the consumer device, external device (e.g. desktop computer), or the like, such as determining stereoscopic images, 3D image reconstruction, or the like.
  • an additional image sensor such as image sensor 45 may also be included and may provide all the benefits of more than two cameras described herein.
  • the additional image sensors may be fixed or rotated such that the three cameras are pointed in the same direction (e.g. toward the same plane, line, or other geometric construction) such that stereoscopic information can be deduced for multiple orientations of the device, in opposite directions, or the like.
  • Fig. 14 illustrates additional features of embodiments of the present invention illustrated in Fig. 13.
  • a field of view 663 is shown for camera 664
  • a field of view 671 is illustrated for camera 670, and the like.
  • cameras 659 and 660 that are adjacent in consumer device 657 are "pulled apart," in consumer device 654, and cameras 664 and 670 are then separated and rotated relative to each other.
  • cameras 664 and 670 may remain in the same plane as they move, however they may also change plane with respect to each other.
  • tilt sensors 666 and 668 may be used to determine the tilt of each camera. These measurements may be referenced to determine a separation angle between cameras 664 and 670. Then, using the known geometry of the device, the linear displacement, or the like between cameras 664 and 670 can be determined, hi various embodiments, such information may be deduced by other means, such as installing a single tilt sensor and directly measuring the angle of a hinge 667, by deduction from the camera image data, or the like.
  • Fig. 15 illustrates an example of the result of various embodiments of the present invention. More particularly, FIG.
  • FIG. 15 shows an example of image data 682 and 684 from two image sensors of a consumer device that have been separated by an arbitrary distance, hi this example, image data 682 and 684 are rotated and tilted relative to each other as a result of being captured on a consumer device similar to cameras 664 and 670, in Fig. 14.
  • images captured by a user are expected to be rectangular in shape and parallel to the ground.
  • rectangles 681 and 683 represent level rectangular image information available from image data 682 and 684.
  • lines 685 and 686 illustrate that rectangles 681 and 683 are level.
  • rectangles 681 and 683 may be used to represent the right / left image pair for generating a stereoscopic 3D image.
  • a consumer device 678 may display one or both of image data 681 and 683 to the user on a display as 2D images or as a 3D stereoscopic image, before or while acquiring or storing image and / or video data, hi such embodiments, the user can be provided feedback as to how to reorient the image sensors with respect to each other, to capture the desired 2D image(s).
  • the inventors have determined that if images 681 and 683 do not have sufficiently overlapping subject matter, or if images 681 and 683 have narrow fields of view, a stereoscopic image formed from images 681 and 683 will not convey a significant 3D effect to the viewer.
  • the feedback from consumer device 678 may be provided in real-time to the user.
  • consumer device 678 may provide feed back in the form of tilt sensor feedback, to encourage the user to hold the device such that both cameras are more level, as illustrated by consumer device 655 in Fig. 14. hi practice, the inventors have determined that if the cameras are more level with respect to each other, the rectangular size of the region of interest increases.
  • face recognition technology can also be used, either to override or to coordinate with the tilt sensor, to increase or maximize the area of a face which is captured by images 681 and 683.
  • the system can encourage the user to reorient the system manually or the system may be able to do so automatically.
  • consumer device 678 may increase the region of interest (e.g. image data 681 and 683) by encouraging the user to manually level the cameras with respect to each other; encouraging the user to manually open or close the hinge completely; encouraging the user to "zoom out," pan upwards; etc.
  • the inventors have determined that automatic zooming out or panning upwards are particularly useful if face recognition technology is also included. As an example, such techniques would be useful to prevent image 681 and 683 from cropping out the eyebrows of the person illustrated in Fig. 15.
  • stereoscopic 3D image data may be deduced without the need for multiple camera image capture systems, such as those shown in 613, 614, 622, 623, 635, 636 or 637 if the subject is still enough that the user can generate one image such as 682 and then translate or rotate the camera to produce another image such as 684. If the subject is not sufficiently still, multiple images can still be used to deduce the stereoscopic data that is correct for a particular image.
  • a graphical display similar to the one shown in 678 may be displayed to the user, either as a flat (e.g. 2D) or stereoscopic 3D image.
  • the consumer device may then provide the user with information which allows the user to translate or rotate the camera of the consumer device into a new position to take another image.
  • the multiple images from a single camera may substitute for a set of single images from multiple cameras.
  • the consumer device determines the above-mentioned horizontal region of interest on the first image and displays information to guide the user in taking the next picture, to increase the 3D overlap or effect.
  • the user may be given written, verbal, or other instructions to take both pictures.
  • Such embodiments can also assist the user in taking the first image offset to one side instead of centered. This allows the second image to be offset to the other side, the result of which is the subject is centered in the deduced stereoscopic 3D image.
  • An example of such instructions is "Take a picture with the left eye on the viewfinder, then move the camera such that the right eye is on the viewfinder and take another picture.”
  • the feedback that the user has generated both images appropriately can be provided after both images are taken and after the resulting stereoscopic 3D image is determined.
  • the camera image data is examined to provide the user with graphical feedback to assist the user in capturing the second image.
  • the user is presented with a display, audible or other information to help select the second image after the first image has been taken.
  • facial recognition technology may be useful to encourage the user to translate the entire camera such that a stereoscopic image data of an entire face can be ensured, as opposed to a stereoscopic 3D image of landscape or still life, for example
  • the dongle described above may be operated to acquire 2D images when semi-permanently affixed to a consumer device.
  • the dongle may be operated to acquire 2D images apart from the consumer device. Subsequently, the 2D images may be provided to the consumer device using one or more of the transmission protocols described above. In such embodiments, the dongle may be stored semi-permanently affixed to consumer

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Stereoscopic And Panoramic Photography (AREA)

Abstract

A system, apparatus, method, and computer-readable media are provided for the capture of stereoscopic three dimensional (3D) images using multiple cameras or a single camera manipulated to deduce stereoscopic data. According to one method, a dongle or cradle is added to a mobile phone or other device to capture stereoscopic images. According to another method, the images are captured from cameras with oblique orientation such that the images may need to be rotated, cropped, or both to determine the appropriate stereoscopic 3D regions of interest. According to another method, a single camera is manipulated such that stereoscopic 3D information is deduced.

Description

METHOD OF STEREOSCOPIC 3D IMAGE CAPTURE AND VIEWING
BACKGROUND OF THE INVENTION
[0001] The present invention relates to stereoscopic 3D image acquisition and methods and apparatus. More particularly, the present invention relates to stereoscopic 3D image capture and viewing devices incorporating radio frequency communications.
[0002] When two-dimensional images that represent left and right points of view are sensed by respective left and right eyes of a user, the user typically experiences the perception of a 3D image from the two-dimensional images. The inventors are aware of several systems that allow users (e.g. individuals or groups) to perceive stereoscopic 3D depth in images, photos, pictures, moving pictures, videos, or the like, by the selective transmission of images to a users' eyes. Such systems include the use of display systems including light projection / reflection within a public or home theater or emissive or transmissive displays (e.g. LCD, plasma display, flat-panel display, or the like) to alternatively or simultaneously output right eye images and left eye images to a user. To view such 3D images, a variety of approaches have been provided to the user including prisms, static polarized glasses, LCD shutter glasses, or the like. The inventors of the present invention have recognized that existing approaches have many drawbacks, as will be discussed below. [0003] One approach has been with the use of polarized glasses, where the left and right lenses have a fixed orthogonal polarization (e.g. clockwise-circular and counter-clockwise- circular polarization). The inventors of the present invention have determined that such systems have a number of drawbacks. One such drawback includes that such systems typically rely upon images provided by a light projector and thus such systems are limited for use in darkened environments. Another drawback includes that such systems typically reply upon expensive silver or metalized reflective screens, that maintain the appropriate polarization of light from the projector to the right and left eye images. Such screens are often too expensive for the average consumer. Yet another drawback is that because both left and right eye images are displayed to the user at the same time, despite the polarized glasses, right eye images are often visible to the left eye and left eye images are often visible to the right eye. This light pollution degrades the quality of the 3D images and can be termed as "ghosting" of 3D images. [0004] The inventors are aware of a number of techniques that may be used to reduce this ghosting effect. Some techniques may include deliberate degradation of left eye images to account for right eye image ghosting and the deliberate degradation of right eye images to account for left eye image ghosting. The inventors believe that such techniques are disadvantageous as they tend to reduce the contrast of objects in an image, and they may result in a visible halo around objects in the image. As a result of using these circular or linear polarized glasses, the inventors have recognized that 3D versions of features often do not appear as aesthetically pleasing as 2D versions of such features.
[0005] Another approach to 3D visualization has included the use of stereoscopic shutter glasses that are based upon physical shutters, or more commonly liquid crystal display (LCD) technology. With such approaches, left and right images are alternatively displayed to the user, and the right and left LCD lenses alternate between a dark and transparent state. When the shutter glasses quickly alternate between transparency in the left then right eyes in synchronicity with an image which presents alternating left and right points of view, the left and right eyes receive different 2D images and the observer experiences the perception of depth in a 3D image.
[0006] Fig. 1 illustrates a typical stereoscopic system. As illustrated, such systems typically include a computer 1, an infrared transmitter 3, a display 12, and a pair of liquid crystal display glasses (LCD shutter glasses) 8. In such systems, computer 1 alternatively provides left eye images and right eye images on signal line 2, in addition to a signal that distinguishes when the left eye image or right eye image is displayed.
[0007] In response to the signal, IR transmitter 3 outputs infrared data 6 that indicate when the right eye image is being output and when the left eye image is being output. The inventors note that many different manufacturers currently have different IR data packet definitions and protocols. For example, one simple format for infrared data is a simple square wave with a high signal indicating left and a low signal indicating right; and another format includes a 8-bit word. Because of these different data formats, IR transmitters from one manufacturer often cannot be used with LCD glasses from another manufacturer.
[0008] In various systems, infrared data 6 is received by LCD glasses 8, and in response, for example, the right LCD of the LCD glasses 8 becomes opaque and the left LCD becomes translucent (e.g. clear, neutral density), or the left LCD of the LCD glasses 8 becomes opaque and the right LCD becomes translucent. Ideally, at the same time the right LCD becomes translucent, display 12 is displaying a right eye image, and when the left LCD becomes translucent, display 12 is displaying a left eye image.
[0009] In theory, systems illustrated in Fig. 1 are expected to provide a workable, robust system. However, in practice, then inventors have determined that there are many limitations that degrade the performance of such systems and that limit the applicability of such systems from being successfully and widely adopted.
[0010] One such limitation includes the difficulty in synchronizing the glasses to the images that are displayed. Synchronization data is typically based upon when the images are provided to the 3D display. Limitations to such approaches, determined by the inventors includes that both latency and timing jitter are introduced as it is processed and rendered by the 3D display device. As a result of such latency and jitter information, the LCD lenses or shutters are often opened and closed often at improper times, e.g. out of phase, with some of the image intended for the left eye being shown to the right eye and vice versa. This is perceivable by the user as ghosting effects. Additionally, as the inventors have determined that the phase difference is not constant and is subject to jitter, the user may see the image brightness change or flicker undesirably.
[0011] One approach to reduce such latency or jitter effects has been to reduce the amount of time the left LCD shutter and the amount of time the right LCD shutter are translucent. In such approaches, instead of the left shutter being open for example 50% of the time, the left shutter may be open 35% of the time, or the like. This reduction in open time should reduce the amount of ghosting.
[0012] The inventors recognize drawbacks to such approach to image ghosting. One such drawback is the reduction in net amount of light transmitted to the user's eyes. In particular, as the exposure time for each eye is reduced, the user will perceive a darkening of the images for each eye. Accordingly, a 3D version of a feature will appear darker and duller compared to a 2D version of the feature when using IR-type shutter glasses.
[0013] Another limitation is the use of the IR communications channel itself. The inventors of the present invention have determined that LCD glasses based upon IR receivers often lose synchronization with the display as a result of stray reflections. For example, it has been observed by the inventors that IR LCD glasses may become confused as a result of sunlight reflecting from household objects; heat sources such candles, open flames, heat lamps; other IR remote controls (e.g. television remotes, game controllers); light sources (e.g. florescent lights); and the like. Additionally, it has been observed by the inventors that IR LCD Glasses may also lose synchronization as a result of clothing, hair, portions of other users bodies (e.g. head), or the like, that temporarily obscure an IR receiver of the LCD glasses. The loss of synchronization may lead the user to seeing a series of flickering or rolling images and / or the left eye seeing the right eye image. The inventors believe these types of anomalies are highly disturbing to most users and should be inhibited or minimized.
[0014] In some cases manufacturers of such devices specifically instruct users to use IR LCD glasses in highly controlled environments. For example, they suggest that the 3D displays and glasses be used only in darkened rooms. The inventors believe such a solution limits the applicability and attractiveness of such 3D display devices to typical consumers. This is believed to be because most consumers do not have the luxury of a dedicated, light- controlled room for a home theater, and that most consumer entertainment rooms are multipurpose family rooms.
[0015] An additional drawback to such devices, determined by the inventors, is that multiple 3D display systems cannot easily be operated in the vicinity of each other. As described above, each 3D display system includes its own IR transmitter and 2D field timing and phase data. Then, when two such systems are in close proximity, a user's IR LCD glasses may receive IR transmissions from either of the 3D display systems. Because of this, although a user is viewing a first 3D display, the user's 3D glasses may be synchronizing to a different 3D display, causing the user to undesirably view flickering and rolling images. The inventors of the present invention thus recognize that multiple 3D display systems cannot easily be used in applications such as for public gaming exhibitions, tournaments, or contests, trade shows, in stadiums, in restaurants or bars, or the like.
[0016] With respect to stereoscopic 3D image capture, the inventors have recognized that methods for capturing such images have been limited and have been beyond the reach of the average consumer. The inventors of the present invention recognize that capturing of stereoscopic 3D images currently requires complex 3D imaging hardware and processing software. These systems are typically dedicated to acquiring or generating stereoscopic images. For example, the left and right 2D images that are used to form stereoscopic 3D images are generated entirely by computation (e.g. animation software) or by a pair of professional grade cameras which have been firmly affixed to each other with tightly manufactured proximity and spacing, not necessarily in the prior art. The inventors believe that because current systems provide such narrow and specialized functionality, they are too "expensive" for typical consumers to purchase. The inventors believe that if the cost of 3D hardware and software capturing systems could be reduced, consumers would more readily embrace stereoscopic 3D imaging.
[0017] The inventors understand that a number of software products have been developed for particular devices to enable a user to use a single camera to acquire "stereoscopic" images. FIG. 10 illustrates a number of devices 601-603 including cameras 605, 607 and 609, that may or may not exist, that might use such software. For example, device 604 includes an off-center camera 605; device 606 includes centered camera 607, and device 603 includes camera 609. As shown if FIG. 1, portions 610 and 608 of device 603 may be reoriented or repositioned with respect to each other, as shown in dotted positions 611 and 612.
[0018] Problems to such approaches, determined by the inventors, include that it requires the user to be very careful how they position the camera to capture two images, one after the other. If the direction in which the camera is pointing is too different between the two images, the images may not overlap, and any three-dimensional stereoscopic effect of the two images may be lost. Another problem, considered by the inventors, is that it requires objects in the scene to be relatively stationary. If objects move to a large degree between the two images, the three-dimensional stereoscopic effect of the two images may also be lost. Yet another problem, believed by the inventors, includes that a user cannot easily receive feedback from such software products, when acquiring certain images. In particular, it is believed that a large number of stereoscopic photographs that users may wish to take using such software will be user self-portraits. However, because single camera devices typically have cameras on the opposite side of the device from the user display, the user's will not be able to view any instructions from the software, while taking such self-portraits. Typical examples of devices where the camera is on the opposite side of the user display includes the Apple iPhone, Motorola Droid, HTC Nexus, and the like.
[0019] Accordingly, what is desired are improved methods and apparatus for improved 3D image capture without the drawbacks discussed above.
BRIEF SUMMARY OF THE INVENTION [0020] The present invention relates to stereoscopic 3D image capture and viewing methods and apparatus. More particularly, the present invention relates to stereoscopic 3D image viewing devices incorporating radio frequency communications. [0021] In various embodiments of the present invention, a stereoscopic 3D image viewing device is based upon liquid crystal display (LCD) shutters that are synchronized to a source of 3D images. In various embodiments, the synchronization is based upon rf protocols such as Bluetooth, ZigBee radio (ZigBee Alliance), IEEE Standard 802.15.1, IEEE Standard 802.15.4, or any other type of rf communications protocol. In some embodiments of the present invention, the stereoscopic 3D image viewing device may transmit data back to the source of 3D images, via the rf communications mechanism or protocol, to increase the level of synchronization between the two devices.
[0022] In various embodiments, by using a multitude of communications protocols (e.g. rf) and adding feedback from 3D shutter glasses back to the 3D image source, a system, method, and apparatus of perceiving stereoscopic 3D can be generated which improves the level of synchronization between the alternating images and the alternating action of shutter glasses. A system, apparatus, method, and computer-readable media are provided to enable stereoscopic viewing. In particular, according to one method, the physical method of connecting the display system to stereoscopic glasses is the IEEE 802.15.4 wireless radio, ZigBee radio or Bluetooth technology. This allows a user to move one's head into positions that would normally lose reception of wireless transmissions (e.g. IR) thus simplifying the user experience of wearing stereoscopic glasses. The wireless radio connection also has the advantage of replacing the infra-red light transmission method and its associated interference with remote controls and tendency to accept interference from natural and artificial light sources, thus enhancing the user experience.
[0023] According to other aspects, a method is provided for synchronization between the video transmitter and the shutter glasses. Synchronization is provided via a protocol that provides timing information such as a beacon offset or any series of packets that is used as the energy to excite a clock. A precision timing protocol may be utilized to provide synchronization between the transmitter and the receiver.
[0024] The above-described subject matter may also be implemented as a computer- controlled apparatus, a computer process, a computing system, or as an article of manufacture such as a pair of electronic glasses, an earbud or headset, a computer program product or a computer-readable medium. These and various other features will be apparent from a reading of the following Detailed Description and a review of the associated drawings. In various embodiments, the shutter glasses and the transmission device may include executable computer programs resident in a memory that instructs a respective processor to perform specific functions or operations, such as to transmit data, to determine a latency, or the like.
[0025] This Summary is provided to introduce a selection of concepts in a simplified form that are further described below in the Detailed Description. This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used to limit the scope of the claimed subject matter.
[0026] According to one aspect of the present invention, a device for providing 3D synchronization data to a user 3D viewing device is disclosed. One apparatus includes a receiving portion configured to receive 3D image data indicating timing of left and right images from a source of 3D data. A system may include a radio frequency transmitter coupled to the receiving portion, wherein the radio frequency transmitter is configured to output 3D synchronization signals to the 3D viewing device in response to the 3D image data.
[0027] According to another aspect of the present invention, a method for transmitting stereoscopic display information is disclosed. One process includes receiving a plurality of 3D video synchronization signals from a source of 3D image data, and converting the plurality of 3D video synchronization signals into a plurality of wireless radio signals. A method may include outputting the plurality of wireless radio signals to a pair of shutter glasses associated with a user that are adapted to receive the wireless radio signals, wherein the plurality of wireless radio signals are adapted to change the states for a pair of LCD shutters on the pair of shutter glasses, in response to the wireless radio signals.
[0028] According to yet another aspect of the invention, a method for operating a pair of shutter glasses including a right LCD shutter and a left LCD shutter is disclosed. One process includes receiving synchronization data via radio frequency transmissions from a radio frequency transmitter, and determining shutter timings for the right LCD shutter and the left LCD shutter in response to the synchronization data. A technique may include applying the shutter timings to the right LCD shutter and the left LCD shutter to enable the viewer to view right-eye images via the right LCD shutter and left-eye images via the left LCD shutter.
[0029] According to another aspect of the invention, a method for transmitting stereoscopic display information includes converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device. [0030] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device; wherein the wireless radio is the IEEE Standard 802.15.1, Bluetooth, or components thereof.
[0031] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device; wherein the wireless radio is the IEEE 802.15 (802.15.1-4) ZigBee radio, or components thereof.
[0032] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: converting one or more video synchronization signals into wireless radio signals; and decoding the wireless radio signal in a pair of shutter glasses or other device; wherein the wireless radio is the IEEE Standard 802.11, WiFi, or components thereof.
[0033] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: a pair of shutter glasses or other consumer electronics device which contains a localized clock such that the device remains synchronous to the video display system even when the connection to the source transmitting the synchronization information is interrupted or is not present. In some aspects, the synchronization information between the display system and the glasses or other device are determined by a precision timing protocol in which bidirectional communication of timing information occurs.
[0034] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: a pair of shutter glasses or other consumer electronics device which receives synchronous information from the video display system, and a means and method of storing the delay and synchronization information in the transmitter or the video source generating computer, home theater system, or device. In some aspects, the delay and synchronization information are stored and then transmitted to multiple devices to allow multiple users to simultaneously use the same system. [0035] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: a pair of shutter glasses or other consumer electronics device which receives synchronous information from the video display system, and a means of determining the delay and synchronization information through information contained in the display and transmitter from the display via the video signal cable.
[0036] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization According to another aspect of the invention, a method for transmitting stereoscopic display information includes: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both infrared and visible light sources. [0037] According to another aspect of the invention, a method for transmitting stereoscopic display information includes: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both infrared and radio sources.
[0038] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both infrared and radio sources.
[0039] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both visible light and radio sources.
[0040] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from infrared, visible light and radio sources. In various aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. In various aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. In other aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. In other aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. In other aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources. I other aspects, the shutter glasses or other receiving device can incorporate a computer program which allows the device to automatically determine which source or sources of synchronization information are available and automatically use the best source or sources.
[0041] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information from both visible light and another source, and the visible light information is also used to deduce that correct image is going to the correct eye and that the information is not reversed such that the left image is going to the right eye and vice versa.
[0042] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information where the system is capable of dynamically changing whether all viewers are sharing the same set of images or different sets of images.
[0043] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a transmitter of synchronization information and a pair of shutter glasses or other consumer electronics device which is capable of receiving synchronization information where the system is capable of displaying a sequence such that the wearers of shutter glasses or other consumer electronics devices see stereoscopic content while viewers without glasses or without other consumer electronics devices see only the left or right image, a non-stereoscopic version of the content, a blank or solid colored screen, or an arbitrary piece of content such as an advertisement. In various aspects, anti left, anti right, or combined anti left and anti right images are incorporated into the video frame sequence.
[0044] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a pair of shutter glasses or other consumer electronics device and a transmitter of synchronization information which has been incorporated into a mobile device either by using an unused wireless or infrared technology on the phone or by adding addition information to a protocol the phone is already using. In various aspects, the wireless technology is the IEEE Standard 802.15.1, Bluetooth, or components thereof. In various aspects, the wireless technology is the IEEE Standard 802.15.3, ZigBee radio (compliant with IEEE 802.15.4), or components thereof.
[0045] According to another aspect of the invention, a method for transmitting stereoscopic display information, the method including: a pair of shutter glasses or other consumer electronics device and a transmitter of synchronization information which has been incorporated into a mobile device which has been augmented by an external cradle, dongle or device which contains additional hardware and means of providing synchronization information or image viewing. In various aspects, the cradle, dongle or device contains an image projector.
[0046] According to another aspect of the invention, a method of using stereoscopic glasses as ordinary sunglasses is disclosed. In various aspects, the stereoscopic glasses incorporate a visible light sensor and make automatic decisions about the appropriate level of perceived darkening. In various aspects, the level of perceived darkening is based on computer algorithms, information about the user and the environment stored on a mobile device, information retrieved from a computer network via the mobile device, and other deductions. In other aspects, the stereoscopic glasses and ordinary sunglasses are combined with a wireless headset, Bluetooth headset, or stereo Bluetooth headset.
[0047] According to another aspect of the invention, a method of combining stereoscopic glasses with a wireless headset, Bluetooth headset, or stereo Bluetooth headset is disclosed.
[0048] According to another aspect of the invention, a method of combining ordinary or automatically darkening sunglasses with a wireless headset, Bluetooth headset, or stereo Bluetooth headset is disclosed.
[0049] Embodiments of the present invention include an imaging device including one or more image sensors (e.g. cameras) and a communications channel. In various embodiments, the imaging device may be physically coupled to a general purpose consumer device such as a personal media player (e.g. iPod), a communications device (e.g. iPhone, Android-based phone), a mobile internet device, a processing device (e.g. netbook, notebook, desktop computers), or the like. Additionally, the imaging device may utilize the communications channel (e.g. Bluetooth, Wi-Fi, ZigBee radio, IR, USB, IEEE 802.15.1, IEEE 802.15.4) to provide image data from the imaging device to the consumer device. [0050] In other embodiments of the present invention, the imaging device may be used independently of the consumer device to acquire stereoscopic images, and such images may be provided to the consumer device via the communications channel. In turn the consumer device may process and / or retransmit the stereoscope images to a remote server. For example, such stereoscopic images may be viewed on the consumer device and / or uploaded to the web (e.g. Facebook, MySpace, TwitPic), sent via e-mail, IM, or the like.
[0051] In various embodiments, the imaging device may capture one of the left or right pair of 2D images, and an image sensor on the general purpose consumer device may be used to capture the other 2D image. In other embodiments, the imaging device may include two or more image sensors (e.g. embedded therein) and be used to capture the left and right stereoscopic pair of 2D images. In various embodiments, pair of images are typically captured simultaneously or within a short amount of time apart (e.g. less than 1 second) to facilitate proper 3D image capture. This time period may increase when photographing still life, landscapes, or the like. [0052] In specific embodiments, users (e.g. consumers) may want to capture stereoscopic 3D images using a portable device such as a mobile phone, smart phone, or other device. Embodiments for methods of stereoscopic 3D image capture could incorporate an existing phone or device, a new piece of hardware such as a cradle or dongle for an existing phone or device. Other embodiments may include a piece of software or computer readable method of using an existing or new device to capture stereoscopic 3D images. Still other embodiments may include a system and method that combines these aspects in the capture of stereoscopic 3D images.
[0053] It is with respect to these and other considerations that embodiments of systems, methods, apparatus, and computer-readable media are provided for improved image capture of stereoscopic 3D images, photos, pictures, videos, or the like.
[0054] Reference to the remaining portions of the specification, including the drawings and claims, will realize other features and advantages of various embodiments of the present invention. Further features and advantages of various embodiments of the present invention, as well as the structure and operation of various embodiments of the present invention, are described in detail below with respect to accompanying drawings.
[0055] According to one aspect of the invention, a consumer device for capturing stereoscopic images is disclosed. One apparatus includes a plurality of image acquisition devices, wherein a first image acquisition device and a second image acquisition device are both approximately directed in a common direction, wherein the first image acquisition device and the second image acquisition device are displaced by a displacement, wherein the first image acquisition device is configured to capture a first image, and wherein the second image acquisition device is configured to capture a second image. A system may include a user input device configured to receive an input from a user, a memory configured to store the first image and the second image, and a wired or wireless communications portion configured to transmit data to a remote device. Various devices may include a processor coupled to the first image acquisition device, to the second image acquisition device, to the user input device, and to the communications portion, wherein the processor is configured to approximately contemporaneously direct acquisition of the first image by the first image acquisition device and of the second image by the second image acquisition device in response to the input from the user, wherein the processor is configured direct storage the first image and the second image in the memory, and wherein the processor is configured to direct the communications portion to transmit at least a portion of the first image and at least a portion of the image to a remote device.
[0056] According to another aspect of the invention, a method for capturing stereoscopic images, photos or videos on a mobile computing device, wherein the mobile computing device includes at least a first camera and a second camera, and wherein a distance and an orientation between the first and the second cameras are determinable, is disclosed. Techniques may include receiving an initiation signal from a user, while the user points the first and the second cameras in a direction of interest, and substantially simultaneously acquiring a first image with first camera, a second image with a second camera and camera parameters, in response to the initiation signal. One process may include storing the first image, the second image and the camera parameters in a memory, and uploading at least a portion of the first image, at least a portion of the second image, and the camera parameters to a remote server.
BRIEF DESCRIPTION OF THE DRAWINGS
[0057] In order to more fully understand the present invention, reference is made to the accompanying drawings. Understanding that these drawings are not to be considered limitations in the scope of the invention. The presently described embodiments and the presently understood best mode of the invention are described with additional detail through use of the accompanying drawings in which: [0058] FIG. 1 is a block diagram illustrating aspects of the prior art;
[0059] FIGS. 2A-D include block diagrams of various embodiments of the present invention illustrating the process of elements of a system in which stereoscopic glasses are synchronized with the display device by incorporation of a wireless radio into the system; [0060] FIG. 3 illustrates a block diagram of a process according to various embodiments of the present invention;
[0061] FIG. 4 is a timing diagram of various embodiments of the present invention illustrating a method of sending image information to a display in which the frames which compose the image are sent sequentially; [0062] FIG. 5 illustrates various embodiments incorporated into a mobile phone's hardware, firmware, and software and into a pair of stereoscopic shutter glasses;
[0063] FIG. 6 illustrates various embodiments incorporated into a mobile phone, some of the methods are incorporated into a pair of stereoscopic shutter glasses, and other methods are incorporated into a cradle or other device that attaches to the mobile phone; [0064] FIG. 7 illustrates various embodiments incorporated into a pair of stereoscopic shutter glasses combined with a mobile phone headset;
[0065] Fig. 8 illustrates various embodiments of the present invention; [0066] Fig. 9 illustrates various embodiments of the present invention;
[0067] FIG. 10 is a diagram illustrating aspects of the prior art;
[0068] FIG. 11 is a diagram illustrating embodiments of the present invention including methods of incorporating stereoscopic capture capabilities into a mobile device by embedded multiple cameras;
[0069] FIG. 12 is a diagram illustrating embodiments of the present invention including methods of incorporating stereoscopic capture capabilities into a mobile phone or other device by attaching an external dongle or fitting the device to a cradle;
[0070] FIG. 13 is a diagram illustrating embodiments of the present invention where multiple cameras have been incorporated into a mobile device in which the relative orientation of the cameras can be changed by way of actuating a hinge which is part of the mobile device; [0071] FIG. 14 is a diagram illustrating embodiments of the present invention where the field of view of two cameras changes as a hinge is manipulated; and
[0072] FIG. 15 is a diagram illustrating embodiments of the present invention where the field of view of two obliquely oriented cameras are rotated and/ or cropped to specific regions of interest.
DETAILED DESCRIPTION OF THE INVENTION
[0073] FIGS. 2A-D illustrate various embodiments of the present invention. In particular, FIGS. 2A-D illustrate various arrangements of embodiments of the present invention.
[0074] FIG. 2A includes a 3D source 34 of image data, a transmission device 37, a display 43, and shutter glasses 42. In various embodiments, 3D source 34 may be a computer, a Blu-ray or DVD player, a gaming console, a portable media player, set-top-box, home theater system, preamplifier, a graphics card of a computer, a cable box, or the like, and 3D display 43 may be any 3D display device such as an LCD/Plasma/OLED display, a DLP display, a projection display, or the like. In various embodiments, transmission device 37 and shutter glasses 42 may be embodied by a product developed by the assignee of the current patent application, Bit Cauldron Corporation of Gainesville, FL. In some embodiments, shutter glasses 42 may be implemented with mechanical shutters or LCD shutters. For example, LCD shutters based upon pi-cell technology may be used.
[0075] In operation, 3D source 34 sends 3D display signals to display 43 through a video cable 35, typically through a standards-based interface such as VGA, DVI, HDMI, Display Port (DP), or the like. Such 3D display signals are often configured as one or more interleaved full right-eye images then full left-eye images (e.g. field sequential); double wide (e.g. side by side) or double height (e.g. stacked) images including both left and right images; images interleaved with right-eye images and left-eye images on a pixel by pixel basis; or the like. As shown in FIG. 2A, a transmission device 37, e.g. a radio transmitter may be inserted between the 3D source 34 or other video source and 3D display 43.
[0076] In various embodiments, in transmission device 37 determines 3D timing information by decoding the 3D display signals as they pass through to display 43 on signal line or cable 44. In FIG. 2A, transmission device 37 includes a transmitter based upon radio frequency (rf) signals. The rf signals may use or may be combined with any conventional transmission protocol such as IEEE Standard 802.15.1 (e.g. Bluetooth), Wi-Fi, IEEE Standard 802.15.4 (e.g. ZigBee Alliance radio), or the like. In various embodiments, synchronization signals 40 are then transmitted via antenna 39.
[0077] In various embodiments, transmission device 37 may be a stand-alone device, e.g. a dongle, a USB "key," or the like and transmission device 37 may be powered by power source 36 and 38, self-powered, powered from 3D data source, USB powered, or the like. In other embodiments, transmission device 37 may incorporated into another device, such as 3D source 34, 3D display 43, a pre-amplifier, or the like.
[0078] FIG. 2B illustrates additional embodiments of the present invention. In particular, Fig. 2B includes a source of 3D images 100, a transmission device 110, and a 3D display 120. As illustrated, 3D image source 100 provides 3D images (e.g. double- wide or double- height images including both right and left images) to 3D display 120 via a signal line 130 such as a VGA, DVI, Display Port (DP), cable, or the like. Additionally 3D image source 100 provides a synchronization signal along signal line 140 to transmission device 110. In various embodiments, 3D image source 100 includes an industry standard interface such as a VESA miniDIN-3 connector, VESA 1997.11, USB connector, or the like, to which transmission device 110 may be coupled.
[0079] FIG. 2C illustrates additional embodiments of the present invention. In particular, Fig. 2C includes a source of 3D images 160, a transmission device 170, and a 3D display 180. As illustrated, 3D image source 160 provides 3D images (e.g. double-wide or double- height images including both right and left images) to 3D display 180 via a signal line 190 such as a VGA, DVI, HDMI cable, Display Port (DP), or the like. In turn, 3D display 180 provides a synchronization signal along signal line 200 to transmission device 170. In various embodiments, 3D display 180 includes an industry standard interface such as a VESA miniDIN-3 connector, USB connector, or the like, to which transmission device 170 may be coupled.
[0080] FIG. 2D illustrates other additional embodiments of the present invention. In particular, Fig. 2D includes a source of 3D images 220, a transmission device 230, and a 3D display 240. As illustrated, 3D image source 220 provides 3D images (e.g. double-wide or double-height images including both right and left images) to 3D display 240 via a signal line 250 such as a VGA, DVI, HDMI cable, Display Port (DP), or the like. In these embodiments, transmission device 230 may be disposed within 3D display 240. For example, transmission device 230 may be installed within the manufacturing facility of 3D display 240, or the like. In such embodiments, 2D display 240 may also power transmission device 230. Similar to the embodiments described above, 3D display 240 provides a (derived) synchronization signal along signal line 260 to transmission device 230.
[0081] In various embodiments described herein, shutter glasses 42 include a radio receiver 41 that receives the synchronization signals 40. In response to synchronization signals 40, shutter glasses 42 alternatively changes the properties of one lens from translucent to opaque (e.g. dark) to translucent and of the other lens from opaque to translucent (e.g. clear) to opaque. Because the shutters of shutter glasses 42 operate under the direction of synchronization signals 40, a user / viewer, views 3D display images 45 from display 43 at the proper timing. More particularly, the user's right eye is then exposed to a right-eye image from 3D display images 45, and then the user's left eye is then exposed to a left-eye image from 3D display images 45, etc.
[0082] The inventors of the present invention recognize that transmission device 37 based upon a radio frequency transmitter has several advantages over an infrared transmitter. One advantage recognized is that radio signals can be received in many situations where an infrared signal would be blocked. For example this allows the user of a pair of 3D shutter glasses or the like, to move their head much farther away from the 3D display or transmission device than if IR were used, and allows the user to move throughout a room with a larger range of motion while maintaining synchronization with the 3D display. As another example, rf transmitters allow other people or objects to pass in front the user / viewer with out interrupting the signal.
[0083] Another advantage goes beyond the improved range and reliability of radio technology for synchronization purposes. For the example, the inventors believe that the avoidance of infrared is itself a benefit, as infrared signals can interfere with remote controls, such as those popular in households and home theater systems. Additionally, another benefit includes that IR receivers are often interfered with and are confused by IR remote controls, natural and artificial light sources, and video displays themselves.
[0084] In various embodiments of the present invention, shutter glasses 42 may include its own localized clock. Benefits to such a configuration include that it allows shutter glasses 42 to remain approximately synchronized to display 43 even though the connection to transmission device 37 is interrupted and / or synchronization signals 40 are not received.
[0085] In various embodiments, a precision timing protocol can be used so that the clock that is local to shutter glasses 42 is synchronized with a clock within transmission device 37 and / or the 3D display signals. A precision timing protocol may include the transmission of data packets with a time stamp time associated with the 3D display signals to shutter glasses 42. In other embodiments, the protocol may include transmission of a data packet with a time stamp associated with shutter glasses 42 to transmission device 37. In operation, shutter glasses 42 receive the time stamp from the 3D data source, compares the received time stamp to its local clock and returns a data packet with its local time stamp. Using this information, transmission device 37 can determine a round-trip time for data between transmission device 37 and shutter glasses 42. hi some embodiments of the present invention, the round-trip time offset is evenly divided between transmission device 37 and shutter glasses 42. In other embodiments, if one or both devices are capable of determining a difference in speed or lag between the two transmissions, then a more precise determination of the relative values of both clocks (offsets) can be determined. As a result, in various embodiments, more precise synchronization between the two clocks can be established.
[0086] hi various embodiments of the present invention, by repeating this process periodically, the difference in rate (e.g. frequency) between the two clocks (transmission device 37 or 3D source 34 and shutter glasses 42) can be more precisely determined, hi some embodiments if there is a low degree of consistency in the latencies, the period of time between the determination of a latency process may be made small, e.g. once a minute; and if there is a higher degree of consistency in the latencies, the period of time between the determination of a latency process may be increased, e.g. once every ten minutes.
[0087] Embodiments of the present invention enable the use of multiple pairs of shutter glasses 42. hi such embodiments, a single pair of shutter glasses 42 may be used to determine delay and jitter as discussed was discussed above. Next, a simpler protocol, such as a unidirectional or broadcast protocol, may be used by transmission device 37 to communicate this synchronization information to the remaining pairs of shutter glasses, hi various embodiments, the delay and jitter information can be stored in transmission device 37, in 3D source 34, or other consumer electronics device generating the 3D data, either in a volatile or non- volatile manner.
[0088] hi other embodiments of the present invention other methods can be used to determine the synchronization and delay information, hi various examples, this data may be determined using bidirectional communications on cable 44, such as the DisplayPort protocol, or the like, as illustrated in FIG. 2C. Communications protocols such as display data channel (DDC and DDC2) protocols, PanelLink serial protocol or a similar protocols allows the display to communicate information back to the computer, home theater system, video source, or the like, hi various embodiments, this serial protocol can be enhanced to provide the appropriate latency and synchronization characteristics of 3D display 43 back to 3D source 34 and / or transmission device 37. In other embodiments, these protocols can be used to determine the manufacturer, vendor, or other identifying information for 3D display 34, and a table of pre-determined synchronization information can be retrieved, either locally, across a local area network, across a network, or the like This information may include an appropriate delay and synchronization information for respective 3D displays.
[0089] FIG. 3 illustrates a block diagram of a process according to various embodiments of the present invention. More specifically, FIG. 3 illustrates a process for synchronizing shutter glasses to a source of 3D images.
[0090] Initially, a 3D data source provides 3D images, step L. hi various embodiments, the 3D images may be provided in any number of specific formats, such as right and left images: sequentially transmitted, packed vertically or horizontally into a single image and transmitted, combined on a pixel by pixel basis into a single image and transmitted, or the like, hi other embodiments, as illustrated in Fig. 2B, 3D data source may provide specific timing data.
[0091] Next, in response to the data from 3D data source, synchronization data, such as an identifier of a timing clock resident on 3D data source is determined, step 310. hi various embodiments, this may include a packet of data including a source time stamp, or the like. The synchronization data may then be transmitted through radio frequency transmissions to a first pair of shutter glasses, step 320.
[0092] In various embodiments, the shutter glasses receive the source time stamp and synchronizes the operation of the right / left shutters to the synchronization data, step 330. The synchronization data can then be maintained within the shutter glasses by an internal clock within such glasses, step 340. As synchronization data is received, the internal clock can be resynchronized. Such embodiments are believed to be advantageous as the glasses need not wait for synchronization data from the 3D data source to be able to switch. Accordingly, synchronization data from the transmission device may be dropped or lost while the shutter glasses continue to operate properly. When synchronization data is reestablished, the synchronization described above may be performed.
[0093] In various embodiments of the present invention, rf communications using the ZigBee radio (IEEE 802.15.4 standard) occur at 2.4 GHz, the same band as most Wi-Fi transmissions. In the case of interference with Wi-Fi transmissions, embodiments of the present shutter glasses are designed to inhibit communications, and defer to such Wi-Fi signals. As discussed above, in some embodiment, the shutter glasses will continue to operate autonomously, until the interference stops and new synchronization data is received from the transmission server.
[0094] In some embodiments of the present invention, the shutter glasses may transmit data back to the rf transmission device. More specifically, the shutter glasses may transmit the received source time stamp and / or the glasses time stamp back to the transmission device via the same rf communications channel, or the like, step 350. [0095] In FIG. 3, in response to the received source time stamp and / or the glasses time stamp, and the source time stamp when these data are received, the transmission device may determine adjustments to subsequent synchronization data that will be sent to the shutter glasses, step 360. As an example, the transmission device may determine that it should output synchronization data to the shutter glasses, even before the synchronization data is determined or received from the 3D data source. As a numeric example, if it is determined that the shutter glasses lag the 3D data source by 100 microseconds, the shutter glasses may trigger its shutters 100 microseconds before the expected arrival of a synchronization pulse.
[0096] In various embodiments of the present invention, this adjustment to synchronization data may be used to drive 3D glasses of other viewers of the 3D image. In other embodiments of the present invention, 3D glasses of other viewers in the room may also have synchronization data adjusted using the process described above. In such embodiments, the transmission device may output the synchronization data at different times for different 3D glasses.
[0097] In other embodiments, other adjustments may be performed by the shutter glasses. For example, based upon received time stamps and the shutter glasses own internal clocks, the shutter glasses may verify that they are in sync. If not, the shutter glasses may adjust the frequency of its own internal clocks until they are kept in a higher amount of synchronization.
[0098] As seen in FIG. 3, the process may be repeated. In various embodiments, the synchronization process may be performed periodically, with the period dependent upon how well the 3D data source and the shutter clock stays remain in synchronization - if highly synchronized, the synchronization process may be performed at longer time periods apart (e.g. 2 minutes) than if these devices continually have synchronization problems (e.g. every 10 seconds). [0099] Various embodiments of the present invention may include shutter glasses or other devices that includes multiple physical methods for receiving synchronization information. For example, some embodiments may contain both an infrared and radio receiver; an infrared and visible light receiver; a radio or visible light receiver; a combination of infrared, visible light and radio receivers; or the like. In such embodiments, the shutter glasses or other receiving device may include executable computer program that instructs a processor to automatically determine which communications channel or channels are available, and automatically use the communications channel having the strongest signal, lowest number of dropped data packets, or the like. [0100] In various embodiments, the combination of a visible light receiver (e.g. IR) with another synchronization transmission technology (e.g. rf) may be advantageous. More specifically, the information transmitted via visible light and the synchronization information transmitted via another transmission technology may be combined within shutter glasses 42 to deduce unknown elements of the delay in 3D display 43 and other synchronization information. In various embodiments, the data from the different communication channels are compared to more precisely synchronize 3D display 43 and shutter glasses 42. As merely an example, the two communications channels can be used to verify that a left image displayed on 3D display 43 is going to the left eye and the right image displayed on 3D display 43 is going to the right eye. Li such an example, this would preventing the error of a reversal of synchronization information somewhere in the system that results in the sending the left image to the right eye and vice versa.
[0101] In various embodiments of the present invention, shutter glasses 42 may used to provide a variety of new functions. FIG. 4 illustrates typical video output timing where frame one 26, frame two 28, frame three 30 and frame four 32 are output sequentially. In some embodiments, left images (frames) and right images (frames) are alternatively output. For example, frame one 26 is left, frame two 28 is right, frame three 30 is left and frame four 32 is right, creating the sequence L, R, L, R images to the user.
[0102] Various embodiments of the present invention may be applied to 3D displays having display rates on the order of 120Hz and higher. In embodiments where the refresh rate is 120 Hz, right and left images will be displayed and refreshed at 60 Hz. Accordingly, the viewer should not be able to detect significant flickering, however, the viewer may detect a darkening of the images. As refresh rates for future televisions, projectors or the like, are increasing, the inventors have determined that the higher refresh rate may enable new features, as described below.
[0103] In various embodiments, depending upon the output frame rate of the 3D display, more than one left image and right image may be output. For example, in various embodiments, multiple viewers may view a 3D display, and different viewers may see different 3D images. For example, a two viewer sequence of output images may be user 1 left, user 1 right, user 2 left, user 2 right, etc. This could be represented as: Ll, Rl, L2, R2. In such examples, shutter glasses of a first viewer will allow the first viewer will see images Ll and Rl and a shutter glasses of a second viewer will allow the second viewer will see images L2 and R2. In other examples, other sequences are contemplated, such as Ll, L2, Rl, R2, and the like. With respect to refresh rate, for a 3D display having a 240 Hz refresh rate, a viewer will see the respective right and left images at a refresh rate of 60Hz. As noted above, this frequency should be above the typical sensitivity of the eye, however, viewers may detect a darker image. Such artifacts may be mitigated by increasing the brightness of the images, or the like.
[0104] Other embodiments may be extended to additional (e.g. three or more viewers). Applications of such embodiments may include for computer or console gaming, or the like. As an example, two or more viewers may initially see the same 3D image, and subsequently one or more viewers "break off to view a different 3D image. For example, three people could be playing a multiplayer game in which all three are traveling together and see the same 3D images. Next, one player then breaks away from the other players. Using the additional communications protocols disclosed in various embodiments of the present invention, the player's glasses can be reprogrammed to allow the third person to see a different 3D image. Subsequently, the third person may return to the group, and then see the same 3D image. In such an example, a sequence of images output by the 3D display could begin with L0-R0-L0-R0, where 0 indicates everyone in the party. Next, when the third person leaves the party, the 3D display could switch and output images in a sequence such as L1&2, R1&2, L3, R3; L1&2, L3, R1&2, R3; or the like. When the third person returns to the party, the sequence may revert to LO, RO, LO, RO. hi various embodiments, switching back and forth may occur with little, if any, visible interruption in the 3D images viewed by the viewer. In various embodiments, the inventors recognize that the brightness of each frame may have to be adjusted to correct for the changes in overall viewing time. [0105] In other embodiments, other sequences of images enable still other types functionality. For example, one sequence of frames can be sent such that viewers wearing 3D glasses see a stereo display and viewers without glasses see only one side of the image (e.g. left or right), hi such an example, a three frame sequence may include: Left, Right, Left- minus-Right, hi response, a user using embodiments of the present invention may see a stereoscopic image by viewing the left image in their left eye and the right image in their right eye. That user would be prevented from viewing the Left minus Right image. To a viewer without the glasses, they would see in succession: L, R, (L-R) = 2L, or only the left image with both eyes. In other embodiments, separate anti-left, anti-right images or both may also be sent. With such embodiments, theater-goers can decide whether they care to watch the same movie or feature with or without 3D glasses; game players can play in 3D while viewers watch the same display in 2D.
[0106] In still other embodiments, users not utilizing embodiments of the 3D glasses may view other arbitrary images. As an example, a sequence may be: Left, Right, and Arbitrary- minus-Left-minus-Right = Arbitrary image. In operation, the viewer with 3D glasses may see the left image in the left eye and the right image in the right eye, and may not see the Arbitrary image. Further, the viewer without 3D glasses would see the arbitrary image, in succession: L, R, (A-L-R)=A, that may be a non-stereo version of the same program, a blank or solid color screen, or a completely different piece of content such as an advertisement, a copyright warning, or the like.
[0107] Fig. 5 illustrates additional embodiments of the present invention. More specifically, FIG. 5 illustrates a general purpose consumer device (e.g. mobile phone, personal media player, laptop, or the like) capable of 3D image output, hi such embodiments, the synchronization information to the shutter glasses may be provided by with the consumer device including embodiments of the rf transmitter described above, or unused or available transmitters available in the consumer device. Various examples may use infrared, WiFi, Bluetooth, or the like, to provide synchronization signals to shutter glasses according to embodiments of the present invention.
[0108] Fig. 6 illustrates additional embodiments of the present invention wherein existing consumer devices (e.g. mobile phone) may be augmented to better support stereoscopic 3D viewing. In various embodiments, a cradle or dongle which attaches to the mobile device or holds the mobile device may be used, hi such examples, the cradle or dongle may incorporate a projection system such that the image may be projected at a larger size than the screen on the mobile device. The cradle or dongle or consumer device may also provide the synchronization signals to the shutter glasses. For example, the cradle or dongle may include a ZigBee radio-type transmitter (IEEE 802.15.4) that transmits the synchronization data to the shutter glasses, or the like. [0109] In other embodiments of the present invention, stereoscopic shutter glasses that are to be used with the consumer device described above, can be used for other purposes. For example, if such glasses incorporate a visible light sensor, they can be worn as ordinary sunglasses but make improved automatic decisions about the appropriate level of perceived darkening. This information can be based on computer algorithms, information about the user and the environment that is stored on a mobile device; information retrieved from a computer network via the mobile device, and the like.
[0110] Fig. 7 illustrates yet another embodiment of the present invention. In such embodiments, a user of the consumer device may desire to perform multiple functions at the same time, such as: talk on a Bluetooth headset, view stereoscopic 3D content, and wear sunglasses. Embodiments illustrated in Fig. 7 may include a pair of shutter glasses 57 combined with a pair of sunglasses and a Bluetooth or stereo mobile Bluetooth headset with a left earpiece 58 and a right earpiece 55, or the like.
[0111] Fig. 8 illustrates various embodiments of the present invention, hi particular, Fig. 8 illustrates a block diagram of various embodiments of a dongle 400 providing rf transmissions, as described above.
[0112] In Fig. 8, a physical interface 410 is illustrated. In various embodiments, physical interface 410 may be a DVI port, HDMI port, Display Port (DP), USB, VESA 1997.11, or the like, for coupling to a source of 3D data (e.g. computer, DVD / BluRay player, HD display, monitor, etc.). In embodiments illustrated in Fig. 2A or 2C for example, the 3D data may include 3D image data, whereas in the embodiments illustrated in Fig. 2B, the 3D data may include 3D timing data. In various embodiments, an interface chip or block 420 may provide the electronic interface to physical interface 410. Next, a processing device such as a CPLD (complex programmable logic device) 430 may be used to decode 3D synchronization data from 3D image data or 3D timing data. [0113] In various embodiments of the present invention, 3D synchronization data 440 is then provided to an rf interface device 450 that references a clock 440. In some embodiments, rf interface device 450 is a TI CC2530 System on a Chip, that includes a 8051 MCU (processor), RAM, Flash memory, and a IEEE 802.14.4 ZigBee RF transceiver. The flash memory is configured to store executable computer code or instructions that directs the processor to perform various functions, as described herein. In various examples, the flash memory includes computer code that directs the processor to transmit the 3D synchronization data to the 3D glasses, to receive timing data back from the 3D glasses, to determine a round- trip communication latency, to adjust 3D synchronization data in response to the round-trip communication latency, and the like, as described above.
[0114] In some embodiments of the present invention, dongle 400 may include an output port or 460 driven by an output interface 470. In various embodiments, as illustrated in Fig. 2A, the output port may be a DVI port, HDMI port, Display Port (DP), or the like providing 3D image data to a 3D display (e.g. an display, projector, etc.).
[0115] Fig. 9 illustrates various embodiments of the present invention. In particular, Fig. 9 illustrates a block diagram of a pair of shutter glasses 500 according to various embodiments of the present invention. Shutter glasses 500 is illustrated to include an rf interface device 510 that references a clock 540 and a pair of electronically controlled LCD shutter elements 520 and 530.
[0116] In various embodiments of the present invention, 3D synchronization data 550, typically radio frequency signals, is received in rf interface device 510. In various embodiments, rf interface device 510 is also an TI CC2530 System on a Chip, that includes a 8051 MCU (processor), RAM, Flash memory, and a IEEE 802.14.4 ZigBee RF transceiver. The flash memory is configured to store executable computer code or instructions that directs the processor to perform various functions, as described herein. In various examples, the flash memory includes computer code that directs the processor to receive the 3D synchronization data, to change the states of/ drive shutter elements 520 and 530 at the appropriate timing (e.g. Ll and Rl in the sequence Ll, L2, Rl, R2), to send clock or timing data back to a transmission device via rf communications, and the like.
[0117] In light of the above disclosure, one of ordinary skill in the art would recognize that many variations may be implemented based upon the discussed embodiments. Embodiments described above may be useful for hand-held consumer devices such as cell-phones, personal media players, mobile internet devices, or the like. Other embodiments may also be applied to higher-end devices such as laptop computers, desktop computers, DVRs. BluRay players, gaming consoles, hand-held portable devices, or the like. Other embodiments may take advantage of existing IR transmission devices for IR shutter glasses. More specifically, in such embodiments, an IR to RF conversion portion may be added to receive the IR 3D output instructions and to convert them to RF 3D transmission signals, described above. In some embodiments, an RF receiver is thus used. The RF 3D transmission signals are then transmitted to the RF 3D shutter glasses, described above. Such embodiments can therefore be a simple upgrade to available IR 3D glasses transmitters. [0118] In other embodiments of the present invention, feedback from shutter glasses to the transmitter device described above with regards to synchronization, may be used for additional purposes. One such embodiment may allow the 3D image source (e.g. a cable box, computer, or the like) to take the indication that a pair of shutter glasses are currently synchronized to mean a person is viewing the 3D content, and to provide that data back to a marketing company such as Media Metrics, Nielsen Ratings, or the like. By doing this, such market research companies may determine the number of viewers of specific 3D features, or the like.
[0119] The above detailed description is directed to systems, methods, and computer- readable media for stereoscopic viewing. While the subject matter described herein is presented in the general context of program modules that execute in conjunction with the execution of an application program or an operating system on a 3D source, consumer electronics device, and a pair of stereoscopic glasses, those skilled in the art will recognize that other implementations may be performed in combination with other program modules or devices.
[0120] FIG. 11 illustrates various embodiments of the present invention. More specifically, Fig. 11 illustrates incorporation of more than one image sensor onto a consumer device to provide stereoscopic capture capabilities as described herein.
[0121] In one embodiment, a consumer device 613, such as a mobile telephone, personal media player, mobile internet device, or the like includes two imaging sensors 616 and 617 coupled to body 615. In various embodiments, imaging sensors 616 and 617 are configured to acquire left and right 2D image pairs at substantially the same time (e.g. within a second or less). In various embodiments, stereoscopic cameras 616 and 617 may be embedded directly into consumer device 613, and imaging sensors 616 and 617 may have a fixed position and orientation with respect to body 615. hi other embodiments, imaging sensors 616 and 617 may be movable within body 615 (e.g. along a track or point of rotation) or may be removable from body 615. [0122] In various embodiments, cameras 616 and 617 may be affixed at a known displacement (e.g. offset or location) relative to each other. In other embodiments, the displacement between cameras 616 and 617 may be modified by the user and the displacement may be determined by consumer device 613. In various embodiments, cameras 616 and 617 may alternatively capture left and right 2D images or videos or simultaneously capture such images at any other speed fast enough to approximate simultaneity.
[0123] The acquisition of such images may be initiated by the user via software and / or hardware portions of consumer device 613. For example, the acquisition may be initiated via depression of a physical switch or the selection of a "soft" button on a display of the consumer device 613. In such embodiments, executable software code stored within a memory in consumer device 613 may instruct one or more processors within consumer device 613 to acquire at least a pair of images from image sensors 616 and 617 and to store such images into a memory.
[0124] In various embodiments, the acquired left and / or right images may be displayed to back to the user on a display of consumer device 613. Additionally, the left / right images may processed by the one or more processors for display as a stereoscopic image pair. In various examples, the images may be combined into a static stereoscopic image, and when a lenticular lens (e.g. prismatic) is disposed on top the display, the user may simultaneously see the left / right images with their respective left / right eyes. Such a lenticular lens may be provided in various embodiments of the present invention, in the form of a removable sheet the user places over a display of the consumer device to view 3D images. In other embodiments, the lens may be part of a removable sleeve the user slips onto the consumer device to view 3D images.
[0125] In other embodiments, the left / right images may be uploaded to another consumer device, such as a laptop, desktop, cloud storage system, television, HD monitor, or the like, hi such embodiments, the right / left images may be displayed on a display in a time- interleaved manner and viewed by the viewer.
[0126] In various embodiments, in addition to the left / right image pairs, the relative position of cameras 616 and 617 to each other, the position of cameras 616 and 617 relative to the display, the optical settings and characteristics of cameras 616 and 617 may also be recorded in the memory and / or referenced. Such parameters or settings may be made available to various processing software (resident upon consumer device 613, or other processing device) to further deduce, capture, or process information from cameras 616 and 617. As merely an example, based upon displacement between cameras 616 and 617, estimates of distances and other measurements may be performed in three-dimensions. As another example, parameters or settings may include camera parameters, e.g. shutter speed, aperture, gain, contrast, and the like may be measured from a left camera and be applied to the right camera to normalize the captured left / right 2D images.
[0127] In various embodiments, consumer device 613 may be vertically oriented when acquiring a left / right images. In still other embodiments, a consumer device may be horizontally oriented when acquiring left / right image pairs. An example of this is illustrated by consumer device 614 (e.g. mobile phone) in Fig. 11, where cameras 618 and 620 are distributed along the long axis of device 621. As described above, image sensors 618 and 620 may capture right / left images at substantially the same time (or the like) to enable the generation / viewing of stereoscopic images.
[0128] In various embodiments of the present invention, one or more additional image sensors, such as camera 619 may be provided as part of the consumer device. In the embodiment illustrated in Fig. 11 , camera 619 may be directed towards the user, while cameras 618 and 620 are directed away from the user. Such embodiments may be provided to capture not only a right / left image pair, but a reaction of the user. As an example, the user may use consumer device 613 to record a video of a roller coaster ride in "3D" and to contemporaneously record their reactions, hi still other embodiments, camera 619 may be installed or rotated such that cameras 618, 619 and 620 are all pointed in approximately the same direction, toward the same plane, line, or the like, towards or away from the user, or the like, hi various embodiments, display of consumer device may also be directed towards or away from the user. In various embodiments, images and camera parameters captured by cameras 618-620 may also be used as a source of stereoscopic image data, such as for 3D scene reconstruction, or the like.
[0129] In other embodiments of the present invention, the consumer device may include one or more segments which move with respect to each other such that the stereoscopic cameras remain horizontally oriented (e.g. level) while the display or other sections of the consumer device are rotated or manipulated, hi various embodiments, the cameras may be manually leveled by the user to be horizontal disposed, and in other embodiments, the cameras may be automatically manipulated by the consumer device via feedback from one or more tilt sensors or accelerometers provided in the consumer device. [0130] Fig. 12 illustrates embodiments of the present invention directed towards supplementing consumer devices having an image sensor with right / left image acquisition capabilities, hi particular, FIG. 12. illustrates embodiments where stereoscopic image capture capabilities can be added to an existing consumer device (e.g. mobile phone) by means of an external cradle, dongle, or other device.
[0131] In various embodiments illustrated in Fig. 12, a consumer device 622 includes a body portion 624 coupled to a dongle 626. Dongle 626 may include one or more image sensors, such as cameras 627 and 629. In various embodiments, dongle 626 provides image data, camera parameter data, or the like to consumer device 622 via a physical and / or data communications channel, such as USB or microUSB connector, wireless (e.g. IR, Bluetooth, Wi-Fi, ZigBee radio (ZigBee Alliance), IEEE Standard 802.15.4, IEEE Standard 802.15.1), docking (e.g. iPod connector), a proprietary connector, or the like, hi other embodiments, any other method for physically restraining dongle 626 with respect to consumer device 622 is contemplated, additionally, any other transfer protocol for providing data from dongle 626 to consumer device 622 is also contemplated.
[0132] In embodiments of the present invention, a user may initiate capture of right / left images on dongle 626 via one or more physical buttons on dongle 626 or consumer device 622 or soft buttons on a display of consumer device 622. Similar to the embodiments described above, executable software code operating upon consumer device 622 may direct a processing device within consumer device 622 or dongle 626 to initiate acquisition of images by cameras 627 and 629. It is contemplated that consumer device 622 may send one or more instruction signals to dongle 626 via the same physical and / or data communications channel as described above. Alternatively, other communications methods and mechanisms for instructing dongle 626 are contemplated. [0133] In response to the user or to the consumer device 622, in various embodiments, dongle 626 initiates the capturing of one or more images from image sensors 627 and 629. Additionally, dongle 626 may capture image parameters from one or both of image sensors 627 and 629 to assist in capturing, normalizing, and / or generating of stereoscopic images. In various embodiments, such information may include the fixed or relative locations of cameras 627 and 629, optical parameters (e.g. aperture, shutter speed, focal length, iso, focal point in the images, and the like), level or orientation information from tilt sensor 28, and the like, hi various embodiments of the present invention, consumer device 622 may include functionality described above for dongle 626, such as tilt sensor 628, or the like. [0134] In various embodiments, dongle 626 may be capable of using more than one communication protocol and may connect to other devices than consumer device 622. For example, dongle 626 may provide right / left images directly to other users' mobile phones, computers, televisions, or the like, via Bluetooth, Wi-Fi, ZigBee radio, IEEE 802.15.1, IEEE 802.15.4, IR, or the like. In various embodiments, dongle 626 may communicate with such devices either one at a time, in an interleaved manner, simultaneously, or the like.
[0135] Fig. 12 also illustrates additional embodiments of the present invention. More particularly, a consumer device 623 is illustrated including an external device such as cradle 634 physically holding or enveloping the body 631 of consumer device 623. In the embodiments illustrated, cradle 634 includes a single image sensor 632, although in other embodiments, more than one image sensor may be provided. In the embodiments illustrated, cradle 634 is operated such that the image sensor 630 of consumer device 623 operates in coordination with image sensor 632 to capture right / left image pairs.
[0136] In various embodiments, cradle 634 may be physically coupled to consumer device 623, as illustrated, or may be physically coupled in any manner contemplated or described in the embodiments above (e.g. iPod connector, USB). As illustrated in Fig. 12, consumer device 623 is placed into an undersized opening of cradle 634, and thus consumer device 623 and cradle 634 are physically restrained with respect to each other. Further, cradle 634 and device 623 may communicate image data, sensor data, instructions to and from consumer device 623 in any manner described in the embodiments above (e.g. Bluetooth, Wi-Fi, IR).
[0137] As previously described, additional camera information such as camera parameters and / or information from a tilt sensor 633 maybe determined by cradle 634. In various embodiments, cradle 634 may also communicate such information about the optical characteristics and properties of image sensor 632 to consumer device 623. Such data may be used by consumer device 623 to coordinate the actions of image sensors 630 and 632. As an example, in various embodiments of the present invention, camera or lens parameters from image sensor 632 may be used to set the parameters of image sensor 630. As examples of this, a gain setting from image sensor 630 may be used to set a gain setting of image sensor 632, a shutter speed of image sensor 632 may be used to set a shutter speed of image sensor 630, and the like.
[0138] In light of the above, it can be seen that in various embodiments of the present invention, by providing a second image sensor to a consumer device that already includes a single image sensor, such a combined system may have some or all of the capabilities of an expensive and dedicated stereoscopic image capture system, as previously discussed.
[0139] Fig. 13 illustrates various embodiments of the present invention. In particular, a consumer device 635 is illustrated including two sections 639 and 642 and at least a pair of image sensors 640 and 641. In Fig. 4, section 635 and 642 are coupled together by a hinge or other conveyance. In various embodiments, consumer device 635 may be "folded-up", or consumer device 636 may be partially opened, consumer device 637 may be fully opened, or the like, as shown. As can be seen, depending upon the amount a hinge is opened, the displacement between sensors may vary. For example, for consumer device 636, image sensor 643 is disposed upon the side or end of section 644 and image sensor (e.g. camera) 647 is disposed upon the end of side or end of section 646. As shown, cameras 643 and 647 are laterally displaced with respect to each other; in another case image, sensor 640 is adjacent to 41; and in another case, cameras 648 and 652 are far away from each other.
[0140] As can be seen, in various embodiments, the orientation of the two cameras in terms of their distance relative to each other, their rotation relative to each other, the tilt of the entire system and the like, are variable. Because of this, in various embodiments, the displacements between the cameras, camera parameters, image parameters and the like may be recorded. As described in the various embodiments above, such data may be used for many purposes by the consumer device, external device (e.g. desktop computer), or the like, such as determining stereoscopic images, 3D image reconstruction, or the like.
[0141] In various embodiments, an additional image sensor, such as image sensor 45 may also be included and may provide all the benefits of more than two cameras described herein. The additional image sensors may be fixed or rotated such that the three cameras are pointed in the same direction (e.g. toward the same plane, line, or other geometric construction) such that stereoscopic information can be deduced for multiple orientations of the device, in opposite directions, or the like.
[0142] Fig. 14 illustrates additional features of embodiments of the present invention illustrated in Fig. 13. For example, a field of view 663 is shown for camera 664, a field of view 671 is illustrated for camera 670, and the like. [0143] In various embodiments, it can be seen that cameras 659 and 660 that are adjacent in consumer device 657, are "pulled apart," in consumer device 654, and cameras 664 and 670 are then separated and rotated relative to each other. In various embodiments, cameras 664 and 670 may remain in the same plane as they move, however they may also change plane with respect to each other.
[0144] In various embodiments, as cameras 664 and 670 rotate away from each other, tilt sensors 666 and 668 may be used to determine the tilt of each camera. These measurements may be referenced to determine a separation angle between cameras 664 and 670. Then, using the known geometry of the device, the linear displacement, or the like between cameras 664 and 670 can be determined, hi various embodiments, such information may be deduced by other means, such as installing a single tilt sensor and directly measuring the angle of a hinge 667, by deduction from the camera image data, or the like. [0145] Fig. 15 illustrates an example of the result of various embodiments of the present invention. More particularly, FIG. 15 shows an example of image data 682 and 684 from two image sensors of a consumer device that have been separated by an arbitrary distance, hi this example, image data 682 and 684 are rotated and tilted relative to each other as a result of being captured on a consumer device similar to cameras 664 and 670, in Fig. 14. [0146] In various embodiments, images captured by a user are expected to be rectangular in shape and parallel to the ground. Accordingly, rectangles 681 and 683 represent level rectangular image information available from image data 682 and 684. In Fig. 15, lines 685 and 686 illustrate that rectangles 681 and 683 are level. In various embodiments, rectangles 681 and 683 may be used to represent the right / left image pair for generating a stereoscopic 3D image.
[0147] In various embodiments, a consumer device 678 may display one or both of image data 681 and 683 to the user on a display as 2D images or as a 3D stereoscopic image, before or while acquiring or storing image and / or video data, hi such embodiments, the user can be provided feedback as to how to reorient the image sensors with respect to each other, to capture the desired 2D image(s). The inventors have determined that if images 681 and 683 do not have sufficiently overlapping subject matter, or if images 681 and 683 have narrow fields of view, a stereoscopic image formed from images 681 and 683 will not convey a significant 3D effect to the viewer. The feedback from consumer device 678 may be provided in real-time to the user. [0148] hi various embodiments, consumer device 678 may provide feed back in the form of tilt sensor feedback, to encourage the user to hold the device such that both cameras are more level, as illustrated by consumer device 655 in Fig. 14. hi practice, the inventors have determined that if the cameras are more level with respect to each other, the rectangular size of the region of interest increases. In other examples, face recognition technology can also be used, either to override or to coordinate with the tilt sensor, to increase or maximize the area of a face which is captured by images 681 and 683.
[0149] In various embodiments, the system can encourage the user to reorient the system manually or the system may be able to do so automatically. For example, consumer device 678 may increase the region of interest (e.g. image data 681 and 683) by encouraging the user to manually level the cameras with respect to each other; encouraging the user to manually open or close the hinge completely; encouraging the user to "zoom out," pan upwards; etc. The inventors have determined that automatic zooming out or panning upwards are particularly useful if face recognition technology is also included. As an example, such techniques would be useful to prevent image 681 and 683 from cropping out the eyebrows of the person illustrated in Fig. 15.
[0150] In still other embodiments, stereoscopic 3D image data may be deduced without the need for multiple camera image capture systems, such as those shown in 613, 614, 622, 623, 635, 636 or 637 if the subject is still enough that the user can generate one image such as 682 and then translate or rotate the camera to produce another image such as 684. If the subject is not sufficiently still, multiple images can still be used to deduce the stereoscopic data that is correct for a particular image.
[0151] In various embodiments, a graphical display similar to the one shown in 678 may be displayed to the user, either as a flat (e.g. 2D) or stereoscopic 3D image. The consumer device may then provide the user with information which allows the user to translate or rotate the camera of the consumer device into a new position to take another image. In such embodiments, the multiple images from a single camera may substitute for a set of single images from multiple cameras. In various embodiments, the consumer device determines the above-mentioned horizontal region of interest on the first image and displays information to guide the user in taking the next picture, to increase the 3D overlap or effect.
[0152] In various embodiments, the user may be given written, verbal, or other instructions to take both pictures. Such embodiments can also assist the user in taking the first image offset to one side instead of centered. This allows the second image to be offset to the other side, the result of which is the subject is centered in the deduced stereoscopic 3D image. An example of such instructions is "Take a picture with the left eye on the viewfinder, then move the camera such that the right eye is on the viewfinder and take another picture." With such embodiments, the feedback that the user has generated both images appropriately can be provided after both images are taken and after the resulting stereoscopic 3D image is determined. In some embodiments, the camera image data is examined to provide the user with graphical feedback to assist the user in capturing the second image. In such examples, the user is presented with a display, audible or other information to help select the second image after the first image has been taken. In some embodiments facial recognition technology may be useful to encourage the user to translate the entire camera such that a stereoscopic image data of an entire face can be ensured, as opposed to a stereoscopic 3D image of landscape or still life, for example
[0153] In light of the above disclosure, one of ordinary skill in the art would recognize that many variations may be implemented based upon the discussed embodiments. Embodiments described above may be useful for hand-held consumer devices such as cell-phones, personal media players, mobile internet devices, or the like. Other embodiments may also be applied to higher-end devices such as laptop computers, desktop computers, digital SLR cameras, HD video cameras, and the like. [0154] In various embodiments of the present invention, the dongle described above may be operated to acquire 2D images when semi-permanently affixed to a consumer device. In other embodiments, the dongle may be operated to acquire 2D images apart from the consumer device. Subsequently, the 2D images may be provided to the consumer device using one or more of the transmission protocols described above. In such embodiments, the dongle may be stored semi-permanently affixed to consumer
[0155] The above detailed description is directed to systems, methods, apparatus and computer-readable media for stereoscopic image capture. While the subject matter described herein is presented in the general context of hardware blocks that are embedded in electronic devices or program modules that execute in conjunction with the execution of an application program or an operating system on a computer system, consumer electronics device, or an information processing device, those skilled in the art will recognize that other implementations may be performed in combination with other program modules or devices.
[0156] Further embodiments can be envisioned to one of ordinary skill in the art after reading this disclosure. In other embodiments, combinations or sub-combinations of the above disclosed invention can be advantageously made. The block diagrams of the architecture and flow charts are grouped for ease of understanding. However it should be understood that combinations of blocks, additions of new blocks, re-arrangement of blocks, and the like are contemplated in alternative embodiments of the present invention. [0157] The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense. It will, however, be evident that various modifications and changes may be made thereunto without departing from the broader spirit and scope.

Claims

WHAT IS CLAIMED IS:
1. A device for providing 3D synchronization data to a user 3D viewing device comprising: a receiving portion configured to receive 3D image data indicating timing of left and right images from a source of 3D data; a radio frequency transmitter coupled to the receiving portion, wherein the radio frequency transmitter is configured to output 3D synchronization signals to the 3D viewing device in response to the 3D image data.
2. The device of claim 1 wherein the receiving port is coupled to an output port of the source of 3D data; wherein the device further comprises an output portion configured to provide the 3D image data to a 3D display device; and wherein the output port is selected from a group consisting of: HDMI, DVI, VGA, Display Port (DP).
3. The device of claim 1 wherein the receiving port is coupled to an output synchronization port of the source of 3D data; wherein the receiving port is configured to determine 3D synchronization signals in response to the 3D image data; and wherein the output port is selected from a group consisting of: VESA, USB.
4. The device of claim 1 wherein the receiving port is coupled to an output port of a 3D display device; wherein the receiving port is configured to determine 3D synchronization signals in response to the 3D image data; and wherein the output port is selected from a group consisting of: VESA 1997.11, USB.
5. The device of claim 1 wherein a protocol for the radio frequency transmitter is selected from a group consisting of: IEEE Standard 802.15.1, Bluetooth, ZigBee radio, WiFi, IEEE 802.15.4.
6. A 3D viewing device for providing 3D images to a user comprising: a radio frequency receiver configured to receive 3D synchronization signals from a transmitting device; and a plurality of LCD shutters including a right LCD shutter and a left LCD shutter coupled to the radio frequency receiver, wherein the right LCD shutter and the left LCD shutter are configured to alternatively enter a translucent state in response to the 3D synchronization signals.
7. The 3D viewing device of claim 6 wherein a protocol for the radio frequency transmitter is selected from a group consisting of: IEEE Standard 802.15.1, Bluetooth, ZigBee radio, WiFi.
8. The 3D viewing device of claim 6 further comprising a local clock coupled to the radio frequency receiver, wherein the local clock is configured to synchronize with the 3D synchronization signals.
9. A method for transmitting stereoscopic display information, the method comprising: receiving a plurality of 3D video synchronization signals from a source of 3D image data; converting the plurality of 3D video synchronization signals into a plurality of wireless radio signals; and outputting the plurality of wireless radio signals to a pair of shutter glasses associated with a user that are adapted to receive the wireless radio signals, wherein the plurality of wireless radio signals are adapted to change the states for a pair of LCD shutters on the pair of shutter glasses, in response to the wireless radio signals.
10. The method of claim 9 wherein the wireless radio signals are selected from a group consisting of: IEEE Standard 802.15.1, Bluetooth, ZigBee radio, IEEE
802.15.4, WiFi.
11. The method of claim 9 further comprising: attaching a dongle to a port of the source of 3D image data; and receiving the plurality of 3D video synchronization signals through the port; wherein converting the plurality of 3D video synchronization signals into a plurality of wireless radio signals is performed by the dongle; wherein outputting the plurality of wireless radio signals to the pair of shutter glasses is performed by the dongle; and wherein the port is selected from a group consisting of: VESA 1997.11, USB.
12. A method for operating a pair of shutter glasses including a right LCD shutter and a left LCD shutter comprising: receiving synchronization data via radio frequency transmissions from a radio frequency transmitter; determining shutter timings for the right LCD shutter and the left LCD shutter in response to the synchronization data; and applying the shutter timings to the right LCD shutter and the left LCD shutter to enable the viewer to view right-eye images via the right LCD shutter and left-eye images via the left LCD shutter.
13. The method of claim 12 further comprising: determining a local clock time stamp in response to the synchronization data; and transmitting the local clock time stamp to the radio frequency transmitter.
14. The method of claim 12 the radio frequency transmissions are selected from a group consisting of: IEEE Standard 802.15.1, Bluetooth, ZigBee radio, IEEE 802.15.4, WiFi.
15 A consumer device for capturing stereoscopic images comprising: a plurality of image acquisition devices, wherein a first image acquisition device and a second image acquisition device are both approximately directed in a common direction, wherein the first image acquisition device and the second image acquisition device are displaced by a displacement, wherein the first image acquisition device is configured to capture a first image, and wherein the second image acquisition device is configured to capture a second image; a user input device configured to receive an input from a user; a memory configured to store the first image and the second image; a wireless communications portion configured to transmit data to a remote device; and a processor coupled to the first image acquisition device, to the second image acquisition device, to the user input device, and to the wireless communications portion, wherein the processor is configured to approximately contemporaneously direct acquisition of the first image by the first image acquisition device and of the second image by the second image acquisition device in response to the input from the user, wherein the processor is configured direct storage the first image, the second image, and the displacement in the memory, and wherein the processor is configured to direct the wireless communications portion to transmit at least a portion of the first image and at least a portion of the image to a remote device.
16. The consumer device of claim 15 further comprising a removably attached portion, wherein the removably attached portion includes at least the first image acquisition device, and wherein the removably attached portion is configured to communicate the first image to the memory via a communications channel.
17. The consumer device of claim 15 wherein the removably attached portion is selected from a group consisting of: a dongle, cradle, an attachment, an encasement.
18. The consumer device of claim 15 wherein the processor is also configured to determine at least a tilt of the first camera with respect to level; and wherein the processor is configured to determine the portion of the first image in response to the tilt of the first camera.
19. The consumer device of claim 15 wherein the processor is configured to determine the portion of the first image by being configured to perform manipulations on the first image to determine the portion of the first image, wherein the processor is configured to determine the portion of the second image by being configured to perform manipulations on the second image to determine the portion of the second image, wherein the manipulations are selected from a group consisting of: image rotation, image crop, image scale, perspective correction.
20. A method for capturing stereoscopic images, photos or videos on a mobile computing device, wherein the mobile computing device includes at least a first camera and a second camera, and wherein a distance and an orientation between the first and the second cameras are determinable, the method comprising: receiving an initiation signal from a user, while the user points the first and the second cameras in a direction of interest; substantially simultaneously acquiring a first image with first camera, a second image with a second camera and camera parameters, in response to the initiation signal; storing the first image, the second image and the camera parameters in a memory; and uploading at least a portion of the first image, at least a portion of the second image, and the camera parameters to a remote server.
21. The method of claim 20 wherein the mobile computing device comprises a mobile telecommunications device; and wherein the first and the second cameras of the computing device are embedded into the mobile telecommunications device.
22. The method of claim 20 further comprising: determining an initial portion of the first image and an initial portion of the second image; determining suggested manipulations of the mobile computing device to increase a size of the initial portion of the first image; and outputting the suggested manipulations of the mobile computing to the user.
23. The method of claim 22 further comprising: repositioning the first camera relative to the second camera in response to physical manipulations by the user; receiving another initiation signal from the user, while the user points the first and the second cameras in the direction of interest; substantially simultaneously acquiring a third image with the first camera, a fourth image with the second camera and additional camera parameters, in response to the other initiation signal; storing the third image, the fourth image and the additional camera parameters in the memory; and wherein uploading at least a portion of the first image, at least a portion of the second image, and the camera parameters to a remote server comprises uploading at least a portion of the third image, at least a portion of the fourth image, and the additional camera parameters to the remote server.
24. The method of claim 20 further comprising: determining a location of a face in the first image; determining a location of the face in the second image; and determining manipulations of the first camera relative to the second camera such that the location of the face is within the portion of the first image and that the location of the face is within the portion of the second image.
25. The method of claim 24 further comprising: automatically performing the manipulations on the first camera relative to the second camera; receiving another initiation signal from the user, while the user points the first and the second cameras in the direction of interest; substantially simultaneously acquiring a third image with one camera, a fourth image with the second camera and additional camera parameters, in response to the other initiation signal; storing the third image, the fourth image and the additional camera parameters in the memory; and wherein uploading at least a portion of the first image, at least a portion of the second image, and the camera parameters to a remote server comprises uploading at least a portion of the third image, at least a portion of the fourth image, and the additional camera parameters to the remote server.
EP10739079.1A 2009-02-03 2010-02-03 Method of stereoscopic 3d image capture and viewing Withdrawn EP2394195A4 (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US14965109P 2009-02-03 2009-02-03
US14966609P 2009-02-03 2009-02-03
US18284509P 2009-06-01 2009-06-01
US21806909P 2009-06-18 2009-06-18
US25173909P 2009-10-15 2009-10-15
PCT/US2010/023091 WO2010091113A2 (en) 2009-02-03 2010-02-03 Method of stereoscopic 3d image capture and viewing

Publications (2)

Publication Number Publication Date
EP2394195A2 true EP2394195A2 (en) 2011-12-14
EP2394195A4 EP2394195A4 (en) 2013-05-08

Family

ID=42542635

Family Applications (1)

Application Number Title Priority Date Filing Date
EP10739079.1A Withdrawn EP2394195A4 (en) 2009-02-03 2010-02-03 Method of stereoscopic 3d image capture and viewing

Country Status (2)

Country Link
EP (1) EP2394195A4 (en)
WO (1) WO2010091113A2 (en)

Families Citing this family (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9179136B2 (en) 2009-11-20 2015-11-03 Broadcom Corporation Method and system for synchronizing 3D shutter glasses to a television refresh rate
GB2482662A (en) * 2010-07-28 2012-02-15 Suniel Singh Mudhar 3d video signal format conversion
JP4951102B2 (en) * 2010-09-08 2012-06-13 株式会社東芝 Notification system, shutter glasses, and notification method
KR20120029658A (en) * 2010-09-17 2012-03-27 삼성전자주식회사 3d glasses and 3d display apparatus using ir signal and rf signal
US9185398B2 (en) 2011-09-22 2015-11-10 Google Technology Holdings LLC Method and apparatus for providing three-dimensional content
US9390537B2 (en) 2011-12-09 2016-07-12 Thomson Licensing Disparity setting method and corresponding device
US9648308B2 (en) 2012-03-27 2017-05-09 Koninklijke Philips N.V. Multiple viewer 3D display
CN102881271A (en) * 2012-09-29 2013-01-16 深圳市华星光电技术有限公司 Method and system for driving liquid crystal display device
EP2717581A1 (en) * 2012-10-05 2014-04-09 BlackBerry Limited Methods and devices for generating a stereoscopic image
US9137517B2 (en) 2012-10-05 2015-09-15 Blackberry Limited Methods and devices for generating a stereoscopic image
EP4137927A4 (en) 2020-07-13 2023-10-18 Samsung Electronics Co., Ltd. Electronic device, display device connected to electronic device, and method for operating same

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4967268A (en) * 1989-07-31 1990-10-30 Stereographics Liquid crystal shutter system for stereoscopic and other applications
US20070247477A1 (en) * 2006-04-21 2007-10-25 Lowry Gregory N Method and apparatus for processing, displaying and viewing stereoscopic 3D images
US20080031283A1 (en) * 2006-08-07 2008-02-07 Martin Curran-Gray Time synchronization for network aware devices
US20100194857A1 (en) * 2009-02-03 2010-08-05 Bit Cauldron Corporation Method of stereoscopic 3d viewing using wireless or multiple protocol capable shutter glasses

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20010043266A1 (en) * 2000-02-02 2001-11-22 Kerry Robinson Method and apparatus for viewing stereoscopic three- dimensional images
US7508485B2 (en) * 2001-01-23 2009-03-24 Kenneth Martin Jacobs System and method for controlling 3D viewing spectacles
US20070146478A1 (en) * 2005-07-14 2007-06-28 Butler-Smith Bernard J Stereoscopic 3D rig calibration and viewing device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4967268A (en) * 1989-07-31 1990-10-30 Stereographics Liquid crystal shutter system for stereoscopic and other applications
US20070247477A1 (en) * 2006-04-21 2007-10-25 Lowry Gregory N Method and apparatus for processing, displaying and viewing stereoscopic 3D images
US20080031283A1 (en) * 2006-08-07 2008-02-07 Martin Curran-Gray Time synchronization for network aware devices
US20100194857A1 (en) * 2009-02-03 2010-08-05 Bit Cauldron Corporation Method of stereoscopic 3d viewing using wireless or multiple protocol capable shutter glasses

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2010091113A2 *

Also Published As

Publication number Publication date
EP2394195A4 (en) 2013-05-08
WO2010091113A2 (en) 2010-08-12
WO2010091113A3 (en) 2010-12-02

Similar Documents

Publication Publication Date Title
US20100194857A1 (en) Method of stereoscopic 3d viewing using wireless or multiple protocol capable shutter glasses
EP2394195A2 (en) Method of stereoscopic 3d image capture and viewing
US20140184762A1 (en) Method of stereoscopic synchronization of active shutter glasses
US9179136B2 (en) Method and system for synchronizing 3D shutter glasses to a television refresh rate
US20110090324A1 (en) System and method of displaying three dimensional images using crystal sweep with freeze tag
KR101487182B1 (en) Method and apparatus for making intelligent use of active space in frame packing format
WO2010141514A2 (en) Method of stereoscopic synchronization of active shutter glasses
US20110001805A1 (en) System and method of transmitting and decoding stereoscopic sequence information
US20100194860A1 (en) Method of stereoscopic 3d image capture using a mobile device, cradle or dongle
US20110134231A1 (en) Method And System For Synchronizing Shutter Glasses To A Display Device Refresh Rate
US9438894B2 (en) Method of providing 3D image and 3D display apparatus using the same
WO2010122711A1 (en) 3d image display apparatus, 3d image playback apparatus, and 3d image viewing system
JP2011193460A (en) Method for adjusting 3d-image quality, 3d-display apparatus, 3d-glasses, and system for providing 3d-image
US8896676B2 (en) Method and system for determining transmittance intervals in 3D shutter eyewear based on display panel response time
US20140092218A1 (en) Apparatus and method for stereoscopic video with motion sensors
US20120249540A1 (en) Display system, display device and display assistance device
JP2011139456A (en) 3d glass driving method, and 3d glass and 3d display device using the same
US11102469B2 (en) 3D play system
EP2432239A2 (en) Method for controlling ambient brightness perceived via three-dimensional glasses by adjusting ambient brightness setting, three-dimensional glasses, and video display device thereof
CN103747360A (en) Intelligent television video playing method and equipment
US20130194399A1 (en) Synchronization of shutter signals for multiple 3d displays/devices
US20120019616A1 (en) 3D image capturing and playing device
KR20120059947A (en) 3D glasses and method for controlling 3D glasses thereof
KR101768538B1 (en) Method for adjusting 3-Dimension image quality, 3D display apparatus, 3D glasses and System for providing 3D image
JP2013013089A (en) Three-dimensional display device and three-dimensional display method applied to the same

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20110727

AK Designated contracting states

Kind code of ref document: A2

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO SE SI SK SM TR

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20130410

RIC1 Information provided on ipc code assigned before grant

Ipc: G02B 27/22 20060101AFI20130404BHEP

Ipc: H04N 5/232 20060101ALI20130404BHEP

Ipc: H04N 13/02 20060101ALI20130404BHEP

Ipc: H04N 13/04 20060101ALI20130404BHEP

Ipc: H04N 13/00 20060101ALI20130404BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20130903