US20120140034A1 - Device for displaying 3d content on low frame-rate displays - Google Patents
Device for displaying 3d content on low frame-rate displays Download PDFInfo
- Publication number
- US20120140034A1 US20120140034A1 US13/379,613 US201113379613A US2012140034A1 US 20120140034 A1 US20120140034 A1 US 20120140034A1 US 201113379613 A US201113379613 A US 201113379613A US 2012140034 A1 US2012140034 A1 US 2012140034A1
- Authority
- US
- United States
- Prior art keywords
- shuttering
- video
- signal
- video signal
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/22—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
- G02B30/24—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type involving temporal multiplexing, e.g. using sequentially activated left and right shutters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/341—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/398—Synchronisation thereof; Control thereof
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2310/00—Command of the display device
- G09G2310/06—Details of flat display driving waveforms
- G09G2310/061—Details of flat display driving waveforms for resetting or blanking
Definitions
- PCT/US2011/027933 filed Mar. 10, 2011, entitled “DISPLAYING 3D CONTENT ON LOW FRAME-RATE DISPLAYS;”
- PCT Patent Application No. PCT/US2011/027981 filed Mar. 10, 2011, entitled “SHUTTERING THE DISPLAY OF INTER-FRAME TRANSITIONS;”
- PCT Patent Application No. PCT/US2011/032549 filed Apr. 14, 2011, entitled “ADAPTIVE 3-D SHUTTERING DEVICES.”
- the entire content of each of the foregoing applications is incorporated by reference herein.
- This invention relates to systems, methods, and computer program products related to conversion and presentation of three-dimensional video content.
- Three-dimensional (3D) display technology involves presenting two-dimensional images in such a manner that the human brain perceives the images as being 3D. The process typically involves presenting “left” image data to the left eye, and “right” image data to the right eye. When received, the brain perceives this data as a 3D image.
- 3D display technology generally incorporates the use of a filtering or shuttering device, such as stereoscopic glasses, which filter displayed image data to the correct eye.
- Filtering devices can comprise passive configurations, meaning that the filtering device filters image data passively (e.g., by color code or by polarization), or active configurations, meaning that the filtering device filters image data actively (e.g., by shuttering or “blanking”).
- Traditional display devices such as computer monitors, television sets, and portable display devices, have been either incapable of producing suitable image data for 3D viewing, or have produced an inferior 3D viewing experience using known devices and processes.
- viewing 3D content from traditional display devices generally results in blurry images and/or images that have “ghosting” effects, both of which may cause dizziness, headache, discomfort, and even nausea in the viewer.
- display devices that incorporate more recent display technologies, such as Liquid Crystal Display (LCD), Plasma, Light Emitting Diode (LED), Organic Light Emitting Diode (OLED), etc.
- Implementations of the present invention provide devices, methods, and computer program products configured to enable the viewing of three-dimensional (3D) content on a broad range of display devices.
- a viewer can view 3D content at display devices not specifically designed for 3D content display, while experiencing a level of quality that can match or even exceed the quality of specialized 3D display devices.
- implementations of the present invention can eliminate the need to purchase a 3D-specific display device by enabling viewers to view 3D content on traditional display devices in a high-quality manner.
- an implementation of a video conversion device can include a video signal input interface device adapted to receive a 3D video signal.
- the video conversion device can also include one or more programmable processing units.
- the processing units convert the received 3D video signal to the output video signal, which is specifically adapted for display on the destination display device.
- the processing units can also generate a shuttering signal, which instructs a stereographic shuttering device to shutter a user's view of the output video signal.
- the video conversion device can also include a shuttering signal transmitter device, which is adapted to transmit the generated shuttering signal to the stereographic shuttering device.
- the video conversion device can also include a video signal output interface device adapted to send an output video signal to a particular destination display device.
- FIG. 1 illustrates a schematic diagram of a video conversion device, along with schematic representations of internal components of the video conversion device, in accordance with one or more implementations of the invention
- FIG. 3 illustrates a flow diagram of the shuttering of the display of three-dimensional video content in response to a shuttering signal, in accordance with one or more implementations of the invention
- FIG. 4 illustrates a timing diagram which demonstrates the relative timing of transmitted output 3D content, a corresponding blanking signal, and resulting display states in accordance with one or more implementations of the present invention
- FIG. 5A illustrates a schematic diagram of an accessory device, along with schematic representations of internal components of the accessory device, in accordance with one or more implementations of the invention
- FIG. 5B illustrates some exemplary operating environments in which an accessory device can enable the sending converted three-dimensional content to various display devices, in accordance with one or more implementations of the invention.
- FIG. 6 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention for converting 3D video content for low frame-rate displays.
- Implementations of the present invention provide devices, methods, and computer program products configured to enable the viewing of three-dimensional (3D) content on a broad range of display devices.
- a viewer can view 3D content at display devices not specifically designed for 3D content display, while experiencing a level of quality that can match or even exceed the quality of specialized 3D display devices.
- implementations of the present invention can eliminate the need to purchase a 3D-specific display device by enabling viewers to view 3D content on traditional display devices in a high-quality manner.
- Specialized 3D display devices attempt to provide an enhanced 3D viewing experience by modifying physical characteristics of the display device, such as by increasing the frame-rate and by decreasing a frame overlap interval.
- the frame-rate refers to the number of unique video frames the display device can render in a given amount of time (e.g., one second).
- Frame overlap interval refers to the period of time that elapses when transitioning between two frames.
- the display device displays at least a portion of two or more video frames concurrently. Longer frame overlap intervals are perceptible to the human eye, and can lead to a degraded viewing experience. For example, longer frame overlap intervals can cause motion blurring or ghosting. These effects are a particular problem when viewing 3D video content.
- One or more implementations of the present invention provide for a video conversion device that converts incoming 3D content to a format adapted to a particular display device, such as a lower frame-rate display device.
- the conversion device can generate an output video signal formatted in a manner understood by a particular destination display device, which takes into account physical characteristics of the destination device (e.g., frame-rate, frame overlap).
- the video conversion device can also compensate for longer overlap intervals exhibited by the display device by generating and transmitting a shuttering signal that shutters (or occludes) a user's view of the display device during frame overlap intervals.
- the video conversion device can facilitate viewing of 3D content on a broad range of display devices, including display devices that have lower frame-rates (and longer frame overlap intervals), while overcoming undesirable effects, such as motion blurring and ghosting.
- One or more implementations also provide for an accessory device (e.g., a Universal Serial Bus (USB) device) that can enable a variety of computing systems to send 3D content to lower frame-rate displays.
- the accessory device can extend the functionality of existing devices (e.g., general purpose computing systems, gaming systems, tablet computers, smart phones, set-top boxes, optical disc players, etc.) to enable the devices to send 3D content to attached or integrated display devices (which can have lower frame-rates).
- one or more implementations also provide methods for configuring computing systems to convert/send 3D content to lower frame-rate display devices.
- FIG. 1 illustrates a cutout diagram of a video conversion device 100 , along with schematic representations of internal components of the video conversion device 100 , in accordance with one or more implementations of the invention. While the illustrated housing is rectangular, the housing can take virtually any form.
- the video conversion device 100 can convert incoming 3D content (e.g., 3D content received from an optical disc player, a set-top box, a gaming device, the Internet, etc.) to a format adapted for an attached display device.
- the video conversion device 100 can function as a standalone device positioned between one or more media devices and a particular destination display device. In one or more implementations, however, other devices (e.g., a media device or a display device) can incorporate functionality of the video conversion device 100 .
- the video conversion device 100 can include a video signal input interface device 102 (video input port) adapted to receive a 3D video signal.
- the video input port 102 can include any number of constituent input ports or interface devices.
- the video input port 102 can include one or more digital video input devices, such as High-Definition Multimedia Interface (HDMI) input(s) 102 a , DisplayPort input(s) 102 b , Digital Visual Interface (DVI) input(s) 102 c , etc.
- the video input port 102 can also include any number of analog video inputs, such as composite, component, or coaxial inputs, to name a few.
- the video conversion device 100 can include any appropriate type and number of input ports, such as a LAN/WAN input 102 d (e.g., RJ-45 or WIFI).
- FIG. 1 illustrates that the video conversion device 100 can include a video signal output interface device 104 (video output port) adapted to send an output video signal to a particular destination display device.
- the video output port 104 can include any number of appropriate constituent output ports or interfaces.
- the video output port 104 can include one or more composite outputs 104 a , one or more component outputs 104 b , and/or one or more HDMI outputs 104 c .
- the video output port 104 can, however, include any other appropriate type and number of constituent output ports (as indicated by the ellipses 104 d ).
- the video input port 102 and the video output port 104 can share constituent input/output ports dynamically configured as an input or an output.
- the video conversion device 100 can include any number of other appropriate external interface devices/ports.
- the video conversion device 100 can include one or more user input device(s) 108 .
- the user input device(s) 108 can include any combination of switches 108 a , buttons 108 b , wireless receivers 108 c (e.g., Infrared, BLUETOOTH, WIFI), wired receivers 108 d (e.g., USB), or others 108 e .
- the video conversion device 100 can also include one or more transmitter devices 110 (e.g., a shuttering signal transmitter device adapted to transmit a shuttering signal).
- the transmitter device 110 can comprise an infrared transmitter device 110 a , BLUETOOTH transmitter device 110 b , WIFI transmitter device 110 c , or another transmitter device 110 d .
- an infrared transmitter device 110 a BLUETOOTH transmitter device 110 b
- WIFI transmitter device 110 c WIFI transmitter device 110 c
- another transmitter device 110 d another transmitter device 110 d .
- the video conversion device 100 can also include one or more processing units 106 .
- FIG. 1 illustrates a single processing unit 106 that includes a plurality of constituent processing components or modules ( 106 a , 106 b , 106 c , 106 d , 106 e , 106 f ), but the invention is not so limited. Instead, the video conversion device 100 can include any number of processing units, which can each perform a single function, or a variety of functions.
- the video conversion device 100 can, for example, include one or more programmable processing units, such as Field-programmable Gate Arrays (FPGA) or microcontrollers (e.g., PIC, ATmega, etc.).
- FPGA Field-programmable Gate Arrays
- microcontrollers e.g., PIC, ATmega, etc.
- the video conversion device 100 can also include one or more dedicated processing units or a combination of software-configurable processing units and dedicated processing units.
- any appropriate communications mechanisms e.g., on-chip connection, busses, etc. can couple the processing units together.
- any appropriate communications channel(s) can communicatively couple the processing units 106 with the other components, such as the video input port 102 , the video output port 104 , the user input device(s) 108 , the transmitter 110 , and any other components.
- the video input port 102 can receive 3D content via an input video signal (in either an analog or a digital format), and communicate the received 3D video signal to the processing units 106 .
- the processing units 106 can convert the 3D video signal to an output format adapted to, or suited for, a particular display device (e.g., display device 202 , FIG. 2 ) attached to the video output port 104 .
- This conversion can take into account physical characteristics of the display device, and can comprise using a customized encoding format, frame-rate, frame size, etc. in the output format.
- the processing units 106 can send the converted 3D content to the particular destination display device via an output video signal sent through the video output port 104 .
- the processing units 106 can include a plurality of constituent processing units, can implement a plurality of logical modules or components, or can use a combination of the two.
- the processing units 106 can, for instance, include a decoder 106 a and an encoder 106 b .
- the decoder 106 a can receive the 3D video signal (which may comprise any number of various known 3D formats) and decode it into one or more internal buffers for conversion to the output format.
- the 3D video signal can comprise one or more video frames that encode left-perspective image data and right-perspective image data.
- the decoder 106 a can detect the format used to encode this data, and then decode left-perspective data into one buffer and decode right-perspective data into another buffer.
- the decoder 106 a can encode standard (e.g., VESA) content to standard or non-standard formats.
- the encoder 106 b can encode image data from the buffer(s) into the output format.
- the output format can comprise a sequence of one or more left-perspective video frame(s) alternating with one or more right-perspective video frames, or vice versa.
- the processing units 106 can first pass the one or more video frames containing image data for one eye (e.g., one or more left perspective frames) to the video output port 104 via an output video signal.
- the processing units 106 can subsequently pass one or more video frames containing image data for the other eye (e.g., one or more right perspective frames) to the video output port 104 via the output video signal.
- the processing units 106 can employ a digital-to-analog converter (DAC) and/or an analog-to-digital converter (ADC).
- DAC digital-to-analog converter
- ADC analog-to-digital converter
- the encoder 106 a and/or the decoder 106 b can include these converters.
- the converters are one or more separate components (e.g., DAC/ADC 106 c ).
- the processing units 106 can use the DAC produce an output video signal that is analog (e.g., component or composite).
- Such converters can allow for the conversion and display of 3D content on older devices that may have the capabilities of receiving digital content.
- the inverse is also true, and the processing units 106 can convert an analog 3D video signal to an output video signal that is digital using the ADC.
- the processing units 106 can also include a shuttering signal generator component 106 d , which can generate a shuttering (or sync) signal to assist with 3D viewing.
- the video conversion device 100 can transmit the generated shuttering signal to one or more shuttering devices (e.g., stereographic shuttering glasses 204 , FIG. 2 ), using the transmitter 110 , prior to, or concurrently with, sending the output format to the destination display device.
- the shuttering signal can instruct the shuttering device(s) to shutter (or occlude) portions of a user's view of the output video signal when displayed at the destination display device, to provide the illusion of 3D content display.
- the shuttering signal can include one or more inter-frame shuttering instructions, which can enhance 3D content viewing on lower frame-rate devices.
- the processing units 106 can adapt the output video signal for a particular destination display device. This can include any combination of generating customized types of output video frames (e.g., interlaced or progressive) and/or generating customized sizes of output video frames (e.g., 480, 720, or 1080 vertical lines). This can also include generating output video frames at a target frame-rate (e.g., 60 Hz, 120 Hz). When generating output video frames at a target frame-rate, the processing units 106 can send the frames to the video output port 104 at a rate that would cause the display device to receive a target number of frames per second.
- a target frame-rate e.g. 60 Hz, 120 Hz
- the processing units 106 can include a detection module or component 106 e .
- the detection module 106 e can receive physical characteristic information about the destination display device and provide this information to the other modules or components, such as the decoder 106 a and/or the encoder 106 b .
- the detection module 106 e can receive the physical characteristic information from a user via the user input device(s) 108 , or directly from the display device (e.g., via an HDMI connection).
- the physical characteristic information can include any appropriate information, such as frame size and frame-rate capabilities of the display device, an inter-frame overlap interval of the display device, etc.
- receiving physical characteristic information via the user input device(s) 108 can involve receiving specific physical characteristic information about the particular destination display device.
- the user can, for example, use a wireless remote control to enter or select a make and model of the particular destination display device.
- the detection module 106 e can use this information look up the physical characteristics of the particular destination display device from a local or remote database.
- the user can use buttons or switches on the video conversion device 100 to select particular physical characteristics of the particular destination display device (e.g., frame rate).
- receiving physical characteristic information via the user input device(s) 108 can also involve inference and/or learning techniques.
- the video conversion device 100 can send configuration information for display at the display device in various different formats, while also sending a corresponding shuttering signal to a shuttering device. The user can then provide appropriate feedback about his or her perception of the displayed configuration information, as viewed through the shuttering device, via buttons, wireless communication, etc. Based on the sent configuration information, and the corresponding feedback received, the video conversion device 100 can infer the physical characteristics of the display device.
- the media device can comprise a streaming source 206 (e.g., a satellite box, cable box, the Internet), a gaming device (e.g., XBOX 208 , PLAYSTATION 210 ), a player device (e.g., Blu-Ray player 212 , DVD player 214 ) capable of reading media 216 (e.g., optical media), and the like.
- a streaming source 206 e.g., a satellite box, cable box, the Internet
- a gaming device e.g., XBOX 208 , PLAYSTATION 210
- a player device e.g., Blu-Ray player 212 , DVD player 214
- media 216 e.g., optical media
- the video conversion device 100 can, itself, comprise one or more media devices.
- the video conversion device 100 can communicate with other devices using any of the hardware components discussed herein above (e.g., the video input port 102 , the video output port 104 , or the transmitter 110 ).
- An appropriate wired (e.g., HDMI, component, composite, coaxial, network) or wireless (BLUETOOTH, Wi-Fi) mechanism can couple the video output port 104 and the display device 202 together.
- an appropriate wired or wireless mechanism can couple the video input port 102 to a media device.
- an appropriate wireless mechanism e.g., BLUETOOTH, infrared, etc.
- the display device 202 can comprise any one of a broad range of display devices that incorporate a variety of display technologies, both current and future (e.g., Cathode Ray, Plasma, LCD, LED, OLED).
- the display device 202 can take any of a number of forms, such as a television set, a computer display (e.g., desktop computer monitor, laptop computer display, tablet computer display), a handheld display (e.g., cellular telephone, PDA, handheld gaming device, handheld multimedia device), or any other appropriate form.
- the display device 202 can have a configuration designed specifically to display 3D content
- the destination display device 202 alternatively can comprise a more traditional display device, such as a lower frame-rate device.
- the display device 202 can include both digital and analog display devices.
- FIG. 3 illustrates is a flow diagram of the shuttering of the display of 3D video content in response to a shuttering signal, according to one or more implementations.
- FIG. 3 illustrates three display states 302 , 304 , and 306 .
- the video conversion device 100 sends different portions of the output video signal to the particular destination display device 202 .
- the video conversion device 100 also transmits appropriate shuttering instructions 314 , 316 , 318 , to the shuttering device(s) 204 in a shuttering signal.
- the video conversion device 100 can provide the illusion that two-dimensional images encoded in the output video signal are 3D.
- the video conversion device 100 can transmit one or more left-perspective video frames in an output video signal 324 to the display device 202 , and can also transmit a shuttering instruction 314 (occlude right) to the shuttering device(s) 204 .
- a shuttering component 322 can occlude the viewer's right eye view of the display device 202 .
- the video conversion device 100 can transmit one or more right-perspective video frames in the output signal 324 to the display device 202 , and can also transmit a shuttering instruction 318 (occlude left) to the shuttering device(s) 204 .
- the shuttering component 320 can occlude the viewer's left eye view of the display device 202 .
- the video conversion device 100 can reverse the images and shuttering instructions of states 302 and 306 .
- state 302 for example, the video conversion device 100 can alternatively send a right-perspective image to the display device 202 and can send an “occlude left” instruction to the shuttering device(s) 204 .
- state 306 the video conversion device 100 can send a left-perspective image to the display device 202 and can send an “occlude right” instruction to the shuttering device(s) 204 .
- the illustrated sequence of images and instructions is not limiting.
- Display states 302 and 306 provide the illusion of 3D content display
- one or more implementations introduce a third display state 304 , during which the video conversion device 100 occludes an inter-frame overlap 310 .
- Inter-frame overlap 310 occurs after the video conversion device 100 has fully transmitted image data for one eye (e.g., left-perspective video frames), and has begun to transmit image data for the other eye (e.g., right-perspective video frames).
- image data for one eye e.g., left-perspective video frames
- the other eye e.g., right-perspective video frames
- portions of the different frames can be blended,” so that portions of both the left and right perspective images are concurrently displayed.
- Inter-frame shuttering or the occlusion of both eyes during inter-frame overlap intervals, can enhance the clarity of the perceived 3D image. Inter-frame shuttering can reduce or eliminate the undesirable effects common to 3D content display, such as motion blurring and ghosting.
- inter-frame shuttering techniques when synchronously combined with the creation of an output video signal adapted to a particular display device, can allow for viewing of 3D content on display devices that may have lower frame-rates and/or longer frame overlap intervals.
- FIG. 4 illustrates a timing diagram 400 of the transmission of an output video signal to a destination display device 202 , and the transmission of a shuttering signal to shuttering device(s) 204 , in accordance with one or more implementations.
- the video conversion device 100 begins transmitting left-perspective video frames(s) 410 to the destination display device 202 .
- the destination display device 202 displays only the left frame(s) 410 at a time 404 .
- the video conversion device 100 can instruct the shuttering device(s) 204 to occlude the user's right eye with an appropriate shuttering instruction 314 (occlude right).
- the video conversion device 100 can cease transmitting the left frame(s) 410 at a time 406 , and begin transmitting right-perspective video frame(s) 412 .
- the video conversion device 100 can base the timing of the transition between the left and right frames on a target frame-rate of the output video signal, which is adapted to the destination display device 202 .
- the video conversion device 100 can determine a display state 304 from time 406 to a time 408 .
- the display device 202 will display an inter-frame overlap ( 310 , FIG. 3 ) as the display device transitions between displaying the left frame(s) 410 to displaying the right frames(s) 412 .
- the video conversion system 100 can send an inter-frame shuttering instruction 316 (occlude both).
- the inter-frame shuttering instruction 316 can shutter at least a portion of the inter-frame overlap ( 310 , FIG. 3 ) from the user's view.
- the destination display device 202 will have transitioned past the inter-frame overlap 310 and will display only the right frame(s) 412 .
- the video conversion device 100 can send an appropriate shuttering instruction 318 (occlude left). Subsequently, the video conversion device 100 can send other left frame(s), other right frame(s), and so on. These frames can include new image data from the received 3D video signal, or can include the same data sent previously (i.e., to increase the frame-rate of the output signal). Correspondingly, the video conversion device 100 can send corresponding shuttering instructions (as shown).
- FIG. 5A illustrates a cutout diagram of an accessory device 500 , along with schematic representations of internal components of the accessory device 500 , in accordance with one or more implementations.
- the illustrated accessory device 500 is a USB device that would fit in the palm of the hand of a user
- the accessory device 500 can take any number of alternate forms without departing from the disclosure herein.
- the accessory device 500 may also take the form of an IEEE 1394 (FIREWIRE, I.LINK, LYNX) device, an APPLE Dock device, etc.
- IEEE 1394 FIREWIRE, I.LINK, LYNX
- the accessory device 500 can include a variety of constituent components.
- the accessory device 500 can include an interface device 502 adapted to communicatively interface with the associated computing system (e.g., a USB interface, an IEEE 1394 interface, an APPLE Dock interface).
- the accessory device 500 can also include a shuttering signal transmission device 504 (transmitter).
- the transmitter 504 can transmit a shuttering signal to stereographic shuttering devices, and can use any appropriate signal type (e.g., Infrared, BLUETOOTH, WIFI).
- the associated computing system can process/convert 3D video content, generate a shuttering signal, and send the generated shuttering signal to one or more shuttering devices via the transmitter 504 on the accessory device 500 .
- the associated computing system run computer-executable instructions received as part of, or separate from, the accessory device 500 .
- the associated computing system can receive instructions via a storage device provided at the associated computing system (e.g., a CD-ROM, FLASH memory, etc.), via an Internet download, etc.
- the associated computing system can receive instructions from the accessory device 500 .
- the accessory device 500 can include, for example, one or more computerized storage devices 506 storing computer-executable instructions.
- the stored computer-executable instructions can instruct one or more processing units to convert a 3D video signal to a format adapted for display on a particular destination display device.
- the instructions can, for instance, instruct one or more processors at the associated computing system to perform the conversion.
- the instructions can also instruct one or more processing units 508 on the accessory device 500 , to perform, or to help perform, the conversion. In this manner, the accessory device 500 can offload some or all of the computation needed to perform the conversion from the associated computing system.
- the stored computer-executable instructions can also cause one or more processing units to generate a shuttering signal.
- the shuttering signal can include one or more inter-frame shuttering instructions for shuttering an inter-frame transition between first-eye and second-eye content (as discussed in connection with FIGS. 3 and 4 ).
- the stored computer-executable instructions can instruct processing units at the associated computing system or on the accessory device 500 to generate the shuttering signal (in whole or in part).
- the accessory device 500 can instruct the associated computing system to generate the shuttering signal, or can generate the shuttering signal itself.
- the stored computer-executable instructions can also cause the transmitter 504 to send the generated shuttering signal to one or more stereographic shuttering devices 204 .
- the accessory device 500 can include one or more processors or processing units 508 . Similar to the video conversion device 100 , these processing units 508 can comprise any number of processing units, which can each perform a single function, or a variety of functions. The processing units 508 can thus comprise programmable processing units (e.g., FPGAs, microcontrollers), dedicated processing units, or a combination of each. In addition, an appropriate communications channel 510 (e.g., one or more buses) can couple each component of the accessory device 500 . Additionally, similar to the video conversion device 100 , the processing units 508 can implement a series of processing components, such as decoder(s), encoder(s), ADC, DAC, etc.
- decoder(s) such as decoder(s), encoder(s), ADC, DAC, etc.
- FIG. 5B illustrates a few exemplary operating environments in which the accessory device 500 can enable the sending of 3D content to low frame-rated display devices.
- Item 512 illustrates that the accessory device 500 can enable a general-purpose computing system (e.g., a laptop or desktop computer) to convert/send 3D content to an integrated display.
- the accessory device can operate with virtually any operating system, such as MICROSOFT WINDOWS, APPLE MAC OS, LINUX, UNIX, etc.
- the accessory device 500 can enable the general purpose computing system 512 to receive 3D content from any appropriate source (e.g., solid state media, optical media, the Internet), to convert the 3D content to an output video signal adapted to the attached display device, to generate a shuttering signal, and to send the output video signal and the shuttering signal to their respective devices.
- any appropriate source e.g., solid state media, optical media, the Internet
- the accessory device 500 can also enable 3D content conversion and/or display with other devices as well, such as a gaming device 514 (e.g., XBOX, PLAYSTATION), a DVD/BLU-RAY player 516 , or a tablet computer 518 .
- the accessory device 500 can include hardware interfaces and/or computer instructions customized to the particular device. It is noted that more specialized devices (e.g., DVD/BLU-RAY players 516 or tablet computers 518 ), may have limited processing or configuration capabilities. Thus, the inclusion of processing units 508 on the accessory device 500 can enable these devices to process/convert 3D content.
- FIG. 6 for instance, a flowchart of a computerized method for converting 3D video content for a for low frame-rate displays. The acts of FIG. 6 are described with respect to the schematics, diagrams, devices and components shown in FIGS. 1-5B .
- a method can comprise an act 602 of converting an input 3D video signal.
- Act 602 can include converting an input 3D video signal to an output video signal which includes an alternating sequence of one or more first video frames that include a first image for viewing by a first eye and one or more second video frames that include a second image for viewing by a second eye.
- the video conversion device 100 can receive a 3D video signal and convert the received 3D content to a format adapted for a particular destination display device.
- the accessory device 500 can configure a general or special purpose computing system to convert 3D video content.
- the accessory device can, for example, include computer-executable instructions which cause processors at the computing system, or at the accessory device 500 , to convert 3D content to an output signal adapted for a particular display device.
- adapting the output signal can include determining an optimal frame rate for the display device, which can comprise a low frame-rate display device.
- the illustrated method can also comprise an act 604 of generating an inter-frame shuttering signal.
- Act 604 can include generating an inter-frame shuttering signal configured to instruct a shuttering device to concurrently shutter both the first eye and the second eye during a display of an inter-frame transition.
- the inter-frame transition can comprise a period during which at least a portion of the one or more first video frames and at least a portion of the one or more second video frames are displayed concurrently.
- the video conversion device 100 or the accessory device 500 can generate, or cause an associated computing system to generate, a shuttering signal that includes a plurality of shuttering instructions.
- the shuttering signal can include one or more inter-frame shuttering instructions ( 316 ) which instruct shuttering device(s) 204 to occlude both of a viewer's eyes during an inter-frame transition.
- FIGS. 1-6 provide a number of components and mechanisms for sending 3D video content to a broad range of display devices.
- One or more disclosed implementations allow for viewing of 3D video content on a broad range of display devices, including devices that that may have lower frame-rates and longer frame overlap intervals, or that are not otherwise specifically designed for displaying 3D video content.
- the implementations of the present invention can comprise a special purpose or general-purpose computing systems.
- Computing systems may, for example, comprise handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally considered a computing system, such as DVD players, BLU-RAY Players, gaming systems, and video converters.
- the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions, which the processor may execute.
- the memory may take any form and may depend on the nature and form of the computing system.
- a computing system may be distributed over a network environment and may include multiple constituent computing systems. In its most basic configuration, a computing system typically includes at least one processing unit and memory.
- the memory may be physical system memory, which may be volatile, non-volatile, or some combination of the two.
- the term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well.
- the term “module” or “component” can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
- Implementations of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below.
- Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures.
- Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system.
- Computer-readable media that store computer-executable instructions are physical storage media.
- Computer-readable media that carry computer-executable instructions are transmission media.
- embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- a “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices.
- a network or another communications connection can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa).
- computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system.
- a network interface module e.g., a “NIC”
- NIC network interface module
- computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions.
- the computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code.
- the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like.
- the invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks.
- program modules may be located in both local and remote memory storage devices.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
- Controls And Circuits For Display Device (AREA)
Abstract
Description
- The present application is a U.S. National Stage Application corresponding to PCT Patent Application No. PCT/US2011/031115, filed Apr. 4, 2011, which claims priority to U.S. Provisional Application No. 61/416,708, filed Nov. 23, 2010, entitled “3D VIDEO CONVERTER.” The present application is also a continuation-in-part of: PCT Patent Application No. PCT/US2011/025262, filed Feb. 17, 2011, entitled “BLANKING INTER-FRAME TRANSITIONS OF A 3D SIGNAL;” PCT Patent Application No. PCT/US2011/027175, filed Mar. 4, 2011, entitled “FORMATTING 3D CONTENT FOR LOW FRAME-RATE DISPLAYS;” PCT Patent Application No. PCT/US2011/027933, filed Mar. 10, 2011, entitled “DISPLAYING 3D CONTENT ON LOW FRAME-RATE DISPLAYS;” PCT Patent Application No. PCT/US2011/027981, filed Mar. 10, 2011, entitled “SHUTTERING THE DISPLAY OF INTER-FRAME TRANSITIONS;” and PCT Patent Application No. PCT/US2011/032549, filed Apr. 14, 2011, entitled “ADAPTIVE 3-D SHUTTERING DEVICES.” The entire content of each of the foregoing applications is incorporated by reference herein.
- 1. The Field of the Invention
- This invention relates to systems, methods, and computer program products related to conversion and presentation of three-dimensional video content.
- 2. Background and Relevant Art
- Three-dimensional (3D) display technology involves presenting two-dimensional images in such a manner that the human brain perceives the images as being 3D. The process typically involves presenting “left” image data to the left eye, and “right” image data to the right eye. When received, the brain perceives this data as a 3D image. 3D display technology generally incorporates the use of a filtering or shuttering device, such as stereoscopic glasses, which filter displayed image data to the correct eye. Filtering devices can comprise passive configurations, meaning that the filtering device filters image data passively (e.g., by color code or by polarization), or active configurations, meaning that the filtering device filters image data actively (e.g., by shuttering or “blanking”).
- Traditional display devices, such as computer monitors, television sets, and portable display devices, have been either incapable of producing suitable image data for 3D viewing, or have produced an inferior 3D viewing experience using known devices and processes. For instance, viewing 3D content from traditional display devices generally results in blurry images and/or images that have “ghosting” effects, both of which may cause dizziness, headache, discomfort, and even nausea in the viewer. This is true even for display devices that incorporate more recent display technologies, such as Liquid Crystal Display (LCD), Plasma, Light Emitting Diode (LED), Organic Light Emitting Diode (OLED), etc.
- Recently, 3D display devices designed specifically for displaying 3D content have become increasingly popular. These 3D display devices are generally used in connection with active filtering devices (e.g., shuttering glasses) to produce 3D image quality not previously available from traditional display devices. These 3D display devices, however, are relatively expensive when compared to traditional display devices.
- As a result, consumers who desire to view 3D content face the purchase of expensive 3D display devices, even when they may already have traditional display devices available. Accordingly, there a number of considerations to be made regarding the display of 3D content.
- Implementations of the present invention provide devices, methods, and computer program products configured to enable the viewing of three-dimensional (3D) content on a broad range of display devices. When employing one or more implementations of the present invention, a viewer can view 3D content at display devices not specifically designed for 3D content display, while experiencing a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can eliminate the need to purchase a 3D-specific display device by enabling viewers to view 3D content on traditional display devices in a high-quality manner.
- For example, an implementation of a video conversion device can include a video signal input interface device adapted to receive a 3D video signal. The video conversion device can also include one or more programmable processing units. The processing units convert the received 3D video signal to the output video signal, which is specifically adapted for display on the destination display device. The processing units can also generate a shuttering signal, which instructs a stereographic shuttering device to shutter a user's view of the output video signal. Along these lines, the video conversion device can also include a shuttering signal transmitter device, which is adapted to transmit the generated shuttering signal to the stereographic shuttering device. Additionally, the video conversion device can also include a video signal output interface device adapted to send an output video signal to a particular destination display device.
- Additionally, an implementation of an accessory device can include an interface device adapted to communicatively interface with a computing system, and a shuttering signal transmission device adapted to transmit a shuttering signal to a stereographic shuttering device. In addition, the accessory device can include one or more computerized storage devices that include executable instructions for converting a 3D video signal to a format adapted for display on a destination display device. The storage device can also include executable instructions for converting a 3D video signal to a format adapted for display on a destination display device, generating the shuttering signal, and instructing the shuttering signal transmission device to send the shuttering signal to the stereographic shuttering device. The shuttering signal can include a shuttering instruction which instructs the stereographic shuttering device to shutter an inter-frame transition between
first eye 3D content andsecond eye 3D content from a user's view. - In addition to the forgoing, one or more computer storage devices can include computer-executable instructions that when executed by one or more processors of a computer system, cause the computer system to implement a method for configuring the computer system to convert three-dimensional (3D) video content for a low frame-rate display device. The method can involve converting an
input 3D video signal to an output video signal. The output video signal can include an alternating sequence of one or more first video frames that include a first image for viewing by a first eye and one or more second video frames that include a second image for viewing by a second eye. The method can also involve generating an inter-frame shuttering signal configured to instruct a shuttering device to concurrently shutter both the first eye and the second eye during a display of an inter-frame transition. An inter-frame transition can occur when, after sending the output video signal to a display device, a portion of the “first eye” video frames and a portion of the “second eye” video frames will be displayed concurrently at the display device. - This Summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
- Additional features and advantages of exemplary implementations of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of such exemplary implementations. The features and advantages of such implementations may be realized and obtained by means of the instruments and combinations particularly pointed out in the appended claims. These and other features will become more fully apparent from the following description and appended claims, or may be learned by the practice of such exemplary implementations as set forth hereinafter.
- In order to describe the manner in which the above-recited and other advantages and features of the invention can be obtained, a more particular description of the invention briefly described above will be rendered by reference to specific embodiments thereof which are illustrated in the appended drawings. It should be noted that the figures are not drawn to scale, and that elements of similar structure or function are generally represented by like reference numerals for illustrative purposes throughout the figures. Understanding that these drawings depict only typical embodiments of the invention and are not therefore to be considered to be limiting of its scope, the invention will be described and explained with additional specificity and detail through the use of the accompanying drawings in which:
-
FIG. 1 illustrates a schematic diagram of a video conversion device, along with schematic representations of internal components of the video conversion device, in accordance with one or more implementations of the invention; -
FIG. 2 illustrates an operating environment in which a video conversion device may function, in accordance with one or more implementations of the invention; -
FIG. 3 illustrates a flow diagram of the shuttering of the display of three-dimensional video content in response to a shuttering signal, in accordance with one or more implementations of the invention; -
FIG. 4 illustrates a timing diagram which demonstrates the relative timing of transmittedoutput 3D content, a corresponding blanking signal, and resulting display states in accordance with one or more implementations of the present invention; -
FIG. 5A illustrates a schematic diagram of an accessory device, along with schematic representations of internal components of the accessory device, in accordance with one or more implementations of the invention; -
FIG. 5B illustrates some exemplary operating environments in which an accessory device can enable the sending converted three-dimensional content to various display devices, in accordance with one or more implementations of the invention; and -
FIG. 6 illustrates a flowchart of a series of acts in a method in accordance with an implementation of the present invention for converting 3D video content for low frame-rate displays. - Implementations of the present invention provide devices, methods, and computer program products configured to enable the viewing of three-dimensional (3D) content on a broad range of display devices. When employing one or more implementations of the present invention, a viewer can view 3D content at display devices not specifically designed for 3D content display, while experiencing a level of quality that can match or even exceed the quality of specialized 3D display devices. Accordingly, implementations of the present invention can eliminate the need to purchase a 3D-specific display device by enabling viewers to view 3D content on traditional display devices in a high-quality manner.
- Specialized 3D display devices attempt to provide an enhanced 3D viewing experience by modifying physical characteristics of the display device, such as by increasing the frame-rate and by decreasing a frame overlap interval. The frame-rate refers to the number of unique video frames the display device can render in a given amount of time (e.g., one second). Frame overlap interval refers to the period of time that elapses when transitioning between two frames. During the frame overlap interval, the display device displays at least a portion of two or more video frames concurrently. Longer frame overlap intervals are perceptible to the human eye, and can lead to a degraded viewing experience. For example, longer frame overlap intervals can cause motion blurring or ghosting. These effects are a particular problem when viewing 3D video content.
- One or more implementations of the present invention provide for a video conversion device that converts incoming 3D content to a format adapted to a particular display device, such as a lower frame-rate display device. For example, the conversion device can generate an output video signal formatted in a manner understood by a particular destination display device, which takes into account physical characteristics of the destination device (e.g., frame-rate, frame overlap). The video conversion device can also compensate for longer overlap intervals exhibited by the display device by generating and transmitting a shuttering signal that shutters (or occludes) a user's view of the display device during frame overlap intervals. Thus, the video conversion device can facilitate viewing of 3D content on a broad range of display devices, including display devices that have lower frame-rates (and longer frame overlap intervals), while overcoming undesirable effects, such as motion blurring and ghosting.
- One or more implementations also provide for an accessory device (e.g., a Universal Serial Bus (USB) device) that can enable a variety of computing systems to send 3D content to lower frame-rate displays. Thus, the accessory device can extend the functionality of existing devices (e.g., general purpose computing systems, gaming systems, tablet computers, smart phones, set-top boxes, optical disc players, etc.) to enable the devices to send 3D content to attached or integrated display devices (which can have lower frame-rates). Similarly, one or more implementations also provide methods for configuring computing systems to convert/send 3D content to lower frame-rate display devices.
-
FIG. 1 , for example, illustrates a cutout diagram of avideo conversion device 100, along with schematic representations of internal components of thevideo conversion device 100, in accordance with one or more implementations of the invention. While the illustrated housing is rectangular, the housing can take virtually any form. Thevideo conversion device 100 can convert incoming 3D content (e.g., 3D content received from an optical disc player, a set-top box, a gaming device, the Internet, etc.) to a format adapted for an attached display device. Thevideo conversion device 100 can function as a standalone device positioned between one or more media devices and a particular destination display device. In one or more implementations, however, other devices (e.g., a media device or a display device) can incorporate functionality of thevideo conversion device 100. - The
video conversion device 100 can include a video signal input interface device 102 (video input port) adapted to receive a 3D video signal. As indicated, thevideo input port 102 can include any number of constituent input ports or interface devices. For instance, thevideo input port 102 can include one or more digital video input devices, such as High-Definition Multimedia Interface (HDMI) input(s) 102 a, DisplayPort input(s) 102 b, Digital Visual Interface (DVI) input(s) 102 c, etc. While not shown, thevideo input port 102 can also include any number of analog video inputs, such as composite, component, or coaxial inputs, to name a few. As indicated by theellipses 102 e, thevideo conversion device 100 can include any appropriate type and number of input ports, such as a LAN/WAN input 102 d (e.g., RJ-45 or WIFI). - Similarly,
FIG. 1 illustrates that thevideo conversion device 100 can include a video signal output interface device 104 (video output port) adapted to send an output video signal to a particular destination display device. Like thevideo input port 102, thevideo output port 104 can include any number of appropriate constituent output ports or interfaces. In the illustrated case, for instance, thevideo output port 104 can include one or morecomposite outputs 104 a, one ormore component outputs 104 b, and/or one ormore HDMI outputs 104 c. Thevideo output port 104 can, however, include any other appropriate type and number of constituent output ports (as indicated by theellipses 104 d). In some implementations, thevideo input port 102 and thevideo output port 104 can share constituent input/output ports dynamically configured as an input or an output. - The
video conversion device 100 can include any number of other appropriate external interface devices/ports. For instance, thevideo conversion device 100 can include one or more user input device(s) 108. As indicated, the user input device(s) 108 can include any combination ofswitches 108 a,buttons 108 b,wireless receivers 108 c (e.g., Infrared, BLUETOOTH, WIFI),wired receivers 108 d (e.g., USB), orothers 108 e. Thevideo conversion device 100 can also include one or more transmitter devices 110 (e.g., a shuttering signal transmitter device adapted to transmit a shuttering signal). As shown, thetransmitter device 110 can comprise aninfrared transmitter device 110 a,BLUETOOTH transmitter device 110 b,WIFI transmitter device 110 c, or anothertransmitter device 110 d. Use of the user input device(s) 108 and thetransmitter device 110 are discussed in more detail herein after. - The
video conversion device 100 can also include one ormore processing units 106. For convenience in description,FIG. 1 illustrates asingle processing unit 106 that includes a plurality of constituent processing components or modules (106 a, 106 b, 106 c, 106 d, 106 e, 106 f), but the invention is not so limited. Instead, thevideo conversion device 100 can include any number of processing units, which can each perform a single function, or a variety of functions. Thevideo conversion device 100 can, for example, include one or more programmable processing units, such as Field-programmable Gate Arrays (FPGA) or microcontrollers (e.g., PIC, ATmega, etc.). In one or more implementations, thevideo conversion device 100 can also include one or more dedicated processing units or a combination of software-configurable processing units and dedicated processing units. When multiple processing units are used, any appropriate communications mechanisms (e.g., on-chip connection, busses, etc.) can couple the processing units together. - As indicated by the arrows in
FIG. 1 , any appropriate communications channel(s) (e.g., one or more buses) can communicatively couple theprocessing units 106 with the other components, such as thevideo input port 102, thevideo output port 104, the user input device(s) 108, thetransmitter 110, and any other components. Thevideo input port 102 can receive 3D content via an input video signal (in either an analog or a digital format), and communicate the received 3D video signal to theprocessing units 106. Theprocessing units 106 can convert the 3D video signal to an output format adapted to, or suited for, a particular display device (e.g.,display device 202,FIG. 2 ) attached to thevideo output port 104. This conversion can take into account physical characteristics of the display device, and can comprise using a customized encoding format, frame-rate, frame size, etc. in the output format. After conversion, theprocessing units 106 can send the converted 3D content to the particular destination display device via an output video signal sent through thevideo output port 104. - To perform the foregoing conversion, the
processing units 106 can include a plurality of constituent processing units, can implement a plurality of logical modules or components, or can use a combination of the two. Theprocessing units 106 can, for instance, include adecoder 106 a and anencoder 106 b. Thedecoder 106 a can receive the 3D video signal (which may comprise any number of various known 3D formats) and decode it into one or more internal buffers for conversion to the output format. For example, the 3D video signal can comprise one or more video frames that encode left-perspective image data and right-perspective image data. Thedecoder 106 a can detect the format used to encode this data, and then decode left-perspective data into one buffer and decode right-perspective data into another buffer. Thedecoder 106 a can encode standard (e.g., VESA) content to standard or non-standard formats. - Once at least a portion of the 3D video signal is decoded, the
encoder 106 b can encode image data from the buffer(s) into the output format. In one or more implementations, the output format can comprise a sequence of one or more left-perspective video frame(s) alternating with one or more right-perspective video frames, or vice versa. Theprocessing units 106 can first pass the one or more video frames containing image data for one eye (e.g., one or more left perspective frames) to thevideo output port 104 via an output video signal. Theprocessing units 106 can subsequently pass one or more video frames containing image data for the other eye (e.g., one or more right perspective frames) to thevideo output port 104 via the output video signal. - As part of the conversion process, the
processing units 106 can employ a digital-to-analog converter (DAC) and/or an analog-to-digital converter (ADC). Theencoder 106 a and/or thedecoder 106 b can include these converters. Alternatively, the converters are one or more separate components (e.g., DAC/ADC 106 c). When the received 3D video signal is digital (e.g., from an HDMI connection), theprocessing units 106 can use the DAC produce an output video signal that is analog (e.g., component or composite). Such converters can allow for the conversion and display of 3D content on older devices that may have the capabilities of receiving digital content. The inverse is also true, and theprocessing units 106 can convert an analog 3D video signal to an output video signal that is digital using the ADC. - The
processing units 106 can also include a shutteringsignal generator component 106 d, which can generate a shuttering (or sync) signal to assist with 3D viewing. Thevideo conversion device 100 can transmit the generated shuttering signal to one or more shuttering devices (e.g.,stereographic shuttering glasses 204,FIG. 2 ), using thetransmitter 110, prior to, or concurrently with, sending the output format to the destination display device. The shuttering signal can instruct the shuttering device(s) to shutter (or occlude) portions of a user's view of the output video signal when displayed at the destination display device, to provide the illusion of 3D content display. As discussed in more detail later, the shuttering signal can include one or more inter-frame shuttering instructions, which can enhance 3D content viewing on lower frame-rate devices. - As mentioned, while encoding (and decoding) the
processing units 106 can adapt the output video signal for a particular destination display device. This can include any combination of generating customized types of output video frames (e.g., interlaced or progressive) and/or generating customized sizes of output video frames (e.g., 480, 720, or 1080 vertical lines). This can also include generating output video frames at a target frame-rate (e.g., 60 Hz, 120 Hz). When generating output video frames at a target frame-rate, theprocessing units 106 can send the frames to thevideo output port 104 at a rate that would cause the display device to receive a target number of frames per second. - In order to adapt the output format to the destination display device, the
processing units 106 can include a detection module orcomponent 106 e. Thedetection module 106 e can receive physical characteristic information about the destination display device and provide this information to the other modules or components, such as thedecoder 106 a and/or theencoder 106 b. Thedetection module 106 e can receive the physical characteristic information from a user via the user input device(s) 108, or directly from the display device (e.g., via an HDMI connection). The physical characteristic information can include any appropriate information, such as frame size and frame-rate capabilities of the display device, an inter-frame overlap interval of the display device, etc. - In one or more implementations, receiving physical characteristic information via the user input device(s) 108 can involve receiving specific physical characteristic information about the particular destination display device. The user can, for example, use a wireless remote control to enter or select a make and model of the particular destination display device. The
detection module 106 e can use this information look up the physical characteristics of the particular destination display device from a local or remote database. Alternatively, the user can use buttons or switches on thevideo conversion device 100 to select particular physical characteristics of the particular destination display device (e.g., frame rate). - In one or more implementations, receiving physical characteristic information via the user input device(s) 108 can also involve inference and/or learning techniques. In a configuration mode, for instance, the
video conversion device 100 can send configuration information for display at the display device in various different formats, while also sending a corresponding shuttering signal to a shuttering device. The user can then provide appropriate feedback about his or her perception of the displayed configuration information, as viewed through the shuttering device, via buttons, wireless communication, etc. Based on the sent configuration information, and the corresponding feedback received, thevideo conversion device 100 can infer the physical characteristics of the display device. - One will appreciate in light of the disclosure herein that the
video conversion device 100 can include any number of additional physical or software-based components or modules (as indicated by theellipses 106 f), or can contain a fewer number of components or modules. Accordingly, thevideo conversion device 100 can depart from the illustrated form without departing from the scope of this disclosure. As mentioned, other devices may incorporate functionality of the video conversion device. In these instances, the video conversion device may not include some of the illustrated components. For example, if a media device incorporates functionality of thevideo conversion device 100, thevideo input device 102 may or may not be included. -
FIG. 2 illustrates an operatingenvironment 200 in which thevideo conversion device 100 may function in accordance with one or more implementations. As illustrated, theenvironment 200 can include thevideo conversion device 100, one ormore shuttering devices 204, and adisplay device 202. Thevideo conversion device 100 can receive a 3D video signal from a media device. The media device can comprise any number of media devices capable of sending 3D content to thevideo conversion device 100. For example,FIG. 2 illustrates that the media device can comprise a streaming source 206 (e.g., a satellite box, cable box, the Internet), a gaming device (e.g.,XBOX 208, PLAYSTATION 210), a player device (e.g., Blu-Ray player 212, DVD player 214) capable of reading media 216 (e.g., optical media), and the like. As indicated herein above, thevideo conversion device 100 can, itself, comprise one or more media devices. - The
video conversion device 100 can communicate with other devices using any of the hardware components discussed herein above (e.g., thevideo input port 102, thevideo output port 104, or the transmitter 110). An appropriate wired (e.g., HDMI, component, composite, coaxial, network) or wireless (BLUETOOTH, Wi-Fi) mechanism can couple thevideo output port 104 and thedisplay device 202 together. Likewise, an appropriate wired or wireless mechanism can couple thevideo input port 102 to a media device. Furthermore, an appropriate wireless mechanism (e.g., BLUETOOTH, infrared, etc.) can couple thevideo conversion device 100 and the blanking device(s) 204 together. - The
display device 202 can comprise any one of a broad range of display devices that incorporate a variety of display technologies, both current and future (e.g., Cathode Ray, Plasma, LCD, LED, OLED). Thedisplay device 202 can take any of a number of forms, such as a television set, a computer display (e.g., desktop computer monitor, laptop computer display, tablet computer display), a handheld display (e.g., cellular telephone, PDA, handheld gaming device, handheld multimedia device), or any other appropriate form. While thedisplay device 202 can have a configuration designed specifically to display 3D content, thedestination display device 202 alternatively can comprise a more traditional display device, such as a lower frame-rate device. One will appreciate in light of the disclosure herein, that thedisplay device 202 can include both digital and analog display devices. - The shuttering device(s) 204 can comprise any shuttering device configured to interoperate with
video conversion device 100, and to respond to one or more shuttering instructions received via a shuttering signal. In one or more implementations, the shuttering device(s) 204 comprise stereographic shuttering glasses, with lenses that include one or more liquid crystal layers. The liquid crystal layers can have the property of becoming opaque (or substantially opaque) when voltage is applied (or, alternatively, when voltage is removed). The liquid crystal layers can otherwise have the property being transparent (or substantially transparent) when voltage is removed (or, alternatively, when voltage is applied). The shuttering device(s) 204 can thus apply or remove voltage from the lenses to block the user's view, as instructed by the shuttering signal. - As mentioned herein above, the
video conversion device 100 can generate and send a shuttering signal to one ormore shuttering devices 204.FIG. 3 illustrates is a flow diagram of the shuttering of the display of 3D video content in response to a shuttering signal, according to one or more implementations.FIG. 3 illustrates three display states 302, 304, and 306. During these display states, thevideo conversion device 100 sends different portions of the output video signal to the particulardestination display device 202. Correspondingly, thevideo conversion device 100 also transmitsappropriate shuttering instructions - Referring to display
states video conversion device 100 can provide the illusion that two-dimensional images encoded in the output video signal are 3D. Instate 302, for example, thevideo conversion device 100 can transmit one or more left-perspective video frames in anoutput video signal 324 to thedisplay device 202, and can also transmit a shuttering instruction 314 (occlude right) to the shuttering device(s) 204. Thus, when thedisplay device 202 displays a left-perspective image 308, ashuttering component 322 can occlude the viewer's right eye view of thedisplay device 202. Similarly, instate 306, thevideo conversion device 100 can transmit one or more right-perspective video frames in theoutput signal 324 to thedisplay device 202, and can also transmit a shuttering instruction 318 (occlude left) to the shuttering device(s) 204. Thus, when thedisplay device 202 displays a right-perspective image 312, theshuttering component 320 can occlude the viewer's left eye view of thedisplay device 202. - Alternatively, the
video conversion device 100 can reverse the images and shuttering instructions ofstates state 302, for example, thevideo conversion device 100 can alternatively send a right-perspective image to thedisplay device 202 and can send an “occlude left” instruction to the shuttering device(s) 204. Similarly, instate 306, thevideo conversion device 100 can send a left-perspective image to thedisplay device 202 and can send an “occlude right” instruction to the shuttering device(s) 204. One will appreciate in light of the disclosure herein that the illustrated sequence of images and instructions is not limiting. - While display states 302 and 306 provide the illusion of 3D content display, one or more implementations introduce a
third display state 304, during which thevideo conversion device 100 occludes aninter-frame overlap 310.Inter-frame overlap 310 occurs after thevideo conversion device 100 has fully transmitted image data for one eye (e.g., left-perspective video frames), and has begun to transmit image data for the other eye (e.g., right-perspective video frames). During inter-frame overlap, physical limitations of thedisplay device 202, can cause portions of the different frames to “blend,” so that portions of both the left and right perspective images are concurrently displayed. Thevideo conversion device 100 can occlude at least a portion of this overlap by transmitting a shuttering instruction 316 (occlude both) to the shuttering device(s) 204, which causes the shuttering device(s) 204 to occlude both eyes concurrently. - Inter-frame shuttering, or the occlusion of both eyes during inter-frame overlap intervals, can enhance the clarity of the perceived 3D image. Inter-frame shuttering can reduce or eliminate the undesirable effects common to 3D content display, such as motion blurring and ghosting. Thus, inter-frame shuttering techniques, when synchronously combined with the creation of an output video signal adapted to a particular display device, can allow for viewing of 3D content on display devices that may have lower frame-rates and/or longer frame overlap intervals.
-
FIG. 4 illustrates a timing diagram 400 of the transmission of an output video signal to adestination display device 202, and the transmission of a shuttering signal to shuttering device(s) 204, in accordance with one or more implementations. At atime 402, thevideo conversion device 100 begins transmitting left-perspective video frames(s) 410 to thedestination display device 202. After potentially passing through an inter-frame overlap display state 304 (and sending a corresponding shuttering instruction 316), thedestination display device 202 displays only the left frame(s) 410 at atime 404. Thus, beginning at (or near)time 404 thevideo conversion device 100 can instruct the shuttering device(s) 204 to occlude the user's right eye with an appropriate shuttering instruction 314 (occlude right). - The
video conversion device 100 can cease transmitting the left frame(s) 410 at atime 406, and begin transmitting right-perspective video frame(s) 412. Thevideo conversion device 100 can base the timing of the transition between the left and right frames on a target frame-rate of the output video signal, which is adapted to thedestination display device 202. Based on physical characteristic information about thedestination display device 202, thevideo conversion device 100 can determine adisplay state 304 fromtime 406 to atime 408. During this period, thedisplay device 202 will display an inter-frame overlap (310,FIG. 3 ) as the display device transitions between displaying the left frame(s) 410 to displaying the right frames(s) 412. Thus, fromtime 406 totime 408, thevideo conversion system 100 can send an inter-frame shuttering instruction 316 (occlude both). As discussed, theinter-frame shuttering instruction 316 can shutter at least a portion of the inter-frame overlap (310,FIG. 3 ) from the user's view. - Next, during
display state 306, thedestination display device 202 will have transitioned past theinter-frame overlap 310 and will display only the right frame(s) 412. Thevideo conversion device 100 can send an appropriate shuttering instruction 318 (occlude left). Subsequently, thevideo conversion device 100 can send other left frame(s), other right frame(s), and so on. These frames can include new image data from the received 3D video signal, or can include the same data sent previously (i.e., to increase the frame-rate of the output signal). Correspondingly, thevideo conversion device 100 can send corresponding shuttering instructions (as shown). - One will also appreciate that, while
FIG. 4 illustrates a series of alternating left and right frames (in any order), one or more implementations extend to any sequence of video frames. Furthermore, in some instances, the shuttering signal can instruct the shuttering device(s) 204 to occlude an entire time period. In other instances, however, the shuttering signal can instruct the shuttering device(s) 204 to occlude only a portion of a corresponding time period. The shuttering signal can also instruct the shuttering device(s) 204 to occlude more than a corresponding time period, or can include other shuttering instructions, such as a shuttering instruction that causes the shuttering device(s) 204 to refrain from occluding any portion of the user's view. - One or more implementations also extend to devices adapted for use in configuring a computing system to convert 3D video content for low frame-rate display devices.
FIG. 5A , for example, illustrates a cutout diagram of anaccessory device 500, along with schematic representations of internal components of theaccessory device 500, in accordance with one or more implementations. While the illustratedaccessory device 500 is a USB device that would fit in the palm of the hand of a user, theaccessory device 500 can take any number of alternate forms without departing from the disclosure herein. For example, theaccessory device 500 may also take the form of an IEEE 1394 (FIREWIRE, I.LINK, LYNX) device, an APPLE Dock device, etc. - The
accessory device 500 can include a variety of constituent components. In one or more implementations, theaccessory device 500 can include an interface device 502 adapted to communicatively interface with the associated computing system (e.g., a USB interface, an IEEE 1394 interface, an APPLE Dock interface). Theaccessory device 500 can also include a shuttering signal transmission device 504 (transmitter). Like thetransmitter 110 of thevideo conversion device 100, thetransmitter 504 can transmit a shuttering signal to stereographic shuttering devices, and can use any appropriate signal type (e.g., Infrared, BLUETOOTH, WIFI). The associated computing system can process/convert 3D video content, generate a shuttering signal, and send the generated shuttering signal to one or more shuttering devices via thetransmitter 504 on theaccessory device 500. - The associated computing system run computer-executable instructions received as part of, or separate from, the
accessory device 500. For example, the associated computing system can receive instructions via a storage device provided at the associated computing system (e.g., a CD-ROM, FLASH memory, etc.), via an Internet download, etc. Alternatively, the associated computing system can receive instructions from theaccessory device 500. Theaccessory device 500 can include, for example, one or morecomputerized storage devices 506 storing computer-executable instructions. - The stored computer-executable instructions can instruct one or more processing units to convert a 3D video signal to a format adapted for display on a particular destination display device. The instructions can, for instance, instruct one or more processors at the associated computing system to perform the conversion. The instructions can also instruct one or
more processing units 508 on theaccessory device 500, to perform, or to help perform, the conversion. In this manner, theaccessory device 500 can offload some or all of the computation needed to perform the conversion from the associated computing system. - The stored computer-executable instructions can also cause one or more processing units to generate a shuttering signal. The shuttering signal can include one or more inter-frame shuttering instructions for shuttering an inter-frame transition between first-eye and second-eye content (as discussed in connection with
FIGS. 3 and 4 ). The stored computer-executable instructions can instruct processing units at the associated computing system or on theaccessory device 500 to generate the shuttering signal (in whole or in part). Thus, theaccessory device 500 can instruct the associated computing system to generate the shuttering signal, or can generate the shuttering signal itself. The stored computer-executable instructions can also cause thetransmitter 504 to send the generated shuttering signal to one or morestereographic shuttering devices 204. - As indicated, the
accessory device 500 can include one or more processors orprocessing units 508. Similar to thevideo conversion device 100, these processingunits 508 can comprise any number of processing units, which can each perform a single function, or a variety of functions. Theprocessing units 508 can thus comprise programmable processing units (e.g., FPGAs, microcontrollers), dedicated processing units, or a combination of each. In addition, an appropriate communications channel 510 (e.g., one or more buses) can couple each component of theaccessory device 500. Additionally, similar to thevideo conversion device 100, theprocessing units 508 can implement a series of processing components, such as decoder(s), encoder(s), ADC, DAC, etc. -
FIG. 5B illustrates a few exemplary operating environments in which theaccessory device 500 can enable the sending of 3D content to low frame-rated display devices.Item 512, for example, illustrates that theaccessory device 500 can enable a general-purpose computing system (e.g., a laptop or desktop computer) to convert/send 3D content to an integrated display. The accessory device can operate with virtually any operating system, such as MICROSOFT WINDOWS, APPLE MAC OS, LINUX, UNIX, etc. In this context, theaccessory device 500 can enable the generalpurpose computing system 512 to receive 3D content from any appropriate source (e.g., solid state media, optical media, the Internet), to convert the 3D content to an output video signal adapted to the attached display device, to generate a shuttering signal, and to send the output video signal and the shuttering signal to their respective devices. - The
accessory device 500 can also enable 3D content conversion and/or display with other devices as well, such as a gaming device 514 (e.g., XBOX, PLAYSTATION), a DVD/BLU-RAY player 516, or atablet computer 518. In each environment, theaccessory device 500 can include hardware interfaces and/or computer instructions customized to the particular device. It is noted that more specialized devices (e.g., DVD/BLU-RAY players 516 or tablet computers 518), may have limited processing or configuration capabilities. Thus, the inclusion of processingunits 508 on theaccessory device 500 can enable these devices to process/convert 3D content. - Implementations of the present invention can also be described in terms of flowcharts comprising one or more acts in a method for accomplishing a particular result.
FIG. 6 , for instance, a flowchart of a computerized method for converting 3D video content for a for low frame-rate displays. The acts ofFIG. 6 are described with respect to the schematics, diagrams, devices and components shown inFIGS. 1-5B . - As illustrated, a method can comprise an
act 602 of converting aninput 3D video signal. Act 602 can include converting aninput 3D video signal to an output video signal which includes an alternating sequence of one or more first video frames that include a first image for viewing by a first eye and one or more second video frames that include a second image for viewing by a second eye. For example, thevideo conversion device 100 can receive a 3D video signal and convert the received 3D content to a format adapted for a particular destination display device. Similarly, theaccessory device 500 can configure a general or special purpose computing system to convert 3D video content. The accessory device can, for example, include computer-executable instructions which cause processors at the computing system, or at theaccessory device 500, to convert 3D content to an output signal adapted for a particular display device. As disclosed, adapting the output signal can include determining an optimal frame rate for the display device, which can comprise a low frame-rate display device. - The illustrated method can also comprise an
act 604 of generating an inter-frame shuttering signal. Act 604 can include generating an inter-frame shuttering signal configured to instruct a shuttering device to concurrently shutter both the first eye and the second eye during a display of an inter-frame transition. The inter-frame transition can comprise a period during which at least a portion of the one or more first video frames and at least a portion of the one or more second video frames are displayed concurrently. For example, thevideo conversion device 100 or theaccessory device 500 can generate, or cause an associated computing system to generate, a shuttering signal that includes a plurality of shuttering instructions. As illustrated inFIGS. 3 and 4 , the shuttering signal can include one or more inter-frame shuttering instructions (316) which instruct shuttering device(s) 204 to occlude both of a viewer's eyes during an inter-frame transition. - Accordingly,
FIGS. 1-6 provide a number of components and mechanisms for sending 3D video content to a broad range of display devices. One or more disclosed implementations allow for viewing of 3D video content on a broad range of display devices, including devices that that may have lower frame-rates and longer frame overlap intervals, or that are not otherwise specifically designed for displaying 3D video content. - The implementations of the present invention can comprise a special purpose or general-purpose computing systems. Computing systems may, for example, comprise handheld devices, appliances, laptop computers, desktop computers, mainframes, distributed computing systems, or even devices that have not conventionally considered a computing system, such as DVD players, BLU-RAY Players, gaming systems, and video converters. In this description and in the claims, the term “computing system” is defined broadly as including any device or system (or combination thereof) that includes at least one physical and tangible processor, and a physical and tangible memory capable of having thereon computer-executable instructions, which the processor may execute.
- The memory may take any form and may depend on the nature and form of the computing system. A computing system may be distributed over a network environment and may include multiple constituent computing systems. In its most basic configuration, a computing system typically includes at least one processing unit and memory. The memory may be physical system memory, which may be volatile, non-volatile, or some combination of the two. The term “memory” may also be used herein to refer to non-volatile mass storage such as physical storage media. If the computing system is distributed, the processing, memory and/or storage capability may be distributed as well. As used herein, the term “module” or “component” can refer to software objects or routines that execute on the computing system. The different components, modules, engines, and services described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads).
- Implementations of the present invention may comprise or utilize a special purpose or general-purpose computer including computer hardware, such as, for example, one or more processors and system memory, as discussed in greater detail below. Embodiments within the scope of the present invention also include physical and other computer-readable media for carrying or storing computer-executable instructions and/or data structures. Such computer-readable media can be any available media that can be accessed by a general purpose or special purpose computer system. Computer-readable media that store computer-executable instructions are physical storage media. Computer-readable media that carry computer-executable instructions are transmission media. Thus, by way of example, and not limitation, embodiments of the invention can comprise at least two distinctly different kinds of computer-readable media: computer storage media and transmission media.
- Computer storage media includes RAM, ROM, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, or any other medium which can be used to store desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer.
- A “network” is defined as one or more data links that enable the transport of electronic data between computer systems and/or modules and/or other electronic devices. When information is transferred or provided over a network or another communications connection (either hardwired, wireless, or a combination of hardwired or wireless) to a computer, the computer properly views the connection as a transmission medium. Transmissions media can include a network and/or data links which can be used to carry or desired program code means in the form of computer-executable instructions or data structures and which can be accessed by a general purpose or special purpose computer. Combinations of the above should also be included within the scope of computer-readable media.
- Further, upon reaching various computer system components, program code means in the form of computer-executable instructions or data structures can be transferred automatically from transmission media to computer storage media (or vice versa). For example, computer-executable instructions or data structures received over a network or data link can be buffered in RAM within a network interface module (e.g., a “NIC”), and then eventually transferred to computer system RAM and/or to less volatile computer storage media at a computer system. Thus, it should be understood that computer storage media can be included in computer system components that also (or even primarily) utilize transmission media.
- Computer-executable instructions comprise, for example, instructions and data which, when executed at a processor, cause a general purpose computer, special purpose computer, or special purpose processing device to perform a certain function or group of functions. The computer executable instructions may be, for example, binaries, intermediate format instructions such as assembly language, or even source code. Although the subject matter has been described in language specific to structural features and/or methodological acts, it is to be understood that the subject matter defined in the appended claims is not necessarily limited to the described features or acts described above. Rather, the described features and acts are disclosed as example forms of implementing the claims.
- Those skilled in the art will appreciate that the invention may be practiced in network computing environments with many types of computer system configurations, including, personal computers, desktop computers, laptop computers, message processors, hand-held devices, multi-processor systems, microprocessor-based or programmable consumer electronics, network PCs, minicomputers, mainframe computers, mobile telephones, PDAs, pagers, routers, switches, and the like. The invention may also be practiced in distributed system environments where local and remote computer systems, which are linked (either by hardwired data links, wireless data links, or by a combination of hardwired and wireless data links) through a network, both perform tasks. In a distributed system environment, program modules may be located in both local and remote memory storage devices.
- The present invention may be embodied in other specific forms without departing from its spirit or essential characteristics. The described embodiments are to be considered in all respects only as illustrative and not restrictive. The scope of the invention is, therefore, indicated by the appended claims rather than by the foregoing description. All changes which come within the meaning and range of equivalency of the claims are to be embraced within their scope.
Claims (20)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/379,613 US20120140034A1 (en) | 2010-11-23 | 2011-04-04 | Device for displaying 3d content on low frame-rate displays |
Applications Claiming Priority (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US41670810P | 2010-11-23 | 2010-11-23 | |
PCT/US2011/025262 WO2012071063A1 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3d signal |
PCT/US2011/027175 WO2012071064A1 (en) | 2010-11-23 | 2011-03-04 | Formatting 3d content for low frame-rate displays |
PCT/US2011/027933 WO2012071066A1 (en) | 2010-11-23 | 2011-03-10 | Displaying 3d content on low frame-rate displays |
PCT/US2011/027981 WO2012071067A1 (en) | 2010-11-23 | 2011-03-10 | Shuttering the display of inter-frame transitions |
US13/379,613 US20120140034A1 (en) | 2010-11-23 | 2011-04-04 | Device for displaying 3d content on low frame-rate displays |
PCT/US2011/031115 WO2012071072A1 (en) | 2010-11-23 | 2011-04-04 | Device for displaying 3d content on low frame-rate displays |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2011/025262 Continuation-In-Part WO2012071063A1 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3d signal |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120140034A1 true US20120140034A1 (en) | 2012-06-07 |
Family
ID=46146151
Family Applications (6)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/377,132 Expired - Fee Related US8553072B2 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3D signal |
US13/378,649 Abandoned US20120140032A1 (en) | 2010-11-23 | 2011-03-04 | Formatting 3d content for low frame-rate displays |
US13/378,975 Abandoned US20120140051A1 (en) | 2010-11-23 | 2011-03-10 | Shuttering the display of inter-frame transitions |
US13/378,981 Abandoned US20120140033A1 (en) | 2010-11-23 | 2011-03-10 | Displaying 3d content on low frame-rate displays |
US13/379,613 Abandoned US20120140034A1 (en) | 2010-11-23 | 2011-04-04 | Device for displaying 3d content on low frame-rate displays |
US13/379,317 Abandoned US20120147160A1 (en) | 2010-11-23 | 2011-04-14 | Adaptive 3-d shuttering devices |
Family Applications Before (4)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/377,132 Expired - Fee Related US8553072B2 (en) | 2010-11-23 | 2011-02-17 | Blanking inter-frame transitions of a 3D signal |
US13/378,649 Abandoned US20120140032A1 (en) | 2010-11-23 | 2011-03-04 | Formatting 3d content for low frame-rate displays |
US13/378,975 Abandoned US20120140051A1 (en) | 2010-11-23 | 2011-03-10 | Shuttering the display of inter-frame transitions |
US13/378,981 Abandoned US20120140033A1 (en) | 2010-11-23 | 2011-03-10 | Displaying 3d content on low frame-rate displays |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/379,317 Abandoned US20120147160A1 (en) | 2010-11-23 | 2011-04-14 | Adaptive 3-d shuttering devices |
Country Status (2)
Country | Link |
---|---|
US (6) | US8553072B2 (en) |
WO (6) | WO2012071063A1 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE102012108685A1 (en) * | 2012-09-17 | 2014-03-20 | Karlheinz Gelhardt | Multi-converter for digital, stereoscopic video signals, has converting unit converting video signal into multiple given formats for digital, stereoscopic video signals and communicating with another converting unit over common data bus |
US20140089398A1 (en) * | 2011-05-27 | 2014-03-27 | Huawei Technologies Co., Ltd. | Media sending method, media receiving method, and client and system |
US9494975B1 (en) * | 2011-03-28 | 2016-11-15 | Amazon Technologies, Inc. | Accessory device identification method |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8947503B2 (en) * | 2009-12-08 | 2015-02-03 | Broadcom Corporation | Method and system for processing 3-D video |
JP5821259B2 (en) * | 2011-04-22 | 2015-11-24 | セイコーエプソン株式会社 | Image display system, image display device, 3D glasses, and image display method |
US8724662B2 (en) * | 2012-06-25 | 2014-05-13 | Johnson & Johnson Vision Care, Inc. | Wireless communication protocol for low power receivers |
GB2508413A (en) * | 2012-11-30 | 2014-06-04 | Nordic Semiconductor Asa | Stereoscopic viewing apparatus and display synchronization |
EP2778747A3 (en) * | 2013-03-15 | 2014-11-26 | Johnson & Johnson Vision Care, Inc. | Ophthalmic lens viewing sets for three-dimensional perception of stereoscopic media |
US9873233B2 (en) | 2013-03-15 | 2018-01-23 | Johnson & Johnson Vision Care, Inc. | Ophthalmic lens viewing sets for three-dimensional perception of stereoscopic media |
US10194163B2 (en) | 2014-05-22 | 2019-01-29 | Brain Corporation | Apparatus and methods for real time estimation of differential motion in live video |
US9939253B2 (en) * | 2014-05-22 | 2018-04-10 | Brain Corporation | Apparatus and methods for distance estimation using multiple image sensors |
US9713982B2 (en) | 2014-05-22 | 2017-07-25 | Brain Corporation | Apparatus and methods for robotic operation using video imagery |
US9842551B2 (en) | 2014-06-10 | 2017-12-12 | Apple Inc. | Display driver circuitry with balanced stress |
US10057593B2 (en) * | 2014-07-08 | 2018-08-21 | Brain Corporation | Apparatus and methods for distance estimation using stereo imagery |
US10055850B2 (en) | 2014-09-19 | 2018-08-21 | Brain Corporation | Salient features tracking apparatus and methods using visual initialization |
US10334223B2 (en) * | 2015-01-30 | 2019-06-25 | Qualcomm Incorporated | System and method for multi-view video in wireless devices |
CN104994372B (en) * | 2015-07-03 | 2017-03-29 | 深圳市华星光电技术有限公司 | A kind of 3D display systems |
US10197664B2 (en) | 2015-07-20 | 2019-02-05 | Brain Corporation | Apparatus and methods for detection of objects using broadband signals |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040218269A1 (en) * | 2002-01-14 | 2004-11-04 | Divelbiss Adam W. | General purpose stereoscopic 3D format conversion system and method |
WO2007126904A2 (en) * | 2006-03-29 | 2007-11-08 | Nvidia Corporation | System, method, and computer program product for controlling stereo glasses shutters |
US20110187840A1 (en) * | 2010-02-04 | 2011-08-04 | Chunghwa Picture Tubes, Ltd. | Three dimensional display and method thereof |
US20120050154A1 (en) * | 2010-08-31 | 2012-03-01 | Adil Jagmag | Method and system for providing 3d user interface in 3d televisions |
Family Cites Families (43)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5821989A (en) * | 1990-06-11 | 1998-10-13 | Vrex, Inc. | Stereoscopic 3-D viewing system and glasses having electrooptical shutters controlled by control signals produced using horizontal pulse detection within the vertical synchronization pulse period of computer generated video signals |
JPH06350978A (en) | 1993-06-11 | 1994-12-22 | Sanyo Electric Co Ltd | Video signal converter |
US6057811A (en) * | 1993-09-28 | 2000-05-02 | Oxmoor Corporation | 3-D glasses for use with multiplexed video images |
JPH0879799A (en) | 1994-09-05 | 1996-03-22 | Sony Corp | Stereoscopic display system, and its synchronizing signal transmitter and synchronizing signal receiver |
US5572250A (en) | 1994-10-20 | 1996-11-05 | Stereographics Corporation | Universal electronic stereoscopic display |
US5610661A (en) | 1995-05-19 | 1997-03-11 | Thomson Multimedia S.A. | Automatic image scanning format converter with seamless switching |
JPH09139957A (en) | 1995-11-14 | 1997-05-27 | Mitsubishi Electric Corp | Graphic display device |
DE69633705T2 (en) * | 1995-11-16 | 2006-02-02 | Ntt Mobile Communications Network Inc. | Method for detecting a digital signal and detector |
US6088052A (en) | 1997-01-08 | 2000-07-11 | Recherches Point Lab Inc. | 3D stereoscopic video display system |
DE19806547C2 (en) | 1997-04-30 | 2001-01-25 | Hewlett Packard Co | System and method for generating stereoscopic display signals from a single computer graphics pipeline |
JPH1169384A (en) | 1997-08-08 | 1999-03-09 | Olympus Optical Co Ltd | Video signal type decision processor |
JP3448467B2 (en) | 1997-09-19 | 2003-09-22 | 三洋電機株式会社 | LCD shutter glasses driving device |
KR100381817B1 (en) | 1999-11-17 | 2003-04-26 | 한국과학기술원 | Generating method of stereographic image using Z-buffer |
DE10016074B4 (en) | 2000-04-01 | 2004-09-30 | Tdv Technologies Corp. | Method and device for generating 3D images |
EP1334623A2 (en) * | 2000-10-12 | 2003-08-13 | Reveo, Inc. | 3d projection system with a digital micromirror device |
US20020070932A1 (en) | 2000-12-10 | 2002-06-13 | Kim Jesse Jaejin | Universal three-dimensional graphics viewer for resource constrained mobile computers |
CA2380105A1 (en) * | 2002-04-09 | 2003-10-09 | Nicholas Routhier | Process and system for encoding and playback of stereoscopic video sequences |
US7511714B1 (en) | 2003-11-10 | 2009-03-31 | Nvidia Corporation | Video format conversion using 3D graphics pipeline of a GPU |
US20090051759A1 (en) * | 2005-05-27 | 2009-02-26 | Adkins Sean M | Equipment and methods for the synchronization of stereoscopic projection displays |
CN101001320A (en) | 2006-01-12 | 2007-07-18 | 万里科技股份有限公司 | System of automatic 3D image generating and automatic image format conversion |
JP2007200116A (en) | 2006-01-27 | 2007-08-09 | Manri Kagi Kofun Yugenkoshi | System for 3d image automatic generation and image format automatic conversion |
US7724211B2 (en) * | 2006-03-29 | 2010-05-25 | Nvidia Corporation | System, method, and computer program product for controlling stereo glasses shutters |
JP2007324830A (en) * | 2006-05-31 | 2007-12-13 | Toshiba Corp | Frame rate converting device, and frame rate converting method |
US8717348B2 (en) * | 2006-12-22 | 2014-05-06 | Texas Instruments Incorporated | System and method for synchronizing a viewing device |
US20090207167A1 (en) * | 2008-02-18 | 2009-08-20 | International Business Machines Corporation | Method and System for Remote Three-Dimensional Stereo Image Display |
JP5338166B2 (en) | 2008-07-16 | 2013-11-13 | ソニー株式会社 | Transmitting apparatus, stereoscopic image data transmitting method, receiving apparatus, and stereoscopic image data receiving method |
US20100026794A1 (en) * | 2008-07-30 | 2010-02-04 | Sin-Min Chang | Method, System and Apparatus for Multiuser Display of Frame-Sequential Images |
JP4606502B2 (en) | 2008-08-07 | 2011-01-05 | 三菱電機株式会社 | Image display apparatus and method |
CA2684513A1 (en) * | 2008-11-17 | 2010-05-17 | X6D Limited | Improved performance 3d glasses |
JP2010139855A (en) | 2008-12-12 | 2010-06-24 | Sharp Corp | Display device, method for controlling display device, and control program |
US8233035B2 (en) * | 2009-01-09 | 2012-07-31 | Eastman Kodak Company | Dual-view stereoscopic display using linear modulator arrays |
TWI408947B (en) * | 2009-02-13 | 2013-09-11 | Mstar Semiconductor Inc | Image adjusting apparatus and image adjusting method |
EP2439934A4 (en) * | 2009-06-05 | 2014-07-02 | Lg Electronics Inc | Image display device and an operating method therefor |
US20110001805A1 (en) * | 2009-06-18 | 2011-01-06 | Bit Cauldron Corporation | System and method of transmitting and decoding stereoscopic sequence information |
JP5273478B2 (en) | 2009-07-07 | 2013-08-28 | ソニー株式会社 | Video display device and video display system |
KR20110040378A (en) * | 2009-10-14 | 2011-04-20 | 삼성전자주식회사 | Image providing method and image providing apparatus, display apparatus and image providing system using the same |
US20110090324A1 (en) * | 2009-10-15 | 2011-04-21 | Bit Cauldron Corporation | System and method of displaying three dimensional images using crystal sweep with freeze tag |
WO2011052918A2 (en) | 2009-10-30 | 2011-05-05 | Samsung Electronics Co., Ltd. | Two-dimensional/three-dimensional image display apparatus and method of driving the same |
KR101659575B1 (en) | 2009-10-30 | 2016-09-26 | 삼성전자주식회사 | Display apparatus for both 2D and 3D image and method of driving the same |
US9179136B2 (en) * | 2009-11-20 | 2015-11-03 | Broadcom Corporation | Method and system for synchronizing 3D shutter glasses to a television refresh rate |
CN101765024A (en) | 2009-12-22 | 2010-06-30 | 周晓民 | 3D stereoscopic video signal converter and technical application thereof |
KR20110102758A (en) * | 2010-03-11 | 2011-09-19 | 삼성전자주식회사 | 3-dimension glasses, rechargeable cradle, 3-dimension display apparatus and system for charging 3-dimension glasses |
US20120050462A1 (en) * | 2010-08-25 | 2012-03-01 | Zhibing Liu | 3d display control through aux channel in video display devices |
-
2011
- 2011-02-17 US US13/377,132 patent/US8553072B2/en not_active Expired - Fee Related
- 2011-02-17 WO PCT/US2011/025262 patent/WO2012071063A1/en active Application Filing
- 2011-03-04 WO PCT/US2011/027175 patent/WO2012071064A1/en active Application Filing
- 2011-03-04 US US13/378,649 patent/US20120140032A1/en not_active Abandoned
- 2011-03-10 US US13/378,975 patent/US20120140051A1/en not_active Abandoned
- 2011-03-10 WO PCT/US2011/027933 patent/WO2012071066A1/en active Application Filing
- 2011-03-10 WO PCT/US2011/027981 patent/WO2012071067A1/en active Application Filing
- 2011-03-10 US US13/378,981 patent/US20120140033A1/en not_active Abandoned
- 2011-04-04 US US13/379,613 patent/US20120140034A1/en not_active Abandoned
- 2011-04-04 WO PCT/US2011/031115 patent/WO2012071072A1/en active Application Filing
- 2011-04-14 US US13/379,317 patent/US20120147160A1/en not_active Abandoned
- 2011-04-14 WO PCT/US2011/032549 patent/WO2012071073A1/en active Application Filing
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20040218269A1 (en) * | 2002-01-14 | 2004-11-04 | Divelbiss Adam W. | General purpose stereoscopic 3D format conversion system and method |
WO2007126904A2 (en) * | 2006-03-29 | 2007-11-08 | Nvidia Corporation | System, method, and computer program product for controlling stereo glasses shutters |
US20110187840A1 (en) * | 2010-02-04 | 2011-08-04 | Chunghwa Picture Tubes, Ltd. | Three dimensional display and method thereof |
US20120050154A1 (en) * | 2010-08-31 | 2012-03-01 | Adil Jagmag | Method and system for providing 3d user interface in 3d televisions |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9494975B1 (en) * | 2011-03-28 | 2016-11-15 | Amazon Technologies, Inc. | Accessory device identification method |
US20140089398A1 (en) * | 2011-05-27 | 2014-03-27 | Huawei Technologies Co., Ltd. | Media sending method, media receiving method, and client and system |
DE102012108685A1 (en) * | 2012-09-17 | 2014-03-20 | Karlheinz Gelhardt | Multi-converter for digital, stereoscopic video signals, has converting unit converting video signal into multiple given formats for digital, stereoscopic video signals and communicating with another converting unit over common data bus |
DE102012108685B4 (en) * | 2012-09-17 | 2017-02-16 | Karlheinz Gelhardt | Multi-converter for digital, high-resolution, stereoscopic video signals |
Also Published As
Publication number | Publication date |
---|---|
WO2012071073A1 (en) | 2012-05-31 |
US8553072B2 (en) | 2013-10-08 |
US20120147160A1 (en) | 2012-06-14 |
US20120140033A1 (en) | 2012-06-07 |
US20120140051A1 (en) | 2012-06-07 |
WO2012071066A1 (en) | 2012-05-31 |
WO2012071063A1 (en) | 2012-05-31 |
WO2012071064A1 (en) | 2012-05-31 |
WO2012071072A1 (en) | 2012-05-31 |
US20120140031A1 (en) | 2012-06-07 |
US20120140032A1 (en) | 2012-06-07 |
WO2012071067A1 (en) | 2012-05-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20120140034A1 (en) | Device for displaying 3d content on low frame-rate displays | |
US8994795B2 (en) | Method for adjusting 3D image quality, 3D display apparatus, 3D glasses, and system for providing 3D image | |
EP2577981B1 (en) | Method and apparaus for making intelligent use of active space in frame packing format | |
US8624965B2 (en) | 3D glasses driving method and 3D glasses and 3D image providing display apparatus using the same | |
WO2016098411A1 (en) | Video display device, video display system, and video display method | |
CA2928248A1 (en) | Image display device and image display method, image output device and image output method, and image display system | |
JP2011090079A (en) | Display device, display method and computer program | |
US20090002482A1 (en) | Method for displaying three-dimensional (3d) video and video apparatus using the same | |
US20120002279A1 (en) | 2d quality enhancer in polarized 3d systems for 2d-3d co-existence | |
JP5702063B2 (en) | Display device, display method, and computer program | |
US20130016196A1 (en) | Display apparatus and method for displaying 3d image thereof | |
US8687950B2 (en) | Electronic apparatus and display control method | |
US8547418B2 (en) | Method and system for processing and displaying video in three dimensions using a liquid crystal display | |
US20130242044A1 (en) | 2d to 3d video conversion box | |
US20110310222A1 (en) | Image distributing apparatus, display apparatus, and image distributing method thereof | |
US9137522B2 (en) | Device and method for 3-D display control | |
US20130265391A1 (en) | Display device and method for automatically adjusting the brightness of an image according to the image mode | |
JP2014027351A (en) | Image processing device, image processing method, and program | |
CN210807464U (en) | 4K-compatible 3D LED display screen | |
US8736602B2 (en) | Display apparatus and control method thereof | |
US20150156483A1 (en) | Providing a capability to simultaneously view and/or listen to multiple sets of data on one or more endpoint device(s) associated with a data processing device | |
JP2014002308A (en) | Video processing device, video processing method and computer program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIMOTHY A.;REEL/FRAME:026078/0787 Effective date: 20110404 |
|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIMOTHY A.;REEL/FRAME:026136/0387 Effective date: 20110128 |
|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIMOTHY A.;REEL/FRAME:027421/0960 Effective date: 20110404 |
|
AS | Assignment |
Owner name: CIRCA3D, LLC, UTAH Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TABOR, TIM;REEL/FRAME:027571/0323 Effective date: 20110128 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |