WO2012101395A1 - Dual channel stereoscopic video streaming systems - Google Patents

Dual channel stereoscopic video streaming systems Download PDF

Info

Publication number
WO2012101395A1
WO2012101395A1 PCT/GB2012/000042 GB2012000042W WO2012101395A1 WO 2012101395 A1 WO2012101395 A1 WO 2012101395A1 GB 2012000042 W GB2012000042 W GB 2012000042W WO 2012101395 A1 WO2012101395 A1 WO 2012101395A1
Authority
WO
WIPO (PCT)
Prior art keywords
video
pixels
display
transmission system
stereoscopic
Prior art date
Application number
PCT/GB2012/000042
Other languages
French (fr)
Inventor
Ka-Shun Carrison Tong
Original Assignee
Hospital Authority
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hospital Authority filed Critical Hospital Authority
Publication of WO2012101395A1 publication Critical patent/WO2012101395A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/10Processing, recording or transmission of stereoscopic or multi-view image signals
    • H04N13/194Transmission of image signals
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/20Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
    • G02B30/22Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type
    • G02B30/25Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the stereoscopic type using polarisation techniques
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/337Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor

Definitions

  • the present invention involves the real-time streaming of stereoscopic video over the Internet.
  • 3D display technologies Today, there are several types of 3D display technologies including color anaglyph, auto-stereoscopy, page flipping, 3D projection, horizontal or vertical interlacing. In stereoscopic video broadcasting, each 3D display technology has different requirements.
  • Stereo video streaming is one of the techniques to transfer stereo videos through Local Area Network (LAN) or internet. Hence, 3D video display and streaming technologies were closely related.
  • a stereoscopic video streaming system which is composed of a stereoscopic 3D video streaming server, decoders, and stereoscopic visualization system .
  • stereoscopic video transmission system comprising:
  • a server connected to transmit left and right video signals making up a stereoscopic image over a Internet
  • a client connected to receive the left and right video images over those channels;
  • left and right channels are distinguished by having different source IP addresses for the server and/or by having different destination IP addresses for the client.
  • the present invention also provides a method of transmitting stereoscopic video from a first location to a second location comprising :
  • Protocol network connecting the locations in a first channel
  • Figure 1 shows a first example of a dual channel stereoscopic video streaming and visualization system this example having a 3D projection system and a software video decoder;
  • Figure 2 shows a second example of a dual channel stereoscopic video streaming and visualization system, this example having a 3D LCD system and a hardware video decoder;
  • Figure 3 shows third example of a dual channel stereoscopic video streaming and visualization system, this example having a 3D projection system and a hardware video decoder;
  • Figure 3A shows a stereoscopic video streaming system
  • Figure 3B shows a stereoscopic video streaming system
  • Figure 3C shows a stereoscopic video streaming system
  • Figure 4 shows channels distinguished by two IP addresses at the server
  • Figure 5 shows channels distinguished by two IP addresses at the client
  • Figure 6 shows channels distinguished by two IP addresses at the server and at the client
  • FIG. 7 illustrates the interspersing method of the
  • Figure 8 illustrates a third method of generating the pixel polarization map
  • Figure 9 shows the pixel offset for depth control.
  • Figure 1 is a first example of a dual channel stereoscopic video streaming and visualization system, this example having in particular a 3D projection system and a software video decoder .
  • a stereoscopic 3D video streaming server 1 at a first location is arranged to broadcast or telecast three-dimensional images captured from a stereoscopic pair of video cameras 3, 4 at that location connected to the server, to a client computer 20 at a second location.
  • the server is installed with two video capturing devices 5, 6 connected to the respective cameras for the acquiring of left and right video signals from the cameras
  • Standard video encoding and streaming software 7 is used to encode the video signals from the left camera and to stream it to Internet 10 in standard video formats such as Flash, H.264, RM/RMVB, MKV, WMV9 , MPEG 1/2/4.
  • a second instance 8 of the encoding software in the server 1 is used to similarly encode the video signals from the right camera and stream those to the Internet.
  • the first instance of the streaming software 7 transmits its stream from a first IP address belonging the server 1, while the second instance uses a IP address, also belonging to the server, different to the first to transmit its stream.
  • the streams use TCP but UDP may also be used.
  • This provides the video streams with separate identity during transmission, which provides the separate channels marked in the Figures.
  • the network will most likely, at least for most of the time, send the packets of the streams via the same route, which route may be left, as with most Internet traffic, for the network to decide on a packet by packet basis.
  • the use of particular routings, quality of service flags or any other kind of traffic management is not excluded.
  • a computer 20 is provided with two instances 21 and 22 of a software video decoder. Each decoder is connected to receive a respective one of the two video streams by listening on a respective one of two IP addresses belonging to the computer, which addresses are also respective to the streams. Respective video cards 23 and 24 are provided to output the respective left and right video signals in decoded form from the software decoders to respective video projectors 25 and 26.
  • the projectors each have a circular polarized filter over their output lenses, respectively clockwise and anticlockwise circularly polarized.
  • the projectors project their images overlaid on to a 3D projection screen 27, which preserves the polarization state of the reflected light.
  • the screen is viewed by a viewer 40 wearing 3D spectacles 41 comprising clockwise and anticlockwise circularly polarized filters for respective eyes
  • FIG. 2 shows another example, which is similar to that of Figure 1.
  • hardware decoders 21' and 22' are used instead.
  • 3D LCD TV 27' is used as an alternative to the projection system.
  • the hardware decoders are arranged to receive respectively the left and right video streams from the Internet to produce respective left and right video signals in decoded form.
  • the LCD TV 27' is generally conventional as such but has an overlay of polarizing material providing alternate pixels in the display with a polarizing filter, alternately between pixels these are of clockwise and anticlockwise circular polarization.
  • the pattern of the clockwise and anticlockwise filters may be horizontal or vertical stripes, or chequerboard.
  • the viewer 40 wears 3D spectacles 41 with respective clockwise and anticlockwise circular
  • the 3D LCD TV expects as its input a single normal video signal but with the pixels of the left and right images interspersed in that signal in a corresponding manner to the pattern of the filters on the LCD display.
  • a left and right combiner 29 is connected to receive the left and right decoded video streams from the decoders and to provide an appropriate combined signal to the 3D TV. This is provided as a software module (also denoted by box 29 in the Figure) in the client computer and its operation is explained later below, with respect to Figures 7 to 10.
  • Figure 3 shows a further example, which is similar to that of Figures 1 and 2, except that this is has hardware decoders 21' and 22' like the example of Figure 2 and projectors 25 and 26 like the example of Figure 1.
  • the outputs of the hardware decoders are respectively connected to the video inputs of the two video projectors.
  • Figures 3A, 3B and 3C show some further examples. These all employ the transmission system for separate left and right channels as described above.
  • Each is shown with twin hardware video decoders but a software decoder in the manner of Figure 1 would also be possible. Also each is provided with a
  • the combiner 29 in the manner of Figure 2 to drive the display.
  • the combiner may be omitted if the display is constructed to receive left and right video signals in parallel.
  • the display is a 3D LED display 50.
  • an LED display 50 is provided. This may be, for example, a screen of a diagonal size from 32 inch to hundred feet, and may provide high colour fidelity.
  • the LED display 50 comprises rows 51, 52 of LEDs which provide the light for each pixel of the display, the rows providing pixels of the left and right images alternately. A large number of users 40 are able to view the 3D video
  • the combiner 29 provides a joint combined video signal of those images to the display 50, interspersing the signal for the left and right images in a method to be described below.
  • users' spectacles 41 are provided with vertical and horizontal linear polarization filters respectively for each eye so that each eye receives the correct one of the left and right video images.
  • FIG. 3B The example of Figure 3B is similar to that of Figure 3A.
  • the LED display 50 is however provided with a further layer over the linear polarizers, namely a quarter wave retarder plate 55.
  • the fast axis of this is aligned, preferably at 45 degrees to the axes of the polarizers, to convert the linearly polarized light emitted from the polarizers into respective ones of left and right handed circular polarizations.
  • users' spectacles 41 are provided with left and right circular
  • FIG. 3C The example of Figure 3C is provided with a different set of polarizing filters and plates.
  • a single sheet of linear polarizing filter 56 is laid. (This sheet could be divided into portions each covering one or more pixels, but it is highly efficient to use a single sheet.)
  • portions 57 (here strips) of a half wave
  • retarder plate which converts for those pixels the light from the polarizer to the other, perpendicular, linear polarization.
  • the next layer is a quarter wave plate 55 over all the pixels, which as in the example of Figure 3B converts the linearly polarized light into left and right circular polarizations.
  • the pixels for the left and right images are arranged in alternate rows. Howeve they may be equally be arranged in columns, a
  • the LED displays of these examples provided a superior viewing angle of 30 degrees compared to the LCD TVs which had a
  • Each of the above examples may optionally be provided with a top antireflective acrylic sheet and/or an acrylic sheet over the LEDs to provide a based for the polarizing layers.
  • Each of the above examples has separate channels for the left and right video streams for the transmission over the Internet.
  • Alternative arrangements for providing separate channels of transmission are as follows.
  • One or other (in fact preferably both as in the example above) of the server 1 and the client 20 is provided with two IP addresses and the left and right streams are transmitted from the server using a respective one of those two IP addresses as the source address or to the client using a respective one as the destination address. If two IP addresses are used at each of both of the ends transmission of each stream is from a respective one of the IP address of the server to a respective IP address of the client.
  • Figure 7 shows the signal combining process for the examples of Figures 2, 3A, 3B and 3C.
  • the display 7 may be, for example the LCD 27' or LED display 50 but which could also use other technologies, such as plasma, to generate the pixels.
  • This display 7 has an input for a combined signal 21 comprising pixels of both the left and right images and has its pixels arranged in rows and has mounted over each row a strip of polarizing material. The strips of polarizing
  • material for alternate rows have opposite polarizations, for example left circular polarization and right circular polarization (or vertical polarization and horizontal
  • the combining module 29 operates (i) to take pixels from the left video signal and provide them in rows of the combined video signal 21 that are displayed on rows of the display 7 that have a filter providing a first polarization and (ii) to take pixels from the right video signal and display them on rows having a filter of the second opposite polarization.
  • the viewer 23 is provided with spectacles 24 having a
  • each left or right frame are, of course, kept in their original order. Further, corresponding rows from the left and right frames are displayed only with a small vertical distance from each other; in this example where the polarizing filter strips are only one row of pixels high corresponding rows from the left and right images are displayed next to each other.
  • the method of the combining module 29 in this example is to build up the rows of the combined frame in order by taking rows, in order, alternately from the left and right frames.
  • the combined signal is displayed on the 3D display 7 in the same single combined image area. If the sum of the number of rows in the left image and that in the right image exceeds the number that can be displayed on the display 7 then only a fraction of the rows from the left and right images are used, with the other rows being dropped; preferably the dropped rows are spread across the image so that no large area at the top or bottom is omitted.
  • a 3D display has vertical strips of polarizing filter one pixel wide with alternate columns of pixels being given opposite polarization. Since pixels of the frames are usually organized in RAM in a raster pattern of one row after the next, the combiner 29 in this case steps through following that pattern taking pixels from the left and right images alternately.
  • pixels in certain columns are dropped; for example if the left and right frames have the same width in pixels as the display 7 then pixels in alternate columns of the left and right frames are dropped.
  • a further example of how the filters on a 3D monitor may be arranged is to arrange them in a chequerboard pattern with the polarization of the filter changing to the opposite every pixel both in the vertical and horizontal directions.
  • the application program works through the pixels of each row in order taking pixels alternately from the corresponding pixels of the left and right frames. The first pixel of each row is taken alternately from the left and right frames.
  • the left, right and combined frames are stored in the main RAM of the computer and the application program is executed by the central processing unit of the client computer 20.
  • Video capture and display cards also have RAM and processing units, and in alternative
  • the RAM of any of these or the main RAM and the processors of any these or the CPU may be used, alone or in combination.
  • These methods of combining the pixels of the left and right frames can be explicitly coded into the combining module for each type of filter layout. This however is not very flexible since the user of the system may wish to do so without having the application program specially coded for the pattern of their particular 3D display. Problems would also occur if the monitor of an installed system was to be upgraded to a
  • a pixel polarization map 30 of the display being used is first derived.
  • An example of such a map is shown in 8 at 30. It comprises a two-dimensional array of values each corresponding to a pixel of the display 7.
  • the information is shown in 8 at 30.
  • display 7 is used to display a frame which is completely white. The display is then viewed through a magnifying glass and one of the left and right filters of the 3D spectacles that are provided to view the display. This will reveal the pattern of pixels in the display for that eye. This information is then used to compile the map. This method is useful if there is no relevant documentation supplied with the 3D display.
  • Figure 8 illustrates a third example of a method of deriving the map.
  • Some 3D monitors for example, are supplied with a test program that tak.es two static images provided by the user, a left image and right image, and interlaces them in an
  • this test program is provided with a completely black image 31 as the left image and a completely white image 32 as the right image to produce an output image. (Other pairs of colours may be used.)
  • the resultant image is not a 3D image (in contrast to the purpose of the test program) but is in fact a map of the filter polarizations for the pixels. Since the left image is all black the pixels having the value (0,0,0) (in 24 bit RGB representation) are those for the left image pixels and those having the value (255,255,255) (in 24 bit RGB representation) are those for the right image pixels. Therefore the resultant image can be saved as it is for later use as the map.
  • the preferred example of combining module uses the map as follows. As before the decoders provide series of frames of those images.
  • the combining module 29 processes each pair of frames from the left and right series as follows. For each pixel of the combined frame the combining module first looks up the value of that pixel in the map.
  • the left, right and combined frames are of the same height and width in pixels. No special steps are required to drop pixels since for each pixel it only selects one or other of the left and right pixels. If however the left and right frames are of a different size to the combined frame a rule for the mapping the pixels between the two is adopted, but the basic method of selecting the pixel to be from the left or right frame on the basis of the map is unaffected.
  • mapping rule could be that a 2x2 block of four pixels in the combined frame maps to a respective 1 pixel in the left and right frames, which will result (with the more likely arrangements of polarization filters on the 3D ) in each block of 2x2 pixels in the combined frame showing each corresponding pixel of the left and right frames twice, with of course the correct polarizations for each .
  • each sub-pixel of each pixel of the map is checked individually and in response to each such test the corresponding sub-pixel of the left or right image is copied to the resultant image. Although this is less efficient in terms of the number of tests, the resultant image is the same.
  • the combining module (whether using a pixel polarization map or not) also provides a depth control.
  • This takes the form of a pixel offset value which is used when copying pixels from one of the left and right images to the combined image so that corresponding pixels in the left and right images are offset from each other horizonta-lly by that number of pixels.
  • the offset can be set to values from zero to positive values for which the pixels for the left image are offset in the display to the left of the corresponding pixels of the right image.
  • Figure 9 illustrates the final positions of the pixels for an offset of 2.
  • R right and the pixel coordinates from those images are given as (row, column) .
  • the case is for a monitor having horizontal strips of polarization filter one pixel high.

Abstract

A method and system for transmitting stereoscopic images has a server that transmits the left and right video images over separate channels to a client. The server and/or the client has two IP addresses and the respective channels use respective ones of those IP addresses.

Description

DUAL CHANNEL STEREOSCOPIC VIDEO STREAMING SYSTEMS
Field of the Invention The present invention involves the real-time streaming of stereoscopic video over the Internet.
Background of the Invention Television has been evolving gradually from black-and-white to colour to high definition. The next awaited step is the introduction of 3D. In traditional broadcasting this will mean big TV sets. For conferencing and lectures, due to the limited size of existing 3DTV sets, 3D projection is still the main stream.
Today, there are several types of 3D display technologies including color anaglyph, auto-stereoscopy, page flipping, 3D projection, horizontal or vertical interlacing. In stereoscopic video broadcasting, each 3D display technology has different requirements. Stereo video streaming is one of the techniques to transfer stereo videos through Local Area Network (LAN) or internet. Hence, 3D video display and streaming technologies were closely related.
In the literature there exist a few systems for stereo video streaming. A 3DTV prototype system, with real-time acquisition, transmission and auto-stereoscopic display of dynamic scenes, has been introduced by Mitsubishi Electric Research Laboratories ( ERL) . Multiple video "streams are encoded and sent over a broadband network. The 3D display shows high- resolution stereoscopic color images for multiple viewpoints without special glasses. This system uses light-field rendering to synthesize views at the correct virtual camera positions. However, this technique was constrained in limited viewing angles and distance. Auto-stereoscopic technique was not applicable in large-scale display.
In Combined Exhibition of Advanced Technologies 2010 (CEATEC 2010), Sony has showed the world's largest 3D LED display system measured over 70 feet wide by 16 feet tall. This System was used to broadcast the World Cup match between Japan and the Netherlands this summer. The 3D LED display was made up of 345 tiles that each included 9 square display modules. Sony used Cyan/Magenta anaglyphic glasses to avoid any active glasses for 3D display requirements. The disadvantages of existing 3D display technologies are as following :
Technology Disadvantage
. Colour anaglyph LCD, LED, 3D Low colour fidel projection display with
Cyan/Magenta anaglyphic glasses
Auto-ste eoscopy Limited viewing
angle and distance
3D projection with active shutter High cost of the 3D glasses glasses
4. 3D projection with passive Low bright
polarized 3D glasses
5. Cathode Ray Tube (CRT) monitor Limited size
with active shutter 3D glasses
6. Horizontal or Vertical interlacing Limited size
LCD with passive polarized 3D
glasses Brief Summary of the Invention
In this invention, a stereoscopic video streaming system was developed which is composed of a stereoscopic 3D video streaming server, decoders, and stereoscopic visualization system .
According to the present invention there is provided a
stereoscopic video transmission system comprising:
a server connected to transmit left and right video signals making up a stereoscopic image over a Internet
Protocol network in respective channels; and
a client connected to receive the left and right video images over those channels;
wherein the left and right channels are distinguished by having different source IP addresses for the server and/or by having different destination IP addresses for the client.
The present invention also provides a method of transmitting stereoscopic video from a first location to a second location comprising :
obtaining separate video signals representing left and right video signals of the stereoscopic video;
transmitting the left video signal over an Internet
Protocol network connecting the locations in a first channel; and
transmitting the right video signal over an Internet Protocol network connecting the locations in a second channel; wherein packets of the video data in the first and second channel are denoted as such by having different source IP addresses and/or by having different destination IP addresses.
Further features of the invention are set out in the appended claims . Description of the Drawing
There will now be described examples of the invention, with reference to the accompanying drawings, of which:
Figure 1 shows a first example of a dual channel stereoscopic video streaming and visualization system this example having a 3D projection system and a software video decoder;
Figure 2 shows a second example of a dual channel stereoscopic video streaming and visualization system, this example having a 3D LCD system and a hardware video decoder;
Figure 3 shows third example of a dual channel stereoscopic video streaming and visualization system, this example having a 3D projection system and a hardware video decoder;
Figure 3A shows a stereoscopic video streaming system
including a first example of a 3D LED display;
Figure 3B shows a stereoscopic video streaming system
including a second example of a 3D LED display;
Figure 3C shows a stereoscopic video streaming system
including a third example of a 3D LED display;
Figure 4 shows channels distinguished by two IP addresses at the server;
Figure 5 shows channels distinguished by two IP addresses at the client;
Figure 6 shows channels distinguished by two IP addresses at the server and at the client;
Figure 7 illustrates the interspersing method of the
stereoscopic video system for the example of Figure 2 etc . ;
Figure 8 illustrates a third method of generating the pixel polarization map; and Figure 9 shows the pixel offset for depth control.
Figure 1 is a first example of a dual channel stereoscopic video streaming and visualization system, this example having in particular a 3D projection system and a software video decoder .
A stereoscopic 3D video streaming server 1 at a first location is arranged to broadcast or telecast three-dimensional images captured from a stereoscopic pair of video cameras 3, 4 at that location connected to the server, to a client computer 20 at a second location. The server is installed with two video capturing devices 5, 6 connected to the respective cameras for the acquiring of left and right video signals from the cameras Standard video encoding and streaming software 7 is used to encode the video signals from the left camera and to stream it to Internet 10 in standard video formats such as Flash, H.264, RM/RMVB, MKV, WMV9 , MPEG 1/2/4. A second instance 8 of the encoding software in the server 1 is used to similarly encode the video signals from the right camera and stream those to the Internet. The first instance of the streaming software 7 transmits its stream from a first IP address belonging the server 1, while the second instance uses a IP address, also belonging to the server, different to the first to transmit its stream. (Preferably the streams use TCP but UDP may also be used. ) This provides the video streams with separate identity during transmission, which provides the separate channels marked in the Figures. (However, of course, the network will most likely, at least for most of the time, send the packets of the streams via the same route, which route may be left, as with most Internet traffic, for the network to decide on a packet by packet basis. The use of particular routings, quality of service flags or any other kind of traffic management is not excluded. ) In the client side in the example of Figure 1 a computer 20 is provided with two instances 21 and 22 of a software video decoder. Each decoder is connected to receive a respective one of the two video streams by listening on a respective one of two IP addresses belonging to the computer, which addresses are also respective to the streams. Respective video cards 23 and 24 are provided to output the respective left and right video signals in decoded form from the software decoders to respective video projectors 25 and 26. The projectors each have a circular polarized filter over their output lenses, respectively clockwise and anticlockwise circularly polarized. The projectors project their images overlaid on to a 3D projection screen 27, which preserves the polarization state of the reflected light. The screen is viewed by a viewer 40 wearing 3D spectacles 41 comprising clockwise and anticlockwise circularly polarized filters for respective eyes
Figure 2 shows another example, which is similar to that of Figure 1. However in this case as an alternative to the software decoders hardware decoders 21' and 22' are used instead. Also as an alternative to the projection system a 3D LCD TV 27' is used. The hardware decoders are arranged to receive respectively the left and right video streams from the Internet to produce respective left and right video signals in decoded form. The LCD TV 27' is generally conventional as such but has an overlay of polarizing material providing alternate pixels in the display with a polarizing filter, alternately between pixels these are of clockwise and anticlockwise circular polarization. The pattern of the clockwise and anticlockwise filters may be horizontal or vertical stripes, or chequerboard. Again, the viewer 40 wears 3D spectacles 41 with respective clockwise and anticlockwise circular
polarization filters. The 3D LCD TV expects as its input a single normal video signal but with the pixels of the left and right images interspersed in that signal in a corresponding manner to the pattern of the filters on the LCD display. A left and right combiner 29 is connected to receive the left and right decoded video streams from the decoders and to provide an appropriate combined signal to the 3D TV. This is provided as a software module (also denoted by box 29 in the Figure) in the client computer and its operation is explained later below, with respect to Figures 7 to 10.
Figure 3 shows a further example, which is similar to that of Figures 1 and 2, except that this is has hardware decoders 21' and 22' like the example of Figure 2 and projectors 25 and 26 like the example of Figure 1. The outputs of the hardware decoders are respectively connected to the video inputs of the two video projectors. Figures 3A, 3B and 3C show some further examples. These all employ the transmission system for separate left and right channels as described above. Each is shown with twin hardware video decoders but a software decoder in the manner of Figure 1 would also be possible. Also each is provided with a
combiner 29 in the manner of Figure 2 to drive the display. The combiner may be omitted if the display is constructed to receive left and right video signals in parallel.)
In each of these three examples the display is a 3D LED display 50. In the example of Figure 3A, an LED display 50 is provided. This may be, for example, a screen of a diagonal size from 32 inch to hundred feet, and may provide high colour fidelity. The LED display 50 comprises rows 51, 52 of LEDs which provide the light for each pixel of the display, the rows providing pixels of the left and right images alternately. A large number of users 40 are able to view the 3D video
display using low cost passive polarized 3D glasses 41 (with linear polarization filters) at any angle with long distance since the screen is provided with alternate strips 53, 54, of perpendicularly oriented polarizing filters, in this example horizontal 53 and vertically 54 aligned, which polarize the light linearly respectively for the left and right images. The combiner 29 provides a joint combined video signal of those images to the display 50, interspersing the signal for the left and right images in a method to be described below.
In this case users' spectacles 41 are provided with vertical and horizontal linear polarization filters respectively for each eye so that each eye receives the correct one of the left and right video images.
The example of Figure 3B is similar to that of Figure 3A. The LED display 50 is however provided with a further layer over the linear polarizers, namely a quarter wave retarder plate 55. The fast axis of this is aligned, preferably at 45 degrees to the axes of the polarizers, to convert the linearly polarized light emitted from the polarizers into respective ones of left and right handed circular polarizations. In this case users' spectacles 41 are provided with left and right circular
polarization filters respectively for each eye so that each eye receives the correct one of the left and right video images .
The example of Figure 3C is provided with a different set of polarizing filters and plates. First over all the LED pixel elements a single sheet of linear polarizing filter 56 is laid. (This sheet could be divided into portions each covering one or more pixels, but it is highly efficient to use a single sheet.) Next over the pixels for one of the left and right images is laid portions 57 (here strips) of a half wave
retarder plate, which converts for those pixels the light from the polarizer to the other, perpendicular, linear polarization. The next layer is a quarter wave plate 55 over all the pixels, which as in the example of Figure 3B converts the linearly polarized light into left and right circular polarizations.
Mounting the half wave plate portions above the quarter wave plate instead would produce the same result of left and right circular polarizations.
How to construct quarter and half wave retarder plates is well known in the art of optics generally.
In each of the examples of Figures, 3A, 3B and 3C the pixels for the left and right images are arranged in alternate rows. Howeve they may be equally be arranged in columns, a
chequerboard pattern or any other interspersed pattern.
The LED displays of these examples provided a superior viewing angle of 30 degrees compared to the LCD TVs which had a
viewing angle of 20 degrees.
Each of the above examples may optionally be provided with a top antireflective acrylic sheet and/or an acrylic sheet over the LEDs to provide a based for the polarizing layers.
Each of the above examples has separate channels for the left and right video streams for the transmission over the Internet. Alternative arrangements for providing separate channels of transmission are as follows. One or other (in fact preferably both as in the example above) of the server 1 and the client 20 is provided with two IP addresses and the left and right streams are transmitted from the server using a respective one of those two IP addresses as the source address or to the client using a respective one as the destination address. If two IP addresses are used at each of both of the ends transmission of each stream is from a respective one of the IP address of the server to a respective IP address of the client These methods are shown in Figures 4, 5 and 6 respectively.
Using two IP address in any of the ways described above means that the two channels for the left and right video streams are provided with a separate identity by that, so the two streams may use the same port numbers as each other, but equally they could use different ones. It would also be possible to
distinguish the two streams by using different port numbers but the same source and destination IP addresses, however, experiments have found that the performance is more stable using separate IP addresses and that the network tends to favour one the streams if separate ports is used so that arrangement is not preferred..
The above examples streaming over the Internet is described. While of course this may be the public Internet, the streaming may of course be over any network employing the Internet
Protocol, such as a LAN, or indeed be over a path comprising two or more such networks that are interconnected. Figure 7 shows the signal combining process for the examples of Figures 2, 3A, 3B and 3C. (Note that the display 7 may be, for example the LCD 27' or LED display 50 but which could also use other technologies, such as plasma, to generate the pixels.) This display 7 has an input for a combined signal 21 comprising pixels of both the left and right images and has its pixels arranged in rows and has mounted over each row a strip of polarizing material. The strips of polarizing
material for alternate rows have opposite polarizations, for example left circular polarization and right circular polarization (or vertical polarization and horizontal
polarization, or more generally generates two opposite polarizations e.g. as in Figures 3B and 3C) . The combining module 29 operates (i) to take pixels from the left video signal and provide them in rows of the combined video signal 21 that are displayed on rows of the display 7 that have a filter providing a first polarization and (ii) to take pixels from the right video signal and display them on rows having a filter of the second opposite polarization. As noted above, the viewer 23 is provided with spectacles 24 having a
polarizing filter for the left eye that allows through the pixels having the first polarization and a filter for the right that allows through pixels having the second, opposite, polarization. In this way the left and right eyes see
respectively only the left and right images and so the viewer perceives 25 the view in 3D. The rows and pixels of each left or right frame are, of course, kept in their original order. Further, corresponding rows from the left and right frames are displayed only with a small vertical distance from each other; in this example where the polarizing filter strips are only one row of pixels high corresponding rows from the left and right images are displayed next to each other.
So, in summary the method of the combining module 29 in this example is to build up the rows of the combined frame in order by taking rows, in order, alternately from the left and right frames. The combined signal is displayed on the 3D display 7 in the same single combined image area. If the sum of the number of rows in the left image and that in the right image exceeds the number that can be displayed on the display 7 then only a fraction of the rows from the left and right images are used, with the other rows being dropped; preferably the dropped rows are spread across the image so that no large area at the top or bottom is omitted. For example, if the left and right images each have 1080 rows and the monitor has 1080 rows then only every other row from each of the images are used and the others are dropped - the result image therefore has 540 + 540 = 1080 rows, as required for the monitor. (See in Figure 9 later, which Figure is in accordance with this example, that the combined image has only odd numbered rows from the left image and even numbered rows from the right image . )
In another example a 3D display has vertical strips of polarizing filter one pixel wide with alternate columns of pixels being given opposite polarization. Since pixels of the frames are usually organized in RAM in a raster pattern of one row after the next, the combiner 29 in this case steps through following that pattern taking pixels from the left and right images alternately.
In this case, if the sum of the number of pixels across the left image and the number across the right image is too great to display then pixels in certain columns are dropped; for example if the left and right frames have the same width in pixels as the display 7 then pixels in alternate columns of the left and right frames are dropped.
A further example of how the filters on a 3D monitor may be arranged is to arrange them in a chequerboard pattern with the polarization of the filter changing to the opposite every pixel both in the vertical and horizontal directions. Here the application program works through the pixels of each row in order taking pixels alternately from the corresponding pixels of the left and right frames. The first pixel of each row is taken alternately from the left and right frames. Note that in the examples above the left, right and combined frames are stored in the main RAM of the computer and the application program is executed by the central processing unit of the client computer 20. Video capture and display cards also have RAM and processing units, and in alternative
examples of the invention the RAM of any of these or the main RAM and the processors of any these or the CPU may be used, alone or in combination. These methods of combining the pixels of the left and right frames can be explicitly coded into the combining module for each type of filter layout. This however is not very flexible since the user of the system may wish to do so without having the application program specially coded for the pattern of their particular 3D display. Problems would also occur if the monitor of an installed system was to be upgraded to a
different one having a different layout. A preferred example of the combining module 29 avoids these inconveniences. In that preferred example a pixel polarization map 30 of the display being used is first derived. An example of such a map is shown in 8 at 30. It comprises a two-dimensional array of values each corresponding to a pixel of the display 7. In a first example of deriving the map the information
necessary to compile it is taken from the documentation
supplied with the 3D display.
In a second example of a method of deriving the map the
display 7 is used to display a frame which is completely white. The display is then viewed through a magnifying glass and one of the left and right filters of the 3D spectacles that are provided to view the display. This will reveal the pattern of pixels in the display for that eye. This information is then used to compile the map. This method is useful if there is no relevant documentation supplied with the 3D display.
Figure 8 illustrates a third example of a method of deriving the map. Some 3D monitors, for example, are supplied with a test program that tak.es two static images provided by the user, a left image and right image, and interlaces them in an
appropriate manner for display on the 3D monitor. For the derivation of the map 30 this test program is provided with a completely black image 31 as the left image and a completely white image 32 as the right image to produce an output image. (Other pairs of colours may be used.) The resultant image is not a 3D image (in contrast to the purpose of the test program) but is in fact a map of the filter polarizations for the pixels. Since the left image is all black the pixels having the value (0,0,0) (in 24 bit RGB representation) are those for the left image pixels and those having the value (255,255,255) (in 24 bit RGB representation) are those for the right image pixels. Therefore the resultant image can be saved as it is for later use as the map. This can be in RAM or in long term storage such as a disc drive. (Note that it is not necessary to discover the method by which the test program interlaces the pixels from the left and right images - only the resultant image - i.e. the map 30 - is required. Nor in fact is it necessary to display the test pattern on the 3D monitor.)
(Since the images in computer systems are usually stored as RGB values there are in fact three sub-pixel maps 30 as shown in Figure 8, one for each or red, green and blue, but since white and black test images are used they contain the same values. If different pairs of colours are used again only one sub-pixel of the three need be tested because the although the red, green and blue values will in general be different from each other each still indicates by itself whether the pixel is for the left or right eye - except, of course for a colour pair both having the same red value, for example, testing the red value only is not sensible.)
The preferred example of combining module uses the map as follows. As before the decoders provide series of frames of those images. The combining module 29 processes each pair of frames from the left and right series as follows. For each pixel of the combined frame the combining module first looks up the value of that pixel in the map. If it indicates that the pixel is for the left image (pixel = (0,0,0) in 24 bit RGB representation) then it takes the value of the corresponding pixel from the left frame and copies it to the corresponding pixel at the same position in the combined frame; on the other hand if it indicates that the pixel is for the right image (pixel = (255,255,255) in 24 bit RGB representation) then it takes the value of the corresponding pixel from the right frame and copies it to the corresponding pixel in the combined frame. (Note that since each sub-pixel map contains the same information the value of the pixel in the map can be checked by checking just one of the sub-pixels, as well as by testing all three of them. Note also that it does not matter which order the application works through the pixels, but the raster pattern of working across each row in turn is employed in this example . )
In the simplest example of this method the left, right and combined frames are of the same height and width in pixels. No special steps are required to drop pixels since for each pixel it only selects one or other of the left and right pixels. If however the left and right frames are of a different size to the combined frame a rule for the mapping the pixels between the two is adopted, but the basic method of selecting the pixel to be from the left or right frame on the basis of the map is unaffected. For example if the left and right frames are half height and half width then mapping rule could be that a 2x2 block of four pixels in the combined frame maps to a respective 1 pixel in the left and right frames, which will result (with the more likely arrangements of polarization filters on the 3D ) in each block of 2x2 pixels in the combined frame showing each corresponding pixel of the left and right frames twice, with of course the correct polarizations for each . In an alternative version of the preferred example of the combining module 29 each sub-pixel of each pixel of the map is checked individually and in response to each such test the corresponding sub-pixel of the left or right image is copied to the resultant image. Although this is less efficient in terms of the number of tests, the resultant image is the same.
The combining module (whether using a pixel polarization map or not) also provides a depth control. This takes the form of a pixel offset value which is used when copying pixels from one of the left and right images to the combined image so that corresponding pixels in the left and right images are offset from each other horizonta-lly by that number of pixels. The offset can be set to values from zero to positive values for which the pixels for the left image are offset in the display to the left of the corresponding pixels of the right image.
Figure 9 illustrates the final positions of the pixels for an offset of 2. The notation used in the Figure is L=left,
R=right and the pixel coordinates from those images are given as (row, column) . The case is for a monitor having horizontal strips of polarization filter one pixel high.

Claims

CLAIMS:
1. A stereoscopic video transmission system comprising:
a server connected to transmit left and right video signals making up a stereoscopic image over a Internet
Protocol network in respective channels; and
a client connected to receive the left and right video images over those channels;
wherein the left and right channels are distinguished by having different source IP addresses for the server and/or by having different destination IP addresses for the client.
2. A stereoscopic video transmission system as claimed in claim 1 wherein the server comprises left and right encoders connected to receive respectively left and right video signals, to encode them and to transmit them into the respective
channel .
3. A stereoscopic video transmission system as claimed in claim 2 wherein the encoders are respective instances of a software encoder. . A stereoscopic video transmission system as claimed in any preceding claim wherein the client comprises left and right decoders respectively connected to receive the video signal from the channels and to decode those signals.
5. A stereoscopic video transmission system as claimed in claim 4 wherein decoders are respective instances of a
software decoder.
6. A stereoscopic video transmission system as claimed in any preceding claim comprising dual projectors respectively connected to receive the left and right video signals from the client .
7. A stereoscopic video transmission system as claimed in any one of claims 1 to 5 comprising a 3D panel display connected to receive the left and right video signals from the client.
8. A stereoscopic video transmission system as claimed in claim 7
comprising a display having pixels for displaying the left video signal interspersed with pixels for displaying the left video signal, and
wherein the client comprises a combiner connected to receive the left and right video signal and being operative to intersperse the pixels of those signals into a signal supplied to the display so that the pixels of the left and right video signals are respectively supplied to left and right pixels of the display.
9. A stereoscopic video transmission system as claimed in any one of claims 1 to 8 including a 3D LED display connected to receive the left and right video signals from the client.
10. A method of transmitting stereoscopic video from a first location to a second location comprising:
obtaining separate video signals representing left and right video signals of the stereoscopic video;
transmitting the left video signal over an Internet
Protocol network connecting the locations in a first channel; and
transmitting the right video signal over an Internet Protocol network connecting the locations in a second channel; wherein packets of the video data in the first and second channel are denoted as such by having different source IP addresses and/or by having different destination IP addresses.
PCT/GB2012/000042 2011-01-24 2012-01-18 Dual channel stereoscopic video streaming systems WO2012101395A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
GB1101212.7A GB2487537A (en) 2011-01-24 2011-01-24 Stereoscopic video transmission over the internet with separate IP channels for the left and right images.
GB1101212.7 2011-01-24

Publications (1)

Publication Number Publication Date
WO2012101395A1 true WO2012101395A1 (en) 2012-08-02

Family

ID=43769541

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/GB2012/000042 WO2012101395A1 (en) 2011-01-24 2012-01-18 Dual channel stereoscopic video streaming systems

Country Status (3)

Country Link
GB (1) GB2487537A (en)
HK (1) HK1154462A2 (en)
WO (1) WO2012101395A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11016579B2 (en) 2006-12-28 2021-05-25 D3D Technologies, Inc. Method and apparatus for 3D viewing of images on a head display unit
US11228753B1 (en) 2006-12-28 2022-01-18 Robert Edwin Douglas Method and apparatus for performing stereoscopic zooming on a head display unit
US11275242B1 (en) 2006-12-28 2022-03-15 Tipping Point Medical Images, Llc Method and apparatus for performing stereoscopic rotation of a volume on a head display unit
US11315307B1 (en) 2006-12-28 2022-04-26 Tipping Point Medical Images, Llc Method and apparatus for performing rotating viewpoints using a head display unit

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090096726A1 (en) * 2007-10-15 2009-04-16 Nec Lcd Technologies, Ltd. Display device, terminal device, display panel, and display device driving method
EP2219383A2 (en) * 2009-02-17 2010-08-18 Samsung Electronics Co., Ltd. 2D/3D display system, 2D/3D display apparatus and control method of 2D/3D display apparatus

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050175083A1 (en) * 2000-09-01 2005-08-11 Gutierrez Novelo Manuel R. Stereoscopic video capturing device and dual receiver with viewer for three-dimension display, and method thereof
JP4348337B2 (en) * 2003-07-30 2009-10-21 誠次郎 富田 Light source device for stereoscopic image display device
US7221332B2 (en) * 2003-12-19 2007-05-22 Eastman Kodak Company 3D stereo OLED display
US7522184B2 (en) * 2004-04-03 2009-04-21 Li Sun 2-D and 3-D display
TWI381191B (en) * 2007-12-03 2013-01-01 Au Optronics Corp Three-dimensional display device and fabricating method thereof
DE202008001645U1 (en) * 2008-02-06 2008-05-15 Maibom, Frank 3D polarizer shader module for LED screens
KR100972791B1 (en) * 2008-07-17 2010-07-30 주식회사 파버나인코리아 Organic light emitting diode for stereoscopic display
WO2010104881A2 (en) * 2009-03-10 2010-09-16 Bradley Nelson 3d stereoscopic display system for large format led displays
US20100231700A1 (en) * 2009-03-10 2010-09-16 Lsi Industries, Inc. 3d screen with modular polarized pixels
US20100315486A1 (en) * 2009-06-15 2010-12-16 Electronics And Telecommunication Research Institute Stereoscopic video service providing/receiving method and apparatus in digital broadcasting system
KR100959506B1 (en) * 2010-01-29 2010-05-25 갤럭시아일렉트로닉스(주) 3d light board

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090096726A1 (en) * 2007-10-15 2009-04-16 Nec Lcd Technologies, Ltd. Display device, terminal device, display panel, and display device driving method
EP2219383A2 (en) * 2009-02-17 2010-08-18 Samsung Electronics Co., Ltd. 2D/3D display system, 2D/3D display apparatus and control method of 2D/3D display apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
KIYOUNG LEE: "Software-based realization of secure stereoscopic HD video delivery over IP networks", PROCEEDINGS OF SPIE, vol. 6016, 1 January 2005 (2005-01-01), pages 601604 - 601604-11, XP055022561, ISSN: 0277-786X, DOI: 10.1117/12.630151 *
MILOS LISKA: "Design and Implementation of Capturing, Transmission, and Display of Stereoscopic Video in DV Format", DIPLOMA THESIS, 1 January 2004 (2004-01-01), Brno,, pages 1 - 53, XP055022563, Retrieved from the Internet <URL:http://www.internet2.edu/communities/dvts/dvtsrsrcs/dvts-3d.pdf> [retrieved on 20120321] *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11016579B2 (en) 2006-12-28 2021-05-25 D3D Technologies, Inc. Method and apparatus for 3D viewing of images on a head display unit
US11036311B2 (en) 2006-12-28 2021-06-15 D3D Technologies, Inc. Method and apparatus for 3D viewing of images on a head display unit
US11228753B1 (en) 2006-12-28 2022-01-18 Robert Edwin Douglas Method and apparatus for performing stereoscopic zooming on a head display unit
US11275242B1 (en) 2006-12-28 2022-03-15 Tipping Point Medical Images, Llc Method and apparatus for performing stereoscopic rotation of a volume on a head display unit
US11315307B1 (en) 2006-12-28 2022-04-26 Tipping Point Medical Images, Llc Method and apparatus for performing rotating viewpoints using a head display unit
US11520415B2 (en) 2006-12-28 2022-12-06 D3D Technologies, Inc. Interactive 3D cursor for use in medical imaging

Also Published As

Publication number Publication date
HK1154462A2 (en) 2012-04-20
GB2487537A (en) 2012-08-01
GB201101212D0 (en) 2011-03-09

Similar Documents

Publication Publication Date Title
CN102215364B (en) Format method and the communication interface of the first video data and the second video data
US7760429B2 (en) Multiple mode display device
JP5238429B2 (en) Stereoscopic image capturing apparatus and stereoscopic image capturing system
US20050185711A1 (en) 3D television system and method
US20090116108A1 (en) Lenticular Autostereoscopic Display Device and Method, and Associated Autostereoscopic Image Synthesizing Method
US20080291267A1 (en) Lenticular Autostereoscopic Display Device and Method, and Associated Autostereoscopic Image Synthesising Method
CN101883215A (en) Imaging device
CN103823308A (en) Integrated-imaging double-vision 3D (Three-Dimensional) display device based on polarization gratings
TWI413405B (en) Method and system for displaying 2d and 3d images simultaneously
CN100594737C (en) 3D image display method and system
JP2006098779A (en) Structure of stereo image data, recording method for same, and displaying/reproducing method for same
TWI527434B (en) Method for using a light field camera to generate a three-dimensional image and the light field camera
US8723920B1 (en) Encoding process for multidimensional display
WO2012101395A1 (en) Dual channel stereoscopic video streaming systems
CN108627991A (en) Double vision 3D display device and method based on Lenticular screen
CN102630027B (en) Naked eye 3D display method and apparatus thereof
US20090295909A1 (en) Device and Method for 2D-3D Switchable Autostereoscopic Viewing
TWI772997B (en) Multi-view 3D display screen, multi-view 3D display device
US20130088485A1 (en) Method of storing or transmitting auto-stereoscopic images
Cserkaszky et al. Towards display-independent light-field formats
Lv et al. Polarizer parallax barrier 3D display with high brightness, resolution and low crosstalk
JPH07307959A (en) Stereoscopic video device
Sasaki et al. Color moiré reduction method for thin integral 3d displays
CN108919506A (en) A kind of double vision 3D display device and method
KR20130003966A (en) Film patterned retarder stereoscopic 3d display device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12700872

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12700872

Country of ref document: EP

Kind code of ref document: A1