GB2481094A - 3D display with automatic switching between 2D and 3D display modes - Google Patents

3D display with automatic switching between 2D and 3D display modes Download PDF

Info

Publication number
GB2481094A
GB2481094A GB1108670.9A GB201108670A GB2481094A GB 2481094 A GB2481094 A GB 2481094A GB 201108670 A GB201108670 A GB 201108670A GB 2481094 A GB2481094 A GB 2481094A
Authority
GB
United Kingdom
Prior art keywords
viewer
viewing device
image
display screen
display
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
GB1108670.9A
Other versions
GB201108670D0 (en
GB2481094B (en
Inventor
Ian Bickerstaff
Simon Benson
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment Europe Ltd
Original Assignee
Sony Computer Entertainment Europe Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Computer Entertainment Europe Ltd filed Critical Sony Computer Entertainment Europe Ltd
Priority to GB1108670.9A priority Critical patent/GB2481094B/en
Publication of GB201108670D0 publication Critical patent/GB201108670D0/en
Publication of GB2481094A publication Critical patent/GB2481094A/en
Application granted granted Critical
Publication of GB2481094B publication Critical patent/GB2481094B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • H04N13/383Image reproducers using viewer tracking for tracking with gaze detection, i.e. detecting the lines of sight of the viewer's eyes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/334Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using spectral multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/337Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using polarisation multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • H04N13/341Displays for viewing with the aid of special glasses or head-mounted displays [HMD] using temporal multiplexing
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/361Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N2213/00Details of stereoscopic systems
    • H04N2213/008Aspects relating to glasses for viewing stereoscopic images

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

A stereoscopic display 10 capable of automatically switching between 2D and 3D image modes in dependence upon the detection of whether 3D glasses 215 are being used to view the display screen. A camera 125 may capture images 300 of the viewer and use image processing, such as the relative position between the 3D spectacles and the viewer, to detect if the eyewear is being worn by the viewer. A preset degree of hysteresis may be used when switching between modes. The image brightness may be compressed when switching modes to overcome the light attenuation of the glasses. The display may communicate bi-directionally with the viewing glasses with query and confirmation signals sent between devices. The glasses may comprise pupil detecting means to detect the direction of the eye, and adjust the passage of light through the rest of the lens. In 3D mode, active shutter or polarised glasses may be used to view the stereo pair of images.

Description

DISPLAY APPARATUS AND METHOD
This is a divisional application from GB 1009250.0.
The present invention relates to a display apparatus and method.
Recently, films for showing in a cinema on a projection screen are increasingly being made in a so-called three-dimensional (3D) format. Such a format may allow a viewer (user) to view a film such that the film appears three dimensional, for example if the viewer is wearing a suitable viewing device such as 3D glasses.
To create an illusion that an image is three-dimensional (3D), two slightly different images may be viewed together so that one of the images is viewed by a user's left eye and the other image is viewed by a user's right eye. Provided that the two images correspond to two slightly different views of the same scene (for example each image in the pair being as if seen from the user's left eye and right eye respectively), the user's brain will fool the user into thinking that the pair of images is one three dimensional image when the images are viewed in a suitable manner. An object within the images will appear at an apparent depth from the display which is dependent upon an offset amount between the left-hand image corresponding to that object and the right-hand image corresponding to that object.
In order to try and ensure that each eye sees the image that is intended to be viewed by that eye, many techniques are known. In some techniques, each image of the stereo pair can be reproduced in such a way so as to be separable from the other image. For example, the left-hand image could be displayed next to the right-hand image and a suitable viewer such as a stereoscope used to view the images. This method of displaying the images was used in the earliest forms of 3D images. However, this technique is not very practical for use in a cinema where there is likely to be a plurality of viewers, nor is this technique particularly practical for a television because typically only one viewer is able to view the stereo images at a time.
Alternatively, light relating to a left-hand image intended to be viewed by the left eye may be polarised in the vertical direction, whilst light relating to a right-hand image intended to be viewed by the right eye may be polarised in the horizontal direction. The left-hand image and right-hand image are then superimposed on each other. By wearing appropriate glasses in which a polarisation of each lens corresponds with the desired image to be viewed, the correct image for each eye will be viewed by the user. Many other systems of displaying and viewing the images are also known such as circular polarisation, coloured filters (e.g. red/cyan anaglyph), chromadepth, and anachrome; these systems require a user to wear an appropriate pair of glasses.
Other techniques involve alternately displaying the left-hand image and right-hand image. For example, the left-hand and right-hand images can be displayed alternately at a rate which corresponds to a frame rate for each image which is faster than a user's persistent vision (typically 24 frames per second). Tn other words a sequence of frames comprising alternate left-hand (L) and right-hand (R) images (e.g. LRLRLRLRLR) could be displayed at a frame rate of 60 frames per second (i.e. 30 frames per second for each image). The user can then view the images using an appropriate pair of 3D glasses which alternately block the corresponding left-hand image or right-hand image so that the correct image is viewed by the correct eye. In this technique, the 3D glasses may be operable to alternately obscure the right lens and the left lens in synchrony with the displayed images so that the left eye sees only the left images and the right eye sees only the right images. 3D glasses for viewing the alternate left-hand and right-hand images are typically called "shutter glasses" because the lenses act as shutters to alternately blank or obscure the left-hand and right-hand images. However, shutter glasses can be expensive.
As well as the use of a 3D format for films, so called 3D TVs are becoming increasingly popular. Some implementations of 3D TVs require the use of polarised 3D glasses similar to those used for viewing 3D films in a cinema. Other implementations use shutter glasses and alternate display of left-hand and right-hand images in a similar manner to that described above.
Some other implementations of 3D TVs use a horizontal array of vertically aligned longitudinal lenticular lenses on the front of the television display screen to direct each of the left-hand and right-hand images towards the correct eye, thus removing the need for the user to wear 3D glasses in order to view a three-dimensional image. However, the use of lenticular lenses in front of the display screen can lead to a poor 3D experience for the user because the resolution of the image may be reduced and there may only be a few (e.g. one or two) viewing positions where a satisfactory 3D effect may be obtained. Furthermore, if the viewer moves whilst viewing the display, the 3D effect can be lost or the viewer may experience headaches and/or eye discomfort.
Additionally, whilst 3D TV channels are starting to come on air, most TV broadcasts are still broadcast in a two-dimensional (2D) format. In other words, most TV broadcasts are typically viewed so that the images appear two-dimensional. Therefore, some 3D TVs may also be required to display 2D images, for example when the broadcast content only comprises 2D images. However, where both a 2D representation and a 3D representation of a TV programme are broadcast, a user may have to select manually between 2D and 3D display. This can be annoying for the user and may detract from the viewing experience.
Alternatively, the TV may automatically switch to 3D display when 3D content (a 3D representation of a TV programme) is received from a broadcaster. However, the user may therefore have to rapidly fetch their 3D glasses, from where the user keeps them, in order to view 3D content. Alternatively, if the viewer does not own a pair of 3D glasses, they may be unable to watch the TV programme.
The present invention seeks to alleviate or mitigate the above problems.
In a first aspect, there is provided a display apparatus comprising: a display screen for displaying, to a viewer, a two-dimensional representation of a source image in a first mode of operation and a three-dimensional representation of the source image in a second mode of operation; detecting means for detecting if a viewing device, required to be worn by the viewer to view the three-dimensional representation, is currently being used by the viewer to view the display screen; and switching means for causing the display screen to switch between the first mode and the second mode in dependence upon the detection of whether the viewing device is currently being used to view the display screen.
In a second aspect, there is provided a method for displaying a two-dimensional representation of a source image and a three-dimensional representation of the source image to a viewer, the method comprising: displaying, to a viewer, a two-dimensional representation of a source image in a first mode of operation and a three-dimensional representation of the source image in a second mode of operation; detecting if a viewing device, required to be worn by the viewer to view the three-dimensional representation, is currently being used by the viewer to view the display screen; and switching between the first mode and the second mode in dependence upon the detection of whether the viewing device is currently being used to view the display screen.
Advantageously, embodiments of the present invention can advantageously allow a display apparatus such as a 3D TV to switch to the appropriate mode depending on whether the viewer is using an appropriate viewing device.
For example, a first TV programme could be broadcast with associated 2D content and a second programme having associated 2D and 3D content could broadcast after the first TV programme, i.e. when the first programme finishes. If the detecting means detects that the viewer is wearing a suitable viewing device such 3D glasses (i.e. using the viewing device to view the display screen), the switching means can cause the device to switch to the second mode so that the viewer can view the 3D content of the second programme. However, if the viewer is not using the viewing device to view the display screen, then the switching means can cause the viewing display screen to operate in the first mode so that the 2D content of the second programme is displayed to the viewer.
This improves a viewing experience for the viewer as well as simplifying operation of the display apparatus because the viewer does not need to select manually between 2D and 3D display modes. Furthermore, the display apparatus is less likely to display a 3D representation of the source image if the viewer is unable to view the 3D representation as a 3D image. This also improves the viewing experience because viewing a 3D representation of a source image without the appropriate viewing device can cause the source image to appear as a meaningless jumble of two images.
A display apparatus and method is disclosed. In the following description, a number of specific details are presented in order to provide a thorough understanding of embodiments of the present invention. II will be apparent however to a person skilled in the art that these specific details need not be employed to practise the present invention. Conversely, specific details known to the person skilled in the art are omitted for the purposes of clarity in presenting the embodiments.
Embodiments of the present invention will now be described by way of example with reference to the accompanying drawings, in which: Figure 1 is schematic diagram of a 3D TV in accordance with embodiments of the present invention; Figure 2 is a schematic representation of a viewer viewing a two dimensional representation of an image in a first mode of operation in accordance with embodiments of the present invention; Figure 3 is a schematic diagram of a viewer viewing a three dimensional representation of a image in a second mode of operation in accordance with embodiments of the present invention; Figure 4 is a schematic diagram of a 3D TV and a 3D viewing device in accordance with embodiments of the present invention; Figure 5 is a schematic diagram of a 3D viewing device in accordance with embodiments of the present invention; Figure 6 is a flow chart of a method of controlling switching between the 2D representation and the 3D representation in accordance with embodiments of the present invention; Figure 7 is a schematic diagram of a 3D TV and a 3D viewing device in accordance with embodiments of the present invention; Figure 8 is a flowchart of a method of displaying a two dimensional representation and a three dimensional representation of a source image in accordance with embodiments of the present invention; Figure 9 is a schematic diagram of a viewing device in accordance with embodiments of the present invention; Figure 10 is a schematic diagram of a viewing device being used to view a 3D TV in accordance with embodiments of the present invention; and Figure 11 is a schematic diagram of a viewing device in accordance with embodiments of the present invention.
A 3D television in accordance with embodiments of the present invention will now be described with reference to Figure 1.
Figure 1 shows a 3D television 10 in accordance with embodiments of the present invention. The 3D TV 10 comprises a signal input port 15 which is operable to receive media data from a media data source. For example, the media data could comprise a broadcast signal broadcast by a television station. However, any other suitable media data could be input to the signal input port 15 such as media data read from a DVD by a DVD player. In the description below, a broadcast input signal is referred to. However, the term "broadcast input signal" should be taken to include any form of media data input signal such as media data from a DVD player, media data streamed over the internet, a broadcast signal, and the like.
The 3D TV 10 comprises a processor 100, a display 105 (display screen); an audio output 110; a memory 115; a user input interface 120; a camera 125; and a 3D communication link 130.
The processor 100 is operable to decode the broadcast input signal so that the content of the broadcast input signal can be output on the display 105 and the audio content of the broadcast input signal can be output via the audio output 110.
In embodiments, the broadcast input signal comprises an MPEG data stream, and the processor 100 is operable to decode the MPEG data stream using known techniques so as to generate appropriate output for the display 105 and the audio output 110. In embodiments, the broadcast input signal comprises one or more source images. For example, the source images could be image frames of video sequence. However, any other suitable source image could be used.
A source image may have an associated 2D representation of the source image and/or a 3D representation of the source image. For example, a source image may only have an associated 2D representation. Alternatively, the source image may have both a 2D representation and a 3D representation associated with the source image, or the source image may only have a 3D representation of the source image associated with it.
Here, a 2D representation of the source image is taken to mean an image which, when viewed by the viewer, appears to the viewer to be two-dimensional. Typically, the 2D representation comprises an image which corresponds to the source image.
A 3D representation of the source image is taken to mean that, when the 3D representation of the source image is viewed by a viewer, the viewer perceives the source image as being three-dimensional. In embodiments, the 3D representation of the source image comprises a left-hand image for viewing by a viewer's left eye, and a right-hand image for viewing by a viewer's right eye so that, when the left-hand image and the right-hand image are viewed together, the viewer perceives the source image to be three dimensional. The term "viewed together" should be understood to mean that the left-hand image and the right-hand image can be viewed simultaneously, alternately, or in any other suitable way such that a user perceives a three-dimensional effect. In other words, the three-dimensional representation can be thought of as comprising a stereo pair of images, in which the stereo pair comprises the left-hand image and the right-hand image. Additionally, it will be appreciated that the 3D representation could be any other representation of the source image such that the source image appears to the viewer to be three-dimensional. For example, the 3D representation could be a holographic representation of the source image.
It will be appreciated that any other suitable broadcast input signal could be decoded by the processor 100. For example, the broadcast input signal could be a digital satellite signal, a digital cable television signal, or a digital broadcast signal transmitted from a terrestrial transmitter. In some embodiments, the signal input port 15 is operable to receive media data from other media sources such as a DVD player, Blu-ray disc player, video cassette recorder, portable media player, and the like, although any suitable media source could be used to input media data to the signal input port 15. Therefore, the term "broadcast input signal" should be taken to mean any suitable input signal including broadcast signals from a broadcast source, other signals from storage media, on-line content delivered via the internet, and the like.
The memory 115 is operable to communicate bi-directionally with the processor 100.
In an embodiment, the memory 115 comprises dynamic random access memory (DRAM) although it will be appreciated that any other suitable memory could be used. In embodiments, the memory 115 stores firmware and/or software for operation of the 3D TV 10, as well as other data associated with the operation of the 3D TV 10. However, it will be appreciated that the memory 115 may store any type of data suitable for operation of the 3D TV1O.
The user input interface 120 provides means by which a user can control the 3D TV 10. In embodiments, the user input interface 120 is operable to receive signals from a remote control handset operated by the user, for example pulsed infra-red control signals emitted by the remote control handset in response to user input. However, it will be appreciated that any other suitable form of user input to the user input interface 120 to control the 3D TV 10 could be used, for example via input buttons physically located on the television. Remote control handsets for controlling televisions are known in the art and so will not be described in more detail herein.
In embodiments, the 3D TV 10 also comprises the camera 125 although in other embodiments, the camera 125 need not be used or be present. The camera 125 is operable to capture images of an environment in which the 3D TV 10 is situated. In embodiments, the camera 125 is used to detect whether a viewer is using a 3D viewing device to view the display 105. This functionality will be described in more detail later below.
The 3D TV 10 also comprises a 3D communication link interface 130 operable to communicate bi-directionally with the processor 100. The 3D communication link interface is also operable to communicate with a 3D viewing device (not shown in Figure 1), for example to synchronise the shuttering of lenses of 3D viewing device such as so-called shutter glasses with display of a 3D representation of a source image so that the left-hand and right-hand images are viewed by the correct eyes. This functionality will be described in more detail later below.
Display of a 2D representation and a 3D representation of an image in accordance with embodiments of the present invention will now be described with reference to Figures 2 and 3.
Figure 2 shows a schematic representation of a viewer 200 viewing a 2D representation of a source image on the display 105 in accordance with embodiments of the present invention. In the example shown in Figure 2, the viewer 200 is viewing a two dimensional representation of a source image comprising an image of a car 205. The two dimensional representation of the car 205 will appear to the viewer 200 as if positioned in a plane of the display 105. In other words, the object will appear to the viewer 200 to be positioned at the same distance from the viewer 200 as the plane of the display 105 so that the source image appears two-dimensional.
The situation shown in Figure 2 corresponds to a conventional TV for displaying two dimensional representations of television programmes. However, in order for the viewer 200 to view a 3D representation of the car 205 as a 3D image, an alternative method for displaying the source image should be used. One method of achieving this will now be described with reference to Figure 3.
Figure 3 shows a schematic diagram of a viewer viewing a three dimensional representation of a source image (comprising the car 205) in accordance with embodiments of the present invention.
In order for the viewer 200 to view the car 205 so that it appears as a 3D image, many different techniques may be employed. In an embodiment, the display 105 is operable to display a left-hand image 210a and right-hand image 210b which together form a three dimensional representation of the source image. In embodiments, the viewer 200 may wear a viewing device such as a pair of 3D glasses 215 as shown in Figure 3 so as to view the 3D representation as a three-dimensional image.
The 3D glasses have a left lens 220L and a right lens 220R. The left lens 220L is arranged so that light relating to the left image 210a can be directed to a left eye of the viewer 200. The right lens 220R is arranged so that light relating to the right image 210b can be directed to a right eye of the viewer 200. Accordingly, the viewer 200 should perceive the car 205 as being three-dimensional. In other words, the car 205 should appear to the viewer 200 to be at a distance from the viewer which is different from the distance from the viewer 200 to the display 105. The apparent depth from the display at which the car 205 will appear is dependent upon an offset amount 207 between the left-hand image 21 Oa and the right-hand image 21 Ob which correspond to the car 205. However, it will be appreciated that the offset amount 207 could be such that the car 205 appears at the same distance from the viewer 200 as the display 105.
In an embodiment, the display 105 is operable to display the left image 210a so that light corresponding to the left image 210a is polarised so as to have an anticlockwise circular polarisation. Additionally, the display 105 is operable to display the right image 210b so that light corresponding to the right image 21 Oh is polarised so as to have a circular polarisation in a clockwise direction. The left lens 220L of the viewing device 215 is arranged so as to only allow light having an anticlockwise circular polarisation to be passed through the left lens 220L. The right lens 220R of the viewing device 215 is arranged so as to allow light having a circular polarisation in a clockwise direction through the right lens 220R.
Accordingly, light from the left image 210a will be directed to a left eye of the viewer 200, and light from the right image 210b will be directed to the right eye of the viewer 200.
However, it will be appreciated that any other method of discriminating between the left-hand and right-hand images could be used. For example, the left image 21 Oa could be polarised in a horizontal direction and the right image could be polarised in a vertical direction. The lenses of the viewing device 215 could then be arranged accordingly so that the light is directed to the appropriate eye. However, any other suitable polarisation angle could be used.
It will be understood that, in embodiments, the left-hand image 210a and the right-hand image 210b can be directed to the appropriate eyes by using an arrangement of polarised filters and polarised images as described above. For example the polarisation could be linear or circular. However other forms of directing the light from the left image 21 Oa and the right image 210b to the left and the right eye of the viewer 200 respectively, such as anachrome, chromadepth, interference filter technology, or any other suitable method could be used.
As mentioned above, the processor 100 is operable to process the media data input to the signal input port 15 so as to generate images for displaying on the display 105. In embodiments, the media data comprises image data relating to a sequence of images (also referred to as image frames), for example image frames of a film, television programme, or other time dependent sequence of images. In this case, each image frame can be considered to be a source image.
In order for a viewer to perceive a 3D effect, each image frame comprises a left-hand image for viewing by the left eye of the viewer and a right-hand image for viewing by the right eye of the viewer in a similar manner to that illustrated with respect to Figure 3. When the left-hand image and the right-hand image are viewed together, the viewer's brain should be tricked into thinking that the image frame is three-dimensional. As mentioned above, the term "viewed together" should be understood as meaning that the left-hand image and right-hand image can be viewed simultaneously, that they can be viewed alternately, or that they can be viewed in any other suitable way such that a user perceives a three-dimensional effect.
In an embodiment, for each image frame, the display 105 is operable to display alternately a left-hand image (e.g. the left-hand image 210a) and a right-hand image (e.g. the right-hand image 210b). In embodiments, the processor 100 is operable to cause the display to display the left-hand and right-hand images alternately at a rate which corresponds to a frame rate for each image which is faster than a user's persistent vision (typically 24 frames per second). For example, the sequence of image frames comprising alternate left-hand (L) and right-hand (R) images (e.g. LRLRLRLRLR) could be displayed at an image rate of 60 images per second (i.e. 30 image frames per second), although any other suitable image rate and/or frame rate could be used.
In these embodiments, the viewing device 215 comprises 3D shutter glasses. In these embodiments, the left lens 2201 and the right lens 220R of the viewing device 215 each comprise liquid crystal filters which, on application of an appropriate control signal, become substantially opaque. Here, substantially opaque is taken to mean sufficiently opaque or obscure so as to block or "blank" light from the left-hand or right-hand image such that, when the left lens 220L and right lens 220R are alternately blanked in synchrony with the display of the left-hand or right-hand image, the viewer perceives a 3D effect.
In embodiments, the viewing device 215 comprises a communication unit operable to communicate bi-directionally with the 3D communication link interface 130 of the 3D TV 10.
In embodiments, the processor 100 is operable to control the 3D communication link interface 130 to transmit synchronisation data to the viewing device 215. The communication unit of the viewing device 215 is operable to receive the synchronisation data from the 3D communication link interface 130, and cause the left lens 220L and the right lens 220R to alternately block light from the left-hand and right-hand image respectively in synchronisation with the display. The viewer 200 should then perceive a three-dimensional effect, such as a 3D image of the object 205.
In embodiments, the 3D communications link interface 130 is operable to communicate with the communication unit of the viewing device 215 using infra-red light.
However, any other suitable communication method, wireless or otherwise, could be used.
However, as mentioned above, a viewer may not own a viewing device such as 3D glasses for viewing a 3D representation. Alternatively, where a TV programme is broadcast both as 2D content (having a 2D representation) and 3D content (having a 3D representation), a user may have to switch manually between the 2D content and the 3D content, This can impair an overall viewing experience for the user.
Therefore, in embodiments, the 3D TV 10 is operable to detect if a viewing device is being used by the user to view the display screen 105. As mentioned above, the 3D TV 10 can display a 2D representation and a 3D representation of a source image. In some embodiments, the 3D TV 10 is operable to display either the 2D representation or the 3D representation.
However, in other embodiments, the 3D TV 10 is operable to display a combination of the 2D representation and the 3D representation as appropriate.
In embodiments, the 3D TV 10 is operable to display, to the viewer 200, the two- dimensional representation of the source image in a first mode of operation and the three-dimensional representation of the source image in a second mode of operation. The processor is operable to cause the display 105 to switch between the first mode and the second mode in dependence upon the detection of whether the viewing device 215 is currently being used to view the display 105. The detection of whether the viewing device is currently being used to view the display 105 will be described in more detail later below. Accordingly, embodiments of the present invention advantageously allow the 3D TV 10 to switch to the appropriate mode depending on whether the viewer is using an appropriate viewing device.
For example, a first TV programme could be broadcast with associated 2D content and a second programme having associated 2D and 3D content could broadcast after the first TV programme, i.e. when the first programme finishes. If the viewer 200 is wearing 3D glasses (i.e. using the viewing device 215 to view the display 105), the processor 100 can cause the 3D TV 10 to switch to the second mode so that the viewer 200 can view the 3D content of the second programme. However, if the viewer 200 is not wearing 3D glasses, that is they are not using the viewing device 215 to view the display 105, then the processor 100 can cause the 3D TV 10 to operate in the first mode so that the 2D content of the second programme is displayed to the viewer 200. More generally, the processor 100 can act as a switching means for causing the display 105 to switch between the first mode (2D representation) and the second mode (3D representation).
In embodiments, the processor is operable to cause the display 105 to display the three-dimensional representation of the source image in the second mode if it is detected that the viewing device is currently being used to view the display 105.
A method for detecting whether the viewing device 215 is currently being used by the viewer 200 to view the display 105 will now be described in more detail with reference to Figures 4 and 5.
Figure 4 is a schematic diagram of a 3D TV and a 3D viewing device in accordance with embodiments of the present invention. In particular, Figure 4 shows the 3D TV 10, the camera 125, and the viewing device 215. In Figure 4, the camera 125 is arranged so as to capture one or more images of the viewing device 215 as indicated by dashed lines 300. For the sake of clarity, the viewer 200 is not shown in Figure 4, but it is to be assumed for the purposes of the description of Figure 4 that the viewer is wearing the viewing device 215.
In Figure 4, the camera 125 is shown as being external to the 3D TV 10. However, in some embodiments, the camera 125 is housed within the 3D TV 10. However, it will be appreciated that the camera could communicate with the processor 100 in any suitable way, such as wirelessly, and that the camera 125 could be internal or external with respect to the 3D TV 10.
As mentioned above, the camera 125 is operable to capture one or more images of the real environment. Tn embodiments, the camera 125 is operable to transmit image data relating to the captured images to the processor 100. Tn embodiments, the processor 100 is operable to carry out image analysis on the images captured by the cameras so as to detect whether the viewing device 215 is being used to view the display 105.
Referring to Figure 5, which shows a schematic diagram of the viewing device 215, the processor 100 is operable to carry out image processing on the image data received from the camera 125 to detect the outline of the viewing device 215 using known techniques. For example, the processor 100 could carry out known template matching techniques to detect the outline of the viewing device 215.
If the processor 100 detects that the images captured by the camera comprise the viewing device 215 as detected by template matching, then, in embodiments, the processor interprets this to mean that the viewing device 215 is currently being used to view the display 105. Accordingly, the processor 100 causes the 3D representation of the source image to be displayed on the display 105.
If the viewing device 215 is not detected in the captured images, then the processor 100 causes the 2D representation of the source image to be displayed, because the viewer 200 is unlikely to be currently using the viewing device 215 to view the display 105.
However, if the detection of whether the viewer is viewing the display is based only on whether the viewing device is present in the captured images, in some situations the processor 100 may cause the display 105 to switch between the first mode and the second mode unintentionally. For example, the viewer 200 may have placed the viewing device 215 on a table within a field of view of the camera but the viewer 200 may not be wearing the viewing device. Therefore, on detection of the presence of the viewing device 215 within the captured images, the processor 100 may cause the display 105 to display the 3D representation even though the viewer 200 is not wearing the viewing device 215. The viewer 200 would then be unlikely to be able to view the display in a satisfactory manner.
Additionally, for example, the viewer 200 may not even be in the same room as the viewing device 215 and the 3D TV 10.
Accordingly, in some embodiments, the processor 100 is operable to detect whether the viewing device 215 is currently being used to view the display 105 in dependence upon a relative position of the viewing device 215 with respect to the viewer 200.
In embodiments, the processor is operable to carry out face recognition on the captured images using known techniques so as to detect a position of the viewer's face. If the position of the viewing device 215 is detected to substantially correspond with the position of the viewer's face, then the processor 100 can cause the display 105 to switch to the second mode so as to display the 3D representation of the source image. Therefore, a likelihood that the display mode is accidentally switched, for example in response to detection of the viewing device 215 in images captured by the camera 125, is reduced. Face detection techniques are known in the art and so will not be described in more detail herein.
In some embodiments, the processor 100 is operable to detect whether the position of the lens of the viewing device corresponds with a position on the viewer's face which matches a position of the viewer's eyes. This further improves detection of whether the viewer 200 is using the viewing device 215 to view the display 105.
In some embodiments, the viewing device 215 comprises one or more markers (as illustrated in Figure 5) for detection by analysis of the captured images by the processor 100.
As shown in Figure 5, the viewing device 215 comprises three triangular markers arranged in corners of the viewing device 215. In particular, the triangular markers comprise a top left triangular marker 305 in the top left corner of the viewing device when viewed from the front (the front being a side facing away from the viewer 200 when the viewing device is worn by the viewer 200), a bottom left triangular marker 310 in the bottom left corner of the viewing device 215 when viewed from the front, and a bottom right triangular marker 315 in the bottom right corner of the viewing device when viewed from the front. The viewing device 215 shown in Figure 5 also comprises a cross-shaped marker 320 near the top middle of the viewing device 215 (positioned close to the bridge of the 3D glasses). However, it will be appreciated that any other suitable number of markers could be used and that the markers could be any suitable shape and/or size.
The markers help distinguish the viewing device 215 from other objects having a similar shape and size, such as a pair of sunglasses, or prescription glasses worn by the viewer to correct for vision defects such as short sight. In embodiments, the processor 100 is operable to analyse the captured images using known techniques so as to detect the presence of the markers. Tf no markers are detected, then it is unlikely that the viewer is currently using the viewing device 215 to view the display 105. n embodiments, all of the markers should be detected by the processor in order to determine that the viewing device is present in the field of view of the camera 125. However, in other embodiments, only some of the markers need to be detected.
In some situations, the viewer may momentarily turn their head away from the 3D TV 10 whilst a viewer is watching the 3D TV 10 in the second mode (with the 3D representation being displayed), for example to see who has just entered the room. However, this may mean that the viewer 200 is not face-on to the camera 125. In other words, the captured images may comprise a profile of the viewer 200, or the captured images may correspond to images of the back of the viewer's head. In this situation, the processor 100 is unlikely to be able to detect the presence of the viewing device 215 in the captured image. Therefore, the processor 100 could cause the display 105 to switch to the first mode to display the 2D representation, even though the viewer 200 is still watching the TV.
To address this problem, in embodiments, the processor 100 is operable to control switching between the first mode (2D representation) and the second mode (3D representation) so that, with respect to the detection of whether the viewing device is currently being used to view the display 105, there is a preset degree of hysteresis when switching between the first mode and the second mode. This will now be described in more detail with reference to Figure 6.
Figure 6 is a flow chart of a method of controlling switching between the 2D representation and the 3D representation in accordance with embodiments of the present invention.
At a step slOO, the processor 100 causes the display 105 to display the 2D representation of the source image in the first mode of operation. Then, at a step s105, the processor 100 detects whether the viewing device 215 (3D glasses) is present in the images captured by the camera 125. In some embodiments, the processor 100 is also operable to detect, at the step s105, whether a position of the viewing device corresponds to a position of the viewer's face so as to detect whether the viewing device 215 is being used to view the display 105.
If the viewing device 215 is detected in the captured images, then, at a step silO, the processor 100 detects whether the viewing device 215 is present in the captured images for a time t greater than or equal to a first threshold period of time THR1. In other words, the processor detects whether t »= THR1. Typically, THR1 0.5 seconds, although any other suitable period of time could be used.
In some embodiments, THR1 corresponds to a time period of an integer multiple n of the duration of an image frame. This simplifies detection and synchronisation of switching between the first and second modes. For example, for a frame rate of 24 frames per second (fps) (with each frame having a duration of 1/24 = 0.04 167 second) and for THR1 = 0.5 seconds, then n = 12. However, any other suitable integer number of frames could be used to determine THR1.
If, at the step silO, the processor detects that the viewing device 215 is present in the captured images for a time greater than or equal to THR1 (i.e. t »= THR1 is TRUE), then, at a step s115, the processor 100 causes the display 105 to display the 3D representation of the source image. Processing then returns to the step s105.
However, if, at the step silO, the processor detects that the viewing device 215 is present in the captured images for a time less than THR1 (i.e. t »= THR1 is FALSE), then the 2D representation is displayed at the step slOO. This advantageously reduces the likelihood that a false detection of the viewing device 215 will cause the display 105 to switch inadvertently to the second mode to display the 3D representation.
Returning to the step s105, if at the step s105, the viewing device (3D glasses) is not detected as being present in the captured image, then, at a step s120, the processor 100 detects whether the viewing device is not present in the captured images for a time t greater than or equal to a second threshold period of time THR2. In other words, the processor detects whether t »= THR2.
In some embodiments, THR2 corresponds to a time period of an integer multiple n of the duration of an image frame. For example, for a frame rate of 24 frames per second (fps) (with each frame having a duration of 1/24 = 0.04 167 second) and for THR2 = 0.5 seconds, then n = 12. However, any other suitable integer number of frames could be used to determine THR2. In some embodiments, THR1 is the same as THR2 (THR1 = THR2) although in other embodiments, the first threshold period of time THR1 is different from the second threshold period of time THR2 (THR1!= THR2).
If, at the step s120, the processor detects that the viewing device 215 is not present in the captured images for a time greater than or equal to THR2 (i.e. t »= THR2 is TRUE), then, processing proceeds to the step slOO and the processor 100 causes the display 105 to display the 2D representation of the source image. Processing then passes to the step s105.
However, if, at the step s120, the processor detects that the viewing device 215 is not present in the captured images for a time less than THR2 (i.e. t »= THR2 is FALSE, in other words detection of the viewing device occurs in a time less than THR2), then the 3D representation is displayed at the step s115.
For example, the viewer might turn their head away from the camera 125 for 0.3 seconds. This would mean that the viewing device would be unlikely to be detected for 0.3 seconds. If THR2 = 0.5 seconds, then the condition t »= THR2 would not be satisfied if a detection of the viewing device occurred after 0.3 seconds, and therefore the 3D representation will be displayed at the step s115.
This advantageously reduces the likelihood that the display 105 will be caused to switch inadvertently from the second mode showing the 3D representation to the first mode showing the 2D representation if the viewing device 215 is not detected for a short period of time, for example when a user turns their head or the processor 100 is unable to detect the viewing device 215 due to the viewing device 215 being obscured from the field of view of the camera 125.
In some embodiments, if at the step s120, t »= THR2 is FALSE, i.e. detection of the viewing device occurs within a time period less than THR2, then processing passes to the step silO, as indicated by the dashed line 400, rather than the step s115. This further reduces the likelihood of false detection causing undesirable switching especially when the 3D TV 10 is in the second mode.
In some embodiments, whether the step silO or the step s 115 is implemented immediately after the step s120 is dependent upon whether the 2D representation or the 3D representation is being displayed.
In embodiments, if the 2D representation is being displayed (i.e. step slOO, step slOS, then step s120), then processing passes directly from the step s120 to the step silO. In other words, if the 2D representation is being displayed and, at the step s120, the viewing device is not detected in the captured images for greater than or equal to the second threshold period of time THR2 (i.e. the viewing device is not detected for a time period less than THR2), then the processor 100 detects whether the viewing device 215 is present in the captured images for a time t greater than or equal to the first threshold period of time THR1 at the step silO.
However, if the 3D representation is being displayed (i.e. step s115, step s105, then step s120), then processing passes from the step s120 to the step silS. In other words, if the 3D representation is being displayed and, at the step s120, the viewing device is not detected in the captured images for greater than or equal to the second threshold period of time THR2 (the viewing device is not detected for a time period less than THR2), then the processor 100 causes the display 105 to display the 3D representation at the step sliS.
The first threshold THR1 and the second threshold THR2 can therefore be thought of as a preset degree of hysteresis when switching between the first mode and the second mode.
The first threshold THR1 and the second threshold THR2 can be set within software, can be set by a user using a suitable calibration process or by any other suitable method for defining THR1 and THR2.
Therefore, by controlling the switching between the first mode and the second mode so that, with respect to the detection of whether the viewing device is being used to view the display 105, there is a preset degree of hysteresis when switching between the first mode and second mode, a likelihood that accidental switching occurs is reduced.
When a viewer is using a viewing device such as 3D shutter glasses or 3D polarised glasses to view the 3D representation, the brightness of image displayed on the display 105 is likely to appear dimmed because the lenses of the 3D glasses (polarised or shutter type) are likely to reduce the amount of light entering the viewers eyes.
In the case of 3D polarised glasses, the lenses of the glasses may attenuate some light because of the polarisation of the lenses. In the case of 3D shutter glasses, the alternate blanking of the left and right lens leads to an overall reduction of light passing to the viewers eyes because the amount of light reaching each eye when averaged over a time period greater than the switching rate of the lenses is halved. Additionally, because shutter glasses typically use liquid crystal technology to achieve the shuttering effect, the lenses are unlikely to be fully transparent and will attenuate some light even when that lens is arranged to allow light to pass into the viewer's eye. This can cause colours and images to appear dull to the viewer when viewing a 3D representation of an object or image.
Therefore, to address this problem, in embodiments, the processor 100 is operable to apply brightness compression to the source image so as to compress the brightness of the source image when switching from the first mode (2D representation) to the second mode (3D representation). In some embodiments, the processor 100 is operable to apply brightness compression to the left-hand image and the right-hand image of the 3D representation directly rather than to the source image. The processor 100 is operable to apply brightness compression to brightness levels of each pixel in a similar manner to dynamic range compression techniques carried out for audio signals. Accordingly, brightness compression of the brightness levels of the source image can cause the brightness levels to appear increased.
In other words, the processor 100 is operable to compress the dynamic range of luminance values of the pixels of the source image or 3D representation so as to apply brightness compression. Therefore, when the 3D representation is viewed with the viewing device 215, the viewer is less likely to experience any darkening of the image due to wearing the 3D glasses.
As mentioned above, the viewing device 215 is operable to communicate bi-directionally with the 3D TV 10. This functionality will now be described in more detail with reference to Figure 7.
Figure 7 is a schematic diagram of a 3D TV and a 3D viewing device in accordance with embodiments of the present invention. In particular, Figure 7 shows a side view of the 3D TV 10, and the viewing device 215. The viewer is not shown for the sake of clarity in understanding the drawing. However, for the purposes of understanding Figure 7, it is to be assumed that the viewer is wearing the viewing device 215.
As shown in Figure 7, in embodiments the 3D communication link 130 of the 3D TV comprises a TV communication antenna 500 for communicating wirelessly with the viewing device 215. The viewing device 215 comprises a viewing device communication antenna 505 for communicating with the 3D TV 10 via the TV communication antenna 500.
In other words, in embodiments, the 3D TV 10 is operable to communicate bi-directionally with the viewing device 215 via a wireless communication link which comprises the communication unit of the viewing device 215, the 3D communication link interface 130, the antenna 500 and the antenna 505.
In embodiments, the viewing device also comprises a power supply, a viewing device processing unit, a memory, and a control interface such as an on/off switch (not shown).
These features cooperate together using known techniques to provide shutter functionality and control the left lens 220L and the right lens 220R of the viewing device.
In embodiments, the viewing device 215 is operable to transmit, via the communication link to the 3D TV 10, a use confirmation signal which indicates that the viewing device is operational for viewing the display 105, The use confirmation signal may comprise any suitable signal for indicating to the 3D TV 10 that the viewing device 215 is operational for viewing the display 105. The processor 100 is operable to control switching between the first mode and the second mode in dependence upon receipt of the use confirmation signal.
For example, the viewing device 215 may generate and transmit the use confirmation signal to the 3D TV 10 when the viewing device 215 is switched on. The processor 100 could then cause the display 105 to switch from displaying the 2D representation to displaying the 3D representation.
In some embodiments, the processor 100 is operable to cause the 3D communication link interface 130 to transmit a query signal via the communication link to the viewing device 215 to query the viewing device 215 as to whether the viewing device 215 is operational for viewing the display 105. The viewing device is operable to transmit the use confirmation signal to the 3D TV 10 via the communication link in response to receipt of the query signal at the viewing device 215 if the viewing device 215 is operational for viewing the display 105. In other words, the processor 100 can detect whether the viewing device is being used to view the display 105 in dependence upon receipt of the use confirmation signal.
A method of controlling switching between the first mode and the second mode will now be described with reference to Figure 8.
Figure 8 is a flowchart of a method of displaying a two dimensional representation and a three dimensional representation of a source image in accordance with embodiments of the present invention.
At a step s200, the processor 100 causes the display 105 to display the 2D representation of the source image in the first mode of operation. Then, at a step s205, the processor 100 detects if the viewing device 215 is being used by the viewer to view the display 105.
If the viewing device 215 is not being used to view the display 105, then processing returns to the step s200. However, if the viewing device 215 is being used to view the display 105, then, at a step s210, the processor 100 causes the display 105 to display the 3D representation in the second mode of operation. Processing then passes to the step s205.
More generally, the method includes switching between the first mode and the second mode in dependence upon the detection at the step s205 of whether the viewing device is being used to view the display 105.
Embodiments of the present invention in which portions of the left and right lens of the viewing device 215 are selectively controlled will now be described with reference to Figures 9, 10, and 11.
Figure 9 shows a schematic diagram of a viewing device in accordance with embodiments of the present invention. In particular, Figure 9 shows the viewing device 215.
The viewing device 215 has the same operability as the embodiments described above.
However, in the embodiments described with respect to Figures 9 to 11, the viewing device comprises a pupil detector for detecting, for each eye of the viewer, a position of the display screen with respect to the right pupil of the right eye and the left pupil of the left eye.
Figure 9 schematically illustrates the viewer's right eye 600R and the viewer's left eye 600L. The viewer's right pupil 605R and left pupil 605L are also illustrated schematically in Figure 9. In embodiments, the pupil detector comprises a left light source 610L, a right light source 610R, a left image detector 615L, and a right image detector 615R. The left light source 610L is arranged with respect to the viewing device 215 so that light emitted by the left light source 610L illuminates the viewer's left eye 600L. The right light source 610R is arranged with respect to the viewing device 215 so that light emitted by the right light source 610R illuminates the viewer's right eye 600R.
In the following, detection of the viewer's pupil is described with reference to the viewer's left pupil 605L, the left light source 610L and the left image detector 615L.
However, in embodiments, the equivalent components on the right-hand side (right light source 610R and right image detector 615R) operate in the same manner as the left-hand components (left light source 610L and left image detector 610L). Therefore, the operation of the right-hand components has not been described in detail.
In order to detect the position of the viewer's left pupil 605L with respect to the display 105, the left light source 610L is operable to illuminate the left pupil 605L with light of an appropriate frequency. In embodiments, so as to not dazzle the viewer, the light source 610L is operable to emit light having a wavelength in the infra-red spectrum. In embodiments, the left light source 61 OL and the right light source 61 OR each comprise a light emitting diode (LED) operable to emit light in the infra-red spectrum. However, any other suitable light source could be used. Additionally, it will be appreciated that any other suitable frequency of light could be used to illuminate the viewer's eyes 600L and 600R.
The left image detector 615L is arranged with respect to the left light source 610L so that light emitted from the left light source 610L can be reflected from the user's left eye 600L into the left image detector 615L.
In embodiments, the left image detector 615L and the right image detector 615R each comprise a charge coupled device (CCD) for detecting light reflected from the user's eye.
However, it will be appreciated that any suitable imaging device could be used.
As mentioned above, in embodiments, the viewing device 215 comprises a viewing device processing unit. In embodiments, the viewing device processing unit has a reduced instruction set computer (RISC) architecture but it will be appreciated that any other suitable architecture or processing device could be used.
In embodiments, the viewing device processing unit is operable to perform image processing on the images generated by the left image detector 61 5L and right image detector 61 5R so as to detect a relative position of the user's pupils with respect to the viewing device 215 using known techniques. Therefore, by analysing the data from the left image detector 615L and right image detector 615R, the viewing device processing unit can detect a portion of the respective lens through which the viewer 200 is looking. However, any suitable apparatus and method for detecting an eye gaze direction of the viewer, such as those commonly found in auto focus systems in single lens reflex (SLR) cameras, may be used. The viewing device 215 is operable to transmit, to the 3D TV 10 via the communication link, pupil position data which relates to the relative position of the viewer's pupils with respect to the lenses of the viewing device 215.
However, image processing techniques can be computationally expensive and hence require a substantial power supply. The viewing device 215 is typically in the form of glasses worn by the user and to include a power supply (e.g. battery) with sufficient power to supply the viewing device processing unit, may cause the glasses to become cumbersome and uncomfortable for the viewer 200.
Therefore, in embodiments, the viewing device is operable to transmit the data generated by the left image detector 615L and/or the right image detector 615R to the 3D TV via the communication link so that the processor 100 of the 3D TV 10 can carry out the image processing. This allows the viewing device 215 to be made smaller and lighter because the viewing device 215 does not need to house a large power supply and high performance processor.
The processor 100 of the TV is also operable to detect a position of the viewing device 215 with respect to the 3D TV 10. In embodiments, the processor 100 is operable to carry out image processing techniques on the images captured by the camera 125 so as to detect a relative position of the viewing device 215 with respect to the 3D TV 10. Typically, the processor 100 is operable to detect a position of the viewing device 215 within the captured images captured by the camera 125 and correlate the position of the viewing device 215 in the captured images with the physical position of the camera with respect to the display 105. In embodiments, the physical position of the camera 125 may be input by the user via a suitable user interface executed by the 3D TV 10. In other embodiments, the physical position of the camera 125 with respect to the display 105 may be preset within firmware (or software) of the 3D TV 10, especially if the camera 125 is housed within the housing of the 3D TV 10.
Using the pupil position data together with the detected position of viewing device 215 with respect to the display 105, the processor 100 is operable to detect the position of the display screen with respect to the right pupil 605R of the right eye 600R and the left pupil 605L of the left eye 600L. The viewing device is operable to control the passage of light through the left lens 200L and the right lens 200R so that a first portion of the left lens 220L which lies on a light path from the display 105 to the left pupil 605L directs light relating to the left image 210a to the left eye 600L and a first portion of the right lens 220R which lies on a light path from the display 105 to the right pupil 605R directs light relating to the right image 21 Ob to the right eye 600R. In embodiments, a second portion of the left lens 220L and a second portion of the right lens 220R which do not correspond to the light path from the display 105 to the respective pupil are arranged so that light coming from a light source (e.g. ambient light) other than the display 105 can be directed to both the right eye 600R and the left eye 600L.
Therefore, the viewer 200 is less likely to suffer from eye strain or headaches because more ambient light will reach their eyes and those parts of the left lens and right lens which are relevant for viewing the display operate as 3D glasses.
This will now be described with reference to Figure 10.
Figure 10 shows a schematic diagram of a viewing device being used to view a 3D TV in accordance with embodiments of the present invention, In particular, in the embodiment schematically illustrated in Figure 10, the viewer is using the viewing device 215 to view the display 105. In the embodiments of Figure 10, the viewing device 215 is the same as that described above with reference to Figure 9. However, the left light source 610L, the right light source 610R, the left image detector 615L, and the right image detector 615R are not shown in Figure 10 for ease in understanding the drawing.
A first light path 700R from the display 105 to the right pupil 605R and a second light path 700L from the display 105 to the left pupil 605L are illustrated as the diagonally shaded areas between the dashed lines shown in Figure 10. As mentioned above, the viewing device 215 is operable to control the passage of light through the left lens 220L so that a first left portion 705L (indicated by cross-hatched shading in Figure 10) of the left lens 220L which lies on the light path 700L directs the left image to the left eye 600L. The viewing device 215 is also operable to control the passage of light through the right lens 220R so that a first right portion 705R (indicated by cross-hatched shading in Figure 10) of the right lens 220R which lies on the light path 700R directs the right image to the right eye 600R.
A second left portion 710L of the left lens 220L which does not correspond to the light path 700L and a second right portion 710R of the right lens 220R which does not correspond to the light path 700R are arranged so that light coming from a light source other the display can be directed to both the left eye 600L and the right eye 600R.
In embodiments, the first left portion 705L and the first right portion 705R are arranged to have functionality similar to that of shutter glasses described above. In other words, the first left portion 705L and the first right portion 705R behave as shutter or polarised glasses in a similar manner to that described above, whilst the second left portion 710L and the second right portion 710R can behave as clear lenses with no shutter or polarisation functionality. In some embodiments, a polarisation direction of the first left portion 705L can be cross polarised with a polarisation direction of the first right portion 705R. However, any other suitable method for directing the light from the left image to the left eye and the light from the right image to the right eye may be used.
This functionality will now be described in more detail with reference to Figure 11.
Figure 11 is a schematic diagram of a viewing device in accordance with embodiments of the present invention. In particular, Figure 11 shows the viewing device 215 as viewed from the position of the viewer along a cross section A-A (shown in Figure 10) of the viewing device 215. In embodiments, both the left lens 220L and the right lens 220R comprise an array of pixels (as shown in Figure 11) which can be selectively controlled by the viewing device processing unit to control the passage of light through each lens.
In embodiments, each pixel comprises a liquid crystal so that each pixel can be controlled in a similar manner to that used to control liquid crystal displays. For example, pixels corresponding to the first left portion 705L could be controlled to have a vertical polarisation direction, and the pixels corresponding to the first right portion 705R could be controlled to have a horizontal polarisation direction so that light from a vertically polarised left image displayed by the display 105 is directed to the left eye and light from a horizontally polarised right image displayed on the display 105 is directed to the right eye. However, any other suitable polarisation direction (e.g. linear or circular polarisation) could be used.
Additionally, it will be appreciated that coloured filtering for use with anaglyph images and the like could be applied to the first left portion 705L and the first right portion 705R.
The viewing device is operable to control the second left portion 710L and the second right portion 71 OR so that these portions are clear and no polarisation is applied to the second left portion 71 OL and the second right portion 71 OR.
Alternatively, the viewing device processing unit could control the first left portion 705L and the first right portion 705R so that they become opaque in synchrony with the alternate display of the left image and the right image on the display in dependence upon the synchronisation signal. In other words, the first left portion 705L and the first right portion 705R could have similar functionality to shutter glasses, with the second left portion 710L and the second right portion 710R being controlled so as to be clear with no shuttering effect.
In embodiments, the viewing device processing unit and/or the processing unit are operable to control the position of the first left portion 705L and the first right portion 705R so that they lie on the respective light paths from the display 105 to the viewer's pupils by generation of appropriate control signals sent to the viewing device. In other words, when the display 105 is viewed through the viewing device, the first left portion 705L and the first right portion 705R substantially correspond with a position of the display 105 when viewed by the viewer 200 through the viewing device 215. However, light from other light sources, such as ambient light, which does not correspond to the display 105 will not be subject to the 3D viewing technique (e.g. shutter technique, polarisation technique, colour filter technique etc.) therefore providing a more comfortable viewing experience for the viewer.
It will be appreciated that while the above described embodiments have been described with reference to a 3D TV, the above embodiments may be more generally applicable to device for displaying images which can be viewed as 3D images. For example, the source image need not be from a video sequence and could be a source image such as that captured by a digital camera. In this case, the 3D representation of the source image could be displayed on a so-called digital photo frame having the functionality of the embodiments described above.
Additionally, the source image could be any suitable source image having a 2D representation and/or a 3D representation. For example a 3D representation of a landscape scene could be captured by a stereo pair of cameras. As another example, the 3D representation could be computer generated from a single source image to generate a stereo pair, or the source image could be computer generated.
The various methods set out above may be implemented by adaptation of an existing entertainment device or display apparatus, for example by using a computer program product comprising processor implementable instructions stored on a data carrier such as a floppy disk, optical disk, hard disk, PROM, RAM, flash memory or any combination of these or other storage media, or transmitted via data signals on a network such as an Ethernet, a wireless network, the internet, or any combination of these of other networks, or realised in hardware as an ASIC (application specific integrated circuit) or an FPGA (field programmable gate array) or other configurable circuit suitable to use in adapting the existing equivalent device.
In conclusion, although a variety of embodiments have been described herein, these S are provided by way of example only, and many variations and modifications on such embodiments will be apparent to the skilled person and fall within the scope of the present invention, which is defined by the appended claims and their equivalents.

Claims (14)

  1. CLAIMS1. A display apparatus comprising: a display screen for displaying, to a viewer, a two-dimensional representation of a source image in a first mode of operation and a three-dimensional representation of the source image in a second mode of operation; detecting means for detecting if a viewing device, required to be worn by the viewer to view the three-dimensional representation, is currently being used by the viewer to view the display screen; and switching means for causing the display screen to switch between the first mode and the second mode in dependence upon the detection of whether the viewing device is currently being used to view the display screen.
  2. 2. A display apparatus according to claim 1, in which, if the detecting means detects that the viewing device is currently being used to view the display, the switching means is operable to cause the display screen to display the three-dimensional representation of the source image.
  3. 3. A display apparatus according to claim 1 or claim 2, comprising: a camera for capturing one or more images of the viewer; in which the detecting means is operable to carry out image analysis on the images captured by the camera so as to detect whether the viewing device is currently being used to view the display.
  4. 4. A display apparatus according to claim 3, in which the switching means is operable to control the switching between the first mode and the second mode so that, with respect to the detection of whether the viewing device is currently being used to view the display screen, there is a preset degree of hysteresis when switching between the first mode and the second mode.
  5. 5. A display apparatus according to claim 3 or claim 4, in which: the detecting means is operable to detect whether the viewing device is currently being used to view the display in dependence upon a relative position of the viewing device with respect to the viewer.
  6. 6. A display apparatus according to any one of the preceding claims, comprising brightness compressing means for applying brightness compression to the source image to compress the brightness of the source image when switching from the first mode to the second mode.
  7. 7. A display apparatus according to any one of the preceding claims, in which the source image is a video image from a sequence of video images.
  8. 8. A display system for displaying a two-dimensional representation of a source image and a three-dimensional representation of the source image to a viewer, the system comprising: a display apparatus according to any one of the preceding claims; and a viewing device which, when used by a viewer to view the display screen, causes the viewer to perceive the three-dimensional representation of the image as a three-dimensional image.
  9. 9. A system according to claim 8, in which: the display apparatus is operable to communicate bi-directionally with the viewing device via a communication link; the switching means is operable to control the switching between the first mode and the second mode in dependence upon receipt at the display apparatus of a use confirmation signal sent by the viewing device via the communication link which indicates that the viewing device is operational for viewing the display screen.
  10. 10. A system according to claim 9, in which: the display apparatus is operable to send a query signal via the communication link to the viewing device to query the viewing device as to whether the viewing device is operational for viewing the display screen; and the viewing device is operable to transmit the use confirmation signal to the display apparatus via the communication link in response to receipt of the query signal at the viewing device if the viewing device is operational for viewing the display screen.
  11. 11. A system according to any one of claims 8 to 10, in which: the three-dimensional representation comprises a stereo pair of images, the stereo pair comprising a left-hand image for viewing by the viewer's left eye and a right-hand image for viewing by the viewer's right eye; the viewing device is a pair of spectacles having a left lens and a right lens arranged so that the light relating to the left image can be directed to the left eye of the viewer and light relating to the right image can be directed to the right eye of the viewer.
  12. 12. A system according to claim 11, in which: the viewing device comprises pupil detecting means for detecting, for each eye of the viewer, a position of the display screen with respect to the right pupil of the right eye and the left pupil of the left eye; the viewing device is operable to control the passage of light through the left lens and the right lens so that a first portion of the left lens which lies on a light path from the display screen to the left pupil directs the light relating to the left image to the left eye and a first portion of the right lens which lies on a light path from the display screen to the right pupil directs the light relating to the right image to the right eye; and a second portion of the left lens and a second portion of the right lens which do not correspond to the light path from the display screen to the respective pupil are arranged so that light coming from a light source other than the display screen can be directed to both the right and left eyes.
  13. 13. A method for displaying a two-dimensional representation of a source image and a three-dimensional representation of the source image to a viewer, the method comprising: displaying, to a viewer, a two-dimensional representation of a source image in a first mode of operation and a three-dimensional representation of the source image in a second mode of operation; detecting if a viewing device, required to be worn by the viewer to view the three-dimensional representation, is currently being used by the viewer to view the display screen; and switching between the first mode and the second mode in dependence upon the detection of whether the viewing device is currently being used to view the display screen.
  14. 14. A computer program for carrying out the method of claim 13.Amendments to the claims have been filed as followsCLAIMS1. A display system for displaying a two-dimensional representation of a source image and a three-dimensional representation of the source image to a viewer, the system comprising: a display apparatus comprising: a display screen for displaying, to a viewer, a two-dimensional representation of a source image in a first mode of operation and a three-dimensional representation of the source image in a second mode of operation; detecting means for detecting if a viewing device, required to be worn by the viewer to view the three-dimensional representation, is currently being used by the viewer to view the display screen; and switching means for causing the display screen to switch between the first mode and the second mode in dependence upon the detection of whether the viewing device is currently being used to view the display screen; and a viewing device which, when used by a viewer to view the display screen, causes the viewer to perceive the three-dimensional representation of the image as a three-dimensional image; in which: the three-dimensional representation comprises a stereo pair of images, the stereo pair comprising a lefi-hand image for viewing by the viewer's left eye and a right-hand image for viewing by the viewer's right eye; the viewing device is a pair of spectacles having a left lens and a right lens arranged so that the light relating to the left image can be directed to the left eye of the viewer and light relating to the right image can be directed to the right eye of the viewer; the viewing device comprises pupil detecting means for detecting, for each eye of the viewer, a position of the display screen with respect to the right pupil of the right eye and the left pupil of the left eye; the viewing device is operable to control the passage of light through the left lens and the right lens so that a first portion of the left lens which lies on a light path from the display screen to the left pupil directs the light relating to the left image to the left eye and a first portion of the right lens which lies on a light path from the display screen to the right pupil directs the light relating to the right image to the right eye; and a second portion of the left lens and a second portion of the right lens which do not correspond to the light path from the display screen to the respective pupil are arranged so that light coming from a light source other than the display screen can be directed to both the right and left eyes.2. A system according to claim 1, in which, if the detecting means detects that the viewing device is currently being used to view the display, the switching means is operable to cause the display screen to display the three-dimensional representation of the source image.3. A system according to claim 1 or claim 2, comprising: a camera for capturing one or more images of the viewer; in which the detecting means is operable to carry out image analysis on the images captured by the camera so as to detect whether the viewing device is currently being used to view the display.4. A system according to claim 3, in which the switching means is operable to control the switching between the first mode and the second mode so that, with respect to the detection of whether the viewing device is currently being used to view the display screen, there is a preset r.. degree of hysteresis when switching between the first mode and the second mode.5. A system according to claim 3 or claim 4, in which: the detecting means is operable to detect whether the viewing device is currently being used to view the display in dependence upon a relative position of the viewing device with respect to the viewer.6. A system according to any one of the preceding claims, comprising brightness compressing means for applying brightness compression to the source image to compress the brightness of the source image when switching from the first mode to the second mode.7. A system according to any one of the preceding claims, in which the source image is a video image from a sequence of video images.8. A system according to claim, in which: the display apparatus is operable to communicate bi-directionally with the viewing device via a communication link; the switching means is operable to control the switching between the first mode and the second mode in dependence upon receipt at the display apparatus of a use confirmation signal sent by the viewing device via the communication link which indicates that the viewing device is operational for viewing the display screen.9. A system according to claim 8, in which: the display apparatus is operable to send a query signal via the communication link to the viewing device to query the viewing device as to whether the viewing device is operational for viewing the display screen; and the viewing device is operable to transmit the use confirmation signal to the display apparatus via the communication link in response to receipt of the query signal at the viewing device if the viewing device is operational for viewing the display screen.10. A display method for displaying a two-dimensional representation of a source image and a three-dimensional representation of the source image to a viewer using a viewing device which, when used by a viewer to view the display screen, causes the viewer to perceive the three-dimensional representation of the image as a three-dimensional image having a stereo pair of images, the stereo pair comprising a left-hand image for viewing by the viewer's left eye and a right-hand image for viewing by the viewer's right eye, the viewing device comprising a pair of spectacles having a left lens and a right lens arranged so that the light relating to the left image can be directed to the left eye of the viewer and light relating to the right image can be directed to the right eye of the viewer and pupil detecting means for detecting, for each eye of the viewer, a position of the display screen with respect to the right pupil of the right eye and the left pupil of the left eye, the method comprising: displaying, to a viewer, a two-dimensional representation of a source image in a first mode of operation and a three-dimensional representation of the source image in a second mode of operation; detecting if a viewing device, required to be worn by the viewer to view the three-dimensional representation, is currently being used by the viewer to view the display screen; and causing the display screen to switch between the first mode and the second mode in dependence upon the detection of whether the viewing device is currently being used to view the display screen; and controlling the passage of light through the left lens and the right lens so that a first portion of the left lens which lies on a light path from the display screen to the left pupil directs the light relating to the left image to the left eye and a first portion of the right lens which lies on a light path from the display screen to the right pupil directs the light relating to the right image to the right eye, and a second portion of the left lens and a second portion of the right lens which do not correspond to the light path from the display screen to the respective pupil are arranged so that light coming from a light source other than the display screen can be directed to both the right and left eyes.11. A computer program for carrying out the method of claim 10. Co
GB1108670.9A 2010-06-03 2010-06-03 Display apparatus and method Active GB2481094B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB1108670.9A GB2481094B (en) 2010-06-03 2010-06-03 Display apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB1108670.9A GB2481094B (en) 2010-06-03 2010-06-03 Display apparatus and method

Publications (3)

Publication Number Publication Date
GB201108670D0 GB201108670D0 (en) 2011-07-06
GB2481094A true GB2481094A (en) 2011-12-14
GB2481094B GB2481094B (en) 2012-05-16

Family

ID=44279503

Family Applications (1)

Application Number Title Priority Date Filing Date
GB1108670.9A Active GB2481094B (en) 2010-06-03 2010-06-03 Display apparatus and method

Country Status (1)

Country Link
GB (1) GB2481094B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103227931A (en) * 2012-01-31 2013-07-31 三星电子株式会社 3D glasses, display apparatus and control method thereof
US20130194400A1 (en) * 2010-10-08 2013-08-01 Lg Electronics Inc. Three-dimensional glasses, three-dimensional image display apparatus, and method for driving the three-dimensional glasses and the three-dimensional image display apparatus

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0193987A (en) * 1987-10-05 1989-04-12 Sanyo Electric Co Ltd Display mode switching control circuit for stereoscopic television system
US5293227A (en) * 1992-07-24 1994-03-08 Tektronix, Inc. Self-synchronizing optical state controller for infrared linked stereoscopic glasses
US20060061652A1 (en) * 2004-09-17 2006-03-23 Seiko Epson Corporation Stereoscopic image display system
US20100085424A1 (en) * 2008-01-29 2010-04-08 Kane Paul J Switchable 2-d/3-d display system
EP2202991A2 (en) * 2008-12-24 2010-06-30 Samsung Electronics Co., Ltd. Stereoscopic image display apparatus and control method thereof

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0193987A (en) * 1987-10-05 1989-04-12 Sanyo Electric Co Ltd Display mode switching control circuit for stereoscopic television system
US5293227A (en) * 1992-07-24 1994-03-08 Tektronix, Inc. Self-synchronizing optical state controller for infrared linked stereoscopic glasses
US20060061652A1 (en) * 2004-09-17 2006-03-23 Seiko Epson Corporation Stereoscopic image display system
US20100085424A1 (en) * 2008-01-29 2010-04-08 Kane Paul J Switchable 2-d/3-d display system
EP2202991A2 (en) * 2008-12-24 2010-06-30 Samsung Electronics Co., Ltd. Stereoscopic image display apparatus and control method thereof

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130194400A1 (en) * 2010-10-08 2013-08-01 Lg Electronics Inc. Three-dimensional glasses, three-dimensional image display apparatus, and method for driving the three-dimensional glasses and the three-dimensional image display apparatus
US9247240B2 (en) * 2010-10-08 2016-01-26 Lg Electronics Inc. Three-dimensional glasses, three-dimensional image display apparatus, and method for driving the three-dimensional glasses and the three-dimensional image display apparatus
CN103227931A (en) * 2012-01-31 2013-07-31 三星电子株式会社 3D glasses, display apparatus and control method thereof
GB2498954A (en) * 2012-01-31 2013-08-07 Samsung Electronics Co Ltd Detecting an object such as 3D glasses in an image
GB2498954B (en) * 2012-01-31 2015-04-15 Samsung Electronics Co Ltd Detecting an object in an image
US9124882B2 (en) 2012-01-31 2015-09-01 Samsung Electronics Co., Ltd. 3D glasses, display apparatus and control method thereof
EP2624574A3 (en) * 2012-01-31 2016-02-17 Samsung Electronics Co., Ltd 3D glasses, display apparatus and control method thereof
US9787977B2 (en) 2012-01-31 2017-10-10 Samsung Electronics Co., Ltd. 3D glasses, display apparatus and control method thereof

Also Published As

Publication number Publication date
GB201108670D0 (en) 2011-07-06
GB2481094B (en) 2012-05-16

Similar Documents

Publication Publication Date Title
US8684531B2 (en) Stereoscopic display device projecting parallax image and adjusting amount of parallax
US8994795B2 (en) Method for adjusting 3D image quality, 3D display apparatus, 3D glasses, and system for providing 3D image
JP5404246B2 (en) 3D image processing apparatus and control method thereof
US8077964B2 (en) Two dimensional/three dimensional digital information acquisition and display device
US9137523B2 (en) Method and apparatus for controlling image display so that viewers selectively view a 2D or a 3D service
JP2012015774A (en) Stereoscopic image processing device and stereoscopic image imaging method
US9955138B2 (en) Information processing apparatus, information processing method and program
US20120133734A1 (en) Information processing apparatus, information processing method and program
KR20110091443A (en) Image display device, image display viewing system and image display method
JP2013532424A (en) Method and apparatus for intelligent use of active space in frame packing format
CN102822848A (en) Device and method for the recognition of glasses for stereoscopic vision, and related method to control the display of a stereoscopic video stream
US9167237B2 (en) Method and apparatus for providing 3-dimensional image
US20130050416A1 (en) Video processing apparatus and video processing method
JP2010081001A (en) 2d compatible 3d display device and 3d viewing device
CN103517056A (en) Detector, detection method and video display apparatus
JPH11164329A (en) Stereoscopic video image display device
CN103167311A (en) Video processing device, video processing method and recording medium
GB2481094A (en) 3D display with automatic switching between 2D and 3D display modes
JP5474530B2 (en) Stereoscopic image display device
GB2480999A (en) 3D display with automatic switching between 2D and 3D display modes
JP2012089906A (en) Display controller
CN104041026B (en) Image take-off equipment, method and program and recording medium thereof
KR101768538B1 (en) Method for adjusting 3-Dimension image quality, 3D display apparatus, 3D glasses and System for providing 3D image
US8830150B2 (en) 3D glasses and a 3D display apparatus
KR101728724B1 (en) Method for displaying image and image display device thereof

Legal Events

Date Code Title Description
732E Amendments to the register in respect of changes of name or changes affecting rights (sect. 32/1977)

Free format text: REGISTERED BETWEEN 20200723 AND 20200729