WO2023224621A1 - Switching display device between 2d and 3d display modes based on eye tracking - Google Patents

Switching display device between 2d and 3d display modes based on eye tracking Download PDF

Info

Publication number
WO2023224621A1
WO2023224621A1 PCT/US2022/029969 US2022029969W WO2023224621A1 WO 2023224621 A1 WO2023224621 A1 WO 2023224621A1 US 2022029969 W US2022029969 W US 2022029969W WO 2023224621 A1 WO2023224621 A1 WO 2023224621A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
display device
display
lock
display mode
Prior art date
Application number
PCT/US2022/029969
Other languages
French (fr)
Inventor
Hsing-Hung Hsieh
Kuan-Ting Wu
Feng Cheng Lin
Original Assignee
Hewlett-Packard Development Company, L.P.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hewlett-Packard Development Company, L.P. filed Critical Hewlett-Packard Development Company, L.P.
Priority to PCT/US2022/029969 priority Critical patent/WO2023224621A1/en
Publication of WO2023224621A1 publication Critical patent/WO2023224621A1/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/356Image reproducers having separate monoscopic and stereoscopic modes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/302Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays
    • H04N13/305Image reproducers for viewing without the aid of special glasses, i.e. using autostereoscopic displays using lenticular lenses, e.g. arrangements of cylindrical lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/366Image reproducers using viewer tracking

Definitions

  • Display devices can include standalone monitors that are communicatively connected to computing devices like desktop, laptop, and notebook computers, as well as displays that are integrated within all-in-one (AIO) desktop computers and laptop and notebook computers.
  • Display devices have traditionally been two-dimensional (2D) displays, in that the images they display are 2D renderings of content. More recently, display devices have been developed that are three-dimensional (3D) displays, in that the images they display are 3D renderings of content. For instance, stereoscopy can be employed so that the different images are projected to the left and rigfht eyes of a viewer to render 3D images of content.
  • Autostereoscopic display devices employ stereoscopic techniques without the need for special headgear or glasses being worn by the viewer, and are often referred to as “glasses-free 3D” or “glasssesless 3D.”
  • FIG. 1 is a diagram of an example system including a display device that is switchable between two-dimensional (2D) and three-dimensional (3D) display modes.
  • FIG. 2 is a diagram of an example non-transitory computer- readable data storage medium storing program code to switch a display device between 2D and 3D display modes based on eye tracking.
  • FIG. 3 is a diagram of an example display device that employs a lenticular lens to operate in an 3D display mode and a liquid crystal layer to switch between a 2D display mode and the 3D display mode.
  • FIGs. 4A and 4B are diagrams of the example display device of FIG. 3 operating in a 2D display mode.
  • FIGs. 5A and 5B are diagrams of the example display device of FIG. 3 operating in a 3D display mode.
  • FIG. 6 is a diagram of an example computing device.
  • FIG. 7 is a flowchart of an example method.
  • display devices such as autostereoscopic display devices can operate in a 3D display mode in which 3D images of content are displayed, specifically by projecting different images to the left and right eyes of a viewer.
  • An example of such a display device is one that employs a lenticular lens to direct different images to the left and right eyes of a viewer.
  • the display device may further be able to operate in a 2D display mode in which 2D images of content are displayed, specifically by projecting the same image to the left and right eyes of a viewer.
  • the computing device communicatively connected to the display device or of which the display device is a part may momentarily lose track of the location of the user’s eyes.
  • the display device may therefore be unable to properly project different images to the user’s left and right eyes. That is, the computing device does not know where the user’s left and right eyes are and thus cannot generate different images that will just be viewed by the left and right eyes.
  • Eye tracking is performed on the images to acquire a lock on a location of eyes within the images that is suitable to view 3D images of content on the display device. If such lock acquisition is unsuccessful, then a command is transmitted to the display device to cause the display device to operate in the 2D display mode. By comparison, if such lock acquisition is successful, then a different command is transmitted to the display device to cause the display device to operate in the 3D display mode.
  • Images captured by the camera device thus can be continually received, and eye tracking continually performed on the images to maintain a location of eyes within the image that is suitable to view 3D images of content on the display device.
  • the display device Whenever there is an interruption in maintenance of the lock (such that the lock has been lost), the display device is operated in the 2D display mode.
  • the display device In response to recovery from any such interruption in lock maintenance (such that the lock has been recovered), the display device is again operated in the 3D display mode.
  • FIG. 1 shows an example system 100.
  • the system includes a display device 102, a camera device 104, and a computing device 105.
  • the display device 102 is switchable between 2D and 3D display modes. In the 3D display mode, the display device 102 projects different images for viewing by the left and right eyes of a user 110 in front of the device 102. In the 2D display mode, the display device 102 projects images that are viewed by both the left and right eyes of the user 110. An example of such a display device 102 is described in detail later in the detailed description.
  • the camera device 104 may be a webcam having a complementary metal-oxide semiconductor (CMOS) image sensor, or another type of image sensor.
  • CMOS complementary metal-oxide semiconductor
  • the camera device 104 may be external to the display device 102, such as by being attached to the top of the device 102 and centered between the left side 103A and the right side 103B of the display device 102.
  • the camera device 104 may instead be integrated within the display device 102, such as within an enclosure of the display device 102, either within a bezel over a display panel of the device 102 or behind the display panel.
  • the computing device 105 may be a computer, such as a laptop, notebook, or desktop computer, or another type of computing device 105, such as a smartphone, a tablet computing device, a set top television box, and so on.
  • the computing device 105 may be integrated with the display device 102 (and/or with the camera device 104), in the case of an all-in-one (AIO) desktop computer, a laptop or notebook computer, a television, and so on.
  • the computing device 105 may instead by external to the display device 102, in which case the display device 102 is an external monitor for the computing device 105, even if the computing device 105 has its own internal display as in the case of a laptop or notebook computer.
  • the computing device 105 is wirelessly connected or connected in a wired manner to both the display device 102 and the camera device 104.
  • the display device 102 has an area 106 in front of the device 102 in which the display device 102 is capable of projecting, in a 3D display mode, different images for the left and right eyes of the user 110 if the user 110 is located in this area 106.
  • the field of view of the camera device 104 as defined between lines 108A and 108B, may correspond to this area 106. In another implementation, if the field of view of the camera device 104 is wider than the area 106 bounded by the lines 108A and 108B, the area 106 may be configured within the camera device 104 by specification of the lines 108A and 108B. In either case, therefore, the camera device 104 is able to detect when the user 110 is inside the area 106 or not.
  • the computing device 105 performs eye tracking on images captured by the camera device 104 in front of the display device 104 to obtain a lock on the location of the eyes of the user 110 suitable to view 3D images of content on the display device 104 when operating in the 3D display mode. That a lock has been acquired (or is being maintained) on the location of the eyes of the user 110 means that eye tracking has identified where the eyes are within the captured images. That this location is suitable for the user 110 to view 3D images of content means that the location is within the area 106 in front of the display device 102.
  • the computing device 105 may be able to properly generate 3D images for display on the display device 104. That is, when the lock has been acquired (or is being maintained), the computing device 105 may be able to properly generate 3D images based on the location of the user 110’s eyes within the area 106 so that the images intended for viewing by the left eye are not viewable by the right eye and vice-versa. So long as the lock on the location of the eyes suitable for viewing 3D images of content is maintained, the display device 102 can thus be operated in the 3D display mode.
  • the user 110 may shift in position to the left or right, per arrows 112A and 112B, or towards or away from the display device 102, per arrows 114A and 114B.
  • the user 110 may also rotate his or head clockwise or counterclockwise, per arrows 116A and 116B.
  • the computing device 105 may be unable to properly generate 3D images for display on the display device 104. This is because the computing device 105 does not know where the user 110’s eyes currently are, and thus may be unable to generate different images for display on the display device 102 that will be viewed by just the left and right eyes.
  • the computing device 105 is unable to generate the images based on the location of the user 110’s eyes, since the lock on this location within the area 106 has been lost.
  • the user 110 may move outside the area 106 in which the display device 102 is able to display 3D images for viewing by the user 110 in the 3D display mode.
  • the user 110 may move to a location 118A towards the left side 103A of the display device 102 that is outside the area 106, or to a location 118B towards the right side 103B that is also outside the area 106.
  • the lock on the location of the user 110’s eyes may still be maintained (i.e., acquired).
  • the lock is on a location of the eyes that is not suitable for viewing 3D images of content on the display device 102.
  • the computing device 105 may therefore still be able to properly generate 3D images for display on the display device 104. This is because the computing device 105 may know where the user 110’s eyes currently are, and therefore may be able to generate different images for viewing by just the left and right eyes. However, the display device 102 is unable to display these 3D images for viewing by the user 110 in the 3D display mode, in that the device 102 is unable to display the images intended for the left eye such that they are viewable just by the left eye and not the right eye, and vice-versa. [0025] Ultimately, a lock on the location of the eyes of the user 110 suitable to view 3D images of content on the display device 102 may be lost in three different ways.
  • the user may be inside the area 106, such that the display device 102 is able to display 3D images of content, but the location of the eyes of the user 110 within this area 106 may not be able to be determined due to the user 110 shifting in position or rotating his or her head.
  • the computing device 105 may thus be unable to properly generate the 3D images for display by the display device 102 based on the location of the user 110’s eyes, since the lock on the location has been lost. If the computing device 105 generates the 3D images based on the last known location of the user 110’s eyes, the user 110’s right eye may also view a portion of the images intended for the left eye, and vice-versa, even though the display device 102 is properly displaying the images in the 3D display mode.
  • the user may be outside the area 106, such that the display device 102 is unable to display 3D images of content.
  • a lock on the eyes of the user 110 may not have been lost, though, in that the location of the user 110’s eyes outside the area 106 may still be known.
  • the lock on the eyes that is suitable to view 3D images of content on the display device 102 is considered to have been lost, since the lock that is still being maintained is not on a location within the area 106.
  • the computing device 105 may thus be able to properly generate the 3D images based on the location of the user 110’s eyes, but the display device 102 is unable to properly display the 3D images so that the images for the left eye are viewable by just the left eye and not the right eye, and vice-versa. If the computing device 105 generates the 3D images, the user 110’s right eye may also view a portion of the images intended for the left eye, and vice-versa, because the display device 102 is unable to properly display the images in the 3D display mode.
  • the user may again be outside the area 106, and the lock on the eyes of the user 110 may have also been lost.
  • the display device 102 is thus still unable to properly display 3D images so that the images for the left eye are viewable by just the left eye and not the right eye, and vice-versa.
  • the computing device 105 may also be unable to properly generate the 3D images based on the location of the user 110’s eyes, since the location is unknown. If the computing device 105 generates the 3D images based on the last known location of the user 110’s eyes, the user 110’s right eye may also view a portion of the images intended for the left eye, and vice-versa. This is because the images are not properly generated based on the current location of the eyes, and/or because the display device 102 is unable to properly display the images in the 3D display mode.
  • FIG. 2 shows an example non-transitory computer-readable data storage medium 200 storing program code 202 executable by a processor to perform processing.
  • the medium 200 may be a memory, such as a volatile or a non-volatile semiconductor memory, a solid-state drive (SSD), a hard disk drive (HDD), or another type of computer-readable data storage medium.
  • the processor may be part of the computing device 105. In the case in which the display device 102 is integrated with or is a part of the computing device 104, the processor may thus also be considered a part of the display device 102.
  • the processing includes received images captured by the camera device 104 in front of the display device 102 (204).
  • the captured images may be a continuously received series of individual images over time, in JPEG, PNG, or another type of image format, or they may be frames of a continuously received video, in MPEG2, MPEG4, or another type of video format.
  • the processing includes performing eye tracking on the received images to acquire a lock on the user 110’s eyes within the images suitable to view 3D images of content on the display device 102 (206).
  • the eye tracking that is performed may be an algorithm, technique, or machine learning model, such as those described in A.F. Klaib, “Eye tracking algorithms, techniques, tools, and applications within an emphasis on machine learning and Internet of Things technologies,” Expert Systems with Applications 166 (2021) 1114037.
  • eye tracking can be performed to acquire a lock on eyes within the images suitable to view 3D images of content as follows. First, a human face is identified within the images. Second, two eyes may be identified within the region including the human face in the images. In one implementation, there may be two camera devices 104 that capture a pair of stereo images. In this case, the angle from each eye to each camera device 104 is different. Upon calculating the angle from each eye to each device 104, the location of each eye can thus be determined. A successful lock is acquired suitable to view 3D images of content if the location of each eye is able to be determined, and the locations of the eyes are within the are 106.
  • the processing includes in response transmitting a command to the display device 102 to operate in the 2D display mode (210).
  • the command may be transmitted in-band with the images transmitted to the display device 102 to be displayed on the device 102, or out-of-band, such as in a separate cable and/or communication channel.
  • the command results in the display device 102 operating in the 2D display mode, in which the images are displayed by the device 102 so that they are viewable (and are intended to be viewed) by both eyes of the user 110.
  • the processing further includes causing graphics hardware of the computing device 102 to render and transmit 2D images of content for display on the display device 102 in the 2D display mode (212).
  • the graphics hardware of the computing device 102 may be a graphics processing unit (GPU) separate from the processor, or central processing unit (CPU), of the computing device 102.
  • a GPU is a specialized, usually highly parallelized, processor designed to render images for display on display devices.
  • the graphics hardware may instead be integrated within the processor itself, which is referred to as integrated graphics.
  • a GPU in general has its own memory apart from that of the CPU, whereas integrated graphics shares the memory of the CPU.
  • the content is rendered as 2D images in that the 2D images are rendered with the intention that they will be viewed by both eyes of the user 110.
  • the processing includes continuing to receive captured images from the camera device 104 and continuing to perform eye tracking on these images (214), until the lock on the location of the eyes of the user 110 within the images suitable for viewing 3D images of content on the display device 102 has been acquired. Until such a lock is acquired (216), the processing includes continuing to cause the graphics hardware to render and transmit 2D images of the content for display by the display device 102 in the 2D display mode (212). Once the lock has been acquired (216), the processing includes in response transmitting a command to the display device 102 to operate in (i.e., switch to operation in) the 3D display mode (218). The processing also proceeds to transmitting this command in response to the lock having been initially acquired (208).
  • the command to operate in the 3D display mode may be transmitted in-band with the images transmitted for display on the display device 102, or out-of-band.
  • the command thus results in the display device 102 operating in the 3D display mode. That is, the images are display by the display device 102 so that some images are viewable (and are intended to be viewed) by the left eye of the user 110, and other images are simultaneously viewable (and are intended to be viewed) by the right eye of the user 110.
  • These 3D images are of the same content, but are slightly different versions of that content, so that the content is in effect viewed in 3D.
  • the processing further includes causing the graphics hardware of the display device 102 to render and transmit these 3D images of content for display on the display device in the 3D display mode (220).
  • the content is rendered as 3D images in that different image are rendered with the intention that they will be viewed by the left eye or the right eye of the user 110.
  • the content that is rendered as 3D images may be the same content that is rendered as 2D images in the 2D display mode.
  • the content may have two video streams, a 2D video stream and a 3D video stream, which correspond to one another.
  • the 2D video stream is rendered and transmitted in the 2D display mode
  • the 3D video stream is rendered and transmitted in the 3D display mode.
  • the 3D images in particular may not be generated based on the location of the user 110’s eyes within the area 106, since the 3D video stream is predefined and may not be modified before passing its frames as the 3D images to the display device 102.
  • the content may be an abstract model of what is to be displayed.
  • the model may thus be rendered as 2D images in the 2D display mode and as 3D images in the 3D display mode.
  • the 3D images in particular may be generated based on the location of the user 110’s eyes within the area 106, since the 3D images are generated in realtime.
  • the processing includes continuing to receive captured images from the camera device 104 and continuing to perform eye tracking on these images (222), in order to maintain the lock on the location of the eyes of the user 110 within the images suitable for viewing 3D images of content on the display device 102. While the lock is maintained (i.e., the lock has not been lost, and no lock maintenance failure has occurred) (224), the processing includes continue to cause the graphics hardware to render and transmit 3D images of the content for display by the display device 102 in the 3D display mode (220).
  • FIG. 3 shows an example display device 102 that can switch between a 2D display mode in which images are intended to be viewed by both of the user 110’s eyes and a 3D display mode in which different images are intended to be viewed by each eye.
  • the display device 102 includes a display panel 302.
  • the display panel may be a liquid crystal display (LCD) panel or an organic light-emitting display (OLED) panel, for instance, with individually addressable and controllable color pixels.
  • each pixel may be made up of a red sub-pixel, a green sub-pixel, and a blue sub-pixel to realize a fullcolor display.
  • the display device 102 includes a lenticular lens layer 304 over the display panel 302.
  • a lenticular lens is an array of lenses, designed so that when viewed from slightly different angles, different parts of the display panel 302 are viewed. Therefore, the lenticular lens layer 304 permits different pixels of the display panel 302 to be viewed by the left and right eyes of the user 110, to provide an illusion of depth of the displayed content, and thus 3D.
  • the display device 102 includes a transparent liquid crystal layer 306 over and adjacent to the lenticular lens layer 304 and having a surface topology that is complementary to the surface topology of the lenticular lens layer 304.
  • a power source 316 such as a direct current (DC) power source, can assert a potential between transparent electrode layers 308 and 310 of the display device 102 respectively below and above the liquid crystal layer 306 to operate the liquid crystal layer 306 at different voltages.
  • the display device 102 can also include a bottom glass substrate layer 312 directly above the display panel 302 to protect the panel 302, and a top glass substrate layer 314 above the liquid crystal layer 306 to protect the layers 304, 306, 308, and 310.
  • FIGs. 4A and 4B show example operation of the display device 102 of FIG. 3 in the 2D display mode.
  • the liquid crystal layer 306 is operated at a first voltage (V1), which may be zero voltages, causing the liquid crystal layer 306 to have a refractive index matching that of the lenticular lens layer 304.
  • V1 a first voltage
  • the lenticular lens layer 304 has no optical effect. That is, the liquid crystal layer 306 effective cancels the lenticular lens layer 304, and light 402 is projected outwards from the display panel 302 as if the lenticular lens layer 304 were not present.
  • the liquid crystal layer 306 is operated at the first voltage in response to receipt of a command for the display device 102 to operate in the 2D display mode. Per FIG. 4B, then, individual pixels 404 of the display panel 302 are viewable by both the left eye and the right eye of the user 110.
  • FIGs. 5A and 5B show example operation of the display device 102 of FIG. 3 in the 3D display mode.
  • the liquid crystal layer 306 is operated at a second voltage (V2), which may be five voltages, causing the liquid crystal layer 306 to have a refractive index different than that of the lenticular lens layer 304.
  • V2 second voltage
  • the lenticular lens layer 304 has an optical effect. That is, light 502 projected outwards from the display panel 302 is refracted by the lenticular lens layer 304.
  • the liquid crystal layer 306 is operated at the second voltage in response to receipt of a command for the display device 102 to operate in the 3D display mode.
  • individual pixels 404L of the display panel 302 are viewable by just the left eye of the user 110
  • individual pixels 404R are viewable by just the right eye of the user 110.
  • FIG. 6 shows the example computing device 105.
  • the computing device 105 includes an interface 602 to communicatively connected to the display device 102 and to the camera device 104.
  • the interface 602 that communicatively connects to the display device 102 may be the same or different interface that communicatively connects to the camera device 104.
  • the interface 602 may communicatively connect to the camera device 104 through the display device 102 in a daisy-chain manner.
  • the interface 602 may be a wired interface and/or a wireless interface.
  • the computing device 105 includes graphics hardware 604, a processor 606, and memory 608.
  • the graphics hardware 604 may be a GPU separate from the processor 606, or internal graphics of the processor 606.
  • the memory 608 is more generally a non-transitory computer-readable data storage medium.
  • the processor 606 and the memory 608 may be integrated within an application-specific integrated circuit (ASIC) in the case in which the processor 606 is a special-purpose processor.
  • ASIC application-specific integrated circuit
  • the processor 606 may instead be a general-purpose processor, such as a CPU, in which case the memory 608 is separate from the processor 606.
  • the memory 608 stores program code 610 executable by the processor 606 to perform processing.
  • the processing includes continually receiving, from the camera device 104, captured images in front of the display device 104 (612).
  • the processing includes continually perform eye tracking on the continually received images to maintain a lock on a location of eyes within the images suitable to view 3D images of content on the display device 102 (614).
  • the processing includes transmitting a command to the display device 102 causing it to switch to the 2D display mode in response to any interruption in lock maintenance such that the lock has been lost (616).
  • the processing includes transmitting a command to the display device 102) causing it to switch to the 3D display mode in response to recovery from any interruption in lock maintenance such that the lock has been recovered (618).
  • the processing includes, while lock maintenance is interrupted, causing the graphics hardware 604 to render and transmit the 2D images of the content to the display device 102 to display in the 2D display mode (620).
  • the processing includes, while lock maintenance is not interrupted, causing the graphics hardware 604 to render and transmit the 3D images to the display device 102 to display in the 3D display mode (622).
  • FIG.7 shows an example method 700.
  • the method 700 includes the camera device 104 continually capturing images in front of a display device 102 switchable between a 2D display mode and a 3D display mode (702).
  • the method 700 includes, as the images are continually captured, the computing device 105 continually performing eye tracking on the images (704).
  • the method 700 includes the computing device 105 switching the display device 102 between the 2D and 3D display modes based on the eye tracking (706).
  • the eye tracking may be performed to maintain a lock on a location of eyes within the images suitable to view 3D images of content on the display device 102.
  • Switching the display device 102 between the 2D and 3D display modes can thus include switching the display device 102 to the 2D display mode in response to any interruption in lock maintenance such that the lock has been lost (708). Switching the display device 102 between the 2D and 3D display modes can thus also include switching the display device to the 3D display mode in response to recovery from any interruption in lock maintenance such that the lock has been recovered (710).
  • the method 700 can include, while lock maintenance is interrupted, the computing device 105 rendering 2D images of content and the display device 102 displaying the 2D images in the 2D display mode (712).
  • the method 710 can include, while lock maintenance is not interrupted, the computing device 105 rendering the 3D images of content and the display device 102 displaying the 3D images in the 3D display mode (714).

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)

Abstract

Images in front of a display device are continually captured. The display device is switchable between a two-dimensional (2D) display mode and a threedimensional (3D) display mode. As the images are continually captured, eye tracking on the images is continually performed. The display device is switched between the 2D and 3D display modes based on the eye tracking.

Description

SWITCHING DISPLAY DEVICE BETWEEN 2D AND 3D DISPLAY MODES BASED ON EYE TRACKING
BACKGROUND
[0001] Display devices can include standalone monitors that are communicatively connected to computing devices like desktop, laptop, and notebook computers, as well as displays that are integrated within all-in-one (AIO) desktop computers and laptop and notebook computers. Display devices have traditionally been two-dimensional (2D) displays, in that the images they display are 2D renderings of content. More recently, display devices have been developed that are three-dimensional (3D) displays, in that the images they display are 3D renderings of content. For instance, stereoscopy can be employed so that the different images are projected to the left and rigfht eyes of a viewer to render 3D images of content. Autostereoscopic display devices employ stereoscopic techniques without the need for special headgear or glasses being worn by the viewer, and are often referred to as “glasses-free 3D” or “glasssesless 3D.”
BRIEF DESCRIPTION OF THE DRAWINGS
[0002] FIG. 1 is a diagram of an example system including a display device that is switchable between two-dimensional (2D) and three-dimensional (3D) display modes. [0003] FIG. 2 is a diagram of an example non-transitory computer- readable data storage medium storing program code to switch a display device between 2D and 3D display modes based on eye tracking.
[0004] FIG. 3 is a diagram of an example display device that employs a lenticular lens to operate in an 3D display mode and a liquid crystal layer to switch between a 2D display mode and the 3D display mode.
[0005] FIGs. 4A and 4B are diagrams of the example display device of FIG. 3 operating in a 2D display mode.
[0006] FIGs. 5A and 5B are diagrams of the example display device of FIG. 3 operating in a 3D display mode.
[0007] FIG. 6 is a diagram of an example computing device.
[0008] FIG. 7 is a flowchart of an example method.
DETAILED DESCRIPTION OF THE DRAWINGS
[0009] As noted in the background section, display devices such as autostereoscopic display devices can operate in a 3D display mode in which 3D images of content are displayed, specifically by projecting different images to the left and right eyes of a viewer. An example of such a display device is one that employs a lenticular lens to direct different images to the left and right eyes of a viewer. The display device may further be able to operate in a 2D display mode in which 2D images of content are displayed, specifically by projecting the same image to the left and right eyes of a viewer.
[0010] When a user is operating an autostereoscopic display device in the 3D display mode, there can be times at which the different images intended for the left and right eyes of the user may not in actuality correspondingly be viewed by just the user’s left and right eyes. That is, the left eye of the user may view a portion of an image intended for the user’s right eye, and vice-versa. The result is the user may at least momentarily perceive a double image or shadow image. Besides this effect causing a degraded user experience, the user may also resultantly suffer from headaches if the issue occurs too frequently or for too long in duration.
[0011] For example, when the user turns his head or shifts in position relative to the display, the computing device communicatively connected to the display device or of which the display device is a part may momentarily lose track of the location of the user’s eyes. The display device may therefore be unable to properly project different images to the user’s left and right eyes. That is, the computing device does not know where the user’s left and right eyes are and thus cannot generate different images that will just be viewed by the left and right eyes.
[0012] As another example, if the user shifts in position in front of the display device to the display device’s far left or right sides, it may be impossible for the display device to project different images for viewing by the user’s left and right eyes. As a result, even if the computing device is able to track the location of the user’s eyes, it cannot generate different images that will be viewed by the left and right eyes when displayed by the display device. In both these examples, then, the result is a degraded user experience in viewing content in a 3D display mode of the display device. [0013] Techniques described herein ameliorate these and other issues associated with a display device that is switchable between 2D and 3D display modes. A camera device captures images in front of the display device. Eye tracking is performed on the images to acquire a lock on a location of eyes within the images that is suitable to view 3D images of content on the display device. If such lock acquisition is unsuccessful, then a command is transmitted to the display device to cause the display device to operate in the 2D display mode. By comparison, if such lock acquisition is successful, then a different command is transmitted to the display device to cause the display device to operate in the 3D display mode.
[0014] Images captured by the camera device thus can be continually received, and eye tracking continually performed on the images to maintain a location of eyes within the image that is suitable to view 3D images of content on the display device. Whenever there is an interruption in maintenance of the lock (such that the lock has been lost), the display device is operated in the 2D display mode. In response to recovery from any such interruption in lock maintenance (such that the lock has been recovered), the display device is again operated in the 3D display mode.
[0015] FIG. 1 shows an example system 100. The system includes a display device 102, a camera device 104, and a computing device 105. The display device 102 is switchable between 2D and 3D display modes. In the 3D display mode, the display device 102 projects different images for viewing by the left and right eyes of a user 110 in front of the device 102. In the 2D display mode, the display device 102 projects images that are viewed by both the left and right eyes of the user 110. An example of such a display device 102 is described in detail later in the detailed description.
[0016] The camera device 104 may be a webcam having a complementary metal-oxide semiconductor (CMOS) image sensor, or another type of image sensor. The camera device 104 may be external to the display device 102, such as by being attached to the top of the device 102 and centered between the left side 103A and the right side 103B of the display device 102. The camera device 104 may instead be integrated within the display device 102, such as within an enclosure of the display device 102, either within a bezel over a display panel of the device 102 or behind the display panel.
[0017] The computing device 105 may be a computer, such as a laptop, notebook, or desktop computer, or another type of computing device 105, such as a smartphone, a tablet computing device, a set top television box, and so on. The computing device 105 may be integrated with the display device 102 (and/or with the camera device 104), in the case of an all-in-one (AIO) desktop computer, a laptop or notebook computer, a television, and so on. The computing device 105 may instead by external to the display device 102, in which case the display device 102 is an external monitor for the computing device 105, even if the computing device 105 has its own internal display as in the case of a laptop or notebook computer. The computing device 105 is wirelessly connected or connected in a wired manner to both the display device 102 and the camera device 104. [0018] The display device 102 has an area 106 in front of the device 102 in which the display device 102 is capable of projecting, in a 3D display mode, different images for the left and right eyes of the user 110 if the user 110 is located in this area 106. The field of view of the camera device 104, as defined between lines 108A and 108B, may correspond to this area 106. In another implementation, if the field of view of the camera device 104 is wider than the area 106 bounded by the lines 108A and 108B, the area 106 may be configured within the camera device 104 by specification of the lines 108A and 108B. In either case, therefore, the camera device 104 is able to detect when the user 110 is inside the area 106 or not.
[0019] The computing device 105 performs eye tracking on images captured by the camera device 104 in front of the display device 104 to obtain a lock on the location of the eyes of the user 110 suitable to view 3D images of content on the display device 104 when operating in the 3D display mode. That a lock has been acquired (or is being maintained) on the location of the eyes of the user 110 means that eye tracking has identified where the eyes are within the captured images. That this location is suitable for the user 110 to view 3D images of content means that the location is within the area 106 in front of the display device 102.
[0020] When the lock has been acquired (or is being maintained), the computing device 105 may be able to properly generate 3D images for display on the display device 104. That is, when the lock has been acquired (or is being maintained), the computing device 105 may be able to properly generate 3D images based on the location of the user 110’s eyes within the area 106 so that the images intended for viewing by the left eye are not viewable by the right eye and vice-versa. So long as the lock on the location of the eyes suitable for viewing 3D images of content is maintained, the display device 102 can thus be operated in the 3D display mode.
[0021] However, while still remaining in the area 106, the user 110 may shift in position to the left or right, per arrows 112A and 112B, or towards or away from the display device 102, per arrows 114A and 114B. The user 110 may also rotate his or head clockwise or counterclockwise, per arrows 116A and 116B.
While the user 110 is so shifting in position or rotating his or her head, the lock on the location of the user 110’s eyes may momentarily or temporarily be lost. [0022] During any such interruption, the computing device 105 may be unable to properly generate 3D images for display on the display device 104. This is because the computing device 105 does not know where the user 110’s eyes currently are, and thus may be unable to generate different images for display on the display device 102 that will be viewed by just the left and right eyes. That is, while the display device 102 is able to display 3D images for viewing by the user 110 in the 3D display mode, the computing device 105 is unable to generate the images based on the location of the user 110’s eyes, since the lock on this location within the area 106 has been lost.
[0023] Furthermore, the user 110 may move outside the area 106 in which the display device 102 is able to display 3D images for viewing by the user 110 in the 3D display mode. The user 110 may move to a location 118A towards the left side 103A of the display device 102 that is outside the area 106, or to a location 118B towards the right side 103B that is also outside the area 106. When the user 110 has so moved outside the area 106, the lock on the location of the user 110’s eyes may still be maintained (i.e., acquired). However, the lock is on a location of the eyes that is not suitable for viewing 3D images of content on the display device 102.
[0024] In this case, the computing device 105 may therefore still be able to properly generate 3D images for display on the display device 104. This is because the computing device 105 may know where the user 110’s eyes currently are, and therefore may be able to generate different images for viewing by just the left and right eyes. However, the display device 102 is unable to display these 3D images for viewing by the user 110 in the 3D display mode, in that the device 102 is unable to display the images intended for the left eye such that they are viewable just by the left eye and not the right eye, and vice-versa. [0025] Ultimately, a lock on the location of the eyes of the user 110 suitable to view 3D images of content on the display device 102 may be lost in three different ways. First, the user may be inside the area 106, such that the display device 102 is able to display 3D images of content, but the location of the eyes of the user 110 within this area 106 may not be able to be determined due to the user 110 shifting in position or rotating his or her head. The computing device 105 may thus be unable to properly generate the 3D images for display by the display device 102 based on the location of the user 110’s eyes, since the lock on the location has been lost. If the computing device 105 generates the 3D images based on the last known location of the user 110’s eyes, the user 110’s right eye may also view a portion of the images intended for the left eye, and vice-versa, even though the display device 102 is properly displaying the images in the 3D display mode.
[0026] Second, the user may be outside the area 106, such that the display device 102 is unable to display 3D images of content. A lock on the eyes of the user 110 may not have been lost, though, in that the location of the user 110’s eyes outside the area 106 may still be known. However, the lock on the eyes that is suitable to view 3D images of content on the display device 102 is considered to have been lost, since the lock that is still being maintained is not on a location within the area 106. The computing device 105 may thus be able to properly generate the 3D images based on the location of the user 110’s eyes, but the display device 102 is unable to properly display the 3D images so that the images for the left eye are viewable by just the left eye and not the right eye, and vice-versa. If the computing device 105 generates the 3D images, the user 110’s right eye may also view a portion of the images intended for the left eye, and vice-versa, because the display device 102 is unable to properly display the images in the 3D display mode.
[0027] Third, the user may again be outside the area 106, and the lock on the eyes of the user 110 may have also been lost. As in the previous case, the display device 102 is thus still unable to properly display 3D images so that the images for the left eye are viewable by just the left eye and not the right eye, and vice-versa. However, in this case the computing device 105 may also be unable to properly generate the 3D images based on the location of the user 110’s eyes, since the location is unknown. If the computing device 105 generates the 3D images based on the last known location of the user 110’s eyes, the user 110’s right eye may also view a portion of the images intended for the left eye, and vice-versa. This is because the images are not properly generated based on the current location of the eyes, and/or because the display device 102 is unable to properly display the images in the 3D display mode.
[0028] FIG. 2 shows an example non-transitory computer-readable data storage medium 200 storing program code 202 executable by a processor to perform processing. The medium 200 may be a memory, such as a volatile or a non-volatile semiconductor memory, a solid-state drive (SSD), a hard disk drive (HDD), or another type of computer-readable data storage medium. The processor may be part of the computing device 105. In the case in which the display device 102 is integrated with or is a part of the computing device 104, the processor may thus also be considered a part of the display device 102.
However, the example is described in relation to the computing device 104 performing the processing.
[0029] The processing includes received images captured by the camera device 104 in front of the display device 102 (204). The captured images may be a continuously received series of individual images over time, in JPEG, PNG, or another type of image format, or they may be frames of a continuously received video, in MPEG2, MPEG4, or another type of video format. The processing includes performing eye tracking on the received images to acquire a lock on the user 110’s eyes within the images suitable to view 3D images of content on the display device 102 (206). The eye tracking that is performed may be an algorithm, technique, or machine learning model, such as those described in A.F. Klaib, “Eye tracking algorithms, techniques, tools, and applications within an emphasis on machine learning and Internet of Things technologies,” Expert Systems with Applications 166 (2021) 1114037.
[0030] More generally, eye tracking can be performed to acquire a lock on eyes within the images suitable to view 3D images of content as follows. First, a human face is identified within the images. Second, two eyes may be identified within the region including the human face in the images. In one implementation, there may be two camera devices 104 that capture a pair of stereo images. In this case, the angle from each eye to each camera device 104 is different. Upon calculating the angle from each eye to each device 104, the location of each eye can thus be determined. A successful lock is acquired suitable to view 3D images of content if the location of each eye is able to be determined, and the locations of the eyes are within the are 106.
[0031] If such a lock on the location of the eyes of the user 110 (i.e., a lock on the location of the user 110’s eyes within the area 106) has not been acquired (208), then the processing includes in response transmitting a command to the display device 102 to operate in the 2D display mode (210). The command may be transmitted in-band with the images transmitted to the display device 102 to be displayed on the device 102, or out-of-band, such as in a separate cable and/or communication channel. The command results in the display device 102 operating in the 2D display mode, in which the images are displayed by the device 102 so that they are viewable (and are intended to be viewed) by both eyes of the user 110.
[0032] The processing further includes causing graphics hardware of the computing device 102 to render and transmit 2D images of content for display on the display device 102 in the 2D display mode (212). The graphics hardware of the computing device 102 may be a graphics processing unit (GPU) separate from the processor, or central processing unit (CPU), of the computing device 102. A GPU is a specialized, usually highly parallelized, processor designed to render images for display on display devices. The graphics hardware may instead be integrated within the processor itself, which is referred to as integrated graphics. A GPU in general has its own memory apart from that of the CPU, whereas integrated graphics shares the memory of the CPU. The content is rendered as 2D images in that the 2D images are rendered with the intention that they will be viewed by both eyes of the user 110.
[0033] The processing includes continuing to receive captured images from the camera device 104 and continuing to perform eye tracking on these images (214), until the lock on the location of the eyes of the user 110 within the images suitable for viewing 3D images of content on the display device 102 has been acquired. Until such a lock is acquired (216), the processing includes continuing to cause the graphics hardware to render and transmit 2D images of the content for display by the display device 102 in the 2D display mode (212). Once the lock has been acquired (216), the processing includes in response transmitting a command to the display device 102 to operate in (i.e., switch to operation in) the 3D display mode (218). The processing also proceeds to transmitting this command in response to the lock having been initially acquired (208).
[0034] As with the command transmitted to the display device 102 to operate in the 2D display mode, the command to operate in the 3D display mode may be transmitted in-band with the images transmitted for display on the display device 102, or out-of-band. The command thus results in the display device 102 operating in the 3D display mode. That is, the images are display by the display device 102 so that some images are viewable (and are intended to be viewed) by the left eye of the user 110, and other images are simultaneously viewable (and are intended to be viewed) by the right eye of the user 110. These 3D images are of the same content, but are slightly different versions of that content, so that the content is in effect viewed in 3D.
[0035] The processing further includes causing the graphics hardware of the display device 102 to render and transmit these 3D images of content for display on the display device in the 3D display mode (220). The content is rendered as 3D images in that different image are rendered with the intention that they will be viewed by the left eye or the right eye of the user 110. The content that is rendered as 3D images may be the same content that is rendered as 2D images in the 2D display mode.
[0036] For example, in the case of a movie, the content may have two video streams, a 2D video stream and a 3D video stream, which correspond to one another. The 2D video stream is rendered and transmitted in the 2D display mode, and the 3D video stream is rendered and transmitted in the 3D display mode. In this case, the 3D images in particular may not be generated based on the location of the user 110’s eyes within the area 106, since the 3D video stream is predefined and may not be modified before passing its frames as the 3D images to the display device 102.
[0037] As another example, in the case of a computer game, the content may be an abstract model of what is to be displayed. The model may thus be rendered as 2D images in the 2D display mode and as 3D images in the 3D display mode. In this case, the 3D images in particular may be generated based on the location of the user 110’s eyes within the area 106, since the 3D images are generated in realtime.
[0038] The processing includes continuing to receive captured images from the camera device 104 and continuing to perform eye tracking on these images (222), in order to maintain the lock on the location of the eyes of the user 110 within the images suitable for viewing 3D images of content on the display device 102. While the lock is maintained (i.e., the lock has not been lost, and no lock maintenance failure has occurred) (224), the processing includes continue to cause the graphics hardware to render and transmit 3D images of the content for display by the display device 102 in the 3D display mode (220). In response to lock maintenance failure (i.e., if the lock has been lost) (224), the processing includes transmitting the command to the display device 102 to operate in (i.e., switch to operation in) the 2D display mode (210), causing the graphics hardware to render and transmit 2D images (212), and so on, as has been described. [0039] FIG. 3 shows an example display device 102 that can switch between a 2D display mode in which images are intended to be viewed by both of the user 110’s eyes and a 3D display mode in which different images are intended to be viewed by each eye. The display device 102 includes a display panel 302. The display panel may be a liquid crystal display (LCD) panel or an organic light-emitting display (OLED) panel, for instance, with individually addressable and controllable color pixels. For example, each pixel may be made up of a red sub-pixel, a green sub-pixel, and a blue sub-pixel to realize a fullcolor display.
[0040] The display device 102 includes a lenticular lens layer 304 over the display panel 302. A lenticular lens is an array of lenses, designed so that when viewed from slightly different angles, different parts of the display panel 302 are viewed. Therefore, the lenticular lens layer 304 permits different pixels of the display panel 302 to be viewed by the left and right eyes of the user 110, to provide an illusion of depth of the displayed content, and thus 3D. The display device 102 includes a transparent liquid crystal layer 306 over and adjacent to the lenticular lens layer 304 and having a surface topology that is complementary to the surface topology of the lenticular lens layer 304. That is, while the lenticular lens layer 304 is made up of a series of convex lenses, the liquid crystal layer 306 conforms to these convex lenses is a correspondingly concave manner. [0041] A power source 316, such as a direct current (DC) power source, can assert a potential between transparent electrode layers 308 and 310 of the display device 102 respectively below and above the liquid crystal layer 306 to operate the liquid crystal layer 306 at different voltages. Operating the liquid crystal layer 306 at a first voltage, such as zero volts, causes the liquid crystal layer 306 to have a refractive index matching that of the lenticular lens layer 304. Operating the liquid crystal layer 306 at a second voltage, such as five volts, causes the liquid crystal layer 306 to have a refractive index different than that of the lenticular lens layer 304. Finally, the display device 102 can also include a bottom glass substrate layer 312 directly above the display panel 302 to protect the panel 302, and a top glass substrate layer 314 above the liquid crystal layer 306 to protect the layers 304, 306, 308, and 310.
[0042] FIGs. 4A and 4B show example operation of the display device 102 of FIG. 3 in the 2D display mode. Per FIG. 4A, the liquid crystal layer 306 is operated at a first voltage (V1), which may be zero voltages, causing the liquid crystal layer 306 to have a refractive index matching that of the lenticular lens layer 304. This means that that the lenticular lens layer 304 has no optical effect. That is, the liquid crystal layer 306 effective cancels the lenticular lens layer 304, and light 402 is projected outwards from the display panel 302 as if the lenticular lens layer 304 were not present. The liquid crystal layer 306 is operated at the first voltage in response to receipt of a command for the display device 102 to operate in the 2D display mode. Per FIG. 4B, then, individual pixels 404 of the display panel 302 are viewable by both the left eye and the right eye of the user 110.
[0043] FIGs. 5A and 5B, by comparison, show example operation of the display device 102 of FIG. 3 in the 3D display mode. Per FIG. 5B, the liquid crystal layer 306 is operated at a second voltage (V2), which may be five voltages, causing the liquid crystal layer 306 to have a refractive index different than that of the lenticular lens layer 304. As a result, the lenticular lens layer 304 has an optical effect. That is, light 502 projected outwards from the display panel 302 is refracted by the lenticular lens layer 304. The liquid crystal layer 306 is operated at the second voltage in response to receipt of a command for the display device 102 to operate in the 3D display mode. Per FIG. 5B, then, individual pixels 404L of the display panel 302 are viewable by just the left eye of the user 110, and individual pixels 404R are viewable by just the right eye of the user 110.
[0044] FIG. 6 shows the example computing device 105. The computing device 105 includes an interface 602 to communicatively connected to the display device 102 and to the camera device 104. The interface 602 that communicatively connects to the display device 102 may be the same or different interface that communicatively connects to the camera device 104. The interface 602 may communicatively connect to the camera device 104 through the display device 102 in a daisy-chain manner. The interface 602 may be a wired interface and/or a wireless interface. [0045] The computing device 105 includes graphics hardware 604, a processor 606, and memory 608. The graphics hardware 604 may be a GPU separate from the processor 606, or internal graphics of the processor 606. The memory 608 is more generally a non-transitory computer-readable data storage medium. The processor 606 and the memory 608 may be integrated within an application-specific integrated circuit (ASIC) in the case in which the processor 606 is a special-purpose processor. The processor 606 may instead be a general-purpose processor, such as a CPU, in which case the memory 608 is separate from the processor 606.
[0046] The memory 608 stores program code 610 executable by the processor 606 to perform processing. The processing includes continually receiving, from the camera device 104, captured images in front of the display device 104 (612). The processing includes continually perform eye tracking on the continually received images to maintain a lock on a location of eyes within the images suitable to view 3D images of content on the display device 102 (614). The processing includes transmitting a command to the display device 102 causing it to switch to the 2D display mode in response to any interruption in lock maintenance such that the lock has been lost (616).
[0047] The processing includes transmitting a command to the display device 102) causing it to switch to the 3D display mode in response to recovery from any interruption in lock maintenance such that the lock has been recovered (618). The processing includes, while lock maintenance is interrupted, causing the graphics hardware 604 to render and transmit the 2D images of the content to the display device 102 to display in the 2D display mode (620). The processing includes, while lock maintenance is not interrupted, causing the graphics hardware 604 to render and transmit the 3D images to the display device 102 to display in the 3D display mode (622).
[0048] FIG.7 shows an example method 700. The method 700 includes the camera device 104 continually capturing images in front of a display device 102 switchable between a 2D display mode and a 3D display mode (702). The method 700 includes, as the images are continually captured, the computing device 105 continually performing eye tracking on the images (704). The method 700 includes the computing device 105 switching the display device 102 between the 2D and 3D display modes based on the eye tracking (706). The eye tracking may be performed to maintain a lock on a location of eyes within the images suitable to view 3D images of content on the display device 102.
[0049] Switching the display device 102 between the 2D and 3D display modes can thus include switching the display device 102 to the 2D display mode in response to any interruption in lock maintenance such that the lock has been lost (708). Switching the display device 102 between the 2D and 3D display modes can thus also include switching the display device to the 3D display mode in response to recovery from any interruption in lock maintenance such that the lock has been recovered (710). The method 700 can include, while lock maintenance is interrupted, the computing device 105 rendering 2D images of content and the display device 102 displaying the 2D images in the 2D display mode (712). The method 710 can include, while lock maintenance is not interrupted, the computing device 105 rendering the 3D images of content and the display device 102 displaying the 3D images in the 3D display mode (714).
[0050] Techniques have been described for switching a display device 102 between 2D and 3D display modes based on eye tracking. Therefore, while the user 110 is turning his or her head or is shifting in position in front of the display device 102, the user 110 is less likely to experience a degraded user experience because the display device 102 will temporarily operate in the 2D display mode. That is, the user 110 will less likely view images intended for the left eye with the right eye, and vice-versa. Once a lock on the location of the user 110’s eyes in front of the display device 102 that is suitable for viewing 3D images has been recovered, the display device 102 can return to operating in the 3D display mode.

Claims

\Ne claim:
1. A non-transitory computer-readable data storage medium storing program code executable by a processor of a computing device to perform processing comprising: receiving, from a camera device, images captured by the camera device in front of a display device switchable between a two-dimensional (2D) display mode and a three-dimensional (3D) display mode; performing eye tracking on the images to acquire a lock on a location of eyes within the images suitable to view 3D images of content on the display device; in response to unsuccessful lock acquisition, transmitting a command to the display device causing the display device to operate in the 2D display mode; and in response to successful lock acquisition, transmitting a command to the display device causing the display device to operate in the 3D display mode.
2. The non-transitory computer-readable data storage medium of claim 1 , wherein the processing further comprises: in response to unsuccessful lock acquisition, causing graphics hardware of the computing device to render 2D images of the content and transmit the rendered 2D images of the content to the display device to display in the 2D display mode.
3. The non-transitory computer-readable data storage medium of claim 2, wherein the processing further comprises: in response to successful lock acquisition, causing the graphics hardware to render the 3D images of the content and transmit the rendered 3D images of the content to the display device to display in the 3D display mode.
4. The non-transitory computer-readable data storage medium of claim 1 , wherein the processing further comprises, in response to unsuccessful lock acquisition: continuing to receive the images from the camera device and continuing to perform eye tracking on the images until the lock on the location of eyes within the images has been acquired; and once the lock on the location of eyes within the images has been acquired, transmitting a command to the display device causing the display device to switch from the 2D display mode to the 3D display mode.
5. The non-transitory computer-readable data storage medium of claim 4, wherein the processing further comprises, once the lock on the location of eyes within the image has been acquired: causing graphics hardware of the computing device to render the 3D images of the content and transmit the rendered 3D images of the content to the display device to display in the 3D display mode.
6. The non-transitory computer-readable data storage medium of claim 1 , wherein the processing further comprises, in response to successful lock acquisition: continuing to receive the images from the camera device and continuing to perform eye tracking on the images to maintain the lock on the location of eyes within the images; and in response to lock maintenance failure on the location of eyes within the images, transmitting a command to the display device causing the display device to switch from the 3D display mode to the 2D display mode.
7. The non-transitory computer-readable data storage medium of claim 6, wherein the processing further comprises, in response to lock maintenance failure: causing graphics hardware of the computing device to render 2D images of the content and transmit the rendered 2D images of the content to the display device to display in the 2D display mode.
8. A computing device comprising: an interface to communicatively connect to a display device switchable between a two-dimensional (2D) display mode and a three-dimensional (3D) display mode and to a camera device to capture images in front of the display device; a processor; and a memory storing program code executable by the processor to: continually receive the captured images from the camera device; continually perform eye tracking on the continually received images to maintain a lock on a location of eyes within the images suitable to view 3D images of content on the display device; transmitting a command to the display device causing the display device to switch to the 2D display mode in response to any interruption in lock maintenance such that the lock has been lost; and transmitting a command to the display device causing the display device to switch to the 3D display mode in response to recovery from any interruption in lock maintenance such that the lock has been recovered.
9. The computing device of claim 8, further comprising: graphics hardware to render 2D images of the content, wherein the program code is executable by the processor to further: while lock maintenance is interrupted, cause the graphics hardware to render and transmit the 2D images of the content to the display device to display in the 2D display mode.
10. The computing device of claim 9, wherein the graphics hardware is further to render the 3D images of the content, and wherein the program code is executable by the processor to further: while lock maintenance is not interrupted, cause the graphics hardware to render and transmit the 3D images to the display device to display in the 3D display mode.
11 . The computing device of claim 8, wherein the display device is part of the computing device.
12. The computing device of claim 11 , wherein the display device comprises: a display panel; a lenticular lens layer over the display panel; and a liquid crystal layer over and adjacent to the lenticular lens layer and having a surface topology that is complementary to a surface topology the lenticular lens layer, wherein the liquid crystal layer is operable at a first voltage causing the liquid crystal layer to have a refractive index matching a refractive index of the lenticular lens layer, resulting in operation of the display device in the 2D display mode, and wherein the liquid crystal layer is operable at a second voltage causing the liquid crystal layer to have a different refractive index than the refractive index of the lenticular lens layer, resulting in operation of the display device in the 3D display mode.
13. A method comprising: continually capturing images in front of a display device switchable between a two-dimensional (2D) display mode and a three-dimensional (3D) display mode; as the images are continually captured, continually performing eye tracking on the images; and switching the display device between the 2D and 3D display modes based on the eye tracking.
14. The method of claim 13, wherein the eye tracking is performed to maintain a lock on a location of eyes within the images suitable to view 3D images of content on the display device, and switching the display device between the 2D and 3D display modes comprises: switching the display device to the 2D display mode in response to any interruption in lock maintenance such that the lock has been lost; and switching the display device to the 3D display mode in response to recovery from any interruption in lock maintenance such that the lock has been recovered.
15. The method of claim 14, further comprising: while lock maintenance is interrupted, rendering 2D images of content and displaying the 2D images on the display device in the 2D display mode; and while lock maintenance is not interrupted, rendering the 3D images of content and displaying the 3D images on the display device in the 3D display mode.
PCT/US2022/029969 2022-05-19 2022-05-19 Switching display device between 2d and 3d display modes based on eye tracking WO2023224621A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/US2022/029969 WO2023224621A1 (en) 2022-05-19 2022-05-19 Switching display device between 2d and 3d display modes based on eye tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/US2022/029969 WO2023224621A1 (en) 2022-05-19 2022-05-19 Switching display device between 2d and 3d display modes based on eye tracking

Publications (1)

Publication Number Publication Date
WO2023224621A1 true WO2023224621A1 (en) 2023-11-23

Family

ID=82100657

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/029969 WO2023224621A1 (en) 2022-05-19 2022-05-19 Switching display device between 2d and 3d display modes based on eye tracking

Country Status (1)

Country Link
WO (1) WO2023224621A1 (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0935154A2 (en) * 1998-02-09 1999-08-11 Kabushiki Kaisha Toshiba Stereoscopic image display apparatus
EP2993902A1 (en) * 2014-09-02 2016-03-09 SuperD Co. Ltd. 2d/3d switchable stereoscopic display device
EP3385779A1 (en) * 2017-04-05 2018-10-10 Koninklijke Philips N.V. Multi-view display device and method
EP3961353A1 (en) * 2020-08-24 2022-03-02 Samsung Electronics Co., Ltd. Method and apparatus for controlling head-up display based on eye tracking status

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0935154A2 (en) * 1998-02-09 1999-08-11 Kabushiki Kaisha Toshiba Stereoscopic image display apparatus
EP2993902A1 (en) * 2014-09-02 2016-03-09 SuperD Co. Ltd. 2d/3d switchable stereoscopic display device
EP3385779A1 (en) * 2017-04-05 2018-10-10 Koninklijke Philips N.V. Multi-view display device and method
EP3961353A1 (en) * 2020-08-24 2022-03-02 Samsung Electronics Co., Ltd. Method and apparatus for controlling head-up display based on eye tracking status

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
A.F. KLAIB: "Eye tracking algorithms, techniques, tools, and applications within an emphasis on machine learning and Internet of Things technologies", EXPERT SYSTEMS WITH APPLICATIONS, vol. 166, 2021, pages 1114037

Similar Documents

Publication Publication Date Title
TWI489147B (en) Three-dimensional image display apparatus and three-dimensional image processing method
US9716877B2 (en) 3D display device using barrier and driving method thereof
US9355455B2 (en) Image data processing method and stereoscopic image display using the same
US20130286168A1 (en) 3d display device and method
US8514275B2 (en) Three-dimensional (3D) display method and system
US8749622B2 (en) Method and system for displaying 3D images
US20100026794A1 (en) Method, System and Apparatus for Multiuser Display of Frame-Sequential Images
US9549174B1 (en) Head tracked stereoscopic display system that uses light field type data
US8564647B2 (en) Color management of autostereoscopic 3D displays
EP3100135A1 (en) Camera included in display
US10694173B2 (en) Multiview image display apparatus and control method thereof
US8723920B1 (en) Encoding process for multidimensional display
CN103096109B (en) Multiple view automatic stereoscopic displayer and display method
US9261710B2 (en) 2D quality enhancer in polarized 3D systems for 2D-3D co-existence
TW201125355A (en) Method and system for displaying 2D and 3D images simultaneously
US9167237B2 (en) Method and apparatus for providing 3-dimensional image
KR102175813B1 (en) Three dimensional image display device and method of processing image
TWI432013B (en) 3d image display method and image timing control unit
US20140071237A1 (en) Image processing device and method thereof, and program
US20160014400A1 (en) Multiview image display apparatus and multiview image display method thereof
US8847945B2 (en) Stereoscopic display device and display method thereof
US10495893B2 (en) Hardware system for inputting 3D image in flat panel
CN114503014A (en) Multi-view stereoscopic display using lens-based steerable backlight
JP2019154008A (en) Stereoscopic image display device, method for displaying liquid crystal display, and program for liquid crystal display
WO2023224621A1 (en) Switching display device between 2d and 3d display modes based on eye tracking

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22731364

Country of ref document: EP

Kind code of ref document: A1