US20170064291A1 - Display apparatus, head-mounted display apparatus, image display method, and image display system - Google Patents
Display apparatus, head-mounted display apparatus, image display method, and image display system Download PDFInfo
- Publication number
- US20170064291A1 US20170064291A1 US15/139,766 US201615139766A US2017064291A1 US 20170064291 A1 US20170064291 A1 US 20170064291A1 US 201615139766 A US201615139766 A US 201615139766A US 2017064291 A1 US2017064291 A1 US 2017064291A1
- Authority
- US
- United States
- Prior art keywords
- infrared
- display apparatus
- pixel
- sub
- light
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- H04N13/044—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/26—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type
- G02B30/27—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes of the autostereoscopic type involving lenticular arrays
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/20—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images by providing first and second parallax images to an observer's left and right eyes
- G02B30/34—Stereoscopes providing a stereoscopic pair of separated images corresponding to parallactically displaced views of the same object, e.g. 3D slide viewers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/2003—Display of colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/22—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
- G09G3/30—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
- G09G3/32—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
- G09G3/3208—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]
-
- H04N13/0447—
-
- H04N13/0456—
-
- H04N13/0497—
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/204—Image signal generators using stereoscopic image cameras
- H04N13/25—Image signal generators using stereoscopic image cameras using two or more image sensors with different characteristics other than in their location or field of view, e.g. having different resolutions or colour pickup characteristics; using image signals from one sensor to control the characteristics of another sensor
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/271—Image signal generators wherein the generated image signals comprise depth maps or disparity maps
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/324—Colour aspects
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/30—Transforming light or analogous information into electric information
- H04N5/33—Transforming infrared radiation
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0626—Adjustment of display parameters for control of overall brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/141—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light conveying information used for selecting or modulating the light emitting or modulating element
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/04—Exchange of auxiliary data, i.e. other than image data, between monitor and graphics controller
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2370/00—Aspects of data communication
- G09G2370/22—Detection of presence or absence of input display information or of connection or disconnection of a corresponding information source
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/3406—Control of illumination source
- G09G3/3413—Details of control of colour illumination sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/361—Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
Definitions
- One or more exemplary embodiments relate to a display apparatus, a head-mounted display apparatus, an image display method, and an image display system, and more particularly, to a display apparatus, a head-mounted display apparatus, an image display method, and am image display system of displaying a three-dimensional (“3D”) image by emitting or using infrared rays having depth information.
- 3D three-dimensional
- a head-mounted display apparatus typically refers to a display apparatus configured to be mounted on a user's head in the form of glasses or a helmet.
- images are displayed in front of the eyes of the user so that the user may recognize the images.
- the head-mounted display apparatus may display the images using self-generated light and/or light incident from an external source.
- a light-emitting diode is a semiconductor device, in particular, a p-n junction diode that converts energy, which is generated by a recombination of holes and electrons, into light energy.
- a voltage is applied to the p-n junction diode in a forward direction, holes and electrons are injected, and a recombination of the holes and the electrons generate energy.
- Inorganic LEDs emit light using inorganic compounds.
- the inorganic LEDs may include red, yellow, blue, white, ultraviolet and infrared LEDs.
- the inorganic LEDs are widely used in backlight of a liquid crystal display (“LCD”) device, lighting devices, or electronic displays, for example.
- organic LEDs emit light using organic compounds, and are widely used in small to large electronic devices, e.g., mobile phones and large screen display devices.
- a display apparatus such as a television (“TV”)
- TV television
- a polarizer a lens array
- shutter a shutter to display a 3D image.
- the method above has limited viewpoints and cannot simultaneously display two-dimensional (“2D”) images and 3D images.
- One or more exemplary embodiments include an image display method, a display apparatus, and a head-mounted display apparatus for providing a continuous wide viewing angle for a 3D display apparatus and generating augmented reality by connecting a 3D display apparatus with the head-mounted display apparatus.
- a display apparatus includes a first pixel, and a second pixel.
- each of the first and second pixels includes a first sub-pixel which emits light having a first color, a second sub-pixel which emits light having a second color different from the first color, a third sub-pixel which emits light having a third color different from the first and second colors, and an infrared sub-pixel which emits infrared light.
- the infrared light emitted from the infrared sub-pixel in the first pixel and the infrared light emitted from the infrared sub-pixel in the second pixel have different intensities from each other.
- the first color, the second color and the third color may be red, green and blue, respectively.
- the infrared light emitted from the infrared sub-pixel in the first pixel and the infrared light emitted from the infrared sub-pixel in the second pixel may have substantially the same frequency as each other.
- the display apparatus may further include a plurality of pixels including the first and second pixels, and a controller which controls intensities of infrared light emitted by an infrared sub-pixel in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels.
- the infrared sub-pixel may include an infrared driver circuit, and an infrared inorganic light-emitting diode (“LED”) electrically connected to and driven by the infrared driver circuit.
- LED infrared inorganic light-emitting diode
- the display apparatus may further include a first electrode electrically connected to the infrared driver circuit and contacting an end of the infrared inorganic LED, and a second electrode facing the first electrode and contacting another end of the infrared LED.
- the second electrode may be commonly disposed in the first sub-pixel, the second sub-pixel, the third sub-pixel and the infrared sub-pixel.
- the infrared sub-pixel may further include a light spreading layer which spreads infrared light emitted by the infrared inorganic LED.
- each of the first to third sub-pixels may include an organic light emitting diode (“OLED”).
- OLED organic light emitting diode
- each of the first to third sub-pixels may include an inorganic LED.
- a head-mounted display apparatus includes a camera which receives visible light emitted by an object and converts the visible light into an electric signal, an infrared sensor which receives infrared light emitted by the object, a signal processor which generates 3D rendering data based on color data obtained by the camera and depth data obtained by the infrared sensor, and a display unit which receives the 3D rendering data from the signal processor and display an image corresponding to the 3D rendering data.
- the signal processor may include a data matching unit which matches the color data and the depth data based on a location of the object that emits the visible light and the infrared light.
- the head-mounted display apparatus may further include an optical device located on a path of light emitted by the display unit and which focuses the light on a predetermined area.
- the head-mounted display apparatus may further include a frame which accommodates the camera, the infrared sensor, the signal processor, and the display unit.
- the frame may be shaped to be mounted on a head of a user.
- the head-mounted display apparatus may further include a lens unit accommodated in the frame and located between the object and the user.
- the lens unit may include a transmittance adjusting unit which adjusts a transmittance of light incident from the object.
- the transmittance adjusting unit may include a liquid crystal.
- an image display method using a head-mounted display apparatus includes receiving visible light and infrared light from a display apparatus, extracting color data and depth data from the visible light and the infrared light, generating 3D rendering data based on the color data and the depth data, and displaying an image corresponding to the 3D rendering data on the head-mounted display apparatus.
- the display apparatus may include a plurality of pixels, and each of the plurality of pixels may include a visible light sub-pixel which emits the visible light and an infrared sub-pixel which emits the infrared light.
- the display apparatus may further include a controller which controls intensities of infrared light emitted by the infrared sub-pixel in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels.
- the infrared sub-pixel may include an infrared driver circuit, and an infrared inorganic light-emitting diode (“LED”) electrically connected to and driven by the infrared driver circuit.
- LED infrared inorganic light-emitting diode
- the method may further include, before the generating the 3D rendering data, matching the color data and the depth data based on respective locations of the plurality of pixels in the display apparatus which emits the visible light and the infrared light.
- an image display system includes a display apparatus including a plurality of pixels emitting visible light and infrared light and a head-mounted display apparatus configured to receive the visible light and the infrared light from the display apparatus and display image.
- the head-mounted display apparatus includes a camera which receives the visible light emitted by the display apparatus and converts the visible light into an electric signal, an infrared sensor which receives the infrared light emitted by the display apparatus, a signal processor which generates three-dimensional rendering data based on color data obtained by the camera and depth data obtained by the infrared sensor, and a display unit which receives the three-dimensional rendering data from the signal processor and display an image corresponding to the three-dimensional rendering data.
- FIG. 1 is a schematic block diagram of a display apparatus, according to an exemplary embodiment
- FIG. 2 is a plan view of two adjacent pixels in the display apparatus of FIG. 1 ;
- FIG. 3 is a cross-sectional view taken along line of a pixel FIG. 2 ;
- FIG. 4 is a schematic cross-sectional view of a display apparatus according to another exemplary embodiment
- FIG. 5 is a schematic perspective view of a head-mounted display apparatus, according to an exemplary embodiment
- FIG. 6 is a schematic conceptual view of some components in the head-mounted display apparatus of FIG. 5 ;
- FIG. 7 is a schematic cross-sectional view of an exemplary embodiment of a lens unit in the head-mounted display apparatus of FIG. 5 ;
- FIG. 8 is a flowchart of an image display method, according to an exemplary embodiment.
- FIG. 9 is a conceptual view of a system for providing the image display method of FIG. 8 , according to an exemplary embodiment.
- first,” “second,” “third” etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, “a first element,” “component,” “region,” “layer” or “section” discussed below could be termed a second element, component, region, layer or section without departing from the teachings herein.
- relative terms such as “lower” or “bottom” and “upper” or “top,” may be used herein to describe one element's relationship to another element as illustrated in the Figures. It will be understood that relative terms are intended to encompass different orientations of the device in addition to the orientation depicted in the Figures. For example, if the device in one of the figures is turned over, elements described as being on the “lower” side of other elements would then be oriented on “upper” sides of the other elements. The exemplary term “lower,” can therefore, encompasses both an orientation of “lower” and “upper,” depending on the particular orientation of the figure.
- “About” or “approximately” as used herein is inclusive of the stated value and means within an acceptable range of deviation for the particular value as determined by one of ordinary skill in the art, considering the measurement in question and the error associated with measurement of the particular quantity (i.e., the limitations of the measurement system). For example, “about” can mean within one or more standard deviations, or within ⁇ 30%, 20%, 10%, 5% of the stated value.
- Exemplary embodiments are described herein with reference to cross section illustrations that are schematic illustrations of idealized embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, embodiments described herein should not be construed as limited to the particular shapes of regions as illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, a region illustrated or described as flat may, typically, have rough and/or nonlinear features. Moreover, sharp angles that are illustrated may be rounded. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the precise shape of a region and are not intended to limit the scope of the present claims.
- FIG. 1 is a schematic block diagram of a display apparatus 100 , according to an exemplary embodiment.
- FIG. 2 is a plan view of two adjacent pixels in the display apparatus 100 of FIG. 1 .
- FIG. 3 is a cross-sectional view taken along line III-III of the pixel of FIG. 2 .
- an exemplary embodiment of the display apparatus 100 may include a display panel 10 , a driver 20 , and a processor 30 .
- the display panel 10 may include a plurality of pixels, including a first pixel P 1 and a second pixel P 2 .
- the driver 20 may include a scan driver and a data driver that respectively apply scan signals and data signals to scan lines and data lines, which are connected to the plurality of pixels.
- the driver 20 is connected with the processor 30 , and may receive information from the processor 30 , for example, information on timing for applying the scan signals and the data signals to the plurality of pixels and amplitude of signals.
- FIG. 2 schematically illustrates a structure of the first and second pixels P 1 and P 2 from among the plurality of pixels.
- the first and second pixels P 1 and P 2 are adjacent to each other, but exemplary embodiments are not limited thereto.
- other pixels may be provided between the first and second pixels P 1 and P 2 .
- Each of the first and second pixels P 1 and P 2 may include a first sub-pixel SP 1 that emits light having a first color, a second sub-pixel SP 2 that emits light having a second color that is different from the first color, a third sub-pixel SP 3 that emits light having a third color that is different from the first and second colors, and an infrared sub-pixel IR that emits infrared light.
- the infrared light emitted from the infrared sub-pixel IR in the first pixel P 1 and the infrared light emitted from the infrared sub-pixel IR in the second pixel P 2 emit infrared light of different intensities.
- the infrared light is provided so that depth information is included in light with a predetermined color emitted from the first and second pixels P 1 and P 2 , a difference between intensities of infrared light emitted from the first and second pixels P 1 and P 2 may correspond to a depth difference between the first and second pixels P 1 and P 2 .
- the term ‘depth’ refers to a distance from an arbitrary point. Depth information of an image may indicate 3D information of the image. That is, the display apparatus 100 includes the plurality of pixels including the first and second pixels P 1 and P 2 , and may display images by using the plurality of pixels. Each of the plurality of pixels may be turned on or off. When the pixels are turned on, light having a predetermined color may be emitted by using a combination of first to third sub-pixels SP 1 to SP 3 that emit visible light in different colors. A display apparatus including only pixels that include the first to third sub-pixels SP 1 to SP 3 may only display a two-dimensional (“2D”) image.
- 2D two-dimensional
- the display apparatus 100 includes the infrared sub-pixel IR that may include depth information in each of the plurality of pixels, and may obtain depth information of an image displayed by the display apparatus 100 from a combination of infrared light emitted from each of the plurality of pixels.
- the display apparatus 100 may obtain 3D information of an image displayed by the display apparatus 100 from a combination of the depth information and 2D image.
- the first color, the second color and the third color may be red, green and blue, respectively, but not being limited thereto.
- the first to third colors may be other colors that show white light when combined, e.g., other primary colors.
- the infrared sub-pixels IR of the first and second pixels P 1 and P 2 may emit infrared light having substantially the same frequency as each other, but exemplary embodiments are not limited thereto.
- the infrared sub-pixels IR of the first and second pixels P 1 and P 2 may emit infrared light having different frequencies from each other, respectively.
- the processor 30 may include a controller 31 that controls intensity of infrared light emitted by the infrared sub-pixel IR in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels in the display apparatus 100 .
- the controller 31 may control the intensity of infrared light by controlling amplitude of data signals applied to the infrared sub-pixel IR or controlling time when the scan signals are applied to the infrared sub-pixel IR.
- the processor 30 may further include a calculator and a register.
- the processor 30 may process signals by using the controller 31 so that the display apparatus 100 may display a 2D image including 3D information.
- a user in front of the display apparatus 100 may not see infrared light without an additional device, and may only see the 2D image displayed by the user display apparatus 100 .
- an additional device for receiving a processing infrared light emitted by the display apparatus 100 the user may see a 3D image displayed by the display apparatus 100 , which will be described later in detail.
- FIG. 2 illustrates an exemplary embodiment in which sub-pixels in the first and second pixels P 1 and P 2 are arranged in the form of a 2 x 2 matrix
- exemplary embodiments are not limited thereto, and the sub-pixels may be arranged in various ways.
- FIG. 3 is a cross-sectional view of the third sub-pixel SP 3 and the infrared sub-pixel IR of the first pixel P 1 of FIG. 2 , according to an exemplary embodiment.
- some elements in the third sub-pixel SP 3 and the infrared sub-pixel IR will be described in detail with reference to FIG. 3 .
- a buffer layer 111 is on a substrate 110 .
- a driver circuit including a transistor T IR and a capacitor (not shown), and an infrared inorganic light-emitting diode (“LED”) LED IR that is connected to and driven by the driver circuit are disposed in an area of the buffer layer 111 corresponding to the infrared sub-pixel IR.
- the substrate 110 may include glass or plastic.
- the buffer layer 111 may effectively prevent impurities from penetrating to the driver circuit from the substrate 110 , and planarizes a surface of the substrate 110 .
- the buffer layer 111 may have a single layer structure or a multi-layer structure including a layer of an inorganic material such as silicon nitride (SiN x ) and/or silicon oxide (SiO x ).
- the transistor T IR may include an active layer 112 , a gate electrode 114 , a source electrode 116 S and a drain electrode 116 D.
- the active layer 112 may have a source area and a drain area that are conductive, and a channel area between the source and drain areas.
- the gate electrode 114 may be disposed on the active layer 112 but insulated from the active layer 112 .
- the source electrode 116 S and the drain electrode 116 D may be electrically connected with the source area and the drain area of the active layer 112 , respectively. At least one of the source electrode 116 S and the drain electrode 116 D may be omitted.
- a first insulating layer 113 may be disposed between the active layer 112 and the gate electrode 114 .
- a second insulating layer 115 may be disposed on the first insulating layer 113 and cover the gate electrode 114 .
- the first insulating layer 113 and the second insulating layer 115 may have a single layer structure or a multilayer structure including a layer of an inorganic material such as silicon nitride (SiN x ) and/or silicon oxide (SiO x ).
- a third insulating layer 117 may be disposed on the second insulating layer 115 and cover the source electrode 116 S and the drain electrode 116 D.
- the third insulating layer 117 may include an organic material and/or an inorganic material.
- FIG. 3 illustrates that the gate electrode 114 of the transistor T IR is disposed above the active layer 112 , exemplary embodiments are not limited thereto.
- the gate electrode 114 may be disposed under the active layer 112 .
- a bank 170 may be disposed on the third insulating layer 117 and define a sub-pixel area.
- the bank 170 may include a concave area 170 a that accommodates the infrared inorganic LED LED IR .
- a height of the bank 170 may be determined based on a height of the infrared inorganic LED LED IR and a viewing angle.
- a size (e.g., a width) of the concave area 170 a may be determined based on resolution of the display apparatus 100 .
- FIG. 2 illustrates that the concave area 170 a is square-shaped, exemplary embodiments are not limited thereto.
- the concave area 170 a may have various shapes, for example, a polygonal, rectangular, circular, oval, or triangular shape.
- the first electrode 120 a may be disposed on a side surface and a bottom surface of the concave area 170 a and at least a portion of an upper surface of the bank 170 .
- the first electrode 120 a may be electrically connected to the source electrode 116 S or the drain electrode 116 D of the transistor T IR via a hole H formed in the third insulating layer 117 .
- the bank 170 may function as a light blocking unit and include a material with low light transmittance.
- the bank 170 may effectively prevent light from being emitted through a side surface of the infrared inorganic LED LED IR , and thus effectively prevent interference with light emitted from adjacent sub-pixels.
- the bank 170 may increase a bright room contrast ratio (“BRCR”) of the display apparatus 100 by absorbing and blocking light incident from an external source outside the display apparatus 100 .
- BRCR bright room contrast ratio
- the bank 170 may include a semi-transparent material, an optical reflective material, or a light spreading material.
- the infrared inorganic LED LED IR may be disposed in the concave area 170 a of the bank 170 .
- the infrared inorganic LED LED IR may be, but is not limited to, a micro LED having about 1 micrometer ( ⁇ m) to about 100 ⁇ m.
- a single piece or a plurality of the infrared inorganic LED LED IR may be picked up by a transfer device from a wafer, transferred to the substrate 110 , and then, accommodated in the concave area 170 a.
- the infrared inorganic LED LED IR may emit infrared light with a wavelength of about 700 nanometers (nm) to about 1 mm. Infrared light may be not visible to the user's eyes.
- the infrared inorganic LED LED IR may include a p-n junction diode 140 a, a first contact electrode 130 a and a second contact electrode 150 a.
- the first contact electrode 130 a and/or the second contact electrode 150 a may have a single-layer structure or a multi-layer structure including at least one of metal, conductive oxide, and conductive polymer.
- the first contact electrode 130 a and the second contact electrode 150 a may selectively include a reflective layer, for example, a layer of silver.
- the first contact electrode 130 a may be electrically connected to the first electrode 120 a.
- the second contact electrode 150 a may be electrically connected to a second electrode 160 .
- the p-n junction diode 140 a may include a p-doping layer 141 a, an n-doping layer 142 a, and an intermediate layer 143 a between the p-doping layer 141 a and the n-doping layer 142 a.
- the intermediate layer 143 a is an area that emits light as excitons generated by a recombination of electrons and holes transitions from a higher energy level to a lower energy level.
- the intermediate layer 143 a includes a semiconductor material and may have a single quantum well or a multi quantum well structure.
- the first electrode 120 a may include a reflective electrode
- the second electrode 160 may include a transparent or semi-transparent electrode.
- the second electrode 160 may be commonly disposed in the plurality of pixels in the display apparatus 100 as a common electrode.
- a passivation layer 180 may surround at least a portion of the infrared inorganic LED LED IR in the concave area 170 a, and may cover the bank 170 .
- the passivation layer 180 may have a predetermined height such that an upper portion of the infrared inorganic LED LED IR , for example, the second contact electrode 150 a, is not covered. Therefore, the second contact electrode 150 a may be not covered by, but exposed through, the passivation layer 180 .
- the exposed second contact electrode 150 a may be electrically connected to the second electrode 160 .
- an exemplary embodiment of the display apparatus 100 may further include a light spreading layer (not shown) that spreads infrared light.
- the light spreading layer may be disposed on a path of infrared light emitted by the infrared inorganic LED LED IR .
- the light spreading layer may be arranged in various locations and shapes.
- the light spreading layer may allow the infrared inorganic LED LED IR to uniformly emit infrared light from a front surface of the infrared sub-pixel IR to the outside, and increase an angle range, e.g., a viewing angle, of infrared light emitted by the display apparatus 100 .
- a driver circuit including a transistor T SP3 and a capacitor, and an inorganic LED LED SP3 that is electrically connected to the driver circuit and driven by the driver circuit.
- the third sub-pixel SP 3 may have substantially the same structure as the infrared sub-pixel IR described above, except for a difference in wavelengths of light emitted by the inorganic LED.
- the third sub-pixel SP 3 may include a first electrode 120 b electrically connected to the transistor T SP3 , and the inorganic LED LED SP3 on the first electrode 120 b.
- the inorganic LED LED SP3 may include a first contact electrode 130 b that is electrically connected to the first electrode 120 b, a p-n junction diode 140 b including a p-doping layer 114 b, an n-doping layer 142 b and an intermediate layer 143 b on the first contact electrode 113 b, and a second contact electrode 150 b on the p-n junction diode 140 b and electrically connected to the second electrode 160 .
- the first and second sub-pixels SP 1 and SP 2 of FIG. 2 may have the same structure as the third sub-pixel SP 3 , except for a difference in colors of light emitted by the inorganic LED.
- the first to third sub-pixels SP 1 to SP 3 and the infrared sub-pixel IR may be disposed or transferred by an identical LED transfer device.
- the size of the display apparatus 100 may be easily reduced by including a small inorganic LED.
- the infrared sub-pixel IR is included in each of the plurality of pixels of the display apparatus 100 , such that both a 2D image and depth information corresponding to each of the plurality of pixels may be displayed.
- FIG. 4 is a schematic cross-sectional view of a display apparatus 200 according to another exemplary embodiment.
- an exemplary embodiment of the display apparatus 200 may include a plurality of pixels, including a visible light sub-pixel SP that emits visible light and an infrared sub-pixel IR that emits infrared light.
- the display apparatus 200 includes a substrate, and a buffer layer 211 on the substrate 210 .
- a driver circuit including at least one transistor T IR and at least one capacitor (not shown), and an infrared inorganic LED LED IR that is connected to and driven by the driver circuit are disposed in an area of the buffer layer 211 corresponding to the infrared sub-pixel IR.
- the transistor T IR may include an active layer 212 , a gate electrode 214 , a source electrode 216 S, and a drain electrode 216 D.
- a first insulating layer 213 may be disposed between the active layer 212 and the gate electrode 214 .
- a second insulating layer 215 may be disposed on the first insulating layer 213 and cover the gate electrode 214 .
- a third insulating layer 217 may be disposed on the second insulating layer 215 and cover the source electrode 216 S and the drain electrode 216 D.
- a bank 270 may be disposed on the third insulating layer 217 and define a sub-pixel area.
- the bank 270 may include a concave area 270 a that accommodates the infrared inorganic LED LED IR .
- a first electrode 220 a is disposed on the third insulating layer 217 .
- the first electrode 220 a may be electrically connected with the transistor T IR via a hole H formed in the third insulating layer 217 . Both ends of the first electrode 220 a may be covered by the bank 270 .
- the infrared inorganic LED LED IR may be disposed in the concave area 270 a of the bank 270 .
- the infrared inorganic LED LED IR may be a micro LED with a size (e.g., a length or width) of about 1 ⁇ m to about 100 ⁇ m that emits infrared light with a wavelength in a range of about 700 nm to about 1 mm.
- the infrared inorganic LED LED IR may include a p-n junction diode 240 a, a first contact electrode 230 a and a second contact electrode 250 a.
- the p-n junction diode 240 a may include a p-doping layer 241 a, an n-doping layer 242 a, and an intermediate layer 243 a between the p-doping layer 241 a and the n-doping layer 242 a.
- the first electrode 220 a may include a reflective electrode, and a second electrode 260 may include a transparent or semi-transparent electrode.
- the second electrode 260 may be commonly disposed in the plurality of pixels in the display apparatus 200 as a common electrode.
- a passivation layer 280 may surround at least a portion of the infrared inorganic LED LED IR in the concave area 270 a, and may cover the bank 270 and the infrared inorganic LED LED IR .
- the passivation layer 280 may have a predetermined height such that the second contact electrode 250 a of the infrared inorganic LED LED IR is not covered. Therefore, the second contact electrode 250 a may be not covered by but exposed through the passivation layer 280 .
- the exposed second contact electrode 250 a may be electrically connected to the second electrode 260 .
- a driver circuit including a transistor T SP and a, and an organic LED OLED SP that is electrically connected to the driver circuit and driven by the driver circuit.
- the visible light sub-pixel SP may include the organic LED OLED SP that includes a first electrode 220 b electrically connected to the transistor T SP , a second electrode 260 facing the first electrode 220 b, and an organic emission layer 240 b between the first electrode 220 b and the second electrode 260 .
- the display apparatus 200 may include the visible light sub-pixel SP that includes the organic LED OLED SP that is appropriate for a large screen display apparatus and has fast response speed, and the infrared sub-pixel IR that includes the inorganic LED LED IR that generates infrared light.
- the display apparatus 200 may not only display a 2D image but also depth information corresponding to each of the plurality of pixels by including the infrared sub-pixel IR in each of the plurality of pixels.
- FIG. 5 is a schematic perspective view of a head-mounted display apparatus 300 , according to an exemplary embodiment.
- FIG. 6 is a schematic conceptual view of some components in the head-mounted display apparatus 300 of FIG. 5 .
- an exemplary embodiment of the head-mounted display apparatus 300 may include a camera 310 that receives visible light emitted by an object and converts the visible light into an electric signal, an infrared sensor 320 that receives infrared light emitted by the object, a signal processor 330 that generates 3D rendering data based on color data obtained by the camera 310 and depth data obtained by the infrared sensor 320 , and a display unit 340 that receives 3D rendering data from the signal processor 330 and displays an image corresponding to the 3D rendering data.
- the camera 310 may include an image sensor (not shown) such as a charge coupled device (“CCD”) or a complementary metal-oxide semiconductor (“CMOS”), and an optical system (not shown) that focuses light incident from the object.
- An infrared ray block filter and/or an ultraviolet ray block filter may be disposed in front of the image sensor.
- the infrared sensor 320 may also include an image sensor (not shown) such as a CCD or a CMOS.
- an image sensor such as a CCD or a CMOS.
- a band-pass filter that passes infrared rays of a certain frequency range and/or a block filter that blocks light having a wavelength range lower than that of visible light rays.
- the camera 310 may obtain the color data of the object, e.g., color data of a 2D image of the object.
- the depth data of the object may be obtained by the infrared sensor 320 that receives infrared light including the depth information.
- the object may be the display apparatus 100 of FIG. 2 in which each of the plurality of pixels include the infrared sub-pixel IR.
- the signal processor 330 may generate 3D rendering data based on the color data obtained by the camera 310 and the depth data obtained by the infrared sensor 320 .
- 3D rendering refers to a process of generating a 3D image by using a 2D image based on shadows, colors and density thereof, or a process of adding a 3D effect to a 2D image by changing shadows or density.
- the signal processor 330 of the head-mounted display apparatus 300 may generate the 3D rendering data by combining the 2D image obtained by the camera 310 with the depth information obtained by the infrared sensor 320 and thus changing shadows or density of the 2D image.
- the signal processor 330 may include a data matching unit 331 that matches the color data and the depth data based on a location of the object that emits visible light and infrared light.
- the object may be the display apparatus 100 of FIG. 2 .
- the data matching unit 331 may match a value of a pixel from the color data to a value of a pixel in the depth data corresponding to the pixel in the color data.
- the display unit 340 may be a small display device that may be mounted on the head-mounted display apparatus 300 , for example, an organic light-emitting display or a liquid crystal display (“LCD”) device.
- an organic light-emitting display or a liquid crystal display (“LCD”) device.
- LCD liquid crystal display
- the 3D rendering data generated by the signal processor 330 may be input to the display unit 340 .
- the display unit 340 may display an image that corresponds to the 3D rendering data.
- the image may be a 3D image, in particular, a 2D image with a 3D effect.
- optical devices R 1 and R 2 which changes the path of light and an optical device 350 that converges light to a predetermined area may be disposed on a path of light emitted by the display unit 340 .
- the predetermined area may be a crystalline lens 41 of an eye 40 of the user. Light converged to the crystalline lens 41 may pass through the lens 41 and reach a retina 42 of the eye 40 of the user.
- the shortest focal length of the eye of a person may be about 20 centimeters (cm) or more. According to an exemplary embodiment, even when a distance between the eye 40 of the user and the display unit 340 is smaller than the shortest focal length, the shortest focal length may be provided by the optical device 350 between the display unit 340 and the eye 40 . In such an embodiment, the shortest focal length is effectively provided, such that the user may clearly and easily recognize the image displayed by the display unit 340 .
- FIGS. 5 and 6 illustrate an exemplary embodiment, where the display unit 340 is located beside the eye 40 , rather than the front of the eye 40 , and the optical devices R 1 and R 2 changes the path of light emitted by the display unit 340 toward a direction of the eye 40 .
- exemplary embodiments are not limited thereto.
- the display unit 340 may be located in front of the eye 40 , and the optical devices R 1 and R 2 may be omitted.
- the display unit 340 may be a transparent display by which the user may not only see the image displayed by the display unit 340 , but also see the external background.
- the head-mounted display apparatus 300 may include a frame 360 that accommodates the camera 310 , the infrared sensor 320 , the signal processor 330 and the display unit 340 .
- the frame 360 may be shaped such that the frame 360 may be disposed or mounted on the head of the user.
- the frame 360 may include a lens unit 370 that is disposed between the object and the user.
- the lens unit 370 may include a transparent or semi-transparent lens to generate augmented reality.
- the user may not only see the image displayed by the display unit 340 in the head-mounted display apparatus 300 , but also the background image passing through the lens unit 370 .
- the lens unit 370 may be configured as an opaque lens to generate virtual reality.
- the user wearing the head-mounted display apparatus 300 may only see the image displayed by the display unit 340 .
- FIG. 7 is a schematic cross-sectional view of an exemplary embodiment of the lens unit 370 in the head-mounted display apparatus 300 of FIG. 5 .
- the lens unit 370 of the head-mounted display apparatus 300 of FIG. 5 may be a lens unit 470 shown in FIG. 7 .
- the lens unit 470 may be disposed between the object and the user, be accommodated in the frame 360 of FIG. 5 , and include a transmittance adjusting unit for adjusting transmittance of light incident from the object.
- the transmittance adjusting unit may include a liquid crystal 473 (e.g., a liquid crystal layer or liquid crystal molecules).
- the lens unit 470 may include a first polarizer 471 , a first substrate 472 , the liquid crystal 473 , a second substrate 474 , and a second polarizer 475 .
- the transmittance of the lens unit 470 may be adjusted by controlling an arrangement direction of the liquid crystal 473 by applying an electric field to the liquid crystal 473 .
- the head-mounted display apparatus 300 may selectively display augmented reality in which the image displayed by the display unit 340 of FIG. 5 and the external background are visible, or virtual reality in which the external background is not visible.
- FIG. 7 illustrates an exemplary embodiment of the transmittance adjusting unit including the liquid crystal 473
- exemplary embodiments are not limited thereto.
- the transmittance adjusting unit may have various structures, for example, a light blocking unit may be or not be disposed in front of a transparent lens to transmit or not transmit light incident from an external background.
- FIG. 8 is a flowchart of an image display method, according to an exemplary embodiment.
- FIG. 9 is a conceptual view of a system for providing the image display method of FIG. 8 , according to an exemplary embodiment.
- an exemplary embodiment of the image display method of the head-mounted display apparatus 300 may include receiving visible light and infrared light emitted from the display apparatus 100 or 200 (S 110 ), extracting color data and depth data respectively from visible light and infrared light (S 120 ), generating 3D rendering data based on the color data and the depth data (S 140 ), and displaying an image corresponding to the 3D rendering data (S 150 ).
- the display apparatus 100 or 200 may be the display apparatus 100 of FIGS. 1 to 3 or the display apparatus 200 of FIG. 4
- the head-mounted display apparatus 300 may be the head-mounted display apparatus 300 of FIG. 5
- exemplary embodiments are not limited thereto.
- the display apparatus and the head-mounted display apparatus may be modified in various ways.
- Each of the display apparatuses 100 and 200 includes a plurality of pixels.
- Each of the plurality of pixels may include the visible light sub-pixels SP 1 , SP 2 , SP 3 , and SP that emit visible light and the infrared sub-pixel IR that emits infrared light.
- the display apparatus 100 or 200 may include the controller 31 of FIG. 1 that controls intensity of infrared light emitted by the infrared sub-pixel IR in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels.
- the image display method may further include matching the color data and the depth data based on respective locations of the plurality of pixels in the display apparatus 100 or 200 that emits visible light and infrared light (S 130 ).
- the head-mounted display apparatus 300 of FIG. 5 may include the camera 310 , the infrared sensor 320 , the signal processor 330 and the display unit 340 .
- the camera 310 and the infrared sensor 320 may perform the receiving of visible light and infrared light emitted from the display apparatus 100 or 200 (S 110 ) and the extracting of the color data and the depth data respectively from visible light and infrared light (S 120 ).
- the signal processor 330 may perform the generating of the 3D rendering data based on the color data and the depth data (S 140 ).
- the display unit 340 may perform the displaying of the image corresponding to the 3D rendering data (S 150 ).
- the matching of the color data and the depth data may be performed by the data matching unit 331 in the signal processor 330 of the head-mounted display apparatus 300 .
- the image displayed by the display apparatus 100 or 200 may include visible light VL that is visible to the user and infrared light IRL that is invisible to the user.
- the image may be emitted not only in a normal direction to a main plane on which the image of the display apparatus 100 or 200 is displayed, but also within an angle range with respect to the normal direction.
- the angle range may be equal to or greater than about ⁇ 60°.
- some users U 1 to U 4 may be located at an angle range equal to or greater than about 120° with respect to the display apparatus 100 .
- the users U 1 to U 4 may simultaneously watch the image displayed by the display apparatus 100 or 200 .
- Light emitted from the display apparatus 100 or 200 may be simultaneously observed by the users U 1 , U 2 , U 3 and U 4 that are located within a predetermined angle range in front of the display apparatus 100 or 200 .
- the user U 3 that is not wearing the head-mounted display apparatus 300 may not see the infrared light IRL but only see the visible light VL. That is, the user U 3 may only see the 2D image on the display apparatus 100 or 200 .
- the users U 1 , U 2 , and U 4 that are wearing the head-mounted display apparatus 300 may see a 3D image, e.g., a 2D image with a 3D effect, according to the above-described image display method.
- a 3D image e.g., a 2D image with a 3D effect
- the users U 1 , U 2 , and U 4 may be located in front of the display apparatus 100 or 200 , and simultaneously see the 3D image displayed by the display apparatus 100 or 200 .
- the user U 3 that is not wearing the head-mounted display apparatus 300 sees only the 2D image. Therefore, 2D and 3D images may be simultaneously displayed without changing a mode of the display apparatus 100 or 200 .
- the display apparatuses 100 and 200 and the head-mounted display apparatus 300 may simultaneously display the 2D and 3D images, and may provide a continuous wide viewing angle for the 3D image.
- the head-mounted display apparatus 300 and the image display method may easily generate augmented reality by connecting with the display apparatus 100 or 200 .
Abstract
Description
- This application claims priority to Korean Patent Application No. 10-2015-0123014, filed on Aug. 31, 2015, and all the benefits accruing therefrom under 35 U.S.C. §119, the content of which in its entirety is herein incorporated by reference.
- 1. Field
- One or more exemplary embodiments relate to a display apparatus, a head-mounted display apparatus, an image display method, and an image display system, and more particularly, to a display apparatus, a head-mounted display apparatus, an image display method, and am image display system of displaying a three-dimensional (“3D”) image by emitting or using infrared rays having depth information.
- 2. Description of the Related Art
- A head-mounted display apparatus typically refers to a display apparatus configured to be mounted on a user's head in the form of glasses or a helmet. In such a head-mounted display apparatus, images are displayed in front of the eyes of the user so that the user may recognize the images. The head-mounted display apparatus may display the images using self-generated light and/or light incident from an external source.
- A light-emitting diode (“LED”) is a semiconductor device, in particular, a p-n junction diode that converts energy, which is generated by a recombination of holes and electrons, into light energy. When a voltage is applied to the p-n junction diode in a forward direction, holes and electrons are injected, and a recombination of the holes and the electrons generate energy.
- Inorganic LEDs emit light using inorganic compounds. The inorganic LEDs may include red, yellow, blue, white, ultraviolet and infrared LEDs. The inorganic LEDs are widely used in backlight of a liquid crystal display (“LCD”) device, lighting devices, or electronic displays, for example. Also, organic LEDs emit light using organic compounds, and are widely used in small to large electronic devices, e.g., mobile phones and large screen display devices.
- Due to the increasing demand of three-dimensional (“3D”) display apparatuses, various 3D image display methods are being studied. For example, a display apparatus, such as a television (“TV”), may project different images on left and right eyes of a user by using a polarizer, a lens array, or a shutter to display a 3D image. However, the method above has limited viewpoints and cannot simultaneously display two-dimensional (“2D”) images and 3D images.
- One or more exemplary embodiments include an image display method, a display apparatus, and a head-mounted display apparatus for providing a continuous wide viewing angle for a 3D display apparatus and generating augmented reality by connecting a 3D display apparatus with the head-mounted display apparatus.
- According to one or more exemplary embodiments, a display apparatus includes a first pixel, and a second pixel. In such an embodiment, each of the first and second pixels includes a first sub-pixel which emits light having a first color, a second sub-pixel which emits light having a second color different from the first color, a third sub-pixel which emits light having a third color different from the first and second colors, and an infrared sub-pixel which emits infrared light. In such an embodiment, the infrared light emitted from the infrared sub-pixel in the first pixel and the infrared light emitted from the infrared sub-pixel in the second pixel have different intensities from each other.
- In an exemplary embodiment, the first color, the second color and the third color may be red, green and blue, respectively.
- In an exemplary embodiment, wherein the infrared light emitted from the infrared sub-pixel in the first pixel and the infrared light emitted from the infrared sub-pixel in the second pixel may have substantially the same frequency as each other.
- In an exemplary embodiment, the display apparatus may further include a plurality of pixels including the first and second pixels, and a controller which controls intensities of infrared light emitted by an infrared sub-pixel in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels.
- In an exemplary embodiment, the infrared sub-pixel may include an infrared driver circuit, and an infrared inorganic light-emitting diode (“LED”) electrically connected to and driven by the infrared driver circuit.
- In an exemplary embodiment, the display apparatus may further include a first electrode electrically connected to the infrared driver circuit and contacting an end of the infrared inorganic LED, and a second electrode facing the first electrode and contacting another end of the infrared LED. In such an embodiment, the second electrode may be commonly disposed in the first sub-pixel, the second sub-pixel, the third sub-pixel and the infrared sub-pixel.
- In an exemplary embodiment, the infrared sub-pixel may further include a light spreading layer which spreads infrared light emitted by the infrared inorganic LED.
- In an exemplary embodiment, each of the first to third sub-pixels may include an organic light emitting diode (“OLED”).
- In an exemplary embodiment, each of the first to third sub-pixels may include an inorganic LED.
- According to one or more exemplary embodiments, a head-mounted display apparatus includes a camera which receives visible light emitted by an object and converts the visible light into an electric signal, an infrared sensor which receives infrared light emitted by the object, a signal processor which generates 3D rendering data based on color data obtained by the camera and depth data obtained by the infrared sensor, and a display unit which receives the 3D rendering data from the signal processor and display an image corresponding to the 3D rendering data.
- In an exemplary embodiment, the signal processor may include a data matching unit which matches the color data and the depth data based on a location of the object that emits the visible light and the infrared light.
- In an exemplary embodiment, the head-mounted display apparatus may further include an optical device located on a path of light emitted by the display unit and which focuses the light on a predetermined area.
- In an exemplary embodiment, the head-mounted display apparatus may further include a frame which accommodates the camera, the infrared sensor, the signal processor, and the display unit. In such an embodiment, the frame may be shaped to be mounted on a head of a user.
- In an exemplary embodiment, the head-mounted display apparatus may further include a lens unit accommodated in the frame and located between the object and the user. In such an embodiment, the lens unit may include a transmittance adjusting unit which adjusts a transmittance of light incident from the object.
- In an exemplary embodiment, the transmittance adjusting unit may include a liquid crystal.
- According to one or more exemplary embodiments, an image display method using a head-mounted display apparatus includes receiving visible light and infrared light from a display apparatus, extracting color data and depth data from the visible light and the infrared light, generating 3D rendering data based on the color data and the depth data, and displaying an image corresponding to the 3D rendering data on the head-mounted display apparatus.
- In an exemplary embodiment, the display apparatus may include a plurality of pixels, and each of the plurality of pixels may include a visible light sub-pixel which emits the visible light and an infrared sub-pixel which emits the infrared light.
- In an exemplary embodiment, the display apparatus may further include a controller which controls intensities of infrared light emitted by the infrared sub-pixel in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels.
- In an exemplary embodiment, the infrared sub-pixel may include an infrared driver circuit, and an infrared inorganic light-emitting diode (“LED”) electrically connected to and driven by the infrared driver circuit.
- In an exemplary embodiment, the method may further include, before the generating the 3D rendering data, matching the color data and the depth data based on respective locations of the plurality of pixels in the display apparatus which emits the visible light and the infrared light.
- According to one or more exemplary embodiments, an image display system includes a display apparatus including a plurality of pixels emitting visible light and infrared light and a head-mounted display apparatus configured to receive the visible light and the infrared light from the display apparatus and display image. In such an embodiment, the head-mounted display apparatus includes a camera which receives the visible light emitted by the display apparatus and converts the visible light into an electric signal, an infrared sensor which receives the infrared light emitted by the display apparatus, a signal processor which generates three-dimensional rendering data based on color data obtained by the camera and depth data obtained by the infrared sensor, and a display unit which receives the three-dimensional rendering data from the signal processor and display an image corresponding to the three-dimensional rendering data.
- These and/or other features will become apparent and more readily appreciated from the following description of the exemplary embodiments, taken in conjunction with the accompanying drawings, in which:
-
FIG. 1 is a schematic block diagram of a display apparatus, according to an exemplary embodiment; -
FIG. 2 is a plan view of two adjacent pixels in the display apparatus ofFIG. 1 ; -
FIG. 3 is a cross-sectional view taken along line of a pixelFIG. 2 ; -
FIG. 4 is a schematic cross-sectional view of a display apparatus according to another exemplary embodiment; -
FIG. 5 is a schematic perspective view of a head-mounted display apparatus, according to an exemplary embodiment; -
FIG. 6 is a schematic conceptual view of some components in the head-mounted display apparatus ofFIG. 5 ; -
FIG. 7 is a schematic cross-sectional view of an exemplary embodiment of a lens unit in the head-mounted display apparatus ofFIG. 5 ; -
FIG. 8 is a flowchart of an image display method, according to an exemplary embodiment; and -
FIG. 9 is a conceptual view of a system for providing the image display method ofFIG. 8 , according to an exemplary embodiment. - The invention now will be described more fully hereinafter with reference to the accompanying drawings, in which various embodiments are shown. This invention may, however, be embodied in many different forms, and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Like reference numerals refer to like elements throughout.
- It will be understood that when an element is referred to as being “on” another element, it can be directly on the other element or intervening elements may be present therebetween. In contrast, when an element is referred to as being “directly on” another element, there are no intervening elements present.
- It will be understood that, although the terms “first,” “second,” “third” etc. may be used herein to describe various elements, components, regions, layers and/or sections, these elements, components, regions, layers and/or sections should not be limited by these terms. These terms are only used to distinguish one element, component, region, layer or section from another element, component, region, layer or section. Thus, “a first element,” “component,” “region,” “layer” or “section” discussed below could be termed a second element, component, region, layer or section without departing from the teachings herein.
- The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting. As used herein, the singular forms “a,” “an,” and “the” are intended to include the plural forms, including at least one,” unless the content clearly indicates otherwise. “Or” means “and/or.” As used herein, the term “and/or” includes any and all combinations of one or more of the associated listed items. It will be further understood that the terms “comprises” and/or “comprising,” or “includes” and/or “including” when used in this specification, specify the presence of stated features, regions, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, regions, integers, steps, operations, elements, components, and/or groups thereof.
- Furthermore, relative terms, such as “lower” or “bottom” and “upper” or “top,” may be used herein to describe one element's relationship to another element as illustrated in the Figures. It will be understood that relative terms are intended to encompass different orientations of the device in addition to the orientation depicted in the Figures. For example, if the device in one of the figures is turned over, elements described as being on the “lower” side of other elements would then be oriented on “upper” sides of the other elements. The exemplary term “lower,” can therefore, encompasses both an orientation of “lower” and “upper,” depending on the particular orientation of the figure. Similarly, if the device in one of the figures is turned over, elements described as “below” or “beneath” other elements would then be oriented “above” the other elements. The exemplary terms “below” or “beneath” can, therefore, encompass both an orientation of above and below.
- “About” or “approximately” as used herein is inclusive of the stated value and means within an acceptable range of deviation for the particular value as determined by one of ordinary skill in the art, considering the measurement in question and the error associated with measurement of the particular quantity (i.e., the limitations of the measurement system). For example, “about” can mean within one or more standard deviations, or within ±30%, 20%, 10%, 5% of the stated value.
- Unless otherwise defined, all terms (including technical and scientific terms) used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this disclosure belongs. It will be further understood that terms, such as those defined in commonly used dictionaries, should be interpreted as having a meaning that is consistent with their meaning in the context of the relevant art and the present disclosure, and will not be interpreted in an idealized or overly formal sense unless expressly so defined herein.
- Exemplary embodiments are described herein with reference to cross section illustrations that are schematic illustrations of idealized embodiments. As such, variations from the shapes of the illustrations as a result, for example, of manufacturing techniques and/or tolerances, are to be expected. Thus, embodiments described herein should not be construed as limited to the particular shapes of regions as illustrated herein but are to include deviations in shapes that result, for example, from manufacturing. For example, a region illustrated or described as flat may, typically, have rough and/or nonlinear features. Moreover, sharp angles that are illustrated may be rounded. Thus, the regions illustrated in the figures are schematic in nature and their shapes are not intended to illustrate the precise shape of a region and are not intended to limit the scope of the present claims.
- Hereinafter, exemplary embodiments of the invention will be described with reference to the drawings.
-
FIG. 1 is a schematic block diagram of adisplay apparatus 100, according to an exemplary embodiment.FIG. 2 is a plan view of two adjacent pixels in thedisplay apparatus 100 ofFIG. 1 .FIG. 3 is a cross-sectional view taken along line III-III of the pixel ofFIG. 2 . - Referring to
FIGS. 1 and 2 , an exemplary embodiment of thedisplay apparatus 100 may include adisplay panel 10, adriver 20, and aprocessor 30. - The
display panel 10 may include a plurality of pixels, including a first pixel P1 and a second pixel P2. Thedriver 20 may include a scan driver and a data driver that respectively apply scan signals and data signals to scan lines and data lines, which are connected to the plurality of pixels. Thedriver 20 is connected with theprocessor 30, and may receive information from theprocessor 30, for example, information on timing for applying the scan signals and the data signals to the plurality of pixels and amplitude of signals. -
FIG. 2 schematically illustrates a structure of the first and second pixels P1 and P2 from among the plurality of pixels. In an exemplary embodiment, as shown inFIG. 2 , the first and second pixels P1 and P2 are adjacent to each other, but exemplary embodiments are not limited thereto. In an exemplary embodiment, other pixels may be provided between the first and second pixels P1 and P2. - Each of the first and second pixels P1 and P2 may include a first sub-pixel SP1 that emits light having a first color, a second sub-pixel SP2 that emits light having a second color that is different from the first color, a third sub-pixel SP3 that emits light having a third color that is different from the first and second colors, and an infrared sub-pixel IR that emits infrared light. The infrared light emitted from the infrared sub-pixel IR in the first pixel P1 and the infrared light emitted from the infrared sub-pixel IR in the second pixel P2 emit infrared light of different intensities.
- The infrared light is provided so that depth information is included in light with a predetermined color emitted from the first and second pixels P1 and P2, a difference between intensities of infrared light emitted from the first and second pixels P1 and P2 may correspond to a depth difference between the first and second pixels P1 and P2.
- Herein, the term ‘depth’ refers to a distance from an arbitrary point. Depth information of an image may indicate 3D information of the image. That is, the
display apparatus 100 includes the plurality of pixels including the first and second pixels P1 and P2, and may display images by using the plurality of pixels. Each of the plurality of pixels may be turned on or off. When the pixels are turned on, light having a predetermined color may be emitted by using a combination of first to third sub-pixels SP1 to SP3 that emit visible light in different colors. A display apparatus including only pixels that include the first to third sub-pixels SP1 to SP3 may only display a two-dimensional (“2D”) image. - In an exemplary embodiment, the
display apparatus 100 includes the infrared sub-pixel IR that may include depth information in each of the plurality of pixels, and may obtain depth information of an image displayed by thedisplay apparatus 100 from a combination of infrared light emitted from each of the plurality of pixels. Thus, thedisplay apparatus 100 may obtain 3D information of an image displayed by thedisplay apparatus 100 from a combination of the depth information and 2D image. - According to an exemplary embodiment, the first color, the second color and the third color may be red, green and blue, respectively, but not being limited thereto. The first to third colors may be other colors that show white light when combined, e.g., other primary colors.
- According to an exemplary embodiment, the infrared sub-pixels IR of the first and second pixels P1 and P2 may emit infrared light having substantially the same frequency as each other, but exemplary embodiments are not limited thereto. In an alternative exemplary embodiment, the infrared sub-pixels IR of the first and second pixels P1 and P2 may emit infrared light having different frequencies from each other, respectively.
- The
processor 30 may include acontroller 31 that controls intensity of infrared light emitted by the infrared sub-pixel IR in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels in thedisplay apparatus 100. Thecontroller 31 may control the intensity of infrared light by controlling amplitude of data signals applied to the infrared sub-pixel IR or controlling time when the scan signals are applied to the infrared sub-pixel IR. Although not illustrated, theprocessor 30 may further include a calculator and a register. - The
processor 30 may process signals by using thecontroller 31 so that thedisplay apparatus 100 may display a 2D image including 3D information. A user in front of thedisplay apparatus 100 may not see infrared light without an additional device, and may only see the 2D image displayed by theuser display apparatus 100. When an additional device for receiving a processing infrared light emitted by thedisplay apparatus 100, the user may see a 3D image displayed by thedisplay apparatus 100, which will be described later in detail. - Although
FIG. 2 illustrates an exemplary embodiment in which sub-pixels in the first and second pixels P1 and P2 are arranged in the form of a 2x2 matrix, exemplary embodiments are not limited thereto, and the sub-pixels may be arranged in various ways. -
FIG. 3 is a cross-sectional view of the third sub-pixel SP3 and the infrared sub-pixel IR of the first pixel P1 ofFIG. 2 , according to an exemplary embodiment. Hereinafter, some elements in the third sub-pixel SP3 and the infrared sub-pixel IR will be described in detail with reference toFIG. 3 . - In the first pixel P1, a
buffer layer 111 is on asubstrate 110. A driver circuit including a transistor TIR and a capacitor (not shown), and an infrared inorganic light-emitting diode (“LED”) LEDIR that is connected to and driven by the driver circuit are disposed in an area of thebuffer layer 111 corresponding to the infrared sub-pixel IR. - The
substrate 110 may include glass or plastic. Thebuffer layer 111 may effectively prevent impurities from penetrating to the driver circuit from thesubstrate 110, and planarizes a surface of thesubstrate 110. Thebuffer layer 111 may have a single layer structure or a multi-layer structure including a layer of an inorganic material such as silicon nitride (SiNx) and/or silicon oxide (SiOx). - The transistor TIR may include an
active layer 112, agate electrode 114, asource electrode 116S and adrain electrode 116D. Theactive layer 112 may have a source area and a drain area that are conductive, and a channel area between the source and drain areas. Thegate electrode 114 may be disposed on theactive layer 112 but insulated from theactive layer 112. The source electrode 116S and thedrain electrode 116D may be electrically connected with the source area and the drain area of theactive layer 112, respectively. At least one of thesource electrode 116S and thedrain electrode 116D may be omitted. - A first insulating
layer 113 may be disposed between theactive layer 112 and thegate electrode 114. A second insulatinglayer 115 may be disposed on the first insulatinglayer 113 and cover thegate electrode 114. The first insulatinglayer 113 and the second insulatinglayer 115 may have a single layer structure or a multilayer structure including a layer of an inorganic material such as silicon nitride (SiNx) and/or silicon oxide (SiOx). - A third insulating
layer 117 may be disposed on the second insulatinglayer 115 and cover thesource electrode 116S and thedrain electrode 116D. The thirdinsulating layer 117 may include an organic material and/or an inorganic material. - Although
FIG. 3 illustrates that thegate electrode 114 of the transistor TIR is disposed above theactive layer 112, exemplary embodiments are not limited thereto. Thegate electrode 114 may be disposed under theactive layer 112. - A
bank 170 may be disposed on the third insulatinglayer 117 and define a sub-pixel area. Thebank 170 may include aconcave area 170 a that accommodates the infrared inorganic LED LEDIR. A height of thebank 170 may be determined based on a height of the infrared inorganic LED LEDIR and a viewing angle. A size (e.g., a width) of theconcave area 170 a may be determined based on resolution of thedisplay apparatus 100. AlthoughFIG. 2 illustrates that theconcave area 170 a is square-shaped, exemplary embodiments are not limited thereto. Alternatively, theconcave area 170 a may have various shapes, for example, a polygonal, rectangular, circular, oval, or triangular shape. - The
first electrode 120 a may be disposed on a side surface and a bottom surface of theconcave area 170 a and at least a portion of an upper surface of thebank 170. Thefirst electrode 120 a may be electrically connected to thesource electrode 116S or thedrain electrode 116D of the transistor TIR via a hole H formed in the third insulatinglayer 117. - According to an exemplary embodiment, the
bank 170 may function as a light blocking unit and include a material with low light transmittance. Thebank 170 may effectively prevent light from being emitted through a side surface of the infrared inorganic LED LEDIR, and thus effectively prevent interference with light emitted from adjacent sub-pixels. In such an embodiment, thebank 170 may increase a bright room contrast ratio (“BRCR”) of thedisplay apparatus 100 by absorbing and blocking light incident from an external source outside thedisplay apparatus 100. However, exemplary embodiments are not limited thereto. Thebank 170 may include a semi-transparent material, an optical reflective material, or a light spreading material. - The infrared inorganic LED LEDIR may be disposed in the
concave area 170 a of thebank 170. According to an exemplary embodiment, the infrared inorganic LED LEDIR may be, but is not limited to, a micro LED having about 1 micrometer (μm) to about 100 μm. A single piece or a plurality of the infrared inorganic LED LEDIR may be picked up by a transfer device from a wafer, transferred to thesubstrate 110, and then, accommodated in theconcave area 170 a. The infrared inorganic LED LEDIR may emit infrared light with a wavelength of about 700 nanometers (nm) to about 1 mm. Infrared light may be not visible to the user's eyes. - The infrared inorganic LED LEDIR may include a
p-n junction diode 140 a, afirst contact electrode 130 a and asecond contact electrode 150 a. Thefirst contact electrode 130 a and/or thesecond contact electrode 150 a may have a single-layer structure or a multi-layer structure including at least one of metal, conductive oxide, and conductive polymer. Thefirst contact electrode 130 a and thesecond contact electrode 150 a may selectively include a reflective layer, for example, a layer of silver. Thefirst contact electrode 130 a may be electrically connected to thefirst electrode 120 a. Thesecond contact electrode 150 a may be electrically connected to asecond electrode 160. Thep-n junction diode 140 a may include a p-doping layer 141 a, an n-doping layer 142 a, and anintermediate layer 143 a between the p-doping layer 141 a and the n-doping layer 142 a. Theintermediate layer 143 a is an area that emits light as excitons generated by a recombination of electrons and holes transitions from a higher energy level to a lower energy level. Theintermediate layer 143 a includes a semiconductor material and may have a single quantum well or a multi quantum well structure. - The
first electrode 120 a may include a reflective electrode, and thesecond electrode 160 may include a transparent or semi-transparent electrode. Thesecond electrode 160 may be commonly disposed in the plurality of pixels in thedisplay apparatus 100 as a common electrode. - A
passivation layer 180 may surround at least a portion of the infrared inorganic LED LEDIR in theconcave area 170 a, and may cover thebank 170. Thepassivation layer 180 may have a predetermined height such that an upper portion of the infrared inorganic LED LEDIR, for example, thesecond contact electrode 150 a, is not covered. Therefore, thesecond contact electrode 150 a may be not covered by, but exposed through, thepassivation layer 180. The exposedsecond contact electrode 150 a may be electrically connected to thesecond electrode 160. - Although not illustrated, an exemplary embodiment of the
display apparatus 100 may further include a light spreading layer (not shown) that spreads infrared light. The light spreading layer may be disposed on a path of infrared light emitted by the infrared inorganic LED LEDIR. The light spreading layer may be arranged in various locations and shapes. The light spreading layer may allow the infrared inorganic LED LEDIR to uniformly emit infrared light from a front surface of the infrared sub-pixel IR to the outside, and increase an angle range, e.g., a viewing angle, of infrared light emitted by thedisplay apparatus 100. - In an area on the
buffer layer 111 corresponding to the third sub-pixel SP3, a driver circuit including a transistor TSP3 and a capacitor, and an inorganic LED LEDSP3 that is electrically connected to the driver circuit and driven by the driver circuit. - The third sub-pixel SP3 may have substantially the same structure as the infrared sub-pixel IR described above, except for a difference in wavelengths of light emitted by the inorganic LED. The third sub-pixel SP3 may include a
first electrode 120 b electrically connected to the transistor TSP3, and the inorganic LED LEDSP3 on thefirst electrode 120 b. The inorganic LED LEDSP3 may include afirst contact electrode 130 b that is electrically connected to thefirst electrode 120 b, ap-n junction diode 140 b including a p-doping layer 114 b, an n-doping layer 142 b and anintermediate layer 143 b on the first contact electrode 113 b, and asecond contact electrode 150 b on thep-n junction diode 140 b and electrically connected to thesecond electrode 160. - In such an embodiment, the first and second sub-pixels SP1 and SP2 of
FIG. 2 may have the same structure as the third sub-pixel SP3, except for a difference in colors of light emitted by the inorganic LED. - According to an exemplary embodiment, the first to third sub-pixels SP1 to SP3 and the infrared sub-pixel IR may be disposed or transferred by an identical LED transfer device. The size of the
display apparatus 100 may be easily reduced by including a small inorganic LED. - In such an embodiment, the infrared sub-pixel IR is included in each of the plurality of pixels of the
display apparatus 100, such that both a 2D image and depth information corresponding to each of the plurality of pixels may be displayed. -
FIG. 4 is a schematic cross-sectional view of adisplay apparatus 200 according to another exemplary embodiment. - Referring to
FIG. 4 , an exemplary embodiment of thedisplay apparatus 200 may include a plurality of pixels, including a visible light sub-pixel SP that emits visible light and an infrared sub-pixel IR that emits infrared light. - In such an embodiment, the
display apparatus 200 includes a substrate, and abuffer layer 211 on thesubstrate 210. A driver circuit including at least one transistor TIR and at least one capacitor (not shown), and an infrared inorganic LED LEDIR that is connected to and driven by the driver circuit are disposed in an area of thebuffer layer 211 corresponding to the infrared sub-pixel IR. - The transistor TIR may include an
active layer 212, agate electrode 214, asource electrode 216S, and adrain electrode 216D. A first insulatinglayer 213 may be disposed between theactive layer 212 and thegate electrode 214. A second insulatinglayer 215 may be disposed on the first insulatinglayer 213 and cover thegate electrode 214. - A third insulating
layer 217 may be disposed on the second insulatinglayer 215 and cover thesource electrode 216S and thedrain electrode 216D. Abank 270 may be disposed on the third insulatinglayer 217 and define a sub-pixel area. Thebank 270 may include aconcave area 270 a that accommodates the infrared inorganic LED LEDIR. - A
first electrode 220 a is disposed on the third insulatinglayer 217. Thefirst electrode 220 a may be electrically connected with the transistor TIR via a hole H formed in the third insulatinglayer 217. Both ends of thefirst electrode 220 a may be covered by thebank 270. - The infrared inorganic LED LEDIR may be disposed in the
concave area 270 a of thebank 270. The infrared inorganic LED LEDIR may be a micro LED with a size (e.g., a length or width) of about 1 μm to about 100 μm that emits infrared light with a wavelength in a range of about 700 nm to about 1 mm. - The infrared inorganic LED LEDIR may include a
p-n junction diode 240 a, afirst contact electrode 230 a and asecond contact electrode 250 a. Thep-n junction diode 240 a may include a p-doping layer 241 a, an n-doping layer 242 a, and anintermediate layer 243 a between the p-doping layer 241 a and the n-doping layer 242 a. - The
first electrode 220 a may include a reflective electrode, and asecond electrode 260 may include a transparent or semi-transparent electrode. Thesecond electrode 260 may be commonly disposed in the plurality of pixels in thedisplay apparatus 200 as a common electrode. - A
passivation layer 280 may surround at least a portion of the infrared inorganic LED LEDIR in theconcave area 270 a, and may cover thebank 270 and the infrared inorganic LED LEDIR. Thepassivation layer 280 may have a predetermined height such that thesecond contact electrode 250 a of the infrared inorganic LED LEDIR is not covered. Therefore, thesecond contact electrode 250 a may be not covered by but exposed through thepassivation layer 280. The exposedsecond contact electrode 250 a may be electrically connected to thesecond electrode 260. - In an area on the
buffer layer 211 corresponding to the visible light sub-pixel SP, a driver circuit including a transistor TSP and a, and an organic LED OLEDSP that is electrically connected to the driver circuit and driven by the driver circuit. - The visible light sub-pixel SP may include the organic LED OLEDSP that includes a
first electrode 220 b electrically connected to the transistor TSP, asecond electrode 260 facing thefirst electrode 220 b, and anorganic emission layer 240 b between thefirst electrode 220 b and thesecond electrode 260. - According to an exemplary embodiment, the
display apparatus 200 may include the visible light sub-pixel SP that includes the organic LED OLEDSP that is appropriate for a large screen display apparatus and has fast response speed, and the infrared sub-pixel IR that includes the inorganic LED LEDIR that generates infrared light. As in an exemplary embodiment of thedisplay apparatus 100 described above with reference toFIG. 3 , thedisplay apparatus 200 may not only display a 2D image but also depth information corresponding to each of the plurality of pixels by including the infrared sub-pixel IR in each of the plurality of pixels. -
FIG. 5 is a schematic perspective view of a head-mounteddisplay apparatus 300, according to an exemplary embodiment.FIG. 6 is a schematic conceptual view of some components in the head-mounteddisplay apparatus 300 ofFIG. 5 . - Referring to
FIGS. 5 and 6 , an exemplary embodiment of the head-mounteddisplay apparatus 300 may include acamera 310 that receives visible light emitted by an object and converts the visible light into an electric signal, aninfrared sensor 320 that receives infrared light emitted by the object, asignal processor 330 that generates 3D rendering data based on color data obtained by thecamera 310 and depth data obtained by theinfrared sensor 320, and adisplay unit 340 that receives 3D rendering data from thesignal processor 330 and displays an image corresponding to the 3D rendering data. - The
camera 310 may include an image sensor (not shown) such as a charge coupled device (“CCD”) or a complementary metal-oxide semiconductor (“CMOS”), and an optical system (not shown) that focuses light incident from the object. An infrared ray block filter and/or an ultraviolet ray block filter may be disposed in front of the image sensor. - The
infrared sensor 320 may also include an image sensor (not shown) such as a CCD or a CMOS. A band-pass filter that passes infrared rays of a certain frequency range and/or a block filter that blocks light having a wavelength range lower than that of visible light rays. - The
camera 310 may obtain the color data of the object, e.g., color data of a 2D image of the object. The depth data of the object may be obtained by theinfrared sensor 320 that receives infrared light including the depth information. The object may be thedisplay apparatus 100 ofFIG. 2 in which each of the plurality of pixels include the infrared sub-pixel IR. - The
signal processor 330 may generate 3D rendering data based on the color data obtained by thecamera 310 and the depth data obtained by theinfrared sensor 320. Herein, ‘3D rendering’ refers to a process of generating a 3D image by using a 2D image based on shadows, colors and density thereof, or a process of adding a 3D effect to a 2D image by changing shadows or density. - According to an exemplary embodiment, the
signal processor 330 of the head-mounteddisplay apparatus 300 may generate the 3D rendering data by combining the 2D image obtained by thecamera 310 with the depth information obtained by theinfrared sensor 320 and thus changing shadows or density of the 2D image. Thesignal processor 330 may include adata matching unit 331 that matches the color data and the depth data based on a location of the object that emits visible light and infrared light. According to an exemplary embodiment, the object may be thedisplay apparatus 100 ofFIG. 2 . Thedata matching unit 331 may match a value of a pixel from the color data to a value of a pixel in the depth data corresponding to the pixel in the color data. - The
display unit 340 may be a small display device that may be mounted on the head-mounteddisplay apparatus 300, for example, an organic light-emitting display or a liquid crystal display (“LCD”) device. - The 3D rendering data generated by the
signal processor 330 may be input to thedisplay unit 340. Thedisplay unit 340 may display an image that corresponds to the 3D rendering data. The image may be a 3D image, in particular, a 2D image with a 3D effect. - In such an embodiment, as shown in
FIG. 6 , optical devices R1 and R2, which changes the path of light and anoptical device 350 that converges light to a predetermined area may be disposed on a path of light emitted by thedisplay unit 340. The predetermined area may be acrystalline lens 41 of aneye 40 of the user. Light converged to thecrystalline lens 41 may pass through thelens 41 and reach aretina 42 of theeye 40 of the user. - The shortest focal length of the eye of a person may be about 20 centimeters (cm) or more. According to an exemplary embodiment, even when a distance between the
eye 40 of the user and thedisplay unit 340 is smaller than the shortest focal length, the shortest focal length may be provided by theoptical device 350 between thedisplay unit 340 and theeye 40. In such an embodiment, the shortest focal length is effectively provided, such that the user may clearly and easily recognize the image displayed by thedisplay unit 340. -
FIGS. 5 and 6 illustrate an exemplary embodiment, where thedisplay unit 340 is located beside theeye 40, rather than the front of theeye 40, and the optical devices R1 and R2 changes the path of light emitted by thedisplay unit 340 toward a direction of theeye 40. However, exemplary embodiments are not limited thereto. In an alternative exemplary embodiment, thedisplay unit 340 may be located in front of theeye 40, and the optical devices R1 and R2 may be omitted. According to another exemplary embodiment, thedisplay unit 340 may be a transparent display by which the user may not only see the image displayed by thedisplay unit 340, but also see the external background. - According to an exemplary embodiment, the head-mounted
display apparatus 300 may include aframe 360 that accommodates thecamera 310, theinfrared sensor 320, thesignal processor 330 and thedisplay unit 340. Theframe 360 may be shaped such that theframe 360 may be disposed or mounted on the head of the user. In such an embodiment, theframe 360 may include alens unit 370 that is disposed between the object and the user. - In an exemplary embodiment, the
lens unit 370 may include a transparent or semi-transparent lens to generate augmented reality. In such an embodiment, the user may not only see the image displayed by thedisplay unit 340 in the head-mounteddisplay apparatus 300, but also the background image passing through thelens unit 370. - However, exemplary embodiments are not limited thereto. In an alternative exemplary embodiment, the
lens unit 370 may be configured as an opaque lens to generate virtual reality. In such an embodiment, the user wearing the head-mounteddisplay apparatus 300 may only see the image displayed by thedisplay unit 340. -
FIG. 7 is a schematic cross-sectional view of an exemplary embodiment of thelens unit 370 in the head-mounteddisplay apparatus 300 ofFIG. 5 . - In an exemplary embodiment, the
lens unit 370 of the head-mounteddisplay apparatus 300 ofFIG. 5 may be alens unit 470 shown inFIG. 7 . - The
lens unit 470 may be disposed between the object and the user, be accommodated in theframe 360 ofFIG. 5 , and include a transmittance adjusting unit for adjusting transmittance of light incident from the object. - According to an exemplary embodiment, the transmittance adjusting unit may include a liquid crystal 473 (e.g., a liquid crystal layer or liquid crystal molecules). In such an embodiment, the
lens unit 470 may include afirst polarizer 471, afirst substrate 472, theliquid crystal 473, asecond substrate 474, and asecond polarizer 475. The transmittance of thelens unit 470 may be adjusted by controlling an arrangement direction of theliquid crystal 473 by applying an electric field to theliquid crystal 473. - In such an embodiment, the head-mounted
display apparatus 300 may selectively display augmented reality in which the image displayed by thedisplay unit 340 ofFIG. 5 and the external background are visible, or virtual reality in which the external background is not visible. - Although
FIG. 7 illustrates an exemplary embodiment of the transmittance adjusting unit including theliquid crystal 473, exemplary embodiments are not limited thereto. Alternatively, the transmittance adjusting unit may have various structures, for example, a light blocking unit may be or not be disposed in front of a transparent lens to transmit or not transmit light incident from an external background. -
FIG. 8 is a flowchart of an image display method, according to an exemplary embodiment.FIG. 9 is a conceptual view of a system for providing the image display method ofFIG. 8 , according to an exemplary embodiment. - Referring to
FIGS. 8 and 9 , an exemplary embodiment of the image display method of the head-mounteddisplay apparatus 300 may include receiving visible light and infrared light emitted from thedisplay apparatus 100 or 200 (S110), extracting color data and depth data respectively from visible light and infrared light (S120), generating 3D rendering data based on the color data and the depth data (S140), and displaying an image corresponding to the 3D rendering data (S150). - According to an exemplary embodiment, the
display apparatus display apparatus 100 ofFIGS. 1 to 3 or thedisplay apparatus 200 ofFIG. 4 , and the head-mounteddisplay apparatus 300 may be the head-mounteddisplay apparatus 300 ofFIG. 5 . However, exemplary embodiments are not limited thereto. The display apparatus and the head-mounted display apparatus may be modified in various ways. - Each of the
display apparatuses display apparatus controller 31 ofFIG. 1 that controls intensity of infrared light emitted by the infrared sub-pixel IR in each of the plurality of pixels, based on data of a depth difference between the plurality of pixels. - Such an embodiment of the
display apparatus FIGS. 1 to 4 , and any repetitive detailed description thereof will be omitted. - According to an exemplary embodiment, before the generating of the 3D rendering data (S140), the image display method may further include matching the color data and the depth data based on respective locations of the plurality of pixels in the
display apparatus - The head-mounted
display apparatus 300 ofFIG. 5 may include thecamera 310, theinfrared sensor 320, thesignal processor 330 and thedisplay unit 340. Thecamera 310 and theinfrared sensor 320 may perform the receiving of visible light and infrared light emitted from thedisplay apparatus 100 or 200 (S110) and the extracting of the color data and the depth data respectively from visible light and infrared light (S120). Thesignal processor 330 may perform the generating of the 3D rendering data based on the color data and the depth data (S140). Thedisplay unit 340 may perform the displaying of the image corresponding to the 3D rendering data (S150). - In such an embodiment, the matching of the color data and the depth data (S130) may be performed by the
data matching unit 331 in thesignal processor 330 of the head-mounteddisplay apparatus 300. - Referring to
FIG. 9 , the image displayed by thedisplay apparatus display apparatus FIG. 9 , some users U1 to U4 may be located at an angle range equal to or greater than about 120° with respect to thedisplay apparatus 100. The users U1 to U4 may simultaneously watch the image displayed by thedisplay apparatus - Light emitted from the
display apparatus display apparatus display apparatus 300 may not see the infrared light IRL but only see the visible light VL. That is, the user U3 may only see the 2D image on thedisplay apparatus - However, the users U1, U2, and U4 that are wearing the head-mounted
display apparatus 300 may see a 3D image, e.g., a 2D image with a 3D effect, according to the above-described image display method. - The users U1, U2, and U4 may be located in front of the
display apparatus display apparatus display apparatus 300 sees only the 2D image. Therefore, 2D and 3D images may be simultaneously displayed without changing a mode of thedisplay apparatus - According to the exemplary embodiments described herein, the
display apparatuses display apparatus 300 may simultaneously display the 2D and 3D images, and may provide a continuous wide viewing angle for the 3D image. - In such embodiments, the head-mounted
display apparatus 300 and the image display method may easily generate augmented reality by connecting with thedisplay apparatus - While one or more exemplary embodiments have been described with reference to the figures, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope as defined by the following claims.
Claims (21)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/209,243 US11582440B2 (en) | 2015-08-31 | 2021-03-23 | Display apparatus, head-mounted display apparatus, image display method, and image display system |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150123014A KR102497281B1 (en) | 2015-08-31 | 2015-08-31 | Display apparatus, head mounted display apparatus, and image display method |
KR10-2015-0123014 | 2015-08-31 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/209,243 Continuation US11582440B2 (en) | 2015-08-31 | 2021-03-23 | Display apparatus, head-mounted display apparatus, image display method, and image display system |
Publications (1)
Publication Number | Publication Date |
---|---|
US20170064291A1 true US20170064291A1 (en) | 2017-03-02 |
Family
ID=58096446
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US15/139,766 Abandoned US20170064291A1 (en) | 2015-08-31 | 2016-04-27 | Display apparatus, head-mounted display apparatus, image display method, and image display system |
US17/209,243 Active US11582440B2 (en) | 2015-08-31 | 2021-03-23 | Display apparatus, head-mounted display apparatus, image display method, and image display system |
Family Applications After (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/209,243 Active US11582440B2 (en) | 2015-08-31 | 2021-03-23 | Display apparatus, head-mounted display apparatus, image display method, and image display system |
Country Status (3)
Country | Link |
---|---|
US (2) | US20170064291A1 (en) |
KR (1) | KR102497281B1 (en) |
CN (1) | CN106483658B (en) |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20170294451A1 (en) * | 2016-04-12 | 2017-10-12 | Samsung Display Co., Ltd. | Display device |
US20180158847A1 (en) * | 2015-06-16 | 2018-06-07 | Au Optronics Corporation | Light emitting diode display |
US10134709B1 (en) | 2017-12-21 | 2018-11-20 | Industrial Technology Research Institute | Substrateless light emitting diode (LED) package for size shrinking and increased resolution of display device |
EP3480803A1 (en) * | 2017-11-07 | 2019-05-08 | Macroblock, Inc. | Dual light source system and method of generating dual images using the same |
WO2019108109A1 (en) * | 2017-11-28 | 2019-06-06 | Fingerprint Cards Ab | Biometric imaging system and method for controlling the system |
US10347179B2 (en) * | 2016-08-23 | 2019-07-09 | Samsung Display Co., Ltd. | Display device and driving method thereof |
US10522062B2 (en) * | 2016-10-13 | 2019-12-31 | Industrial Technology Research Institute | Three-dimensional display module |
US20200175712A1 (en) * | 2018-11-30 | 2020-06-04 | Hins Sas | Head mounted device for virtual or augmented reality combining reliable gesture recognition with motion tracking algorithm |
WO2020127817A1 (en) * | 2018-12-21 | 2020-06-25 | Osram Opto Semiconductors Gmbh | Optoelectronic component, display device, image system, and method for operating an image system |
WO2020196789A1 (en) * | 2019-03-28 | 2020-10-01 | 株式会社ジャパンディスプレイ | Display device |
CN112204761A (en) * | 2018-05-31 | 2021-01-08 | 株式会社日本显示器 | Display device |
US10892257B2 (en) | 2019-01-21 | 2021-01-12 | Innolux Corporation | Foldable display device |
WO2021024609A1 (en) * | 2019-08-08 | 2021-02-11 | 株式会社ジャパンディスプレイ | Display device |
US20210082893A1 (en) * | 2018-05-31 | 2021-03-18 | Japan Display Inc. | Display device and array substrate |
US11233954B1 (en) * | 2019-01-24 | 2022-01-25 | Rockwell Collins, Inc. | Stereo infrared imaging for head mounted devices |
US11311188B2 (en) * | 2017-07-13 | 2022-04-26 | Micro Medical Devices, Inc. | Visual and mental testing using virtual reality hardware |
US20220149024A1 (en) * | 2018-12-28 | 2022-05-12 | Honor Device Co., Ltd. | Display, electronic device, and display fabrication method |
US11380738B2 (en) | 2017-04-13 | 2022-07-05 | Hong Kong Beida Jade Bird Display Limited | LED-OLED hybrid self-emissive display |
US11586042B2 (en) | 2018-08-20 | 2023-02-21 | Samsung Display Co., Ltd. | Optical device |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180190672A1 (en) * | 2017-01-03 | 2018-07-05 | Innolux Corporation | Display device |
CN107085304B (en) * | 2017-04-10 | 2020-12-04 | 北京维信诺光电技术有限公司 | Near-to-eye display device |
KR102334953B1 (en) * | 2017-06-12 | 2021-12-02 | 엘지디스플레이 주식회사 | Display Device And Method For Driving Of The Same |
CN109425991A (en) * | 2017-06-28 | 2019-03-05 | 上海与德科技有限公司 | A kind of display screen and display device |
CN109143599A (en) * | 2017-06-28 | 2019-01-04 | 上海与德科技有限公司 | A kind of display screen and display device |
KR102481946B1 (en) * | 2017-07-17 | 2022-12-29 | 서울반도체 주식회사 | Display apparatus |
CN109300966A (en) * | 2018-10-31 | 2019-02-01 | 京东方科技集团股份有限公司 | Display panel and preparation method thereof and display device |
US11716863B2 (en) * | 2020-05-11 | 2023-08-01 | Universal Display Corporation | Hybrid display architecture |
CN113376899A (en) * | 2021-06-25 | 2021-09-10 | 安徽熙泰智能科技有限公司 | Virtual reality glasses with adjustable luminousness |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6653765B1 (en) * | 2000-04-17 | 2003-11-25 | General Electric Company | Uniform angular light distribution from LEDs |
US7560679B1 (en) * | 2005-05-10 | 2009-07-14 | Siimpel, Inc. | 3D camera |
US20100020209A1 (en) * | 2008-07-25 | 2010-01-28 | Samsung Electronics Co., Ltd. | Imaging method and apparatus |
US20110175981A1 (en) * | 2010-01-19 | 2011-07-21 | Chun-Hung Lai | 3d color image sensor |
US20110291116A1 (en) * | 2010-05-28 | 2011-12-01 | Samsung Mobile Display Co., Ltd. | Organic light emitting diode display and method for manufacturing the same |
US20150304638A1 (en) * | 2012-11-23 | 2015-10-22 | Lg Electronics Inc. | Method and apparatus for obtaining 3d image |
US20150364107A1 (en) * | 2014-06-17 | 2015-12-17 | LuxVue Technology Corporation | Interactive display panel with ir diodes |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5467104A (en) | 1992-10-22 | 1995-11-14 | Board Of Regents Of The University Of Washington | Virtual retinal display |
JP2002318652A (en) * | 2001-04-20 | 2002-10-31 | Foundation For Nara Institute Of Science & Technology | Virtual input device and its program |
KR100834344B1 (en) | 2001-12-29 | 2008-06-02 | 엘지디스플레이 주식회사 | an active matrix organic electroluminescence display and a manufacturing method of the same |
JP2003317931A (en) * | 2002-04-26 | 2003-11-07 | Canon Inc | El element and display using thereof |
JP4645822B2 (en) * | 2005-04-19 | 2011-03-09 | ソニー株式会社 | Image display device and object detection method |
US7532181B2 (en) * | 2005-07-20 | 2009-05-12 | Eastman Kodak Company | Visible and invisible image display |
WO2010020067A1 (en) * | 2008-08-19 | 2010-02-25 | Lattice Power (Jiangxi) Corporation | Semiconductor light-emitting device with passivation layer |
US8482767B2 (en) * | 2009-02-11 | 2013-07-09 | Infoprint Solutions Company Llc | Print job submission with sleep mechanism |
JP5508393B2 (en) | 2010-03-25 | 2014-05-28 | パナソニック株式会社 | Organic EL display device, video display system, and video display method |
US9690099B2 (en) | 2010-12-17 | 2017-06-27 | Microsoft Technology Licensing, Llc | Optimized focal area for augmented reality displays |
KR20120084216A (en) | 2011-01-19 | 2012-07-27 | 삼성전자주식회사 | Method of 3d image signal processing for removing pixel noise of depth information and 3d image processor of the same |
US8937663B2 (en) | 2011-04-01 | 2015-01-20 | Microsoft Corporation | Camera and sensor augmented reality techniques |
US20150077312A1 (en) | 2011-05-13 | 2015-03-19 | Google Inc. | Near-to-eye display having adaptive optics |
KR101521676B1 (en) | 2011-09-20 | 2015-05-19 | 엘지디스플레이 주식회사 | Organic light emitting diode display and method for manufacturing the same |
KR101180096B1 (en) | 2012-05-02 | 2012-09-05 | (주)지엘디테크 | Apparatus for emitting LED light using RGB and IR LED |
KR101430404B1 (en) | 2012-08-30 | 2014-08-14 | 엘지디스플레이 주식회사 | Lensticular array, manufacturing method thereof and three-dimensional image display device using the same |
KR20140092055A (en) | 2013-01-15 | 2014-07-23 | 엘지디스플레이 주식회사 | Stereoscopic image display device and driving method thereof |
KR102003521B1 (en) | 2013-03-26 | 2019-07-24 | 엘지디스플레이 주식회사 | Stereoscopic 3d display device and method of fabricating the same |
TWI534993B (en) * | 2013-09-25 | 2016-05-21 | 友達光電股份有限公司 | Pixel structure of inorganic light emitting diode |
KR102156343B1 (en) | 2013-12-31 | 2020-09-15 | 엘지디스플레이 주식회사 | Non- Glasses 3D Display Device |
US9355599B2 (en) | 2014-03-06 | 2016-05-31 | 3M Innovative Properties Company | Augmented information display |
-
2015
- 2015-08-31 KR KR1020150123014A patent/KR102497281B1/en active IP Right Grant
-
2016
- 2016-04-27 US US15/139,766 patent/US20170064291A1/en not_active Abandoned
- 2016-08-02 CN CN201610625671.6A patent/CN106483658B/en active Active
-
2021
- 2021-03-23 US US17/209,243 patent/US11582440B2/en active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6653765B1 (en) * | 2000-04-17 | 2003-11-25 | General Electric Company | Uniform angular light distribution from LEDs |
US7560679B1 (en) * | 2005-05-10 | 2009-07-14 | Siimpel, Inc. | 3D camera |
US20100020209A1 (en) * | 2008-07-25 | 2010-01-28 | Samsung Electronics Co., Ltd. | Imaging method and apparatus |
US20110175981A1 (en) * | 2010-01-19 | 2011-07-21 | Chun-Hung Lai | 3d color image sensor |
US20110291116A1 (en) * | 2010-05-28 | 2011-12-01 | Samsung Mobile Display Co., Ltd. | Organic light emitting diode display and method for manufacturing the same |
US20150304638A1 (en) * | 2012-11-23 | 2015-10-22 | Lg Electronics Inc. | Method and apparatus for obtaining 3d image |
US20150364107A1 (en) * | 2014-06-17 | 2015-12-17 | LuxVue Technology Corporation | Interactive display panel with ir diodes |
Cited By (37)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20180158847A1 (en) * | 2015-06-16 | 2018-06-07 | Au Optronics Corporation | Light emitting diode display |
US20170294451A1 (en) * | 2016-04-12 | 2017-10-12 | Samsung Display Co., Ltd. | Display device |
US10373985B2 (en) * | 2016-04-12 | 2019-08-06 | Samsung Display Co., Ltd. | Display device using micro light emitting diode |
US10347179B2 (en) * | 2016-08-23 | 2019-07-09 | Samsung Display Co., Ltd. | Display device and driving method thereof |
US10720103B2 (en) * | 2016-08-23 | 2020-07-21 | Samsung Display Co., Ltd. | Display device and driving method thereof |
US20190333448A1 (en) * | 2016-08-23 | 2019-10-31 | Samsung Display Co., Ltd. | Display device and driving method thereof |
US10522062B2 (en) * | 2016-10-13 | 2019-12-31 | Industrial Technology Research Institute | Three-dimensional display module |
US11380738B2 (en) | 2017-04-13 | 2022-07-05 | Hong Kong Beida Jade Bird Display Limited | LED-OLED hybrid self-emissive display |
TWI774751B (en) * | 2017-04-13 | 2022-08-21 | 中國大陸商上海顯耀顯示科技有限公司 | Led-oled hybrid self-emissive display |
US11311188B2 (en) * | 2017-07-13 | 2022-04-26 | Micro Medical Devices, Inc. | Visual and mental testing using virtual reality hardware |
US20190139464A1 (en) * | 2017-11-07 | 2019-05-09 | Macroblock, Inc. | Dual light source system and method of generating dual images using the same |
EP3480803A1 (en) * | 2017-11-07 | 2019-05-08 | Macroblock, Inc. | Dual light source system and method of generating dual images using the same |
JP2019086768A (en) * | 2017-11-07 | 2019-06-06 | 聚積科技股▲ふん▼有限公司 | Display-purpose dual display light source device, and dual display image generation method |
WO2019108109A1 (en) * | 2017-11-28 | 2019-06-06 | Fingerprint Cards Ab | Biometric imaging system and method for controlling the system |
US11200408B2 (en) | 2017-11-28 | 2021-12-14 | Fingerprint Cards Anacatum Ip Ab | Biometric imaging system and method for controlling the system |
US10134709B1 (en) | 2017-12-21 | 2018-11-20 | Industrial Technology Research Institute | Substrateless light emitting diode (LED) package for size shrinking and increased resolution of display device |
US11916051B2 (en) * | 2018-05-31 | 2024-02-27 | Japan Display Inc. | Inorganic light emitting display device with inorganic film |
US20210082893A1 (en) * | 2018-05-31 | 2021-03-18 | Japan Display Inc. | Display device and array substrate |
CN112204761A (en) * | 2018-05-31 | 2021-01-08 | 株式会社日本显示器 | Display device |
US20210082892A1 (en) * | 2018-05-31 | 2021-03-18 | Japan Display Inc. | Display device |
US11586042B2 (en) | 2018-08-20 | 2023-02-21 | Samsung Display Co., Ltd. | Optical device |
US10902627B2 (en) * | 2018-11-30 | 2021-01-26 | Hins Sas | Head mounted device for virtual or augmented reality combining reliable gesture recognition with motion tracking algorithm |
US20200175712A1 (en) * | 2018-11-30 | 2020-06-04 | Hins Sas | Head mounted device for virtual or augmented reality combining reliable gesture recognition with motion tracking algorithm |
WO2020127817A1 (en) * | 2018-12-21 | 2020-06-25 | Osram Opto Semiconductors Gmbh | Optoelectronic component, display device, image system, and method for operating an image system |
US20220149024A1 (en) * | 2018-12-28 | 2022-05-12 | Honor Device Co., Ltd. | Display, electronic device, and display fabrication method |
US10892257B2 (en) | 2019-01-21 | 2021-01-12 | Innolux Corporation | Foldable display device |
US11233954B1 (en) * | 2019-01-24 | 2022-01-25 | Rockwell Collins, Inc. | Stereo infrared imaging for head mounted devices |
US20220013578A1 (en) * | 2019-03-28 | 2022-01-13 | Japan Display Inc. | Display device |
TWI740431B (en) * | 2019-03-28 | 2021-09-21 | 日商日本顯示器股份有限公司 | Display device |
JP2020166058A (en) * | 2019-03-28 | 2020-10-08 | 株式会社ジャパンディスプレイ | Display device |
WO2020196789A1 (en) * | 2019-03-28 | 2020-10-01 | 株式会社ジャパンディスプレイ | Display device |
JP7320970B2 (en) | 2019-03-28 | 2023-08-04 | 株式会社ジャパンディスプレイ | Display device |
CN113632159A (en) * | 2019-03-28 | 2021-11-09 | 株式会社日本显示器 | Display device |
CN114127831A (en) * | 2019-08-08 | 2022-03-01 | 株式会社日本显示器 | Display device |
US20220157229A1 (en) * | 2019-08-08 | 2022-05-19 | Japan Display Inc. | Display device |
WO2021024609A1 (en) * | 2019-08-08 | 2021-02-11 | 株式会社ジャパンディスプレイ | Display device |
US11817039B2 (en) * | 2019-08-08 | 2023-11-14 | Japan Display Inc. | Display device |
Also Published As
Publication number | Publication date |
---|---|
US11582440B2 (en) | 2023-02-14 |
US20210211641A1 (en) | 2021-07-08 |
CN106483658A (en) | 2017-03-08 |
CN106483658B (en) | 2021-10-22 |
KR102497281B1 (en) | 2023-02-08 |
KR20170026935A (en) | 2017-03-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11582440B2 (en) | Display apparatus, head-mounted display apparatus, image display method, and image display system | |
KR102515963B1 (en) | Organic light emitting display panel | |
US9753323B2 (en) | Color filter substrate, liquid crystal display device, and method for manufacturing color filter substrate | |
JP5961254B2 (en) | Device for displaying and detecting images | |
US20180358581A1 (en) | Organic light emitting diode display device | |
CN101576673B (en) | Liquid crystal display | |
CN110010786A (en) | Organic light-emitting display device | |
CN106887445B (en) | OLED three-dimensional display screen | |
TWI693708B (en) | Transparent display panel | |
US9176327B2 (en) | Three-dimensional display for naked eyes | |
WO2017118224A1 (en) | Angle of view directional light source device and display device | |
KR20180117770A (en) | Head mounted display device | |
WO2017028435A1 (en) | Control method and control apparatus for 3d display device, and 3d display device | |
TW201942727A (en) | Sensing board and display device with sensing board wherein electrode lines and light sensors of the sensing board can be located in the orthographic projection area | |
KR102120172B1 (en) | Display device and method of driving the same | |
CN109283692A (en) | A kind of driving method of display device and display device | |
WO2011124117A1 (en) | Stereoscopic display apparatus | |
CN110634415B (en) | Display device | |
KR102250045B1 (en) | Display apparatus and display system | |
JP2023156540A (en) | Image display and electronic equipment | |
WO2018040699A1 (en) | Display device and display method therefor | |
JP2008158273A (en) | Electrooptical device | |
JP2019070792A (en) | Three-dimensional picture display device including barrier panel | |
KR101765905B1 (en) | Light emitting apparatus | |
KR102631568B1 (en) | Autostereoscopic 3d image display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: SAMSUNG DISPLAY CO., LTD., KOREA, REPUBLIC OF Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DO, YUNSEON;KWON, JAEJOONG;CHO, CHIO;REEL/FRAME:038394/0446 Effective date: 20160415 |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: AWAITING RESPONSE FOR INFORMALITY, FEE DEFICIENCY OR CRF ACTION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: ADVISORY ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |