WO2022050132A1 - Dispositif d'affichage d'image et appareil électronique - Google Patents

Dispositif d'affichage d'image et appareil électronique Download PDF

Info

Publication number
WO2022050132A1
WO2022050132A1 PCT/JP2021/031006 JP2021031006W WO2022050132A1 WO 2022050132 A1 WO2022050132 A1 WO 2022050132A1 JP 2021031006 W JP2021031006 W JP 2021031006W WO 2022050132 A1 WO2022050132 A1 WO 2022050132A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
display device
transmission window
light emitting
image display
Prior art date
Application number
PCT/JP2021/031006
Other languages
English (en)
Japanese (ja)
Inventor
誠一郎 甚田
Original Assignee
ソニーセミコンダクタソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ソニーセミコンダクタソリューションズ株式会社 filed Critical ソニーセミコンダクタソリューションズ株式会社
Priority to DE112021004550.4T priority Critical patent/DE112021004550T5/de
Priority to KR1020237005230A priority patent/KR20230061348A/ko
Priority to US18/042,388 priority patent/US20230329036A1/en
Publication of WO2022050132A1 publication Critical patent/WO2022050132A1/fr

Links

Images

Classifications

    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/10OLED displays
    • H10K59/12Active-matrix OLED [AMOLED] displays
    • H10K59/121Active-matrix OLED [AMOLED] displays characterised by the geometry or disposition of pixel elements
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/80Constructional details
    • H10K59/875Arrangements for extracting light from the devices
    • H10K59/879Arrangements for extracting light from the devices comprising refractive means, e.g. lenses
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01LSEMICONDUCTOR DEVICES NOT COVERED BY CLASS H10
    • H01L27/00Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate
    • H01L27/02Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components specially adapted for rectifying, oscillating, amplifying or switching and having potential barriers; including integrated passive circuit elements having potential barriers
    • H01L27/12Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components specially adapted for rectifying, oscillating, amplifying or switching and having potential barriers; including integrated passive circuit elements having potential barriers the substrate being other than a semiconductor body, e.g. an insulating body
    • H01L27/1214Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components specially adapted for rectifying, oscillating, amplifying or switching and having potential barriers; including integrated passive circuit elements having potential barriers the substrate being other than a semiconductor body, e.g. an insulating body comprising a plurality of TFTs formed on a non-semiconducting substrate, e.g. driving circuits for AMLCDs
    • H01L27/1248Devices consisting of a plurality of semiconductor or other solid-state components formed in or on a common substrate including semiconductor components specially adapted for rectifying, oscillating, amplifying or switching and having potential barriers; including integrated passive circuit elements having potential barriers the substrate being other than a semiconductor body, e.g. an insulating body comprising a plurality of TFTs formed on a non-semiconducting substrate, e.g. driving circuits for AMLCDs with a particular composition or shape of the interlayer dielectric specially adapted to the circuit arrangement
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B3/00Simple or compound lenses
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B33/00Electroluminescent light sources
    • H05B33/02Details
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B33/00Electroluminescent light sources
    • H05B33/12Light sources with substantially two-dimensional radiating surfaces
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B33/00Electroluminescent light sources
    • H05B33/12Light sources with substantially two-dimensional radiating surfaces
    • H05B33/22Light sources with substantially two-dimensional radiating surfaces characterised by the chemical or physical composition or the arrangement of auxiliary dielectric or reflective layers
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05BELECTRIC HEATING; ELECTRIC LIGHT SOURCES NOT OTHERWISE PROVIDED FOR; CIRCUIT ARRANGEMENTS FOR ELECTRIC LIGHT SOURCES, IN GENERAL
    • H05B33/00Electroluminescent light sources
    • H05B33/12Light sources with substantially two-dimensional radiating surfaces
    • H05B33/26Light sources with substantially two-dimensional radiating surfaces characterised by the composition or arrangement of the conductive material used as an electrode
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/10OLED displays
    • H10K59/12Active-matrix OLED [AMOLED] displays
    • H10K59/121Active-matrix OLED [AMOLED] displays characterised by the geometry or disposition of pixel elements
    • H10K59/1213Active-matrix OLED [AMOLED] displays characterised by the geometry or disposition of pixel elements the pixel elements being TFTs
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/10OLED displays
    • H10K59/12Active-matrix OLED [AMOLED] displays
    • H10K59/131Interconnections, e.g. wiring lines or terminals
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/60OLEDs integrated with inorganic light-sensitive elements, e.g. with inorganic solar cells or inorganic photodiodes
    • H10K59/65OLEDs integrated with inorganic image sensors
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/80Constructional details
    • H10K59/805Electrodes
    • H10K59/8051Anodes
    • HELECTRICITY
    • H10SEMICONDUCTOR DEVICES; ELECTRIC SOLID-STATE DEVICES NOT OTHERWISE PROVIDED FOR
    • H10KORGANIC ELECTRIC SOLID-STATE DEVICES
    • H10K59/00Integrated devices, or assemblies of multiple devices, comprising at least one organic light-emitting element covered by group H10K50/00
    • H10K59/80Constructional details
    • H10K59/805Electrodes
    • H10K59/8052Cathodes

Definitions

  • This disclosure relates to image display devices and electronic devices.
  • Recent electronic devices such as smartphones, mobile phones, and PCs (Personal Computers) are equipped with various sensors such as cameras on the frame (bezel) of the display panel.
  • the number of sensors installed is increasing, and in addition to cameras, there are sensors for face recognition, infrared sensors, motion detection sensors, and the like.
  • a technique has been proposed in which an image sensor module is placed directly under the display panel and the subject light passing through the display panel is photographed by the image sensor module. In order to arrange the image sensor module directly under the display panel, it is necessary to make the display panel transparent (see Patent Document 1).
  • opaque members such as pixel circuits and wiring patterns are arranged in each pixel of the display panel, and in addition, an insulating layer having low transmittance is also arranged. Therefore, when the image sensor module is placed directly under the display panel, the light incident on the display panel is irregularly reflected, refracted and diffracted in the display panel, and the light generated by these reflections, refractions and diffractions. It is incident on the image sensor module in a state where (hereinafter referred to as diffracted light) is generated. If shooting is performed with diffracted light generated, the image quality of the subject image deteriorates.
  • the present disclosure provides an image display device and an electronic device capable of suppressing the generation of diffracted light.
  • a plurality of pixels arranged in a two-dimensional manner are provided. At least a part of the plurality of pixels The first self-luminous element and The first light emitting region emitted by the first self-luminous element and Provided is an image display device comprising a non-light emitting region having a transmission window having a predetermined shape for transmitting visible light.
  • Two or more pixels having the non-light emitting region having different shapes of the transmission windows may be provided.
  • the non-light emitting region may be arranged at a position overlapping with a light receiving device that receives light incident through the image display device when viewed in a plan view from the display surface side of the image display device.
  • the pixel circuit connected to the first self-luminous element may be arranged in the first light emitting region.
  • the non-light emitting region may have a plurality of the transmission windows arranged apart from each other in one pixel.
  • the transparent window may be arranged so as to straddle two or more pixels.
  • An optical member that is arranged on the light incident side of the transmission window and refracts the incident light to guide the incident light to the transmission window may be provided.
  • the optical member is The first optical system that refracts the incident light in the optical axis direction, It has a second optical system that parallelizes the light refracted by the first optical system, and has.
  • the transmission window may transmit light parallelized by the second optical system.
  • a first optical member arranged on the light incident side of the transmission window and refracting the incident light to guide the incident light to the transmission window.
  • a second optical member which is arranged on the light emitting side of the transmission window and parallelizes the light emitted from the transmission window and guides the light to the light receiving device, may be provided.
  • the first pixel area including some of the plurality of pixels and A second pixel area including at least a part of the pixels other than the pixels in the first pixel area among the plurality of pixels is provided.
  • the pixel in the first pixel region has the first self-luminous element, the first light emitting region, and the non-light emitting region.
  • the pixels in the second pixel area are The second self-luminous element and It may have a second light emitting region that is emitted by the second self-luminous element and has a larger area than the first light emitting region.
  • the first pixel area may be provided at a plurality of places in the pixel display area at a distance from each other.
  • two or more pixels having the transmission window having different shapes may be provided so that the shape of the diffracted light due to the light transmitted through the transmission window is different.
  • the first self-luminous element is With the lower electrode layer, A display layer arranged on the lower electrode layer and An upper electrode layer arranged on the display layer and It has a wiring layer arranged below the lower electrode layer and conducted to the lower electrode layer via a contact extending from the lower electrode layer in the stacking direction.
  • the shape of the transmission window when viewed in a plan view from the display surface side of the plurality of pixels may be defined by the end portion of the lower electrode layer.
  • the first self-luminous element is With the lower electrode layer, A display layer arranged on the lower electrode layer and An upper electrode layer arranged on the display layer and It has a wiring layer arranged below the lower electrode layer and conducted to the lower electrode layer via a contact extending from the lower electrode layer in the stacking direction.
  • the shape of the transmission window when viewed in a plan view from the display surface side of the plurality of pixels may be defined by the end portion of the wiring layer.
  • the wiring layer has a plurality of laminated metal layers, and has a plurality of laminated metal layers.
  • the shape of the transmission window when viewed in a plan view from the display surface side of the plurality of pixels may be defined by the end portion of at least one metal layer of the plurality of metal layers.
  • the metal layer that defines the shape of the transmission window when viewed in a plan view from the display surface side of the plurality of pixels may be an electrode of a capacitor in a pixel circuit.
  • the entire area of the first light emitting region may be covered with the lower electrode layer except for the region of the transmission window.
  • an image display device having a plurality of pixels arranged two-dimensionally and A light receiving device for receiving light incident through the image display device.
  • the image display device has a first pixel region including a part of the plurality of pixels.
  • the part of the pixels in the first pixel area The first self-luminous element and The first light emitting region emitted by the first self-luminous element and It has a non-emissive region, which has a transmission window of a predetermined shape that allows visible light to pass through.
  • An electronic device is provided in which at least a part of the first pixel region is arranged so as to overlap the light receiving device when viewed in a plan view from the display surface side of the image display device.
  • the light receiving device may receive light through the non-light emitting region.
  • the light receiving device includes an image sensor that photoelectrically converts light incident through the non-light emitting region, a distance measuring sensor that receives light incident through the non-light emitting region and measures a distance, and an incident light through the non-light emitting region. It may include at least one of a temperature sensor that measures the temperature based on the emitted light.
  • Sectional view of the image sensor module. The figure schematically explaining the optical composition of an image sensor module. The figure explaining the optical path until the light from a subject is imaged on an image sensor.
  • the cross-sectional view which shows an example of the laminated structure of a display layer.
  • the cross-sectional view which shows the 1st example of the cross-sectional structure of the 1st pixel area.
  • the plan layout view according to the 1st modification of FIG. FIG. 19 is a cross-sectional view taken along the line AA.
  • the plan layout view according to the 2nd modification of FIG. FIG. 21 is a cross-sectional view taken along the line AA of FIG.
  • the plan layout view according to the 3rd modification of FIG. FIG. 23 is a sectional view taken along line AA of FIG. 23.
  • FIG. 27 The circuit diagram which shows the 2nd example of the detailed circuit composition of a pixel circuit.
  • the plan layout view according to the 4th modification of FIG. FIG. 27 is a cross-sectional view taken along the line AA of FIG. 27.
  • the figure which shows the example which the transparent window is a rectangle. It is a figure which shows the diffracted light which is generated when the parallel light is made incident on the transmission window of FIG. 29A.
  • the figure which shows the example which the transmission window is circular.
  • FIG. 3 is a diagram showing diffracted light generated when parallel light is incident on the transmission window of FIG. 30A.
  • FIG. 5 is a cross-sectional view showing an example in which a microlens is arranged on the light incident side of the first pixel region. The figure which showed the traveling direction of the light incident on the 1st pixel area in the absence of a microlens by an arrow.
  • FIG. 1 is a plan view and a cross-sectional view of an electronic device 50 provided with an image display device 1 according to the first embodiment of the present disclosure.
  • the image display device 1 according to the present embodiment includes a display panel 2.
  • flexible printed circuit boards (FPCs) 3 are connected to the display panel 2.
  • the display panel 2 is, for example, a glass substrate or a transparent film in which a plurality of layers are laminated, and a plurality of pixels are arranged vertically and horizontally on the display surface 2z.
  • a chip (COF: Chip On Film) 4 incorporating at least a part of the drive circuit of the display panel 2 is mounted on the FPC 3.
  • the drive circuit may be laminated on the display panel 2 as COG (Chip On Glass).
  • the image display device 1 can arrange various sensors 5 that receive light through the display panel 2 directly under the display panel 2.
  • the configuration including the image display device 1 and the sensor 5 is referred to as an electronic device 50.
  • the type of the sensor 5 provided in the electronic device 50 is not particularly limited, but for example, the light is projected through the image sensor and the display panel 2 that photoelectrically convert the light incident on the display panel 2, and is reflected by the object.
  • a distance measurement sensor that receives light received through the display panel 2 and measures the distance to an object, a temperature sensor that measures the temperature based on the light incident through the display panel 2, and the like.
  • the sensor 5 arranged directly below the display panel 2 has at least the function of a light receiving device that receives light.
  • the sensor 5 may have a function of a light emitting device that emits light through the display panel 2.
  • FIG. 1 shows an example of a specific location of the sensor 5 arranged directly under the display panel 2 with a broken line.
  • the sensor 5 is arranged, for example, on the back surface side above the center of the display panel 2.
  • the location of the sensor 5 in FIG. 1 is an example, and the location of the sensor 5 is arbitrary.
  • FIG. 1 shows an example in which the sensor 5 is arranged at one place of the display panel 2, the sensor 5 may be arranged at a plurality of places as shown in FIG. 2A or FIG. 2B.
  • FIG. 2A shows an example in which two sensors 5 are arranged side by side on the back surface side above the center of the display panel 2.
  • FIG. 2B shows an example in which the sensors 5 are arranged at the four corners of the display panel 2. The reason why the sensors 5 are arranged at the four corners of the display panel 2 as shown in FIG. 2B is as follows. Since the pixel region overlapping the sensor 5 in the display panel 2 is devised to increase the transmittance, there is a possibility that the display quality may be slightly different from the pixel region around the pixel region. be.
  • the types of the plurality of sensors 5 may be the same or different.
  • a plurality of image sensor modules 9 having different focal lengths may be arranged, or different types of sensors 5 such as an image pickup sensor 5 and a ToF (Time of Flight) sensor 5 may be arranged. ..
  • FIG. 3 is a diagram schematically showing the structure of the pixel 7 in the first pixel region 6 and the structure of the pixel 7 in the second pixel region 8.
  • the pixel 7 in the first pixel region 6 has a first self-luminous element 6a, a first light emitting region 6b, and a non-light emitting region 6c.
  • the first light emitting region 6b is a region where light is emitted by the first self-luminous element 6a.
  • the non-light emitting region 6c has a transmission window 6d having a predetermined shape for transmitting visible light, although the first self-luminous element 6a does not emit light.
  • the pixel 7 in the second pixel region 8 has a second self-luminous element 8a and a second light emitting region 8b.
  • the second light emitting region 8b is emitted by the second self-luminous element 8a and has a larger area than the first light emitting region 6b.
  • Typical examples of the first self-luminous element 6a and the second self-luminous element 8a are organic EL (Electroluminescence) elements (hereinafter, also referred to as OLED: Organic Light Emitting Diode). Since the backlight can be omitted from the self-luminous element, at least a part of the self-luminous element can be made transparent. In the following, an example of using an OLED as a self-luminous element will be mainly described.
  • the structure of the pixels 7 may be the same in the display panel 2 instead of changing the structure of the pixels 7 in the pixel area that overlaps with the sensor 5 and the pixel area that does not overlap with the sensor 5.
  • all the pixels 7 may be configured by the first light emitting region 6b and the non-light emitting region 6c in FIG. 3 so that the sensor 5 can be arranged on the display panel 2 in an arbitrary position.
  • FIG. 4 is a cross-sectional view of the image sensor module 9.
  • the image sensor module 9 includes an image sensor 9b mounted on a support substrate 9a, an IR (Infrared Ray) cut filter 9c, a lens unit 9d, a coil 9e, a magnet 9f, and the like. It has a spring of 9 g.
  • the lens unit 9d has one or more lenses. The lens unit 9d is movable in the optical axis direction according to the direction of the current flowing through the coil 9e.
  • the internal configuration of the image sensor module 9 is not limited to that shown in FIG.
  • FIG. 5 is a diagram schematically explaining the optical configuration of the image sensor module 9.
  • the light from the subject 10 is refracted by the lens unit 9d and imaged on the image sensor 9b.
  • the display panel 2 is arranged between the subject 10 and the lens unit 9d. When the light from the subject 10 passes through the display panel 2, it is important to suppress absorption, reflection, and diffraction on the display panel 2.
  • FIG. 6 is a diagram illustrating an optical path until the light from the subject 10 forms an image on the image sensor 9b.
  • each pixel 7 of the display panel 2 and each pixel 7 of the image sensor 9b are schematically represented by rectangular squares. As shown, each pixel 7 of the display panel 2 is much larger than each pixel 7 of the image sensor 9b.
  • Light from a specific position of the subject 10 passes through the transmission window 6d of the display panel 2, is refracted by the lens unit 9d of the image sensor module 9, and is imaged by the specific pixel 7 on the image sensor 9b. In this way, the light from the subject 10 passes through the plurality of transmission windows 6d provided in the plurality of pixels 7 in the first pixel region 6 of the display panel 2 and is incident on the image sensor module 9.
  • FIG. 7 is a circuit diagram showing a basic configuration of a pixel circuit 12 including an OLED 5.
  • the pixel circuit 12 of FIG. 7 includes a drive transistor Q1, a sampling transistor Q2, and a pixel capacitance Cs in addition to the OLED 5.
  • the sampling transistor Q2 is connected between the signal line Sig and the gate of the drive transistor Q1.
  • a scanning line Gate is connected to the gate of the sampling transistor Q2.
  • the pixel capacitance Cs is connected between the gate of the drive transistor Q1 and the anode electrode of the OLED 5.
  • the drive transistor Q1 is connected between the power supply voltage node Vccp and the anode of the OLED 5.
  • FIG. 8 is a plan layout diagram of the pixels 7 in the second pixel region 8 in which the sensor 5 is not directly arranged.
  • the pixel 7 in the second pixel region 8 has a general pixel configuration.
  • Each pixel 7 has a plurality of color pixels 7 (for example, three color pixels 7 of RGB).
  • FIG. 8 shows a planar layout of a total of four color pixels 7, two color pixels 7 in the horizontal direction and two color pixels 7 in the vertical direction.
  • Each color pixel 7 has a second light emitting region 8b.
  • the second light emitting region 8b extends over almost the entire area of the color pixel 7.
  • a pixel circuit 12 having a second self-luminous element 8a (OLED5) is arranged in the second light emitting region 8b.
  • the two columns on the left side of FIG. 8 show the planar layout below the anode electrode 12a, and the two columns on the right side of FIG. 8 show the planar layout of the anode electrode 12a and the display layer 2a arranged on the an
  • the anode electrode 12a and the display layer 2a are arranged over almost the entire area of the color pixel 7, and the entire area of the color pixel 7 is the second light emitting region 8b that emits light.
  • the pixel circuit 12 of the color pixel 7 is arranged in the region of the upper half in the color pixel 7. Further, on the upper end side of the color pixel 7, a wiring pattern for the power supply voltage Vccp and a wiring pattern for the scanning line are arranged in the horizontal direction X. Further, a wiring pattern of the signal line Sigma is arranged along the boundary of the color pixel 7 in the vertical direction Y.
  • FIG. 9 is a cross-sectional view of the pixel 7 (color pixel 7) in the second pixel region 8 in which the sensor 5 is not arranged directly below.
  • FIG. 9 shows a cross-sectional structure in the A-A line direction of FIG. 8, and more specifically, shows a cross-sectional structure around the drive transistor Q1 in the pixel circuit 12.
  • the cross-sectional views shown in the drawings attached to the present specification, including FIG. 9, emphasize the characteristic layer structure, and the ratio of the vertical and horizontal lengths does not necessarily match the plan layout. do not do.
  • the upper surface of FIG. 9 is the display surface side of the display panel 2, and the bottom surface of FIG. 9 is the side on which the sensor 5 is arranged.
  • the first transparent substrate 31 From the bottom surface side to the top surface side (light emitting side) of FIG. 9, the first transparent substrate 31, the first insulating layer 32, the first wiring layer (gate electrode) 33, the second insulating layer 34, and the second wiring.
  • the two transparent substrates 41 are laminated in this order.
  • the first transparent substrate 31 and the second transparent substrate 41 are preferably formed of, for example, quartz glass or a transparent film having excellent visible light transmittance.
  • either one of the first transparent substrate 31 and the second transparent substrate 41 may be formed of quartz glass and the other may be formed of a transparent film.
  • a colored film having a low transmittance for example, a polyimide film may be used.
  • at least one of the first transparent substrate 31 and the second transparent substrate 41 may be formed of a transparent film.
  • a first wiring layer (M1) 33 for connecting each circuit element in the pixel circuit 12 is arranged on the first transparent substrate 31.
  • a first insulating layer 32 is arranged on the first transparent substrate 31 so as to cover the first wiring layer 33.
  • the first insulating layer 32 is, for example, a laminated structure of a silicon nitride layer and a silicon oxide layer having excellent visible light transparency.
  • a semiconductor layer 42 forming a channel region of each transistor in the pixel circuit 12 is arranged on the first insulating layer 32.
  • FIG. 9 schematically shows a cross-sectional structure of a drive transistor Q1 having a gate formed in the first wiring layer 33, a source and a drain formed in the second wiring layer 35, and a channel region formed in the semiconductor layer 42.
  • other transistors are also arranged in these layers 33, 35, 42 and are connected to the first wiring layer 33 by contacts (not shown).
  • a second insulating layer 34 is arranged on the first insulating layer 32 so as to cover a transistor or the like.
  • the second insulating layer 34 is, for example, a laminated structure of a silicon oxide layer, a silicon nitride layer, and a silicon oxide layer having excellent visible light transparency.
  • a trench 34a is formed in a part of the second insulating layer 34, and by filling the trench 34a with the contact member 35a, the second wiring layer (M2) 35 connected to the source, drain, etc. of each transistor is formed. It is formed.
  • FIG. 9 shows a second wiring layer 35 for connecting the drive transistor Q1 and the anode electrode 12a of the OLED 5, but the second wiring layer 35 connected to other circuit elements is also arranged on the same layer. Has been done.
  • a third wiring layer (not shown in FIG. 9) may be provided between the second wiring layer 35 and the anode electrode 12a.
  • the third wiring layer can be used as wiring in the pixel circuit, or may be used for connection with the anode electrode 12a.
  • a third insulating layer 36 for covering the second wiring layer 35 and flattening the surface is arranged on the second insulating layer 34.
  • the third insulating layer 36 is made of a resin material such as acrylic resin.
  • the film thickness of the third insulating layer 36 is larger than the film thickness of the first to second insulating layers 32 and 34.
  • a trench 36a is formed in a part of the upper surface of the third insulating layer 36, and the contact member 36b is filled in the trench 36a to conduct conduction with the second wiring layer 35, and the contact member 36b is connected to the third insulating layer.
  • the anode electrode layer 38 is formed by extending to the upper surface side of the 36.
  • the anode electrode layer 38 has a laminated structure and includes a metal material layer.
  • the metal material layer generally has a low visible light transmittance and functions as a reflective layer that reflects light.
  • As a specific metal material for example, AlNd or Ag can be applied.
  • the lowermost layer of the anode electrode layer 38 is a portion in contact with the trench 36a and is easily broken, at least the corner portion of the trench 36a may be formed of a metal material such as AlNd.
  • the uppermost layer of the anode electrode layer 38 is formed of a transparent conductive layer such as ITO (Indium Tin Oxide).
  • the anode electrode layer 38 may have, for example, an ITO / Ag / ITO laminated structure. Ag is originally opaque, but by reducing the film thickness, the visible light transmittance is improved. Since the strength is weakened when Ag is thinned, it can function as a transparent conductive layer by forming a laminated structure in which ITO is arranged on both sides.
  • a fourth insulating layer 37 is arranged on the third insulating layer 36 so as to cover the anode electrode layer 38.
  • the fourth insulating layer 37 is also made of a resin material such as acrylic resin, like the third insulating layer 36.
  • the fourth insulating layer 37 is patterned according to the arrangement location of the OLED 5, and a recess 37a is formed.
  • the display layer 2a is arranged so as to include the bottom surface and the side surface of the recess 37a of the fourth insulating layer 37.
  • the display layer 2a has a laminated structure as shown in FIG. 10, for example.
  • the display layer 2a shown in FIG. 10 has an anode 2b, a hole injection layer 2c, a hole transport layer 2d, a light emitting layer 2e, an electron transport layer 2f, an electron injection layer 2g, and a cathode 2h in the order of stacking from the anode electrode layer 38 side. It is a laminated structure in which.
  • the anode 2b is also referred to as an anode electrode 12a.
  • the hole injection layer 2c is a layer into which holes are injected from the anode electrode 12a.
  • the hole transport layer 2d is a layer that efficiently transports holes to the light emitting layer 2e.
  • the light emitting layer 2e recombines holes and electrons to generate excitons, and emits light when the excitons return to the ground state.
  • the cathode 2h is also called a cathode electrode.
  • the electron injection layer 2g is a layer into which electrons from the cathode 2h are injected.
  • the electron transport layer 2f is a layer that efficiently transports electrons to the light emitting layer 2e.
  • the light emitting layer 2e contains an organic substance.
  • a cathode electrode layer 39 is arranged on the display layer 2a shown in FIG.
  • the cathode electrode layer 39 is formed of a transparent conductive layer like the anode electrode layer 38.
  • the transparent conductive layer of the anode electrode layer 38 is formed of, for example, ITO / Ag / ITO, and the transparent electrode layer of the cathode electrode layer 39 is formed of, for example,
  • a fifth insulating layer 40 is arranged on the cathode electrode layer 39.
  • the fifth insulating layer 40 is formed of an insulating material that flattens the upper surface and has excellent moisture resistance.
  • a second transparent substrate 41 is arranged on the fifth insulating layer 40.
  • the anode electrode layer 38 that functions as a reflective film is arranged in almost the entire area of the color pixels 7, and visible light cannot be transmitted.
  • FIG. 11 is a plan layout view of pixels 7 in the first pixel area 6 in which the sensor 5 is arranged directly below.
  • One pixel 7 has a plurality of color pixels 7 (for example, three color pixels 7 of RGB).
  • FIG. 11 shows a planar layout of a total of four color pixels 7, two color pixels 7 in the horizontal direction and two color pixels 7 in the vertical direction.
  • Each color pixel 7 has a first light emitting region 6b and a non-light emitting region 6c.
  • the first light emitting region 6b is a region including a pixel circuit 12 having a first self-luminous element 6a (OLED5) and being emitted by the OLED 5.
  • the non-light emitting region 6c is a region through which visible light is transmitted.
  • the non-light emitting region 6c cannot emit the light from the OLED 5, but can transmit the incident visible light. Therefore, by arranging the sensor 5 directly below the non-light emitting region 6c, the sensor 5 can receive visible light.
  • FIG. 12 is a cross-sectional view of the pixel 7 in the first pixel region 6 in which the sensor 5 is arranged directly below.
  • FIG. 12 shows the cross-sectional structure of FIG. 11 in the AA line direction, and shows the cross-sectional structure from the first light emitting region 6b to the non-light emitting region 6c.
  • the third insulating layer 36, the fourth insulating layer 37, the anode electrode layer 38, the display layer 2a, and the cathode electrode layer 39 are removed. Therefore, the light incident on the non-light emitting region 6c from above (display surface side) in FIG. 12 is emitted from the bottom surface (back surface side) and incident on the sensor 5 without being absorbed or reflected in the non-light emitting region 6c. Will be done.
  • a part of the light incident on the first pixel region 6 is incident on not only the non-light emitting region 6c but also on the first light emitting region 6b and diffracted to generate diffracted light.
  • FIG. 13 is a diagram illustrating a diffraction phenomenon that generates diffracted light.
  • Parallel light such as sunlight or highly directional light is diffracted at the boundary between the non-light emitting region 6c and the first light emitting region 6b to generate high-order diffracted light including the primary diffracted light.
  • the 0th-order diffracted light is light traveling in the optical axis direction of the incident light, and is the light having the highest light intensity among the diffracted light. in short, The 0th-order diffracted light is the object to be photographed and is the light to be photographed. The higher the order of the diffracted light, the more the light travels in a direction away from the 0th-order diffracted light, and the light intensity becomes weaker.
  • higher-order diffracted light including the first-order diffracted light is collectively called diffracted light.
  • the diffracted light is light that does not originally exist in the subject light, and is unnecessary light for photographing the subject 10.
  • the brightest bright spot is the 0th-order light
  • the higher-order diffracted light spreads from the 0th-order diffracted light in a cross shape.
  • the subject light is white light
  • the diffraction angle is different for each of a plurality of wavelength components contained in the white light, so that iridescent diffracted light f is generated.
  • the shape of the diffracted light reflected in the captured image is, for example, a cross shape, but the shape of the diffracted light f is determined by the shape of the portion through which the light is transmitted in the non-light emitting region 6c, as will be described later. If the shape of the transmitted portion is known, the shape of the diffracted light can be estimated by simulation from the diffraction principle.
  • a light transmission region exists not only in the non-light emitting region 6c but also in the wiring gap and around the first light emitting region 6b. As described above, when the light transmission region having an irregular shape exists at a plurality of locations in the pixel 7, the incident light is diffracted in a complicated manner, and the shape of the diffracted light f is also complicated.
  • FIG. 14 is a planar layout diagram of an image display device 1 according to an embodiment that solves a problem that may occur in the planar layout of FIG.
  • an anode electrode 12a is arranged in the entire area of the first light emitting region 6b in the first pixel region 6 to prevent light from transmitting, and a transmission window 6d having a predetermined shape is provided in the non-light emitting region 6c. Only the inside of the transmission window 6d allows the subject light to pass through.
  • FIG. 14 shows an example in which the periphery of the transmission window 6d in the non-light emitting region 6c is covered with the anode electrode 12a, but as will be described later, the member defining the shape of the transmission window 6d is not necessarily the anode electrode 12a.
  • the planar shape of the transparent window 6d is rectangular.
  • the planar shape of the transmission window 6d is preferably as simple as possible. The simpler the shape, the simpler the generation direction of the diffracted light f, and the diffracted light shape can be obtained in advance by simulation.
  • the planar layout of FIG. 15 is more desirable than that of FIG.
  • the shape of the transmission window 6d in the non-light emitting region 6c in the first pixel region 6 in which the sensor 5 is arranged directly below can be defined by any of a plurality of members.
  • FIG. 16 is a cross-sectional view showing a first example of the cross-sectional structure of the first pixel region 6.
  • FIG. 16 shows an example in which the shape of the transmission window 6d in the non-light emitting region 6c is defined by the anode electrode 12a (anode electrode layer 38).
  • the end portion of the anode electrode layer 38 is formed in a rectangular shape as shown in FIG. 14 when viewed in a plan view from the display surface side.
  • the shape of the transmission window 6d is defined by the end portion of the anode electrode layer 38.
  • the third insulating layer 36 and the fourth insulating layer 37 inside the transmission window 6d are left as they are. Therefore, when the materials of the third insulating layer 36 and the fourth insulating layer 37 are colored resin layers, the visible light transmittance may decrease, but at least a part of the visible light is transmitted, so that the visible light is transmitted.
  • the third insulating layer 36 and the fourth electrode layer 37 in the window 6d may be left.
  • FIG. 17 is a cross-sectional view showing a second example of the cross-sectional structure of the first pixel region 6.
  • FIG. 17 defines the shape of the transmission window 6d at the end of the anode electrode layer 38 as in FIG. FIG. 17 differs from FIG. 16 in that the fourth insulating layer 37 is removed inside the transmission window 6d. Since the fourth insulating layer 37 does not exist inside the transmission window 6d, absorption and reflection of light when light is transmitted through the fourth insulating layer 37 can be suppressed, and the amount of light incident on the sensor 5 can be increased. , The light receiving sensitivity of the sensor 5 becomes high.
  • FIG. 18 is a cross-sectional view showing a third example of the cross-sectional structure of the first pixel region 6.
  • FIG. 18 defines the shape of the transmission window 6d at the end of the anode electrode layer 38 as in FIGS. 16 and 17.
  • FIG. 18 is different from FIGS. 16 and 17 in that the third insulating layer 36 and the fourth insulating layer 37 are removed inside the transmission window 6d. Since the third insulating layer 36 and the fourth insulating layer 37 do not exist inside the transmission window 6d, the amount of light incident on the sensor 5 can be increased as compared with FIG. 17, and the light received by the sensor 5 can be further increased as compared with FIG. You can increase the sensitivity.
  • the end portion of the third insulating layer 36 arranged under the anode electrode layer 38 is provided at substantially the same position as the end portion of the anode electrode layer 38.
  • the end of the third insulating layer 36 may protrude toward the transmission window 6d from the end of the anode electrode layer 38.
  • the shape of the transmission window 6d is the end of the anode electrode layer 38. It is ambiguous whether it is defined by the part or by the end of the third insulating layer 36.
  • the method of generating the diffracted light f may change depending on how the end of the third insulating layer 36 protrudes.
  • FIG. 19 is a plan layout view according to the first modification of FIG. 14, and FIG. 20 is a sectional view taken along line AA of FIG. FIG. 20 shows a fourth example of the cross-sectional structure of the first pixel region 6.
  • the shape of the transmission window 6d is defined by the end portion of the second wiring layer (M2) 35 arranged below the third insulating layer 36.
  • the second wiring layer (M2) 35 is formed in a rectangular shape when viewed in a plan view from the display surface direction. Since the second wiring layer (M2) 35 is made of a metal material such as aluminum that does not transmit visible light, the incident light to the first pixel region 6 passes through the inside of the transmission window 6d and enters the sensor 5. Will be done.
  • the second wiring layer (M2) 35 since the second wiring layer (M2) 35 is arranged on the transmission window 6d side of the third insulating layer 36, the second wiring layer (M2) 35 may have manufacturing variations.
  • the shape of the transmission window 6d can be defined.
  • FIG. 21 is a plan layout view according to the second modification of FIG. 14, and FIG. 22 is a sectional view taken along line AA of FIG. 21.
  • the shape of the transmission window 6d is defined by the end portion of the first wiring layer (M1) 33 arranged below the third insulating layer 36.
  • the first wiring layer (M1) 33 is formed in a rectangular shape when viewed in a plan view from the display surface direction. Since the first wiring layer (M1) 33 is made of a metal material such as aluminum that does not transmit visible light, the incident light to the first pixel region 6 passes through the inside of the transmission window 6d and enters the sensor 5. Will be done.
  • FIGS. 19 to 22 show an example in which the shape of the transmission window 6d is defined at the end of the wiring layer
  • the capacitor may be formed by the wiring layer that defines the shape of the transmission window 6d. As a result, it is not necessary to separately form a capacitor, and the cross-sectional structure of the image display device 1 can be simplified.
  • FIG. 23 is a plan layout view according to the third modification of FIG. 14, and FIG. 24 is a sectional view taken along line AA of FIG. 23.
  • the shape of the transmission window 6d is defined by the first wiring layer (M1) 33.
  • a metal layer 44 is arranged so as to sandwich the first insulating layer 32 directly above the first wiring layer (M1) 33 provided to define the shape of the transmission window 6d to form a capacitor 43.
  • the capacitor 43 can be used as a capacitor provided in the pixel circuit 12.
  • the capacitor 43 shown in FIG. 24 can be used, for example, as the pixel capacitance Cs in the pixel circuit 12 of FIG. 7.
  • FIG. 25 is a circuit diagram showing a first example of a detailed circuit configuration of the pixel circuit 12.
  • the pixel circuit 12 of FIG. 25 has three transistors Q3 to Q5 in addition to the drive transistor Q1 and the sampling transistor Q2 shown in FIG. 7.
  • the drain of the transistor Q3 is connected to the gate of the drive transistor Q1, the source of the transistor Q3 is set to the voltage V1, and the gate signal Gate1 is input to the gate of the transistor Q3.
  • the drain of the transistor Q4 is connected to the anode electrode 12a of the OLED 5, the source of the transistor Q4 is set to the voltage V2, and the gate signal Gate2 is input to the gate of the transistor Q4.
  • Transistors Q1 to Q4 are N-type transistors, while transistors Q5 are P-type transistors.
  • the source of the transistor Q5 is set to the power supply voltage Vccp
  • the drain of the transistor Q5 is connected to the drain of the drive transistor Q1
  • the gate signal Gate3 is input to the gate of the transistor Q5.
  • FIG. 26 is a circuit diagram showing a second example of a detailed circuit configuration of the pixel circuit 12.
  • Each of the transistors Q1a to Q5a in the pixel circuit 12 of FIG. 26 is the reverse of the conductive type of each of the transistors Q1 to Q5 in the pixel circuit 12 of FIG. 25.
  • the pixel circuit 12 in FIG. 26 is partially different from the pixel circuit 12 in FIG. 25 except that the conductive type of the transistor is reversed.
  • 25 and 26 are merely examples of the pixel circuit 12, and various changes in the circuit configuration can be considered.
  • the capacitor 43 formed by the first wiring layer (M1) 33 of FIG. 24 and the metal layer immediately above the first wiring layer (M1) 33 can be used as the capacitors Cs in the pixel circuit 12 of FIG. 25 or FIG.
  • 19 to 26 show an example in which the wiring layer constituting a part of the pixel circuit 12 is used to define the shape of the transmission window 6d, but the transmission window 6d is separated from the wiring layer of the pixel circuit 12.
  • a metal layer may be provided to define the shape of the above.
  • FIG. 27 is a plan layout view according to the fourth modification of FIG. 14, and FIG. 28 is a sectional view taken along line AA of FIG. 27.
  • the shape of the transmission window 6d is defined by the end portion of the third metal layer (M3) 45.
  • the third metal layer (M3) 45 that defines the shape of the transmission window 6d may form a part of the wiring layer of the pixel circuit 12, or is newly provided to define the shape of the transmission window 6d. It may be a window.
  • the pattern provided to define the opening shape of FIGS. 19, 21, and 27 is shown as an electrically floating image, but since it is easily affected by electrical factors such as potential coupling, some potential is present. It is recommended to connect to. For example, considering FIG. 25, a fixed DC potential (Vccp, Vcath, V1, V2) is the first recommendation, the anode potential is the second recommendation, and other wirings and nodes are the third recommendation.
  • the first wiring layer (M1) 33 and the second wiring layer (M2) 35 used as the wiring of the pixel circuit 12 are adapted to the ideal shape of the transmission window 6d due to the limitation of the wiring of the pixel circuit 12. It may not be possible to place it. Therefore, in FIG. 27, a third wiring layer (M3) 45 is newly provided, and the end portion of the third wiring layer (M3) 45 is arranged at a position where the shape of the transmission window 6d becomes ideal. .. Thereby, the transmission window 6d can be set to an ideal shape without changing the first wiring layer (M1) 33 and the second wiring layer (M2) 35.
  • FIG. 29A shows an example in which the transmission window 6d is rectangular
  • FIG. 29B shows an example of diffracted light f generated when parallel light is incident on the transmission window 6d in FIG. 29A.
  • the transmission window 6d is rectangular
  • cross-shaped diffracted light f is generated.
  • FIG. 30A shows an example in which the transmission window 6d is circular
  • FIG. 30B shows an example of diffracted light f generated when parallel light is incident on the transmission window 6d in FIG. 30A.
  • diffracted light f is generated concentrically. The higher the diffracted light f, the larger the diameter size and the weaker the light intensity.
  • the number of transmission windows 6d in the non-light emitting region 6c is not necessarily limited to one.
  • a plurality of transmission windows 6d may be provided in the non-light emitting region 6c.
  • 31A is a diagram showing an example in which a plurality of circular transmission windows 6d are provided in the non-emission region 6c
  • FIG. 31B is an example of diffracted light f generated when parallel light is incident on each transmission window 6d in FIG. 31A. Is shown.
  • the plurality of transmission windows 6d are provided, the light intensity of the central portion of the diffracted light f is weakened, and the diffracted light f is generated concentrically.
  • FIG. 31A shows an example in which a plurality of circular transmission windows 6d are provided, a plurality of transmission windows 6d having a shape other than the circular shape may be provided. In this case, the shape of the diffracted light f is different from that in FIG. 31B.
  • FIGS. 29A and 29B when the transmission window 6d is rectangular, cross-shaped diffracted light f is generated.
  • a plurality of transmission windows 6d having different directions of the rectangles are provided, and the diffracted lights f generated in the plurality of transmission windows 6d are combined.
  • a method of removing the diffracted light f can be considered.
  • FIG. 32 is a diagram showing a first example of removal of diffracted light f.
  • two image sensor modules 9 are arranged directly under the display panel 2, and in the non-light emitting region 6c of each pixel 7 in the two first pixel regions 6 located directly above the image sensor modules 9.
  • a transparent window 6d having a different rectangular orientation is arranged.
  • a rectangular transmission window 6d is arranged substantially parallel to the boundary line of the pixel 7 in the non-light emitting region 6c in the first pixel region 6 located directly above the image sensor module 9 located on the left side. ing.
  • a rectangular transmission window 6d is arranged in a direction inclined by 45 degrees with respect to the boundary line of the pixel 7. is doing.
  • the cross shape of the diffracted light f1 incident on the image sensor module 9 on the left side and the cross shape of the diffracted light f2 incident on the image sensor module 9 on the right side are 45 degrees different from each other. More specifically, the diffracted light f2 is not generated in the direction in which the diffracted light f1 is generated, and the diffracted light f1 is not generated in the direction in which the diffracted light f2 is generated. Therefore, by combining the captured image g1 of the diffracted light f1 by the image sensor module 9 on the left side and the captured image g2 of the diffracted light f2 by the image sensor module 9 on the right side, as shown in the composite image g3 of FIG. , The diffracted light f other than the light spot of the 0th-order diffracted light at the center position can be removed.
  • the arrangement angles of the transmission windows 6d having the same size and shape are different from each other, the direction in which the diffracted light f is generated is changed, and the diffracted light f generation image is synthesized to cancel the diffracted light f.
  • the sizes and shapes of the plurality of transmission windows 6d to be synthesized do not necessarily have to be the same.
  • FIG. 33 is a diagram showing a second example of removal of diffracted light f.
  • two image sensor modules 9 are arranged directly under the display panel 2, and transparent windows 6d having different shapes and directions are arranged in the two first pixel regions 6 located directly above the image sensor modules 9. Is placed.
  • a transmission window 6d is provided in almost the entire non-light emitting region 6c in the first pixel region 6 located directly above the image sensor module 9 located on the left side. Since the non-light emitting region 6c has a rectangular shape, the shape of the transmission window 6d also has a rectangular shape.
  • the non-light emitting area 6c in the first pixel area 6 located directly above the image sensor module 9 located on the right side is larger than the first pixel area 6 on the left side in a direction inclined by 45 degrees with respect to the boundary line of the pixel 7.
  • a small size transparent window 6d is provided.
  • the size of the transmission window 6d is different on the left side and the right side, and the inclination direction is also different, but the shapes of the diffracted light f3 and f4 are almost the same as the diffracted light f1 and f2 in FIG. (Captured images g4 and g5). Therefore, similarly to FIG. 32, by synthesizing the images of the diffracted light f taken by each image sensor 9b, the diffracted light f is removed except for the light spot of the 0th-order diffracted light as shown in the combined image g6. can.
  • one or more transmission windows 6d are provided for one pixel 7 (or one color pixel 7) is shown, but one is provided with a plurality of pixels 7 (or a plurality of color pixels 7) as a unit.
  • the above transmission window 6d may be provided.
  • FIG. 34 is a diagram showing an example in which one transmission window 6d is provided so as to straddle three pixels 7 (or three color pixels 7).
  • the shape of the transmission window 6d is defined at the end of the second wiring layer (M2) 35.
  • FIG. 35 is a diagram showing a third example of removal of diffracted light f.
  • two image sensor modules 9 are arranged directly under the display panel 2, and in the non-light emitting region 6c of each pixel 7 in the two first pixel regions 6 located directly above the image sensor modules 9.
  • Transparent windows 6d having different shapes and directions are arranged.
  • the non-light emitting region 6c in the first pixel region 6 located directly above the image sensor module 9 located on the left side has a size straddling three pixels 7 (or three color pixels 7).
  • a rectangular transparent window 6d is provided.
  • the non-light emitting area 6c in the first pixel area 6 located directly above the image sensor module 9 located on the right side is larger than the first pixel area 6 on the left side in a direction inclined by 45 degrees with respect to the boundary line of the pixel 7.
  • Three small-sized transparent windows 6d are provided so as to straddle three pixels 7 (or three color pixels 7).
  • the generated diffracted light f5 and f6 are substantially the same as the diffracted light f1 and f2 of FIG. 32 (photographed images g7 and g8).
  • the transmission window 6d is provided in the non-light emitting region 6c in the first pixel region 6 so that the generation direction of the diffracted light f can be predicted in advance.
  • the amount of light received by the sensor 5 is limited, and there is a concern that the detection sensitivity of the sensor 5 may decrease. Therefore, it is desirable to take measures to collect as much light incident on the first pixel region 6 as possible on the transmission window 6d.
  • FIG. 36 is a cross-sectional view showing an example in which the microlens (optical system) 20 is arranged on the light incident side of the first pixel region 6.
  • the microlens 20 is arranged on the second transparent substrate 41 of the display panel 2, or is formed by processing the second transparent substrate 41.
  • the microlens 20 can be formed by arranging a resist on a transparent resin material having excellent visible light transmittance and performing wet etching or dry etching.
  • FIG. 37A is a diagram showing the traveling direction of the light incident on the first pixel region 6 when the microlens 20 is not provided
  • FIG. 37B is an arrow showing the traveling direction of the light when the microlens 20 of FIG. 36 is provided. It is a figure shown by. Without the microlens 20, the light incident on the opaque member in the first pixel region 6 cannot pass through the transmission window 6d, so that the amount of light transmitted through the transmission window 6d is reduced. On the other hand, when the microlens 20 is provided, the parallel light incident on the microlens 20 is refracted in the focal direction of the microlens 20. Therefore, by optimizing the curvature of the microlens 20 and adjusting the focal position, the amount of light transmitted through the transmission window 6d can be increased.
  • FIG. 38 is a diagram showing the traveling direction of the light refracted by the microlens 20 by an arrow line. As shown in the figure, since the microlens 20 refracts light, a part of the refracted light passes through the transmission window 6d and reaches a place away from the light receiving surface of the sensor 5 and is incident on the microlens 20. There is a risk that the emitted light cannot be used effectively.
  • a plurality of microlenses 20a and 20b having different convex directions may be arranged on the light incident side of the first pixel region 6.
  • the light refracted by the first microlens 20a is converted into parallel light having a small beam diameter by the second microlens 20b and incident on the transmission window 6d. Will be done.
  • the curvature of the second microlens 20b according to the size of the transmission window 6d, parallel light can be incident over the entire area of the transmission window 6d, and light with less image distortion can be received by the sensor 5. ..
  • the two microlenses 20a and 20b shown in FIG. 39 can be formed by, for example, laminating transparent resin layers, one of which is processed by wet etching, and the other of which is processed by dry etching.
  • a microlens (first optical system) 20a is provided on the light incident side of the first pixel region 6.
  • another microlens (second optical system) 20b may be arranged on the light emitting side of the first pixel region 6.
  • the direction of the convex shape is opposite between the microlens 20a on the light incident side and the microlens 20b on the light emitting side.
  • the light incident on the first microlens 20a is refracted and passed through the transmission window 6d, and then converted into parallel light by the second microlens 20b and incident on the sensor 5.
  • the first transparent resin layer is processed by wet etching or dry etching to form a second microlens 20b, then each layer is formed, and then the second transparent resin layer is formed. Is processed by wet etching or dry etching to form the first microlens 20a.
  • FIG. 41 is a diagram showing the traveling direction of light passing through the two microlenses 20a and 20b of FIG. 40 by arrow lines.
  • the light refracted by the first microlens 20a passes through the transmission window 6d, and then is converted into parallel light by the second microlens 20b and incident on the sensor 5.
  • the light incident on the microlens 20 can be incident on the sensor 5 without leakage, and the light receiving sensitivity of the sensor 5 can be improved.
  • the non-light emitting region 6c is provided in the first pixel region 6 located directly above the sensor 5 arranged on the back surface side of the display panel 2, and the non-light emitting region 6c has a predetermined shape.
  • a transmission window 6d is provided.
  • the light incident on the first pixel region 6 passes through the transmission window 6d and is incident on the sensor 5.
  • Diffracted light f is generated when light is transmitted through the transmission window 6d.
  • the shape of the transmission window 6d a predetermined shape, the generation direction of the diffracted light f can be estimated in advance, and the light receiving signal of the sensor 5 can be used.
  • the influence of the diffracted light f can be removed.
  • the diffracted light f imprinted on the image data captured by the image sensor module 9 can be removed by image processing by estimating the generation direction of the diffracted light f in advance. can.
  • the transmission window 6d in the non-light emitting region 6c can be defined by the end of the anode electrode 12a or the end of the wiring layer, the transmission window 6d having a desired shape and size can be formed relatively easily. Further, since a plurality of transmission windows 6d having different shapes can be formed in the non-light emitting region 6c of the first pixel region 6, the diffracted light f is synthesized by synthesizing the diffracted light f generated by the transmission windows 6d having different shapes. It is also possible to offset the effects of.
  • the microlens 20 are arranged on the light incident side of the first pixel region 6, the light incident on the first pixel region 6 is refracted by the microlens 20 and transmitted to the transmission window 6d of the non-light emitting region 6c. This makes it possible to increase the amount of light transmitted through the transmission window 6d. Further, by providing a plurality of microlenses 20 along the incident direction of the light, the light transmitted through the transmission window 6d can be guided to the light receiving surface of the sensor 5, and the light receiving amount of the sensor 5 can be increased. The light receiving sensitivity of 5 can be improved.
  • FIG. 42 is a plan view when the electronic device 50 of the first embodiment is applied to a capsule endoscope.
  • the capsule endoscope 50 of FIG. 42 is, for example, photographed by a camera (ultra-small camera) 52 and a camera 52 for capturing an image in the body cavity in a housing 51 having hemispherical surfaces at both ends and a cylindrical center.
  • a CPU (Central Processing Unit) 56 and a coil (magnetic force / current conversion coil) 57 are provided in the housing 51.
  • the CPU 56 controls the shooting by the camera 52 and the data storage operation in the memory 53, and also controls the data transmission from the memory 53 to the data receiving device (not shown) outside the housing 51 by the wireless transmitter 55.
  • the coil 57 supplies electric power to the camera 52, the memory 53, the wireless transmitter 55, the antenna 54, and the light source 52b described later.
  • the housing 51 is provided with a magnetic (reed) switch 58 for detecting when the capsule endoscope 50 is set in the data receiving device.
  • the reed switch 58 detects the set to the data receiving device and the data can be transmitted, the CPU 56 supplies electric power from the coil 57 to the wireless transmitter 55.
  • the camera 52 has, for example, an image pickup element 52a including an objective optical system for capturing an image in the body cavity, and a plurality of light sources 52b for illuminating the inside of the body cavity.
  • the camera 52 is configured as a light source 52b, for example, a CMOS (Complementary Metal Oxide Semiconductor) sensor equipped with an LED (Light Emitting Diode), a CCD (Charge Coupled Device), or the like.
  • CMOS Complementary Metal Oxide Semiconductor
  • LED Light Emitting Diode
  • CCD Charge Coupled Device
  • the display unit 3 in the electronic device 50 of the first embodiment is a concept including a light emitting body as shown in the light source 52b of FIG. 42.
  • the capsule endoscope 50 of FIG. 42 has, for example, two light sources 52b, and these light sources 52b can be configured by a display panel having a plurality of light source units and an LED module having a plurality of LEDs. In this case, by arranging the image pickup unit of the camera 52 below the display panel or the LED module, restrictions on the layout arrangement of the camera 52 are reduced, and a smaller capsule endoscope 50 can be realized.
  • FIG. 43 is a rear view when the electronic device 50 of the first embodiment is applied to the digital single-lens reflex camera 60.
  • the digital single-lens reflex camera 60 and the compact camera are provided with a display unit 3 for displaying a preview screen on the back surface opposite to the lens.
  • the camera modules 4 and 5 may be arranged on the side opposite to the display surface of the display unit 3 so that the photographer's face image can be displayed on the display surface of the display unit 3.
  • the camera modules 4 and 5 can be arranged in the area overlapping with the display unit 3, it is not necessary to provide the camera modules 4 and 5 in the frame portion of the display unit 3, and the display unit 3
  • the size can be made as large as possible.
  • FIG. 44A is a plan view showing an example in which the electronic device 50 of the first embodiment is applied to a head-mounted display (hereinafter, HMD) 61.
  • the HMD 61 of FIG. 44A is used for VR (Virtual Reality), AR (Augmented Reality), MR (Mixed Reality), SR (Substituional Reality), and the like.
  • the current HMD has a camera 62 mounted on the outer surface, and the wearer of the HMD can visually recognize the surrounding image, while the surrounding humans wear the HMD.
  • the facial expressions of a person's eyes and face cannot be understood.
  • the display surface of the display unit 3 is provided on the outer surface of the HMD 61, and the camera modules 4 and 5 are provided on the opposite side of the display surface of the display unit 3.
  • the facial expression of the wearer taken by the camera modules 4 and 5 can be displayed on the display surface of the display unit 3, and the humans around the wearer can display the facial expression of the wearer and the movement of the eyes in real time. Can be grasped.
  • the electronic device 50 according to the first embodiment can be used for various purposes, and the utility value can be enhanced.
  • the present technology can have the following configurations. (1) Equipped with a plurality of pixels arranged in a two-dimensional manner, At least a part of the plurality of pixels The first self-luminous element and The first light emitting region emitted by the first self-luminous element and An image display device comprising a non-light emitting region having a transmission window having a predetermined shape for transmitting visible light. (2) The image display device according to (1), wherein two or more pixels having the non-light emitting region having different shapes of the transmission windows are provided. (3) The non-light emitting region is arranged at a position overlapping with a light receiving device that receives light incident through the image display device when viewed in a plan view from the display surface side of the image display device (1) or. The image display device according to (2).
  • the image display device according to any one of (1) to (7), comprising an optical member arranged on the light incident side of the transmission window and refracting the incident light to guide the incident light to the transmission window. .. (9)
  • the optical member is The first optical system that refracts the incident light in the optical axis direction, It has a second optical system that parallelizes the light refracted by the first optical system, and has.
  • the image display device according to (8), wherein the transmission window transmits light parallelized by the second optical system.
  • a first pixel region including some of the plurality of pixels and A second pixel area including at least a part of the pixels other than the pixels in the first pixel area among the plurality of pixels is provided.
  • the pixel in the first pixel region has the first self-luminous element, the first light emitting region, and the non-light emitting region.
  • the pixels in the second pixel area are The second self-luminous element and
  • the first self-luminous element is With the lower electrode layer, A display layer arranged on the lower electrode layer and An upper electrode layer arranged on the display layer and It has a wiring layer arranged below the lower electrode layer and conducted to the lower electrode layer via a contact extending from the lower electrode layer in the stacking direction.
  • the first self-luminous element is With the lower electrode layer, A display layer arranged on the lower electrode layer and An upper electrode layer arranged on the display layer and It has a wiring layer arranged below the lower electrode layer and conducted to the lower electrode layer via a contact extending from the lower electrode layer in the stacking direction.
  • the image display device according to any one of (1) to (13), wherein the shape of the transmission window when viewed in a plan view from the display surface side of the plurality of pixels is defined by an end portion of the wiring layer. .. (16)
  • the wiring layer has a plurality of laminated metal layers, and has a plurality of laminated metal layers.
  • the image display device wherein the shape of the transmission window when viewed in a plan view from the display surface side of the plurality of pixels is defined by the end portion of at least one metal layer of the plurality of metal layers.
  • the metal layer defining the shape of the transmission window when viewed in a plan view from the display surface side of the plurality of pixels is an electrode of a capacitor in a pixel circuit.
  • the image display device according to any one of (14) to (18), wherein the entire area of the first light emitting region is covered with the lower electrode layer except for the region of the transmission window.
  • An image display device having a plurality of pixels arranged two-dimensionally, A light receiving device for receiving light incident through the image display device is provided.
  • the image display device has a first pixel region including a part of the plurality of pixels.
  • the part of the pixels in the first pixel area The first self-luminous element and The first light emitting region emitted by the first self-luminous element and It has a non-emissive region, which has a transmission window of a predetermined shape that allows visible light to pass through.
  • the light receiving device includes an image sensor that photoelectrically converts light incident through the non-light emitting region, a distance measuring sensor that receives light incident through the non-light emitting region and measures a distance, and the non-light emitting sensor. 19.
  • 1 image display device 2 display panel, 2a display layer, 5 sensor, 6 first pixel area, 6a first self-luminous element, 6b first light emitting area, 6c non-light emitting area, 6d transmission window, 7 pixels, 8 second Pixel area, 8a second self-luminous element, 8b second light emitting area, 9 image sensor module, 9a support substrate, 9b image sensor, 9c cut filter, 9d lens unit, 9e coil, 9f magnet, 9g spring, 10 subject, 11 Specific pixel, 12 pixel circuit, 12a anode electrode, 31 first transparent substrate, 32 first insulating layer, 33 first wiring layer, 34 second insulating layer, 35 second wiring layer, 36 third insulating layer, 36a trench, 37 4th insulating layer, 38 anode electrode layer, 39 cathode electrode layer, 40 5th insulating layer, 41 2nd transparent substrate, 42 semiconductor layer, 43 capacitor, 44 metal layer, 45 3rd metal layer

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Chemical & Material Sciences (AREA)
  • Inorganic Chemistry (AREA)
  • Geometry (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Sustainable Development (AREA)
  • General Physics & Mathematics (AREA)
  • Power Engineering (AREA)
  • Optics & Photonics (AREA)
  • Devices For Indicating Variable Information By Combining Individual Elements (AREA)
  • Computer Hardware Design (AREA)
  • Condensed Matter Physics & Semiconductors (AREA)
  • Electroluminescent Light Sources (AREA)
  • Thin Film Transistor (AREA)
  • Solid State Image Pick-Up Elements (AREA)

Abstract

Le problème décrit par la présente invention est de fournir un dispositif d'affichage d'image avec lequel il est possible d'inhiber la génération de lumière diffractée, et un appareil électronique. À cet effet, l'invention concerne un dispositif d'affichage d'image comprenant une pluralité de pixels disposés selon une forme bidimensionnelle, au moins certains des pixels parmi la pluralité de pixels ayant : un premier élément auto-emmissif ; une première région électroluminescente dans laquelle de la lumière est émise par le premier élément auto-emmissif ; et une région non électroluminescente ayant une fenêtre transparente qui permet à la lumière visible de passer à travers celle-ci et est une forme prescrite.
PCT/JP2021/031006 2020-09-03 2021-08-24 Dispositif d'affichage d'image et appareil électronique WO2022050132A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112021004550.4T DE112021004550T5 (de) 2020-09-03 2021-08-24 Bildanzeigevorrichtung und elektronische einrichtung
KR1020237005230A KR20230061348A (ko) 2020-09-03 2021-08-24 화상 표시 장치 및 전자 기기
US18/042,388 US20230329036A1 (en) 2020-09-03 2021-08-24 Image display device and electronic device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-148536 2020-09-03
JP2020148536A JP2023156540A (ja) 2020-09-03 2020-09-03 画像表示装置及び電子機器

Publications (1)

Publication Number Publication Date
WO2022050132A1 true WO2022050132A1 (fr) 2022-03-10

Family

ID=80490868

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/031006 WO2022050132A1 (fr) 2020-09-03 2021-08-24 Dispositif d'affichage d'image et appareil électronique

Country Status (5)

Country Link
US (1) US20230329036A1 (fr)
JP (1) JP2023156540A (fr)
KR (1) KR20230061348A (fr)
DE (1) DE112021004550T5 (fr)
WO (1) WO2022050132A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023112780A1 (fr) * 2021-12-13 2023-06-22 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'affichage d'image et appareil électronique
WO2024030450A1 (fr) * 2022-08-01 2024-02-08 Applied Materials, Inc. Caméra sans encadrement et trou de capteur

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005010407A (ja) * 2003-06-18 2005-01-13 Canon Inc 撮像装置付き表示装置
JP2011118330A (ja) * 2009-11-02 2011-06-16 Sony Corp 撮像装置付き画像表示装置
WO2018168231A1 (fr) * 2017-03-14 2018-09-20 富士フイルム株式会社 Filtre bloquant l'infrarouge proche, procédé de production de filtre bloquant l'infrarouge proche, élément d'imagerie à semi-conducteurs, module de caméra et dispositif d'affichage d'image
WO2020021399A1 (fr) * 2018-07-27 2020-01-30 株式会社半導体エネルギー研究所 Dispositif d'affichage, module d'affichage et appareil électronique
US20200111827A1 (en) * 2018-10-08 2020-04-09 Samsung Electronics Co., Ltd. Semiconductor device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101084198B1 (ko) 2010-02-24 2011-11-17 삼성모바일디스플레이주식회사 유기 발광 표시 장치

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005010407A (ja) * 2003-06-18 2005-01-13 Canon Inc 撮像装置付き表示装置
JP2011118330A (ja) * 2009-11-02 2011-06-16 Sony Corp 撮像装置付き画像表示装置
WO2018168231A1 (fr) * 2017-03-14 2018-09-20 富士フイルム株式会社 Filtre bloquant l'infrarouge proche, procédé de production de filtre bloquant l'infrarouge proche, élément d'imagerie à semi-conducteurs, module de caméra et dispositif d'affichage d'image
WO2020021399A1 (fr) * 2018-07-27 2020-01-30 株式会社半導体エネルギー研究所 Dispositif d'affichage, module d'affichage et appareil électronique
US20200111827A1 (en) * 2018-10-08 2020-04-09 Samsung Electronics Co., Ltd. Semiconductor device

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2023112780A1 (fr) * 2021-12-13 2023-06-22 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'affichage d'image et appareil électronique
WO2024030450A1 (fr) * 2022-08-01 2024-02-08 Applied Materials, Inc. Caméra sans encadrement et trou de capteur

Also Published As

Publication number Publication date
KR20230061348A (ko) 2023-05-08
US20230329036A1 (en) 2023-10-12
DE112021004550T5 (de) 2023-06-22
JP2023156540A (ja) 2023-10-25

Similar Documents

Publication Publication Date Title
US20180012069A1 (en) Fingerprint sensor, fingerprint sensor package, and fingerprint sensing system using light sources of display panel
JP7007268B2 (ja) 表示パネルおよびその製造方法、並びに表示装置
US8502756B2 (en) Image display device with imaging unit
JP5644928B2 (ja) 撮像装置付き画像表示装置
WO2021244251A1 (fr) Substrat d'affichage et dispositif d'affichage
WO2021244266A1 (fr) Substrat d'affichage et dispositif d'affichage
WO2022050132A1 (fr) Dispositif d'affichage d'image et appareil électronique
TWI680397B (zh) 感測板及具有感測板的顯示器
US8937363B2 (en) Solid-state imaging device and electronic apparatus
JP2023531340A (ja) 表示基板及び表示装置
US20210014393A1 (en) Light emitting device, exposure system, imaging display device, imaging device, electronic device, and lighting device
US20230012412A1 (en) Display substrate and method of manufacturing the same, and display device
CN111834398B (zh) 显示模组和显示装置
JP2011150583A (ja) 撮像装置付き画像表示装置
WO2022049906A1 (fr) Dispositif d'affichage d'image et dispositif électronique
WO2023195351A1 (fr) Appareil d'affichage et dispositif électronique
WO2021111955A1 (fr) Dispositif électronique
JP2023181857A (ja) 発光装置、表示装置、光電変換装置、および、電子機器
CN115379610A (zh) 光学装置
KR20230144679A (ko) 표시 장치
JP2023165556A (ja) 発光装置、発光装置の製造方法、表示装置、光電変換装置、電子機器、照明装置、移動体、および、ウェアラブルデバイス
CN117135975A (zh) 显示面板及显示装置
JP2023161989A (ja) 半導体装置、発光装置、表示装置、光電変換装置、電子機器、照明装置、移動体、および、ウェアラブルデバイス
CN114374750A (zh) 电子设备
KR20240055949A (ko) 표시 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21864190

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 21864190

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP