GB2489657A - A display device and sensor arrangement - Google Patents
A display device and sensor arrangement Download PDFInfo
- Publication number
- GB2489657A GB2489657A GB1022139.8A GB201022139A GB2489657A GB 2489657 A GB2489657 A GB 2489657A GB 201022139 A GB201022139 A GB 201022139A GB 2489657 A GB2489657 A GB 2489657A
- Authority
- GB
- United Kingdom
- Prior art keywords
- sensor
- display
- light
- transparent
- display device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 239000010410 layer Substances 0.000 description 78
- 238000005259 measurement Methods 0.000 description 65
- 230000008878 coupling Effects 0.000 description 29
- 238000010168 coupling process Methods 0.000 description 29
- 238000005859 coupling reaction Methods 0.000 description 29
- 239000011521 glass Substances 0.000 description 27
- 239000000463 material Substances 0.000 description 25
- 238000005516 engineering process Methods 0.000 description 24
- 230000003287 optical effect Effects 0.000 description 22
- 239000011159 matrix material Substances 0.000 description 18
- 238000000034 method Methods 0.000 description 18
- 239000000758 substrate Substances 0.000 description 15
- 239000011368 organic material Substances 0.000 description 12
- 239000000835 fiber Substances 0.000 description 11
- 238000001228 spectrum Methods 0.000 description 11
- 239000011229 interlayer Substances 0.000 description 10
- 230000005855 radiation Effects 0.000 description 10
- OAICVXFJPJFONN-UHFFFAOYSA-N Phosphorus Chemical compound [P] OAICVXFJPJFONN-UHFFFAOYSA-N 0.000 description 9
- 230000008901 benefit Effects 0.000 description 9
- 238000013461 design Methods 0.000 description 8
- 238000001514 detection method Methods 0.000 description 8
- 229910052451 lead zirconate titanate Inorganic materials 0.000 description 7
- 239000004973 liquid crystal related substance Substances 0.000 description 7
- 238000012544 monitoring process Methods 0.000 description 7
- 230000010287 polarization Effects 0.000 description 7
- 239000002096 quantum dot Substances 0.000 description 7
- 238000010521 absorption reaction Methods 0.000 description 6
- 239000004020 conductor Substances 0.000 description 6
- 238000000151 deposition Methods 0.000 description 6
- 238000004519 manufacturing process Methods 0.000 description 6
- 239000012044 organic layer Substances 0.000 description 6
- 238000012545 processing Methods 0.000 description 6
- 239000007787 solid Substances 0.000 description 6
- -1 Poly(3,4-ethylenedioxythiophene) Polymers 0.000 description 5
- 239000012792 core layer Substances 0.000 description 5
- 230000008021 deposition Effects 0.000 description 5
- 239000000203 mixture Substances 0.000 description 5
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 4
- 238000000862 absorption spectrum Methods 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 238000005229 chemical vapour deposition Methods 0.000 description 4
- 239000011888 foil Substances 0.000 description 4
- HFGPZNIAWCZYJU-UHFFFAOYSA-N lead zirconate titanate Chemical compound [O-2].[O-2].[O-2].[O-2].[O-2].[Ti+4].[Zr+4].[Pb+2] HFGPZNIAWCZYJU-UHFFFAOYSA-N 0.000 description 4
- 230000009467 reduction Effects 0.000 description 4
- 229910052710 silicon Inorganic materials 0.000 description 4
- 239000010703 silicon Substances 0.000 description 4
- 238000003860 storage Methods 0.000 description 4
- 230000007704 transition Effects 0.000 description 4
- 229910052984 zinc sulfide Inorganic materials 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000005253 cladding Methods 0.000 description 3
- 238000004040 coloring Methods 0.000 description 3
- 238000002059 diagnostic imaging Methods 0.000 description 3
- 238000005538 encapsulation Methods 0.000 description 3
- 238000011156 evaluation Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 239000010931 gold Substances 0.000 description 3
- 231100001261 hazardous Toxicity 0.000 description 3
- AMGQUBHHOARCQH-UHFFFAOYSA-N indium;oxotin Chemical compound [In].[Sn]=O AMGQUBHHOARCQH-UHFFFAOYSA-N 0.000 description 3
- 229910052751 metal Inorganic materials 0.000 description 3
- 239000002184 metal Substances 0.000 description 3
- 239000004065 semiconductor Substances 0.000 description 3
- 239000002356 single layer Substances 0.000 description 3
- 239000004984 smart glass Substances 0.000 description 3
- QZTQQBIGSZWRGI-UHFFFAOYSA-N 2-n',7-n'-bis(3-methylphenyl)-2-n',7-n'-diphenyl-9,9'-spirobi[fluorene]-2',7'-diamine Chemical compound CC1=CC=CC(N(C=2C=CC=CC=2)C=2C=C3C4(C5=CC=CC=C5C5=CC=CC=C54)C4=CC(=CC=C4C3=CC=2)N(C=2C=CC=CC=2)C=2C=C(C)C=CC=2)=C1 QZTQQBIGSZWRGI-UHFFFAOYSA-N 0.000 description 2
- IJGRMHOSHXDMSA-UHFFFAOYSA-N Atomic nitrogen Chemical compound N#N IJGRMHOSHXDMSA-UHFFFAOYSA-N 0.000 description 2
- 229920000144 PEDOT:PSS Polymers 0.000 description 2
- 230000009471 action Effects 0.000 description 2
- 239000000853 adhesive Substances 0.000 description 2
- 230000001070 adhesive effect Effects 0.000 description 2
- 230000004075 alteration Effects 0.000 description 2
- 229910002113 barium titanate Inorganic materials 0.000 description 2
- 229910052793 cadmium Inorganic materials 0.000 description 2
- BDOSMKKIYDKNTQ-UHFFFAOYSA-N cadmium atom Chemical compound [Cd] BDOSMKKIYDKNTQ-UHFFFAOYSA-N 0.000 description 2
- UHYPYGJEEGLRJD-UHFFFAOYSA-N cadmium(2+);selenium(2-) Chemical compound [Se-2].[Cd+2] UHYPYGJEEGLRJD-UHFFFAOYSA-N 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000001419 dependent effect Effects 0.000 description 2
- 230000009977 dual effect Effects 0.000 description 2
- 239000007772 electrode material Substances 0.000 description 2
- 239000010408 film Substances 0.000 description 2
- PCHJSUWPFVWCPO-UHFFFAOYSA-N gold Chemical compound [Au] PCHJSUWPFVWCPO-UHFFFAOYSA-N 0.000 description 2
- 229910052737 gold Inorganic materials 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 229910010272 inorganic material Inorganic materials 0.000 description 2
- 239000011147 inorganic material Substances 0.000 description 2
- 239000007788 liquid Substances 0.000 description 2
- 238000001459 lithography Methods 0.000 description 2
- 238000004020 luminiscence type Methods 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 229910052748 manganese Inorganic materials 0.000 description 2
- 239000011572 manganese Substances 0.000 description 2
- 229910052757 nitrogen Inorganic materials 0.000 description 2
- 239000013307 optical fiber Substances 0.000 description 2
- 239000002245 particle Substances 0.000 description 2
- SLIUAWYAILUBJU-UHFFFAOYSA-N pentacene Chemical compound C1=CC=CC2=CC3=CC4=CC5=CC=CC=C5C=C4C=C3C=C21 SLIUAWYAILUBJU-UHFFFAOYSA-N 0.000 description 2
- 229920000301 poly(3-hexylthiophene-2,5-diyl) polymer Polymers 0.000 description 2
- 229920003229 poly(methyl methacrylate) Polymers 0.000 description 2
- 229920000052 poly(p-xylylene) Polymers 0.000 description 2
- 229920001467 poly(styrenesulfonates) Polymers 0.000 description 2
- 229920000642 polymer Polymers 0.000 description 2
- 239000004926 polymethyl methacrylate Substances 0.000 description 2
- 238000007639 printing Methods 0.000 description 2
- 230000008569 process Effects 0.000 description 2
- 229910052761 rare earth metal Inorganic materials 0.000 description 2
- 150000002910 rare earth metals Chemical class 0.000 description 2
- 230000011514 reflex Effects 0.000 description 2
- 238000012552 review Methods 0.000 description 2
- 230000007480 spreading Effects 0.000 description 2
- 238000003892 spreading Methods 0.000 description 2
- 238000012549 training Methods 0.000 description 2
- 230000009466 transformation Effects 0.000 description 2
- 239000012780 transparent material Substances 0.000 description 2
- 238000001429 visible spectrum Methods 0.000 description 2
- GKWLILHTTGWKLQ-UHFFFAOYSA-N 2,3-dihydrothieno[3,4-b][1,4]dioxine Chemical compound O1CCOC2=CSC=C21 GKWLILHTTGWKLQ-UHFFFAOYSA-N 0.000 description 1
- OGGKVJMNFFSDEV-UHFFFAOYSA-N 3-methyl-n-[4-[4-(n-(3-methylphenyl)anilino)phenyl]phenyl]-n-phenylaniline Chemical compound CC1=CC=CC(N(C=2C=CC=CC=2)C=2C=CC(=CC=2)C=2C=CC(=CC=2)N(C=2C=CC=CC=2)C=2C=C(C)C=CC=2)=C1 OGGKVJMNFFSDEV-UHFFFAOYSA-N 0.000 description 1
- PMVRHMKYOMJWSD-UHFFFAOYSA-N 4-(4-aminophenyl)-2,3-diphenylaniline Chemical compound C1=CC(N)=CC=C1C1=CC=C(N)C(C=2C=CC=CC=2)=C1C1=CC=CC=C1 PMVRHMKYOMJWSD-UHFFFAOYSA-N 0.000 description 1
- XQNMSKCVXVXEJT-UHFFFAOYSA-N 7,14,25,32-tetrazaundecacyclo[21.13.2.22,5.03,19.04,16.06,14.08,13.020,37.024,32.026,31.034,38]tetraconta-1(36),2,4,6,8,10,12,16,18,20(37),21,23(38),24,26,28,30,34,39-octadecaene-15,33-dione 7,14,25,32-tetrazaundecacyclo[21.13.2.22,5.03,19.04,16.06,14.08,13.020,37.025,33.026,31.034,38]tetraconta-1(37),2,4,6,8,10,12,16,18,20,22,26,28,30,32,34(38),35,39-octadecaene-15,24-dione Chemical group O=c1c2ccc3c4ccc5c6nc7ccccc7n6c(=O)c6ccc(c7ccc(c8nc9ccccc9n18)c2c37)c4c56.O=c1c2ccc3c4ccc5c6c(ccc(c7ccc(c8nc9ccccc9n18)c2c37)c46)c1nc2ccccc2n1c5=O XQNMSKCVXVXEJT-UHFFFAOYSA-N 0.000 description 1
- 229910016495 ErF3 Inorganic materials 0.000 description 1
- PWHULOQIROXLJO-UHFFFAOYSA-N Manganese Chemical compound [Mn] PWHULOQIROXLJO-UHFFFAOYSA-N 0.000 description 1
- 229910017557 NdF3 Inorganic materials 0.000 description 1
- 229920001609 Poly(3,4-ethylenedioxythiophene) Polymers 0.000 description 1
- 239000004642 Polyimide Substances 0.000 description 1
- 239000004793 Polystyrene Substances 0.000 description 1
- VYPSYNLAJGMNEJ-UHFFFAOYSA-N Silicium dioxide Chemical compound O=[Si]=O VYPSYNLAJGMNEJ-UHFFFAOYSA-N 0.000 description 1
- 239000005083 Zinc sulfide Substances 0.000 description 1
- 239000002253 acid Substances 0.000 description 1
- 230000006978 adaptation Effects 0.000 description 1
- 229920000109 alkoxy-substituted poly(p-phenylene vinylene) Polymers 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- JRPBQTZRNDNNOP-UHFFFAOYSA-N barium titanate Chemical compound [Ba+2].[Ba+2].[O-][Ti]([O-])([O-])[O-] JRPBQTZRNDNNOP-UHFFFAOYSA-N 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 239000011230 binding agent Substances 0.000 description 1
- 239000003990 capacitor Substances 0.000 description 1
- 230000015556 catabolic process Effects 0.000 description 1
- 229910052804 chromium Inorganic materials 0.000 description 1
- 239000011538 cleaning material Substances 0.000 description 1
- 239000011248 coating agent Substances 0.000 description 1
- 238000000576 coating method Methods 0.000 description 1
- 238000011109 contamination Methods 0.000 description 1
- 238000012937 correction Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 238000006731 degradation reaction Methods 0.000 description 1
- 230000000593 degrading effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000002405 diagnostic procedure Methods 0.000 description 1
- NKZSPGSOXYXWQA-UHFFFAOYSA-N dioxido(oxo)titanium;lead(2+) Chemical compound [Pb+2].[O-][Ti]([O-])=O NKZSPGSOXYXWQA-UHFFFAOYSA-N 0.000 description 1
- 239000000428 dust Substances 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 230000005684 electric field Effects 0.000 description 1
- 230000002708 enhancing effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000013213 extrapolation Methods 0.000 description 1
- 230000005669 field effect Effects 0.000 description 1
- 239000005350 fused silica glass Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 238000004770 highest occupied molecular orbital Methods 0.000 description 1
- 230000005525 hole transport Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000010354 integration Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000009191 jumping Effects 0.000 description 1
- FZLIPJUXYLNCLC-UHFFFAOYSA-N lanthanum atom Chemical compound [La] FZLIPJUXYLNCLC-UHFFFAOYSA-N 0.000 description 1
- 238000004768 lowest unoccupied molecular orbital Methods 0.000 description 1
- 238000002488 metal-organic chemical vapour deposition Methods 0.000 description 1
- 150000002739 metals Chemical class 0.000 description 1
- 239000011859 microparticle Substances 0.000 description 1
- 239000002105 nanoparticle Substances 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000006911 nucleation Effects 0.000 description 1
- 238000010899 nucleation Methods 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 239000011146 organic particle Substances 0.000 description 1
- 125000002080 perylenyl group Chemical group C1(=CC=C2C=CC=C3C4=CC=CC5=CC=CC(C1=C23)=C45)* 0.000 description 1
- 238000000206 photolithography Methods 0.000 description 1
- CLYVDMAATCIVBF-UHFFFAOYSA-N pigment red 224 Chemical compound C=12C3=CC=C(C(OC4=O)=O)C2=C4C=CC=1C1=CC=C2C(=O)OC(=O)C4=CC=C3C1=C42 CLYVDMAATCIVBF-UHFFFAOYSA-N 0.000 description 1
- 229920001721 polyimide Polymers 0.000 description 1
- 239000002861 polymer material Substances 0.000 description 1
- 229920002223 polystyrene Polymers 0.000 description 1
- 238000011897 real-time detection Methods 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 238000000926 separation method Methods 0.000 description 1
- 238000012883 sequential measurement Methods 0.000 description 1
- 150000003384 small molecules Chemical class 0.000 description 1
- 229910052950 sphalerite Inorganic materials 0.000 description 1
- 230000006641 stabilisation Effects 0.000 description 1
- 238000011105 stabilization Methods 0.000 description 1
- 230000000087 stabilizing effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 230000009897 systematic effect Effects 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 239000010409 thin film Substances 0.000 description 1
- XOLBLPGZBRYERU-UHFFFAOYSA-N tin dioxide Chemical compound O=[Sn]=O XOLBLPGZBRYERU-UHFFFAOYSA-N 0.000 description 1
- 229910001887 tin oxide Inorganic materials 0.000 description 1
- 238000012546 transfer Methods 0.000 description 1
- 238000001721 transfer moulding Methods 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 238000000411 transmission spectrum Methods 0.000 description 1
- QGJSAGBHFTXOTM-UHFFFAOYSA-K trifluoroerbium Chemical compound F[Er](F)F QGJSAGBHFTXOTM-UHFFFAOYSA-K 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000011179 visual inspection Methods 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
- DRDVZXDWVBGGMH-UHFFFAOYSA-N zinc;sulfide Chemical compound [S-2].[Zn+2] DRDVZXDWVBGGMH-UHFFFAOYSA-N 0.000 description 1
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/3406—Control of illumination source
- G09G3/342—Control of illumination source using several illumination sources separately controlled corresponding to different display panel areas, e.g. along one dimension such as lines
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/01—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
- G02F1/13—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells
- G02F1/133—Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
- G02F1/1333—Constructional arrangements; Manufacturing methods
- G02F1/1335—Structural association of cells with optical devices, e.g. polarisers or reflectors
- G02F1/133524—Light-guides, e.g. fibre-optic bundles, louvered or jalousie light-guides
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F1/00—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics
- G02F1/01—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour
- G02F1/13—Devices or arrangements for the control of the intensity, colour, phase, polarisation or direction of light arriving from an independent light source, e.g. switching, gating or modulating; Non-linear optics for the control of the intensity, phase, polarisation or colour based on liquid crystals, e.g. single liquid crystal display cells
- G02F1/133—Constructional arrangements; Operation of liquid crystal cells; Circuit arrangements
- G02F1/1333—Constructional arrangements; Manufacturing methods
- G02F1/1335—Structural association of cells with optical devices, e.g. polarisers or reflectors
- G02F1/1336—Illuminating devices
- G02F1/133602—Direct backlight
- G02F1/133603—Direct backlight with LEDs
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/22—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources
- G09G3/30—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels
- G09G3/32—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED]
- G09G3/3208—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters using controlled light sources using electroluminescent panels semiconductive, e.g. using light-emitting diodes [LED] organic, e.g. using organic light-emitting diodes [OLED]
-
- G—PHYSICS
- G02—OPTICS
- G02F—OPTICAL DEVICES OR ARRANGEMENTS FOR THE CONTROL OF LIGHT BY MODIFICATION OF THE OPTICAL PROPERTIES OF THE MEDIA OF THE ELEMENTS INVOLVED THEREIN; NON-LINEAR OPTICS; FREQUENCY-CHANGING OF LIGHT; OPTICAL LOGIC ELEMENTS; OPTICAL ANALOGUE/DIGITAL CONVERTERS
- G02F2201/00—Constructional arrangements not provided for in groups G02F1/00 - G02F7/00
- G02F2201/58—Arrangements comprising a monitoring photodetector
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0233—Improving the luminance or brightness uniformity across the screen
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0242—Compensation of deficiencies in the appearance of colours
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/029—Improving the quality of display appearance by monitoring one or more pixels in the display panel, e.g. by monitoring a fixed reference pixel
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0626—Adjustment of display parameters for control of overall brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/144—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2360/00—Aspects of the architecture of display systems
- G09G2360/14—Detecting light within display terminals, e.g. using a single or a plurality of photosensors
- G09G2360/145—Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light originating from the display screen
Landscapes
- Physics & Mathematics (AREA)
- Nonlinear Science (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Mathematical Physics (AREA)
- Chemical & Material Sciences (AREA)
- Crystallography & Structural Chemistry (AREA)
- Optics & Photonics (AREA)
- Computer Hardware Design (AREA)
- Theoretical Computer Science (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
Abstract
A display device having at least one at least partially transparent sensor 8 for detecting a property of light such as the intensity, colour and/or colour point of light emitted from at least one display area 5 of a display device into the viewing angle of said display device. The invention also relates to the many possible uses of such a display device.
Description
DISPLAY DEVICE AND USE THEREOF
FIELD OF THE INVENTION
The invention relates to a display device having at least one sensor for detecting a property such as the intensity, colour and/or colour point of light emitted from at least one display area of a display device into the viewing angle of said display device.
The invention also relates to the use of such a display device.
BACKGROUND OF THE INVENTION
In modern medical facilities, radiology plays a crucial role in the diagnostic process.
Because of this, high-quality medical imaging using display devices like liquid crystal display devices (LCD devices) is more important than ever before. Thereto, display devices are typically provided with a sensor and a controller device coupled thereto.
One type of sensor is coupled to a backlight device, for instance comprising light is emitting diodes (LED5), of the LCD device. It aims at stabilizing the output of the backlight device, which inherently varies as a consequence of the use of LEDs therein.
W02008/050262 discloses one example of such sensor for an LED-based backlight. The backlight device is herein provided with a transparent outcoupling plate overlying its surface from which light is emitted. Structures, such as prismatic grooves, are defined in the outcoupling plate, so as to guide light to a side face, where the sensor is located. Particularly, the outcoupling plate is designed so as to achieve light spreading in addition to the light guiding to a side face. This provides an improved uniformity of the light output of the backlight device. However, a stabilization of merely the backlight is insufficient for obtaining a high-quality display system, such can be for instance applied for medical imaging applications. Moreover, when considering such outcoupling plate in front of a display, light spreading is not desired.
EP1 274066B1 discloses a display device wherein the sensing is applied in front of the display. Use is made herein of a light guide, f.i. a waveguide or fibre, to guide a portion of the light output to a sensor outside the viewing angle of the display. Light from a display area comprising a plurality of pixels is inserted into the light guide, for instance at one end of the fibre or into a continuous waveguide. Therewith, the area on the display blocked for light transmission is limited. Particularly, as disclosed in EP1 274066, light rays traveling under a large angle to the axis of the light guide can be made to exit the structure, while ambient light cannot enter the light guide. By means of this small acceptance angle, it is avoided that ambient light enters the photodiode sensor without a need for shielding.
However, it is desired to further improve such a sensor system, i.e. sensor and light guide. One implementation shown in EP1 274066 is that a end of a fibre is parallel to the output surface of the display and the fibre is bent. This is however not a most practical implementation.
Another such solution with a waveguide in front of a display is disclosed in W02004/023443. The waveguide particularly includes a material of relatively higher refractive index surrounded by a material of relatively lower refractive index. A sensor is present at one edge of the waveguide. Alternatively, the waveguide may extend in four directions and the sensors may be present on four edges. This solution is intended (see example 3) for calibration measurements of an 1 Oxi 0 passive matrix OLED display, wherein each pixel is turned on sequentially.
However, it is an object of the present invention to provide a sensor system that can be used for real-time measurements, e.g. while the display is in use. The solution of WO2004023443 seems not to be fit therefore. This solution is sensitive to receiving is light from the ambient, such that the overall signal to noise ratio will be rather low.
It is therefore an object of the invention to provide a display device with a sensor suitable for real-time sensing (e.g. while the display is in use), with a high signal to noise ratio and without disturbance of an image emitted by the display.
SUMMARY OF THE INVENTION
According to a first aspect of the invention, a display device is provided that comprises at least one display area provided with a plurality of pixels. For each display area an at least partially transparent sensor for detecting a property of light emitted from the said display area into a viewing angle of the display device is present. The sensor is located in a front section of said display device in front of said display area.
Surprisingly good results have been obtained with at least partially transparent sensors located in front of the display area and within the viewing angle. An expected disturbance of the display image tends to be at least substantially absent. Due to the direct incoupling of the light into the sensor, a proper transmission to the sensor is achieved without a coupling member. Such transparent sensor is suitably applied to an inner face of a cover member.
The transparent cover member may be used as a substrate in the manufacturing of the sensor. Particularly a glass or the like inorganic substrate has sufficient thermal stability to withstand operating temperature of vapour deposition, which is a preferred way of deposition of the layers constituting the sensor. Specific examples include chemical vapour deposition (CVD) and any type thereof such as metal organic chemical vapour deposition (MOCVD), thermal vapour deposition.
However, polymeric substrates may be used alternatively, particularly when using low temperature deposition techniques such as printing and coating. Assembly is not excluded as a manufacturing technique.
In a suitable embodiment hereof, the device further comprises at least partially transparent electrical conductors for conducting a measurement signal from said sensor within said viewing angle for transmission to a controller. Substantially transparent conductor materials such as indium tin oxide and the polymeric Poly(3,4-ethylenedioxythiophene) poly(styrenesulfonate), typically referred to as PEDOT:PSS, are known per se. In one most suitable embodiment, the sensor is provided with transparent electrodes that are defined in one layer with the said conductors. This reduces the number of layers that inherently lead to additional resistance and to interfaces that might slightly disturb the display image.
is Preferably, the sensor comprises an organic photoconductor. Such organic materials have been a subject of advanced research over the past decades. Organic photoconductors may be embodied as single layers, as bilayers and as multilayer structures. They may be advantageously applied within the present display device.
Particularly, the presence on the inner face of the cover member allows that the organic materials are present in a closed and controllable atmosphere, e.g. in a space between the cover member and the display. A getter may for instance be present to reduce negative impact of humidity. Furthermore, vacuum conditions or a predefined atmosphere (for instance pure nitrogen) may be applied in said space upon assembly of the cover member to the display.
A sensor comprising an organic photoconductor suitably further comprises a first and a second electrode, that advantageously are located adjacent to each other.
The location adjacent to each other, preferably defined within one layer, allows a design with finger-shaped electrodes that are mutually interdigitated. Herewith, any charge generated in the photoconductor is suitably transmitted to the electrodes.
Preferably the number of fingers per electrode is larger than 50, more preferably larger than 100, for instance in the range of 250-2000.
One preferred type of photosensor is one wherein the organic photoconductor is a bilayer structure with a exciton generation layer and a charge transport layer, said charge transport layer being in contact with a first and a second electrode. Such a bilayer structure is for instance known from Applied Phys Letters 93 (2008), 63305, which is included herein by reference. Alternatively a bilayer device that uses a quantum-dot exciton generation layer and an organic charge transport layer can be used. The quantum dots can be for example colloidal CdSe (Cadmium Selende) quantum dots. The organic charge transport layer can consist of Spiro-TPD.
Alternatively, use may be made of thinned silicon photodiodes. When thinning S silicon to a micrometer range thickness, it becomes, at least partially, optically transparent. Stability of such devices may be obtained by encapsulation of the devices with a polymeric material such as polyimide. The overall thickness of the encapsulated devices is typically in the order of 3-30 microns The technology is for instance known from R. Dekker et al, A 10 im thick RF-ID tag for chip-in-paper applications', IEEE BCTM Proceedings 2005, 1 8-21, which is included herein by reference. The photodiodes made in this technology may be assembled to the cover member by means of an adhesive. An electrically conductive adhesive may be applied.
Alternatively, transfer may be arranged with capacitive coupling by appropriate positioning of electrodes.
is The display defined in the at least one display area of the display device may be of conventional technology, such as an liquid crystal device (LCD) with a backlight, for instance based on light emitting diodes (LEDs), or an electroluminescent device such as an organic light emitting (OLED) display.The display device suitably further comprises an electronic driving system and a controller receiving optical measurement signals generated in the at least one sensor and controlling the electronic driving system on the basis of the received optical measurement signals.
According to a second aspect of the invention, a display device is provided that comprises at least one display area with a plurality of pixels. For each display area, at least one sensor and an at least partially transparent optical coupling device are provided. The at least one sensor is designed for detecting a property of light emitted from the said display area into a viewing angle of the display device. The sensor is located outside or at least partially outside the viewing angle. The at least partially transparent optical coupling device is located in a front section of said display device, It comprises a light guide member for guiding at least one part of the light emitted from the said display area to the corresponding sensor. The coupling device further comprises an incoupling member for coupling the light into the light guide member.
It is an advantage of the present invention to detect a property such as the intensity or the colour of light emitted by at least one display area of a display device into the viewing angle of said display device without constraining the view on said display device. The use of the incoupling member solves the apparent contradiction of a waveguide parallel to the front surface that does not disturb a display image, and a signal-to-noise ratio sufficiently high for allowing real-time measurements. An additional advantage is that any scattering eventually occurring at or in the incoupling member, is limited to a small number of locations over the front surface of the display image.
Preferably, the light guide member is running in a plane which is parallel to a front surface of the display device. The incoupling member is suitably an incoupling member for laterally coupling the light into the light guide member of the coupling device. The result is a substantially planar incoupling member. This has the advantage of minimum disturbance of displayed images. Furthermore, the coupling device may be embedded in a layer or plate. It may be assembled to a cover member, i.e. front glass plate, of the display after its manufacturing, for instance by insert or transfer moulding.
Alternatively, the cover member is used as a substrate for definition of the coupling member.
In one implementation, a plurality of light guide members is arranged as is individual light guide members or part of a light guide member bundle. It is suitable that the light guide member is provided with a circular or rectangular cross-sectional shape when viewed perpendicular to the front surface and perpendicular to a main extension of the light guide member. A light guide with such a cross-section may be made adequately, and moreover limits scattering of radiation. In one suitable embodiment, such light guide member is located between a first and a second display area. This further reduces the risk of scattering. Such location between a first and a second display area may particularly be used, if the light guide member is defined in or on a cover member. Such cover member is typically a transparent substrate, for instance of glass or polymer material.
In any of the above embodiments the sensor or the sensors of the sensor system is/are located at a front edge of the display device.
The incoupling member of this embodiment may be present on top of the light guide member or effectively inside the light guide member. One example of such location inside the light guide is that the incoupling member and the light guide member have a co-planar ground plane. The incoupling member may then extend above the light guide member or remain below a top face of the light guide member or be coplanar with such top face. Furthermore, the incoupling member may have an interface with the light guide member or may be integral with such light guide member In one particular embodiment, the or each incoupling member is cone-shaped.
The incoupling member herein has a tip and a ground plane. The ground plane preferably has circular or oval shape. The tip is preferably facing towards the display area.
The incoupling member may be formed as a laterally prominent incoupling member. Most preferably, it is delimited by two laterally coaxially aligned cones, said cones having a mutual apex and different apex angles. The difference between the apex angles Acx=ctl -o2 is smaller than the double value of the critical angle (Os) for total internal reflection (TIR) Act.c 2O. Especially, the or each incoupling member fades seamlessly to the guide member of the coupling device. The or each incoupling member and the or each guide member are suitably formed integrally.
In an alternative embodiment, the or each incoupling member is a diffraction grating. The diffraction grating allows that radiation of a limited set of wavelengths is transmitted through the light guide member. Different wavelengths (e.g. different colours) may be incoupled with gratings having mutually different grating periods. The range of wavelengths is preferably chosen so as to represent the intensity of the light most adequately.
is In a further embodiment hereof, both the cone-shaped incoupling member and diffraction grating are present as incoupling members. These two different incoupling members may be coupled to one common light guide member or to separate light guide members, one for each, and typically leading to different sensors.
By using a first and a second incoupling members of different type on one common light guide member, light extraction, at least of certain wavelengths, may be increased, thus further enhancing the signal to noise ratio. Additionally, because of the different operation of the incoupling members, the sensor may detect more specific variations.
By using a first and a second incoupling member of different type in combination with a first and a second light guide member respectively, the different type of incoupling members may be applied for different type of measurements. For instance, one type, such as the cone-shaped incoupling member, may be applied for luminance measurements, whereas the diffraction grating or the phosphor discussed below may be applied for color measurements. Alternatively, one type, such as the cone-shaped incoupling member, may be used for a relative measurement, whereas an other type, such as the diffraction grating, is used for an absolute measurement. In this embodiment, the one incoupling member (plus light guide member and sensor) may be coupled to a larger set of pixels than the other one. One is for instance coupled to a display area comprising a set of pixels, the other one is coupled to a group of display areas.
In a further embodiment, the incoupling member further comprises a transformer for transforming a wavelength of light emitted from the display area into a sensing wavelength. The transformer is for instance based on a phosphor. Such phosphor is suitably locally applied on top of the light guiding member. The phosphor may alternatively be incorporated into a material of the light guiding member. It could furthermore be applied on top of another incoupling member (e.g. on top of or in a diffraction grating or a cone-shaped member or another incoupling member).
The sensing wavelength is suitably a wavelength in the infrared range. This range has the advantage the light of the sensing wavelength is not visible anymore.
Incoupling into and transport through the light guide member is thus not visible. In other words, any scattering of light is made invisible, and therewith disturbance of the emitted image of the display is prevented. Such scattering could for instance occur directly after the transformation of the wavelength of the light, i.e. upon reemission of the light from the phosphor. The sensing wavelength is most suitably a wavelength in the near is infrared range, for instance between 0.7 and 1.0 micrometers, and particularly between 0.8 and 0.9 micrometers. Such a wavelength can be suitably detected with a commercially available photodetectors, for instance based on silicon.
A suitable phosphor for such transformation is for instance a Manganese Activated Zinc Sulphide Phosphor. Such a phosphor may emit luminescence in the 3 micron region, where the manganese concentration is greater than 2%. Optical absorption measurements hereof show a maximum at 0.80 microns with sub-bands at 0.74 micron and 0.84 microns, the luminescence being excited by radiation in this region. Also other rare earth doped zinc sulfide phosphors can be used for infrared (IR) emission. Examples are ZnS:ErF3 and ZnS:NdF3 thin film phosphors, such as disclosed in J.Appl.Phys. 94(2003), 3147, which is incorporated herein by reference.
Another example is ZnS:TmAg, with x between 100 and 1000 ppm and y between 10 and 100 ppm, as disclosed in U54499005.
The display device suitably further comprises an electronic driving system and a controller receiving optical measurement signals generated in the at least one sensor and controlling the electronic driving system on the basis of the received optical measurement signals.
The display defined in the at least one display area of the display device may be of conventional technology, such as an liquid crystal device (LCD) with a backlight, for instance based on light emitting diodes (LEDs), or an electroluminescent device such as an organic light emitting (OLED) display.
Instead of being an alternative to the before mentioned transparent sensor solution, the present sensor solution of coupling member and sensor may be applied in addition to such sensor solution. The combination enhances sensing solutions and the different type of sensor solutions have each their benefits. The one sensor solution may herein be coupled to a larger set of pixels than another sensor solution.
While the foregoing description refers to the presence of at least one display area with a corresponding sensor solution, the number of display areas with a sensor is preferably larger than one, for instance two, four, eight or any plurality. It is preferable that each display area of the display is provided with a sensor solution, but that is not essential. For instance, merely one display area within a group of display areas could be provided with a sensor solutions.
In a further aspect according to the invention, use of the said display devices for sensing a light property while displaying an image is provided.
Most suitably, the real-time detection is carried out for the luminance. The is detection of color (chrominance) aspects may be carried out in a calibration mode, e.g. when the display is not in a display mode. However, it is not excluded that chrominance detection may also be carried out real-time, in the display mode. It is suitable to do the measurements relative to a reference. The reference can be chosen as a test image or a start-up image of the display.
For an appropriate real-time sensing while display of images in ongoing, further processing on sensed values is suitably carried out. Therein, an image displayed in a display area is used for treatment of the corresponding sensed value or sensed values.
Aspects of the image that are taken into account, are particularly its light properties, and more preferably light properties emitted by the individual pixels or an average thereof. One example of such light property is the luminance value, that will be used hereinafter for clarity. Same or similar processing may be done for other light properties such as chrominance, color variations, color balance, An algorithm may define an average of those luminance values per display area, based on digital driving levels provided to the display. When comparing to the real light emission of a pixel or a group of pixels, instead of the light emitted theoretically at the used display area, it turns out more difficult to measure the actual non-idealities of the emitted light later on. In defining the average, it may be taken into account that the light is emitted in a specific intensity over a range of angles. More specifically, a luminance profile may be taken into account. The average calculated can be considered to be an ideally emitted luminance.
The luminance value sensed by the sensor is then compared to said ideally emitted luminance. Suitably, the luminance value sensed by the sensor is prior to said comparison or subsequent to said comparison compared to a reference. The two steps of comparison with the emitted luminance value and a reference value provide a sensing result.
In one embodiment, such sensing result is compared by a controller to a lower and/or an upper value of the threshold. If the sensing result is outside the accepted range of values, it is to be reviewed or corrected. One possibility for review is that one or more subsequent sensing results for the display area are calculated and compared by the controller. If more than a critical number of sensing values for one display area are outside the accepted range, then the luminance setting for the display area is to be corrected so as to bring it within the accepted range. A critical number is for instance 7 out of 10. E.g. if 8, 9 or 10 of sensing values are outside the accepted range, the controller takes action. Else, if the number of sensing values outside the accepted range is above a monitoring value but not higher than the critical value, then the is controller may decide to continue monitoring.
In order to balance processing effort, the controller may decide not to review all sensing results continuously, but to do this one after the other. Furthermore, this comparison process may be scheduled with a relatively low priority, such that it is only carried out when the processor is idle.
In another embodiment, such sensing result is stored in a memory. At the end of a monitoring period, such set of sensing results may be evaluated. One suitable evaluation is to find out whether the sensed values of the luminance are systematically above or below the value that, according to the settings specified by the controller, had been emitted. If such systematic difference exists, the settings specified by the controller may be adapted accordingly. In order to increase the robustness of the set of sensing results, certain sensing results may be left out of the set, such as for instance an upper and a lower value. Additionally, it may be that values corresponding to a certain display setting are looked at. For instance, sensing values corresponding to a high luminance setting are looked at only. This may be suitable to verify if the display behaves at high luminance settings similar to its behaviour at other settings, for instance low luminance settings. Alternatively, the sensed values of certain luminance settings may evaluated as these values are most reliable for reviewing luminance settings. Instead of high and low values, one may think of luminance when emitting a predominantly green image versus the luminance when emitting a predominantly yellow image. Furthermore, one may think of luminance for a setting in which substantially all pixels of a display area under monitoring have similar or same luminance values versus a setting in which said pixels have mutually different luminance values.
Additional calculations can be based on said set of sensed values. For instance, instead of merely determining a difference between sensed value and setting, the derivative may be reviewed. This can then be used to see whether the difference increases or decreases. Again, the timescale of determining such derivative may be smaller or larger, preferably larger, than that of the absolute difference. It is not excluded that average values are used for determining the derivative over time.
Another use is sets of sensed values for different display areas are compared to each other. In this manner, homogeneity of the display emittance (e.g. luminance) can be calculated.
It will be understood by the skilled reader, that use is made of storage of display settings and sensed values for the said processing and calculations. An efficient storage protocol may be further implemented by the skilled person. It is repeated that is the above explanation is given for the example of luminance, but that it may be equally applied to other light properties.
As specified above, the initially sensed value is suitably compared to a reference value for calibration purposes. The calibration will be typically carried out per display area. In the case of using a display with a backlight, the calibration typically involves switching the backlight on and off, for a display area and suitably one or more surrounding display areas. In case of using a display without backlight, the calibration typically involves switching the display off, within a display area and suitably surrounding display areas. The calibration is for instance carried out for a first time upon start up of the display. It may subsequently be repeated for display areas.
Moments for such calibration during real-time use which do not disturb a viewer, include for instance short transition periods between a first block and a second block of images. In case of consumer displays, such transition period is for instance an announcement of a new and regular program, such as the daily news. In case of professional displays, such as displays for medical use, such transition periods are for instance periods between reviewing a first medical image (X-ray, MRI and the like) and a second medical image. The controller will know or may determine such transition period.
As discussed hereinabove, different type of incoupling members may be applied for sensing different light properties. Typical light properties that may be sensed separately include a luminance profile, a chrominance strength.
While the above method has been expressed in the claims as a use of the above mentioned sensor solutions, it is to be understood that the method is also applicable to any other sensor for a display that may be used for real-time measurements. It is more generally a method of evaluating at least one value determined by a sensor, comprising the steps of: Providing a sensing result by: Calculating a setting-independent sensor value by comparison of the value determined by the sensor for a specified display area with (average) display settings for said display area corresponding to the moment in time on which the sensor determination is based, and Calibrating said value determined by the sensor or said setting-independent sensor value by comparison to a reference value Evaluating the sensing result and/or evaluating a set of sensing results for defining a display evaluation parameter; If the display evaluation parameter is outside an accepted range, modify the is display settings and/or continue monitoring said display area.
The average display settings as used herein are more preferably the ideally emitted luminance as discussed above.
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter.
BRIEF DESCRIPTION OF THE DRAWINGS
Fig. 1 is a schematic illustration of a display device with a sensor system according to a first embodiment of the invention; Fig. 2 shows the coupling device of the sensor system illustrated in Fig. 1; Fig. 3 shows a vertical sectional of a sensor system for use in the display device according to a third embodiment of the invention; Fig. 4 shows a horizontal sectional view of a display device with a sensor system according to a fourth embodiment of the invention; and Fig. 5 shows a side view of a display device with a sensor system according to a second embodiment of the invention.
DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENTS
The present invention will be described with respect to particular embodiments and with reference to certain drawings but the invention is not limited thereto but only by the claims. The drawings described are only schematic and are non-limiting. In the drawings, the size of some of the elements may be exaggerated and not drawn on scale for illustrative purposes.
Furthermore, the terms first, second, third and the like in the description and in the claims, are used for distinguishing between similar elements and not necessarily for describing a sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances and that the embodiments of the invention described herein are capable of operation in other sequences than described or illustrated herein.
Moreover, the terms top, bottom, over, under and the like in the description and the claims are used for descriptive purposes and not necessarily for describing relative positions. It is to be understood that the terms so used are interchangeable under appropriate circumstances and that the embodiments of the invention described herein are capable of operation in other orientations than described or illustrated herein.
is It is to be noticed that the term "comprising", used in the claims, should not be interpreted as being restricted to the means listed thereafter; it does not exclude other elements or steps. Thus, the scope of the expression "a device comprising means A and B" should not be limited to devices consisting only of components A and B. It means that with respect to the present invention, the only relevant components of the device are A and B. In the claims the indefinite article "a" or "an" does not exclude a plurality. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage. Any reference signs in the claims should not be construed as limiting the scope.
Similarly, it is to be noticed that the term "coupled", also used in the claims, should not be interpreted as being restricted to direct connections only. Thus, the scope of the expression "a device A coupled to a device B" should not be limited to devices or systems wherein an output of device A is directly connected to an input of device B. It means that there exists a path between an output of A and an input of B which may be a path including other devices or means.
It is furthermore observed that the term "at least partially transparent" as used throughout the present application refers to an object that may be partially transparent for all wavelengths, fully transparent for all wavelengths, fully transparent for a range of wavelengths or partially transparent for a range of wavelengths. Typically, it refers to optically transparancy, e.g. transparancy for visible light. Partially transparent is herein understood as the property that the intensity and/or resolution of an image shown through the partially transparent member is reduced due to the said partially transparent member. Partially transparent refers particularly to a reduction of impinging light intensity of at most 50%, more preferably at most 25%, more preferably at most 10%, or even at most 5%. Typically the members are chosen so as to be substantially transparent, i.e. with a reduction of at most 10%.
The term light guide' is used herein for reference to any structure that may guide light in a predefined direction. One preferred embodiment hereof is a waveguide, e.g. a light guide with a structure optimized for guiding light. Typically, such a structure is provided with surfaces that adequately reflect the light without substantial diffraction and/or scattering. Such surfaces may include an angle of substantially 90 or 180 degrees with respect to each other. Another embodiment is for instance an optical fiber. Prismatic structures are deemed less beneficial as they tend to scatter any emitted light and therewith lead to visible disturbance of the emitted image.
Moreover, the term display' is used herein for reference to the functional is display. In case of a liquid crystalline display, as an example, this is the layer stack provided with active matrix or passive matrix addressing. The functional display is subdivided in display areas. An image may be displayed in one or more of the display areas. The term display device' is used herein to refer to the complete apparatus, including sensors, light guide members and incoupling members. Suitably, the display device further comprises a controller, driving system and any other electronic circuitry needed for appropriate operation of the display device.
Fig. 1 shows a display device 1 formed as a liquid crystal display device (LCD device) 2. Alternatively the display device is formed as a plasma display devices or any other kind of display device emitting light. The display 3 of the display device 1 is divided into a number of groups 4 of display areas 5, wherein each display area 5 comprises a plurality of pixels. The display device 3 of this example comprises eight groups 4 of display areas 5; each group 4 comprises in this example ten display areas 5. Each of the display areas 5 is adapted for of emitting light into a viewing angle of the display device to display an image to a viewer in front of the display device 1.
Fig. 1 further shows a sensor system 6 with a sensor array 7 comprising, e.g. eight groups 8 of sensors 9. Each of said groups 8 comprises, e.g. ten sensors 9 (individual sensors 9 are shown in Figs. 3, 4 and 5) and corresponds to one of the groups 4 of display areas 5. Each of the sensors 9 corresponds to one corresponding display area 5. The sensor system 6 further comprises coupling devices 10 for a display area 5 with the corresponding sensors 9. Each coupling device 10 comprises a light guide member 12 and an incoupling member 13 for coupling the light into the light guide member 12, as shown in Fig. 2. The incoupling member 13 shown in Fig. 2 is cone-shaped, with a tip and a ground plane. It is to be understood that the tip of the incoupling member 13 is facing the display area 5. Light emitted from the display area and arriving at the incoupling member 13, is then refracted at the surface of the incoupling member 13. The incoupling member 13 is formed, in one embodiment, as a laterally prominent incoupling member 14, which is delimited by two laterally coaxially aligned cones 15, 16, said cones 15, 16 having a mutual apex 17 and different apex angles ci, c2. The diameter d of the cones 15, 16 delimiting the incoupling member 13 is equal or almost equal to the width of the light guide member 12. Said light was originally emitted (arrow 18) from the display area 5 into the viewing angle of the display device 1. The direction of this originally emitted light is perpendicular to the alignment of a longitudinal axis 19 of the light guide member 12. All light guide members 12 run parallel in a common plane 20 to the sensor array 7 at one edge 21 of the display device 1. Said edge 21 and the sensor array 7 are outside the viewing is angle of the display device 1.
Alternatively, use may be made of a diffraction grating as an incoupling member 13.
Herein, the grating is provided with a spacing, also known as period of the grating. The spacing is in the order of the wavelength of the coupled light, particularly between SOOnm and 2pm. In a further embodiment, a phosphor is used. The size of the phosphor could be smaller than the wavelength of the light to detect.
The light guide members 12 alternatively can be connected to one single sensor 9. All individual display areas 5 can be detected by a time sequential detection mode.
The light guide members 12 are for instance formed as transparent or almost transparent optical fibres 22 (or microscopic light conductors) absorbing just a small part of the light emitted by the specific display areas 5 of the display device 1. The optical fibres 22 should be so small that a viewer does not notice them but large enough to carry a measurable amount of light. The light reduction due to the light guide members and the incoupling structures is about 5% for any display area 5. Waveguides may be applied instead of optical fibres, as discussed hereinafter.
Most of the display devices 1 are constructed with a front transparent plate such as a glass plate 23 serving as a transparent medium 24 in a front section 25 of the display device 1. Other display devices 1 can be made rugged with other transparent media 24 in the front section 25. Suitably, the light guide member 12 is formed as a layer onto a transparent substrate such as glass. A material suitable for forming the light guide member 12 is for instance PMMA (polymethylmethacrylate). Such a material is for instance commercially available from Rohm&Haas under the tradename LightlinkTM, with product numbers XF-5202A Waveguide Clad and XP-6701A Waveguide Core. Suitably, a waveguide has a thickness in the order of 2-10 micrometer and a width in the order of micrometers to millimeters. Typically, the waveguide comprises a core layer that is defined between one or more cladding layers.
The core layer is for instance sandwiched between a first and a second cladding layer.
The core layer is effectively carrying the light to the sensors. The interfaces between the core layer and the cladding layers define surfaces of the waveguide at which reflection takes place so as to guide the light in the desired direction. The incoupling member 13 is suitably defined so as to redirect light into the core layer of the waveguide.
Alternatively, parallel coupling devices 10 formed as fibres 22 with a higher refractive index are buried into the medium 24, especially the front glass plate 23.
Above each area 5 the coupling device 10 is constructed on a predefined guide member 12 so light from that area 5 can be transported to the edge 21 of the display is device. At the edge 21 the sensor array 7 captures light of each display area 5 on the display device 1. This array 7 would of course require the same pitch as the fibres 22 in the plane 20. While fibres are mentioned herein as an example, another light guide member such as a waveguide, could be applied alternatively.
In Fig. 1 the coupling devices 10 are displayed with different lengths. In reality, full length coupling devices 10 may be present. The incoupling member 13 is therein present at the destination area 5 for coupling in the light (originally emitted from the corresponding display area 5 into the viewing angle of the display device 1) into the light guide member 12 of the coupling device 10. The light is afterwards coupled from an end section of the light guide member 12 into the corresponding sensor 9 of the sensor array at the edge 21 of the display device 1. The sensors 9 preferably only measure light coming from the coupling devices 10.
In addition, the difference between a property of light in the coupling device 10 and that in the surrounding front glass plate 23 is measured. This combination of measuring methods leads to the highest accuracy. The property can be intensity or
colour for example.
In one method, each coupling device 10 carries light that is representative for light coming out of a pre-determined area 5 of the display device 1. Setting the display 3 full white or using a white dot jumping from one area to another area 5 gives exact measurements of the light output in each area 5.
However, by this method it is not possible to perform continuous measurements without the viewer noticing it. In this case the relevant output light property, e.g. colour or luminance, should be calculated depending on the image information, radiation pattern of a pixel and position of a pixel with respect to the coupling device 11. Image information determines the value of the relevant property of light, e.g. how much light is coming out of a specific area 5 (for example a pixel of the display 3) or its colour.
Consider the example of optical fibers 22 shaped like a beam, i.e. with a rectangular cross-section, in the plane parallel front glass plate 23, especially a plate 23 made of fused silica. To guide the light through the fibers 22, the light must be travelling in one of the conductive modes. For light coming from outside the fibers 22 or from outside the plate 23, it is difficult to be coupled into one of the conductive modes.
To get into a conductive mode a local alteration of the fiber 22 is needed. Such local alteration may be obtained in different manners, but in this case there are important requirements than just getting light inside the fiber 22.
For accurate measuring it is important that only light from a specific direction (directed from the corresponding display area 5 into the viewing angle of the display is device) enters into the corresponding coupling device 10 (fiber 22). Light from outside the display device 1 (noisy' light) will not interfere with the measurement.
Additionally, it is important that upon insertion into the light guide member, f.i.
fiber or waveguide, the image displayed is hardly, not substantially or not at all disturbed.
According to the invention, use is made of an incoupling member 13 for coupling light into the light guiding member. The incoupling member 13 is a structure with limited dimensions applied locally at a location corresponding to a display area.
The incoupling member 13 has a surface area that typically much smaller than that of the display area, for instance at most 1% of the display area, more preferably at most 0.1% of the display area. Suitably, the incoupling member is designed so as to be laterally prominent, i.e. it leads light to a lateral direction.
Additionally, the incoupling member may be designed to be optically transparent in at least a portion of its surface area for at least a portion of light falling upon it. In this manner the portion of the image corresponding to the location of the incoupling member is still transmitted to a viewer. If for instance at least 30% of the incoming light in the area of the incoupling member is transmitted, the human eye will automatically correct the slight difference in intensity between the image in the area of the incoupling member and neighbouring areas. As a result, it will not be visible. It is observed for clarity that such partial transparency of the incoupling member is highly preferred, but not deemed essential. Alternatively, the incoupling member may be positioned at a location corresponding to a minor portion of the display area. Such minor portion is for instance in an edge region of the display area, or in an area between a first and a second adjacent pixel. This is particularly feasible if the incoupling member is relatively small, e.g. for instance at most 0.1% of the display area.
In a further embodiment, the incoupling member is provided with a ground plane that is circular, oval or is provided with rounded edges. The ground plane of the incoupling member is typically the portion located at the side of the viewer. Hence, it is most visible. By using a ground plane without sharp edges or corners, this visibility is reduced and any scattering on such sharp edges are prevented.
A perfect separation may be difficult to achieve, but with the sensor system 6 comprising the coupling device 10 shown in Fig. 2 a very good signal-to-noise-ratio (SNR) can be achieved.
Fig. 5 shows a side view of a sensor system 9 according to a second embodiment of the invention. The sensor system of this embodiment comprises transparent sensors 33 which are arranged in a matrix with rows and columns. The is sensor 33 is realized as a stack comprising two groups 34, 35 of parallel bands 36 in two different layers 37, 38 on a substrate 39, preferably the front glass plate 23. An interlayer 40 is placed between the bands 36 of the different groups 35, 36. The bands (columns) of the first group 34 are running perpendicular to the bands (rows) of the second group 35. The sensor system 6 divides the display area is into different zones, each with its own optical sensor 9 connected by transparent electrodes.
Suitable materials for the transparent electrodes are for instance ITO (Indium Tin Oxide) and poly-3,4-ethylenedioxythiophene polystyrene acid (known in the art are PEDOT-PSS). This sensor array 7 can be attached to the front glass or laminated on the front glass plate 23 of the display device 2, for instance an LCD.
The interlayer 40 is preferably an organic photoconductor, and may be a monolayer, a bilayer, or a multiple layer structure. Most suitably, the interlayer 40 comprises a exciton generation layer (EGL) and a charge transport layer (CTL). Alternatively instead of using organic layers to generate charges and guide them to the electrodes, hybrid structures using a mix of organic and inorganic materials can be used.
The charge transport layer (CTL) is in contact with a first and a second transparent electrode, between which electrodes a voltage difference may be applied.
The thickness of the CTL can be is for instance in the range of 25 to 100 nm, f.i. 50 nm.
The EGL layer may have a thickness in the order of 5 to 50 nm, for instance 2Onm. The material for the EGL is a suitably a material known for use as an optically absorptive material in solar cells. It is for instance a perylene derivative. One specific example is 3,4,9,10-perylenetetracarboxylic bisbenzimidazole (PTCBI). The material for the CTL is typically an p-type organic semiconductor material. Various examples are known in the art of organic transistors and hole transport materials for use in organic light emitting diodes. Examples include pentacene, poly-3-hexylthiophene (P3HT), 2-methoxy, 5-(2'-ethyl-hexyloxy)-1,4-phenylene vinylene (MEH-PPV), N,Nbis(3-methyIphenyI)-N,N diphenyl-1,1'-biphenyl-4,4'-diamine (TPD). Mixtures of small molecules and polymeric semiconductors in different blends could be used alternatively. The materials for the CTL and the EGL are preferably chosen such that the energy levels of the orbitals (HOMO, LUMO) are appropriately matched, so that excitons dissociate at the interface of both layers. A charge storage layer (CSL) may be present between the CTL and the EGL in one embodiment. Various materials may be used as charge storage layer, for instance based on low molecular organic material and a binder. Such materials are for instance known from US661 7604, the contents of which are included herein by reference.
In accordance with the invention, use is made of an at least partially transparent is electrode materials. This is for instance ITO. Alternatively, a transparent conductor such as ITO or PEDOT:PSS may be combined with a metal layer sufficiently thin to be at least partially transparent. Suitable metals are for instance Au, Mo, Cr. Suitable thickness of such thin metal layer is particularly in the order of nanometers, for instance of less than 2 nm thickness). When ITO is used instead of gold, the inventors did not expect that the structure would work so well so as to be usable for the monitoring of luminance in a high-end display.
Instead of using a bilayer structure, a monolayer structure can also be used.
This configuration is also tested in the referenced paper, with only a CTL. Again, in the paper, the electrodes are Au, whereas we made an embodiment with ITO electrodes, such that a (semi) transparent sensor can be created. Also, we created embodiments with other organic layers, such as PTCDA, with ITO electrodes. The organic photoconductor may be a patterned layer or may be a single sheet covering the entire display. In the latter case, each of the display area 5 will have its own set of electrodes but they will share a common organic photosensitive layer (simple or multiple). The added advantage of a single sheet covering the entire display is that the possible color specific absorption by the organic layer will be uniform across the display. In the case where several islands of organic material are separated on the display, non uniformity in luminance and or color is more difficult to compensate.
In one further implementation, the electrodes are provided with fingered shaped extensions. The extensions of the first and second electrode preferably form an interdigitated pattern. The number of fingers may be anything between 2 and 5000, more preferably between 250 and 2500, suitably between 500 and 1000. The surface area of a single transparent sensor may be in the order of square micrometers but is preferable in the order of square millimeters, for instance between 1 and 1000 square millimeters. One suitable shape is for instance a 1500 x 10 micrometers size, but a size of for instance 4 x 6 micrometers is not excluded either.
In connection with said further implementation, it is most suitable to build up the sensor on a substrate with said electrodes. The interlayer 40 therein overlies or underlies said electrodes. In other words, while Fig. 5 shows a design comprising a first and a second electrode layer (columns and bands), a single electrode layer may be sufficient. It however is observed that a sensor of a first and a second electrode with the interlayer may, on a higher level, be arranged in a matrix for appropriate addressing and read out, as known to the skilled person. Most suitably, the interlayer is deposited after provision of the electrodes. The substrate may be provided with a planarization layer.
Optionally, a transistor may be provided at the output of the photosensor, particularly is for amplification of the signal for transmission over the conductors to a controller. Most suitably, use is made of an organic transistor. Electrodes may be defined in the same electrode material as those of the photodetector. Alternatively, particularly with a suitable, hidden location of the transistor, use may be made of gold electrodes. A organic field effect transistor device structure with a bottom gate structure, a pentacene semiconductor and parylene dielectric is suitably applied. Vias cut into the parylene allow the photoconductor access to the interdigitated electrode structure.
The interlayer 40 may be patterned to be limited to one display area 5, a group of display areas 5, or alternatively certain pixels within the display area 5. Alternatively, the interlayer is substantially unpatterned. Any color specific absorption by the transparent sensor will then be uniform across the display.
Alternatively, the interlayer 40 may comprise nanoparticles or microparticles, either organic or inorganic and dissolved or dispersed in an organic layer. A further alternative is the an interlayer 40 comprising a combination of different organic materials. As the organic photosensitive particles often exhibit a strongly wavelength dependent sensitive absorption coefficient, such a configuration can result in a less colored transmission spectrum. It may further be used to improve detection over the whole visible spectrum, or to improve the detection of a specific wavelength range Suitably, more than one transparent sensor may be present in a display area 5.
Additional sensors may be used for improvement of the measurement, but also to provide different colour-specific measurements. Additionally, by covering substantially the full front surface with transparent sensors, any reduction in intensity of the emitted light due to absorption in the at least partially transparent sensor will be less visible or even invisible.
By constructing the sensor 9 as shown in Fig. 5, the sensor surface of the transparent sensor 30 is automatically divided in different zones. A specific zone corresponds to a specific display area 5, preferably a pixel, and can be addressed by placing the electric field across its columns and rows. The current that flows in the circuit at that given time is representative for the photonic current going through that zone.
This sensor system 6 cannot distinguish the direction of the current. Therefore the current going through the transparent sensor 30 can be either a pixel of the display area 5 or external light. Therefore reference measurements with an inactive backlight device are suitably performed.
Suitably, the transparent sensor is present in a front section between the front glass and the display. The front glass provides protection from external humidity (e.g. is water spilled on front glass, the use of cleaning materials, etc.). Also, it provides protection form potential external damaging of the sensor. In order to minimize negative impact of any humidity present in said cavity between the front glass and the display, encapsulation of the sensor is preferred.
Fig. 3 shows another embodiment of the invention relating to a sensor system 6 for rear detection. Fig. 3 is a simplified representation of an optical stack of the display 3 comprising (from left to right) a diffuser, several collimator foils, a dual brightness enhancement film (DBEF) and a LED display element in the front section 25 of a display device 1. At the backside 26 of the display 3 (left side) the sensor 9 of the sensor system 6 is added to measure all the light in the display area 5. A backlight device 27 is located between the sensor 9 and the stack of the display 3. The sensor 9 is counter sunken in a housing element (not shown) so only light close to the normal, perpendicular to the front surface 28, is detected.
The sensor system 6 shown in Fig. 3 can be used for performing an advantageous method for detecting a property of the light, e.g. the intensity or colour of the light emitted from at least one display area 5 of a liquid crystal display device 2 (LCD device) into the viewing angle of said display device 2, wherein said LCD device 2 comprises a backlight device 27 for lighting the display 3 formed as a liquid crystal display member of the display device 2, the method comprising the steps: Switching off the backlight device 27, 35-Detecting the light emitted by at least one chosen display area 5 and Switching on the backlight device 27.
There are three possible ways to do the detection of the light emitted by the at least one chosen display area 5: A very ambitious method is the use of the time of flight principle and measuring only the photons that react at the interface polarizer-air (not shown) at the front section 25 of the display device 1. A second method uses an optical device 10 formed as a mirror 28 in front of the display 3 to achieve the same result but with higher luminance to measure. A third method consists of estimating the escaped energy out of the backlight cavity of the backlight device 27.
Fig. 4 shows a horizontal sectional view of a display device 1 with a sensor system 6 according to a fourth embodiment of the invention. The present embodiment is a scanning sensor system. The sensor system 6 is realized as a solid state scanning sensor system localized the front section 25 of the display device 1. The display device 1 is in this example an liquid crystalline display, but that is not essential. This embodiment provides effectively an incoupling member. The substrate or structures created therein (waveguide, fibers) may be used as light guide members.
is In accordance with this embodiment of the invention, the solid state scanning sensor system is a switchable mirror. Therewith, light may be redirected into a direction towards a sensor. The solid state scanning system in this manner integrates both the incoupling member and the light guide member. In one suitable embodiment, the solid state scanning sensor system is based on a perovskite crystalline or polycrystalline material, and particularly the materials family based on leadzirconate titanate. Typical examples of such materials include lead zirconate titanate (PZT), lanthane doped lead zirconate titanate (PLZT), lead titanate (PT), bariumtitanate (BaTiO3), bariumstrontiumtitantate (BaSrTiO3). Such materials may be further doped with rare earth materials and may be provided by chemical vapour deposition, by sol-gel technology and as particles to be sintered. Many variations hereof are known from the fields of capacitors, actuators and microactuators (MEMS).
In one example, use was made of PLZT. An additional layer 29 can be added to the front glass plate 23 and may be an optical device 10 of the sensor system 6. This layer is a conductive transparent layer such as a tin oxide, e.g. preferably an ITO layer 29 (ITO: Indium Tin Oxide) that is divided in line electrodes by at least one transparent isolating layer 30. The isolating layer 30 is only a few microns (pm) thick and placed under an angle 3. The isolating layer 30 is any suitable transparent insulating layer of which a PLZT layer (PLZT: lanthanum-doped lead zirconate titanate) is one example.
The insulating layer preferably has a similar refractive index to that of the conductive layer or at least an area of the conductive layer surrounding the insulating layer, e.g. 5% or less difference in refractive index. For example, a PLZT layer has almost the same refractive index as the ITO layer 29. The isolating layer 31 is an electro-optical switchable mirror 31 for deflecting at least one part of the light emitted from the display area 5 to the corresponding sensor 9 and is driven by a voltage. The insulating layer can be an assembly of at least one ITO sub-layer and at least one glass or IPMRA sub-layer.
In one further example, a four layered structure was manufactured. Starting from a substrate, f.i. a corning glass substrate, a first transparent electrode layer was provided. This was for instance ITO in a thickness of 30 nm. Thereon, a perovskite layer was grown, in this example by CVD technology. The layer thickness was approximately 1 micrometer. The deposition of the perovskite layer may be optimized with nucleation layers as well as the deposition of several subsequent layers, that do not need to have the same composition. A further electrode layer was provided on top of the perovskite layer, for instance in a thickness of 100 nm. In one suitable example, this electrode layer was patterned in fingered shapes. More than one electrode may be is defined in this electrode layer. Subsequently, a polymer was deposited. The polymer was added to mask the ITO finger pattern. When to this structure a voltage is applied between the bottom electrode and the fingers on top of the PZT the refractive index of the PZT under each of the fingers will change. This change in refractive index will result in the appearance of a diffraction pattern. The finger pattern of the top electrode is preferably chosen so that a diffraction pattern with the same period would diffract light into direction that would undergo total internal reflection at the next interface of the glass with air. The light is thereafter guided to sensors through the glass substrate.
Therewith, all it is achieved that diffraction orders higher than zero are coupled into the glass and remain in the glass. Optionally, specific light guiding structures, e.g. waveguides may be applied in or directly on the substrate.
While it will be appreciated that the use of ITO is here highly advantageous, it is observed that this embodiment of the invention is not limited to the use of ITO electrodes. Other transparent materials may be used as well. Moreover, partially transparent materials may be applied, particularly for the finger shaped electrode pattern. Furthermore, it is not excluded that an alternative electrode pattern is designed with which the perovskite layer may be switched so as to enable diffraction into the substrate or another light guide member.
The solid state scanning sensor system has no moving parts and is advantageous when it comes to durability. Another benefit is that the solid state scanning sensor system can be made quite thin and doesn't create dust when functioning.
An alternative solution can be the use a reflecting surface or mirror 28 that scans (passes over) the display 3, thereby reflecting light in the direction of the sensor array 7. Other optical devices may be used that are able to deflect, reflect, bend, scatter, or diffract the light towards the sensor or sensors.
The sensor array 7 can be a photodiode array 32 without or with filters to measure intensity or colour of the light. Capturing and optionally storing measured light in function of the mirror position results in accurate light property map e.g. colour or luminance map of the output emitted by the display 3. A comparable result can be achieved by passing the detector array 9 itself over the different display areas 5.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
Other variations to the disclosed embodiments can be understood and effected is by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims. Examples of such variations are given below.
The first technological aspect is to improve the image quality. More specifically, the focus in this technological aspect is to reduce the visibility of the sensor as much as possible. Three ideas are presented.
One idea is to counteract possible colouring by the software or the hardware used in the display. Currently, the design of the sensor results in a visible colour shift of the whitepoint of the light emitted by a display. This effect could be counteracted by reducing the thickness of the exciton generation layer (this layer causes the colouring due to an absorption curve in the visible spectrum that is not entirely uniform).
However, there are other solutions to counteract this, without altering the design of the sensor. One could do a precorrection on the level of the display, such that the eventual perceived light, after it passed through the sensor, is correct again. The precorrection at the level of the display can be either done in hardware or software.
The software solution is a precorrection of the video content that will be shown on the display. In other words, the colour point of the image to be displayed is altered deliberat&y to compensate the colour shift introduced by the sensor.
The hardware solution can have various options. For example, one could use slightly different colour filters (when using an LCD) to compensate for the colorshift.
Alternatively, a light source with a different colorpoint could be used (for example, using a different combination of phosphors, or using other phosphors). One could also use coloured foils/foils with an applied colour pattern in the optical stack. A pattern could e.g. be added to a Dual Brightness Enhancement Film (DBEF), diffuser or backlight reflective foil. The applied pattern could be added for example by a printing procedure.
The second technological aspect concerns the stability of the output signal measured by the sensor. Because the selected technology uses organic materials, stability is an essential aspect.
Another idea can be applying a certain driving voltage/current. The amplitude of the applied signal applied to the sensor will alter the behaviour of the generated signal. We is measured a significant difference in contrast between the light and dark measured signal (i.e. the measured signal at the highest and lowest driving level), depending on the signal applied to the sensor.
Another solution to protect organic materials from external contamination is encapsulation. Methods known from OLED devices described by A. Badano et al, High-fidelity medical imaging displays, p. 79-81 SPIE (2004), can be used here as well.
The third technological aspect concerns improvements of the sensor itself.
A major distinction when comparing the transparent sensor technology to existing technologies is that it is not required that the sensors are integrated into a component of the display. For example, existing solutions can be integrated sensors in the pixel structure of a panel. Contrary to this, the transparent sensor described earlier can be produced e.g. on the front glass of the display. It is a rather (technologically) straightforward extrapolation to extend this technology to a sensor that can be clipped on the surface of various types of light sources, or various display types without imposing a necessary redesign of the sensor (e.g. the number of sensors, the size of the sensors,...).
Using hybrid systems, with organic particles, such as quantum dots instead of using organic layers to generate charges and guide them to the electrodes are advantageous. Hybrid structures using a mix of organic and inorganic materials can be used.
Even more exotic structures, such as a bilayer device that uses a quantum-dot exciton generation layer and an organic charge transport layer could be used. The quantum dots can be for example colloidal CdSe (Cadmium Selende) quantum dots. The organic charge transport layer can consist of Spiro-TPD. These materials are also used and described by John et al, Organic lateral heterojunction devices for vapour-phase chemical detection, MIT PhD.
As a consequence of the design of the sensors, there is a lot of freedom in the selection of the parameters. Here we discuss some parameters that have some freedom, and describe the limiting factor: a) Size of the sensors is Depending on the application of the sensor, the sensor can require a different size. A larger sensing area implies longer fingers/more fingers, such that the amplitude of the measured signal will increase. However, larger sensors also require a larger percentage of the active area of the display, which limits the number of sensors. In addition, using longer fingers or using more fingers increases the chance that an error would occur in the processing, mainly in the lithography process. Sensors cannot be made infinitely small otherwise the measured signal could become too weak.
Depending on the performance of the readout electronics, the signal can still be detected sufficiently accurate or not.
b) Number of sensors Depending on the application(s) of the sensor, the required spatial resolution of the measurements can be high or low. This can be realized easily by altering the photolithography step. The limiting factor of the spatial resolution is the size of the sensors. As described above, sensors cannot be made too small or too large. In addition, the electronics become more complex when using more sensors. This requires multiplexers with more channels or a plurality of multiplexers. Also, the design of the lithography becomes more complex. All sensors need two conducting tracks leading to them. This can result in a very complex design of the tracks.
Another idea would be to implement the inventions in a display device which allows colour measurements. Measuring colour sequentially can be one option. By using a panchromic exciton generation layer, the sensor can be made sensitive to the entire visual spectrum. However, measuring the luminance of the different colour components and measuring the Chroma requires a calibration. Indeed, depending on the spectrum, other measured values could be obtained. Therefore, the sensor should be matched to the spectrum of the display.
Measuring colour non-sequentially can be another option A non-sequential measurement would be even more valuable, as this would provide information in real time instead of interrupting for a measurement. A calibration to the spectrum is needed since the absorption spectrum of the exciton generation layer does not correspond to the absorption spectrum of X Y Z colour filters.
In addition a stack of 3 sensors could be used to obtain colour measurements. Each sensor of the stack should use an exciton generation layer with a different absorption spectrum. However, two ITO layers are needed for each sensor, which results in some luminance loss and added colouring.
Instead of stacking the sensors, the sensors could also be placed next to each other and be spatially separated. However, since the excitons have a different absorption spectrum, colour differences could be perceived over the display area. This could be overcome by adding specific colorants to the different sensors such that each sensor is perceived with a similar hue & the same luminance.
Additionally the use of a touch screen could also be implemented. When one of the sensors is touched, the external light is blocked. The measured light is a combination is of the light transmitted through the sensor, and the light reflected on one's finger.
The alignment of the touch sensor to the screen can be done easily by applying patterns. And the rejection of changing display contents is also possible. We can use the touch sensor while the display content is changing, since we know from the calibration algorithms what should be measured, and compare that to what is actually measured.
For some applications, it could be useful to highlight the sensor visually, when it is operating. This can for example be useful to indicate when the sensor is performing offline measurements.
Furthermore, a laser could be user to highlight the individual sensors if it is directed accordingly. Similar to the lasers, LEDs could also be used. In addition, UV light can be used to light up the sensor. Some organic materials exhibit photo luminescent behaviour. An example of such a material is N,N'-bis(3-methylphenyl)-N,N'-bis(phenyl)-benzidine (TPD), which can be used as a Charge Transport Layer in an embodiment of the sensors. When UV light impinges on TPD, blue light is reemitted, which renders the sensor visible.
Calibration algorithms together with the transparent sensor are the most valuable use case for (medical) displays. The sensor is used to measure a property of the light emitted by the display, in order to check if or ensure that it still operates within tolerable parameters.
As described above the senor can be used for colour measurements. In the non-sequential mode, specific colour patterns could be applied to the display that allows quantifying the luminance non-uniformity of a certain colour component of the display.
Using these measurements, the non-uniformity of the whitepoint can also be measured. For instance, the sensors are calibrated to the display spectrum and the contributions of the different primaries are known. However crosstalk can also influence the measurement.
Dicom compliance is one of the essential characteristics of medical displays. It essential that dicom compliance is maintained throughout the lifetime of the display. An in addition to verify the dicom compliance of the display at multiple positions. Currently, technologies such as the Barco IGuard check if the display remains dicom compliant over time. However, these technologies only measure near the border of the active area of the display. The new sensor could overcome this limitation and is able to measure at different positions on the active area of the display. Dicom compliance could be checked e.g. by measuring for 64 uniform patterns spread equally over the dynamic range of the display. If the measured values are within 10% deviation of the ideal dicom curve, the display is considered to be dicom compliant.
In addition, instead of only determining if the display is still compliant to the dicom standards, we could restart the entire dicom calibration of the display. In practice, this can be done by altering the look up table (LUT) that is applied on the incoming image to obtain a dicom calibrated image. To obtain this LUT, the native behaviour of the display could be measured (without the initial dicom calibration), the resulting values can be used in combination with the ideal dicom curve to obtain the required LUT.
In order to use the new sensor, it has to be calibrated using a reference sensor. To calibrate the luminance measurements, measurements should be performed at different luminance values by the new sensor and the reference sensor. Since the new sensor does not include a V(Iambda) filter, the measurements should be made using the display for which the new sensor is to be used. The different luminance values can be obtained by depicting uniform images of certain driving levels on the display. Based on the two obtained measurements for the new sensor and the reference sensor, a LUT can be created that is to be applied to the all the measurements of the new sensor.
In addition, calibration is possible in real-time by using the known pixel content and the viewing angle. Instead of using specific patterns offline such that the display can't be used for a brief period of time, the pixel content that is displayed at that time could be used to calibrate the display in real-time.
Furthermore, one can display additional patches if certain data is missing during normal use. It can be that not all the required patterns for the desired calibration are shown during normal use. In that case, additional patterns should be displayed in order to measure all required data.
Moreover, ambient light measurement can be using to provide additional useful is information. In many applications, new sensor's ability to measure ambient light can be used to advantage. Some calibration algorithms using this ability are elaborated here.
For all the applications where ambient light is measured, a methodology should be thought of that allows calibrating the new sensor itself to measure the ambient light.
Indeed, when measuring the external light only, the measured light is a combination of the light impinging directly onto the sensor, as well as external light that has been reflected onto the display surface. The amount of reflected light can be determined e.g. by using an integrating sphere.
To measure if ambient light conditions are according to standards for radiology, the ambient light which has an influence on the perceived contrast of a display has to be measured and isolated. It is known that the (perceived) contrast has a direct impact on diagnostic performance. Therefore, standards have been created that ensure a satisfactory performance. The satisfactory performance is illustrated in Fig. 6. The luminance ratio, (L_white + L_ambient)/L(black ÷ L_ambient), should be higher than 250 which insures satisfactory ambient light conditions can be written as following. L is the luminance measured and the underscore specifies the conditions in which the luminance was measured.
Typical systems currently used for medical displays split the measurements using 2 sensors. One sensor is used to measure the ambient light without the influence of the light emitted by the display, while the other sensor measures the light emitted by the display, while shielding it from the ambient light. The combination of these sensors results in the required data to verify the compliance. However, the disadvantage is that only a measurement at a single location is used for both measurements. The measurement of the ambient light is placed typically on the bezel of the display, while the measurement of the light emitted by the display is often placed near the border of the active area of the screen. These limitations can be overcome by using the transparent sensor. It is no longer restricted to measurements at a specific area. This could be interesting when the light deviates significantly from diffuse ambient light. The ambient light and display measurements are replaced by a single sensor using the new sensor.
Note however that the new sensor does not include a V(lambda) filter, and it should be calibrated to the spectrum of the externa' light as well. This could be done by using an external light sensor to measure some sort of "average luminance" and equate that is value to the average of the new sensors over the screen. The new sensors could be used still to detect any local maxima or minima in external light to ensure compliance over the entire screen.
It is important to remark that the measurements of the ambient light do not include angular dependencies. Therefore, the exact profile seen by the viewer cannot be measured in general. However, medical displays typically have a broad viewing angle, such that an excess of ambient light under any angle could degrade the performance.
In specific market segments, medical displays are used in environments which have a relatively high ambient light level. In these applications, it could be interesting to adapt the backlight to the amount of measured ambient light. A single sensor integrated in the bezel can be inadequate if the display is not lit uniformly. If any local maxima are detected, the backlight level can be increased to ensure an optimal performance. In addition, this adaptation can also be done for reflective LCDs.
Instead of implementing a global backlight adjustment based on the detected impinging ambient light, a local correction can be performed e.g. when using a LED backlight.
This could ensure a constant perceived contrast over the display area.
Furthermore, specific measurement protocol can be used to obtain stable measurements. When using certain organic materials and certain sensor parameters (size, number of fingers, distance between the fingers...) it can be that the sensor does not yield stable, reliable measurements. Therefore, it could be useful to apply a certain measurement protocol to improve the stability. For example, instead of measuring only once with the sensor, we could average out 10 consecutive measurements, separated milliseconds in time to obtain more reliable results. Many similar possibilities can be thought of, depending on the exact sensor parameters.
Calibration of the new transparent sensor itself for colour is also possible. Similar measurements can be taken for the luminance of the different colour components of the display. Note that the new sensor does not include a filter with colour matching functions, demanding a calibration compared to a reference sensor for a display with a specific spectrum. It is possible that the transparent sensor itself degrades over time when using it over extended periods of time. Therefore, it could be interesting to do a
calibration of the new sensor itself in the field.
A recalibration for external light can be done by a reference light source which preferably has a spectrum similar to the spectra used typically in the environment of the is displays. The source can be used in combination with various neutral density (ND) filters, to obtain measurements at different light levels.
An external luminance sensor and the new transparent sensor can be used to measure the luminance emitted by the display at certain driving levels. Based on these measurements, a new LUT can be created that can be used to determine the actual values from the measured values.
In addition the described solution can be used in order to facilitate the use of such devices in advertisements. Advertisement is a business where various images from different companies can be displayed on various screens on different locations. This business also involves significant financial value. Therefore, it is vital for costumers to be sure that their advertisements are displayed on the display. The new sensor is suitable for this purpose. For example, a low resolution image of the content depicted on the display can be captured. This could be done for example using a matrix of 10 by 10 sensors. Alternatively, every customer could design his own logo, and a detailed image of the logo can be captured by a matrix of the new sensor.
The backlight can be adjusted depending on the ambient light measured by the new sensor. When using an LED backlight, the LEDs could degrade differently over time, resulting in non-uniformities over the area of the display. In the most extreme case, a LED can fail catastrophically resulting in a very visible spatial non-uniformity. These degradations can be measured by the new sensor at the level of the backlight (e.g. one sensor per LED) or at the final stage (after the panel, on the front glass).
Aside from being a technology with many applications in combination with a display, the technology can be useful in many other fields. One of these fields is next generation windows. Depending on the amount of light impinging on a window (e.g. installed in an office or at home), shutters can be used to (partially) block the light. For example, electric curtains can be activated if the measured value exceeds a certain threshold value. Alternatively, smart glass can be used in combination with the sensors. Smart glass typically functions over the entire area of the glass, but variants can be created that block the light only locally.
Similar to the previous idea, a (local) dimming of the windows of transportation means such as cars, busses or trains can be created. In addition, people can be alerted when is dangerous radiation is detected outside. When carefully selecting the organic layers, a sensor can be created that is sensitive to a specific wavelength range. As a consequence, sensors which are sensitive to certain harmful radiations can be created.
For example, when an excess of UV radiation is measured outside, people can be warned of the danger. Similar to the latter, the organic materials can be selected such that the sensor senses optimal conditions to "brown".
Another technology area where the new sensor could be used is for 3D displays.
Volumetric 3D display: calibrate each layer if stacked layers are used 3D displays can be made for example using stacked semi-transparent light modulating/emitting layers, such as semi-transparent OLEDs as described in US 6,720,961 or LC layers as described in US 5,745,197. Every layer could be accompanied by one or multiple copies of the new sensor.
Also, the use of rubbed Left and Right sensors to calibrate stereoscopic display with two polarizations is possible. By rubbing the sensor material, one could only allow linearly polarized light in one direction to be transmitted through the sensor. The light emitted by most LCDs is linearly polarized in one direction, and ambient light is typically unpolarized. Therefore, when placing the sensor in front of the screen can be such that the transmitted polarization corresponds to the polarization emitted by the display. Currently, many different 3D projection technologies are available. Figure 7 illustrates the possible principles for 3D projection using 2 (left figure), or a single (right figure) projector. The lower figure illustrates a mirror configuration using two projectors.
In all these solutions, the new transparent sensor can be used. When using two projectors, two matrices of new sensors can be used inside the projectors to perform measurements. When using a single projector, two matrices of the new sensor can be used as well, when placing them in the optical path of the 2 light bundles before they are merged, both images that compose the stereo image can be measured. Calibration algorithms as described earlier can then be used on both. In addition, it is important to check if the luminance is equal for both images as well. A possible internal structure of a 3D projector using a single projector is presented by L. Bogaert et al, Projection display for the generation of two orthogonal polarized images using liquid crystal on silicon panels and light emitting diodes, Optical Society of America (2008).
The new sensor could become a valuable instrument in the production flow, and specifically in an efficient production flow. First the camera that scans a screen is is replaced. Currently, the absolute value of the luminance over the screen is measured by positioning a camera such as a Minolta CA-21 0 perpendicular to the screen, and translating it over the area of the screen. By using this technique, the viewing angle dependency of the display is not an issue. While this is a valuable measurement, it can require some time in the production process. It could be replaced by an array of sensors at the desired positions of the display area. A matrix of calibrated sensor of the new sensor type could be clipped onto the display and the absolute value of the luminance of the display can be measured promptly in a single step.
Currently, a very high-end camera is used to measure the luminance profile of the display, which is to be corrected using the Points Per Unit (PP U) or Per Zone Uniformity (PZU) algorithm. Later on, the obtained profile is verified using the Minolta luminance meter as described in the previous idea. Both steps could be combined using a matrix of the new sensor. There will be a trade-off between complexity and detail of measurement when designing the matrix.
Large LED walls consist of a stitching of multiple smaller panels, which require having the same luminance within a certain tolerable limit; otherwise an inconsistent image is produced. The new transparent sensor could also be used here. Instead of a visual inspection, a large panel with a limited number of new sensors could be used to measure the output of a single panel. Accordingly, panels can be sorted by luminance output or the signal driving of the panels can be adapted such that a uniform image of a LED wall is obtained.
In addition, the sensor could be used for ambient light measurements, and the environment could be adapted accordingly by (locally) obstructing the light. In this idea, we take this to the next level. Instead of only using the new sensor to measure the ambient light conditions, additional light sources could be used if necessary to obtain the desired ambient light. This could be realized when the sensor is integrated on the display, but also when the sensor is integrated in smart windows. Imagine that the sun is setting; this would require addition light in the room. This idea would allow additional light sources to be activated when decreasing light is measured.
Furthermore, the new sensor can be made on a larger scale, potentially up to the entire active area of a display. This could allow improving the current sensors that only is measure at the border of the active area of the display. Instead of limiting the sensor to a very small area at the border of the screen and extrapolating the rest, an average over the entire screen could be measured, which is a more reliable metric. In addition, this would increase the amplitude of the signal, which allows a more stable measurement.
Similar to other technologies, the new sensor can also be used to perform measurements on projectors. The new sensor can be placed at various positions throughout the optical path of the projector. The semi-transparent technology of the new sensor can also lead to a set of new measurement devices.
S
As described above, the new sensor can be used for luminance measurements. In addition, the sensor is semi-transparent, which allows to create a sensor which is see-through. Different types of practical realization could be thought of, such as a telescope configuration (i.e. a single tube, to be used for a single eye) or a sensor to be used with both eyes, resembling binoculars. The semi-transparent technology of the new sensor can also lead novel type of power meter. The sensor could be used as a semi-transparent optical power meter. However, we should once again remark that it has to be calibrated in accordance with the spectrum of the source.
is The rubbing principle could be used to measure only one polarization of the incoming light. Multiple layers can be used to measure two perpendicular polarization directions.
Two polarization components can be measured e.g. by using a polarization dependant beam-splitter to divide the beam into its polarized components. Both components can then be measured using the new sensor. Later on, the beams can be recombined to obtain a single bundle again.
The new sensor can additionally be used as a sensor that is able to measure the ambient light. The present invention provides a sensor that is able to measure the position of a light beam directed somewhere onto the screen. Also, measuring the position of a laser beam without interrupting is an option. When using a matrix of new sensors placed close to each other, the position of a laser beam can be detected at a specific spatial position. The sensor which measures the highest peak corresponds closest to the actual position of the beam.
Another application of the new transparent sensor can be military training: detect position of laser sight without obstructing background. In military training, the position of a laser sight can be monitored in a similar fashion as described in the previous idea.
This can e.g. be used to validate the accuracy of the shots of the trainee, without the loss of any resources.
Also, recording the position of laser pointers/collimated light bundle during presentations is possible. Traditional blackboards are slowly being replaced by more modern, electronic equivalents. The new sensor could also introduce added value in this market. Laser pointers or other collimated light sources can be detected spatially using a matrix of the new sensor. Of course, the sensor can be valuable for other types of presentations as well.
Charge coupled device (CCD) image sensors are very common nowadays in many applications. A well-known application is digital photography. In digital photography, a CCD sensor is at the heart of the device, since it is the element that captures the image. This element is non-transparent. This element could be relpaced by a matrix of the new sensor. This could be benificial eg for digital reflex cameras. The basic optics of a single-lens reflex camera are presented in figure 8. From this figure, it is clear that the optics could be significantly reduced if we replace the CCD sensor by a semi-transparent sensor. Instead of using a separated path leading to the eyepiece (8) and the sensor (4), both paths could be merged into a single path. In addition to reducing the optics (some lenses are still needed after the sensor), the cost and size of the device could also be reduced. Semi-transparent OLED displays are known. The new sensor could also be used in combination with this technology, to create a semi-transparent display including all the functionalities described before. In addition, the technology could potentially result in various new camera technologies.
Recording of images seen by a person by integration on contact lenses/glasses is also possible. Cameras can also be transformed in significantly different shapes. For example, instead of using a conventional camera type, the camera can be directly integrated on a viewer's head, similar to glasses, but using more complex optics. The image needs to be focussed on the sensors, and corrected again to ensure that observer is still able to interpret the image. In addition, a new type of 3D camera, which measures what eyes see directly, is implemented. The basic principle of a 3D camera is presented in Fig. 9 and can be found in the article of J.-Ch. Barrière et al, Development of highly transparent Fluorescent Optical Sensor for Transverse Positioning of Multiple Elements with respect to a reference Laser Beam, CEA Saclay, France. The optics can be simplified and the image captured can be observed easily without the need of a lot of additional optics.
Measurement of the profile (cross-section) of a bundle of rays/a laser beam at multiple positions in space is also a possibility. Some principles are known to measure the position of a laser beam using a semi-transparent position sensor. Using a matrix of the new sensor, a measurement can be made of the position and the profile of the bundle/laser beam. When combining multiple layers new sensor matrices, a multi-layer profile of the bundle can be detected to eventually generate a 3D profile.
Efficiency checks can be built in the implementations comprising the new transparent sensor. Using a matrix of the new sensor, the content on the screen can be monitored.
This principle can be used for many applications, for example monitoring what employees are actually doing on their PC, if it is relevant to their job or not.
Another area of interest where the described sensor technology can be used is n avionics. In avionics it is very important that potentially hazardous UV radiation is filtered out to avoid any damage. Therefore, windscreens are designed to filter out UV radiation, while maintaining the visible light to avoid degrading the visibility. The new sensor could also have added value for this application. The light perceived by the pilot or passengers could still reach a hazardous UV level, or could simply be too bright to be comfortable. Therefore, the new sensor can be used in these cases as well. As we have described before, the new sensor can be optimized to detect UV or visible light. A system could be integrated into the aircraft that automatically applies an optical filter when a certain threshold value is measured.
is Similar to the previous idea, the new sensor can be applied to a spacecraft. For spacecraft, the requirements are even more critical since there are no more layers present that absorb (part of) the hazardous radiation.
The new sensor can also be used to monitor light sources automatically and thus simplifying the maintenance. The new sensors can be integrated in the vicinity of the light source, e.g. on the housing. By detecting catastrophic failures at the end of life, maintenance can be alerted when replacements are required.
The new sensor can also be used for user-interactive electronic devices. For example, keyboards can be replaced by matrices of the new sensor. A light source can be used in the device, and based on the change of measured light, resulting from an interaction, an action can be registered. In addition to the variety of applications in specific domains, some applications are thought of that do not belong in a specific category When an object casts a shadow over a matrix of the new sensors, the shadow can be captured. In addition, image processing can be applied to obtain a binary black/white image. Moreover, the size of an object on display can be measured while using the display (in real-time). If a non-transparent object of a certain size is put on top of a display which has a matrix of new sensors, the size of the object can be measured.
This can be done e.g. by applying a uniform white pattern to the display, then measuring it before and after placing the object in front of the display. When both measurements are equal, there is no object present at that sensor of the matrix of sensors, while an object is present if the measurements are different.
A (desktop) scanner uses a light source directed at a specific region of the object to be scanned and a linear COD array to capture light, reflected upon the object. By translating the light source and COD array, the entire image is scanned. An alternative technology can be created using an array of the new sensor. The array of new sensors can be placed on top of the object to be scanned, while a uniform light source is used underneath the array. Based on the measurements, a scan can be made. By using different sources with different spectra, or applying colour filters to the source, a colour scan can be made.
Claims (2)
- Claims 1. A display device substantially as described herein.
- 2. Use of a device of claim 1.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1022139.8A GB2489657A (en) | 2010-12-31 | 2010-12-31 | A display device and sensor arrangement |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
GB1022139.8A GB2489657A (en) | 2010-12-31 | 2010-12-31 | A display device and sensor arrangement |
Publications (2)
Publication Number | Publication Date |
---|---|
GB201022139D0 GB201022139D0 (en) | 2011-02-02 |
GB2489657A true GB2489657A (en) | 2012-10-10 |
Family
ID=43599142
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
GB1022139.8A Withdrawn GB2489657A (en) | 2010-12-31 | 2010-12-31 | A display device and sensor arrangement |
Country Status (1)
Country | Link |
---|---|
GB (1) | GB2489657A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9454265B2 (en) | 2013-09-23 | 2016-09-27 | Qualcomm Incorporated | Integration of a light collection light-guide with a field sequential color display |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1274066A1 (en) * | 2001-07-03 | 2003-01-08 | Barco N.V. | Method and system for real time correction of an image |
WO2008050262A1 (en) * | 2006-10-23 | 2008-05-02 | Koninklijke Philips Electronics N.V. | Backlight system |
EP2159783A1 (en) * | 2008-09-01 | 2010-03-03 | Barco N.V. | Method and system for compensating ageing effects in light emitting diode display devices |
US20100096996A1 (en) * | 2008-10-20 | 2010-04-22 | Industrial Technology Research Institute | Light source detection and control system |
GB2466846A (en) * | 2009-01-13 | 2010-07-14 | Barco Nv | Sensor system and method for detecting a property of light emitted from at least one display area of a display device |
-
2010
- 2010-12-31 GB GB1022139.8A patent/GB2489657A/en not_active Withdrawn
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1274066A1 (en) * | 2001-07-03 | 2003-01-08 | Barco N.V. | Method and system for real time correction of an image |
WO2008050262A1 (en) * | 2006-10-23 | 2008-05-02 | Koninklijke Philips Electronics N.V. | Backlight system |
EP2159783A1 (en) * | 2008-09-01 | 2010-03-03 | Barco N.V. | Method and system for compensating ageing effects in light emitting diode display devices |
US20100096996A1 (en) * | 2008-10-20 | 2010-04-22 | Industrial Technology Research Institute | Light source detection and control system |
GB2466846A (en) * | 2009-01-13 | 2010-07-14 | Barco Nv | Sensor system and method for detecting a property of light emitted from at least one display area of a display device |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9454265B2 (en) | 2013-09-23 | 2016-09-27 | Qualcomm Incorporated | Integration of a light collection light-guide with a field sequential color display |
Also Published As
Publication number | Publication date |
---|---|
GB201022139D0 (en) | 2011-02-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9671643B2 (en) | Display device and use thereof | |
US9336749B2 (en) | Display device and means to measure and isolate the ambient light | |
GB2486921A (en) | Compensating for age effects in active matrix displays | |
US20130278578A1 (en) | Display device and means to improve luminance uniformity | |
KR100464114B1 (en) | Displaying device and displaying method and manufacturing method of the device | |
CN109327576B (en) | Electronic device, control method thereof and control device thereof | |
KR20090122127A (en) | Electro-optical device and electronic apparatus | |
KR20180106992A (en) | Organic light-emitting diode display device | |
US20110073876A1 (en) | Light-emitting device and display | |
US20100246212A1 (en) | Backlight and displaying/imaging apparatus | |
CN111200672B (en) | Electronic device, control method thereof and control device thereof | |
US20220121301A1 (en) | Display device | |
KR20120139122A (en) | Liquid micro shutter display device | |
US20020000772A1 (en) | Image display apparatus | |
JP2011028058A (en) | Display device and electronic apparatus | |
KR20130008096A (en) | Optical film for display device and display device having the same | |
WO2012089847A2 (en) | Stability and visibility of a display device comprising an at least transparent sensor used for real-time measurements | |
CN109994523B (en) | Luminous display panel | |
GB2489657A (en) | A display device and sensor arrangement | |
KR20170041330A (en) | Display device | |
JP2011107454A (en) | Display device | |
CN114725174A (en) | Display device, control method, and electronic apparatus | |
TW202004272A (en) | Display panel and color array substrate | |
JP5239293B2 (en) | Liquid crystal device and electronic device | |
US10989859B2 (en) | Backlight unit and display apparatus including the same |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WAP | Application withdrawn, taken to be withdrawn or refused ** after publication under section 16(1) |