CN116601532A - Improved display panel grounding - Google Patents
Improved display panel grounding Download PDFInfo
- Publication number
- CN116601532A CN116601532A CN202180083682.5A CN202180083682A CN116601532A CN 116601532 A CN116601532 A CN 116601532A CN 202180083682 A CN202180083682 A CN 202180083682A CN 116601532 A CN116601532 A CN 116601532A
- Authority
- CN
- China
- Prior art keywords
- display device
- metal bridge
- display
- color filter
- polarizer
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Landscapes
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
Abstract
A display device improves the ground connection of a polarizer (250B) by using a metal bridge (340) that couples the polarizer to the ground (320) of the display device. The display device includes: a backlight unit (BLU) for providing light for displaying an image; a plurality of pixels for modulating light provided by the BLU; a polarizer (250B) for filtering light provided by the BLU; and a metal bridge (340). The metal bridge (340) is disposed in a non-display area surrounding a display area (310) of the display device.
Description
Cross Reference to Related Applications
The present application claims the benefit of U.S. provisional application No. 63/124,584, filed on 11/12/2020, which is incorporated by reference in its entirety.
Technical Field
The present disclosure relates generally to display devices and, more particularly, to improving grounding of polarizers in liquid crystal displays.
Background
Proper grounding of electronic components provides various benefits, such as protecting the electronic components and users from electrostatic discharge (electrostatic discharge, ESD). Grounding of the electronic component reduces the amount of charge or static electricity. Static electricity can be accumulated in a dielectric material or insulating material (e.g., a polymer or plastic). This problem is particularly important in body-worn electronics (e.g., head-mounted displays (HMDs)), where the collection of charge can discharge when contact occurs between the user's body and the body-worn electronics, causing discomfort.
Disclosure of Invention
The present invention aims to overcome at least some of the disadvantages of the prior art. According to the present invention, a display device, a color filter for a display device and a head mounted display according to the appended claims are disclosed.
In one aspect, the present invention is directed to a display device that uses a metal bridge for coupling a polarizer to ground to improve the ground connection of the polarizer. The display device includes: a backlight unit (BLU) for providing light for displaying an image, a plurality of pixels for modulating the light provided by the BLU, a polarizer for filtering the light provided by the BLU, and a metal bridge for coupling the polarizer to a ground of the display device. The metal bridge is disposed in a non-display area surrounding a display area of the display device.
In an embodiment of the display device according to the invention, the display device comprises:
a backlight unit (BLU) for providing light for displaying an image;
a plurality of pixels for modulating light provided by the BLU, the plurality of pixels being disposed in a display area of the display device;
a polarizer configured to: filtering light provided by the BLU based on the modulation performed by the plurality of pixels; and
A metal bridge coupled to the polarizer, the metal bridge disposed in a non-display area surrounding a display area of the display device, wherein the metal bridge does not overlap the display area of the display device.
In an embodiment of the display device according to the invention, the display device may further comprise: a color filter, wherein a metal bridge is disposed between the color filter and the polarizer. Further, a first surface of the metal bridge may be adhered to a first portion of the color filter, a second surface of the metal bridge opposite to the first surface may be adhered to a first portion of the polarizer, and a second portion of the polarizer may be adhered to a second portion of the color filter.
In an embodiment of the display device according to the invention, the display device may further comprise: a connector coupled to the metal bridge for connecting the metal bridge to a ground of the display device. Further, the connector may be a silver (Ag) point connector.
In an embodiment of the display device according to the invention, the metal bridge may further comprise: a first triangular portion corresponding to a first corner of the polarizer. In addition, the metal bridge may further include: a second triangular portion corresponding to a second angle of the polarizer; and a connecting bar connecting the first triangular portion to the second triangular portion.
In an embodiment of the display device according to the invention, the metal bridge may further comprise: a first triangular portion corresponding to a first corner of the polarizer. In addition, the metal bridge may further include: a first extension bar extending in a first direction from a first corner of the first triangle; and a second extension bar extending from a second corner of the first triangle in a second direction perpendicular to the first direction.
In an embodiment of the display device according to the invention, the display device may further comprise: an optically clear adhesive (optically clear adhesive, OCA) disposed on the polarizer for adhering the polarizer to the metal bridge. Furthermore, the OCA may be electrically conductive.
In an embodiment of the display device according to the invention the metal bridge may be made of a non-transparent material.
In an embodiment of the display device according to the invention, the metal bridge may be patterned so as not to cover the display area of the display device.
In one aspect, the present invention is directed to a color filter for a display device, the color filter comprising:
a color filter glass having a display region and a non-display region surrounding the display region; and
And a metal bridge disposed on the non-display region of the color filter glass, wherein the metal bridge does not overlap the display region of the color filter glass.
In an embodiment of the color filter according to the present invention, the metal bridge may include: a first triangular portion corresponding to a first corner of the color filter glass. In addition, the metal bridge may further include:
a second triangular portion corresponding to a second corner of the color filter glass; and
a connecting strip connecting the first triangular portion to the second triangular portion.
In an embodiment of the color filter according to the invention, the metal bridge may further comprise:
a first extension bar extending in a first direction from a first corner of the first triangle; and
and a second extension bar extending from a second corner of the first triangle in a second direction perpendicular to the first direction.
In an embodiment of the color filter according to the present invention, the color filter may further include: a connector coupled to the metal bridge for connecting the metal bridge to a ground of the display device. Further, the connector may be a silver (Ag) point connector.
In an embodiment of the color filter according to the invention, the metal bridge is made of a non-transparent material.
In one aspect, the invention is also directed to a head mounted display comprising a display device, the display device being a display device as described above, or the display device comprising:
a backlight unit (BLU) for providing light for displaying an image;
a plurality of pixels for modulating light provided by the BLU, the plurality of pixels being disposed in a display area of the display device;
a polarizer configured to: filtering light provided by the BLU based on the modulation performed by the plurality of pixels; and
a metal bridge coupled to the polarizer, the metal bridge disposed in a non-display area surrounding a display area of the display device, wherein the metal bridge does not overlap the display area of the display device.
Drawings
Fig. 1A is a perspective view of a headset implemented as an eyeglass (eyeear) device in accordance with one or more embodiments.
FIG. 1B is a perspective view of a head mounted device implemented as a head mounted display in accordance with one or more embodiments.
Fig. 1C is a cross-section of a front rigid body of the head mounted display shown in fig. 1B.
FIG. 2A illustrates a block diagram of an electronic display environment in accordance with one or more embodiments.
Fig. 2B illustrates a perspective view of various elements of a display device in accordance with one or more embodiments.
FIG. 2C illustrates an example display device having a two-dimensional array of lighting elements or LC-based pixels in accordance with one or more embodiments.
FIG. 3A illustrates a perspective view of a color filter and a front polarizer in accordance with one or more embodiments.
FIG. 3B illustrates a perspective view of a color filter and front polarizer with improved ground connection in accordance with one or more embodiments.
FIG. 3C illustrates a perspective view of a color filter in accordance with one or more embodiments.
FIG. 4 illustrates a front view of a color filter and polarizer stack in accordance with one or more embodiments.
FIG. 5A illustrates a side view of a color filter and polarizer stack along the A-A'cross-section shown in Figure 4, according to one or more embodiments.
FIG. 5B illustrates a side view of a color filter and polarizer stack along the cross-section B-B' shown in FIG. 4 in accordance with one or more embodiments.
FIG. 5C illustrates a side view of a color filter and polarizer stack along the cross-section C-C' shown in FIG. 4 in accordance with one or more embodiments.
FIG. 5D illustrates a side view of a color filter and polarizer stack along the cross-section D-D' shown in FIG. 4 in accordance with one or more embodiments.
Fig. 6A illustrates a first example design of a metal bridge disposed on a color filter in accordance with one or more embodiments.
FIG. 6B illustrates a second example design of a metal bridge disposed on a color filter in accordance with one or more embodiments.
Fig. 6C illustrates a third example design of a metal bridge disposed on a color filter in accordance with one or more embodiments.
FIG. 7 is a system including a head mounted device in accordance with one or more embodiments.
The figures depict various embodiments for purposes of illustration only. Those skilled in the art will readily recognize from the following discussion that alternative embodiments of the structures and methods illustrated herein may be employed without departing from the principles described herein.
Detailed Description
Embodiments of the invention may include or be implemented in conjunction with an artificial reality system. Artificial reality is a form of reality that has been somehow adjusted before being presented to a user, and may include, for example, virtual Reality (VR), augmented reality (augmented reality, AR), mixed Reality (MR), mixed reality (hybrid reality), or some combination and/or derivative thereof. The artificial reality content may include entirely generated content or generated content in combination with captured (e.g., real world) content. The artificial reality content may include video, audio, haptic feedback, or some combination thereof, any of which may be presented in a single channel or multiple channels (e.g., stereoscopic video producing three-dimensional effects to a viewer). Further, in some embodiments, the artificial reality may also be associated with an application, product, accessory, service, or some combination thereof for creating content in the artificial reality and/or otherwise used in the artificial reality. The artificial reality system providing artificial reality content may be implemented on a variety of platforms including a wearable device (e.g., a head mounted device), a stand-alone wearable device (e.g., a head mounted device), a mobile device or computing system connected to a host computer system, or any other hardware platform capable of providing artificial reality content to one or more viewers.
Fig. 1A is a perspective view of a headset device 100 implemented as an eyeglass device in accordance with one or more embodiments. In some embodiments, the eyewear device is a Near Eye Display (NED). In general, the head-mounted device 100 may be worn on the face of a user such that content (e.g., media content) is presented using a display component and/or an audio system. However, the head mounted device 100 may also be used to cause media content to be presented to a user in a different manner. Examples of media content presented by the head-mounted device 100 include one or more images, video, audio, or some combination thereof. The head mounted device 100 includes a frame and may include a display assembly including one or more display elements 120, a depth camera assembly (depth camera assembly, DCA), an audio system, and a position sensor 190, among other components. Although fig. 1A shows various components of the headset 100 located at various example locations on the headset 100, these components may be located elsewhere on the headset 100, on a peripheral paired with the headset 100, or some combination thereof. Similarly, the headset 100 may have more or fewer components than those shown in fig. 1A.
The frame 110 holds other components of the headset 100. The frame 110 includes a front portion that holds one or more display elements 120, and an end piece (e.g., a temple) that attaches to the head of a user. The front of the frame 110 rests on top of the nose of the (bridge) user. The length of the end pieces may be adjustable (e.g., adjustable leg length) to suit different users. The end piece may also include a portion that curls behind the user's ear (e.g., temple tip, ear piece).
One or more display elements 120 provide light to a user wearing the headset 100. As shown, the head mounted device includes a display element 120 for each eye of the user. In some embodiments, the display element 120 generates image light that is provided to an eyebox (eyebox) of the head-mounted device 100. The eyebox is the position in space occupied by the user's eyes when wearing the headset 100. For example, the display element 120 may be a waveguide display. The waveguide display includes a light source (e.g., a two-dimensional source, one or more line sources, one or more point sources, etc.), and one or more waveguides. Light from the light source is in-coupled into the one or more waveguides, which output the light in such a way that there is pupil replication in the eyebox of the head-mounted device 100. The in-coupling of light and/or the out-coupling of light from one or more waveguides may be accomplished using one or more diffraction gratings. In some embodiments, the waveguide display includes a scanning element (e.g., waveguide, mirror, etc.) that scans light from the light source as it is in-coupled into the one or more waveguides. Note that in some embodiments, one or both of the two display elements 120 are opaque and do not transmit light from a localized area around the headset 100. The local area is an area around the head-mounted device 100. For example, the local area may be a room in which a user wearing the head-mounted device 100 is located, or the user wearing the head-mounted device 100 may be outdoors, and the local area is an outdoor area. In this context, the headset 100 generates VR content. Alternatively, in some embodiments, one or both of the two display elements 120 are at least partially transparent, such that light from a localized region may be combined with light from one or more display elements to generate AR content and/or MR content.
In some embodiments, the display element 120 does not generate image light, but rather, is a lens that transmits light from a localized area to an eyebox. For example, one or both of the two display elements 120 may be an uncorrected lens (non-prescription lens), or a prescription lens (e.g., single, bifocal, and trifocal lenses, or progressive lenses) for helping to correct a user's vision defects. In some embodiments, the display element 120 may be polarized and/or colored to protect the user's eyes from the sun.
In some embodiments, display element 120 may include additional optics blocks (not shown). The optics block may include one or more optical elements (e.g., lenses, fresnel lenses, etc.) that direct light from the display element 120 to the eyebox. The optical block may, for example, correct aberrations in some or all of the image content, enlarge some or all of the image, or some combination thereof.
DCA determines depth information for a portion of the local area around the headset 100. The DCA includes one or more imaging devices 130 and a DCA controller (shown in fig. 1A), and the DCA may also include an illuminator 140. In some embodiments, illuminator 140 uses light to illuminate a portion of the localized region. The light may be, for example, infrared (IR) structured light (e.g., dot patterns, bars, etc.), IR flashes for time of flight, etc. In some embodiments, the one or more imaging devices 130 acquire images of portions of the localized area that include light from the illuminator 140. As shown, fig. 1A shows a single illuminator 140 and two imaging devices 130. In an alternative embodiment, there are at least two imaging devices 130 without an illuminator 140.
The DCA controller uses the acquired images, and one or more depth determination techniques, to calculate depth information for the portion of the local region. The depth determination technique may be, for example, direct time-of-flight (ToF) depth sensing, indirect ToF depth sensing, structured light, passive stereo analysis, active stereo analysis (using textures added to the scene by light from illuminator 140), some other technique for determining the depth of the scene, or some combination thereof.
The DCA may include an eye movement tracking unit that determines eye movement tracking information. The eye-tracking information may include information regarding the position and orientation of the single eye or both eyes (within their respective eyebox). The eye tracking unit may comprise one or more cameras. The eye movement tracking unit estimates the angular orientation of the single or both eyes based on image acquisition of the single or both eyes by one or more cameras. In some embodiments, the eye-tracking unit may further include one or more illuminators that illuminate the single eye or both eyes using an illumination pattern (e.g., structured light, flash, etc.). The eye-tracking unit may use the illumination pattern in the acquired image to determine eye-tracking information. The head mounted device 100 may prompt the user to select to enter to allow operation of the eye tracking unit. For example, by selecting to enter, the headset 100 may detect, store any image of the user or eye tracking information of the user.
The audio system provides audio content. The audio system includes a transducer array, and an audio controller 150. However, in other embodiments, the audio system may include different components and/or additional components. Similarly, in some cases, the functionality described with respect to these components of the audio system may be distributed among multiple components in a different manner than described herein. For example, some or all of the controller's multiple functions may be performed by a remote server.
The transducer array presents sound to the user. The transducer array includes a plurality of transducers. The transducer may be a speaker 160 or a tissue transducer 170 (e.g., a bone conduction transducer or a cartilage conduction transducer). Although the speaker 160 is shown as being external to the frame 110, the speaker 160 may be enclosed in the frame 110. In some embodiments, instead of separate speakers for each ear, the headset 100 includes a speaker array that includes multiple speakers integrated into the frame 110 to improve the directionality of the presented audio content. The tissue transducer 170 is coupled to the head of the user and directly vibrates the tissue (e.g., bone or cartilage) of the user to generate sound. The number and/or location of the transducers may be different from that shown in fig. 1A.
The sensor array detects sound within a localized area of the headset 100. The sensor array includes a plurality of acoustic sensors 180. The acoustic sensor 180 collects sounds emitted from one or more sound sources in a local area (e.g., room). Each acoustic sensor is configured to detect sound and convert the detected sound into an electronic format (analog or digital). The acoustic sensor 180 may be an acoustic wave sensor, a microphone, a sound transducer, or similar sensor adapted to detect sound.
In some embodiments, one or more acoustic sensors 180 may be placed in the ear canal of each ear (e.g., acting as a binaural microphone). In some embodiments, the acoustic sensor 180 may be placed on an exterior surface of the head-mounted device 100, may be placed on an interior surface of the head-mounted device 100, may be separate from the head-mounted device 100 (e.g., part of some other device), or some combination thereof. The number and/or location of acoustic sensors 180 may be different from the number and/or location of acoustic sensors shown in fig. 1A. For example, the number of acoustic detection locations may be increased to increase the amount of audio information collected and to increase the sensitivity and/or accuracy of the information. These acoustic detection locations may be oriented such that the microphone is able to detect sound in a wide range of directions around a user wearing the headset 100.
The audio controller 150 processes information from the sensor array describing the sound detected by the sensor array. The audio controller 150 may include a processor and a computer readable storage medium. The audio controller 150 may be configured to generate a direction of arrival (direction of arrival, DOA) estimate, generate an acoustic transfer function (e.g., an array transfer function and/or a head related transfer function), track the location of a sound source, form a beam in the direction of the sound source, classify the sound source, generate a sound filter for the speaker 160, or some combination thereof.
The position sensor 190 generates one or more measurement signals in response to movement of the headset 100. The position sensor 190 may be located on a portion of the frame 110 of the headset 100. The position sensor 190 may include an inertial measurement unit (inertial measurement unit, IMU). Examples of the position sensor 190 include: one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, one type of sensor for error correction of the IMU, or some combination thereof. The position sensor 190 may be located external to the IMU, internal to the IMU, or some combination thereof.
In some embodiments, the headset 100 may provide synchronized positioning and mapping (simultaneous localization and mapping, SLAM) for the location of the headset 100, as well as providing updating of the model of the local area. For example, the head mounted device 100 may include a passive camera assembly (passive camera assembly, PCA) that generates color image data. The PCA may include one or more RGB cameras that capture images of some or all of the portions of the local area. In some embodiments, some or all of the imaging devices 130 of the DCA may also be used as PCA. The image acquired by the PCA and the depth information determined by the DCA may be used to determine parameters of the local area, generate a model of the local area, update the model of the local area, or some combination thereof. Further, the position sensor 190 tracks the positioning (e.g., position and pose) of the head mounted device 100 within the room. Additional details regarding the various components of the headset 100 are discussed below in connection with fig. 7.
Fig. 1B is a perspective view of a head mounted device 105 implemented as an HMD in accordance with one or more embodiments. In embodiments describing an AR system and/or an MR system, a portion of a front face of the HMD is at least partially transparent in a visible light band (about 380nm (nanometers) to 750 nm), and a portion of the HMD between the front face of the HMD and an eye of a user is at least partially transparent (e.g., a partially transparent electronic display). The HMD includes a front rigid body 115 and a strap 175. The head mounted device 105 includes many of the same components described above with respect to fig. 1A, but these components are modified to incorporate HMD form factors. For example, the HMD includes a display component, DCA, audio system, and position sensor 190. Fig. 1B shows an illuminator 140, a plurality of speakers 160, a plurality of imaging devices 130, a plurality of acoustic sensors 180, and a position sensor 190. Speaker 160 may be located in various locations (e.g., coupled to band 175 (as shown), coupled to front rigid body 115), or may be configured to be inserted within the ear canal of a user.
Fig. 1C is a cross-section of the front rigid body 115 of the head mounted display shown in fig. 1B. As shown in fig. 1C, front rigid body 115 includes an optical block 118 that provides altered image light to exit pupil 190. Exit pupil 190 is the location in front rigid body 115 where user's eye 195 is located. For illustration purposes, fig. 1C shows a cross-section associated with a single eye 195, but another optical block separate from optical block 118 provides altered image light to the other eye of the user.
The optical block 118 includes a display element 120 and an optical block 125. The display element 120 emits image light to the optical block 125. Optics block 125 amplifies the image light and, in some embodiments, also corrects for one or more additional optical errors (e.g., distortion, astigmatism, etc.). Optical block 125 directs image light to exit pupil 190 for presentation to a user.
System architecture
FIG. 2A illustrates a block diagram of an electronic display environment 200 in accordance with one or more embodiments. The electronic display environment 200 includes an application processor 210 and a display device 220. In some embodiments, electronic display environment 200 additionally includes power circuit 270 for providing power to application processor 210 and display device 220. In some embodiments, power circuit 270 receives power from battery 280. In other embodiments, power circuit 270 receives power from an electrical outlet.
The application processor 210 generates the following display data: the display data is used to control the display device to display a desired image. The display data includes a plurality of pixel data, each for controlling one pixel of the display device to emit light having a corresponding intensity. In some embodiments, each pixel data includes sub-pixel data corresponding to different colors (e.g., red, green, and blue). Further, in some embodiments, the application processor 210 generates display data for a plurality of display frames to display video.
The display device 220 includes a display driving integrated circuit (display driver integrated circuit, DDIC) 230, an active layer 240, a Liquid Crystal (LC) layer 260, a backlight unit (BLU) 265, a polarizer 250, and a color filter 255. The display device 220 may include additional elements, such as one or more additional sensors. The display device 220 may be part of the HMD 100 in fig. 1A or 1B. That is, the display device 220 may be an embodiment of the display element 120 in fig. 1A or 1C. Fig. 2B illustrates a perspective view of various elements of a display device 220 in accordance with one or more embodiments.
DDIC 230 receives display signals from application processor 210 and generates control signals for controlling individual pixels 245 in active layer 240 and for controlling BLU 265. For example, the DDIC 230 generates a signal for programming each of the plurality of pixels 245 in the active layer 240 according to the image signal received from the application processor 210. In addition, DDIC 230 generates one or more signals for steering BLU 265.
The active layer 240 includes a structure organized in rows and columnsA set of pixels 245. For example, the active layer 240 includes N pixels (P 11 To P 1N ) N pixels (P in the second row 21 To P 2N ) And N pixels (P in the third row 31 To P 3N ) Etc. Each pixel includes a plurality of sub-pixels, each sub-pixel corresponding to a different color. For example, each pixel includes a red sub-pixel, a green sub-pixel, and a blue sub-pixel. Further, each pixel may include a white subpixel. Each subpixel includes a thin-film-transistor (TFT) for controlling liquid crystal in the LC layer 260. For example, the TFT of each sub-pixel is used to control an electric field in a specific region of the LC layer to control the crystal orientation of the liquid crystal in the specific region of the LC layer 260.
The LC layer 260 includes liquid crystals having some characteristics between liquid crystals and solid crystals. Specifically, the liquid crystal has molecules that can be oriented in a crystal-like manner. The crystal orientation of the molecules of the liquid crystal may be controlled or altered by applying an electric field across the liquid crystal. The liquid crystal may be controlled in different ways by applying differently configured electric fields. Schemes for controlling liquid crystals include Twisted Nematic (TN), in-plane switching (IPS), in-plane to line switching (plane line switching, PLS), fringe field switching (fringe field switching, FFS), vertical alignment (vertical alignment, VA), and the like.
Each pixel 245 is controlled to provide a light output corresponding to a display signal received from the application processor 210. For example, in the case of an LCD panel, the active layer 240 includes an array of liquid crystal cells having a controllable polarization state, where the controllable polarization state can be modified to control the amount of light that can pass through the cell.
The BLU 265 includes a light source that is turned on for a predetermined period of time to generate the following light: the light may pass through each liquid crystal cell to generate an image that is displayed by the display device. The light source of the BLU 265 irradiates light toward an array of liquid crystal cells in the active layer 240, and the array of liquid crystal cells controls the amount and location of light passing through the active layer 240. In some embodiments, the BLU 265 includes a plurality of segmented backlight units, each of which provides a light source for a specific area or zone (zone) of the active layer 240.
The polarizer 250 filters light output by the BLU 265 based on polarization of the light. Polarizer 250 may include a rear polarizer 250A and a front polarizer 250B. The rear polarizer 250A filters the light output by the BLU 265 to provide polarized light to the LC layer 260. The front polarizer 250B filters the light output by the LC layer 260. Since the light provided to the LC layer 260 is polarized by the rear polarizer 250A, the LC layer controls the amount of filtering of the front polarizer 250B by adjusting the polarization of the light output by the rear polarizer 250A.
The color filter 255 filters light output by the LC layer 260 based on color. For example, the BLU 265 generates white light, and the color filter 255 filters the white light to output red, green, or blue light. The color filter 255 may include a grid of red, green, and blue filters. In some embodiments, the various elements of display device 220 are arranged in a different order. For example, a color filter may be placed between the BLU 265 and the rear polarizer 250A, may be placed between the rear polarizer 250A and the LC layer 260, or may be placed after the front polarizer 250B.
Fig. 2C illustrates an example display device 220 having a two-dimensional array of lighting elements or LC-based pixels 245 in accordance with one or more embodiments. In one embodiment, the display device 220 may display multiple frames of video content based on global illumination in which all pixels 245 illuminate image light simultaneously for each frame. In an alternative embodiment, the display device 220 may display video content based on segmented illumination in which all pixels 245 in each segment of the display device 220 illuminate image light simultaneously for each frame of video content. For example, as shown in fig. 2C, each segment of display device 220 may include at least one row of pixels 245 in display device 220. In the illustrative example where each segment for illumination in display device 220 includes a row of pixels 245, the segment illumination may be referred to as scrolling illumination. For scrolling illumination, all pixels 245 in the first row of the display device 220 illuminate image light at the same time at a first time; all pixels 245 in a second row of the display device 220 simultaneously illuminate image light at a second instant in time that is continuous with the first instant in time; and all pixels 245 in a third row of the display device 220 simultaneously illuminate image light or the like at a third instant in time that is continuous with the second instant in time. Other sequential row and segment illumination of the display device 220 is also supported in this disclosure. In another embodiment, the display device 220 may display video content based on controllable illumination in which all pixels 245 in a portion of the controllable size (not shown in fig. 2C) of the display device 220 simultaneously illuminate image light for each frame of video content. The controllable portion of the display device 220 may be rectangular, square, or have some other suitable shape. In some embodiments, the size of the controllable portion of display device 220 may be a dynamic function of the frame number.
Display panel grounding
Fig. 3A illustrates a perspective view of a color filter 255 and a front polarizer 250B in accordance with one or more embodiments. The display device 220 has a display area 310 and a non-display area 315 surrounding the display area 310. The display area 310 corresponds to the following portions of the display device 220: this portion allows a portion of the light generated by the BLU 265 to be emitted to display an image. The non-display area 310 surrounds the display area and corresponds to the following portions of the display device 220: which blocks the light generated by the BLU 265 from exiting.
The front polarizer 250B is disposed on the color filter 255. The color filter 255 may be made using a glass substrate, and the polarizer 250B may be made of a polymer material. Polarizer 250B may be attached to color filter 255 using an Optically Clear Adhesive (OCA). To protect the polarizer 250B (e.g., to protect the polarizer from electrostatic discharge), the polarizer 250 is connected to ground. Specifically, polarizer 250B is connected to ground through connector 320. In some embodiments, the connector 320 is formed using a conductive paste (e.g., silver conductive paste). For example, the connector 320 may be formed into silver dots (Ag dots) using a silver conductive paste and curing the paste to form conductive dots. Furthermore, in some embodiments, the OCA used to attach polarizer 250B to color filter 255 is designed to be conductive or have low resistivity.
However, the connection between polarizer 250B and connector 320 (e.g., the connection between the OCA of polarizer 250B and connector 320) may degrade over time. For example, the connection between the connector 320 and the polarizer 250B may degrade due to the difference in thermal expansion of the polarizer 250B and the connector 320. In another example, the silver paste used to make the connector 320 may degrade over time, resulting in a decrease in electrical conductivity after exposure to a high temperature or high humidity environment. In another example, the silver paste used to make connector 320 may interact with the OCA used to attach polarizer 250B to color filter 255. This reaction may cause delamination of polarizer 250B, resulting in damage to display device 220.
To improve the connectivity of polarizer 250B to connector 320, a layer of Indium Tin Oxide (ITO) may be deposited on color filter 355. ITO is a transparent material having relatively high conductivity. However, the ITO layer can degrade the optical performance of the display device 220. For example, the ITO layer may increase the reflectivity of the display panel, or may refract some of the light emitted by the display panel. Since the ITO layer is disposed on the display region 310 of the display device and on the non-display region 315, the ITO layer may degrade the image quality of the display device.
Fig. 3B illustrates a perspective view of a color filter 255 and a front polarizer 250B with improved ground connection in accordance with one or more embodiments. In the embodiment of fig. 3B, color filter 255 includes a metal bridge 340. The metal bridge 340 is disposed on the non-display region 315 outside the display region 310.
The metal bridge 340 may be made of any conductive material. Since the metal bridge is disposed outside the display area 310, the metal bridge 340 does not have to be transparent. In this regard, a non-transparent conductive material (e.g., copper, aluminum, or other metal) may be used to fabricate the metal bridge 340. In addition, since the metal bridge 340 improves the connection between the polarizer 250B and the connector 320, the ITO layer may be removed, thereby improving the image quality of the display device.
Fig. 3C illustrates a perspective view of a color filter 255 in accordance with one or more embodiments. The color filter 255 includes a color filter glass 350 and a metal bridge 340. The color filter glass has a display area 365 overlapping the display area 310 of the display panel 220, and a non-display area 360 surrounding the display area 365. The metal bridge 340 is disposed on the non-display region 315 of the color filter glass 350. Specifically, the metal bridge 340 is designed so as to be disposed outside the display region 310 of the color filter glass 350. That is, the metal bridge 340 does not overlap the display region 310 of the color filter glass 350. In this regard, the metal bridge 340 does not interfere with the optical properties of the color filter glass within the display region 310.
In some embodiments, metal bridge 340 is laminated on top of color filter glass 350. In other embodiments, metal bridge 340 is deposited on top of color filter glass 350. In addition, the metal bridge 340 may be patterned before being applied to the color filter glass 350. Alternatively, the metal bridge 340 may be patterned (e.g., etched) after a thin metal layer has been deposited on top of the color filter glass 350 to expose the display area of the color filter.
FIG. 4 illustrates a front view of a color filter and polarizer stack in accordance with one or more embodiments. FIG. 5A illustrates a side view of a color filter and polarizer stack along the cross-section A-A' shown in FIG. 4 in accordance with one or more embodiments. FIG. 5B illustrates a side view of a color filter and polarizer stack along the B-B' cross-section shown in FIG. 4 in accordance with one or more embodiments. FIG. 5C illustrates a side view of a color filter and polarizer stack along the C-C' cross-section shown in FIG. 4 in accordance with one or more embodiments. FIG. 5D illustrates a side view of a color filter and polarizer stack along the D-D' cross-section shown in FIG. 4 in accordance with one or more embodiments.
Polarizer 250 includes a polarizing layer 510 and an OCA layer 520. The color filter 250 includes a color filter glass 350, a metal bridge 340, and a connector 320. The connector 320 electrically connects the metal bridge 340 to the ground (not shown) of an external board or circuit. OCA is an adhesive layer that attaches polarizer 250 to color filter 255 when in contact with color filter 255. Further, when the polarizer 250 is attached to the color filter 250, a portion of the OCA layer is brought into contact with the metal bridge 340. In this regard, when the polarizer 250 is attached to the color filter 255, the OCA layer is electrically connected to the metal bridge 340. Since metal bridge 340 is electrically connected to ground (via connector 320), the OCA is also electrically coupled to ground (via connector 320 and metal bridge 340). In addition, since the metal bridge 340 is on the non-display region 360 of the color filter glass 350, the metal bridge does not affect the optical performance of the color filter glass 350.
Fig. 6A illustrates a first example design of a metal bridge 340 disposed on a color filter 255A in accordance with one or more embodiments. The design shown in fig. 6A has a triangular portion 610. The triangular portion 610 is located at a corner of the color filter 255A. The location of the triangular portion 610 may be selected based on the location of the connector 320 or the ground connection of an external board or circuit.
Fig. 6B illustrates a second example design of a metal bridge 340B disposed on a color filter 255B in accordance with one or more embodiments. The design shown in fig. 6B has two triangular portions 610 connected to each other by a strip 630. The first triangular portion 610A is disposed on a first quadrant of the non-display area 315 and the second triangular portion 610B is disposed on a second quadrant of the non-display area 315. In the design of fig. 6B, one or both triangular portions may be connected to the ground of an external board or circuit by one or more connectors 320. The design of fig. 6B increases the area of the metal bridge 340, thereby increasing the amount of area the polarizer 250 is attached to the metal bridge 340.
Fig. 6C illustrates a third example design of a metal bridge 340C disposed on a color filter 255B in accordance with one or more embodiments. The design shown in fig. 6C includes a triangular portion 610 coupled to two strips 630. The first bar 630A (e.g., across the width of the display device) is elongated in a first direction and the second bar 630B (e.g., across the height of the display device) is elongated in a second direction.
In some embodiments, other geometries may be used. For example, a right triangle with curved hypotenuses may be used instead of the geometry shown in fig. 6A-6C. In some embodiments, the triangle has a concave hypotenuse.
In some embodiments, the metal bridge is used for a display having a non-rectangular display area. For example, the display area may have a hexagonal, octagonal, or circular display area. This type of display device may be used for head mounted displays used in Augmented Reality (AR) or Virtual Reality (VR) applications. In some embodiments, the geometry of the metal bridge is designed based on the geometry of the display area to increase the area of the metal bridge.
System environment
Fig. 7 is a system 700 that includes a headset 705 in accordance with one or more embodiments. In some embodiments, the headset 705 may be the headset 100 of fig. 1A or the headset 105 of fig. 1B. The system 700 may operate in an artificial reality environment (e.g., a virtual reality environment, an augmented reality environment, a mixed reality environment, or some combination thereof). The system 700 shown by fig. 7 includes a head-mounted device 705, an input/output (I/O) interface 710 coupled to a console 715, a network 720, and a mapping server 725. Although fig. 7 illustrates an example system 700 including one head mounted device 705 and one I/O interface 710, in other embodiments, any number of these components may be included in system 700. For example, there may be multiple head mounted devices, each having an associated I/O interface 710, with each head mounted device and I/O interface 710 in communication with console 715. In alternative configurations, different components and/or additional components may be included in system 700. Furthermore, in some embodiments, the functionality described in connection with one or more of the components illustrated in fig. 7 may be distributed among the components in a different manner than described in connection with fig. 7. For example, some or all of the functionality of the console 715 may be provided by the head mounted device 705.
The head mounted device 705 includes a display component 730, an optical block 735, one or more position sensors 740, and a DCA 745. Some embodiments of the headset 705 have components that are different from those described in connection with fig. 7. Moreover, in other embodiments, the functionality provided by the various components described in connection with fig. 7 may be distributed among multiple components of the headset 705 in different ways, or may be embodied in separate components remote from the headset 705.
The display component 730 displays content to a user based on data received from the console 715. The display component 730 displays content using one or more display elements (e.g., display element 120). The display element may be, for example, an electronic display. In various embodiments, display assembly 730 includes a single display element or multiple display elements (e.g., a display for each eye of a user). Examples of electronic displays include: a liquid crystal display (liquid crystal display, LCD), an organic light emitting diode (organic light emitting diode, OLED) display, an active-matrix organic light-emitting diode display, AMOLED display, a waveguide display, some other display, or some combination thereof. Note that in some embodiments, the display element 120 may also include some or all of the functionality of the optical block 735.
The optical block 735 may amplify the image light received from the electronic display, correct an optical error associated with the image light, and present the corrected image light to one or both eyepieces of the head-mounted device 705. In various embodiments, the optical block 735 includes one or more optical elements. Example optical elements included in the optical block 735 include: an aperture, a fresnel lens, a convex lens, a concave lens, a filter, a reflective surface, or any other suitable optical element that affects image light. Further, the optical block 735 may include a combination of different optical elements. In some embodiments, one or more of the plurality of optical elements in the optical block 735 can have one or more coatings, such as a partially reflective coating or an anti-reflective coating.
The magnification and focusing of the image light by the optical block 735 allows the electronic display to be physically smaller, lighter in weight, and consume less power than larger displays. Further, the magnification may increase the field of view of content presented by the electronic display. For example, the field of view of the displayed content is such that substantially all of the user field of view (e.g., a diagonal of about 110 degrees) is used to present the displayed content, and in some cases, the entirety of the user field of view is used to present the displayed content. Furthermore, in some embodiments, the amount of magnification may be adjusted by adding or removing optical elements.
In some embodiments, the optical block 735 may be designed to correct one or more types of optical errors. Examples of optical errors include barrel distortion or pincushion distortion, longitudinal chromatic aberration, or lateral chromatic aberration. Other types of optical errors may also include: spherical aberration, chromatic aberration, or errors due to lens curvature, astigmatism, or any other type of optical error. In some embodiments, the content provided to the electronic display for display is predistorted and the optical block 735 corrects distortion upon receipt of content-based generated image light from the electronic display.
The position sensor 740 is an electronic device that generates data indicative of the position of the headset 705. The position sensor 740 generates one or more measurement signals in response to movement of the headset 705. The position sensor 190 is an embodiment of the position sensor 740. Examples of the position sensor 740 include: one or more IMUs, one or more accelerometers, one or more gyroscopes, one or more magnetometers, another suitable type of sensor that detects motion, or some combination thereof. The position sensor 740 may include a plurality of accelerometers for measuring translational motion (forward/backward, up/down, left/right), and a plurality of gyroscopes for measuring rotational motion (e.g., pitch, yaw, roll). In some embodiments, the IMU rapidly samples the measurement signals and calculates an estimated position of the headset 705 from the sampled data. For example, the IMU integrates a plurality of measurement signals received from the accelerometer over time to estimate a velocity vector, and integrates the velocity vector over time to determine an estimated location of a reference point on the headset 705. The reference point is a point that may be used to describe the location of the headset 705. However, while the reference point may generally be defined as a point in space, in practice the reference point is defined as a point within the headset 705.
DCA745 generates depth information for a portion of the local area. The DCA includes one or more imaging devices and a DCA controller. DCA745 may also include a luminaire. The operation and structure of DCA745 is described above with respect to fig. 1A.
The audio system 750 provides audio content to a user of the headset 705. The audio system 750 is substantially the same as the audio system 200 described above. The audio system 750 may include one or more acoustic sensors, one or more transducers, and an audio controller. The audio system 750 may provide the user with spatialized audio content. In some embodiments, audio system 750 may request a plurality of acoustic parameters from mapping server 725 through network 720. These acoustic parameters describe one or more acoustic properties (e.g., room impulse response, reverberation time, reverberation level, etc.) of the local region. The audio system 750 may provide information from, for example, DCA745 describing at least a portion of the localized area, and/or location information of the head mounted device 705 from the location sensor 740. The audio system 750 may use one or more of the plurality of acoustic parameters received from the mapping server 725 to generate one or more sound filters and use the sound filters to provide audio content to the user.
The I/O interface 710 is a device that allows a user to send action requests and receive responses from the console 715. An action request is a request to perform a particular action. For example, the action request may be an instruction to start or end acquisition of image data or video data, or an instruction to perform a specific action within an application program. The I/O interface 710 may include one or more input devices. Example input devices include: a keyboard, a mouse, a game controller, or any other suitable device for receiving an action request and transmitting the action request to console 715. The action request received by the I/O interface 710 is transmitted to the console 715, and the console 715 performs an action corresponding to the action request. In some embodiments, the I/O interface 710 includes an IMU that collects calibration data indicating an estimated position of the I/O interface 710 relative to an initial position of the I/O interface 710. In some embodiments, the I/O interface 710 may provide haptic feedback to the user in accordance with instructions received from the console 715. For example, tactile feedback is provided upon receipt of an action request, or the console 715 transmits instructions to the I/O interface when the console 715 performs an action, such that the I/O interface 710 generates tactile feedback.
The console 715 provides content to the headset 705 for processing according to information received from one or more of the following: DCA745, head mounted device 705, and I/O interface 710. In the example shown in fig. 7, the console 715 includes an application store 755, a tracking module 760, and an engine 765. Some embodiments of the console 715 have different modules or components than those described in connection with fig. 7. Similarly, the functions described further below may be distributed among the various components of the console 715 in a different manner than that described in connection with FIG. 7. In some embodiments, the functionality discussed herein with respect to console 715 may be implemented in head mounted device 705 or a remote system.
The application store 755 stores one or more applications for execution by the console 715. An application is a set of instructions that: the set of instructions, when executed by the processor, generates content for presentation to a user. Content generated by the application may be responsive to input received from a user through movement of the headset 705 or the I/O interface 710. Examples of applications include: a gaming application, a conferencing application, a video playback application, or other suitable application.
The tracking module 760 uses information from the DCA745, the one or more position sensors 740, or some combination thereof to track movement of the head mounted device 705 or the I/O interface 710. For example, tracking module 760 determines the location of the reference point of headset 705 in a map (mapping) of the local area based on information from headset 705. The tracking module 760 may also determine the location of an object or virtual object. Further, in some embodiments, tracking module 760 may use the portion of data from location sensor 740 that indicates the location of head mounted device 705, as well as a representation of the local region from DCA745, to predict the future location of head mounted device 705. The tracking module 760 provides the engine 765 with an estimated or predicted future location of the headset 705 or the I/O interface 710.
The engine 765 executes an application and receives position information, acceleration information, velocity information, predicted future positions, or some combination thereof, of the head mounted device 705 from the tracking module 760. Based on the received information, engine 765 determines content to provide to headset 705 for presentation to the user. For example, if the received information indicates that the user has seen the left, engine 765 generates the following for head-mounted device 705: the content reflects the user's movements in the virtual local area or in a local area that is enhanced with additional content. In addition, the engine 765 performs actions within applications executing on the console 715 in response to action requests received from the I/O interface 710 and provides feedback to the user that the actions have been performed. The feedback provided may be visual feedback or audible feedback through the headset 705, or tactile feedback through the I/O interface 710.
Network 720 couples head-mounted device 705 and/or console 715 to mapping server 725. Network 720 may include any combination of local area and/or wide area networks using wireless communication systems and/or wired communication systems. For example, network 720 may include the Internet and a mobile telephone network. In one embodiment, network 720 uses standard communication techniques and/or protocols. Thus, network 720 may include links using a variety of technologies such as Ethernet, 802.11, worldwide interoperability for microwave Access (worldwide interoperability for microwave access, wiMAX), 2G/3G/4G mobile communication protocols, digital subscriber line (digital subscriber line, DSL), asynchronous transfer mode (asynchronous transfer mode, ATM), infiniBand (InfiniBand), PCI Express advanced switching, and the like. Similarly, network protocols used on network 720 may include multiprotocol label switching (multiprotocol label switching, MPLS), transmission control protocol/internet protocol (transmission control protocol/Internet protocol, TCP/IP), user datagram protocol (User Datagram Protocol, UDP), hypertext transfer protocol (hypertext transport protocol, HTTP), simple mail transfer protocol (simple mail transfer protocol, SMTP), file transfer protocol (file transfer protocol, FTP), and the like. The data exchanged over network 720 may be represented using a variety of techniques and/or formats including binary forms of image data (e.g., portable network graphics (Portable Network Graphic, PNG)), hypertext markup language (hypertext markup language, HTML), extensible markup language (extensible markup language, XML), and the like. In addition, all or some of the plurality of links may be encrypted using a variety of conventional encryption techniques, such as secure sockets layer (secure sockets layer, SSL), secure transport layer (transport layer security, TLS), virtual private network (virtual private network, VPN), internet security protocol (Internet Protocol security, IPsec), and the like.
The mapping server 725 may include the following databases: the database stores virtual models describing a plurality of spaces, wherein one location in the virtual models corresponds to a current configuration of a local area of the headset 705. The mapping server 725 receives information describing at least a portion of the local area, and/or location information of the local area from the head-mounted device 705 over the network 720. The user may adjust privacy settings to allow or prevent the headset 705 from sending information to the mapping server 725. The mapping server 725 determines locations in the virtual model associated with the local area of the head-mounted device 705 based on the received information and/or location information. The mapping server 725 determines (e.g., retrieves) one or more acoustic parameters associated with the local region based in part on the determined locations in the virtual model and any acoustic parameters associated with the determined locations. The mapping server 725 may send the location of the local area, as well as any values of acoustic parameters associated with the local area, to the head-mounted device 705.
One or more components of system 700 can include a privacy module that stores one or more privacy settings for user data elements. These user data elements describe the user or the headset 705. For example, the user data elements may describe physical characteristics of the user, actions performed by the user, a location of the user of the head-mounted device 705, a location of the head-mounted device 705, a head-related transfer function (HRTF) of the user, and so forth. The privacy settings (or "access settings") for the user data elements may be stored in any suitable manner, such as in association with the user data elements, in an index on an authorization server, in another suitable manner, or in any suitable combination thereof.
The privacy settings for a user data element specify how the user data element (or particular information associated with the user data element) may be accessed, stored, or otherwise used (e.g., viewed, shared, modified, copied, executed, displayed, or identified). In some embodiments, the privacy settings for the user data element may specify a "blacklist" (blacklist) of entities that may not access certain information associated with the user data element. The privacy settings associated with the user data elements may specify any suitable granularity of allowing access or denying access. For example, some entities may have the right to view the presence of a particular user data element, some entities may have the right to view the content of a particular user data element, and some entities may have the right to modify a particular user data element. These privacy settings may allow the user to allow other entities to access or store user data elements for a limited period of time.
The privacy settings may allow the user to specify one or more geographic locations where the user data elements may be accessed. Access to or denial of access to the user data element may depend on the geographic location of the entity attempting to access the user data element. For example, a user may only allow access to user data elements when the user is in a particular location, and specify that the user data elements are accessible to an entity. If the user leaves the particular location, the user data element may no longer be accessible to the entity. As another example, a user may specify that user data elements are only accessible to entities within a threshold distance from the user (e.g., another user of a headset that is within the same local area as the user). If the user subsequently changes locations, entities having access to the user data element may lose access, and a new set of entities may gain access when they come within a threshold distance of the user.
The system 700 may include one or more authorization/privacy servers for enforcing privacy settings. A request from an entity for a particular user data element may identify the entity associated with the request and may send the user data element to the entity if the authorization server determines that the entity is authorized to access the user data element based on privacy settings associated with the user data element. The authorization server may prevent the requested user data element from being retrieved or may prevent the requested user data element from being sent to the entity if the requesting entity is not authorized to access the user data element. Although this disclosure describes enforcing privacy settings in a particular manner, this disclosure contemplates enforcing privacy settings in any suitable manner.
The foregoing description of the embodiments has been presented for purposes of illustration; it is not intended to be exhaustive or to limit the patent rights to the precise form disclosed. Those skilled in the relevant art will appreciate that many modifications and variations are possible in light of the above disclosure.
The language used in the specification has been principally selected for readability and instructional purposes, and may not have been selected to delineate or circumscribe the patent rights. Accordingly, it is intended that the scope of the patent claims not be limited by this detailed description, but rather by any claims that issue on the application based thereon. Accordingly, the disclosure of the embodiments is intended to be illustrative, but not limiting, of the scope of the patent claims set forth below.
Claims (20)
1. A display device, comprising:
a backlight unit (BLU) for providing light for displaying an image;
a plurality of pixels for modulating the light provided by the BLU, the plurality of pixels being disposed in a display area of the display device;
a polarizer configured to: filtering the light provided by the BLU based on the modulation performed by the plurality of pixels; and
a metal bridge coupled to the polarizer, the metal bridge disposed in a non-display area surrounding the display area of the display device, wherein the metal bridge does not overlap the display area of the display device.
2. The display device of claim 1, further comprising:
and a color filter, wherein the metal bridge is disposed between the color filter and the polarizer.
3. The display device of claim 2, wherein a first surface of the metal bridge is adhered to a first portion of the color filter, wherein a second surface of the metal bridge opposite the first surface is adhered to a first portion of the polarizer, and wherein a second portion of the polarizer is adhered to a second portion of the color filter.
4. The display device of claim 1, further comprising:
a connector coupled to the metal bridge, the connector for connecting the metal bridge to a ground of the display device.
5. The display device of claim 4, wherein the connector is a silver (Ag) point connector.
6. The display device of claim 1, wherein the metal bridge comprises:
a first triangular portion corresponding to a first corner of the polarizer.
7. The display device of claim 6, wherein the metal bridge further comprises:
a second triangular portion corresponding to a second angle of the polarizer; and
a connecting bar connecting the first triangular portion to the second triangular portion.
8. The display device of claim 6, wherein the metal bridge further comprises:
a first extension bar extending in a first direction from a first corner of the first triangle; and
and a second extension bar extending from a second angle of the first triangle in a second direction perpendicular to the first direction.
9. The display device of claim 1, further comprising:
an Optically Clear Adhesive (OCA) disposed on the polarizer, the OCA for adhering the polarizer to the metal bridge.
10. The display device of claim 9, wherein the OCA is electrically conductive.
11. The display device of claim 1, wherein the metal bridge is made of a non-transparent material.
12. The display device of claim 1, wherein the metal bridge is patterned so as not to cover the display area of the display device.
13. A color filter for a display device, comprising:
a color filter glass having a display region and a non-display region surrounding the display region; and
a metal bridge disposed on the non-display region of the color filter glass, wherein the metal bridge does not overlap the display region of the color filter glass.
14. The color filter glass of claim 13, wherein the metal bridge comprises: a first triangular portion corresponding to a first corner of the color filter glass.
15. The color filter glass of claim 14, wherein the metal bridge further comprises:
A second triangular portion corresponding to a second corner of the color filter glass; and
a connecting bar connecting the first triangular portion to the second triangular portion.
16. The color filter glass of claim 13, wherein the metal bridge further comprises:
a first extension bar extending in a first direction from a first corner of the first triangle; and
and a second extension bar extending from a second angle of the first triangle in a second direction perpendicular to the first direction.
17. The color filter glass of claim 13, further comprising:
a connector coupled to the metal bridge, the connector for connecting the metal bridge to a ground of the display device.
18. The color filter glass of claim 17, wherein the connector is a silver (Ag) dot connector.
19. The color filter glass of claim 13, wherein the metal bridge is made of a non-transparent material.
20. A head mounted display, comprising:
a display device, which is a display device according to any one of claims 1 to 12, or which comprises:
A backlight unit (BLU) for providing light for displaying an image;
a plurality of pixels for modulating the light provided by the BLU, the plurality of pixels being disposed in a display area of the display device;
a polarizer configured to: filtering the light provided by the BLU based on the modulation performed by the plurality of pixels; and
a metal bridge coupled to the polarizer, the metal bridge disposed in a non-display area surrounding the display area of the display device, wherein the metal bridge does not overlap the display area of the display device.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US63/124,584 | 2020-12-11 | ||
US17/513,729 US20220187655A1 (en) | 2020-12-11 | 2021-10-28 | Display panel grounding |
US17/513,729 | 2021-10-28 | ||
PCT/US2021/062894 WO2022125954A1 (en) | 2020-12-11 | 2021-12-10 | Improved display panel grounding |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116601532A true CN116601532A (en) | 2023-08-15 |
Family
ID=87612118
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202180083682.5A Pending CN116601532A (en) | 2020-12-11 | 2021-12-10 | Improved display panel grounding |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116601532A (en) |
-
2021
- 2021-12-10 CN CN202180083682.5A patent/CN116601532A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10674141B1 (en) | Apparatuses, systems, and methods for determining interpupillary distances of head-mounted displays | |
EP4179273B1 (en) | Vcsel arrays for generation of linear structured light features | |
US20220187655A1 (en) | Display panel grounding | |
US11195291B1 (en) | Dynamic illumination control for depth determination | |
US11436987B1 (en) | Adaptive backlight activation for low-persistence liquid crystal displays | |
US20230085063A1 (en) | Vcsel chip for generation of linear structured light patterns and flood illumination | |
CN116601532A (en) | Improved display panel grounding | |
KR20230113758A (en) | Improved display panel grounding | |
US11573445B2 (en) | Off-axis pixel design for liquid crystal display devices | |
US12051384B2 (en) | Foveated display and driving scheme | |
EP4312065A1 (en) | Lens barrel with an integrated tunable lens | |
US20240151882A1 (en) | Integrated optical assembly including a tunable lens element | |
US20230305306A1 (en) | Backlight unit for near eye displays with corner placement of light emitting diodes | |
US20230194869A1 (en) | Backlight compensation for brightness drop off | |
US12078810B2 (en) | Offsetting image light aberration due to waveguide movement in display assemblies | |
CN117452591A (en) | Lens barrel with integrated tunable lens | |
US20240355245A1 (en) | Driving multiple illuminators using a single controller receiving multiple trigger signals | |
US11706565B2 (en) | Audio source amplification with speaker protection features and internal voltage and current sensing | |
WO2023039288A1 (en) | Vcsel chip for generation of linear structured light patterns and flood illumination | |
US11852843B1 (en) | Diffractive optical elements (DOEs) for high tolerance of structured light | |
US20220413324A1 (en) | Compact imaging optics using liquid crystal (lc) for dynamic glare reduction and sharpness enhancement | |
WO2023183469A1 (en) | Backlight unit for near eye displays with corner placement of light emitting diodes | |
EP4235220A1 (en) | Indirect time of flight sensor with parallel pixel architecture | |
EP4307028A1 (en) | Optical assembly with micro light emitting diode (led) as eye-tracking near infrared (nir) illumination source | |
WO2022272156A1 (en) | Offsetting image light aberration due to waveguide movement in display assemblies |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |