US12211316B2 - Field sequence color display driving method and display device - Google Patents
Field sequence color display driving method and display device Download PDFInfo
- Publication number
- US12211316B2 US12211316B2 US18/458,155 US202318458155A US12211316B2 US 12211316 B2 US12211316 B2 US 12211316B2 US 202318458155 A US202318458155 A US 202318458155A US 12211316 B2 US12211316 B2 US 12211316B2
- Authority
- US
- United States
- Prior art keywords
- eye
- angle
- estimated
- movement distance
- determining
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/36—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
- G09G3/3607—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals for displaying colours or for displaying grey scales with a specific pixel layout, e.g. using sub-pixels
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/193—Preprocessing; Feature extraction
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/197—Matching; Classification
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/2003—Display of colours
Definitions
- the present disclosure relates to a display technology, and more particularly, to a field sequence color display driving method and a display device.
- the field sequence color (FSC) display technology mainly uses time frequency color switching and superposition to achieve color display. For example, when any pixel on the screen switches between the R field sequence, the G field sequence, and the B field sequence at a frequency of 240 Hz (as shown in three images of left, center, right and three images in FIG. 1 ), the overall display effect can be considered equivalent to a 60 Hz full-color display.
- RGB color is no longer provided by color filters (CF) on the LCD. Instead, it can be directly provided by the color backlight, which will eliminate the loss of light efficiency caused by CFs, improve energy utilization, help improve brightness and reduce power consumption.
- CF color filters
- PPI display resolution
- this technology also has its own unique issue, such as the visual perception of color separation problems shown in FIG. 2 . That is, when the human eye moves relative to the screen when the screen is displaying an image (that is, when the eye rotates horizontally), the edges of the white target will appear red and yellow separated colors.
- the principle of color separation is shown in FIG. 3 .
- the X axis represents the horizontal position, and the Y axis represents the time. If the human eye rotates under certain circumstances (it means that there is a movement of observation point), within one frame (1 frame time), the RGB component of the same point in an image scanned in a certain time order will fall in different positions on the retina, so the human eye will observe the color separation of the image, which is called the color separation phenomenon.
- One objective of an embodiment of the present disclosure is to provide a field sequence color display driving method and a display device, to alleviate the color separation phenomenon.
- a field sequence color (FSC) display driving method includes: matching an eye movement amplitude with a preset amplitude; obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude; obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data; and displaying the pixel data to be displayed according to the target pixel coordinate.
- the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
- the estimated eye movement distance is S′
- the eye angle is d
- a horizontal component of the estimated eye movement distance is S′x
- a vertical component of the estimated eye movement distance is S′y
- the step of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises: configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle; determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
- the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel.
- the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
- the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as ax, the vertical eye initial angle as ⁇ y , a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y;
- X L * [ tan ⁇ ( ⁇ x + ⁇ x ) - tan ⁇ ( ⁇ x ) ]
- ⁇ Y L * [ tan ⁇ ( ⁇ y + ⁇ y ) - tan ⁇ ( ⁇ y ) ] .
- the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
- a display device includes a processor and a memory storing program instructions executable by the processor to execute operations.
- the operations includes: matching an eye movement amplitude with a preset amplitude; obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude; obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data; and displaying the pixel data to be displayed according to the target pixel coordinate.
- the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
- the estimated eye movement distance is S′
- the eye angle is d
- a horizontal component of the estimated eye movement distance is S′x
- a vertical component of the estimated eye movement distance is S′y
- the operation of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises: configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle; determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
- the operation of determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance comprises: configuring the vertical eye rotation angle as ⁇ y; and determining the vertical eye rotation angle according to
- the operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel.
- the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
- the operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as ⁇ X , the vertical eye initial angle as ⁇ y , a viewing distance as L, an abscissa of the target pixel coordinate as X, and an as ordinate of the target pixel coordinate Y;
- X L * [ tan ⁇ ( ⁇ x + ⁇ x ) - tan ⁇ ( ⁇ x ) ]
- ⁇ Y L * ⁇ tan ⁇ ( ⁇ y + ⁇ y ) - tan ⁇ ( ⁇ y ) ⁇ .
- the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
- the field sequence color display driving method and the display device in the present disclosure obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate.
- this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
- FIG. 1 is a diagram of a conventional field sequence color display.
- FIG. 2 shows a comparison before and after color separation in the conventional field sequence color display.
- FIG. 3 shows the principle of color separation in the field color sequence display.
- FIG. 4 is a flow chart of a field sequence color display driving method according to an embodiment of the present disclosure.
- FIG. 5 is a flow chart of a field sequence color display driving method according to another embodiment of the present disclosure.
- FIG. 6 is a flow chart of calculating the eye movement amplitude in FIG. 5 .
- FIG. 7 is a diagram of the eye state data shown in FIG. 5 .
- FIG. 8 is a flow chart of calculating the target pixel coordinate according to an embodiment of the present disclosure.
- FIG. 9 is a diagram of the eye initial angle and the eye rotation angle according to an embodiment of the present disclosure.
- FIG. 10 illustrates a block diagram of a display device according to another embodiment of the present disclosure.
- first”, “second” are for illustrative purposes only and are not to be construed as indicating or imposing a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature that limited by “first”, “second” may expressly or implicitly include at least one of the features.
- an element when it is described that an element is “connected” to another element, the element may be “directly connected” to the other element, or “electrically connected” to the other element through a third element.
- a field sequence color (FSC) display driving method is disclosed according to an embodiment of the present disclosure. Please refer to FIGS. 4 to 9 .
- the FSC display driving method comprises the following steps:
- the field sequence color display driving method in the present disclosure obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate.
- this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
- the eye state data include the eye movement distance S, the movement time t, and the eye angle d.
- the eye movement distance S and the eye angle d are shown in FIG. 7 .
- the eye angle d is an angle between the line of sight emitted by the eye and the horizontal direction (x-axis).
- the FSC display driving method can obtain the eye state data by reading the corresponding device/module parameters in the system.
- the eye state data can be provided by an external system or can be calculated by a software after an internal system reads data from a corresponding sensor.
- the eye movement distance S corresponds to the x-direction movement amplitude Wx and y-direction movement amplitude Wy in the X-axis and Y-axis of the screen coordinates.
- the x-direction movement amplitude Wx S*cos d.
- the y-direction movement amplitude Wy S*sin d.
- the eye movement distance W MAX(Wx, Wy). That is, the greater of the values Wx and Wy is taken as the eye movement distance.
- the eye movement amplitude W reaches the preset amplitude is determined according to the step shown in FIG. 5 (that is, to match the eye movement amplitude with the preset amplitude).
- the eye movement amplitude W is less than the preset amplitude, this indicates that the eye movement amplitude W is small and the observed color separation phenomenon is not obvious.
- the pixel data to be displayed in this frame can be directly displayed, and there is no need to calculate the target pixel coordinate corresponding to the pixel data to be displayed and displaying the pixel data to be displayed according to the target pixel coordinate.
- the eye movement amplitude W is greater than or equal to the preset amplitude, this indicates that the eye movement amplitude W is large and the observed color separation phenomenon is more obvious.
- the target pixel coordinate corresponding to the pixel data to be displayed is calculated, and the pixel data to be displayed is displayed according to the target pixel coordinate. This can effectively improve this color separation issue.
- the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
- the eye rotation radius is an approximate distance between a pupil and the center of eye rotation.
- the above approximate distance is not the true radius of the eye because the eye does not rotate with its exact center as the rotation point. So the eye rotation radius is the approximate distance between the pupil and the true center of rotation of the eye.
- the estimated eye movement distance can be calculated by simulating the eye movement speed based on the estimated acceleration model:
- the estimated eye movement distance is S′
- the eye angle is d
- a horizontal component of the estimated eye movement distance is S′x
- a vertical component of the estimated eye movement distance is S′y.
- the estimated eye movement distance S′ could be decomposed into the horizontal component S′x of the estimated eye movement in the X-axis and the vertical component S′y of the estimated eye movement distance in the Y-axis.
- the eye rotation angle is decomposed in order to finally obtain the eye movement distance in the X-axis and Y-axis and uses the movement distance as the target pixel coordinate.
- the eye is treated as a sphere, and the lateral eye rotation angle is obtained by the chord length formula.
- r is the radius of the sphere
- S′*cos d is the vector distance and also regarded as a chord length, which includes the movement direction and the movement distance of the eye.
- the algorithm of this embodiment can more accurately estimate the lateral eye movement angle.
- the algorithm of this embodiment can more accurately estimate the vertical eye movement angle.
- the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle.
- the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel.
- the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
- the eye initial angle, the lateral eye initial angle and the vertical eye initial angle may be directly obtained by an external device or a sensing system, or may be obtained by a similar calculation method of the eye rotation angle, the lateral eye rotation angle and the vertical eye rotation angle. Further details are omitted for simplicity.
- the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as ⁇ X , the vertical eye initial angle as ⁇ y , a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y. Then,
- X L * [ tan ⁇ ( ⁇ x + ⁇ x ) - tan ⁇ ( ⁇ x ) ]
- ⁇ Y L * ⁇ tan ⁇ ( ⁇ y + ⁇ y ) - tan ⁇ ( ⁇ y ) ⁇ .
- the eye movement distance in two different directions are taken as the abscissa and ordinate of the target pixel coordinate. This allows the eye movement speed to match with the position movement of the display screen, thereby alleviating or eliminating the color separation issue.
- the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and obtaining the eye state data when the eye movement amplitude is greater than or equal to the preset amplitude.
- the eye state data is obtained only when the eye movement amplitude is greater than or equal to the preset amplitude. This can reduce the required operations for obtaining the eye state data, thereby reducing the working load of the sensor or the sensing system and reducing the power consumption. Alternatively, the calculation amount of the FSC display driving method can be reduced. This could improve the efficiency of the FSC display driving method.
- the display device in the present disclosure field sequence color display driving method and the obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate.
- this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Ophthalmology & Optometry (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Computer Hardware Design (AREA)
- Optics & Photonics (AREA)
- Chemical & Material Sciences (AREA)
- Crystallography & Structural Chemistry (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Eye Examination Apparatus (AREA)
Abstract
Description
-
- Step S10: matching an eye movement amplitude with a preset amplitude.
- Step S20: obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude.
- Step S30: obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data.
- Step S40: displaying the pixel data to be displayed according to the target pixel coordinate.
Claims (16)
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202310099729.8 | 2023-01-31 | ||
| CN202310099729.8A CN117524123B (en) | 2023-01-31 | 2023-01-31 | Field color sequence display driving method and display device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20240257558A1 US20240257558A1 (en) | 2024-08-01 |
| US12211316B2 true US12211316B2 (en) | 2025-01-28 |
Family
ID=89763215
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US18/458,155 Active US12211316B2 (en) | 2023-01-31 | 2023-08-30 | Field sequence color display driving method and display device |
Country Status (2)
| Country | Link |
|---|---|
| US (1) | US12211316B2 (en) |
| CN (1) | CN117524123B (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160085302A1 (en) * | 2014-05-09 | 2016-03-24 | Eyefluence, Inc. | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
| US20210373657A1 (en) * | 2020-05-26 | 2021-12-02 | Sony Interactive Entertainment Inc. | Gaze tracking apparatus and systems |
| US20220334636A1 (en) * | 2021-04-19 | 2022-10-20 | Varjo Technologies Oy | Display apparatuses and methods for calibration of gaze-tracking |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101589329B (en) * | 2007-11-21 | 2011-10-12 | 松下电器产业株式会社 | display device |
| US8970495B1 (en) * | 2012-03-09 | 2015-03-03 | Google Inc. | Image stabilization for color-sequential displays |
| WO2016115049A2 (en) * | 2015-01-13 | 2016-07-21 | Magic Leap, Inc. | Improved color sequential display |
| CN114935971B (en) * | 2021-02-05 | 2024-08-20 | 京东方科技集团股份有限公司 | Display device and display driving method |
| CN113971834A (en) * | 2021-10-23 | 2022-01-25 | 郑州大学 | Eye tracking method and system based on virtual reality |
| CN114360043B (en) * | 2022-03-18 | 2022-06-17 | 南昌虚拟现实研究院股份有限公司 | Model parameter calibration method, sight tracking method, device, medium and equipment |
-
2023
- 2023-01-31 CN CN202310099729.8A patent/CN117524123B/en active Active
- 2023-08-30 US US18/458,155 patent/US12211316B2/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20160085302A1 (en) * | 2014-05-09 | 2016-03-24 | Eyefluence, Inc. | Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects |
| US20210373657A1 (en) * | 2020-05-26 | 2021-12-02 | Sony Interactive Entertainment Inc. | Gaze tracking apparatus and systems |
| US20220334636A1 (en) * | 2021-04-19 | 2022-10-20 | Varjo Technologies Oy | Display apparatuses and methods for calibration of gaze-tracking |
Also Published As
| Publication number | Publication date |
|---|---|
| CN117524123A (en) | 2024-02-06 |
| US20240257558A1 (en) | 2024-08-01 |
| CN117524123B (en) | 2025-12-05 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US8830221B2 (en) | Image privacy protecting method | |
| US8384619B2 (en) | Graphic meter display | |
| CN107068102B (en) | Image processing method, image processing device and display device | |
| US20210225329A1 (en) | Image display method, display system and computer-readable storage medium | |
| JP4507936B2 (en) | Image display device and electronic apparatus | |
| KR102794876B1 (en) | Dynamic panel masking | |
| JP6281985B2 (en) | Transparent display device | |
| CN107277419A (en) | A kind of display device and its display methods | |
| US12107091B2 (en) | Display device and display system | |
| WO2021208646A1 (en) | Display method, system and device of vehicle a pillar display assembly, and storage medium | |
| WO2024244852A9 (en) | Liquid crystal display device, image display method, and electronic device | |
| US12211316B2 (en) | Field sequence color display driving method and display device | |
| US20180302613A1 (en) | Method and apparatus for controlling naked eye stereoscopic display and display device | |
| US20190122619A1 (en) | Liquid crystal display device | |
| US12300161B2 (en) | Optical crosstalk compensation for foveated display | |
| WO2024244853A9 (en) | Liquid crystal display apparatus, image display method, and electronic device | |
| WO2012012955A1 (en) | Liquid crystal display and pixel unit thereof | |
| US12536971B2 (en) | Display device and display system | |
| CN106647062A (en) | Pixel structure, liquid crystal panel and stereoscopic display | |
| JP7223567B2 (en) | liquid crystal display | |
| US11132967B2 (en) | Liquid crystal display device having superposed display panels | |
| KR101686093B1 (en) | Viewing Angle Image Control Liquid Crystal Display Device and Driving Method for the Same | |
| JP2022046378A (en) | Display device and display system | |
| EP4243005A1 (en) | Display device and method of driving the same | |
| EP4390630A1 (en) | Method and device for naked eye 3d displaying vehicle instrument |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| AS | Assignment |
Owner name: WUHAN CHINA STAR OPTOELECTRONICS TECHNOLOGY CO., LTD., CHINA Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MA, CHANGWEN;ZHA, GUOWEI;REEL/FRAME:064744/0026 Effective date: 20230823 |
|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |