US12211316B2 - Field sequence color display driving method and display device - Google Patents

Field sequence color display driving method and display device Download PDF

Info

Publication number
US12211316B2
US12211316B2 US18/458,155 US202318458155A US12211316B2 US 12211316 B2 US12211316 B2 US 12211316B2 US 202318458155 A US202318458155 A US 202318458155A US 12211316 B2 US12211316 B2 US 12211316B2
Authority
US
United States
Prior art keywords
eye
angle
estimated
movement distance
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
US18/458,155
Other versions
US20240257558A1 (en
Inventor
Changwen MA
Guowei Zha
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan China Star Optoelectronics Technology Co Ltd
Original Assignee
Wuhan China Star Optoelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan China Star Optoelectronics Technology Co Ltd filed Critical Wuhan China Star Optoelectronics Technology Co Ltd
Assigned to WUHAN CHINA STAR OPTOELECTRONICS TECHNOLOGY CO., LTD. reassignment WUHAN CHINA STAR OPTOELECTRONICS TECHNOLOGY CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MA, Changwen, ZHA, GUOWEI
Publication of US20240257558A1 publication Critical patent/US20240257558A1/en
Application granted granted Critical
Publication of US12211316B2 publication Critical patent/US12211316B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/34Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
    • G09G3/36Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals
    • G09G3/3607Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source using liquid crystals for displaying colours or for displaying grey scales with a specific pixel layout, e.g. using sub-pixels
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/19Sensors therefor
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2003Display of colours

Definitions

  • the present disclosure relates to a display technology, and more particularly, to a field sequence color display driving method and a display device.
  • the field sequence color (FSC) display technology mainly uses time frequency color switching and superposition to achieve color display. For example, when any pixel on the screen switches between the R field sequence, the G field sequence, and the B field sequence at a frequency of 240 Hz (as shown in three images of left, center, right and three images in FIG. 1 ), the overall display effect can be considered equivalent to a 60 Hz full-color display.
  • RGB color is no longer provided by color filters (CF) on the LCD. Instead, it can be directly provided by the color backlight, which will eliminate the loss of light efficiency caused by CFs, improve energy utilization, help improve brightness and reduce power consumption.
  • CF color filters
  • PPI display resolution
  • this technology also has its own unique issue, such as the visual perception of color separation problems shown in FIG. 2 . That is, when the human eye moves relative to the screen when the screen is displaying an image (that is, when the eye rotates horizontally), the edges of the white target will appear red and yellow separated colors.
  • the principle of color separation is shown in FIG. 3 .
  • the X axis represents the horizontal position, and the Y axis represents the time. If the human eye rotates under certain circumstances (it means that there is a movement of observation point), within one frame (1 frame time), the RGB component of the same point in an image scanned in a certain time order will fall in different positions on the retina, so the human eye will observe the color separation of the image, which is called the color separation phenomenon.
  • One objective of an embodiment of the present disclosure is to provide a field sequence color display driving method and a display device, to alleviate the color separation phenomenon.
  • a field sequence color (FSC) display driving method includes: matching an eye movement amplitude with a preset amplitude; obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude; obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data; and displaying the pixel data to be displayed according to the target pixel coordinate.
  • the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
  • the estimated eye movement distance is S′
  • the eye angle is d
  • a horizontal component of the estimated eye movement distance is S′x
  • a vertical component of the estimated eye movement distance is S′y
  • the step of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises: configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle; determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
  • the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel.
  • the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
  • the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as ax, the vertical eye initial angle as ⁇ y , a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y;
  • X L * [ tan ⁇ ( ⁇ x + ⁇ x ) - tan ⁇ ( ⁇ x ) ]
  • ⁇ Y L * [ tan ⁇ ( ⁇ y + ⁇ y ) - tan ⁇ ( ⁇ y ) ] .
  • the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
  • a display device includes a processor and a memory storing program instructions executable by the processor to execute operations.
  • the operations includes: matching an eye movement amplitude with a preset amplitude; obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude; obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data; and displaying the pixel data to be displayed according to the target pixel coordinate.
  • the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
  • the estimated eye movement distance is S′
  • the eye angle is d
  • a horizontal component of the estimated eye movement distance is S′x
  • a vertical component of the estimated eye movement distance is S′y
  • the operation of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises: configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle; determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
  • the operation of determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance comprises: configuring the vertical eye rotation angle as ⁇ y; and determining the vertical eye rotation angle according to
  • the operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel.
  • the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
  • the operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as ⁇ X , the vertical eye initial angle as ⁇ y , a viewing distance as L, an abscissa of the target pixel coordinate as X, and an as ordinate of the target pixel coordinate Y;
  • X L * [ tan ⁇ ( ⁇ x + ⁇ x ) - tan ⁇ ( ⁇ x ) ]
  • ⁇ Y L * ⁇ tan ⁇ ( ⁇ y + ⁇ y ) - tan ⁇ ( ⁇ y ) ⁇ .
  • the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
  • the field sequence color display driving method and the display device in the present disclosure obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate.
  • this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
  • FIG. 1 is a diagram of a conventional field sequence color display.
  • FIG. 2 shows a comparison before and after color separation in the conventional field sequence color display.
  • FIG. 3 shows the principle of color separation in the field color sequence display.
  • FIG. 4 is a flow chart of a field sequence color display driving method according to an embodiment of the present disclosure.
  • FIG. 5 is a flow chart of a field sequence color display driving method according to another embodiment of the present disclosure.
  • FIG. 6 is a flow chart of calculating the eye movement amplitude in FIG. 5 .
  • FIG. 7 is a diagram of the eye state data shown in FIG. 5 .
  • FIG. 8 is a flow chart of calculating the target pixel coordinate according to an embodiment of the present disclosure.
  • FIG. 9 is a diagram of the eye initial angle and the eye rotation angle according to an embodiment of the present disclosure.
  • FIG. 10 illustrates a block diagram of a display device according to another embodiment of the present disclosure.
  • first”, “second” are for illustrative purposes only and are not to be construed as indicating or imposing a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature that limited by “first”, “second” may expressly or implicitly include at least one of the features.
  • an element when it is described that an element is “connected” to another element, the element may be “directly connected” to the other element, or “electrically connected” to the other element through a third element.
  • a field sequence color (FSC) display driving method is disclosed according to an embodiment of the present disclosure. Please refer to FIGS. 4 to 9 .
  • the FSC display driving method comprises the following steps:
  • the field sequence color display driving method in the present disclosure obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate.
  • this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
  • the eye state data include the eye movement distance S, the movement time t, and the eye angle d.
  • the eye movement distance S and the eye angle d are shown in FIG. 7 .
  • the eye angle d is an angle between the line of sight emitted by the eye and the horizontal direction (x-axis).
  • the FSC display driving method can obtain the eye state data by reading the corresponding device/module parameters in the system.
  • the eye state data can be provided by an external system or can be calculated by a software after an internal system reads data from a corresponding sensor.
  • the eye movement distance S corresponds to the x-direction movement amplitude Wx and y-direction movement amplitude Wy in the X-axis and Y-axis of the screen coordinates.
  • the x-direction movement amplitude Wx S*cos d.
  • the y-direction movement amplitude Wy S*sin d.
  • the eye movement distance W MAX(Wx, Wy). That is, the greater of the values Wx and Wy is taken as the eye movement distance.
  • the eye movement amplitude W reaches the preset amplitude is determined according to the step shown in FIG. 5 (that is, to match the eye movement amplitude with the preset amplitude).
  • the eye movement amplitude W is less than the preset amplitude, this indicates that the eye movement amplitude W is small and the observed color separation phenomenon is not obvious.
  • the pixel data to be displayed in this frame can be directly displayed, and there is no need to calculate the target pixel coordinate corresponding to the pixel data to be displayed and displaying the pixel data to be displayed according to the target pixel coordinate.
  • the eye movement amplitude W is greater than or equal to the preset amplitude, this indicates that the eye movement amplitude W is large and the observed color separation phenomenon is more obvious.
  • the target pixel coordinate corresponding to the pixel data to be displayed is calculated, and the pixel data to be displayed is displayed according to the target pixel coordinate. This can effectively improve this color separation issue.
  • the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
  • the eye rotation radius is an approximate distance between a pupil and the center of eye rotation.
  • the above approximate distance is not the true radius of the eye because the eye does not rotate with its exact center as the rotation point. So the eye rotation radius is the approximate distance between the pupil and the true center of rotation of the eye.
  • the estimated eye movement distance can be calculated by simulating the eye movement speed based on the estimated acceleration model:
  • the estimated eye movement distance is S′
  • the eye angle is d
  • a horizontal component of the estimated eye movement distance is S′x
  • a vertical component of the estimated eye movement distance is S′y.
  • the estimated eye movement distance S′ could be decomposed into the horizontal component S′x of the estimated eye movement in the X-axis and the vertical component S′y of the estimated eye movement distance in the Y-axis.
  • the eye rotation angle is decomposed in order to finally obtain the eye movement distance in the X-axis and Y-axis and uses the movement distance as the target pixel coordinate.
  • the eye is treated as a sphere, and the lateral eye rotation angle is obtained by the chord length formula.
  • r is the radius of the sphere
  • S′*cos d is the vector distance and also regarded as a chord length, which includes the movement direction and the movement distance of the eye.
  • the algorithm of this embodiment can more accurately estimate the lateral eye movement angle.
  • the algorithm of this embodiment can more accurately estimate the vertical eye movement angle.
  • the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle.
  • the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel.
  • the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
  • the eye initial angle, the lateral eye initial angle and the vertical eye initial angle may be directly obtained by an external device or a sensing system, or may be obtained by a similar calculation method of the eye rotation angle, the lateral eye rotation angle and the vertical eye rotation angle. Further details are omitted for simplicity.
  • the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as ⁇ X , the vertical eye initial angle as ⁇ y , a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y. Then,
  • X L * [ tan ⁇ ( ⁇ x + ⁇ x ) - tan ⁇ ( ⁇ x ) ]
  • ⁇ Y L * ⁇ tan ⁇ ( ⁇ y + ⁇ y ) - tan ⁇ ( ⁇ y ) ⁇ .
  • the eye movement distance in two different directions are taken as the abscissa and ordinate of the target pixel coordinate. This allows the eye movement speed to match with the position movement of the display screen, thereby alleviating or eliminating the color separation issue.
  • the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and obtaining the eye state data when the eye movement amplitude is greater than or equal to the preset amplitude.
  • the eye state data is obtained only when the eye movement amplitude is greater than or equal to the preset amplitude. This can reduce the required operations for obtaining the eye state data, thereby reducing the working load of the sensor or the sensing system and reducing the power consumption. Alternatively, the calculation amount of the FSC display driving method can be reduced. This could improve the efficiency of the FSC display driving method.
  • the display device in the present disclosure field sequence color display driving method and the obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate.
  • this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Chemical & Material Sciences (AREA)
  • Crystallography & Structural Chemistry (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A field sequence color (FSC) display driving method includes matching an eye movement amplitude with a preset amplitude, obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude, and obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data. The pixel data is displayed according to the target pixel coordinate.

Description

CROSS REFERENCE TO RELATED APPLICATION
This application claims priority to Chinese Application No. 202310099729.8, filed on Jan. 31, 2023. The entire disclosure of the above application is incorporated herein by reference.
FIELD OF THE DISCLOSURE
The present disclosure relates to a display technology, and more particularly, to a field sequence color display driving method and a display device.
BACKGROUND
Different from the conventional Liquid Crystal Display (LCD) technology that uses red, green and blue (RGB) three-color sub-pixel space superposition to achieve color display, the field sequence color (FSC) display technology mainly uses time frequency color switching and superposition to achieve color display. For example, when any pixel on the screen switches between the R field sequence, the G field sequence, and the B field sequence at a frequency of 240 Hz (as shown in three images of left, center, right and three images in FIG. 1 ), the overall display effect can be considered equivalent to a 60 Hz full-color display.
The advantage of this technology is that RGB color is no longer provided by color filters (CF) on the LCD. Instead, it can be directly provided by the color backlight, which will eliminate the loss of light efficiency caused by CFs, improve energy utilization, help improve brightness and reduce power consumption. At the same time, since RGB three sub-pixels are no longer required compared with traditional LCD display technology, the display resolution (PPI) is increased by 3 times under the same sub-pixel conditions.
However, this technology also has its own unique issue, such as the visual perception of color separation problems shown in FIG. 2 . That is, when the human eye moves relative to the screen when the screen is displaying an image (that is, when the eye rotates horizontally), the edges of the white target will appear red and yellow separated colors.
The principle of color separation is shown in FIG. 3 . The X axis represents the horizontal position, and the Y axis represents the time. If the human eye rotates under certain circumstances (it means that there is a movement of observation point), within one frame (1 frame time), the RGB component of the same point in an image scanned in a certain time order will fall in different positions on the retina, so the human eye will observe the color separation of the image, which is called the color separation phenomenon.
SUMMARY
One objective of an embodiment of the present disclosure is to provide a field sequence color display driving method and a display device, to alleviate the color separation phenomenon.
According to an embodiment of the present disclosure, a field sequence color (FSC) display driving method is disclosed. The FSC display driving method includes: matching an eye movement amplitude with a preset amplitude; obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude; obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data; and displaying the pixel data to be displayed according to the target pixel coordinate.
In some embodiments of the present disclosure, the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
In some embodiments of the present disclosure, the estimated eye movement distance is S′, the eye angle is d, a horizontal component of the estimated eye movement distance is S′x, and a vertical component of the estimated eye movement distance is S′y; and the step of determining the eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius comprises: determining the horizontal component of the estimated eye movement distance S′x=S′*cos d; and determining the vertical component of the estimated eye movement distance S′y=S′*sin d.
In some embodiments of the present disclosure, the step of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises: configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle; determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
In some embodiments of the present disclosure, the step of determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance comprises: configuring the eye rotation radius as r, the eye rotation angle as θ, and lateral eye rotation angle θx; and determining the lateral eye rotation angle according to S′*cos d=2r*sin θx.
In some embodiments of the present disclosure, the step of determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance comprises: configuring the vertical eye rotation angle as θy; and determining the vertical eye rotation angle according to S′*sin d=2r*sin θy.
In some embodiments of the present disclosure, the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel. The eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
In some embodiments of the present disclosure, the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as ax, the vertical eye initial angle as αy, a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y; where
X = L * [ tan ( α x + θ x ) - tan ( α x ) ] , and Y = L * [ tan ( α y + θ y ) - tan ( α y ) ] .
In some embodiments of the present disclosure, the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
According to another embodiment of the present disclosure, a display device is disclosed. The display device includes a processor and a memory storing program instructions executable by the processor to execute operations. The operations includes: matching an eye movement amplitude with a preset amplitude; obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude; obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data; and displaying the pixel data to be displayed according to the target pixel coordinate.
In some embodiments of the present disclosure, the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
In some embodiments of the present disclosure, the estimated eye movement distance is S′, the eye angle is d, a horizontal component of the estimated eye movement distance is S′x, and a vertical component of the estimated eye movement distance is S′y; and the operation of determining the eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius comprises: determining the horizontal component of the estimated eye movement distance S′x=S′*cos d; and determining the vertical component of the estimated eye movement distance S′y=S′*sin d.
In some embodiments of the present disclosure, the operation of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises: configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle; determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
In some embodiments of the present disclosure, the operation of determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance comprises: configuring the eye rotation radius as r, the eye rotation angle as θ, and lateral eye rotation angle θx; and determining the lateral eye rotation angle according to S′*cos d=2r*sin θx.
In some embodiments of the present disclosure, the operation of determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance comprises: configuring the vertical eye rotation angle as θy; and determining the vertical eye rotation angle according to
S * sin d = 2 r * sin θ y .
In some embodiments of the present disclosure, the operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel. The eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
In some embodiments of the present disclosure, the operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as αX, the vertical eye initial angle as αy, a viewing distance as L, an abscissa of the target pixel coordinate as X, and an as ordinate of the target pixel coordinate Y; where
X = L * [ tan ( α x + θ x ) - tan ( α x ) ] , and Y = L * tan ( α y + θ y ) - tan ( α y ) .
In some embodiments of the present disclosure, the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
The field sequence color display driving method and the display device in the present disclosure obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate. This could display the pixel data to be displayed in accordance with the target pixel coordinate related to the eye state data. Thus, this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
BRIEF DESCRIPTION OF THE DRAWINGS
In order to more clearly illustrate the technical solution in the embodiment of the present disclosure, the following will be a brief introduction to the drawings required in the description of the embodiment. Obviously, the drawings described below are only some embodiments of the present disclosure, for those skilled in the art, without the premise of creative labor, may also obtain other drawings according to these drawings.
FIG. 1 is a diagram of a conventional field sequence color display.
FIG. 2 shows a comparison before and after color separation in the conventional field sequence color display.
FIG. 3 shows the principle of color separation in the field color sequence display.
FIG. 4 is a flow chart of a field sequence color display driving method according to an embodiment of the present disclosure.
FIG. 5 is a flow chart of a field sequence color display driving method according to another embodiment of the present disclosure.
FIG. 6 is a flow chart of calculating the eye movement amplitude in FIG. 5 .
FIG. 7 is a diagram of the eye state data shown in FIG. 5 .
FIG. 8 is a flow chart of calculating the target pixel coordinate according to an embodiment of the present disclosure.
FIG. 9 is a diagram of the eye initial angle and the eye rotation angle according to an embodiment of the present disclosure.
FIG. 10 illustrates a block diagram of a display device according to another embodiment of the present disclosure.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
To help a person skilled in the art better understand the solutions of the present disclosure, the following clearly and completely describes the technical solutions in the embodiments of the present invention with reference to the accompanying drawings in the embodiments of the present invention. Apparently, the described embodiments are a part rather than all of the embodiments of the present invention. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments of the present invention without creative efforts shall fall within the protection scope of the present disclosure.
The term “first”, “second” are for illustrative purposes only and are not to be construed as indicating or imposing a relative importance or implicitly indicating the number of technical features indicated. Thus, a feature that limited by “first”, “second” may expressly or implicitly include at least one of the features.
In the description of the present disclosure, the meaning of “plural” is two or more, unless otherwise specifically defined.
Throughout the specification and claims, when it is described that an element is “connected” to another element, the element may be “directly connected” to the other element, or “electrically connected” to the other element through a third element.
Furthermore, the term “comprising” will be understood as meaning the inclusion of elements but not the exclusion of any other elements, unless explicitly described to the contrary.
The following disclosure provides many different embodiments or examples to implement different structures of the present disclosure. In order to simplify the disclosure of the present disclosure, the components and settings of specific examples are described below. They are for example purposes only and are not intended to limit this application. Further, the present disclosure may repeat reference numbers and/or reference letters in different examples, such duplication is for the purpose of simplification and clarity, and does not by itself indicate the relationship between the various embodiments and/or settings discussed. Further, the present disclosure provides various examples of specific processes and materials, but those of ordinary skill in the art may be aware of the application of other processes and/or the use of other materials. The following are described in detail, it should be noted that the order of description of the following embodiments is not used as a qualification for the preferred order of embodiments.
In view of the aforementioned color separation issue, a field sequence color (FSC) display driving method is disclosed according to an embodiment of the present disclosure. Please refer to FIGS. 4 to 9 . As shown in FIG. 4 , the FSC display driving method comprises the following steps:
    • Step S10: matching an eye movement amplitude with a preset amplitude.
    • Step S20: obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude.
    • Step S30: obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data.
    • Step S40: displaying the pixel data to be displayed according to the target pixel coordinate.
The field sequence color display driving method in the present disclosure obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate. This could display the pixel data to be displayed in accordance with the target pixel coordinate related to the eye state data. Thus, this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
Before matching the eye movement amplitude with the preset amplitude, it may include another step of obtaining the eye state data shown in FIG. 5 and then calculating the eye movement amplitude.
The eye state data include the eye movement distance S, the movement time t, and the eye angle d. The eye movement distance S and the eye angle d are shown in FIG. 7 . The eye angle d is an angle between the line of sight emitted by the eye and the horizontal direction (x-axis).
The FSC display driving method can obtain the eye state data by reading the corresponding device/module parameters in the system. Alternatively, the eye state data can be provided by an external system or can be calculated by a software after an internal system reads data from a corresponding sensor.
As shown in FIG. 6 and FIG. 7 , the eye movement distance S corresponds to the x-direction movement amplitude Wx and y-direction movement amplitude Wy in the X-axis and Y-axis of the screen coordinates. Here, the x-direction movement amplitude Wx=S*cos d. The y-direction movement amplitude Wy=S*sin d. The eye movement distance W=MAX(Wx, Wy). That is, the greater of the values Wx and Wy is taken as the eye movement distance.
Then, whether the eye movement amplitude W reaches the preset amplitude is determined according to the step shown in FIG. 5 (that is, to match the eye movement amplitude with the preset amplitude). When the eye movement amplitude W is less than the preset amplitude, this indicates that the eye movement amplitude W is small and the observed color separation phenomenon is not obvious. At this time, the pixel data to be displayed in this frame can be directly displayed, and there is no need to calculate the target pixel coordinate corresponding to the pixel data to be displayed and displaying the pixel data to be displayed according to the target pixel coordinate. When the eye movement amplitude W is greater than or equal to the preset amplitude, this indicates that the eye movement amplitude W is large and the observed color separation phenomenon is more obvious. In this case, the target pixel coordinate corresponding to the pixel data to be displayed is calculated, and the pixel data to be displayed is displayed according to the target pixel coordinate. This can effectively improve this color separation issue.
The step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; and determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius. Here the eye rotation radius is an approximate distance between a pupil and the center of eye rotation.
The above approximate distance is not the true radius of the eye because the eye does not rotate with its exact center as the rotation point. So the eye rotation radius is the approximate distance between the pupil and the true center of rotation of the eye.
The estimated eye movement distance can be calculated by simulating the eye movement speed based on the estimated acceleration model:
As shown in FIG. 8 , first, the eye movement acceleration a=dv/dt is calculated. Then, based on the eye movement acceleration a, the estimated eye movement distance S′=∫dvdt is calculated.
The eye rotation radius can be preset data or predictive data, or it can be obtained by an external device. Compared with directly obtaining the eye movement angle through the external device, the algorithm of this embodiment can improve the accuracy of the eye movement angle and reduce the required sensors.
Here, the estimated eye movement distance is S′, the eye angle is d, a horizontal component of the estimated eye movement distance is S′x, and a vertical component of the estimated eye movement distance is S′y. The step of determining the eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius comprises: determining the horizontal component of the estimated eye movement distance S′x=S′*cos d; and determining the vertical component of the estimated eye movement distance S′y=S′*sin d.
As shown in FIG. 8 , the estimated eye movement distance S′ could be decomposed into the horizontal component S′x of the estimated eye movement in the X-axis and the vertical component S′y of the estimated eye movement distance in the Y-axis.
The step of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises: configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle; determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
In this embodiment, the eye rotation angle is decomposed in order to finally obtain the eye movement distance in the X-axis and Y-axis and uses the movement distance as the target pixel coordinate.
The step of determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance comprises: configuring the eye rotation radius as r, the eye rotation angle as θ, and lateral eye rotation angle θx; and determining the lateral eye rotation angle according to S′*cos d=2r*sin θx.
In this embodiment, the eye is treated as a sphere, and the lateral eye rotation angle is obtained by the chord length formula. Here, r is the radius of the sphere, S′*cos d is the vector distance and also regarded as a chord length, which includes the movement direction and the movement distance of the eye.
Furthermore, compared with utilizing an external device or sensor to directly obtain the lateral eye movement angle, the algorithm of this embodiment can more accurately estimate the lateral eye movement angle.
The step of determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance comprises: configuring the vertical eye rotation angle as θy; and determining the vertical eye rotation angle according to S′*sin d=2r*sin θy.
In this embodiment, the eye is treated as a sphere, and the vertical eye rotation angle is obtained by the chord length formula. Here, r is the radius of the sphere, S′*sin d is another vector distance and also regarded as another chord length, which also includes the movement direction and the movement distance of the eye.
Furthermore, compared with utilizing an external device or sensor to directly obtain the vertical eye movement angle, the algorithm of this embodiment can more accurately estimate the vertical eye movement angle.
The step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises: configuring the eye state data to include an eye initial angle. Here, the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel. The eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
In this embodiment, the eye initial angle, the lateral eye initial angle and the vertical eye initial angle may be directly obtained by an external device or a sensing system, or may be obtained by a similar calculation method of the eye rotation angle, the lateral eye rotation angle and the vertical eye rotation angle. Further details are omitted for simplicity.
The step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises: setting the lateral eye initial angle as αX, the vertical eye initial angle as αy, a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y. Then,
X = L * [ tan ( α x + θ x ) - tan ( α x ) ] , and Y = L * tan ( α y + θ y ) - tan ( α y ) .
In this embodiment, the eye movement distance in two different directions are taken as the abscissa and ordinate of the target pixel coordinate. This allows the eye movement speed to match with the position movement of the display screen, thereby alleviating or eliminating the color separation issue.
The step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises: comparing the eye movement amplitude with the preset amplitude; and obtaining the eye state data when the eye movement amplitude is greater than or equal to the preset amplitude.
According to an embodiment, the eye state data is obtained only when the eye movement amplitude is greater than or equal to the preset amplitude. This can reduce the required operations for obtaining the eye state data, thereby reducing the working load of the sensor or the sensing system and reducing the power consumption. Alternatively, the calculation amount of the FSC display driving method can be reduced. This could improve the efficiency of the FSC display driving method.
Please refer to FIG. 10 illustrating a block diagram of a display device according to another embodiment of the present disclosure. According to another embodiment of the present disclosure, a display device 100 includes a processor 101, a memory 102, a display 103. The memory 102 stores program instructions executable by the processor 101 to execute operations as provided in the FSC display driving method in the above-mentioned embodiments. The display 103 displays the pixel data to be displayed according to the target pixel coordinate.
The display device in the present disclosure field sequence color display driving method and the obtain the eye state data according to the matching results of the eye movement amplitude and the preset amplitude, obtain the target pixel coordinate corresponding to the pixel data to be displayed based on the eye state data, and then displays the pixel data to be displayed according to the target pixel coordinate. This could display the pixel data to be displayed in accordance with the target pixel coordinate related to the eye state data. Thus, this can improve the matching degree between the display content and the eye movement, thereby improving the color separation phenomenon.
It should be noted that the above display device may be an Augmented Reality (AR) device, a Virtual Reality (VR) device, or a Head-Up Display (HUD) device. The above driving method for field color sequential display can be integrated into AR equipment, VR equipment or HUD equipment.
In the foregoing embodiments, the descriptions of each embodiment have their own emphases, and for parts not described in detail in a certain embodiment, reference may be made to relevant descriptions of other embodiments.
Above are embodiments of the present disclosure, which does not limit the scope of the present disclosure. Any modifications, equivalent replacements or improvements within the spirit and principles of the embodiment described above should be covered by the protected scope of the disclosure.

Claims (16)

What is claimed is:
1. A field sequence color (FSC) display driving method, comprising:
matching an eye movement amplitude with a preset amplitude;
obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude;
obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data; and
displaying the pixel data to be displayed according to the target pixel coordinate,
wherein the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises:
constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and
determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
2. The FSC display driving method of claim 1, wherein the estimated eye movement distance is S′, the eye angle is d, a horizontal component of the estimated eye movement distance is S′x, and a vertical component of the estimated eye movement distance is S′y; and the step of determining the eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius comprises:
determining the horizontal component of the estimated eye movement distance
S x = S * cos d ; and
determining the vertical component of the estimated eye movement distance
S y = S * sin d .
3. The FSC display driving method of claim 2, wherein the step of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises:
configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle;
determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and
determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
4. The FSC display driving method of claim 3, wherein the step of determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance comprises:
configuring the eye rotation radius as r, the eye rotation angle as θ, and lateral eye rotation angle θx; and
determining the lateral eye rotation angle according to S′*cosd=2r*sinθx.
5. The FSC display driving method of claim 4, wherein the step of determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance comprises:
configuring the vertical eye rotation angle as θy; and
determining the vertical eye rotation angle according to S′*sind=2r*sin θy.
6. The FSC display driving method of claim 5, wherein the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises:
configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel;
wherein the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
7. The FSC display driving method of claim 6, wherein the step of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises:
setting the lateral eye initial angle as ax, the vertical eye initial angle as αy, a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y;
where X = L * [ tan ( α x x - tan ( α x ( ) ) [ ] ) ] , and Y = L * [ tan ( α y + θ y ) - tan ( α y ) ] .
8. The FSC display driving method of claim 1, wherein the step of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises:
comparing the eye movement amplitude with the preset amplitude; and
when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
9. A display device, comprising:
a memory, storing program instructions;
a processor, configured to execute the program instructions to perform operations comprising:
matching an eye movement amplitude with a preset amplitude;
obtaining an eye state data according to the matching result of the eye movement amplitude and the preset amplitude; and
obtaining a target pixel coordinate of the target corresponding to a pixel data to be displayed based on the eyeball state data;
a display, for displaying the pixel data to be displayed according to the target pixel coordinate,
wherein an operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises:
constructing the eye state data to include an estimated eye movement distance, an eye angle and an eye rotation radius; wherein the eye rotation radius is an approximate distance between a pupil and the center of eye rotation; and the eye angle is an angle between a line of sight emitted by an eye and a horizontal direction; and
determining an eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius.
10. The display device of claim 9, wherein the estimated eye movement distance is S′, the eye angle is d, a horizontal component of the estimated eye movement distance is S′x, and a vertical component of the estimated eye movement distance is S′y; and the step of determining the eye rotation angle according to the estimated eye movement distance, the eye angle and the eye rotation radius comprises:
determining the horizontal component of the estimated eye movement distance
S x = S * cos d ; and
determining the vertical component of the estimated eye movement distance
S y = S * sin d .
11. The display device of claim 10, wherein an operation of determining the eye rotation angle based on the estimated eye movement distance, the eye angle and the eye rotation radius further comprises:
configuring the eye rotation angle to include a lateral eye rotation angle and a vertical eye rotation angle;
determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance; and
determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance.
12. The display device of claim 11, wherein an operation of determining the lateral eye rotation angle according to the eye rotation radius and the horizontal component of the estimated eye movement distance comprises:
configuring the eye rotation radius as r, the eye rotation angle as θ, and lateral eye rotation angle θx; and
determining the lateral eye rotation angle according to S′*cosd=2r*sinθx.
13. The display device of claim 10, wherein an operation of determining the vertical eye rotation angle according to the eye rotation radius and the vertical component of the estimated eye movement distance comprises:
configuring the vertical eye rotation angle as θy; and
determining the vertical eye rotation angle according to S′*sind=2r*sin θy.
14. The display device of claim 13, wherein an operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eye state data comprises:
configuring the eye state data to include an eye initial angle, wherein the eye initial angle is an angle between the line of sight of the pupil and a direct line of sight of the eye facing the display panel;
wherein the eye initial angle comprises a lateral eye initial angle and a vertical eye initial angle.
15. The display device of claim 14, wherein an operation of obtaining the target pixel coordinate of the target corresponding to the pixel data to be displayed based on the eyeball state data comprises:
setting the lateral eye initial angle as ax, the vertical eye initial angle as αy, a viewing distance as L, an abscissa of the target pixel coordinate as X, and an ordinate of the target pixel coordinate as Y;
where X = L * [ tan ( α x x - tan ( α x ( ) ) [ ] ) ] , and Y = L * [ tan ( α y + θ y ) - tan ( α y ) ] .
16. The display device of claim 9, wherein an operation of obtaining the eye state data according to the matching result of the eye movement amplitude and the preset amplitude comprises:
comparing the eye movement amplitude with the preset amplitude; and
when the eye movement amplitude is greater than or equal to the preset amplitude, obtaining the eye state data.
US18/458,155 2023-01-31 2023-08-30 Field sequence color display driving method and display device Active US12211316B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202310099729.8 2023-01-31
CN202310099729.8A CN117524123B (en) 2023-01-31 2023-01-31 Field color sequence display driving method and display device

Publications (2)

Publication Number Publication Date
US20240257558A1 US20240257558A1 (en) 2024-08-01
US12211316B2 true US12211316B2 (en) 2025-01-28

Family

ID=89763215

Family Applications (1)

Application Number Title Priority Date Filing Date
US18/458,155 Active US12211316B2 (en) 2023-01-31 2023-08-30 Field sequence color display driving method and display device

Country Status (2)

Country Link
US (1) US12211316B2 (en)
CN (1) CN117524123B (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160085302A1 (en) * 2014-05-09 2016-03-24 Eyefluence, Inc. Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
US20210373657A1 (en) * 2020-05-26 2021-12-02 Sony Interactive Entertainment Inc. Gaze tracking apparatus and systems
US20220334636A1 (en) * 2021-04-19 2022-10-20 Varjo Technologies Oy Display apparatuses and methods for calibration of gaze-tracking

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101589329B (en) * 2007-11-21 2011-10-12 松下电器产业株式会社 display device
US8970495B1 (en) * 2012-03-09 2015-03-03 Google Inc. Image stabilization for color-sequential displays
WO2016115049A2 (en) * 2015-01-13 2016-07-21 Magic Leap, Inc. Improved color sequential display
CN114935971B (en) * 2021-02-05 2024-08-20 京东方科技集团股份有限公司 Display device and display driving method
CN113971834A (en) * 2021-10-23 2022-01-25 郑州大学 Eye tracking method and system based on virtual reality
CN114360043B (en) * 2022-03-18 2022-06-17 南昌虚拟现实研究院股份有限公司 Model parameter calibration method, sight tracking method, device, medium and equipment

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160085302A1 (en) * 2014-05-09 2016-03-24 Eyefluence, Inc. Systems and methods for biomechanically-based eye signals for interacting with real and virtual objects
US20210373657A1 (en) * 2020-05-26 2021-12-02 Sony Interactive Entertainment Inc. Gaze tracking apparatus and systems
US20220334636A1 (en) * 2021-04-19 2022-10-20 Varjo Technologies Oy Display apparatuses and methods for calibration of gaze-tracking

Also Published As

Publication number Publication date
CN117524123A (en) 2024-02-06
US20240257558A1 (en) 2024-08-01
CN117524123B (en) 2025-12-05

Similar Documents

Publication Publication Date Title
US8830221B2 (en) Image privacy protecting method
US8384619B2 (en) Graphic meter display
CN107068102B (en) Image processing method, image processing device and display device
US20210225329A1 (en) Image display method, display system and computer-readable storage medium
JP4507936B2 (en) Image display device and electronic apparatus
KR102794876B1 (en) Dynamic panel masking
JP6281985B2 (en) Transparent display device
CN107277419A (en) A kind of display device and its display methods
US12107091B2 (en) Display device and display system
WO2021208646A1 (en) Display method, system and device of vehicle a pillar display assembly, and storage medium
WO2024244852A9 (en) Liquid crystal display device, image display method, and electronic device
US12211316B2 (en) Field sequence color display driving method and display device
US20180302613A1 (en) Method and apparatus for controlling naked eye stereoscopic display and display device
US20190122619A1 (en) Liquid crystal display device
US12300161B2 (en) Optical crosstalk compensation for foveated display
WO2024244853A9 (en) Liquid crystal display apparatus, image display method, and electronic device
WO2012012955A1 (en) Liquid crystal display and pixel unit thereof
US12536971B2 (en) Display device and display system
CN106647062A (en) Pixel structure, liquid crystal panel and stereoscopic display
JP7223567B2 (en) liquid crystal display
US11132967B2 (en) Liquid crystal display device having superposed display panels
KR101686093B1 (en) Viewing Angle Image Control Liquid Crystal Display Device and Driving Method for the Same
JP2022046378A (en) Display device and display system
EP4243005A1 (en) Display device and method of driving the same
EP4390630A1 (en) Method and device for naked eye 3d displaying vehicle instrument

Legal Events

Date Code Title Description
AS Assignment

Owner name: WUHAN CHINA STAR OPTOELECTRONICS TECHNOLOGY CO., LTD., CHINA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MA, CHANGWEN;ZHA, GUOWEI;REEL/FRAME:064744/0026

Effective date: 20230823

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STPP Information on status: patent application and granting procedure in general

Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE