CN115547230B - Video data display processing method and device, micro display screen and storage medium - Google Patents
Video data display processing method and device, micro display screen and storage medium Download PDFInfo
- Publication number
- CN115547230B CN115547230B CN202211469976.4A CN202211469976A CN115547230B CN 115547230 B CN115547230 B CN 115547230B CN 202211469976 A CN202211469976 A CN 202211469976A CN 115547230 B CN115547230 B CN 115547230B
- Authority
- CN
- China
- Prior art keywords
- pixel
- gray scale
- error
- display screen
- error mapping
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003672 processing method Methods 0.000 title abstract description 21
- 238000013507 mapping Methods 0.000 claims abstract description 198
- 230000033228 biological regulation Effects 0.000 claims abstract description 20
- 238000011217 control strategy Methods 0.000 claims abstract description 18
- 238000000034 method Methods 0.000 claims description 28
- 230000001276 controlling effect Effects 0.000 claims description 11
- 230000001105 regulatory effect Effects 0.000 claims description 8
- 238000004590 computer program Methods 0.000 claims description 4
- 230000007812 deficiency Effects 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 15
- 230000000694 effects Effects 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 230000008569 process Effects 0.000 description 4
- 239000003086 colorant Substances 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000016776 visual perception Effects 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 230000003993 interaction Effects 0.000 description 2
- 230000007704 transition Effects 0.000 description 2
- XUIMIQQOPSSXEZ-UHFFFAOYSA-N Silicon Chemical compound [Si] XUIMIQQOPSSXEZ-UHFFFAOYSA-N 0.000 description 1
- 238000003491 array Methods 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 235000019800 disodium phosphate Nutrition 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 230000004438 eyesight Effects 0.000 description 1
- 230000017525 heat dissipation Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000002096 quantum dot Substances 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000002123 temporal effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/10—Processing, recording or transmission of stereoscopic or multi-view image signals
- H04N13/106—Processing image signals
- H04N13/128—Adjusting depth or disparity
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/0271—Adjustment of the gradation levels within the range of the gradation scale, e.g. by redistribution or clipping
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
Abstract
The application provides a video data display processing method and device, a micro display screen and a storage medium, and relates to the technical field of micro display. When a micro display screen supporting low gray scale depth needs to display a video source with high gray scale depth, gray scale precision regulation and control strategies corresponding to the display types of the micro display screen are used for carrying out gray scale error mapping processing on the video source data with high gray scale depth to obtain video data consistent with the low gray scale depth supported by the micro display screen, namely, detail errors between the gray scale depth of the video source with high gray scale depth and the low gray scale depth supported by the micro display screen are reasonably mapped or error transmitted to make up the deficiency of the details, the purpose of mapping the video source data with high gray scale depth to the micro display screen with low gray scale depth for displaying is achieved, and the purpose of displaying the video source with high gray scale depth by the micro display screen with low gray scale depth can be achieved without additionally increasing the power consumption and the size of the micro display screen.
Description
Technical Field
The application relates to the technical field of micro display, in particular to a video data display processing method and device, a micro display screen and a storage medium.
Background
Micro-display, which is one of the indispensable key technologies in the fields of Augmented Reality (AR), virtual Reality (VR) and MR Mixed Reality (MR), can greatly display the color fidelity of a picture, the fineness of the picture and the uniformity of picture transition, and has a great influence on the user's impression.
At present, video sources develop towards a high gray scale depth direction, such as a Dolby Vision standard movie with a 10-bit gray scale depth, which can improve the viewing experience of users.
However, in the field of micro display, micro display screens supporting high gray scale depth are not common, and if a micro display screen is required to play a video source with high gray scale depth, the gray scale depth of the micro display screen needs to be increased, so that the size and power consumption of the display screen are greatly increased, the endurance, heat dissipation and the like of the display screen are not facilitated, and the cost of the display screen is also increased.
Disclosure of Invention
An object of the present application is to provide a method and an apparatus for processing video data display, a micro display screen, and a storage medium, so as to solve the above-mentioned problems in the prior art.
In order to achieve the above purpose, the technical solutions adopted in the embodiments of the present application are as follows:
in a first aspect, an embodiment of the present application provides a method for displaying and processing video data, where the method includes:
acquiring video source data to be displayed;
determining a target gray scale precision regulation strategy according to the display type of the micro display screen;
performing gray scale error mapping processing on the video source data to be displayed by using the target gray scale precision regulation and control strategy to obtain processed video source data, wherein the gray scale depth of the processed video source data is the same as the gray scale depth supported by the micro display screen;
and displaying the processed video source data to the micro display screen.
Optionally, the determining a target grayscale precision adjusting and controlling manner according to the display type of the micro display screen includes:
and if the display type of the micro display screen is spatial color, determining that the target gray scale precision regulation strategy is a gray scale error mapping strategy between corresponding color sub-pixels on adjacent resolution pixels.
Optionally, the performing gray scale error mapping processing on the video source data to be displayed by using the target gray scale precision regulation and control strategy to obtain processed video source data includes:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: a pixel structure, a number of error mapped pixels, a position of each error mapped pixel, and a threshold;
acquiring a pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel;
determining whether an absolute value of a difference between a pixel value of the current pixel and a pixel value of each of the error mapped pixels is greater than the threshold;
if not, determining the weight value of each error mapping pixel according to the pixel value of the current pixel and the absolute value of the difference value between each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
Optionally, the configuration parameters include: the pixel structure is a Chinese character 'tian' shaped pixel structure, the number of error mapping pixels is 3, and the positions of the error mapping pixels are right, below and below right;
the obtaining the pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel includes:
acquiring a corresponding color component of a right pixel of the current pixel, a corresponding color component of a lower pixel of the current pixel, and a corresponding color component of a lower right pixel of the current pixel, wherein the color components include: a red component, a green component, a blue component, and a W component.
Optionally, the determining a target gray scale precision adjusting and controlling manner according to the display type of the micro display screen includes:
and if the display type of the micro display screen is time sequence color, determining the target gray scale precision regulation strategy as a gray scale error mapping strategy of adjacent physical pixels.
Optionally, the performing gray scale error mapping processing on the video source data to be displayed by using the target gray scale precision regulation and control strategy to obtain processed video source data includes:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: the number of error mapping pixels, the position of each error mapping pixel and the weight value of each error mapping pixel;
acquiring a pixel value of each error mapping pixel according to the number of the error mapping pixels and the position of each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
Optionally, the displaying the processed video source data on the micro display screen includes:
if the display type of the micro display screen is spatial color, displaying the processed video source data on the micro display screen by using the pixel structure supported by the spatial color, wherein the pixel structure supported by the spatial color comprises: a field-shaped pixel structure, a strip-shaped pixel structure, a delta-shaped pixel structure and an L-shaped pixel structure;
and if the display type of the micro display screen is time sequence color, displaying the processed video source data on the micro display screen by using time division multiplexing physical pixels.
In a second aspect, an embodiment of the present application further provides a video data display processing apparatus, where the apparatus includes:
the acquisition module is used for acquiring video source data to be displayed;
the determining module is used for determining a target gray scale precision regulating and controlling strategy according to the display type of the micro display screen;
the processing module is used for performing gray scale error mapping processing on the video source data to be displayed by using the target gray scale precision regulating and controlling strategy to obtain processed video source data, wherein the gray scale depth of the processed video source data is the same as the gray scale depth supported by the micro display screen;
and the display module is used for displaying the processed video source data to the micro display screen.
Optionally, the determining module is further configured to:
and if the display type of the micro display screen is spatial color, determining that the target gray scale precision regulation strategy is a gray scale error mapping strategy between corresponding color sub-pixels on adjacent resolution pixels.
Optionally, the processing module is further configured to:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: a pixel structure, a number of error mapped pixels, a position of each error mapped pixel, and a threshold;
acquiring a pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel;
determining whether an absolute value of a difference between a pixel value of the current pixel and a pixel value of each of the error mapped pixels is greater than the threshold;
if not, determining the weight value of each error mapping pixel according to the pixel value of the current pixel and the absolute value of the difference value between each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
Optionally, the configuration parameters include: the pixel structure is a Chinese character 'tian' shaped pixel structure, the number of error mapping pixels is 3, and the positions of the error mapping pixels are right, below and below right;
the processing module is further configured to:
acquiring a corresponding color component of a right pixel of the current pixel, a corresponding color component of a lower pixel of the current pixel, and a corresponding color component of a lower right pixel of the current pixel, wherein the color components include: a red component, a green component, a blue component, and a W component.
Optionally, the determining module is further configured to:
and if the display type of the micro display screen is time sequence color, determining that the target gray scale precision regulation strategy is a gray scale error mapping strategy of adjacent physical pixels.
Optionally, the processing module is further configured to:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: the number of error mapping pixels, the position of each error mapping pixel and the weight value of each error mapping pixel;
acquiring a pixel value of each error mapping pixel according to the number of the error mapping pixels and the position of each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
Optionally, the display module is further configured to:
if the display type of the micro display screen is a spatial color, displaying the processed video source data on the micro display screen by using a pixel structure supported by the micro display screen, wherein the pixel structure supported by the micro display screen comprises: the pixel structure comprises a field-shaped pixel structure, a strip-shaped pixel structure, a delta-shaped pixel structure and an L-shaped pixel structure;
and if the display type of the micro display screen is time sequence color, displaying the processed video source data on the micro display screen by using time division multiplexing physical pixels.
In a third aspect, an embodiment of the present application further provides a micro display screen, including: the driving chip and the micro display chip are in communication through a bus, the driving chip or the micro display chip stores executable machine readable instructions, and when the micro display screen runs, the driving chip or the micro display chip executes the machine readable instructions to execute the steps of the method provided by the first aspect.
In a fourth aspect, embodiments of the present application further provide a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a driving module or a micro display chip, the computer program performs the steps of the method as provided in the first aspect.
The beneficial effect of this application is:
the embodiment of the application provides a video data display processing method and device, a micro display screen and a storage medium, wherein when a micro display screen supporting low gray scale depth needs to display a video source with high gray scale depth, a gray scale precision regulation and control strategy corresponding to a display type of the micro display screen can be used for performing gray scale error mapping processing on the video source data with high gray scale depth to obtain video data consistent with the low gray scale depth supported by the micro display screen, namely, a detail error between the gray scale depth of the video source with high gray scale depth and the low gray scale depth supported by the micro display screen is reasonably mapped or error transmitted, so that the purpose of making up the loss of details is achieved, and the purpose of mapping the video source data with high gray scale depth to the micro display screen with low gray scale depth for displaying is achieved. Therefore, the purpose of displaying the video source with high gray scale depth by using the micro display screen with low gray scale depth can be achieved without additionally increasing the power consumption and the size of the micro display screen, so that higher gray scale depth visual perception is obtained, the display effect superior to that of the micro display screen with low gray scale depth is obtained, the micro display screen is particularly suitable for application scenes with higher requirements on the power consumption and the size of the micro display screen and higher requirements on the display gray scale depth, and the problem that the micro display screen with low gray scale depth in the prior art cannot display the video source with high gray scale depth is solved.
Drawings
To more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed in the embodiments will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a schematic structural diagram of a micro display screen provided in an embodiment of the present application;
fig. 2 is a schematic structural diagram of a driving chip provided in an embodiment of the present application;
fig. 3 is a schematic flowchart of a video data display processing method according to an embodiment of the present application;
fig. 4 is a schematic flowchart of another video data display processing method according to an embodiment of the present application;
fig. 5 is a mapping relationship diagram of a sub-pixel R in a field-line pixel structure according to an embodiment of the present disclosure;
FIG. 6 is a mapping diagram of a sub-pixel G in a field-line pixel structure according to an embodiment of the present disclosure;
FIG. 7 is a diagram illustrating a mapping relationship of a sub-pixel B in a field-column pixel structure according to an embodiment of the present disclosure;
fig. 8 is a mapping relationship diagram of a sub-pixel W in a field-word line pixel structure according to an embodiment of the present application;
FIG. 9 is a diagram of a complete relationship of field-line pixel structure mapping according to an embodiment of the present application;
fig. 10 is a schematic flowchart of another video data display processing method according to an embodiment of the present application;
FIG. 11 is a schematic diagram illustrating a variation of time-sequential colors of a 1920 × 1080 resolution frame video source according to an embodiment of the present invention;
fig. 12 is a schematic structural diagram of a video data display processing apparatus according to an embodiment of the present application.
An icon: 100-micro display screen; 101-a driver chip; 102-micro display chip; 201-a memory; 202-a processor; 1201-an obtaining module; 1202-a determination module; 1203-a processing module; 1204-display module.
Detailed Description
In order to make the purpose, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it should be understood that the drawings in the present application are for illustrative and descriptive purposes only and are not used to limit the scope of protection of the present application. Further, it should be understood that the schematic drawings are not drawn to scale. The flowcharts used in this application illustrate operations implemented according to some embodiments of the present application. It should be understood that the operations of the flow diagrams may be performed out of order, and steps without logical context may be performed in reverse order or simultaneously. One skilled in the art, under the guidance of this application, may add one or more other operations to, or remove one or more operations from, the flowchart.
In addition, the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. The components of the embodiments of the present application, generally described and illustrated in the figures herein, can be arranged and designed in a wide variety of different configurations. Thus, the following detailed description of the embodiments of the present application, as presented in the figures, is not intended to limit the scope of the claimed application, but is merely representative of selected embodiments of the application. All other embodiments, which can be derived by a person skilled in the art from the embodiments of the present application without making any creative effort, shall fall within the protection scope of the present application.
It should be noted that in the embodiments of the present application, the term "comprising" is used to indicate the presence of the features stated hereinafter, but does not exclude the addition of further features.
First, before specifically explaining the technical solution provided in the present application, a brief description is given to the structure of the micro display screen related to the present application.
Referring to fig. 1, the micro display screen 100 includes: the display device comprises a driving chip 101 and a micro display chip 102, wherein the driving chip 101 and the micro display chip 102 are connected through bus communication.
The Micro display screen 100 may be a Silicon-based Liquid Crystal (LCoS), an Organic Light-Emitting Diode (OLED), a Micro Light-Emitting Diode (Micro LED), or a Quantum Dot Light-Emitting Diode (Q-LED), and may be a display device installed on an AR or VR or MR eyeglass lens, and is particularly suitable for the field of Micro display AR or VR or MR, where the requirement for gray scale depth is high.
As shown in fig. 1, the driver chip 101 is configured to obtain and process video source data to be played, and the driver chip 101 may support, but is not limited to, an RGB888 interface, an MIPI interface, an HDMI interface, and a VGA interface.
Optionally, the driver chip 101 and the micro display chip 102 support, but are not limited to, video sources with 1920 × 1080 and 1280 × 720 resolutions, and support, but are not limited to, video sources with 10bit depth.
In this embodiment, the driver chip 101 and the micro display chip 102 perform information interaction through a bus, the video data display processing method provided by the present application is integrated in the driver chip 101, the driver chip 101 executes the video data display processing method provided by the present application to obtain processed video data, and sends the processed video data to the micro display chip 102, and the micro display chip 102 displays the processed video data; the video data display processing method provided by the present application may also be integrated into the micro display chip 102, and the micro display chip 102 executes the video data display processing method provided by the present application. That is, executable machine readable instructions may be stored on the driving chip 101 or the micro display chip 102, and when the micro display screen 100 runs, the driving chip 101 or the micro display chip 102 may be configured to execute the machine readable instructions to execute the video data display processing method provided in the following embodiments, so as to improve the flexibility of executing the video data display processing method provided in this application.
It will be appreciated that the configuration shown in FIG. 1 is merely illustrative and that micro-display 100 may also include more or fewer components than shown in FIG. 1 or have a different configuration than shown in FIG. 1. The components shown in fig. 1 may be implemented in hardware, software, or a combination thereof.
The structure of the driving chip 101 or the micro display chip 102 in fig. 1 for performing the video data display processing method provided in the present application is briefly explained as follows.
In this embodiment, the driving chip 101 and the micro display chip 102 may have the same structure, and the driving chip 101 is taken as an example for description.
As shown in fig. 2, the driving chip 101 includes a memory 201 and a processor 202. Wherein, the memory 201 and the processor 202 are electrically connected to each other directly or indirectly to realize data transmission or interaction. For example, the components may be electrically connected to each other via one or more communication buses or signal lines.
The memory 201 stores software functional modules stored in the memory 201 in the form of software or firmware (firmware), and the processor 202 executes various functional applications and data processing by running the software programs and modules stored in the memory 201, that is, implements the video data display processing method in the embodiment of the present application.
The Memory 201 may be, but is not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Read-Only Memory (EPROM), and the like. The memory 201 is used for storing a program, and the processor 202 executes the program after receiving an execution instruction.
The processor 202 may be an integrated circuit chip having signal processing capabilities. The Processor 202 may be a general-purpose Processor including a Central Processing Unit (CPU), a Network Processor (NP), and the like.
The following will explain the implementation principle of the steps of the video data display processing method provided by the present application and the beneficial effects produced by the steps through a plurality of specific embodiments.
Fig. 3 is a schematic flowchart of a video data display processing method according to an embodiment of the present application; alternatively, the main body of the method may be the driving chip 101 or the micro display chip 102 shown in fig. 1.
It should be understood that in other embodiments, the order of some steps in the video data display processing method may be interchanged according to actual needs, or some steps may be omitted or deleted. As shown in fig. 3, the method includes:
s301, video source data to be displayed are obtained.
And the gray scale depth of the video source data to be displayed is greater than or equal to the gray scale depth of the original video source supported by the micro display screen.
It should be understood that the gray scale refers to a point on a display screen, i.e., a pixel, which is visible to the naked eye of a human, and is composed of three primary colors of red, green, and blue (RGB). The light source behind it can exhibit different brightness levels for each primary color. The gray scale represents the gradation level with different brightness from the darkest to the brightest, namely the change level of each color of three primary colors in the process of continuously darkening from pure color (such as pure red) to black is divided into the gray scale of the color and is represented by a number, namely the most common color storage principle, and the more the intermediate levels are, the more exquisite the picture effect can be presented. For example, an 8-bit video source may also be referred to as a 2 8 The video source of 10bit with the gray scale of =256 can also be called 2 10 =1024 gray scale.
For example, for each color component gray scale depth d of the original video source supported by the micro display screen, the depth of each color component gray scale of the video source to be displayed is m, that is, the error of the two gray scale depths is e = m-d; wherein d, m and e are positive integers, and d is less than or equal to m. For example, the gray scale depth of the original video source supported and displayed by the micro display screen is 8 bits, and the gray scale depth of the video source to be displayed is 10 bits, that is, the error between the gray scale depths of the original video source and the video source to be displayed supported and displayed by the micro display screen is 2 bits.
Generally, in order to solve the problem that the gray scale depth is inconsistent between an original video source supported by the micro display screen and a video source to be displayed, the micro display screen with low gray scale depth cannot display the video source with high gray scale depth. In this embodiment, it is proposed that a video source with a high gray scale depth to be displayed may be subjected to gray scale mapping processing to obtain video data supported by a micro display screen with a low gray scale depth. The details will be described in detail by the following steps.
Optionally, if the acquired interface format of the video source data to be displayed is not supported by the micro display screen, the interface format conversion of the video source data to be displayed needs to be performed to convert the video source data to be displayed into a video format supported by the micro display screen, for example, to convert the HDMI into an RGB format.
S302, determining a target gray scale precision regulation and control strategy according to the display type of the micro display screen.
The display type of the micro display screen is a display mode of the micro display screen for the video source data to be displayed. Illustratively, for example, the display types of the micro display L1 include: spatial color and temporal color.
It should be understood that, the display types of the micro display screens are different, and the gray scale precision regulation and control strategies adopted by the micro display screens are also different. Therefore, in this embodiment, the gray scale precision adjustment and control strategy corresponding to the video source data to be displayed can be further determined by combining the display type of the micro display screen.
And S303, performing gray scale error mapping processing on the video source data to be displayed by using a target gray scale precision regulation and control strategy to obtain the processed video source data.
And S304, displaying the processed video source data on a micro display screen.
And the gray scale depth of the processed video source data is the same as the gray scale depth supported by the micro display screen.
In this embodiment, if the display type of the micro display screen L1 is a spatial color, a gray scale precision adjustment and control strategy corresponding to the spatial color is used to perform gray scale error mapping processing on video source data to be displayed, that is, to perform reasonable mapping or error transmission on detail errors between the gray scale depth of the video source to be displayed and the gray scale depth supportable by the micro display screen, so as to make up for the loss of details, and map the video source data with a high gray scale depth to the micro display screen with a low gray scale depth for display. Therefore, the purpose of displaying the video source with high gray scale depth by the micro display screen with low gray scale depth can be achieved without additionally increasing the power consumption and the size of the micro display screen, higher gray scale depth visual perception is obtained, the display effect of the micro display screen superior to that of the micro display screen with low gray scale depth is obtained, the micro display screen is particularly suitable for application scenes with higher requirements on the power consumption and the size of the micro display screen and the display gray scale depth, and the problem that the micro display screen with low gray scale depth cannot display the video source with high gray scale depth in the prior art is solved.
In summary, in the present disclosure, when a micro display screen supporting a low grayscale depth needs to display a video source with a high grayscale depth, a grayscale precision control strategy corresponding to a display type of the micro display screen may be used to perform grayscale error mapping processing on the video source data with the high grayscale depth, that is, a detail error between the grayscale depth of the video source with the high grayscale depth and the low grayscale depth supportable by the micro display screen is reasonably mapped or error-transferred, so as to achieve the purpose of compensating for the lack of detail, obtain video data consistent with the low grayscale depth supported by the micro display screen, and achieve the purpose of mapping the video source data with the high grayscale depth to the micro display screen with the low grayscale depth for display. Therefore, the purpose of displaying the video source with high gray scale depth by using the micro display screen with low gray scale depth can be achieved without additionally increasing the power consumption and the size of the micro display screen, so as to obtain higher gray scale depth visual perception, obtain the display effect superior to the self display effect of the micro display screen with low gray scale depth, be particularly suitable for application scenes with higher requirements on the power consumption and the size of the micro display screen and higher requirements on the display gray scale depth, and solve the problem that the micro display screen with low gray scale depth in the prior art cannot display the video source with high gray scale depth.
How to determine the gray scale precision control strategy will be explained in detail by the following embodiments.
First, when the display type of the micro display screen is spatial color, the following is specific:
optionally, the step S302 of determining the target gray scale precision adjusting and controlling manner according to the display type of the micro display screen includes:
in this embodiment, if the display type of the microdisplay is spatial color, the target grayscale accuracy adjusting and controlling policy may be determined as a grayscale error mapping policy between corresponding color sub-pixels on adjacent resolution pixels.
The following embodiment specifically explains how to use the target grayscale accuracy control strategy in step S303 to perform grayscale error mapping on video source data to be displayed, so as to obtain processed video source data.
Optionally, referring to fig. 4, the video source data to be displayed is traversed, and for a traversed current pixel, the following steps are respectively performed:
s401, obtaining preset configuration parameters and a pixel value of a current pixel.
Wherein, the configuration parameters comprise: pixel structure, number of mapped pixels, location of each error mapped pixel, and threshold.
S402, acquiring a pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel.
Illustratively, the spatial color-supported pixel structure includes: a field-shaped pixel structure, a strip-shaped pixel structure, a delta-shaped pixel structure and an L-shaped pixel structure.
The threshold Q is a critical determination value in which two pixels are edge pixels.
For example, in performing gray scale error mapping, the selected pixels are pixels of adjacent resolution, the selected pixel structure is a field-shaped pixel structure, and the current pixel P is 0 (x, y) the number of error map pixels is 3, and the position of each error map pixel is right, lower, and rightCorresponding color component of the lower 3 pixels, i.e. P 1 (x,y+1)、P 2 (x+1,y)、P 3 (x +1, y + 1), then P can be read separately 1 (x,y+1)、P 2 (x+1,y)、P 3 (x +1, y + 1) the three errors map the pixel values of the pixels. For example, the current pixel P 0 (x, y) has a pixel value ofCurrent pixel P 0 The error mapped pixel of (x, y) has a pixel value ofWherein t is less than or equal to L. Wherein,is P 1 A pixel value of (x, y + 1),is P 2 (x +1, y) pixel value,is P 3 (x +1, y + 1).
S403, whether the absolute value of the difference value between the pixel value of the current pixel and the pixel value of each error mapping pixel is larger than a threshold value or not is determined.
S404, if not, determining the weight value of each error mapping pixel according to the pixel value of the current pixel and the absolute value of the difference value between each error mapping pixel.
Alternatively, the weight value of each error mapped pixel may also be a preconfigured value, such as P 1 (x,y+1)、P 2 (x+1,y)、P 3 The weight values of (x +1, y + 1) are 3/8, 3/8 and 2/8 respectively. In this embodiment, in order to improve the accuracy of the gray scale error mapping process, an automatic weight ratio updating method is adopted.
S405, determining a display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
In this embodiment, to determine whether there is an edge between the current pixel and its error mapped pixel, it is proposed that the pixel value of the current pixel may be compared with the pixel values of the error mapped pixels. If the absolute value of the difference between the pixel value of the current pixel and the pixel value of one error mapping pixel is less than or equal to the threshold value Q, that isIf the difference is not greater than the threshold Q, it can be determined that there is a continuous transition gray level between the current pixel and each of the pixels, and the position and number of each of the error-mapped pixels remain unchanged. Thus, the weight value of each error mapped pixel may be determined based on the pixel value of the current pixel and the absolute value of the difference between each error mapped pixel.
Specifically, the absolute value of the difference between the pixel value of the current pixel and each error mapped pixel is summed, i.e.(ii) a Then, the sequence is ordered from large to small, assuming,is greater thanIs greater thanThen, the weight value of each error mapped pixel may be determined as follows:
Therefore, the display pixel value of the current pixel can be calculated by combining the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel. And the display pixel value of the current pixel is the pixel value of the current pixel finally displayed on the micro display screen.
It should be noted that after traversing and processing each pixel point in the video source data to be displayed to obtain the display pixel value of each pixel point, the display pixel value of each pixel point may be cached according to a certain pixel structure to obtain the processed video source data.
In another way, for example, if the absolute value of the difference between the pixel value of the current pixel and the pixel value of one of the error mapped pixels is greater than a threshold value, that is, the difference is smaller than the threshold value>And determining that an edge exists between the current pixel and the error mapped pixel by using the threshold value Q, wherein the position and the number of the error mapped pixel need to be updated at the moment so as to eliminate the error mapped pixel of the original edge. Assume that the error-mapped pixel P on the right is excluded from the location 1 (x, y + 1), the remaining error mapped pixels may then be summed, i.e., summedAnd sorting according to the sequence from big to small. It is assumed that,is greater thanThen the weight value of lower right error mapped pixel P3 (x +1, y + 1) may be determined asAnd the weight value of the lower error mapped pixel P2 (x +1, y) is。
Optionally, the configuration parameters include: the pixel structure is a Chinese character 'tian' shaped pixel structure, the number of error mapping pixels is 3, and the positions of the error mapping pixels are right, below and below right;
in the step S402, obtaining the pixel value of each error mapping pixel according to the pixel structure, the number of error mapping pixels, and the position of each error mapping pixel includes:
acquiring a corresponding color component of a right pixel of the current pixel, a corresponding color component of a lower pixel of the current pixel, and a corresponding color component of a lower right pixel of the current pixel, wherein the color components include: a red component, a green component, a blue component, and a W component.
IN the present embodiment, with 1920 × 1080 resolution, the video source data to be displayed is 10bit gray scale depth, using IN 10 Showing that the pixel structure is a Chinese character 'tian' shaped pixel structure, and the pixel structure is displayed on a spatial color micro-display screen with 8-bit gray scale depth for illustration, and is indicated by OUT 8 And (4) showing.
The spatial color screen comprises a resolution pixel point and a pixel structure, wherein the resolution pixel point of the spatial color screen comprises 4 sub-pixels of a red component R, a green component G, a blue component B and a W component in a field-shaped row pixel structure, the W sub-pixel is obtained by calculating the three sub-pixels of the red component R, the green component G and the blue component B, namely W = n 1R + n 2G + n 3B, and n1+ n2+ n3=1.
In the present embodiment, the corresponding red color component, green component, blue component, and W component of the pixel on the right of the current pixel, the corresponding red color component, green component, blue component, and W component of the pixel below the current pixel, and the corresponding red color component, green component, blue component, and W component of the pixel below and to the right of the current pixel may be acquired.
Wherein the mapping error Δ R is IN 10 Red component of (2) bit lower, mapping error Δ G is IN 10 Green component of (2) bit less, mapping error Δ B is IN 10 Blue component of (2) bit less, mapping error Δ W is IN 10 The W component of (a) is 2 bits lower.
Referring to FIG. 5, it is a diagram showing the mapping relationship of the sub-pixel R in the field-word-line pixel structure, where R is ij The pixel array is characterized by representing red component sub-pixels in a frame of picture, i represents a line number, j represents a column number, the value range of i is 0 to 1079, and the value range of j is 0 to 1919.
In FIG. 5, R is a square column of pixels with 3 rows and 4 columns 00 Error mapped pixel of R 01 、R 10 、R 11 This is three pixels, at R 01 Thereon is provided with 3 Δ R 00 Embodiment of/8 in R 10 Thereon is provided with 3 Δ R 00 Embodiment of/8, in R 11 Thereon is provided with 2 Δ R 00 The embodiment of/8, see mapping circle No. 1 in FIG. 5;
R 01 error mapped pixel of R 02 、R 11 、R 12 At R 02 Thereon is provided with 3 Δ R 01 Embodiment of/8, in R 11 Thereon is provided with 3 Δ R 01 Embodiment of/8, in R 12 Thereon is provided with 2 Δ R 01 The embodiment of/8 can be seen in the mapping circle No. 2 in FIG. 5;
R 02 error mapped pixel of R 03 、R 12 、R 13 At R 03 Thereon is provided with 3 Δ R 02 Embodiment of/8 in R 12 Thereon is provided with 3 Δ R 02 Embodiment of/8, in R 13 Thereon is provided with 2 Δ R 02 Embodiment of/8;
R 03 error mapped pixel of (2) is R 13 At R 13 Thereon is provided with 3 Δ R 03 The embodiment of/8 is shown in the No. 3 mapping circle in FIG. 5; note that the last column of pixels is different from the other columns of pixels.
R 10 Error mapping ofThe pixel is R 11 、R 20 、R 21 At R 11 Thereon is provided with 3 Δ R 10 Embodiment of/8, in R 20 Thereon is provided with 3 Δ R 10 Embodiment of/8 in R 21 Thereon is provided with 2 Δ R 10 The embodiment of/8, see mapping circle No. 4 in FIG. 5;
R 11 error mapped pixel of (2) is R 12 、R 21 、R 22 At R 12 Thereon is provided with 3 Δ R 11 Embodiment of/8, in R 21 Thereon is provided with 3 Δ R 11 Embodiment of/8, in R 22 Thereon is provided with 2 Δ R 11 The embodiment of/8, see mapping circle No. 5 in FIG. 5;
R 12 error mapped pixel of (2) is R 13 、R 22 、R 23 At R 13 Thereon is provided with 3 Δ R 12 Embodiment of/8, in R 22 Thereon is provided with 3 Δ R 12 Embodiment of/8, in R 23 Thereon is provided with 2 Δ R 12 The embodiment of/8;
R 13 error mapped pixel of R 23 At R 23 Thereon is provided with 3 Δ R 13 The embodiment of/8, see mapping circle No. 6 in FIG. 5; note that the last column of pixels is different from the other columns of pixels.
R 20 Error mapped pixel of R 21 At R 21 Thereon is provided with 3 Δ R 20 The embodiment of/8 is shown in the No. 4 mapping circle in FIG. 5;
R 21 error mapped pixel of R 22 At R 22 Thereon is provided with 3 Δ R 21 The embodiment of/8 is shown in the No. 5 mapping circle in FIG. 5;
R 22 error mapped pixel of R 23 At R 23 Thereon is provided with 3 Δ R 22 The embodiment of/8; r 23 No error mapped pixels; see mapping circle No. 6 in fig. 5; note that the last row of pixels is different from the other rows of pixels;
referring to FIG. 6, it is a diagram showing the mapping relationship of the sub-pixel G in the field-line pixel structure, G 00 Error mapped pixel of G 01 、G 10 、G 11 At G 01 Thereon is provided with 3 Δ G 00 /8 embodiment in G 10 Thereon is provided with 3 Δ G 00 /8 embodiment in G 11 Thereon is provided with 2 Δ G 00 The embodiment of/8, see mapping circle No. 1 in FIG. 6;
G 01 error mapped pixel of G 02 、G 11 、G 12 At G 02 Thereon is provided with 3 Δ G 01 /8 embodiment in G 11 Thereon is provided with 3 Δ G 01 /8 embodiment in G 12 Thereon is provided with 2 Δ G 01 The embodiment of/8 is shown in the No. 2 mapping circle in FIG. 6;
G 02 error mapped pixel of G 03 ,G 12 、G 13 At G 03 Thereon is provided with 3 Δ G 02 /8 embodiment in G 12 Thereon is provided with 3 Δ G 02 /8 embodiment in G 13 Thereon is provided with 2 Δ G 02 The embodiment of/8;
G 03 error mapped pixel of G 13 At G 13 Thereon is provided with 3 Δ G 03 The embodiment of/8, see mapping circle No. 3 in FIG. 6;
G 10 error mapped pixel of G 11 、G 20 、G 21 In G 11 Thereon is provided with 3 Δ G 10 /8 embodiment in G 20 Thereon is provided with 3 Δ G 10 /8 embodiment in G 21 Thereon is provided with 2 Δ G 10 The embodiment of/8, see mapping circle No. 4 in FIG. 6;
G 11 error mapped pixel of G 12 、G 21 、G 22 At G 12 Thereon is provided with 3 Δ G 11 /8 embodiment in G 21 Thereon is provided with 3 Δ G 11 /8 embodiment in G 22 Thereon is provided with 2 Δ G 11 The embodiment of/8, see mapping circle No. 5 in FIG. 6;
G 12 error mapped pixel of G 13 、G 22 、G 23 At G 13 Thereon is provided with 3 Δ G 12 /8 embodiment in G 22 Thereon is provided with 3 Δ G 12 /8 embodiment in G 23 Thereon is provided with 2 Δ G 12 The embodiment of/8; g 13 Error mapped pixel of G 23 At G 23 Thereon is provided with 3 Δ G 13 The embodiment of/8 is shown in the No. 6 mapping circle in FIG. 6;
G 20 error mapped pixel of G 21 At G 21 Thereon is provided with 3 Δ G 20 The embodiment of/8, see mapping circle No. 4 in FIG. 6;
G 21 error mapped pixel of G 22 At G 22 Thereon is provided with 3 Δ G 21 The embodiment of/8, see mapping circle No. 5 in FIG. 6;
G 22 error mapped pixel of G 23 At G 23 Thereon is provided with 3 Δ G 22 Embodiment of/8, G 23 No error mapped pixels; see mapping circle No. 6 in fig. 6;
fig. 7 is a diagram showing a mapping relationship of the sub-pixel B in the field-word line pixel structure.
Wherein, B 00 Error mapped pixel of B 01 、B 10 、B 11 In B 01 Thereon is provided with 3 Δ B 00 Embodiment of/8, in B 10 Thereon is provided with 3 Δ B 00 Embodiment of/8, in B 11 Thereon is provided with 2 Δ B 00 The embodiment of/8, see mapping circle No. 1 in FIG. 7;
B 01 error mapped pixel of B 02 、B 11 、B 12 In B 02 Thereon is provided with 3 Δ B 01 Embodiment of/8, in B 11 Thereon is provided with 3 Δ B 01 Embodiment of/8, in B 12 Thereon is provided with 2 Δ B 01 The embodiment of/8 is shown in the No. 2 mapping circle in FIG. 7;
B 02 error mapped pixel of (A) is B 03 、B 12 、B 13 At B, in 03 Thereon is provided with 3 Δ B 02 Embodiment of/8, in B 12 Thereon is provided with 3 Δ B 02 Embodiment of/8, in B 13 Thereon is provided with 2 Δ B 02 The embodiment of/8; b 03 Error mapped pixel of B 13 At B, in 13 Thereon is provided with 3 Δ B 03 The embodiment of/8, see mapping circle No. 3 in FIG. 7;
B 10 error mapped pixel of (A) is B 11 、B 20 、B 21 In B 11 Thereon is provided with 3 Δ B 10 Embodiment of/8, in B 20 Thereon is provided with 3 Δ B 10 Embodiment of/8, in B 21 Thereon is provided with 2 Δ B 10 The embodiment of/8 is shown in the No. 4 mapping circle in FIG. 7;
B 11 error mapped pixel of B 12 、B 21 、B 22 In B 12 Thereon is provided with 3 Δ B 11 Embodiment of/8, in B 21 Thereon is provided with 3 Δ B 11 Embodiment of/8, in B 22 Thereon is provided with 2 Δ B 11 The embodiment of/8, see mapping circle No. 5 in FIG. 7;
B 12 error mapped pixel of B 13 、B 22 、B 23 At B, in 13 Thereon is provided with 3 Δ B 12 Embodiment of/8, in B 22 Thereon is provided with 3 Δ B 12 Embodiment of/8, in B 23 Thereon is provided with 2 Δ B 12 The embodiment of/8; b is 13 Error mapped pixel of (A) is B 23 At B, in 23 Thereon is provided with 3 Δ B 13 The embodiment of/8 is shown in the No. 6 mapping circle in FIG. 7;
B 20 error mapped pixel of (A) is B 21 In B 21 Thereon is provided with 3 Δ B 20 The embodiment of/8, see mapping circle No. 4 in FIG. 7;
B 21 error mapped pixel of B 22 At B, in 22 Thereon is provided with 3 Δ B 21 The embodiment of/8, see the mapping circle No. 5 in the figure;
B 22 error mapped pixel of B 23 In B 23 Thereon is provided with 3 Δ B 22 The embodiment of/8; b is 23 No error mapped pixels; see mapping circle No. 6 in fig. 7;
fig. 8 shows a mapping relationship diagram of the sub-pixel W in the field-word line pixel structure.
Wherein, W 00 Error mapped pixel of W 01 、W 10 、W 11 At W 01 Thereon is provided with 3 Δ W 00 /8 is embodied inW 10 Thereon is provided with 3 Δ W 00 Embodiment of/8 is as in W 11 Thereon is provided with 2 Δ W 00 The embodiment of/8 is shown in the No. 1 mapping circle in FIG. 8;
W 01 error mapped pixel of (2) as W 02 、W 11 、W 12 In W 02 Thereon is provided with 3 Δ W 01 Embodiment of/8, in W 11 Thereon is provided with 3 Δ W 01 Embodiment of/8 is as in W 12 Thereon is provided with 2 Δ W 01 The embodiment of/8, see mapping circle No. 2 in FIG. 8;
W 02 error mapped pixel of W 03 、W 12 、W 13 In W 03 Thereon is provided with 3 Δ W 02 Embodiment of/8, in W 12 Thereon is provided with 3 Δ W 02 Embodiment of/8, in W 13 Thereon is provided with 2 Δ W 02 The embodiment of/8; w is a group of 03 Error mapped pixel of W 13 In W 13 Thereon is provided with 3 Δ W 03 The embodiment of/8, see mapping circle No. 3 in FIG. 8;
W 10 error mapped pixel of W 11 、W 20 、W 21 In W 11 Thereon is provided with 3 Δ W 10 Embodiment of/8, in W 20 Thereon is provided with 3 Δ W 10 Embodiment of/8 is as in W 21 Thereon is provided with 2 Δ W 10 The embodiment of/8, see mapping circle No. 4 in FIG. 8;
W 11 error mapped pixel of W 12 、W 21 、W 22 In W 12 Thereon is provided with 3 Δ W 11 Embodiment of/8 is as in W 21 Thereon is provided with 3 Δ W 11 Embodiment of/8, in W 22 Thereon is provided with 2 Δ W 11 The embodiment of/8, see mapping circle No. 5 in FIG. 8;
W 12 error mapped pixel of W 13 、W 22 、W 23 In W 13 Thereon is provided with 3 Δ W 12 Embodiment of/8, in W 22 Thereon is provided with 3 Δ W 12 Embodiment of/8, in W 23 Thereon is provided with 2 Δ W 12 Embodiment of/8; w is a group of 13 Error mapped pixel of W 23 In W 23 Thereon is provided with 3 Δ W 13 The embodiment of/8, see mapping circle No. 6 in FIG. 8;
W 20 error mapped pixel of (2) as W 21 In W 21 Thereon is provided with 3 Δ W 20 The embodiment of/8, see the mapping circle No. 4 in the figure; see mapping circle No. 4 in fig. 8;
W 21 error mapped pixel of (2) as W 22 In W 22 Thereon is provided with 3 Δ W 21 The embodiment of/8, see mapping circle No. 5 in FIG. 8;
W 22 error mapped pixel of (2) as W 23 At W 23 Thereon is provided with 3 Δ W 22 The embodiment of/8; w is a group of 23 No error mapped pixels; see mapping circle No. 6 in fig. 8;
referring to fig. 9, a complete relationship diagram for mapping the field-word line pixel structure is shown.
Wherein, in the order of RGBW, P 00 Shown as [ R ] 00 ,G 00 ,B 00 ,W 00 ];
P 01 Shown as [ R ] 01 +3 Δ R 00 /8,G 01 +3 Δ G 00 /8,B 01 +3 Δ B 00 /8,W 01 +3 Δ W 00 /8];
P 02 Shown as [ R ] 02 +3 Δ R 01 /8,G 02 +3 Δ G 01 /8,B 02 +3 Δ B 01 /8,W 02 +3 Δ W 01 /8];
P 03 Shown as [ R ] 03 +3 Δ R 02 /8,G 03 +3 Δ G 02 /8,B 03 +3 Δ B 02 /8,W 03 +3 Δ W 02 /8];
P 10 Is shown as [ R ] 10 +3 Δ R 00 /8,G 10 +3 Δ G 00 /8,B 10 +3 Δ B 00 /8,W 10 +3 Δ W 00 /8];
P 11 Shown as [ R ] 11 +2 Δ R 00 /8+3 Δ R 01 /8+3 Δ R 10 /8,G 11 +2 Δ G 00 /8+3 Δ G 01 /8+3 Δ G 10 /8,B 11 +2 Δ B 00 /8+3 Δ B 01 /8+3 Δ B 10 /8,W 11 +2 Δ W 00 /8+3 Δ W 01 /8+3 Δ W 10 /8];
P 12 Shown as [ R ] 12 +2 Δ R 01 /8+3 Δ R 02 /8+3 Δ R 11 /8,G 12 +2 Δ G 01 /8+3 Δ G 02 /8+3 Δ G 11 /8,B 12 +2 Δ B 01 /8+3 Δ B 02 /8+3 Δ B 11 /8,W 12 +2 Δ W 01 /8+3 Δ W 02 /8+3 Δ W 11 /8];
P 13 Shown as [ R ] 13 +2 Δ R 02 /8+3 Δ R 03 /8+3 Δ R 12 /8,G 13 +2 Δ G 02 /8+3 Δ G 03 /8+3 Δ G 12 /8,B 13 +2 Δ B 02 /8+3 Δ B 03 /8+3 Δ B 12 /8,W 13 +2 Δ W 02 /8+3 Δ W 03 /8+3 Δ W 12 /8];
P 20 Shown as [ R ] 20 +3 Δ R 10 /8,G 20 +3 Δ G 10 /8,B 20 +3 Δ B 10 /8,W 20 +3 Δ W 10 /8];
P 21 Is shown as [ R ] 21 +2 Δ R 10 /8+3 Δ R 11 /8+3 Δ R 20 /8,G 21 +2 Δ G 10 /8+3 Δ G 11 /8+3 Δ G 20 /8,B 21 +2 Δ B 10 /8+3 Δ B 11 /8+3 Δ B 20 /8,W 21 +2 Δ W 10 /8+3 Δ W 11 /8+3 Δ W 20 /8];
P 22 Shown as [ R ] 22 +2 Δ R 11 /8+3 Δ R 12 /8+3 Δ R 21 /8,G 22 +2 Δ G 11 /8+3 Δ G 12 /8+3 Δ G 21 /8,B 22 +2 Δ B 11 /8+3 Δ B 12 /8+3 Δ B 21 /8,W 22 +2 Δ W 11 /8+3 Δ W 12 /8+3 Δ W 21 /8];
P 23 Shown as [ R ] 23 +2 Δ R 12 /8+3 Δ R 13 /8+3 Δ R 22 /8,G 23 +2 Δ G 12 /8+3 Δ G 13 /8+3 Δ G 22 /8,B 23 +2 Δ B 12 /8+3 Δ B 13 /8+3 Δ B 22 /8,W 23 +2 Δ W 12 /8+3 Δ W 13 /8+3 Δ W 22 /8]。
Secondly, when the display type of the micro display screen is time sequence color, the following is specific:
optionally, the step S302 of determining a target gray scale precision adjusting and controlling manner according to the display type of the micro display screen includes:
in this embodiment, if the display type of the microdisplay is sequential color, the target grayscale accuracy adjusting and controlling strategy can be determined as the grayscale error mapping strategy of the adjacent physical pixels.
The following embodiment specifically explains how to use the target grayscale accuracy control strategy in step S303 to perform grayscale error mapping on video source data to be displayed, so as to obtain processed video source data.
Alternatively, referring to fig. 10, the step S303 includes:
traversing video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
s1001, acquiring preset configuration parameters and a pixel value of a current pixel.
Wherein, the configuration parameters include: the number of error mapped pixels, the location of each error mapped pixel, and the weight value of each error mapped pixel. Illustratively, for example, the number of error mapped pixels is 3, the positions of the error mapped pixels are right, below and right-below the current pixel, and the weight value of the right error mapped pixel, the weight value of the below error mapped pixel and the weight value of the right-below error mapped pixel may be 3/8, 3/8 and 2/8, respectively, that is, the sum of the weight values of the error mapped pixels is 1.
In another implementation manner, the weight value of each error mapped pixel may also be automatically updated, that is, the weight value of each error mapped pixel is determined according to the absolute value of the difference between the pixel value of the current pixel and each error mapped pixel, and the specific calculation process is the step S404 mentioned in the above embodiment, which is not described in detail herein again.
S1002, acquiring a pixel value of each error mapping pixel according to the number of the error mapping pixels and the position of each error mapping pixel.
And S1003, determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
In the present embodiment, for example, the current pixel is P 0 (x, y) the error-mapped pixel is 3 pixels such as right, lower, and lower right, i.e., P 1 (x,y+1)、P 2 (x+1,y)、P 3 (x +1, y + 1), that is, the pixel values of the 3 error mapped pixels can be obtained according to the positions of the error mapped pixels, and then the display pixel value of the current pixel can be determined according to the pixel value of the current pixel, the pixel value of each error mapped pixel and the weight value of each error mapped pixel.
For example, a 1920 × 1080 resolution, 10bit gray scale depth video source is shown on a 8bit gray scale depth time sequential color display screen for example.
Referring to FIG. 11, 9 rows 16 columns are illustrated, where P 00, Error mapped pixel of P 01 、P 10 、P 11 At P 01 Thereon is provided with 3 Δ P 00 Embodiment of/8, in P 10 Thereon is provided with 3 Δ P 00 Embodiment of/8, in P 11 Thereon is provided with 2 Δ P 00 The embodiment of/8;
P 01 error mapped pixel of P 02 、P 11 、P 12 At P 02 Thereon is provided with 3 Δ P 01 Embodiment of/8, in P 11 Thereon is provided with 3 Δ P 01 Embodiment of/8, in P 12 Thereon is provided with 2 Δ P 01 Embodiment of/8;
P 02 error mapped pixel of (2) is P 03 、P 12 、P 13 At P 03 Thereon is provided with 3 Δ P 02 Embodiment of/8, in P 12 Thereon is provided with 3 Δ P 02 Embodiment of/8, in P 13 Thereon is provided with 2 Δ P 02 Embodiment of/8; by analogy with one another, P 015 Error mapped pixel of P 115 At P 115 Thereon is provided with 3 Δ P 015 Embodiment of/8;
P 10 error mapped pixel of P 11 、P 20 、P 21 At P 11 Thereon is provided with 3 Δ P 10 Embodiment of/8 is as in P 20 Thereon is provided with 3 Δ P 10 Embodiment of/8, in P 21 Thereon is provided with 2 Δ P 10 The embodiment of/8;
P 11 error mapped pixel of (2) is P 12 、P 21 、P 22 At P 12 Thereon is provided with 3 Δ P 11 Embodiment of/8 is as in P 21 Thereon is provided with 3 Δ P 11 Embodiment of/8, in P 22 Thereon is provided with 2 Δ P 11 The embodiment of/8;
P 12 error mapped pixel of P 13 、P 22 、P 23 At P 13 Thereon is provided with 3 Δ P 12 Embodiment of/8, in P 22 Thereon is provided with 3 Δ P 12 Embodiment of/8, in P 23 Thereon is provided with 2 Δ P 12 Embodiment of/8; by analogy with one another, P 115 Error mapped pixel of (2) is P 215 At P 215 Thereon is provided with 3 Δ P 115 The embodiment of/8; and so on;
P 80 error mapped pixel of P 81 At P 81 Thereon is provided with 3 Δ P 80 The embodiment of/8; p is 81 Error mapped pixel of P 82 At P 82 Thereon is provided with 3 Δ P 81 The embodiment of/8;
P 82 error mapped pixel of P 83 At P 83 Thereon is provided with 3 Δ P 82 Embodiment of/8; p 815 No error mapped pixels;
the pixel values at the pixels are denoted by V, and the final displayed pixel values are: p 00 Shown as V 00 ,P 01 Shown as V 01 +3 Δ V 00 /8,P 02 Shown as V 02 +3 Δ V 01 /8,P 03 Shown as V 03 +3 Δ V 02 /8, by analogy, P 015 Shown as V 015 +3 Δ V 014 /8;
P 10 Shown as V 10 +3 Δ V 00 /8,P 11 Shown as V 11 +2 Δ V 00 /8+3 Δ V 01 /8+3 Δ V 10 /8,P 12 Shown as V 12 +2 Δ V 01 /8+3 Δ V 02 /8+3 Δ V 11 8; by analogy with one another, P 115 Shown as V 115 +2 Δ V 014 /8+3 Δ V 015 /8+3 Δ V 114 And 8, and so on;
P 80 shown as V 80 +3 Δ R 70 /8,P 81 Shown as V 81 +2 Δ V 70 /8+3 Δ V 71 /8+3 Δ V 80 /8,P 22 Shown as V 82 +2 Δ V 71 /8+3 Δ V 72 /8+3 Δ V 81 /8, inBy analogy, P 815 Shown as V 815 +2 Δ V 714 /8+3 Δ V 715 /8+3 Δ V 814 /8。
In this embodiment, after the display pixel value of each pixel point on one frame of video image is obtained, the display pixel value of each pixel point may be cached according to a line arrangement manner, so as to obtain processed video source data.
Optionally, the step S304 displays the processed video source data on a micro display screen, and includes:
if the display type of the micro display screen is spatial color, displaying the processed video source data on the micro display screen by using a pixel structure supported by the spatial color, wherein the pixel structure supported by the spatial color comprises: a field-shaped pixel structure, a strip-shaped pixel structure, a delta-shaped pixel structure and an L-shaped pixel structure;
and if the display type of the micro display screen is time sequence color, displaying the processed video source data on the micro display screen by using the time division multiplexing physical pixels.
In this embodiment, if the display type of the micro display screen is spatial color, the processed video source data is output to the micro display screen for display by using the output modes of the field-shaped pixel structure, the strip-shaped pixel structure, the delta-shaped pixel structure, the L-shaped pixel structure, or the like supported by the spatial color; or, if the display type of the micro display screen is time sequence color, outputting the processed video source data to the micro display screen for display in a time division multiplexing physical pixel output mode. Therefore, under the condition that the display frame rate and the clock frequency of the micro display screen are not additionally increased, pixels at different positions and different quantities can be multiplexed according to a certain rule, the display gray scale depth of the whole micro display screen is improved, the power consumption of a micro display chip is reduced, and the area is reduced.
Based on the same inventive concept, the embodiment of the present application further provides a video data display processing apparatus corresponding to the video data display processing method, and since the principle of the apparatus in the embodiment of the present application for solving the problem is similar to that of the video data display processing method in the embodiment of the present application, the implementation of the apparatus may refer to the implementation of the method, and repeated details are omitted.
Referring to fig. 12, an embodiment of the present application further provides a video data display processing apparatus, including:
an obtaining module 1201, configured to obtain video source data to be displayed;
the determining module 1202 is configured to determine a target grayscale precision regulating and controlling strategy according to a display type of the micro display screen;
a processing module 1203, configured to perform gray scale error mapping processing on video source data to be displayed by using a target gray scale precision regulation and control strategy, so as to obtain processed video source data, where a gray scale depth of the processed video source data is the same as a gray scale depth supported by a micro display screen;
and a display module 1204, configured to display the processed video source data on a micro display screen.
Optionally, the determining module 1202 is further configured to:
and if the display type of the micro display screen is space color, determining that the target gray scale precision regulating strategy is a gray scale error mapping strategy between corresponding color sub-pixels on the pixels with the adjacent resolution ratios.
Optionally, the processing module 1203 is further configured to:
traversing video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: the pixel structure, the number of error mapping pixels, the position of each error mapping pixel and a threshold;
acquiring a pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel;
determining whether an absolute value of a difference between a pixel value of the current pixel and a pixel value of each error mapped pixel is greater than a threshold;
if not, determining the weight value of each error mapping pixel according to the absolute value of the difference value between the pixel value of the current pixel and each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
Optionally, the configuration parameters include: the pixel structure is a Chinese character 'tian' shaped pixel structure, the number of error mapping pixels is 3, and the positions of the error mapping pixels are right, below and right-below;
the processing module 1203 is further configured to:
acquiring a corresponding color component of a right pixel of the current pixel, a corresponding color component of a lower pixel of the current pixel, and a corresponding color component of a lower right pixel of the current pixel, wherein the color components include: a red component, a green component, a blue component, and a W component.
Optionally, the determining module 1202 is further configured to:
and if the display type of the micro display screen is time sequence color, determining that the target gray scale precision regulating strategy is a gray scale error mapping strategy of the adjacent physical pixels.
Optionally, the processing module 1203 is further configured to:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: the number of error mapping pixels, the position of each error mapping pixel and the weight value of each error mapping pixel;
acquiring a pixel value of each error mapping pixel according to the number of the error mapping pixels and the position of each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
Optionally, the display module 1204 is further configured to:
if the display type of the micro display screen is spatial color, displaying the processed video source data on the micro display screen by using a pixel structure supported by the spatial color, wherein the pixel structure supported by the spatial color comprises: the pixel structure comprises a field-shaped pixel structure, a strip-shaped pixel structure, a delta-shaped pixel structure and an L-shaped pixel structure;
and if the display type of the micro display screen is time sequence color, displaying the processed video source data on the micro display screen by using the time division multiplexing physical pixels.
The above-mentioned apparatus is used for executing the method provided by the foregoing embodiment, and the implementation principle and technical effect are similar, which are not described herein again.
These above modules may be one or more integrated circuits configured to implement the above methods, such as: one or more Application Specific Integrated Circuits (ASICs), or one or more microprocessors (DSPs), or one or more Field Programmable Gate Arrays (FPGAs), among others. For another example, when one of the above modules is implemented in the form of a Processing element scheduler code, the Processing element may be a general-purpose processor, such as a Central Processing Unit (CPU) or other processor capable of calling program code. For another example, these modules may be integrated together and implemented in the form of a system-on-a-chip (SOC).
Optionally, the present application also provides a program product, such as a computer readable storage medium, comprising a program which, when being executed by a processor, is adapted to carry out the above-mentioned method embodiments.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one type of logical functional division, and other divisions may be realized in practice, for example, multiple units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions for enabling a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to perform some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: a U disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
Claims (8)
1. A method for video data display processing, the method comprising:
acquiring video source data to be displayed;
determining a target gray scale precision regulation strategy according to the display type of the micro display screen;
performing gray scale error mapping processing on the video source data to be displayed by using the target gray scale precision regulation and control strategy to obtain processed video source data, wherein the gray scale depth of the processed video source data is the same as the gray scale depth supported by the micro display screen;
displaying the processed video source data on the micro display screen;
the method for determining the target gray scale precision regulation and control mode according to the display type of the micro display screen comprises the following steps:
if the display type of the micro display screen is spatial color, determining the target gray scale precision regulation strategy as a gray scale error mapping strategy between corresponding color sub-pixels on adjacent resolution pixels;
the using the target gray scale precision regulation and control strategy to perform gray scale error mapping processing on the video source data to be displayed to obtain processed video source data includes:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: the pixel structure, the number of error mapping pixels, the position of each error mapping pixel and a threshold;
acquiring a pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel;
determining whether an absolute value of a difference between a pixel value of the current pixel and a pixel value of each of the error mapped pixels is greater than the threshold;
if so, determining that an edge exists between the current pixel and the error mapping pixel, and updating the position and the number of the error mapping pixel to eliminate the error mapping pixel of the original edge;
if not, determining the weight value of each error mapping pixel according to the pixel value of the current pixel and the absolute value of the difference value between each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
2. The method of claim 1, wherein the configuration parameters comprise: the pixel structure is a Chinese character 'tian' shaped pixel structure, the number of error mapping pixels is 3, and the positions of the error mapping pixels are right, below and below right;
the obtaining the pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel includes:
acquiring a corresponding color component of a right pixel of the current pixel, a corresponding color component of a lower pixel of the current pixel, and a corresponding color component of a lower right pixel of the current pixel, wherein the color components include: a red component, a green component, a blue component, and a W component.
3. The method according to claim 1, wherein determining a target grayscale accuracy control manner according to the display type of the microdisplay comprises:
and if the display type of the micro display screen is time sequence color, determining the target gray scale precision regulation strategy as a gray scale error mapping strategy of adjacent physical pixels.
4. The method according to claim 3, wherein the performing gray scale error mapping processing on the video source data to be displayed by using the target gray scale precision adjustment and control strategy to obtain processed video source data comprises:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: the number of error mapping pixels, the position of each error mapping pixel and the weight value of each error mapping pixel;
acquiring a pixel value of each error mapping pixel according to the number of the error mapping pixels and the position of each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
5. The method of claim 1, wherein said displaying said processed video source data onto said microdisplay comprises:
if the display type of the micro display screen is spatial color, displaying the processed video source data on the micro display screen by using the pixel structure supported by the spatial color, wherein the pixel structure supported by the spatial color comprises: the pixel structure comprises a field-shaped pixel structure, a strip-shaped pixel structure, a delta-shaped pixel structure and an L-shaped pixel structure;
and if the display type of the micro display screen is time sequence color, displaying the processed video source data on the micro display screen by using time division multiplexing physical pixels.
6. A video data display processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for acquiring video source data to be displayed;
the determining module is used for determining a target gray scale precision regulating and controlling strategy according to the display type of the micro display screen;
the processing module is used for performing gray scale error mapping processing on the video source data to be displayed by using the target gray scale precision regulating and controlling strategy to obtain processed video source data, wherein the gray scale depth of the processed video source data is the same as the gray scale depth supported by the micro display screen;
the display module is used for displaying the processed video source data on the micro display screen;
the determining module is used for determining the target gray scale precision regulating strategy as a gray scale error mapping strategy between corresponding color sub-pixels on adjacent resolution pixels if the display type of the micro display screen is spatial color;
the determining module is specifically configured to:
traversing the video source data to be displayed, and respectively executing the following steps aiming at the traversed current pixel:
acquiring preset configuration parameters and a pixel value of a current pixel, wherein the configuration parameters comprise: a pixel structure, a number of error mapped pixels, a position of each error mapped pixel, and a threshold;
acquiring a pixel value of each error mapping pixel according to the pixel structure, the number of the error mapping pixels and the position of each error mapping pixel;
determining whether an absolute value of a difference between a pixel value of the current pixel and a pixel value of each of the error mapped pixels is greater than the threshold;
if so, determining that an edge exists between the current pixel and the error mapping pixel, and updating the position and the number of the error mapping pixel to eliminate the error mapping pixel of the original edge;
if not, determining the weight value of each error mapping pixel according to the pixel value of the current pixel and the absolute value of the difference value between each error mapping pixel;
and determining the display pixel value of the current pixel according to the pixel value of the current pixel, the pixel value of each error mapping pixel and the weight value of each error mapping pixel.
7. A microdisplay, comprising: the method comprises the following steps: the driving chip and the micro display chip are communicated through a bus, the driving chip or the micro display chip is stored with executable machine readable instructions, and when the micro display screen runs, the driving chip or the micro display chip executes the machine readable instructions to execute the steps of the method according to any one of claims 1 to 5.
8. A computer-readable storage medium, in which a computer program is stored which, when being executed by a driver module or a micro-display chip, carries out the steps of the method according to any one of claims 1 to 5.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211469976.4A CN115547230B (en) | 2022-11-23 | 2022-11-23 | Video data display processing method and device, micro display screen and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202211469976.4A CN115547230B (en) | 2022-11-23 | 2022-11-23 | Video data display processing method and device, micro display screen and storage medium |
Publications (2)
Publication Number | Publication Date |
---|---|
CN115547230A CN115547230A (en) | 2022-12-30 |
CN115547230B true CN115547230B (en) | 2023-03-10 |
Family
ID=84719776
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202211469976.4A Active CN115547230B (en) | 2022-11-23 | 2022-11-23 | Video data display processing method and device, micro display screen and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN115547230B (en) |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110168605A (en) * | 2017-01-19 | 2019-08-23 | 索尼公司 | Apparatus for processing of video signals, video signal processing method and program for dynamic range compression |
CN111415606A (en) * | 2019-01-04 | 2020-07-14 | 三星显示有限公司 | Display device and method for driving display device |
CN111508424A (en) * | 2020-05-22 | 2020-08-07 | 东莞阿尔泰显示技术有限公司 | L ED display screen tone mapping method and control system |
CN111627403A (en) * | 2020-06-05 | 2020-09-04 | 南京芯视元电子有限公司 | Automatic phase linearity correction system of LCoS spatial light modulator |
CN112764706A (en) * | 2021-01-27 | 2021-05-07 | 苏州唐古光电科技有限公司 | Image processing method, apparatus, system, device and medium |
CN115033199A (en) * | 2021-03-03 | 2022-09-09 | 青岛海信移动通信技术股份有限公司 | Mobile terminal and image display method thereof |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR100484540B1 (en) * | 2002-10-24 | 2005-04-20 | 학교법인 인하학원 | Gray level reproduction and dynamic false contour reduction on plasma display panel based on a single error diffusion |
KR100658353B1 (en) * | 2005-04-08 | 2006-12-15 | 엘지전자 주식회사 | Image Processing Device and Method for Plasma Display Panel |
CN101170642B (en) * | 2006-10-26 | 2010-08-18 | 深圳Tcl工业研究院有限公司 | A dynamically video image processing method |
CN106128382B (en) * | 2016-08-24 | 2019-02-22 | 深圳市华星光电技术有限公司 | The method of adjustment of color drift during a kind of four colors display device white balance |
CN108074513B (en) * | 2017-12-25 | 2019-10-18 | 深圳市华星光电技术有限公司 | Detect the method and device of image high frequency components |
CN109978777B (en) * | 2019-02-01 | 2021-10-15 | 深圳锐取信息技术股份有限公司 | Image brightness adjusting method and device |
CN110189720B (en) * | 2019-05-31 | 2021-07-06 | 惠科股份有限公司 | White balance parameter generation method for four-color pixel and display device |
CN112233615B (en) * | 2020-12-09 | 2021-03-05 | 卡莱特(深圳)云科技有限公司 | Method and device for improving display effect of LED display screen |
CN114640836B (en) * | 2020-12-15 | 2023-07-25 | 深圳市奥拓电子股份有限公司 | Video data mapping method, LED display controller and LED display system |
CN114339360B (en) * | 2021-09-09 | 2023-05-02 | 腾讯科技(深圳)有限公司 | Video processing method, related device and equipment |
-
2022
- 2022-11-23 CN CN202211469976.4A patent/CN115547230B/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110168605A (en) * | 2017-01-19 | 2019-08-23 | 索尼公司 | Apparatus for processing of video signals, video signal processing method and program for dynamic range compression |
CN111415606A (en) * | 2019-01-04 | 2020-07-14 | 三星显示有限公司 | Display device and method for driving display device |
CN111508424A (en) * | 2020-05-22 | 2020-08-07 | 东莞阿尔泰显示技术有限公司 | L ED display screen tone mapping method and control system |
CN111627403A (en) * | 2020-06-05 | 2020-09-04 | 南京芯视元电子有限公司 | Automatic phase linearity correction system of LCoS spatial light modulator |
CN112764706A (en) * | 2021-01-27 | 2021-05-07 | 苏州唐古光电科技有限公司 | Image processing method, apparatus, system, device and medium |
CN115033199A (en) * | 2021-03-03 | 2022-09-09 | 青岛海信移动通信技术股份有限公司 | Mobile terminal and image display method thereof |
Also Published As
Publication number | Publication date |
---|---|
CN115547230A (en) | 2022-12-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11037523B2 (en) | Display method of display panel that uses different display algorithms for different display areas, display panel and display device | |
CN108877694B (en) | Double-layer liquid crystal screen, backlight brightness control method and device and electronic equipment | |
CN106531046B (en) | Display device | |
US9076397B2 (en) | Image display device and image display method | |
EP1949352B1 (en) | A method and apparatus processing pixel signals for driving a display and a display using the same | |
CN104900205B (en) | Liquid-crystal panel and drive method therefor | |
CN107492359B (en) | Display device and driving method thereof | |
US9111501B2 (en) | Display device | |
KR100772906B1 (en) | Method and apparatus for displaying image signal | |
US9959795B2 (en) | Display device and method of driving the same | |
EP1891621A2 (en) | Dual display device | |
CN101471063B (en) | Color signal generating device | |
WO2022057495A1 (en) | Grayscale data determination method and apparatus, and device and screen drive board | |
JP5671608B2 (en) | Image display device and image display method | |
CN112598582B (en) | Image correction and correction data generation method, device and system and display method | |
CN117198232B (en) | Color ink screen driving method, device and storage medium | |
US20140104294A1 (en) | Correcting anamolous texture and feature width effects in a display that uses a multi primary color unit scheme | |
KR20150081173A (en) | Liquid crystal display apparatus and the drivinig method of the same | |
CN116631350B (en) | Display color shift optimization method, display driving method and display | |
CN115547230B (en) | Video data display processing method and device, micro display screen and storage medium | |
CN112884661A (en) | Image processing apparatus, display apparatus, computer-readable storage medium, and image processing method | |
KR20190016831A (en) | Display device and controlling method for the same | |
CN102859578A (en) | Display device, display method and program | |
US11568783B1 (en) | Display drivers, apparatuses and methods for improving image quality in foveated images | |
US9094678B1 (en) | System, method, and computer program product for inverting a polarity of each cell of a display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |