CN116434722A - Picture display method and device - Google Patents

Picture display method and device Download PDF

Info

Publication number
CN116434722A
CN116434722A CN202310429517.1A CN202310429517A CN116434722A CN 116434722 A CN116434722 A CN 116434722A CN 202310429517 A CN202310429517 A CN 202310429517A CN 116434722 A CN116434722 A CN 116434722A
Authority
CN
China
Prior art keywords
picture
pixels
brightness
picture frame
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310429517.1A
Other languages
Chinese (zh)
Inventor
赵醒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Eswin Computing Technology Co Ltd
Original Assignee
Beijing Eswin Computing Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Eswin Computing Technology Co Ltd filed Critical Beijing Eswin Computing Technology Co Ltd
Priority to CN202310429517.1A priority Critical patent/CN116434722A/en
Publication of CN116434722A publication Critical patent/CN116434722A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/10Intensity circuits
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2007Display of intermediate tones
    • G09G3/2018Display of intermediate tones by time modulation using two or more time intervals
    • G09G3/2022Display of intermediate tones by time modulation using two or more time intervals using sub-frames
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • G09G3/2007Display of intermediate tones
    • G09G3/2074Display of intermediate tones using sub-pixels
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0626Adjustment of display parameters for control of overall brightness

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides a picture display method and a device thereof. The method comprises the following steps: obtaining a histogram of the picture frame i according to the brightness information of the pixels in the picture frame i at the time T+1, obtaining a brightness cumulative value of the picture frame i based on the histogram at the time T+2, and outputting target RGB of the pixels in the picture frame i+2 based on the brightness cumulative value and the brightness information of the pixels in the picture frame i+2 at the time T+3. Acquiring a histogram of the picture frame i+1 based on the brightness information of the pixels in the picture frame i+1 at the time T+2, acquiring a brightness cumulative value of the picture frame i+1 based on the histogram at the time T+3, and outputting target RGB of the pixels in the picture frame i+3 based on the brightness cumulative value and the brightness information of the pixels in the picture frame i+3 at the time T+4; and acquiring a histogram of the picture frame i+2 at the time T+3, obtaining a brightness cumulative value of the picture frame i+2 based on the histogram at the time T+4, and outputting target RGB of the pixel in the picture frame i+4 based on the brightness cumulative value and brightness information of the pixel in the picture frame i+4 at the time T+5.

Description

Picture display method and device
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a method and an apparatus for displaying images.
Background
With the development of display technology, many display devices have appeared. Contrast is one of the factors that affects the picture quality of the display. Lower picture contrast tends to result in poorer picture display, and if the picture contrast is increased, the problem of improving the image quality is called research hot spot.
Disclosure of Invention
The embodiment of the application provides a picture display method and a device thereof.
An embodiment of a first aspect of the present application provides a method for displaying a picture, including:
acquiring a histogram of a picture frame i according to brightness information of pixels in the picture frame i at a time T+1, obtaining a brightness cumulative value of the picture frame i based on the histogram of the picture frame i at a time T+2, and outputting target RGB of the pixels in the picture frame i+2 based on the brightness cumulative value of the picture frame i and the brightness information of the pixels in the picture frame i+2 at a time T+3, wherein i and T are natural numbers meeting a set rule from 1;
acquiring a histogram of the picture frame i+1 based on the brightness information of the pixels in the picture frame i+1 at the time T+2, acquiring a brightness cumulative value of the picture frame i+1 based on the histogram of the picture frame i+1 at the time T+3, and outputting a target RGB of the pixels in the picture frame i+3 based on the brightness cumulative value of the picture frame i+1 and the brightness information of the pixels in the picture frame i+3 at the time T+4;
Acquiring a histogram of the picture frame i+2 based on the brightness information of the pixels in the picture frame i+2 at the time T+3, acquiring a brightness cumulative value of the picture frame i+2 based on the histogram of the picture frame i+2 at the time T+4, and outputting target RGB of the pixels in the picture frame i+4 based on the brightness cumulative value of the picture frame i+2 and the brightness information of the pixels in the picture frame i+4 at the time T+5.
An embodiment of a second aspect of the present application provides a picture display device, including:
a first processing module, configured to obtain, at a time t+1, a histogram of a picture frame i according to luminance information of pixels in the picture frame i, obtain, at a time t+2, a luminance integrated value of the picture frame i based on the histogram of the picture frame i, and output, at a time t+3, a target RGB of pixels in the picture frame i+2 based on the luminance integrated value of the picture frame i and luminance information of pixels in the picture frame i+2, where i and T are natural numbers satisfying a set rule from 1;
the second processing module is used for acquiring a histogram of the picture frame i+1 based on the brightness information of the pixels in the picture frame i+1 at the time T+2, acquiring a brightness cumulative value of the picture frame i+1 based on the histogram of the picture frame i+1 at the time T+3, and outputting a target RGB of the pixels in the picture frame i+3 based on the brightness cumulative value of the picture frame i+1 and the brightness information of the pixels in the picture frame i+3 at the time T+4;
The third processing module is configured to obtain a histogram of the picture frame i+2 based on luminance information of the pixels in the picture frame i+2 at a time t+3, obtain a luminance integrated value of the picture frame i+2 based on the histogram of the picture frame i+2 at a time t+4, and output a target RGB of the pixels in the picture frame i+4 based on the luminance integrated value of the picture frame i+2 and the luminance information of the pixels in the picture frame i+4 at a time t+5.
An embodiment of a third aspect of the present application provides an electronic device, including: the embodiment of the second aspect of the application provides a picture display device.
An embodiment of a fourth aspect of the present application proposes an electronic device, including: a processor; a memory for storing the processor-executable instructions; wherein the processor is configured to execute the instructions to implement the method set forth in the embodiment of the first aspect of the present application.
Embodiments of a fifth aspect of the present application provide a non-transitory computer readable storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the method provided by the embodiments of the first aspect of the present application.
Embodiments of a sixth aspect of the present application propose a computer program product comprising a computer program which, when executed by a processor in a communication device, implements the method proposed by the embodiments of the first aspect of the present application.
The technical scheme provided by the embodiment of the application at least brings the following beneficial effects: the brightness information of the pixels in the picture frames separated by one frame can be subjected to brightness remapping based on the brightness cumulative value of the current picture frame to obtain target RGB of the pixels in the picture frames separated by one frame, and in the process of determining the target RGB, the brightness information of the pixels in the picture frames separated by one frame in real time is adjusted due to the fact that the brightness cumulative value of the current picture frame is referenced, so that local contrast of a picture can be enhanced, and the display effect of continuous picture data is more consistent.
Additional aspects and advantages of the application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the application.
Drawings
The foregoing and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings, in which:
fig. 1 is a flowchart of a method for displaying a picture according to an embodiment of the present application;
fig. 2 is a flowchart of another method for displaying a picture according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating a positional relationship between a pixel and an adjacent pixel according to an embodiment of the present disclosure;
FIG. 4 is a schematic diagram of neighborhood mean calculation according to an embodiment of the present disclosure;
FIG. 5 is a flowchart illustrating another method for displaying images according to an embodiment of the present disclosure;
FIG. 6 is a flowchart illustrating another method for displaying images according to an embodiment of the present disclosure;
fig. 7 is a flowchart of another method for displaying a picture according to an embodiment of the present disclosure;
fig. 8 is a flowchart of another method for displaying a picture according to an embodiment of the present disclosure;
fig. 9 is a schematic structural diagram of a picture display device according to an embodiment of the present disclosure;
fig. 10 is a schematic structural diagram of a picture display device according to an embodiment of the present disclosure;
FIG. 11 is a schematic diagram of a method for displaying a frame according to an embodiment of the present disclosure;
fig. 12 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 13 is a schematic structural diagram of another electronic device according to an embodiment of the present application;
fig. 14 is a schematic structural diagram of another display chip according to an embodiment of the present application.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary embodiments do not represent all implementations consistent with the embodiments of the present application. Rather, they are merely examples of apparatus and methods consistent with aspects of embodiments of the present application as detailed in the accompanying claims.
The terminology used in the embodiments of the application is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the application. As used in this application in the examples and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in embodiments of the present application to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, the first information may also be referred to as second information, and similarly, the second information may also be referred to as first information, without departing from the scope of embodiments of the present application. The words "if" and "if" as used herein may be interpreted as "at … …" or "at … …" or "in response to a determination", depending on the context.
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the like or similar elements throughout. The embodiments described below by referring to the drawings are exemplary and intended for the purpose of explaining the present application and are not to be construed as limiting the present application.
It should be noted that, the method for displaying a picture provided in any one of the embodiments of the present application may be performed alone or in combination with possible implementation methods in other embodiments, and may also be performed in combination with any one of the technical solutions of the related art.
The following describes a screen display method and apparatus according to an embodiment of the present application with reference to the drawings.
Fig. 1 is a flowchart of a method for displaying a picture according to an embodiment of the present application. As shown in fig. 1, the method includes, but is not limited to, the steps of:
s101, acquiring a histogram of the picture frame i according to the brightness information of the pixels in the picture frame i at the time T+1, obtaining a brightness cumulative value of the picture frame i based on the histogram of the picture frame i at the time T+2, and outputting target RGB of the pixels in the picture frame i+2 based on the brightness cumulative value of the picture frame i and the brightness information of the pixels in the picture frame i+2 at the time T+3.
Where i and T are natural numbers satisfying a set rule starting from 1.
The method for displaying a picture provided in the embodiments of the present application is generally used on a display device, for example, a mobile phone screen, a computer monitor, a television, a smart screen, and so on.
It should be noted that, the method for displaying a picture provided in the embodiment of the present application may include a plurality of processing stages, where each processing stage corresponds to a different processing device or module, and each processing device or module processes different picture frames at different times, so as to implement continuous processing of continuously input picture frames.
In order to ensure that the display device can output images in real time, in the embodiment of the present application, continuous processing of continuously input image frames is required, in some implementations, the values of i and T may be 1,4,7,10,13, … … (3 n-2), where n is a natural number greater than or equal to 1.
It will be appreciated that the luminance information of a pixel in picture frame i may be obtained at time T, wherein the luminance information of a pixel may include the luminance component of the pixel and the average luminance of the pixel. The luminance component of the pixel may characterize the actual luminance of the pixel. In some implementations, a neighboring pixel that is adjacent to the pixel around the pixel may be determined, and further, an average luminance of the pixel is determined based on the luminance component of the pixel and the luminance component of the neighboring pixel.
After the brightness information of the pixels in the picture frame i is obtained at the time T, statistics can be continuously performed on the occurrence frequency of the brightness information of the pixels in the picture frame i at the time t+1 so as to obtain a histogram of the picture frame i.
After the histogram of the picture frame i is obtained at the time T+1, the histogram of the picture frame i is continuously adjusted at the time T+2, and the brightness cumulative value of the picture frame i is obtained based on the adjusted histogram.
It can be understood that, in the embodiment of the present application, the luminance information of the pixel in the picture frame i+1 may be obtained at time t+1, and the luminance information of the pixel in the picture frame i+2 may be obtained at time t+2.
In this embodiment of the present application, after the luminance integrated value of the picture frame i is obtained at the time t+2, the luminance information of the pixel in the picture frame i+2 may be fused at the time t+3 based on the luminance integrated value of the picture frame i, and the target RGB of the pixel in the picture frame i+2 may be output at the time. In some implementations, the luminance information of the pixel in the picture frame i+2 is fused at the time t+3 based on the luminance integrated value of the picture frame i to obtain a target luminance component of the pixel in the picture frame i+2, and further, a blue color difference component and a red color difference component of the pixel are obtained, and based on the target luminance component, the blue color difference component and the red color difference component of the pixel, the target RGB of the pixel is finally obtained. It can be understood that, at time t+3, based on the sum of the brightness of the picture frame i, the pixels in the picture frame i+2 are fused with brightness information one by one, and thus, the target RGB of each pixel in the picture frame i+2 can be obtained one by one and displayed on the display device.
S102, acquiring a histogram of the picture frame i+1 based on the brightness information of the pixels in the picture frame i+1 at the time T+2, acquiring a brightness cumulative value of the picture frame i+1 based on the histogram of the picture frame i+1 at the time T+3, and outputting target RGB of the pixels in the picture frame i+3 based on the brightness cumulative value of the picture frame i+1 and the brightness information of the pixels in the picture frame i+3 at the time T+4.
After the acquisition of the brightness information of the pixels in the picture frame i is completed at the time T, the brightness information of the pixels in the picture frame i+1 can be continuously acquired at the time t+1. After the brightness information of the pixels in the picture frame i+1 is obtained at the time t+1, statistics may be continuously performed on the occurrence frequency of the brightness information of the pixels in the picture frame i+1 at the time t+2, so as to obtain a histogram of the picture frame i+1.
After the histogram of the picture frame i+1 is obtained at the time T+2, continuously performing histogram adjustment on the histogram of the picture frame i+1 at the time T+3, and obtaining the brightness cumulative value of the picture frame i+1 based on the adjusted histogram.
It can be understood that, in the embodiment of the present application, the luminance information of the pixel in the picture frame i+1 may be obtained at time t+1, the luminance information of the pixel in the picture frame i+2 may be obtained at time t+2, and the luminance information of the pixel in the picture frame i+3 may be obtained at time t+3.
In this embodiment of the present application, after the luminance integrated value of the picture frame i+1 is obtained at the time t+3, the luminance information of the pixel in the picture frame i+3 may be fused continuously based on the luminance integrated value of the picture frame i+1 at the time t+4, and the target RGB of the pixel in the picture frame i+3 is output at the time. In some implementations, the luminance information of the pixel in the picture frame i+3 is fused at time t+4 based on the luminance integrated value of the picture frame i+1 to obtain a target luminance component of the pixel in the picture frame i+3, and further, a blue color difference component and a red color difference component of the pixel are obtained, and based on the target luminance component, the blue color difference component and the red color difference component of the pixel, the target RGB of the pixel is finally obtained.
S103, acquiring a histogram of the picture frame i+2 based on the brightness information of the pixels in the picture frame i+2 at the time T+3, acquiring a brightness cumulative value of the picture frame i+2 based on the histogram of the picture frame i+2 at the time T+4, and outputting target RGB of the pixels in the picture frame i+2 based on the brightness cumulative value of the picture frame i+2 and the brightness information of the pixels in the picture frame i+4 at the time T+5.
After the acquisition of the brightness information of the pixels in the picture frame i+2 is completed at the time t+2, the brightness information of the pixels in the picture frame i+2 can be continuously acquired at the time t+3. After the brightness information of the pixels in the picture frame i+1 is obtained at the time t+3, the statistics of the occurrence frequency of the brightness information of the pixels in the picture frame i+2 can be continuously performed at the time t+4 so as to obtain the histogram of the picture frame i+2.
After the histogram of the picture frame i+1 is obtained at the time T+3, the histogram of the picture frame i+2 is continuously adjusted at the time T+4, and the brightness cumulative value of the picture frame i+2 is obtained based on the adjusted histogram.
It can be understood that, in the embodiment of the present application, the luminance information of the pixel in the picture frame i+1 may be obtained at time t+1, the luminance information of the pixel in the picture frame i+2 may be obtained at time t+2, the luminance information of the pixel in the picture frame i+3 may be obtained at time t+3, and the luminance information of the pixel in the picture frame i+4 may be obtained at time t+4.
In this embodiment of the present application, after the luminance integrated value of the picture frame i+2 is obtained at the time t+4, the luminance information of the pixel in the picture frame i+4 may be fused continuously based on the luminance integrated value of the picture frame i+2 at the time t+5, and the target RGB of the pixel in the picture frame i+4 is output at the time. In some implementations, the luminance information of the pixel in the picture frame i+4 is fused at time t+5 based on the luminance summation value of the picture frame i+2 to obtain a target luminance component of the pixel in the picture frame i+4, and further, a blue color difference component and a red color difference component of the pixel are obtained, and based on the target luminance component, the blue color difference component and the red color difference component of the pixel, the target RGB of the pixel is finally obtained.
In the histogram statistics process of any frame in the embodiments of the present application, the luminance components of all pixels in any frame need to be counted, and after counting all pixels, the histogram of any frame is obtained. In addition, in the histogram statistics process of any picture frame, the brightness component counted before needs to be cached, so that the statistical result of the brightness component before can be prevented from being lost, and then the occurrence number counted by the same brightness component in the cache can be updated based on the brightness component of the current pixel, so as to obtain the final histogram.
For example, when i=1 and t=1, the luminance information of the pixels in the picture frame 1 may be obtained at time t=1, and the occurrence frequency of the luminance information of the pixels in the picture frame 1 may be counted at time t=2 to obtain the histogram of the picture frame 1. And (3) carrying out histogram adjustment on the histogram of the picture frame 1 at the time T=3, and obtaining the brightness cumulative value of the picture frame 1 based on the adjusted histogram. The luminance information of the pixels in the picture frame 3 is fused based on the luminance integrated value of the picture frame 1 at the time t=4, and the target RGB of the pixels in the picture frame 3 is output at the time.
In the embodiment of the present application, the frame 2 is input at time t=2, the brightness information of the pixels in the frame 2 is obtained, and the occurrence frequency of the brightness information of the pixels in the frame 2 is counted at time t=3, so as to obtain the histogram of the frame 2. And (3) carrying out histogram adjustment on the histogram of the picture frame 2 at the time T=4, and obtaining the brightness cumulative value of the picture frame 2 based on the adjusted histogram. The luminance information of the pixels in the picture frame 4 is fused based on the luminance integrated value of the picture frame 2 at the time t=5, and the target RGB of the pixels in the picture frame 4 is output at the time.
In the embodiment of the present application, the frame 3 is input at time t=3, the brightness information of the pixels in the frame 3 is obtained, and the occurrence frequency of the brightness information of the pixels in the frame 3 is counted at time t=4, so as to obtain the histogram of the frame 3. And (3) carrying out histogram adjustment on the histogram of the picture frame 3 at the time point T=5, and obtaining the brightness cumulative value of the picture frame 3 based on the adjusted histogram. The luminance information of the pixels in the picture frame 5 is fused based on the luminance integrated value of the picture frame 3 at the time t=6, and the target RGB of the pixels in the picture frame 5 is output at the time.
The picture display method provided by the embodiment of the application comprises three processing paths, wherein each processing path processes the input picture frame according to the same processing flow, and certain time domain offset exists among the processing models of the same three processing paths, so that not only can the confusion and overlapping of picture data be avoided, but also the continuous display of pictures on the display equipment can be ensured through the time sequence interleaving strategy of the three processing paths.
It can be understood that, in order to generate confusion and coverage of pixels, when any frame is processed, the frame needs to be kept in an enabled state, that is, when the first pixel in the frame is received, a valid enable signal of the frame is generated based on the frame identifier of the pixel, where the valid enable signal is used to indicate that the frame is in a state where a relevant processing operation can be performed, and in this embodiment, each processing module may perform the relevant processing operation on the pixels in the frame under the control of the valid enable signal of the frame. The processing flow of all pixels in the frame of the picture is completed and the true valid enable signal of the picture can be ended.
In the three processing paths of the embodiment of the present application, the first processing path inputs the first frame, the fourth frame, the seventh frame … and the 3n-2 frame … …, and Y can be used 1_en Indicating that the picture frame input by the first processing path is in an enabled state;
the second processing path inputs the second frame, the fifth frame, and the eighth frame …, the 3n-1 frame … …, and Y can be used 2_en Indicating that the picture frame input by the second processing path is in an enabled state;
the third processing path inputs the third frame, the sixth frame, and the ninth frame …, the 3n frame … …, and Y can be used 3_en Indicating that the picture frame input by the third processing path is in an enabled state.
According to the picture display method provided by the embodiment of the invention, the picture frame can be subjected to histogram adjustment, and the brightness information of the pixels in the picture frames separated by one frame is subjected to brightness remapping based on the brightness summation value obtained by the adjusted histogram, so that new brightness components of the pixels in the picture frames separated by one frame are obtained. Furthermore, each processing path in the embodiment of the application processes the input picture frame according to the same processing flow, and certain time domain offset exists between the processing models of the same processing paths, so that not only can confusion and overlapping of picture data be avoided, but also continuous display of pictures on the display device can be ensured through a time sequence interleaving strategy of each processing path.
On the basis of the above embodiments, fig. 2 is a schematic flow chart of another method for displaying a picture according to an embodiment of the present application. As shown in fig. 2, the method explains the process of acquiring luminance information of pixels in any one picture frame:
s201, for the pixel P in any frame xy Converting from RGB domain to YCbCr domain to obtain pixel P xy YCbCr information of (a).
S202, based on pixel P xy YC of (a) b C r Information, obtain pixel P xy Is included in the luminance component of the image.
Wherein, the pixel P xy And represents the pixel of the x-th row and the y-th column, wherein x is an integer greater than or equal to 2, and y is an integer greater than or equal to 1.
In some implementations, a pixel P in any picture frame is acquired xy RGB information of (a) from RGB domain to YC b C r A first gamut mapping matrix upon conversion of the gamut. Further, for the first gamut mapping matrix and pixel P xy Performing matrix operation on RGB information of (2) to obtain pixel P xy YC of (a) b C r Information of the pixel P xy YC of (a) b C r The information may include pixels P xy A luminance component, a blue color difference component, and a red color difference component.
Illustratively, the set first gamut mapping matrix is:
Figure BDA0004192898240000051
pixel P in any picture frame acquired xy After RGB information of (2), the following formula may be used for color gamut conversion:
Figure BDA0004192898240000061
Wherein Y represents a luminance component, C b Component of the represented blue color difference, C r Representing the components of the red color difference.
To save operational difficulty and cost, floating point multiplication operations may be converted to shift and add operations by shift operations. Optionally, at the opposite pixel P xy After performing color gamut conversion on RGB information of (1), YC can be aimed at b C r And performing shift operation on a matrix operation result of any component in the brightness component, the blue color difference component and the red color difference component in the information to obtain a shift operation result of any component, and further, intercepting the shift operation result according to a set bit number from a high position to a low position to obtain a target operation result of any component.
Illustratively, after the input 8-bit arbitrary frame is subjected to the gamut conversion according to the above formula, the luminance component of the pixel in the arbitrary frame may be expressed as: 1024 x y=3064 r+6011 g+116b, and further, the shift operation may be performed on the matrix operation result to obtain a final target operation result:
1024*Y=306R+601G+116B=(R<<8+R<<4+R<<1)+(G<<9+G<<6+G<<4+G<<3+G)+(B<<6+B<<5+B<<4+B<<2);
after the shift operation is completed, the operation result may be shifted by 10 bits from left to right (from high to low), and then the target operation result of the luminance component Y of the pixel may be obtained. Similarly, for the blue color difference component C b And red color difference component C r The same procedure as for the luminance component Y is used for processing.
S203, acquiring a pixel P xy And is based on pixel P xy And the luminance component of each adjacent pixel to obtain P xy Is included in the image data.
In some implementations, the pixel P may be determined by a M x M size shift matrix xy Is determined by the neighborhood size of (1)Determining a pixel P within the neighborhood xy Wherein M is an integer greater than or equal to 1. For example, the pixel P may be acquired by a 3*3-sized shift matrix xy Is 8 adjacent pixels of the pixel P xy A pixel P located at the center of a neighborhood surrounded by 8 adjacent pixels xy The positional relationship with its neighboring pixels can be shown as 3.
In some implementations, in the process of obtaining the brightness information of the pixels on the x+1th row one by one, if the current pixel on the x+1th row is the pixel P xy Is obtained from the first buffer queue on line x-1 and belongs to pixel P xy Luminance information of adjacent pixels of (a) and obtaining the pixel P belonging to the x-th row from the second buffer queue xy Is included in the display panel. Further, for the pixel P xy Is averaged with the luminance component of each adjacent pixel to obtain a pixel P xy Is a luminance average of (a). It will be appreciated that as shown in fig. 3, pixel P xy The last adjacent pixel of (c) is the lower right hand corner pixel in the figure.
At the acquisition of pixel P xy In the course of the average luminance component of (a), the luminance component of each neighboring pixel, that is, the pixel P on the x-1 row is required xy The luminance component of the adjacent pixels of (a) and (b) the pixel P on the x-th row xy Luminance component of adjacent pixels of (1) and (2) pixels P on x+1 row xy Is included in the display panel. In the embodiment of the present application, the pixel P needs to be calculated in the process of acquiring the luminance information of the pixel on the x+1th row xy Is equal to the luminance component of (i.e. pixel P) xy Average luminance component of (a) relative to pixel P xy With a time delay in order to be able to realize the pixel P xy Optionally, two buffer queues including a first buffer queue and a second buffer queue are preset, where the first buffer queue and the second buffer queue may be used to buffer luminance information of pixels on the x-1 th row and luminance information of pixels on the x-1 th row respectively.
In the process of acquiring the brightness information of the pixels on the x+1 line, as the first cache queue and the second cache queue respectively store the brightness information of the pixels on the x-1 line and the x line, the brightness information of the pixels on the x+1 line can be cached through the cache in the memory pool, and one cache queue is waited to be released.
When the average brightness component of the last pixel on the x line is obtained, the brightness information of the pixel on the x-1 line in the first cache queue is cleared, and the brightness information of the pixel on the x+1 line cached in the memory is cached in the first cache queue. Optionally, the brightness information of the pixel on the x line cached in the second cache queue may be cached in the first cache queue, and the brightness information of the pixel on the x+1 line cached in the memory may be cached in the first cache queue.
As shown in fig. 4, two first-in first-out (First Input First Output, FIFO) buffer queues FIFO1 and FIFO2 are provided, and the FIFO1 and FIFO2 are respectively connected to the average value operation unit. The brightness information of the pixels on the x-th line can be buffered in the FIFO1, and the brightness information of the pixels on the x-1 th line can be buffered in the FIFO 2. On acquisition of the x+1th row y+1 th pixel P x+1,y+1 (pixel P) xy The last adjacent pixel of (a) to perform the luminance component, the average value operation unit may read the buffer queues FIFO1 and FIFO2, and may read the pixel P on the x-th line from the FIFO1 xy The pixel P on line x-1 can be read from FIFO2 as luminance information of the adjacent pixels of (1) xy Luminance information of adjacent pixels of (1) and reading the x (1) th row of pixels P from the memory xy Luminance information of adjacent pixels of (a) and pixel P xy Determines the brightness information of the pixel P xy Is included in the display device).
It should be noted that, in the present embodiment, for the pixels P on the edge rows and the edge columns, there are cases where there are missing portions of adjacent pixels for the pixels on the edge rows and the edge columns xy Determining a pixel P xy The method comprises the steps of obtaining brightness information of existing adjacent pixels and missing adjacent pixels, and completing the brightness information of the missing adjacent pixels. Further, based on pixel P xy And the luminance component of each adjacent pixel to obtain P xy Average of (2)A luminance component.
Alternatively, the missing adjacent pixel may be subjected to luminance information complementation based on the set value, or the missing adjacent pixel may be subjected to luminance information complementation based on the luminance information of the existing adjacent pixel, for example, a mean value of the luminance information of the existing adjacent pixel may be obtained, and the missing adjacent pixel may be subjected to luminance information complementation based on the mean value; for another example, the luminance information of the existing adjacent pixels may be weighted, and the missing adjacent pixels may be complemented with the luminance information based on the weighted luminance information.
It will be appreciated that the edge rows include the first row of picture frames and the last row of picture frames, and the edge columns include the first column of picture frames and the last column of picture frames.
The picture display method provided by the embodiment of the application can perform color gamut conversion on RGB information of pixels in a current picture frame to output YC of the pixels b C r Information to obtain the luminance component and average luminance component of the pixel, providing data for subsequent processing stages.
Contrast is one of the factors that affects the image quality of the display. Image quality with low contrast can be globally improved based on global contrast enhancement, but global contrast enhancement leads to poor local effects of image quality. In order to solve the above technical problem, after the luminance component of the pixel in the frame is obtained, the frame may be divided, and the method for displaying a picture provided in the embodiments of the present application is performed by using the picture block as a unit, so as to enhance the local contrast, and better improve the image quality.
It will be appreciated that a picture block is identical to the processing instant of the picture frame to which the picture block belongs in a subsequent different processing stage. That is, the processing time of the picture block of the picture frame i at the subsequent different processing stage is the same as that of the picture frame i in the above-described embodiment at the subsequent different processing stage; the processing time of the picture block of the picture frame i+1 in the subsequent different processing stages is the same as the processing time of the picture frame i+1 in the subsequent different processing stages in the above embodiment; the processing time of the picture block of the picture frame i+1 at the subsequent different processing stage is the same as the processing time of the picture frame i+1 at the subsequent different processing stage in the above-described embodiment.
For ease of illustration, specific processing with any frame of a picture may include, but is not limited to, the following steps, as shown in fig. 5.
S501, after acquiring luminance information of pixels in a picture frame, the picture frame is divided.
After the luminance information of the pixels in the picture frame is acquired, the picture frame may be divided, for example, into matrix picture blocks having the same size and not overlapping each other, in order to further enhance the contrast of the detail region of the image while avoiding the noise of the background and the flat region from being excessively amplified.
S502, for any picture block in different picture blocks, acquiring a histogram of the picture block based on brightness information of pixels in the picture block.
S503, obtaining the brightness summation value of the picture block based on the histogram of the picture block.
S504, outputting the target RGB of the pixels in the picture block based on the brightness summation value of the picture block and the brightness information of the pixels in the same picture block in the fusion picture frame, wherein the fusion picture frame and the picture frame to which the picture block belongs are separated by one frame.
It should be noted that, the processing time of the picture block and the picture frame to which the picture block belongs in the subsequent different processing stages is the same, and illustratively, any picture frame is a picture frame i, brightness information of a pixel in the picture frame i is obtained at a time T, the picture frame i is divided at a time t+1, histograms of each picture block are obtained at a time t+1, and respective brightness cumulative values of each picture block are obtained based on the histograms of each picture block at a time t+2. In this embodiment, the fused frame of the frame i is the frame i+2, and further, the target RGB of the pixel in the frame block in the frame i+2 is output based on the sum of the brightness of the frame block in the frame i and the brightness information of the pixel in the same frame block in the frame i+2. It should be noted that, each frame may be divided into N frame blocks, and the sum of the brightness of the nth frame block in frame i is fused with the brightness information of the pixel in the nth frame block in frame i+2 to obtain the target RGB of the pixel of the 1 st frame block in frame i+2.
Any picture frame is a picture frame i+1, brightness information of pixels in the picture frame i+1 is obtained at the time T+1, the picture frame i+1 is divided at the time T+2, histograms of each picture block are obtained at the time T+2, and respective brightness cumulative values of each picture block are obtained based on the histograms of each picture block at the time T+3. In this embodiment, the fused frame of the frame i+1 is the frame i+3, and further, the target RGB of the pixel in the frame block in the frame i+3 is output based on the sum of the brightness of the frame block in the frame i+1 and the brightness information of the pixel in the same frame block in the frame i+3.
Any picture frame is a picture frame i+2, brightness information of pixels in the picture frame i+1 is obtained at the time T+2, the picture frame i+2 is divided at the time T+3, histograms of each picture block are obtained at the time T+4, and respective brightness cumulative values of each picture block are obtained based on the histograms of each picture block at the time T+5. In this embodiment, the fused frame of the frame i+1 is the frame i+4, and further, the target RGB of the pixel in the frame block in the frame i+4 is output based on the sum of the brightness of the frame block in the frame i+2 and the brightness information of the pixel in the same frame block in the frame i+4.
After the frame of the picture is segmented, an image block to which the pixel belongs may be determined based on coordinates of the currently input pixel, and a first enable signal of the image block may be generated. In this embodiment of the present application, the first enable signal may be used to indicate that the image block is in a state where the related processing operation can be performed. Optionally, the row counter and the column counter are used to count the rows and columns of the frame, the results of the two counters can determine the coordinates of the pixel in the whole frame, determine the frame block to which the coordinates of the pixel belong according to the input parameters of the block size, and output the first enable signal of the corresponding frame block, so that the following processing module can respectively correlate the first enable signals of the frame block with the processing operation.
It will be appreciated that while processing a picture block, it is also necessary to keep the picture frame to which the picture block belongs in an enabled state, that is, when the first pixel in the picture frame to which the picture block belongs is received, a second enable signal for the picture frame is generated based on the frame identification of the pixel, where the second enable signal is used to indicate that the picture frame is in a state in which the relevant processing operation can be performed. On the premise that the picture frame is in an enabled state, the image block in the enabled state can be subjected to related processing operation.
In the embodiment of the application, after the brightness component of the pixel in the picture frame is obtained, the picture frame can be subjected to picture segmentation, the picture block is taken as a unit for subsequent processing operation, and local contrast can be enhanced through picture segmentation, such as the contrast of a picture detail area, and meanwhile, the problem that noise in a background and a flat area is excessively amplified can be avoided, so that the image quality can be better improved.
On the basis of the above embodiments, fig. 6 is a schematic flow chart of another method for displaying a picture according to an embodiment of the present application. As shown in fig. 6, an explanation is given of an acquisition process of a histogram of any one piece of picture data to be processed in a picture frame and a picture block:
s601, gating the first memory and the second memory.
To solve the problem of inaccurate computation when reading and writing the same luminance component before and after, two memories, such as a first memory and a second memory, may be used to perform ping-pong operations of odd/even columns of pixels in the statistical process. In the process of performing histogram statistics based on luminance components, 256 kinds of luminance components need to be distinguished and data is continuously written and read, so the whole histogram statistics process needs to be provided with a memory capable of performing read-write operations simultaneously, and alternatively, the first memory may be a random access memory (Random Access Memory, RAM).
Optionally, the first memory and the second memory are gated by using a parity counter, so that ping-pong statistics of pixels in odd columns and pixels in even columns can be achieved, and a time is allowed for re-writing the newly counted structure.
S602, counting the brightness components of the pixels in the odd columns in the picture data to be processed through the first memory, and counting the brightness components of the pixels in the even columns in the picture data to be processed through the second memory.
In some implementations, a read address corresponding to each luma component in the picture data to be processed is determined, and a storage space indicated by the read address is used to store the number of occurrences of the corresponding luma component. Traversing pixels in the picture data to be processed, when the brightness components of the traversed current pixels are counted, acquiring accumulated occurrence times of the brightness components based on the read addresses corresponding to the brightness components of the traversed current pixels, updating and restoring the accumulated occurrence times to a storage space corresponding to the read addresses until the pixels in the picture to be processed are traversed.
That is, YC is to be processed while traversing the entire picture data to be processed b C r The Y component of the luminance distribution map is taken as the abscissa and the number of occurrences of the luminance component is taken as the ordinate. Initializing the occurrence number of each brightness component to be 0, and allocating a read address to each brightness component, wherein the storage space indicated by the read address can store the occurrence number counted by the brightness component.
The method comprises the steps of traversing from the first pixel, reading the current accumulated occurrence number from a storage position based on a read address corresponding to the traversed brightness component of the current pixel, adding 1 to the read accumulated occurrence number, and rewriting the read accumulated occurrence number into a RAM according to the read address corresponding to the brightness component, wherein statistics of the occurrence frequency of each brightness component can be realized through the flow, and finally, a histogram of picture data to be processed is obtained.
S603, adding the statistical result of the pixels in the odd columns and the statistical result of the pixels in the odd columns to obtain a histogram of the picture data to be processed.
In the histogram statistics process of any frame in the embodiments of the present application, the luminance components of all pixels in any frame need to be counted, and after counting all pixels, the histogram of any frame is obtained. In addition, in the histogram statistics process of any picture frame, the brightness component counted before needs to be cached, so that the statistical result of the brightness component before can be prevented from being lost, and then the occurrence number counted by the same brightness component in the cache can be updated based on the brightness component of the current pixel, so as to obtain the final histogram.
It can be understood that after the cumulative value of the brightness of the picture data to be processed is obtained based on the histogram of the picture data to be processed, the first memory and the second memory can be cleared for the histogram statistics of the next picture data to be processed; alternatively, the histogram of the currently processed picture data needs to be emptied temporarily at the next picture data to be processed.
In the embodiment of the application, after the brightness component of the pixel in the picture frame is obtained, histogram statistics can be performed based on the brightness component, and in the histogram statistics process, ping-pong operation can be performed on the pixels in the odd columns and the pixels in the even columns through the first memory and the second memory, so that the problem of inaccurate calculation when the same brightness component is read and written before and after can be solved, and the time can be reserved for performing the re-writing operation of the statistics result.
Fig. 7 is a flowchart of another method for displaying images according to the embodiment of the present application. As shown in fig. 7, a determination process of the luminance integrated value of any one of the picture data to be processed in the picture frame and the picture block is explained:
s701, obtaining the total brightness average value and the maximum brightness component of the picture data to be processed based on the brightness components of the pixels in the picture data to be processed.
The brightness components of all pixels in the picture data to be processed are averaged to obtain the total brightness average value of the picture data to be processed, and the total brightness average value can be determined by adopting the following formula:
alpha = Σy/a; where α represents the total luminance average value, Y represents the luminance component of the pixel, and a represents the total number of pixels included in the picture data to be processed.
Further, the luminance components of all pixels in the picture data to be processed are sorted or compared, and the maximum luminance component of the picture data to be processed is determined therefrom.
S702, obtaining an adjustment threshold of the histogram according to the total brightness average value and the maximum brightness component.
When the total brightness average value and the maximum brightness component are obtained, according to the set mapping rule, according to the total brightness average value and the maximum brightness component, an adjustment threshold of the histogram is obtained, and the adjustment threshold of the histogram can be determined by adopting the following formula:
δ=g(α,L max ) The method comprises the steps of carrying out a first treatment on the surface of the Wherein α represents the total luminance average, δ represents the adjustment threshold of the histogram, L max Represents the maximum luminance component of the picture data to be processed, g () represents δ and L max Mapping rules between.
For example, a ratio of the total luminance average and the maximum luminance component may be obtained, and the ratio may be determined as the adjustment threshold of the histogram.
S703, adjusting the histogram of the picture data to be processed based on the adjustment threshold value to obtain an adjusted histogram of the picture data to be processed.
And after the adjustment threshold value of the histogram is obtained, adjusting the histogram of the picture data to be processed to obtain an adjusted histogram of the picture data to be processed, and optionally, adjusting the histogram of the picture data to be processed to obtain an adjusted histogram.
The adjusted histogram may be determined, for example, using the following formula:
N' l =h(N l δ); wherein l is E [0, L max ],N i Representing the number of occurrences, N ', of the luminance component l in the histogram' i Indicating the number of occurrences of the luminance component l in the adjusted histogram.
S704, performing accumulation operation on the adjusted histograms to obtain brightness accumulation values of the picture data to be processed.
After the adjusted histogram is obtained, the adjusted histogram can be subjected to accumulation operation to obtain a brightness cumulative value of the picture data to be processed, and the brightness cumulative value can be determined by adopting the following formula:
Figure BDA0004192898240000101
wherein l is E [0, L max ]S, i.e l =S l-1 +N' l The method comprises the steps of carrying out a first treatment on the surface of the Wherein S is l A luminance integrated value representing a luminance component l, S l Representing the luminance integrated value of the luminance component l-1.
In this embodiment, after the adjusted histogram of the picture frame is obtained, the adjusted histogram may be subjected to a brightness accumulation operation to obtain a brightness accumulation value, and data is provided for a subsequent processing stage by the brightness accumulation value, so as to enhance the contrast of the picture frame and improve the image quality.
Fig. 8 is a flowchart of another method for displaying images according to the embodiment of the present application. As shown in fig. 8, a target RGB output process of pixels in any one of picture data to be processed in a picture frame and a picture block is explained:
s801, determining fusion picture data corresponding to the picture data to be processed, wherein a frame interval is formed between a picture frame to which the fusion picture data belongs and a picture frame to which the picture data to be processed belongs.
It can be understood that if the frame data to be processed is a frame i, the fusion frame data is a frame i+2; if the picture data to be processed is the picture block in the picture frame i, the fusion picture data is the same picture block in the picture frame i+2.
If the picture data to be processed is a picture frame i+1, fusing the picture data into a picture frame i+3; if the picture data to be processed is the picture block in the picture frame i+1, the fusion picture data is the same picture block in the picture frame i+3.
If the picture data to be processed is a picture frame i+2, fusing the picture data into a picture frame i+4; if the picture data to be processed is the picture block in the picture frame i+2, the fusion picture data is the same picture block in the picture frame i+4.
S802, determining a target brightness component of the pixel in the picture data to be processed based on the brightness component of the pixel in the fused picture data and the brightness summation value of the picture data to be processed.
In some implementations, luminance information for each pixel in a picture frame to which picture data to be processed belongs is obtained, and a maximum luminance component for the picture frame is determined based on the luminance information for each pixel in the picture frame. And after the brightness summation value of the picture data to be processed is completed, the brightness components of the pixels in the continuously input fusion picture data can be fused one by one based on the brightness summation value so as to output the target brightness components of the pixels in the picture data to be processed.
Alternatively, the read address corresponding to the luminance component may be determined based on the luminance component of the pixel in the currently input fused picture data, and the stored target luminance integrated value is read from the storage space indicated by the read address, and further, the luminance component mapping parameter of the pixel in the picture data to be processed is obtained based on the target luminance integrated value, the maximum luminance component, and the total number of pixels included in the picture data to be processed.
For example, the luminance component mapping parameter H of the jth pixel may be determined using the following formula j
H j =L max *S j A; wherein H is j Representing the luminance component mapping parameter of the jth pixel, L max Representing the maximum brightness component of the picture frame to which the picture data to be processed belongs, A represents the total number of pixels included in the picture data to be processed, S j And represents the target luminance integrated value corresponding to the luminance component of the j-th pixel.
Further, after the luminance component mapping parameters of the pixels are obtained, determining the average luminance component of the picture frame to which the fused picture data belongs, and performing luminance mapping on the average luminance component of the picture frame to which the fused picture data belongs based on the luminance component mapping parameters of the pixels to obtain the target luminance component of the pixels in the picture data to be processed.
For example, the target luminance component Y of the jth pixel may be determined using the following formula j ':
Figure BDA0004192898240000111
Wherein (1)>
Figure BDA0004192898240000112
Representing the average luminance component, Y, of the picture frame to which the fused picture data belongs j ' represents the target luminance component of the j-th pixel.
S803, obtaining and outputting target RGB information of the pixel in the picture data to be processed based on the target luminance component, the blue color difference component, and the red color difference component of the pixel in the picture data to be processed.
In some implementations, RGB domain YC is performed on pixels in the picture data to be processed b C r The conversion of the domain can obtain the blue color difference component and the red color difference component of the pixel. The RGB information of the pixels in the picture data to be processed and the first color gamut mapping matrix can be obtained, and matrix operation is carried out on the first color gamut mapping matrix and the RGB information of the pixels to obtain the YC of the pixels b C r Information of YC b C r The information includes a luminance component, a blue color difference component, and a red color difference component of the pixel. It will be appreciated that the YC of this pixel b C r The information can be obtained in the color gamut conversion process in extracting the luminance information, and the luminance component, the blue color difference component and the red color difference component of the pixel can be directly obtained from the buffer memory.
Further, a target YC of the pixel in the picture data to be processed is obtained based on the target luminance component, the blue color difference component, and the red color difference component of the pixel in the picture data to be processed b C r Information.
When the target brightness component of the pixel in the picture data to be processed is obtained, the pixel can be selected from the YC b C r The blue color difference component and the red color difference component are obtained from the information, and further, the target brightness component, the blue color difference component and the red color difference component of the pixel are combined to obtain the target YC of the pixel b C r Information.
At the target YC of the acquired pixel b C r After the information, the number of pictures to be processed can be calculatedTarget YC of pixel in the data b C r Information, YC b C r And reversely converting the domain into the RGB domain to obtain target RGB information of pixels in the picture data to be processed and output the target RGB information. In obtaining the target YC of the pixel in the picture data to be processed b C r Information, in order to be able to be presented on a display device, can be specific to the target YC of a pixel b C r Information, YC b C r And reversely converting the domain into the RGB domain to obtain target RGB information of pixels in the picture data to be processed and output the target RGB information. In some implementations, the slave YC may be obtained b C r A second gamut mapping matrix for the inverse conversion of the gamut into the RGB-gamut, a target YC for the second gamut mapping matrix and the pixel b C r And performing matrix operation on the information to obtain target RGB information of the pixels.
According to the picture display method provided by the embodiment of the invention, the brightness information of the pixels in the picture frames separated by one frame can be subjected to brightness remapping based on the brightness accumulation value of the current picture frame, so that the target brightness component of the pixels in the current picture frame is obtained, and in the process of determining the target brightness component, the contrast of local picture can be enhanced by referring to the brightness accumulation value, and the real-time brightness information of the pixels in the picture frames separated by one frame is also referred to, so that the display effect of the current picture frame is better consistent with the display effect of the follow-up picture data.
Fig. 9 is a schematic structural diagram of a screen display device according to an embodiment of the present application. As shown in fig. 9, the screen display device 900 includes: a first processing module 910, a second processing module 920, and a third processing module 930;
The first processing module 910 is configured to obtain, at time t+1, a histogram of the picture frame i according to luminance information of pixels in the picture frame i, obtain, at time t+2, a luminance integrated value of the picture frame i based on the histogram of the picture frame i, and output, at time t+3, a target RGB of the pixels in the picture frame i+2 based on the luminance integrated value of the picture frame i and the luminance information of the pixels in the picture frame i+2, where i and T are natural numbers satisfying a set rule from 1;
a second processing module 920, configured to obtain a histogram of the picture frame i+1 based on the luminance information of the pixels in the picture frame i+1 at a time t+2, obtain a luminance integrated value of the picture frame i+1 based on the histogram of the picture frame i+1 at a time t+3, and output a target RGB of the pixels in the picture frame i+3 based on the luminance integrated value of the picture frame i+1 and the luminance information of the pixels in the picture frame i+3 at a time t+4;
the third processing module 930 is configured to obtain a histogram of the picture frame i+2 based on the luminance information of the pixels in the picture frame i+2 at time t+3, obtain a luminance integrated value of the picture frame i+2 based on the histogram of the picture frame i+2 at time t+4, and output the target RGB of the pixels in the picture frame i+4 based on the luminance integrated value of the picture frame i+2 and the luminance information of the pixels in the picture frame i+4 at time t+5.
In some possible implementations, the screen display device further includes: the luminance extraction module 940.
A brightness extraction module 940, configured to obtain brightness information of a pixel in the frame i at a time T; at the time T+1, acquiring brightness information of pixels in a picture frame i+1; at the time T+2, acquiring brightness information of pixels in a picture frame i+2; wherein the luminance information includes a luminance component and an average luminance component.
According to the picture display method provided by the embodiment of the invention, the brightness information of the pixels in the picture frames separated by one frame can be subjected to brightness remapping based on the brightness accumulation value of the current picture frame, so that the target brightness component of the pixels in the current picture frame is obtained, and in the process of determining the target brightness component, the local contrast of a picture can be enhanced by referring to the brightness accumulation value, and the neighborhood average value of the pixels in the picture frames separated by one frame is also referred to, so that the display effect of the continuous picture frames is more consistent.
Fig. 10 is a schematic diagram of another structure of a display device according to an embodiment of the present application. As shown in fig. 10, the screen display device 1000 includes: a first processing module 110, a second processing module 120, a third processing module 130, and a brightness extraction module 140;
The first processing module 110 is configured to obtain, at time t+1, a histogram of the picture frame i according to luminance information of pixels in the picture frame i, obtain, at time t+2, a luminance integrated value of the picture frame i based on the histogram of the picture frame i, and output, at time t+3, a target RGB of the pixels in the picture frame i+2 based on the luminance integrated value of the picture frame i and the luminance information of the pixels in the picture frame i+2, where i and T are natural numbers satisfying a set rule from 1;
a second processing module 120, configured to obtain a histogram of the picture frame i+1 based on the luminance information of the pixels in the picture frame i+1 at a time t+2, obtain a luminance integrated value of the picture frame i+1 based on the histogram of the picture frame i+1 at a time t+3, and output a target RGB of the pixels in the picture frame i+3 based on the luminance integrated value of the picture frame i+1 and the luminance information of the pixels in the picture frame i+3 at a time t+4;
the third processing module 130 is configured to obtain a histogram of the picture frame i+2 based on the luminance information of the pixels in the picture frame i+2 at time t+3, obtain a luminance integrated value of the picture frame i+2 based on the histogram of the picture frame i+2 at time t+4, and output a target RGB of the pixels in the picture frame i+4 based on the luminance integrated value of the picture frame i+2 and the luminance information of the pixels in the picture frame i+4 at time t+5.
In some possible implementations, the screen display device further includes: the brightness extraction module 140.
A brightness extraction module 140, configured to obtain brightness information of a pixel in a frame i of the picture at a time T; at the time T+1, acquiring brightness information of pixels in a picture frame i+1; at the time T+2, acquiring brightness information of pixels in a picture frame i+2; wherein the luminance information includes a luminance component and an average luminance component.
In some possible implementations, the screen display device further includes: the brightness extraction module 140.
A brightness extraction module 140, configured to obtain brightness information of a pixel in a frame i of the picture at a time T; at the time T+1, acquiring brightness information of pixels in a picture frame i+1; at the time T+2, acquiring brightness information of pixels in a picture frame i+2; wherein the luminance information includes a luminance component and an average luminance component.
In some possible implementations, the luminance extraction module 140 includes: a color gamut conversion unit 141 and a neighborhood mean value calculation unit 142.
Wherein, the color gamut converting unit 141 is used for the pixels P in any picture frame xy Converting from RGB domain to YCbCr domain to obtain pixel P xy And is based on pixel P xy YCbCr information of (a) to obtain pixel P xy Is a luminance component of (1); wherein, the pixel P xy Representing an x-th row and a y-th column of pixels, wherein x is an integer greater than or equal to 2, and y is an integer greater than or equal to 1;
a neighborhood mean value calculation unit 142 for obtaining the pixel P xy And is based on pixel P xy And the luminance component of each adjacent pixel to obtain P xy Is included in the image data.
In some possible implementations, the neighborhood mean calculation unit 142 is further configured to:
in the process of acquiring the brightness information of the pixels on the x+1th row one by one, if the current pixel on the x+1th row is the pixel P xy Is obtained from the first buffer queue on line x-1 and belongs to pixel P xy Luminance information of adjacent pixels of (a) and obtaining the pixel P belonging to the x-th row from the second buffer queue xy Luminance information of adjacent pixels of (a);
for pixel P xy Is averaged with the luminance component of each adjacent pixel to obtain a pixel P xy Is included in the image data.
In some possible implementations, the neighborhood mean calculation unit 142 is further configured to:
caching brightness information of pixels on the x+1th row;
when the average brightness component of the last pixel on the x line is obtained, the brightness information of the pixels on the x-1 line in the first cache queue is cleared, and the brightness information of the pixels on the x+1 line is cached in the first cache queue; or, the brightness information of the pixels on the x line cached in the second cache queue is cached in the first cache queue, and the brightness information of the pixels on the x+1 line is cached in the first cache queue.
In some possible implementations, the neighborhood mean calculation unit 142 is further configured to:
for pixels P on edge rows and edge columns xy Determining P xy Has adjacent pixels and missing adjacent pixels;
acquiring brightness information of the existing adjacent pixels, and completing brightness information of the missing adjacent pixels;
based on pixel P xy And the luminance component of each adjacent pixel to obtain P xy Is included in the image data.
In some possible implementations, each of the first processing module 110, the second processing module 120, and the third processing module 130 is further configured to:
after brightness information of pixels in the picture frame is obtained, picture segmentation is carried out on the picture frame;
for any one of different picture blocks, acquiring a histogram of the picture block based on brightness information of pixels in the picture block, acquiring a brightness cumulative value of the picture block based on the histogram of the picture block, and outputting target RGB of the pixels in the picture block based on the brightness cumulative value of the picture block and the brightness information of the pixels in the same picture block in a fusion picture frame;
the processing time of the picture frame of the picture block and the picture frame of the picture block in different subsequent processing stages is the same, and the interval between the picture frame of the picture block and the picture frame of the picture block is one frame.
In some possible implementations, each of the first processing module 110, the second processing module 120, and the third processing module 130 includes: a histogram statistics unit, a histogram processing unit and a brightness mapping unit;
the histogram statistics unit is used for acquiring a histogram of the picture data to be processed based on brightness information of pixels in the picture data to be processed; wherein, the picture data to be processed is a picture frame or a picture block in the picture frame;
the histogram processing unit is used for obtaining the brightness cumulative value of the picture data to be processed based on the histogram of the picture data to be processed;
and the brightness mapping unit is used for fusing the brightness information of the pixels in the fused picture frame based on the brightness summation value of the picture data to be processed so as to output the target RGB of the pixels in the picture data to be processed, and the fused picture frame is separated from the picture frame to which the picture data to be processed belongs by one frame.
In some possible implementations, the histogram statistics unit is further configured to:
gating the first memory and the second memory;
counting the brightness components of the pixels in odd columns in the picture data to be processed through the first memory, and counting the brightness components of the pixels in even columns in the picture data to be processed through the second memory;
And adding the statistical result of the odd-numbered columns of pixels and the statistical result of the odd-numbered columns of pixels to obtain a histogram of the picture data to be processed.
In some possible implementations, the histogram statistics unit is further configured to:
and after obtaining the brightness summation value of the picture data to be processed based on the histogram of the picture data to be processed, resetting the first memory and the second memory.
In some possible implementations, the histogram statistics unit is further configured to:
determining a reading address corresponding to each brightness component in the picture data to be processed, wherein a storage space indicated by the reading address is used for storing the occurrence times of the corresponding brightness component;
traversing pixels in the picture data to be processed, when the brightness component of the traversed current pixel is counted, acquiring the accumulated occurrence times of the brightness component based on the read address of the brightness component of the current pixel, updating and restoring the accumulated occurrence times to a storage space corresponding to the read address until the pixels in the picture to be processed are traversed.
In some possible implementations, the histogram processing unit is further configured to:
Obtaining the total brightness average value and the maximum brightness component of the picture data to be processed based on the brightness components of the pixels in the picture data to be processed;
obtaining an adjustment threshold of the histogram according to the total brightness average value and the maximum brightness component;
adjusting the histogram of the picture data to be processed based on the adjustment threshold value to obtain an adjusted histogram of the picture data to be processed;
and accumulating the adjusted histogram of the picture data to be processed to obtain the brightness accumulated value of the picture data to be processed.
In some possible implementations, the luminance mapping unit is further configured to:
determining fusion picture data corresponding to the picture data to be processed, wherein a frame interval is reserved between a picture frame to which the fusion picture data belongs and a picture frame to which the picture data to be processed belongs;
and determining a target brightness component of the pixel in the picture data to be processed based on the brightness component of the pixel in the fusion picture data and the brightness summation value of the picture data to be processed.
In some possible implementations, the luminance mapping unit is further configured to:
the brightness mapping unit is further used for:
determining the maximum brightness component of the picture frame to which the picture data to be processed belongs based on the brightness information of the pixels in the picture frame to which the picture data to be processed belongs;
Determining a reading address corresponding to the brightness component of the current pixel in the fusion picture data, and acquiring a target brightness cumulative value stored in a storage space indicated by the reading address;
obtaining a brightness component mapping parameter of pixels in the picture data to be processed based on the target brightness cumulative value, the maximum brightness component and the total number of pixels included in the picture data to be processed;
and obtaining the target brightness component according to the brightness component mapping parameter and the average brightness component of the picture frame to which the fusion picture data belong.
As shown in fig. 10, the first processing module 110 includes therein a histogram statistic unit 111, a histogram processing unit 112, and a luminance mapping unit 113; the second processing module 120 includes a histogram statistic unit 121, a histogram processing unit 122, and a luminance mapping unit 123; the third processing module 130 includes a histogram statistic unit 131, a histogram processing unit 132, and a luminance mapping unit 133.
In some possible implementations, the screen display device in the embodiments of the present application further includes: and an output control module 160.
And an output control module 160, configured to obtain and output target RGB information of the pixels in the to-be-processed picture data based on the target luminance component and the blue color difference component and the red color difference component of the pixels in the to-be-processed picture data.
In some possible implementations, the output control module 160 includes:
an output control unit 161 for outputting a target luminance component of a pixel in the picture data to be processed;
a color gamut reverse conversion unit 162 for obtaining a target YC of a pixel in the picture data to be processed based on the target luminance component, the blue color difference component, and the red color difference component of the pixel in the picture data to be processed b C r Information; target YC for pixels in said picture data to be processed b C r Information, YC b C r And reversely converting the domain into the RGB domain to obtain target RGB information of pixels in the picture data to be processed and outputting the target RGB information.
In some possible implementations, the gamut conversion unit 141 is further configured to:
acquiring RGB information of pixels and a first color gamut mapping matrix;
performing matrix operation on the first color gamut mapping matrix and RGB information of the pixel to obtain YC of the pixel b C r Information of YC b C r The information includes a luminance component, a blue color difference component, and a red color difference component of the pixel.
In some possible implementations, the gamut conversion unit 141 is further configured to: performing shift operation on the matrix operation result of any component aiming at any component of the brightness component, the blue color difference component and the red color difference component to obtain a shift operation result of any component;
And intercepting the shift operation result according to the set bit number from the low order to the high order to obtain a target operation result of any component.
In some possible implementations, the screen display device further includes: the input to the control module 150 is provided,
in some possible implementations, the input control module 150 is configured to determine, based on coordinates of the pixels, an image block to which the pixels belong, so as to generate a first enable signal of the image block, where the first enable signal is used to indicate that the image block is in a state where the relevant processing operation can be performed.
In some possible implementations, the input control module 150 is further configured to, when receiving a pixel in any one of the frame, generate a second enable signal for the any one of the frame based on the frame identifier of the pixel, where the second enable signal is used to indicate that the any one of the frame is in a state where the relevant processing operation can be performed.
According to the picture display method provided by the embodiment of the invention, the brightness information of the pixels in the picture frames separated by one frame can be subjected to brightness remapping based on the brightness accumulation value of the current picture frame, so that the target brightness component of the pixels in the current picture frame is obtained, and in the process of determining the target brightness component, the contrast of local picture can be enhanced by referring to the brightness accumulation value, and the real-time brightness information of the pixels in the picture frames separated by one frame is also referred to, so that the display effect of the current picture frame is better consistent with the display effect of the follow-up picture data. And after the brightness components of the pixels in the picture frame are obtained, the picture frame can be subjected to picture segmentation, the picture blocks are taken as units for subsequent processing operation, local contrast can be enhanced through picture segmentation, such as the contrast of a picture detail area, and meanwhile, the problem that noise in a background and a flat area is excessively amplified can be avoided, so that the image quality can be better improved.
Fig. 11 is a flow chart illustrating a screen display according to an exemplary embodiment. As shown in fig. 11, the screen display device includes: the color gamut conversion unit, the average value calculation unit, the input control unit, and three processing branches, each of which includes a histogram statistics unit, a mapping function calculation unit (such as the histogram processing unit in the above embodiment), a luminance mapping unit, an output control unit, and a color gamut reverse conversion unit.
In this embodiment, an input end of the color gamut conversion unit receives an input frame, and an output end of the color gamut conversion unit is connected to an input end of the average value calculation unit, so as to perform neighborhood average value calculation on pixels in the frame.
The output end of the average value calculation unit is connected with the input end of the input control unit, and the output end of the input control unit is respectively connected with the three processing branches. The input control unit is used for generating an enabling signal of the face frame to continue to input the corresponding processing branch for subsequent operation after the average value calculation of all pixels in the picture frame is completed.
The input end of the histogram statistics unit in the three processing branches is connected with the output end of the input control unit, the output end of the histogram statistics unit is connected with the input end of the mapping function calculation unit, and the input end of the mapping function calculation unit is connected with the output end of the mapping function calculation unit and the output end of the input control unit respectively. The input end of the input control unit generates an enabling signal of the fusion picture frame after the pixel in the fusion picture frame (i.e. the picture frame separated from the picture frame by one frame) of the picture frame completes the neighborhood average value calculation, so that the pixel in the fusion picture frame can be input to the brightness mapping unit, and the brightness mapping unit can use the brightness summation value of the picture frame to carry out brightness mapping with the currently input pixel in the fusion picture frame, thereby obtaining a target brightness component after the same pixel of the picture frame is fused.
Further, the output end of the brightness mapping unit is connected with the input end of the output control unit, the input end of the output control unit is also connected with the input end of the input control unit, and the output end of the output control unit is connected with the color gamut reverse mapping unit. The input control unit may input an enable signal corresponding to each picture frame in the three branches to the output control unit, wherein the enable signal input timings of different picture frames are different. Under the control of the enabling signal, the output control unit may input the target luminance component of the pixel in the enabled picture frame into the gamut reverse conversion unit to perform gamut reverse conversion, so as to obtain the target RGB of the pixel in the picture frame.
As shown in fig. 11, two RAMs are included in each processing branch, one RAM for the histogram statistic unit and the other RAM for the map function calculation unit.
Illustratively, RGB of the pixel in frame 1 is input to the gamut conversion unit at time T to obtain YC of the pixel in frame 1 b C r Information of YC b C r The Y component in the information is input to a neighborhood mean value calculation unit which outputs the luminance component and the average luminance component of the pixel, and the luminance component and the average luminance component of the pixel are input to a control unit. The input control unit can generate the effective enabling signal Y of the picture frame 1 after the neighborhood calculation of all pixels of the picture frame 1 is completed 1 En, and an active enable signal zone_con for the picture block in which the pixel is located. The input control unit can count at time T+1 in the histogram statistics unit 1 in the processing branch 1, and buffer the histogram in RAM 1-1 Is a kind of medium. The mapping function calculation unit 1 is input at the time T+2 to obtain the brightness summation value of the picture frame 1, and the brightness summation value is cached in the RAM 1-2 Is a kind of medium. The t+3 time input control unit may input luminance information of the pixels of the picture frame 3 and an enable signal (Y 3 En) so that the luminance mapping unit 1 can be based on RAM 1-2 The sum of the brightness of the picture frame 1 in the picture frame 3 is used for carrying out brightness mapping on the brightness information of the pixels in the picture frame 3 to obtain Y of the same pixel in the picture frame 3 1 ' i.e. the target luminance component of the pixel.
It will be appreciated that after brightness mapping is completed at time T+3, the RAM can be accessed 1-1 The histogram of the picture frame 1 cached in the buffer memory is cleared for the histogram cache of the picture frame 4.
The output control unit is at T+3Can receive the enabling signals of the picture frame 1 and the picture frame 3 output by the input control unit, and Y is set at the moment under the control of the enabling signals 1 ' an input gamut reverse conversion unit from which the C of the pixel can be obtained b And Cr, combining the three components to obtain YC of the pixel b C r ' target YC of pixel b C r Information and YC for pixel b C r The ' reverse conversion ' is performed to finally obtain R ' G ' B ' of the pixels in the picture frame 3.
RGB of pixels in the picture frame 2 is input into a color gamut conversion unit at the time T+1 to obtain YC of the pixels in the picture frame 2 b C r The information, the neighborhood mean value calculation unit outputs the luminance component and the average luminance component of the pixel, and inputs the luminance component and the average luminance component of the pixel into the control unit. The input control unit can generate the effective enabling signal Y of the picture frame 2 after the neighborhood calculation of all pixels of the picture frame 2 is completed 2 En, and an active enable signal zone_con for the picture block in which the pixel is located. The input control unit can make statistics at time T+2 in the histogram statistics unit 2 in the processing branch 2, and buffer the histogram in RAM 2-1 Is a kind of medium. The mapping function calculation unit 2 is input at the time T+3 to obtain the brightness summation value of the picture frame 2, and the brightness summation value is cached in the RAM 2-2 Is a kind of medium. The t+4 time input control unit may input luminance information of the pixels of the picture frame 4 and an enable signal (Y 4 En) so that the luminance mapping unit 2 can be based on RAM 2-2 The sum of the brightness of the picture frames 2 in the picture frame 4 is used for brightness mapping the brightness information of the pixels in the picture frame 4 to obtain Y of the same pixel in the picture frame 4 2 ' i.e. the target luminance component of the pixel. It will be appreciated that after brightness mapping is completed at time T+4, the RAM can be accessed 2-1 The histogram cached in the buffer is cleared for the histogram cache of the picture frame 5.
The output control unit can receive the enabling signals of the picture frame 2 and the picture frame 4 output by the input control unit at the time T+4, and Y is controlled by the enabling signals at the time 2 ' input color gamutA reverse conversion unit which can obtain the C of the pixel from the color gamut conversion unit b And Cr, combining the three components to obtain YC of the pixel b C r ' target YC of pixel b C r Information and YC for pixel b C r The ' reverse conversion ' is performed to finally obtain R ' G ' B ' of the pixels in the picture frame 4.
RGB of pixels in the picture frame 3 is input into the color gamut conversion unit at the time T+2 to obtain YC of the pixels in the picture frame 3 b C r The information, the neighborhood mean value calculation unit outputs the luminance component and the average luminance component of the pixel, and inputs the luminance component and the average luminance component of the pixel into the control unit. The input control unit can generate the effective enabling signal Y of the picture frame 3 after the neighborhood calculation of all pixels of the picture frame 3 is completed 3 En, and an active enable signal zone_con for the picture block in which the pixel is located. The input control unit can make statistics at time T+3 into the histogram statistics unit 3 in the processing branch 3 and cache the histogram in RAM 3-1 Is a kind of medium. The mapping function calculation unit 3 is input at the time T+4 to obtain the brightness summation value of the picture frame 3, and the brightness summation value is cached in the RAM 3-2 Is a kind of medium. The t+5 time input control unit may input luminance information of the pixels of the picture frame 5 and an enable signal (Y 5 En) so that the luminance mapping unit 3 can be based on RAM 3-2 The sum of the brightness of the picture frames 3 in the picture frame 5 is used for brightness mapping the brightness information of the pixels in the picture frame 5 to obtain Y of the same pixel in the picture frame 5 3 ' i.e. the target luminance component of the pixel. It will be appreciated that after brightness mapping is completed at time T+5, the RAM can be accessed 3-1 The histogram cached in the buffer is cleared for the histogram cache of the picture frame 6.
The output control unit can receive the enabling signals of the picture frame 3 and the picture frame 5 output by the input control unit at the time T+5, and Y is controlled by the enabling signals at the time 3 ' an input gamut reverse conversion unit from which the C of the pixel can be obtained b And Cr, combining the three components to obtain a pixelYC of (a) b C r ' target YC of pixel b C r Information and YC for pixel b C r The ' reverse conversion ' is performed to finally obtain R ' G ' B ' of the pixels in the picture frame 5.
The picture frame 4 is input at the time t+3, the picture frame 4 is input to the processing branch 1, the picture frame 4 is processed at the time t+4, the time t+5 and the time t+6 by three units in the processing branch 1, and the luminance information of the pixels in the picture frame 6 is subjected to fusion processing based on the luminance integrated value of the picture frame 4 to output R ' G ' B ' of the pixels in the picture frame 6.
A picture frame 5 is input at time t+4, the picture frame 5 is input to the processing branch 2, the picture frame 5 is processed at time t+5, time t+6 and time t+7 by three units in the processing branch 2, respectively, and a fusion process is performed on the picture frame 7 based on the picture frame 5 to output R ' G ' B ' of the pixels in the picture frame 7.
A picture frame 6 is input at time t+5, the picture frame 6 is input to the processing branch 3, the picture frame 6 is processed at time t+6, time t+7 and time t+8 by three units in the processing branch 3, respectively, and the luminance information of the pixels in the picture frame 8 is subjected to fusion processing based on the luminance integrated value of the picture frame 6 to output R ' G ' B ' of the pixels in the picture frame 8.
A picture frame 7 is input at time t+6, the picture frame 7 is input to the processing branch 1, the picture frame 9 is processed at time t+7, time t+8 and time t+9 by three units in the processing branch 1, respectively, and the luminance information of the pixels in the picture frame 9 is subjected to fusion processing based on the luminance integrated value of the picture frame 7 to output R ' G ' B ' of the pixels in the picture frame 9.
Similarly, the processing branch 1 may include 1,4,7,10, … …,3n-2 corresponding to the input frame; the processing branch 2 may include 2,5,8,11, … …,3n-1 corresponding to the input picture frame; the processing branch 3 may include 3,6,9,12, … …,3n for the input picture frames.
It will be appreciated that as shown in FIG. 11, the enable signal input by the input control unit to the output control unit may be labeled Y i _en,Y i+1 _en,Y i+2 En. Needs to be as followsWhen the input control unit inputs the enable signal of each frame to the output control unit, the input control unit also inputs the control signal (Contrast enhancement start, ce-st) of the luminance mapping unit to the output control unit, and the luminance mapping unit can be put into an operating state by the ce-st.
For example, it is possible to set the state where the screen frame 1 acquires luminance information at t=0, and further, the histogram statistics unit 1 of the processing branch 1 is input at time t=1, and two RAMs in the three processing branches are in the state of time t=1 to time t=9, as shown in the following table 1:
Figure BDA0004192898240000171
As shown in fig. 11, when a picture frame is input to the picture display device, a vertical synchronization signal (Vertical Synchronization, vs) and a data enable signal (de) are also input in synchronization, wherein the Vs may be used as a frame reset signal. de indicates that the inputted picture data is valid at the high level.
Fig. 12 is a block diagram of an electronic device, according to an example embodiment. As shown in fig. 12, the display apparatus 1200 includes a screen display device 1200. Display devices, which may also be referred to as displays, or directly popular as screens, etc. Is a device that can output images or tactile information, such as a braille display designed for the blind. The display device may include a plasma display panel (Plasma displaypanel, PDP for short), a liquid crystal display (Liquid crystal display, LCD for short), a Thin film transistor display (Thin-film transistordisplay, TFT for short) in a liquid crystal display, an Organic light-emitting diodedisplay (OLED for short), a Surface-conduction electron-emitting display (Surface-conduction electron-emitter display, SED for short), an (experimental) laser display (Laser video display), a Carbon nanotube display (Carbon nanotubes), a quantum dot display (Quantum dot display), an interferometric modulator display (Interferometric modulator display, IMOD), and the like. The embodiments of the present application are not particularly limited.
There is also provided, according to an embodiment of the present application, a display apparatus including: a processor; a memory for storing the processor-executable instructions, wherein the processor is configured to execute the instructions to implement the picture display method as described above.
In order to implement the above embodiment, the present application also proposes a storage medium.
Wherein the instructions in the storage medium, when executed by the processor of the electronic device, enable the electronic device to perform the picture display method as described above.
To achieve the above embodiments, the present application also provides a computer program product.
Wherein the computer program product, when executed by a processor of the electronic device, enables the electronic device to perform the picture display method as described above.
Fig. 13 is a block diagram of an electronic device, according to an example embodiment. The electronic device shown in fig. 13 is only an example, and should not impose any limitation on the functionality and scope of use of the embodiments of the present application.
As shown in fig. 13, the electronic device 1300 includes a processor 1301 that can perform various appropriate actions and processes according to a program stored in a Read Only Memory (ROM) 1302 or a program loaded from a Memory 1306 into a random access Memory (RAM, random Access Memory) 1303. In the RAM 1303, various programs and data necessary for the operation of the electronic apparatus 1300 are also stored. The processor 1301, the ROM 1302, and the RAM 1303 are connected to each other through a bus 1304. An Input/Output (I/O) interface 1305 is also connected to bus 1304.
The following components are connected to the I/O interface 1305: a memory 1306 including a hard disk and the like; and a communication section 1307 including a network interface card such as a LAN (local area network ) card, a modem, or the like, the communication section 1307 performing communication processing via a network such as the internet; the drive 1308 is also connected to the I/O interface 1305 as needed.
In particular, according to embodiments of the present application, the processes described above with reference to flowcharts may be implemented as computer software programs. For example, embodiments of the present application include a computer program embodied on a computer readable medium, the computer program containing program code for performing the methods shown in the flowcharts. In such an embodiment, the computer program can be downloaded and installed from the network through the communication portion 1307. The above-described functions defined in the methods of the present application are performed when the computer program is executed by the processor 1301.
In an exemplary embodiment, a storage medium is also provided, such as a memory, including instructions executable by the processor 1301 of the electronic apparatus 1300 to perform the above-described method. Alternatively, the storage medium may be a non-transitory computer readable storage medium, which may be, for example, ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, and the like.
In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present application, however, a computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
In this application, the electronic device 1300 also includes a display 1309 for displaying picture frames. The display 1309 may be PDP, LCD, TFT, OLED or the like. The embodiments of the present application are not particularly limited.
Fig. 14 is a block diagram illustrating a structure of a display chip according to an exemplary embodiment. The electronic device shown in fig. 14 is only an example, and should not impose any limitation on the functions and scope of use of the embodiments of the present application. As shown in fig. 14, the electronic device 1400 includes a processor 1401 and a memory 1402. Wherein the memory 1402 is used for storing program codes, and the processor 1401 is connected with the memory 1402 and used for reading the program codes from the memory 1402 to realize the screen display method in the above embodiment.
Alternatively, the number of processors 1401 may be one or more.
Optionally, the electronic device may further comprise an interface 1403, the number of the interface 1403 may be a plurality. The interface 1403 may be connected to an application program, and may receive data of an external device such as a sensor, or the like.
Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. This application is intended to cover any variations, uses, or adaptations of the invention following, in general, the principles of the invention and including such departures from the present disclosure as come within known or customary practice within the art to which the invention pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the application being indicated by the following claims.
It is to be understood that the present application is not limited to the precise arrangements and instrumentalities shown in the drawings, which have been described above, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the application is limited only by the appended claims.

Claims (32)

1. A picture display method, comprising:
acquiring a histogram of a picture frame i according to brightness information of pixels in the picture frame i at a time T+1, obtaining a brightness cumulative value of the picture frame i based on the histogram of the picture frame i at a time T+2, and outputting target RGB of the pixels in the picture frame i+2 based on the brightness cumulative value of the picture frame i and the brightness information of the pixels in the picture frame i+2 at a time T+3, wherein i and T are natural numbers meeting a set rule from 1;
acquiring a histogram of the picture frame i+1 based on the brightness information of the pixels in the picture frame i+1 at the time T+2, acquiring a brightness cumulative value of the picture frame i+1 based on the histogram of the picture frame i+1 at the time T+3, and outputting a target RGB of the pixels in the picture frame i+3 based on the brightness cumulative value of the picture frame i+1 and the brightness information of the pixels in the picture frame i+3 at the time T+4;
Acquiring a histogram of the picture frame i+2 based on the brightness information of the pixels in the picture frame i+2 at the time T+3, acquiring a brightness cumulative value of the picture frame i+2 based on the histogram of the picture frame i+2 at the time T+4, and outputting target RGB of the pixels in the picture frame i+4 based on the brightness cumulative value of the picture frame i+2 and the brightness information of the pixels in the picture frame i+4 at the time T+5.
2. The method according to claim 1, wherein the method further comprises:
at the time T, acquiring brightness information of pixels in the picture frame i;
at the time T+1, acquiring brightness information of pixels in the picture frame i+1;
at the time T+2, acquiring brightness information of pixels in the picture frame i+2;
wherein the luminance information includes a luminance component and an average luminance component.
3. The method of claim 2, wherein the acquiring the luminance information of the pixels in any one of the frames of the picture comprises:
for pixel P in any picture frame xy Converting from RGB domain to YCbCr domain to obtain the pixel P xy Wherein the pixel P xy Representing an x-th row and a y-th column of pixels, wherein x is an integer greater than or equal to 2, and y is an integer greater than or equal to 1;
Based on the instituteThe pixel P xy YCbCr information of (a) to obtain the pixel P xy Is a luminance component of (1);
acquiring the pixel P xy And based on the pixel P xy And the luminance component of each adjacent pixel to obtain the P xy Is included in the image data.
4. A method according to claim 3, wherein said pixel P based xy And the luminance component of each adjacent pixel to obtain said pixel P xy Comprises:
in the process of obtaining the brightness information of the pixels on the x+1th row one by one, if the current pixel on the x+1th row is the pixel P xy Is taken from the first buffer queue on line x-1 and belongs to said pixel P xy Luminance information of adjacent pixels of the row x belonging to the pixel P is obtained from a second buffer queue xy Luminance information of adjacent pixels of (a);
for the pixel P xy Is averaged with the luminance component of each adjacent pixel to obtain the pixel P xy Is included in the image data.
5. The method according to claim 4, wherein the method further comprises:
caching the brightness information of the pixels on the x+1th row;
When the average brightness component of the last pixel on the x line is obtained, the brightness information of the pixels on the x-1 line in the first cache queue is cleared, and the brightness information of the pixels on the x+1 line is cached in the first cache queue; or, the brightness information of the pixels on the x line cached in the second cache queue is cached in the first cache queue, and the brightness information of the pixels on the x+1 line is cached in the first cache queue.
6. A method according to any of claims 3-5, wherein said pixel P based xy And the luminance component of the adjacent pixel to obtain the pixel P xy Comprises:
for pixels P on edge rows and edge columns xy Determining the P xy Has adjacent pixels and missing adjacent pixels;
acquiring the brightness information of the existing adjacent pixels, and carrying out brightness information complementation on the missing adjacent pixels;
based on the pixel P xy And the luminance component of each adjacent pixel to obtain the P xy Is included in the image data.
7. The method according to claim 1, wherein the method further comprises:
After brightness information of pixels in any picture frame is obtained, carrying out picture segmentation on the any picture frame;
for any one of different picture blocks, acquiring a histogram of the picture block based on brightness information of pixels in the picture block, acquiring a brightness cumulative value of the picture block based on the histogram of the picture block, and outputting target RGB of the pixels in the picture block based on the brightness cumulative value of the picture block and the brightness information of the pixels in the same picture block in a fusion picture frame;
the processing time of the picture block and the picture frame of the picture block in different subsequent processing stages is the same, and a frame is arranged between the fusion picture frame and the picture frame of the picture block.
8. The method according to claim 1 or 7, wherein the process of acquiring the histogram for any one of the picture data to be processed in the picture frame and the picture block comprises:
gating the first memory and the second memory;
counting the brightness components of the pixels in odd columns in the picture data to be processed through the first memory, and counting the brightness components of the pixels in even columns in the picture data to be processed through the second memory;
And adding the statistical result of the odd-numbered columns of pixels and the statistical result of the odd-numbered columns of pixels to obtain a histogram of the picture data to be processed.
9. The method of claim 8, wherein the method further comprises:
and after obtaining the brightness summation value of the picture data to be processed based on the histogram of the picture data to be processed, resetting the first memory and the second memory.
10. The method of claim 8, wherein the method further comprises:
determining a reading address corresponding to each brightness component in the picture data to be processed, wherein a storage space indicated by the reading address is used for storing the occurrence times of the corresponding brightness component;
traversing pixels in the picture data to be processed, when the brightness component of the traversed current pixel is counted, acquiring the accumulated occurrence times of the brightness component based on the read address of the brightness component of the current pixel, updating and restoring the accumulated occurrence times to a storage space corresponding to the read address until the pixels in the picture to be processed are traversed.
11. The method according to claim 1 or 7, wherein the determination of the luminance integrated value for any one of the picture data to be processed in the picture frame and the picture block comprises:
obtaining the total brightness average value and the maximum brightness component of the picture data to be processed based on the brightness components of the pixels in the picture data to be processed;
obtaining an adjustment threshold of the histogram according to the total brightness average value and the maximum brightness component;
adjusting the histogram of the picture data to be processed based on the adjustment threshold value to obtain an adjusted histogram of the picture data to be processed;
and accumulating the adjusted histogram of the picture data to be processed to obtain the brightness accumulated value of the picture data to be processed.
12. A method according to claim 1 or 7, characterized in that the target RGB output process for pixels in any one of the picture data to be processed in a picture frame and a picture block comprises:
determining fusion picture data corresponding to the picture data to be processed, wherein a frame interval is reserved between a picture frame to which the fusion picture data belongs and a picture frame to which the picture data to be processed belongs;
determining a target brightness component of a pixel in the picture data to be processed based on the brightness component of the pixel in the fused picture data and the brightness summation value of the picture data to be processed;
And obtaining and outputting target RGB information of the pixels in the picture data to be processed based on the target brightness component, the blue color difference component and the red color difference component of the pixels in the picture data to be processed.
13. The method according to claim 12, wherein the obtaining the target RGB information of the pixel in the picture data to be processed for output based on the target luminance component, the blue color difference component and the red color difference component of the pixel in the picture data to be processed, comprises:
obtaining a target YC of a pixel in the picture data to be processed based on a target brightness component, a blue color difference component and a red color difference component of the pixel in the picture data to be processed b C r Information;
target YC for pixels in said picture data to be processed b C r Information, YC b C r And reversely converting the domain into the RGB domain to obtain target RGB information of pixels in the picture data to be processed and outputting the target RGB information.
14. The method of claim 12, wherein the determining a target luminance component for the pixel in the picture data to be processed based on the luminance component for the pixel in the fused picture data and the luminance integrated value for the picture data to be processed comprises:
Determining the maximum brightness component of the picture frame to which the picture data to be processed belongs based on the brightness information of the pixels in the picture frame to which the picture data to be processed belongs;
determining a reading address corresponding to the brightness component of the current pixel in the fusion picture data, and acquiring a target brightness cumulative value stored in a storage space indicated by the reading address;
obtaining a brightness component mapping parameter of pixels in the picture data to be processed based on the target brightness cumulative value, the maximum brightness component and the total number of pixels included in the picture data to be processed;
and obtaining the target brightness component according to the brightness component mapping parameter and the average brightness component of the picture frame to which the fusion picture data belong.
15. The method of claim 12, wherein if the frame data to be processed is a frame block, the fused frame data is the same frame block in the frame to which the fused frame data belongs.
16. A method according to claim 3, characterized in that the method further comprises:
acquiring the pixel P xy RGB information and a first gamut mapping matrix;
for the first gamut mapping matrix and the pixels P xy Performing matrix operation on RGB information of the pixel P xy YC of (a) b C r Information, wherein the YC b C r The information includes the pixel P xy A luminance component, a blue color difference component, and a red color difference component.
17. According toThe method of claim 16, wherein the pair of the first gamut mapping matrix and the pixel P xy Performing matrix operation on RGB information of the pixel P xy YC of (a) b C r Information, comprising:
performing shift operation on the matrix operation result of any component aiming at any component of the brightness component, the blue color difference component and the red color difference component to obtain a shift operation result of any component;
and intercepting the shift operation result according to a set bit number from a low order to a high order to obtain a target operation result of any component.
18. The method of claim 7, wherein the method further comprises:
and determining an image block to which the pixel belongs based on the coordinates of the pixel to generate a first enabling signal of the image block, wherein the first enabling signal is used for indicating that the image block is in a state capable of performing related processing operation.
19. The method according to claim 1 or 7, characterized in that the method further comprises:
And when receiving the pixel in any picture frame, generating a second enabling signal of the any picture frame based on the frame identification of the pixel, wherein the second enabling signal is used for indicating that the any picture frame is in a state capable of performing related processing operation.
20. A picture display device, comprising:
a first processing module, configured to obtain, at a time t+1, a histogram of a picture frame i according to luminance information of pixels in the picture frame i, obtain, at a time t+2, a luminance integrated value of the picture frame i based on the histogram of the picture frame i, and output, at a time t+3, a target RGB of pixels in the picture frame i+2 based on the luminance integrated value of the picture frame i and luminance information of pixels in the picture frame i+2, where i and T are natural numbers satisfying a set rule from 1;
the second processing module is used for acquiring a histogram of the picture frame i+1 based on the brightness information of the pixels in the picture frame i+1 at the time T+2, acquiring a brightness cumulative value of the picture frame i+1 based on the histogram of the picture frame i+1 at the time T+3, and outputting a target RGB of the pixels in the picture frame i+3 based on the brightness cumulative value of the picture frame i+1 and the brightness information of the pixels in the picture frame i+3 at the time T+4;
The third processing module is configured to obtain a histogram of the picture frame i+2 based on luminance information of the pixels in the picture frame i+2 at a time t+3, obtain a luminance integrated value of the picture frame i+2 based on the histogram of the picture frame i+2 at a time t+4, and output a target RGB of the pixels in the picture frame i+4 based on the luminance integrated value of the picture frame i+2 and the luminance information of the pixels in the picture frame i+4 at a time t+5.
21. The apparatus of claim 20, wherein the apparatus further comprises:
the brightness extraction module is used for acquiring brightness information of pixels in the picture frame i at the moment T; at the time T+1, acquiring brightness information of pixels in the picture frame i+1; at the time T+2, acquiring brightness information of pixels in the picture frame i+2; wherein the luminance information includes a luminance component and an average luminance component.
22. The apparatus of claim 21, wherein the brightness extraction module comprises:
a color gamut conversion unit for converting pixels P in any picture frame xy Converting from RGB domain to YCbCr domain to obtain the pixel P xy And based on the YCbCr information of the pixel P xy YCbCr information of (a) to obtain the pixel P xy Is a luminance component of (1); wherein the pixel P xy Representing an x-th row and a y-th column of pixels, wherein x is an integer greater than or equal to 2, and y is an integer greater than or equal to 1;
a neighborhood mean value calculation unit forAcquiring the pixel P xy And based on the pixel P xy And the luminance component of each adjacent pixel to obtain the P xy Is included in the image data.
23. The apparatus of claim 20, wherein any processing module is further configured to:
after brightness information of pixels in a picture frame is obtained, picture segmentation is carried out on the picture frame;
for any one of different picture blocks, acquiring a histogram of the picture block based on brightness information of pixels in the picture block, acquiring a brightness cumulative value of the picture block based on the histogram of the picture block, and outputting target RGB of the pixels in the picture block based on the brightness cumulative value of the picture block and the brightness information of the pixels in the same picture block in a fusion picture frame;
the processing time of the picture block and the picture frame of the picture block in different subsequent processing stages is the same, and a frame is arranged between the fusion picture frame and the picture frame of the picture block.
24. The apparatus of claim 20 or 23, wherein either processing module comprises:
the histogram statistics unit is used for acquiring a histogram of the picture data to be processed based on brightness information of pixels in the picture data to be processed; wherein, the picture data to be processed is a picture frame or a picture block in the picture frame;
the histogram processing unit is used for obtaining the brightness cumulative value of the picture data to be processed based on the histogram of the picture data to be processed;
and the brightness mapping unit is used for fusing the brightness information of the pixels in the fused picture frame based on the brightness summation value of the picture data to be processed so as to output the target RGB of the pixels in the picture data to be processed, and the fused picture frame is separated from the picture frame to which the picture data to be processed belongs by one frame.
25. The apparatus of claim 24, wherein the apparatus further comprises:
and the output control module is used for obtaining and outputting the target RGB information of the pixels in the picture data to be processed based on the target brightness component and the blue color difference component and the red color difference component of the pixels in the picture data to be processed.
26. The method of claim 25, wherein the output control module comprises:
an output control unit for outputting a target luminance component of a pixel in the picture data to be processed;
a color gamut reverse conversion unit for obtaining a target YC of a pixel in the picture data to be processed based on a target brightness component, a blue color difference component and a red color difference component of the pixel in the picture data to be processed b C r Information; target YC for pixels in said picture data to be processed b C r Information, YC b C r And reversely converting the domain into the RGB domain to obtain target RGB information of pixels in the picture data to be processed and outputting the target RGB information.
27. The apparatus of claim 24, wherein the apparatus further comprises: :
and the input control module is used for determining an image block to which the pixel belongs based on the coordinates of the pixel so as to generate a first enabling signal of the image block, wherein the first enabling signal is used for indicating that the image block is in a state capable of performing related processing operation.
28. The method of claim 27, wherein the input control module is further configured to:
and when receiving the pixel in any picture frame, generating a second enabling signal of the any picture frame based on the frame identification of the pixel, wherein the second enabling signal is used for indicating that the any picture frame is in a state capable of performing related processing operation.
29. A display device, characterized by comprising: the device of any one of claims 20 to 28.
30. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the method of any one of claims 1 to 19.
31. A non-transitory computer readable storage medium, wherein instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any one of claims 1 to 19.
32. A computer program product comprising a computer program which, when executed by a processor, implements the method of any one of claims 1 to 19.
CN202310429517.1A 2023-04-20 2023-04-20 Picture display method and device Pending CN116434722A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310429517.1A CN116434722A (en) 2023-04-20 2023-04-20 Picture display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310429517.1A CN116434722A (en) 2023-04-20 2023-04-20 Picture display method and device

Publications (1)

Publication Number Publication Date
CN116434722A true CN116434722A (en) 2023-07-14

Family

ID=87094188

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310429517.1A Pending CN116434722A (en) 2023-04-20 2023-04-20 Picture display method and device

Country Status (1)

Country Link
CN (1) CN116434722A (en)

Similar Documents

Publication Publication Date Title
KR101138852B1 (en) Smart clipper for mobile displays
US6833835B1 (en) Method and apparatus for antialiased imaging of graphical objects
CN101340511B (en) Adaptive video image enhancing method based on lightness detection
US11227566B2 (en) Method for reducing brightness of images, a data-processing apparatus, and a display apparatus
US20110292060A1 (en) Frame buffer sizing to optimize the performance of on screen graphics in a digital electronic device
US20140092012A1 (en) Adaptive tone map to a region of interest to yield a low dynamic range image
CN108665857B (en) Driving method of display device, driving device thereof and related device
KR20100083706A (en) Method and system for improving display quality of a multi-component display
US20060125835A1 (en) DMA latency compensation with scaling line buffer
CN109686342B (en) Image processing method and device
US10578868B2 (en) Head-mounted display and video data processing method thereof
WO2019041842A1 (en) Image processing method and device, storage medium and computer device
CN109194878B (en) Video image anti-shake method, device, equipment and storage medium
US20070200864A1 (en) Method and system for gathering per-frame image statistics while preserving resolution and runtime performance in a real-time visual simulation
JP2837339B2 (en) Image generation method
CN112927147B (en) Display data correction method and device of display module, computer equipment and medium
CN102859573A (en) Method and apparatus for adaptive main back-light blanking in liquid crystal displays
WO2020098624A1 (en) Display method and apparatus, vr display apparatus, device, and storage medium
CN101533511A (en) Background image updating method and device thereof
Wang et al. A real-time image processor with combining dynamic contrast ratio enhancement and inverse gamma correction for PDP
JP5812739B2 (en) Image processing apparatus, image processing method, and program
CN114078452B (en) Method for adjusting content contrast, electronic device and storage medium
US6693644B1 (en) Graphic accelerator reducing and processing graphics data
US20060050084A1 (en) Apparatus and method for histogram stretching
US7995056B1 (en) Culling data selection system and method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination