CN113538271A - Image display method, image display device, electronic equipment and computer readable storage medium - Google Patents

Image display method, image display device, electronic equipment and computer readable storage medium Download PDF

Info

Publication number
CN113538271A
CN113538271A CN202110786222.0A CN202110786222A CN113538271A CN 113538271 A CN113538271 A CN 113538271A CN 202110786222 A CN202110786222 A CN 202110786222A CN 113538271 A CN113538271 A CN 113538271A
Authority
CN
China
Prior art keywords
gray scale
image
gray
pixels
transition region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110786222.0A
Other languages
Chinese (zh)
Inventor
郑超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110786222.0A priority Critical patent/CN113538271A/en
Publication of CN113538271A publication Critical patent/CN113538271A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • G06T5/94Dynamic range modification of images or parts thereof based on local image properties, e.g. for local contrast enhancement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application relates to an image display method, an image display device, electronic equipment and a computer readable storage medium, wherein the image display method comprises the following steps: acquiring an image to be processed with a first color depth; determining a gray scale transition region in the image to be processed, wherein the difference value between the gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold value; and respectively carrying out interpolation processing on the gray scale values of a plurality of pixels in the gray scale transition region to obtain a display image with a second color depth, wherein the second color depth is greater than the first color depth. In the embodiment of the application, the gray scale transition region can be understood as a region with a sudden change in gray scale, and partial transition gray scale can be added by determining the gray scale transition region and performing interpolation processing on the gray scale value of the pixel in the gray scale transition region, so that the transition effect of image color is effectively improved, and the definition of image display is further improved.

Description

Image display method, image display device, electronic equipment and computer readable storage medium
Technical Field
The embodiment of the application relates to the technical field of image display, in particular to an image display method, an image display device, electronic equipment and a computer-readable storage medium.
Background
In recent years, with the continuous upgrade of electronic devices, electronic devices can realize more and more clear and rich image display functions. However, since the conventional video source and the like have low image quality, a high quality display effect cannot be exhibited even with advanced electronic devices. Particularly, in terms of displaying colors, the transition effect of image colors of the conventional video source is not good, resulting in a problem of image display blurring.
Disclosure of Invention
The embodiment of the application provides an image display method, an image display device, electronic equipment and a computer-readable storage medium, which can optimize the transition effect of image colors, thereby improving the definition of image display.
A method of displaying an image, the method comprising:
acquiring an image to be processed with a first color depth;
determining a gray scale transition region in the image to be processed, wherein the difference value between the gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold value;
and respectively carrying out interpolation processing on the gray scale values of a plurality of pixels in the gray scale transition region to obtain a display image with a second color depth, wherein the second color depth is greater than the first color depth.
An image display apparatus comprising:
the image acquisition module is used for acquiring an image to be processed with a first color depth;
the area determination module is used for determining a gray scale transition area in the image to be processed, and the difference value between the gray scale values of at least two adjacent pixels in the gray scale transition area is greater than a first threshold value;
and the interpolation processing module is used for respectively carrying out interpolation processing on the gray scale values of the pixels in the gray scale transition region so as to obtain a display image with a second color depth, wherein the second color depth is greater than the first color depth.
An electronic device comprises a memory and a processor, wherein the memory stores a computer program, and the computer program, when executed by the processor, causes the processor to execute the steps of the image display method.
A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image display method as described above.
The image display method, the image display device, the electronic equipment and the computer readable storage medium comprise the following steps: acquiring an image to be processed with a first color depth; determining a gray scale transition region in the image to be processed, wherein the difference value between the gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold value; and respectively carrying out interpolation processing on the gray scale values of a plurality of pixels in the gray scale transition region to obtain a display image with a second color depth, wherein the second color depth is greater than the first color depth. In the embodiment of the application, the gray scale transition region can be understood as a region with a sudden change in gray scale, and partial transition gray scale can be added by determining the gray scale transition region and performing interpolation processing on the gray scale value of the pixel in the gray scale transition region, so that the transition effect of image color is effectively improved, and the definition of image display is further improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments or related technologies of the present application, the drawings needed to be used in the description of the embodiments or related technologies are briefly introduced below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a diagram illustrating an exemplary embodiment of an image display method;
FIG. 2 is a flowchart illustrating an image displaying method according to an embodiment;
FIG. 3 is a diagram illustrating an embodiment of an image to be processed;
FIG. 4 is a diagram illustrating an embodiment of displaying an image;
FIG. 5 is a sub-flowchart illustrating the steps of interpolating gray-scale values of a plurality of pixels in the gray-scale transition region, respectively, according to one embodiment;
FIG. 6 is a sub-flowchart illustrating an embodiment of interpolating gray-scale values of a plurality of pixels in the gray-scale transition region according to the number of newly added gray-scales;
FIG. 7 is a sub-flowchart illustrating the steps of interpolating gray-scale values of a plurality of pixels in the gray-scale transition region according to the number of newly added gray-scales according to an embodiment;
FIG. 8 is a diagram illustrating a first layer according to an embodiment;
FIG. 9 is a diagram illustrating a second layer according to an embodiment;
FIG. 10 is a second flowchart of an image displaying method according to an embodiment;
FIG. 11 is a third flowchart illustrating an image displaying method according to an embodiment;
FIG. 12 is a diagram illustrating a first layer after inserting a newly added pixel according to an embodiment;
FIG. 13 is a diagram illustrating a second layer after inserting a new pixel according to an embodiment;
FIG. 14 is a diagram illustrating a first image layer after filtering according to an embodiment;
FIG. 15 is a schematic diagram of a fused display image according to an embodiment;
FIG. 16 is a block diagram showing the structure of an image display device according to an embodiment;
fig. 17 is a schematic diagram of an internal structure of an electronic device in one embodiment.
Detailed Description
To facilitate an understanding of the embodiments of the present application, the embodiments of the present application will be described more fully below with reference to the accompanying drawings. Preferred embodiments of the present application are shown in the drawings. The embodiments of the present application may, however, be embodied in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the embodiments of this application belong. The terminology used herein in the description of the embodiments of the present application is for the purpose of describing particular embodiments only and is not intended to be limiting of the embodiments of the present application. As used herein, the term "and/or" includes any and all combinations of one or more of the associated listed items.
It will be understood that, as used herein, the terms "first," "second," and the like may be used herein to describe various elements, but these elements are not limited by these terms. These terms are only used to distinguish one element from another. For example, a first layer may be referred to as a second layer, and similarly, a second layer may be referred to as a first layer, without departing from the scope of the present application. Both the first layer and the second layer are layers, but they are not the same layer.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In the description of the present application, "plurality" means at least two, e.g., two, three, etc., unless specifically limited otherwise. In the description of the present application, "a number" means at least one, such as one, two, etc., unless specifically limited otherwise.
Fig. 1 is an application environment diagram of an image display method according to an embodiment. As shown in fig. 1, the application environment includes a terminal 110 and a server 120. The terminal 110 obtains an image to be processed, wherein the terminal 110 may be a terminal device including a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a vehicle-mounted computer, a wearable device, and the like. The terminal 110 may download and apply various types of image resources from the server 120 as images to be processed. The server 120 may be a server or a server cluster. In the embodiment of the present application, the image display method may be applied to the terminal 110, and the terminal 110 directly executes the image display method in each embodiment to generate a display image with high color depth and natural color gradation.
Fig. 2 is a flowchart of an image display method according to an embodiment, and referring to fig. 2, the method includes steps 202 to 206.
Step 202, acquiring an image to be processed with a first color depth.
Wherein the color depth is used for the watchThe greater the color depth is, the more colors the image can represent. For example, the number of gray scales that can be expressed by a 1-bit image is 212, two colors of black and white can be expressed. The number of gray scales expressed by the 2-bit image is 22That is, two graytones between black and white can be expressed in addition to black and white. Fig. 3 is a schematic diagram of an image to be processed according to an embodiment, in which the image to be processed shown in fig. 3 includes 4 different gray scale values, and 4 gray scales can be respectively defined as 0 gray scale, 1 gray scale, 2 gray scale and 3 gray scale from low to high in brightness. Referring to fig. 3, it can be seen that the abrupt change between the 0 gray level and the 1 gray level is most obvious, and the viewing experience of the user is greatly affected.
It should be noted that the image to be processed may be any image. For example, the image to be processed may be an image captured by a camera on the terminal. Of course, in a possible implementation manner, the image to be processed may also be an image or a screenshot acquired in real time from a network instead of the image acquired by the camera on the terminal, which is not limited in this embodiment.
Step 204, determining a gray scale transition region in the image to be processed, wherein a difference value between gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold value.
The gray level transition region can be understood as a region where gray level abrupt change pixels exist. It is understood that the gray level transition region may be a regular region, such as a rectangular region, or an irregular region, which may be determined according to the image recognition result.
For example, the image may be identified by an edge identification algorithm, and the grayscale transition region may be extracted according to the identified edge. For example, edge pixel points in the image to be processed can be extracted through the edge features, and then the detected edge pixel points are connected to be used as a gray scale transition region. For another example, the edge recognition of the image may be performed by detecting adjacent pixel values or brightness values of pixel points of the image line by line and column by column, determining pixel points with severely changed pixel values or brightness values as edge pixel points, and connecting the edge pixel points to serve as a gray scale transition region. Still further exemplary, the gray-scale transition region may also be determined based on, but not limited to, computing the image using a Roberts edge operator, a Sobel edge detector operator, or a Laplacan edge operator. It should be understood that the above examples are for illustrative purposes only and are not intended to limit the scope of the present embodiments.
And step 206, respectively performing interpolation processing on the gray scale values of the plurality of pixels in the gray scale transition region to obtain a display image with a second color depth, wherein the second color depth is greater than the first color depth.
Fig. 4 is a schematic diagram of a display image according to an embodiment, and the image to be processed shown in fig. 4 includes 5 different gray scale values. With reference to fig. 3 and 4, by performing interpolation processing on the gray scale values of the plurality of pixels, a transition gray scale can be inserted between the 0 gray scale and the 1 gray scale in the embodiment of fig. 3, so that the gray scale transition effect in the image is effectively improved, and the abrupt change boundary in the image is weakened when viewed by the user, thereby improving the viewing experience of the user.
In particular, the second color depth may be determined by hardware capabilities of the terminal, e.g., may be jointly determined by display capabilities of a display panel of the terminal, data processing capabilities of a processor, and the like. Illustratively, the first color depth may be 8bit, and in terms of color transition, the 8bit image to be processed has a relatively serious tomography phenomenon, which shows that the image is relatively blurred and the transition is unnatural. Taking the example that the terminal can support displaying an image of at most 10 bits, the second color depth may be one of 9 bits and 10 bits, and the corresponding second color depth may be specifically selected according to an instruction input by the user. For example, if the second color depth corresponding to the instruction input by the user is 9bit, the display quality of the displayed image is better than that of the image to be processed before processing, but the color transition is less natural than that of the image with the second color depth of 10 bit. If the second color depth corresponding to the instruction input by the user is 10bit, the display quality of the image is optimal, but the data processing amount required by the processor is large, and the terminal may generate heat.
In this embodiment, by determining the gray scale transition region and performing interpolation processing on the gray scale values of the pixels in the gray scale transition region, part of transition gray scales can be added, so that the transition effect of image colors is effectively improved, the part with larger transition jump of the original gray scales can be transited more naturally, the transition effect of color gradual change is improved, the banding problem caused by color gradual change transition is optimized, and the definition of image display is further improved.
Fig. 5 is a sub-flowchart illustrating steps of interpolating gray-scale values of a plurality of pixels in the gray-scale transition region, respectively, according to an embodiment, and referring to fig. 5, the steps include steps 502 to 504.
Step 502, determining the number of newly added gray scales during interpolation processing according to the first color depth and the second color depth.
Specifically, if the first color depth is m and the second color depth is n, where m and n are positive integers, the total number of newly added gray scales is 2n-2m. Illustratively, the transition gray level is inserted between the 0 gray level and the 1 gray level. If the first color depth is 8 bits and the second color depth is 10 bits, the difference between the two color depths is 2 bits, the number of the newly added gray scales is 768, or it can be understood that 3 transition gray scales can be inserted into any two adjacent original gray scales, and therefore, the three newly added gray scales inserted between the 0 gray scale and the 1 gray scale can be respectively defined as the 0.25 gray scale, the 0.5 gray scale and the 0.75 gray scale. If the first color depth is 8 bits and the second color depth is 9 bits, the difference between the two color depths is 1bit, and the number of the newly added gray scales is 256, or it can be understood that 1 transition gray scale can be inserted into any two adjacent original gray scales, so that the newly added gray scales inserted between the 0 gray scale and the 1 gray scale can be respectively defined as 0.5 gray scale.
Step 504, performing interpolation processing on the gray scale values of the plurality of pixels in the gray scale transition region according to the number of the newly added gray scales.
The interpolation process for the gray-scale value of the pixel may be performed based on the gray-scale values of four pixels, i.e., the upper, lower, left, and right pixels of the pixel. Taking the average value interpolation processing method as an example, if the gray scale values of the four pixels above, below, left, and right of the pixel are 0, and 1, respectively, the gray scale value of the pixel should be set to (0+0+0+1)/4, i.e., 0.25 gray scale. Based on the number of newly added gray scales obtained in step 502, if the difference between the two color depths is 2 bits, 0.25 gray scale may be inserted. If the difference between the two color depths is 1bit, then there is no 0.25 gray scale, the gray scale value obtained by calculation needs to be processed, and the gray scale value of the region is set to be 0 gray scale or 0.5 gray scale, which can be specifically set according to a preset rule.
In the embodiment, the number of newly added gray scales that can be inserted can be accurately obtained according to the first color depth and the second color depth, so that the gray scale data can be conveniently processed in the interpolation processing stage to obtain an accurate display image.
Fig. 6 is a sub-flowchart illustrating the steps of interpolating the gray-scale values of the pixels in the gray-scale transition region according to the number of newly added gray-scales according to an embodiment, and referring to fig. 6, in this embodiment, the steps include steps 602 to 606.
Step 602, determining a target pixel in the gray-scale transition region, where a difference between a gray-scale value of the target pixel and a gray-scale value of at least one adjacent pixel is greater than a second threshold, where the second threshold is less than or equal to the first threshold. The target pixel may be determined based on the data acquired in the gray scale transition region identification step, for example, the determined edge contour point may be used as the target pixel, so as to reduce the data computation amount of the embodiment, and further improve the processing speed of the image display method.
Step 604, obtaining a gray scale gradient of the target pixel according to the gray scale values of a plurality of pixels adjacent to the target pixel.
The gray scale gradient can be used for representing the gray scale change condition around the pixel, and further, the gray scale gradient direction can be understood as the direction with the fastest gray scale change. Specifically, for most pixels, the gray gradient direction is usually not the row direction or the column direction. That is, the speed of change in the row direction and the speed of change in the column direction of the gray-scale values of the pixels are not exactly the same. Accordingly, if the gray-scale value of the pixel is simply acquired by the mean value interpolation method, the result of the interpolation process may be insufficiently accurate. Therefore, by acquiring the gradation gradient of the target pixel to be interpolated, more accurate interpolation processing can be realized. It is to be understood that the present embodiment does not limit the method for acquiring the gray scale gradient, and any method capable of acquiring the gray scale gradient may be applied to the present embodiment.
Step 606, performing weighted interpolation processing according to the gray scale gradient of the target pixel and the gray scale value of at least one adjacent pixel.
Wherein the weighted interpolation processing may be performed according to gray-scale values of a plurality of pixels around the target pixel. Further, the number of adjacent pixels used when acquiring the gradation gradient may be different from the number of adjacent pixels used when performing the weighted interpolation processing. For example, the gray scale gradient may be obtained using the gray scale values of a larger number of adjacent pixels, and the weighted interpolation processing may be performed using the gray scale values of a smaller number of adjacent pixels. The influence of the gray scale steps on the accuracy of interpolation processing is relatively large, so that a more efficient interpolation processing process can be realized by acquiring more accurate gray scale gradients and performing weighted interpolation processing more quickly.
Fig. 7 is a sub-flowchart illustrating the steps of interpolating the gray-scale values of the pixels in the gray-scale transition region according to the number of newly added gray-scales according to an embodiment, and referring to fig. 7, in this embodiment, the steps include steps 702 to 706.
And step 702, taking the gray scale transition area as a first image layer, and taking the residual area in the image to be processed as a second image layer. Specifically, based on the image to be processed in the embodiment of fig. 3, a first image layer shown in the embodiment of fig. 8 and a second image layer shown in the embodiment of fig. 9 may be obtained.
Step 704, performing interpolation processing on the gray-scale values of the pixels in the first layer according to the newly added gray-scale quantity, wherein the number of the gray-scale values in the interpolated first layer is greater than the number of the gray-scale values in the image to be processed. In particular, the region in the second layer may be understood as a non-gray level transition region, i.e. a region where the transition of the pixel gray level is relatively natural. Therefore, even if the gray-scale values of the pixels in the second layer are not subjected to interpolation processing, the display quality of the display image is not affected, but the data processing amount of the processor can be greatly reduced.
Step 706, performing fusion processing on the second layer and the interpolated first layer. Specifically, the display image shown in the embodiment of fig. 4 can be obtained by performing fusion processing on the second layer and the interpolated first layer.
In this embodiment, a Graphics Processing Unit (GPU) may extract a gray-scale transition region with a single layer, and finally, a Processing module in the dpu (data Processing unit) that is responsible for image integration fuses the two layers, thereby outputting a complete display image. By performing different processing on different areas in the image to be processed, the processing pressure of the processor can be reduced to a greater extent, so that the processing speed of the image display method is increased.
Fig. 10 is a second flowchart of the image display method according to the embodiment, and referring to fig. 10, the method includes steps 1002 to 1014.
Step 1002, obtain an image to be processed having a first color depth.
Step 1004, determining a gray scale transition region in the image to be processed, wherein a difference value between gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold value.
And step 1006, taking the gray scale transition area as a first image layer, and taking the remaining area in the image to be processed as a second image layer.
The specific implementation of steps 1002 to 1004 may refer to steps 202 and 204 in the foregoing embodiment in a one-to-one correspondence manner, and the specific implementation of step 1006 may refer to step 702 in the foregoing embodiment, which is not described again in this embodiment.
Step 1008, performing super-resolution processing on the first image layer and the second image layer respectively.
It can be understood that, in the related art, when processing a picture or a video picture, a content source is directly stretched or shrunk to a resolution corresponding to a screen on a display link, and then the content source is output and displayed on the screen. In some scenes, a black picture is supplemented around a content source, only an area with a corresponding resolution is displayed, and other areas are displayed in black, so that data of the content source is not distorted, but the influence on the user experience is large when only a small part is displayed, which is equivalent to using a high-end display terminal, but the display effect is influenced because the display of the small area can be seen without being different from that of a low-end display terminal.
Therein, super-resolution may be understood as increasing the resolution of the image. For example, the resolution of the image before the super-resolution processing may be 1920 × 1080, and the resolution of the image before the super-resolution processing may be 3840 × 2160.
Alternatively, the super-resolution processing may employ a conventional algorithm, a neural network algorithm, a dictionary method, or the like. The conventional algorithm may be, for example, directional interpolation, principal component analysis, or the like. The neural network algorithm may be, for example, SRCNN (Image Super-Resolution Using Deep adaptive network), EDVR (video retrieval with Enhanced Deformable adaptive network), VDSR (Very Deep network for Super-Resolution), and the like. The dictionary method may be RAISR (Rapid and Accurate Super Image resolution), for example. It should be noted that the above processing method is only used for exemplary illustration, and is not used to limit the protection scope of the present embodiment.
Step 1010, determining the number of newly added gray scales during interpolation processing according to the first color depth and the second color depth.
And 1012, performing interpolation processing on the gray-scale values of the plurality of pixels in the first image layer after the super-resolution processing according to the newly added gray-scale quantity, wherein the number of the gray-scale values in the first image layer after the interpolation processing is greater than the number of the gray-scale values in the image to be processed.
And 1014, performing fusion processing on the second layer subjected to the super-resolution processing and the first layer subjected to the interpolation processing to acquire a display image with a second color depth.
Steps 1010 to 1014 are steps of performing interpolation processing on the gray scale values of the pixels in the first layer according to the number of the newly added gray scales, and the specific implementation may refer to steps 502, 504, and 706 in the foregoing embodiment one to one, respectively, which is not described again in this embodiment. The embodiment can make the picture fine and smooth by increasing the resolution, thereby improving the watching experience of the user.
Fig. 11 is a third flowchart of an image display method according to an embodiment, and referring to fig. 11, in the embodiment, steps 1102 to 1114 are included.
Step 1102, obtain an image to be processed having a first color depth. The specific implementation of step 1102 may refer to step 202 in the foregoing embodiment, and is not described in detail in this embodiment.
Step 1104, determining a gray scale transition region in the image to be processed, where a difference between gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold.
The gray scale values of the pixels respectively comprise a red gray scale value, a green gray scale value and a blue gray scale value. Accordingly, step 1104 may include: respectively obtaining an average gray scale value of each pixel in an image to be processed, wherein the average gray scale value is an average value of a red gray scale value, a green gray scale value and a blue gray scale value; and determining a gray scale transition region in the image to be processed according to the average gray scale value of each pixel, wherein the difference value between the average gray scale values of at least two adjacent pixels in the gray scale transition region is larger than the first threshold value. According to the embodiment, the judgment accuracy of the gray scale transition region can be effectively improved by comprehensively calculating according to the gray scale values of different colors, so that the processing effect of the subsequent interpolation processing step is improved.
And step 1106, taking the gray scale transition area as a first image layer, and taking the residual area in the image to be processed as a second image layer. The specific implementation of step 1106 may refer to step 702 in the foregoing embodiments, and details are not repeated in this embodiment.
Step 1108, determining the target number of the newly added pixels according to the initial resolution and the target resolution. Wherein the image to be processed has an initial resolution and the display image has a target resolution.
Step 1110, inserting a plurality of newly added pixels in the first layer and the second layer according to the target number.
Where the insertion of newly added pixels is to be understood as an expansion based on the original pixels, i.e. the plurality of pixels shown in fig. 8 and 9. Specifically, the original pixels may be expanded in various directions, such as at least one of the upper, lower, left, and right directions, the expanded pixel range may be customized, and the expanded image includes the corresponding original pixels. When a boundary is met, the pixel values in the extended pixel range can be filled through the preset pixel values, and the filling mode can be customized. Illustratively, based on the first layer shown in fig. 8 and the second layer shown in fig. 9, the first layer shown in fig. 12 and the second layer shown in fig. 9 may be obtained by inserting pixels. As can be seen by referring to fig. 8 and 12 in combination, the number of pixels in the image can be effectively increased by inserting the pixels, but a certain jaggy phenomenon may also exist in the image. The images shown in fig. 12 and 13 can be understood as being extended by one pixel range in the up direction and the right direction, respectively, that is, three pixels are extended on a per original pixel basis.
Step 1112, performing filtering processing on the first layer into which the pixel is inserted to remove edge sawteeth in the first layer. Specifically, fig. 14 is the first image layer after the filtering process, and with reference to fig. 12 and fig. 14, the definition of the image may be further improved through the filtering process, so as to improve the viewing experience. It is to be understood that the present embodiment is not particularly limited to the filtering method, and any method having a smooth edge, a antialiasing effect may be applied to the present embodiment.
Step 1114, determining a new number of gray levels for interpolation according to the first color depth and the second color depth. Step 1114 may refer to step 502 in the foregoing embodiment, and is not described again in this embodiment.
And 1116, performing interpolation processing on the gray scale values of the plurality of pixels in the first image layer after the super-resolution processing according to the newly added gray scale quantity, wherein the number of the gray scale values in the first image layer after the interpolation processing is larger than the number of the gray scale values in the image to be processed. Step 1116 may refer to step 504 in the previous embodiment, and is not described in detail in this embodiment.
Specifically, interpolation processing is performed on the gray scale value of each newly added pixel in the first image layer after the super-resolution processing according to the newly added gray scale quantity. Specifically, fig. 14 is the first image layer after the filtering processing, and referring to fig. 14, by performing interpolation processing on the gray scale values of the newly added pixels, the gray scale gradual change effect can be further improved, so as to improve the viewing experience.
And 1118, performing fusion processing on the second layer subjected to super-resolution processing and the first layer subjected to interpolation processing to obtain a display image with a second color depth. Step 1118 may refer to step 706 in the foregoing embodiment, and is not described in detail in this embodiment.
It should be understood that, although the steps in the flowcharts are shown in sequence as indicated by the arrows, the steps are not necessarily performed in sequence as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least a portion of the steps in each flowchart may include multiple sub-steps or multiple stages, which are not necessarily performed at the same time, but may be performed at different times, and the order of performing the sub-steps or stages is not necessarily sequential, but may be performed alternately or alternately with other steps or at least a portion of the sub-steps or stages of other steps.
Fig. 16 is a block diagram showing the structure of an image display device according to an embodiment. As shown in fig. 16, the image display apparatus includes an image acquisition module 1602, a region determination module 1604, and an interpolation processing module 1606.
The image obtaining module 1602 is configured to obtain an image to be processed with a first color depth. The region determining module 1604 is configured to determine a gray scale transition region in the image to be processed, where a difference between gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold. The interpolation processing module 1606 is configured to perform interpolation processing on the gray scale values of the pixels in the gray scale transition region, respectively, to obtain a display image with a second color depth, where the second color depth is greater than the first color depth.
The division of the modules in the image display apparatus is only for illustration, and in other embodiments, the image display apparatus may be divided into different modules as needed to complete all or part of the functions of the image display apparatus.
For specific limitations of the image display apparatus, reference may be made to the above limitations of the image display method, which are not described herein again. The respective modules in the image display apparatus described above may be wholly or partially implemented by software, hardware, and a combination thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
Fig. 17 is a schematic diagram of an internal structure of an electronic device in one embodiment. As shown in fig. 17, the electronic apparatus includes a processor and a memory connected by a system bus. Wherein, the processor is used for providing calculation and control capability and supporting the operation of the whole electronic equipment. The memory may include a non-volatile storage medium and an internal memory. The non-volatile storage medium stores an operating system and a computer program. The computer program can be executed by a processor to implement an image display method provided in the following embodiments. The internal memory provides a cached execution environment for the operating system computer programs in the non-volatile storage medium. The electronic device may be any terminal device such as a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a Point of Sales (POS), a vehicle-mounted computer, and a wearable device.
The implementation of each module in the image display apparatus provided in the embodiment of the present application may be in the form of a computer program. The computer program may be run on a terminal or a server. Program modules constituted by such computer programs may be stored on the memory of the electronic device. Which when executed by a processor, performs the steps of the method described in the embodiments of the present application.
The embodiment of the application also provides a computer readable storage medium. One or more non-transitory computer-readable storage media containing computer-executable instructions that, when executed by one or more processors, cause the processors to perform the steps of the image display method.
A computer program product containing instructions which, when run on a computer, cause the computer to perform an image display method.
Any reference to memory, storage, database, or other medium used herein may include non-volatile and/or volatile memory. Non-volatile memory can include read-only memory (ROM), Programmable ROM (PROM), Electrically Programmable ROM (EPROM), Electrically Erasable Programmable ROM (EEPROM), or flash memory. Volatile memory can include Random Access Memory (RAM), which acts as external cache memory. By way of illustration and not limitation, RAM is available in a variety of forms, such as Static RAM (SRAM), Dynamic RAM (DRAM), Synchronous DRAM (SDRAM), double data rate SDRAM (DDR SDRAM), Enhanced SDRAM (ESDRAM), synchronous Link (Synchlink) DRAM (SLDRAM), Rambus Direct RAM (RDRAM), direct bus dynamic RAM (DRDRAM), and bus dynamic RAM (RDRAM).
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express a few embodiments of the present application, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for those skilled in the art, variations and modifications can be made without departing from the concept of the embodiments of the present application, and these embodiments are within the scope of the present application. Therefore, the protection scope of the embodiments of the present application shall be subject to the appended claims.

Claims (11)

1. An image display method, characterized in that the method comprises:
acquiring an image to be processed with a first color depth;
determining a gray scale transition region in the image to be processed, wherein the difference value between the gray scale values of at least two adjacent pixels in the gray scale transition region is greater than a first threshold value;
and respectively carrying out interpolation processing on the gray scale values of a plurality of pixels in the gray scale transition region to obtain a display image with a second color depth, wherein the second color depth is greater than the first color depth.
2. The image display method according to claim 1, wherein the interpolating the gray-scale values of the plurality of pixels in the gray-scale transition region, respectively, comprises:
determining the number of newly added gray scales during interpolation processing according to the first color depth and the second color depth;
and carrying out interpolation processing on the gray scale values of the pixels in the gray scale transition region according to the number of the newly added gray scales.
3. The image display method according to claim 2, wherein the interpolating the gray-scale values of the pixels in the gray-scale transition region according to the number of the newly added gray-scales comprises:
determining a target pixel in the gray scale transition region, wherein the difference value between the gray scale value of the target pixel and the gray scale value of at least one adjacent pixel is greater than a second threshold value, and the second threshold value is less than or equal to the first threshold value;
acquiring the gray scale gradient of a target pixel according to the gray scale values of a plurality of pixels adjacent to the target pixel;
and carrying out weighted interpolation processing according to the gray gradient of the target pixel and the gray value of at least one adjacent pixel.
4. The image display method according to claim 2, wherein the interpolating the gray-scale values of the pixels in the gray-scale transition region according to the number of the newly added gray-scales comprises:
taking the gray scale transition area as a first image layer, and taking the residual area in the image to be processed as a second image layer;
performing interpolation processing on the gray-scale values of the pixels in the first image layer according to the newly added gray-scale quantity, wherein the number of the gray-scale values in the first image layer after interpolation processing is larger than the number of the gray-scale values in the image to be processed;
and performing fusion processing on the second image layer and the first image layer after interpolation processing.
5. The image display method according to claim 4, wherein before performing interpolation processing on the gray-scale values of the plurality of pixels in the first layer according to the newly added gray-scale quantity, the method further comprises:
performing super-resolution processing on the first image layer and the second image layer respectively;
the fusing the second image layer and the first image layer after the interpolation processing includes:
and performing fusion processing on the second image layer after the super-resolution processing and the first image layer after the interpolation processing.
6. The image display method according to claim 5, wherein the image to be processed has an initial resolution, the display image has a target resolution, and the performing super-resolution processing on the first image layer and the second image layer respectively comprises:
determining the target number of the newly added pixels according to the initial resolution and the target resolution;
inserting a plurality of newly added pixels into the first image layer and the second image layer according to the target quantity;
the interpolation processing of the gray scale values of the pixels in the first layer according to the newly added gray scale quantity includes:
and performing interpolation processing on the gray scale value of each newly added pixel in the first image layer after the super-resolution processing according to the newly added gray scale quantity.
7. The image display method according to claim 6, wherein before performing interpolation processing on the gray scale value of each new pixel in the super-resolution processed first image layer according to the number of new gray scales, the method further comprises:
and carrying out filtering processing on the first image layer after the pixels are inserted so as to remove edge sawteeth in the first image layer.
8. The image display method according to claim 1, wherein the gray scale values of each pixel respectively include a red gray scale value, a green gray scale value and a blue gray scale value, the determining a gray scale transition region in the image to be processed, a difference between gray scale values of at least two adjacent pixels in the gray scale transition region being greater than a first threshold value comprises:
respectively obtaining an average gray scale value of each pixel in an image to be processed, wherein the average gray scale value is an average value of a red gray scale value, a green gray scale value and a blue gray scale value;
and determining a gray scale transition region in the image to be processed according to the average gray scale value of each pixel, wherein the difference value between the average gray scale values of at least two adjacent pixels in the gray scale transition region is larger than the first threshold value.
9. An image display apparatus, comprising:
the image acquisition module is used for acquiring an image to be processed with a first color depth;
the area determination module is used for determining a gray scale transition area in the image to be processed, and the difference value between the gray scale values of at least two adjacent pixels in the gray scale transition area is greater than a first threshold value;
and the interpolation processing module is used for respectively carrying out interpolation processing on the gray scale values of the pixels in the gray scale transition region so as to obtain a display image with a second color depth, wherein the second color depth is greater than the first color depth.
10. An electronic device comprising a memory and a processor, the memory having stored therein a computer program that, when executed by the processor, causes the processor to perform the steps of the image display method according to any one of claims 1 to 8.
11. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the image display method according to any one of claims 1 to 8.
CN202110786222.0A 2021-07-12 2021-07-12 Image display method, image display device, electronic equipment and computer readable storage medium Pending CN113538271A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110786222.0A CN113538271A (en) 2021-07-12 2021-07-12 Image display method, image display device, electronic equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110786222.0A CN113538271A (en) 2021-07-12 2021-07-12 Image display method, image display device, electronic equipment and computer readable storage medium

Publications (1)

Publication Number Publication Date
CN113538271A true CN113538271A (en) 2021-10-22

Family

ID=78098707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110786222.0A Pending CN113538271A (en) 2021-07-12 2021-07-12 Image display method, image display device, electronic equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN113538271A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222104A (en) * 2021-12-17 2022-03-22 深圳市巨烽显示科技有限公司 Method, device and equipment for improving color display capability of low-bit display equipment
CN115297311A (en) * 2022-07-22 2022-11-04 Oppo广东移动通信有限公司 Image processing method and device, electronic device and storage medium
CN117115276A (en) * 2023-01-12 2023-11-24 荣耀终端有限公司 Picture processing method, device and storage medium

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114222104A (en) * 2021-12-17 2022-03-22 深圳市巨烽显示科技有限公司 Method, device and equipment for improving color display capability of low-bit display equipment
CN115297311A (en) * 2022-07-22 2022-11-04 Oppo广东移动通信有限公司 Image processing method and device, electronic device and storage medium
CN117115276A (en) * 2023-01-12 2023-11-24 荣耀终端有限公司 Picture processing method, device and storage medium

Similar Documents

Publication Publication Date Title
CN113538271A (en) Image display method, image display device, electronic equipment and computer readable storage medium
EP1789922B1 (en) Apparatus and method for processing images
JP4621733B2 (en) Method and system for viewing and enhancing images
EP3407604A1 (en) Method and device for processing high dynamic range image
CN113596573B (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109934262B (en) Picture variability judging method, device, computer equipment and storage medium
CN111489322B (en) Method and device for adding sky filter to static picture
CN108806638B (en) Image display method and device
JP2020191046A (en) Image processing apparatus, image processing method, and program
CN112602088A (en) Method, system and computer readable medium for improving quality of low light image
CN111951172A (en) Image optimization method, device, equipment and storage medium
CN107220934B (en) Image reconstruction method and device
CN108921869B (en) Image binarization method and device
CN109472874B (en) Display method, display device, VR display device and storage medium
CN111738944A (en) Image contrast enhancement method and device, storage medium and smart television
CN113053322B (en) Display optimization method of electronic ink screen and related device
CN112073718B (en) Television screen splash detection method and device, computer equipment and storage medium
KR20030066511A (en) Apparatus and method for real-time brightness control of moving images
CN112435634A (en) Image display method, image display apparatus, and readable storage medium
CN106971386B (en) Method and device for judging image integrity and page loading degree and client equipment
CN115278104B (en) Image brightness adjustment method and device, electronic equipment and storage medium
CN115049572A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN111754518B (en) Image set expansion method and device and electronic equipment
CN106934814B (en) Background information identification method and device based on image
US8139898B2 (en) Image process method and apparatus for image enlargement and enhancement

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination