WO2022033230A1 - Procédé d'affichage d'image, dispositif électronique et support d'enregistrement lisible par ordinateur - Google Patents

Procédé d'affichage d'image, dispositif électronique et support d'enregistrement lisible par ordinateur Download PDF

Info

Publication number
WO2022033230A1
WO2022033230A1 PCT/CN2021/104188 CN2021104188W WO2022033230A1 WO 2022033230 A1 WO2022033230 A1 WO 2022033230A1 CN 2021104188 W CN2021104188 W CN 2021104188W WO 2022033230 A1 WO2022033230 A1 WO 2022033230A1
Authority
WO
WIPO (PCT)
Prior art keywords
sub
image
screen
data
image data
Prior art date
Application number
PCT/CN2021/104188
Other languages
English (en)
Chinese (zh)
Inventor
郑亮
Original Assignee
中兴通讯股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 中兴通讯股份有限公司 filed Critical 中兴通讯股份有限公司
Publication of WO2022033230A1 publication Critical patent/WO2022033230A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4007Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/40Scaling of whole images or parts thereof, e.g. expanding or contracting
    • G06T3/4092Image resolution transcoding, e.g. by using client-server architectures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/72Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
    • H04M1/724User interfaces specially adapted for cordless or mobile telephones
    • H04M1/72403User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality

Definitions

  • the embodiments of the present application relate to, but are not limited to, the field of image technologies, and in particular, relate to an image display method, an electronic device, and a computer-readable storage medium.
  • Embodiments of the present application provide an image display method, an electronic device, and a computer-readable storage medium.
  • An embodiment of the present application provides an image display method, which is applied to a terminal with a screen, where the screen includes a first sub-screen and a second sub-screen with different resolutions, and the method includes: acquiring screen parameters of the screen and the full image data of the image to be displayed; obtain sub-image data according to the screen parameters and the full-image data, and the sub-image data is used to display on the first sub-screen; Adaptive processing; displaying an image on the first sub-screen and the second sub-screen according to the sub-image data and the full-image data after the adaptive processing.
  • An embodiment of the present application provides an electronic device, including: a memory, a processor, and a computer program stored in the memory and running on the processor, the processor implements the program when the processor executes the program:
  • Embodiments of the present application provide a computer-readable storage medium storage, including: the computer-readable storage medium stores computer-executable instructions, and the computer-executable instructions are used to cause a computer to execute: mentioned in the above application embodiments
  • FIG. 1 is a schematic structural diagram of a terminal for executing an image display method mentioned in an embodiment of the present application
  • FIG. 2 is a flowchart of an image display method provided by an embodiment of the present application.
  • FIG. 3 is a specific flowchart of step S1200 in the image display method provided by an embodiment of the present application.
  • FIG. 4 is a specific flowchart of step S1300 in the image display method provided by an embodiment of the present application.
  • FIG. 5 is a specific flowchart of step S1310 in the image display method provided by an embodiment of the present application.
  • FIG. 6 is a specific flowchart of step S1320 in the image display method provided by an embodiment of the present application.
  • FIG. 7 is a specific flowchart of step S1330 in the image display method provided by an embodiment of the present application.
  • FIG. 8 is a diagram of a specific application example in an image display method provided by an embodiment of the present application.
  • FIG. 9 is a diagram of another specific application example in the image display method provided by an embodiment of the present application.
  • FIG. 10 is a module block diagram of an electronic device provided by an embodiment of the present application.
  • Terminal 100 screen 110, first sub-screen 111, second sub-screen 112; processor 200; memory 300.
  • the display effect of the terminal with the under-screen camera is different because, in order to facilitate the imaging of the under-screen camera, the light transmittance of the part of the screen corresponding to the under-screen camera is different from other areas, and the light transmittance is different.
  • the display effect of the area is different from that of other areas, and even there are display problems such as color stripes, obvious graininess, and moiré.
  • the under-screen camera needs to make the transmittance of the screen in the corresponding area different from that of the screen in other areas, so that the under-screen camera can perform imaging, resulting in display differences on the screen, such as color stripes, obvious graininess And other issues.
  • the embodiments of the present application provide an image display method, an electronic device, and a computer-readable storage medium, which can improve the display difference of the display effect of the screen of the terminal.
  • the terminal 100 in this embodiment of the present application is a terminal 100 device having a screen 110 , such as a mobile terminal 100 , a display, a tablet computer, a camera and other terminal 100 devices, and the screen 110 exists
  • the first sub-screen 111 and the second sub-screen 112 with different resolutions wherein the first sub-screen 111 refers to the light transmittance or resolution of the area where the first sub-screen 111 is located compared to the entire screen 110
  • the first sub-screen 111 and the second sub-screen 112 together constitute the screen 110 of the entire terminal 100.
  • this application not only It is only applied to the screen of the terminal 100 including the first sub-screen 111 and the second sub-screen 112, and can also be applied to the screen of the terminal 100 including a plurality of sub-screens.
  • the resolution of the screen 110 is related to the transmittance. The smaller the resolution of the screen 110, the greater the transmittance of the screen 110; the greater the resolution of the screen 110, the smaller the transmittance of the screen 110. , so the resolution or transmittance of the first sub-screen 111 and the second sub-screen 112 are not the same.
  • the first sub-screen 111 can be divided into two display screens, one is a sparse sub-screen and the other is a dense sub-screen.
  • the sparse sub-screen means that the resolution of the first sub-screen 111 is smaller than the resolution of the second sub-screen 112.
  • the resolution of the first sub-screen 111 is 720p
  • the resolution of the second sub-screen 112 is 1080p
  • the screen means that the resolution of the first sub-screen 111 is greater than the resolution of the second sub-screen 112.
  • the resolution of the first sub-screen 111 is 1080p
  • the resolution of the second sub-screen 112 is 720p.
  • the display effects of the first sub-screen 111 and the second sub-screen 112 have different display differences.
  • the embodiments of the present application provide an image display method applied to the terminal 100 having the screen 110 .
  • sub-image data for displaying on the first sub-screen 111 is obtained according to the acquired screen parameters and full-image data.
  • the image data is adaptively processed, and images are displayed on the first sub-screen 111 and the second sub-screen 112 according to the adaptively processed sub-image data and full-image data, which can improve the display of the display effect of the screen 110 of the terminal 100 differences, thereby improving the user experience.
  • the image display method applied to the terminal 100 with a screen in the embodiment of the present application includes the steps:
  • S1400 display an image on the first sub-screen 111 and the second sub-screen 112 according to the adaptively processed sub-image data and full-image data.
  • step S1100 the terminal 100 acquires screen parameters of the screen and full image data of the to-be-displayed image to be displayed on the screen of the terminal 100 by calling the corresponding interface.
  • the terminal 100 can use OpenGL (Open Graphics Library) to call the interface of the terminal 100 to obtain screen parameters and full image data, where openGL is used for rendering 2D and 3D vectors Graphical cross-language, cross-platform Application Programming Interface (API), this API interface consists of multiple different function calls, used to draw from simple graphic bits to complex three-dimensional scenes, implemented in this application
  • the screen parameters of the screen can be obtained by calling the OpenGL API interface, and parameter information such as full image data generated by conversion of the image to be displayed can also be obtained.
  • the texture data generated by the conversion of the image to be displayed can also be acquired, and the texture data and the full image data mentioned in this application are both in an image data format.
  • the screen parameters include, but are not limited to, full screen size, second sub-screen resolution, first sub-screen size, first sub-screen resolution, and first sub-screen coordinates, where the full-screen size refers to the terminal
  • the resolution of the second sub-screen refers to the resolution of other parts of the screen excluding the part of the screen corresponding to the first sub-screen
  • the size of the first sub-screen refers to the resolution of the terminal 100 screen and the The part of the screen with different resolutions of other parts of the screen is the area size or side length of the first sub-screen 111
  • the resolution of a sub-screen 111, the coordinates of the first sub-screen refer to the coordinate position of the first sub-screen 111 relative to the screen of the terminal 100, that is, the coordinate position of the first sub-screen 111 relative to the screen of the terminal 100.
  • the resolution and transmittance of the screen 111 and the second sub-screen 112 are different.
  • Full image data includes, but is not limited to, full image size and full image resolution, where full image size is the size of the image area or side length of the image to be displayed; full image resolution refers to the resolution of the image to be displayed. It should be noted that the specific parameter information of screen parameters and full image data is not limited to the parameter information mentioned above, and can be obtained by calling the OpenGL interface according to actual needs.
  • the parameter information is set as follows: the full screen size in the screen parameters is disRegion(disW, disH), the resolution of the second sub-screen is disRes(disW, disH), and the size of the first sub-screen is subRegion(subRegW, subRegH), the resolution of the first sub screen is subDisRes(subDisW, subDisH), the coordinates of the first sub screen are subPosition(subX, subY); the full image data is srcImage(srcW, srcH), the corresponding full image size is srcRegion(srcW, srcH), and the full image resolution is srcRes(srcW, srcH).
  • step S1200 sub-image data is acquired according to the acquired mapping relationship between the screen parameters and the full-image data, wherein the sub-image data is used for displaying on the first sub-screen 111.
  • the image to be displayed is in the When displaying on the screen of the terminal 100, there is a corresponding display image on the first sub-screen 111, and the image data corresponding to the display image refers to the sub-image data.
  • the sub-image data includes, but is not limited to: sub-image resolution, sub-image size, and sub-image coordinates, where sub-image resolution refers to the resolution of the displayed image in the region, and sub-image size refers to the region The size of the image area or side length of the displayed image in the sub-image, and the sub-image coordinates refer to the coordinate position of the displayed image in this area relative to the image to be displayed.
  • the parameter information for setting the sub-image data is as follows: the sub-image data is subImage(subW, subH), the sub-image resolution is subRes(subW, subH), and the sub-image size is subImageRegion(subRegW, subRegH ), the sub-image coordinates are subImagePosition(subX, subY).
  • step S1200 in this embodiment of the present application further includes steps:
  • step S1210 in order to obtain sub-image data, it is necessary to compare the full-screen size in the screen parameters with the full-image size in the to-be-displayed image to obtain the screen-image mapping relationship ⁇ .
  • the specific comparison method can be performed in an equal-scale method. Yes, where the screen image mapping relationship ⁇ can be obtained by the following formula:
  • step S1220 the sub-image size subImageRegion is obtained according to the screen image mapping relationship ⁇ and the first sub-screen size subRegion(subRegW, subRegH), the first sub-screen coordinates subPosition(subX, subY) and the full image size srcRegion(srcW, srcH) (subRegW, subRegH) and subimage coordinates subImagePosition(subX, subY), the specific mapping relationship is described in the following formula:
  • steps S1210 and S1220 can not only obtain sub-image coordinates according to the full screen size, full image size, first sub-screen size, and first sub-screen coordinates, but also can obtain sub-image coordinates according to the full screen width, that is, the terminal screen
  • the width and full image width are the width of the image to be displayed, determine the width mapping ratio, and then obtain the x-axis of the sub-image coordinates according to the width mapping ratio and the x-axis of the coordinates of the first sub-screen;
  • the length and full image length are the length of the image to be displayed, determine the length mapping ratio, and then obtain the y-axis of the sub-image coordinates according to the length mapping ratio and the y-axis of the coordinates of the first sub-screen, thereby obtaining the x-axis of the sub-image coordinates and the y-axis to get the full subimage coordinates.
  • the exact sub-image data corresponding to the image to be displayed and the first sub-screen 111 is obtained through the mapping relationship between the acquired screen parameters and the full-image data of the image to be displayed, so that the sub-image The data is extracted and the subsequent adaptive adjustment is carried out.
  • the method can determine the sub-image data through a simple comparison relationship, which simplifies the operation process.
  • step S1300 adaptive processing is performed on the sub-image data obtained in step S1200.
  • the adaptive processing refers to performing adaptive processing on the sub-image data so that the sub-image data can be better displayed on the first sub-screen, and can The display effect of the sub-image corresponding to the sub-image data when displayed on the first sub-screen 111 is improved, and the display differences such as color stripes and obvious graininess appearing during screen display are effectively reduced.
  • the adaptive processing includes one or more of the following: resolution processing, filtering processing, and color processing, wherein the resolution processing refers to processing the resolution of the sub-image data so that the sub-image data of the sub-image data is processed
  • the image resolution can be adapted to the first sub-screen resolution of the first sub-screen 111
  • the filtering process refers to performing low-pass filtering processing on the sub-image data to remove high-frequency signals in the sub-image data, thereby effectively reducing high-frequency signals.
  • Moiré patterns appear on the first sub-screen 111 with limited resolution
  • color processing refers to suppressing the color data in the sub-image data, thereby reducing the number of stripes and color particles that appear on the first sub-screen 111 number etc.
  • the three processing methods of resolution processing, filtering processing and color processing can be performed alone or in any combination, for example, only resolution processing or filtering processing or color processing is performed on the sub-image data; or, on the sub-image data Only perform resolution processing and filtering processing; or, perform only resolution processing and color processing on sub-image data; or, perform only filtering processing and color processing on sub-image data; or, perform resolution processing and filtering on sub-image data processing and color manipulation.
  • step S1300 includes steps:
  • S1330 Perform corresponding color processing on the sub-image data according to the data type of the sub-image data.
  • step S1310 refers to performing resolution processing on the sub-image data, specifically, processing the sub-image resolution of the sub-image data according to the resolution of the first sub-screen, so that the resolution of the sub-image is the same as the resolution of the first sub-screen 111 .
  • step S1310 includes the steps of:
  • the first sub-screen resolution is the same as the sub-image resolution.
  • step S1311 by judging the size of the first sub-screen resolution and sub-image resolution obtained in steps S1100 and S1200, it is determined to execute step S1312 or step S1313.
  • the size of the first sub-screen resolution and the sub-image resolution it can be divided into sparse processing and intensive processing.
  • intensive processing is required, that is, step S1312 is performed;
  • sparse processing is required, that is, step S1313 is performed;
  • step S1312 if the resolution of the first sub-screen of the first sub-screen 111 is greater than the sub-image resolution of the sub-image, the resolution processing, that is, data interpolation, is performed on the sub-image resolution.
  • the gray value of the pixel (or the tristimulus value in the rgb image) is used to generate the gray value of the unknown pixel, so that the original image can be reproduced with a higher resolution image, that is, the low-resolution image is processed to generate a high-resolution image.
  • Distinguished images, currently commonly used data interpolation methods include nearest neighbor interpolation, bilinear interpolation, bisquare interpolation, bicubic interpolation and other high-order methods, and the specific interpolation methods are not described here.
  • step S1313 if the first sub-screen resolution of the first sub-screen 111 is smaller than the sub-image resolution of the sub-image, the resolution processing is performed on the sub-image resolution, that is, data downsampling, wherein the data downsampling refers to the image downsampling.
  • the number of sampling points of the data is reduced. For example, for an N*M image, if the data downsampling coefficient is k, then a point is taken every k points in each row and column in the original image to form an image, that is, The high-resolution image is processed to generate a low-resolution image, and the specific data downsampling method is not described here.
  • step S1314 by performing data down-sampling or data interpolation processing on the resolution of the sub-image, the resolution of the sub-image is the same as the resolution of the first sub-screen, so that the sub-image can be displayed on the first sub-screen 111 normally, avoiding the occurrence of The problem of image loss that affects the display effect of the screen, and improves the display effect of the screen.
  • step S1320 refers to filtering the sub-image data, specifically, performing corresponding filtering processing on the sub-image data according to the data type of the sub-image data to remove high-frequency components in the sub-image data, thereby It reduces the occurrence of moiré patterns when the first sub-screen 111 is displayed, and improves the display effect of the screen.
  • the high-frequency component is mainly a measure of the edge and contour of the image, while the human eye is more sensitive to the high-frequency component, which refers to the part corresponding to the sharp change of the image, that is, the edge contour or noise and detail part of the image; because the image noise is in the Most of the time it is high frequency components, so noise also corresponds to high frequency components.
  • filter processing is carried out to the sub-image data subImage(subW, subH), obtain the data SubImageFiltered(subW, subH) after removing the high-frequency component in the sub-image data, specifically can adopt the following formula to obtain:
  • SubImageFiltered(subW,subH) LPF(SubImage(subW,subH))
  • LPF Low-pass filter, low-pass filter
  • boxfilter in which boxFilter is also called box filtering, box filtering, and mean filtering.
  • boxFilter is also called box filtering, box filtering, and mean filtering.
  • the pixels, including the current pixel and the adjacent pixels, are treated equally, and the average or accumulation processing is performed uniformly.
  • the filter F(x,y) in the frequency domain is designed, and the filter size is K*K, where the size of K can be adjusted according to actual needs, as follows:
  • the element x ij (i,j ⁇ K) can be obtained in the following form but not limited to the following form:
  • D(x,y) is the distance of the DC component
  • the DC component is the average component value during the filtering process
  • is an adjustable parameter that can be adjusted according to actual needs, such as when specific high-frequency components need to be filtered.
  • the parameter value of ⁇ can be adjusted. According to the parameter value of ⁇ , high-frequency components of different sizes can be filtered out.
  • a i , a j K/2, that is, the median value of K is taken, the above is a specific application example of the filtering process, and the embodiment of the present application does not limit the specific method of the filtering process, that is, including but not limited to the above-mentioned filtering process Way.
  • the data types of the sub-image data include but are not limited to: RGBA data type and YUV data type, wherein the RGBA data type includes R (Red, red), G (Green, green), B (Blue, Blue) The data type of the color space of these three component data and A (Alpha, transparency); the YUV data type contains Y (Luminance/Luma, brightness), U (Chrominance, chroma), V (Chroma, Concentration) The data type of the color space of the three component data, where UV is collectively referred to as the chromaticity parameter.
  • step S1320 in this embodiment of the present application further includes steps:
  • step S1321 the data type of the sub-image data needs to be determined first. According to different data types, different filtering processes are performed and different filtering parameters are used.
  • step S1322 is executed.
  • step S1322 the three sub-component data of R, G, and B in the sub-image data of the RGBA data type are respectively subjected to filtering processing. The specific processing method is described in detail in step S1320, so it is not repeated here. It should be noted that only the three component data of R, G, and B are processed, and the A component data is not processed, because the transparency of the image is not processed when the image is displayed.
  • step S1323 is executed.
  • step S1323 the three sub-component data of Y, U, and V in the sub-image data of the YUV data type are respectively filtered and processed, and the specific processing method is described in detail in step S1320, so it will not be repeated.
  • the selected filtering parameters are not the same, and the filtering parameters affect the filtering degree of the component data in the filtering process, which can be determined according to actual needs. make adjustments.
  • step S1330 refers to performing color processing on the sub-image data, specifically performing corresponding color processing on the sub-image data according to the data type of the sub-image data to remove stripes or color particles in the sub-image data, This reduces the display effect of the screen. It should be noted that, when step S1330 is performed, not only color processing may be performed on the sub-image data, but also color processing may be performed on the sub-image data after filtering processing.
  • sub-image data SubImageFiltered(subW, subH) obtained after filtering is used for color processing, and the following formula is used for color processing:
  • SubImageChromaCompressed(subW,subH) ChromCompressOperator(SubImageFiltered(subW,subH))
  • ChromCompressOperator is a calculation function that performs color processing on sub-image data. According to different data types of sub-image data, the calculation process of ChromCompressOperator for color processing is different.
  • step S1330 includes the steps of:
  • the data type is RGBA data
  • the sub-image data of the HSV data type is converted back to the sub-image data of the RGBA data type.
  • the data type is YUV/YCbCr data data
  • convert the sub-image data of the YUV/YCbCr data type to the sub-image data of the RGBA data type
  • convert the sub-image data of the RGBA data type to the sub-image data of the HSL/HSV data type Image data
  • adjust the saturation parameter of the HSL/HSV data type and convert the sub-image data of the HSL/HSV data type after adjusting the parameters back to the sub-image data of the YUV/YCbCr data type
  • the chrominance parameters of the sub-image data are adjusted.
  • step S1333 if the data type is HSL/HSV data type, then perform step S1333; if the data type is RGBA data type, then perform step S1332; if the data type is YUV/YCbCr data type, step S1334 is executed.
  • adjusting the saturation parameter of the sub-image data of the HSL/HSV data type in steps S1332 and S1333 further includes: adjusting the saturation parameter of the sub-image data according to the segmentation parameter, corresponding to , where the segmentation parameter can be adjusted by the brightness parameter of the sub-image data of the HSL/HSV data type, or the saturation parameter can be adjusted according to the segmentation parameter obtained by the pre-experiment, wherein a plurality of segmentation parameters can form the segmentation interval , by determining that the saturation parameter of the sub-image data is located at the corresponding segment interval, and adjusting the saturation parameter in combination with the corresponding adjustment algorithm, thereby obtaining the adjusted saturation parameter.
  • the saturation parameter is adjusted by the segmentation parameter, so that the adjusted display effect of the sub-image data is more natural and the effect is better, thereby improving the image display effect at the first sub-screen.
  • adjusting the chrominance parameters of the sub-image data of the YUV/YCbCr data type in step S1334 further includes:
  • the chrominance parameters of the data get adjusted chrominance parameters.
  • the brightness parameter refers to the parameter value of the brightness component data in the YUV/YCbCr data
  • the brightness segment function can determine the corresponding adjustment strategy according to the value of the corresponding brightness parameter.
  • a segment function can include multiple segment values.
  • Each segment value determines the corresponding relationship according to the value of the luminance parameter to obtain the adjustment strategy of the chrominance parameter; the adjustment strategy is pre-stored in the preset parameter table, and the adjustment strategy can obtain the adjusted chrominance parameter according to the input chrominance parameter.
  • Strategies can be set by pre-experimental parameter values.
  • the chromaticity parameters are adjusted through the preset parameter table, so that the adjusted display effect of the sub-image data is more natural and the effect is better, thereby improving the image display effect at the first sub-screen, and reducing the time and cost of adjusting the chromaticity parameters. Complexity, simplifying the adjustment process of chroma parameters
  • HSV ColorConversion(RGB)
  • HSL ColorConvertion(RGB)
  • the parameters for adjusting the saturation and the segment interval determine the specific adjustment operations, as follows:
  • S'(x) is the adjusted saturation parameter value
  • the adjusted saturation parameter is obtained by performing the corresponding calculation through the segment interval formed by the corresponding segment parameter and the corresponding parameter for adjusting the saturation.
  • the adjustment of the segment interval is not limited to the three-segment interval, but can also be adjusted according to the display effect of the screen.
  • the segment interval is divided into more segments, or a smooth curve can be used to adjust the saturation parameter;
  • the value of the degree parameter determines the value of the saturation parameter of the output.
  • the values of k0, k1, k2, S0, S1, and S2 can also be adjusted according to the difference of the brightness parameter v in the HSV data or the brightness parameter L in the HSL data. Specifically, the adjustment can be made according to the brightness parameter v in the HSV data.
  • parm1, parm2 and parm3 are preset adjustment algorithms.
  • the adjustment methods for the values of k0, k1, k2, S0, S1, and S2 are different, which can be set according to actual needs;
  • the parameter v is not in the above preset interval, that is, not in the interval [v0, v1], [v2, v3] and [v4, v5]
  • the parameter value of the adjacent interval can be used for interpolation calculation to obtain it.
  • the method of adjusting the values of k0, k1, k2, S0, S1, and S2 according to the brightness parameter L in the HSL data is similar to the above-mentioned adjustment according to the brightness parameter v in the HSV data, so it is not repeated here. .
  • Sub-image data: SubImageChromaSupressed(subW, subH) RGB'.
  • the S parameter in the HSV/HSL data that is, the saturation parameter
  • the saturation parameter is adjusted accordingly to reduce the saturation of the sub-image.
  • HSV' AdjustSaturation(HSV,k0,k1,k2,S0,S1,S2)
  • HSL' AdjustSaturation(HSL,k0,k1,k2,S0,S1,S2)
  • AdjustSaturation is the adjustment algorithm of the saturation parameter.
  • the specific adjustment algorithm is shown in Figure 8, k0, k1, k2 are the parameters for adjusting the saturation, S0, S1, S2 are the segmentation parameters, and HSL' and HSV' are the adjusted parameters.
  • the specific adjustment operation is determined by combining Fig. 8, the parameters for adjusting the saturation and the segment interval formed by the segment parameters, as follows:
  • S'(x) is the adjusted saturation parameter value, and the corresponding calculation is performed through the segment interval formed by the corresponding segment parameter and the corresponding adjustment saturation parameter to obtain the adjusted saturation parameter.
  • the above adjustment operation The segment parameters adjusted in practical applications are not limited to three segment parameters, that is, the segment interval is not limited to three-segment intervals, and can also be adjusted according to the display effect of the screen, such as segment intervals divided into more segments, It is also possible to use a smooth curve to adjust the saturation parameter, or to determine the adjusted saturation parameter by means of a LUT.
  • the values of k0, k1, k2, S0, S1, and S2 can also be adjusted according to the difference of the brightness parameter v in the HSV data or the brightness parameter L in the HSL data. Specifically, the adjustment can be made according to the brightness parameter v in the HSV data.
  • parm1, parm2 and parm3 are preset adjustment algorithms.
  • the adjustment methods for the values of k0, k1, k2, S0, S1, and S2 are different, which can be set according to actual needs;
  • the parameter v is not in the above preset interval, that is, not in the interval [v0, v1], [v2, v3] and [v4, v5]
  • the parameter value of the adjacent interval can be used for interpolation calculation to obtain it.
  • the method of adjusting the values of k0, k1, k2, S0, S1, and S2 according to the brightness parameter L in the HSL data is similar to the above-mentioned adjustment according to the brightness parameter v in the HSV data, so it is not repeated here. .
  • the adjusted saturation parameter is finally obtained, and the sub-image data SubImageChromaSupressed(subW, subH) of the HSL/HSV data type after color processing is obtained.
  • the UV/CbCr parameter in the YUV/YCbCr data is adjusted accordingly to reduce the sub-image. Chroma, the specific operation is as follows:
  • RGB ColorConvertion(YUV/YCbC)
  • RGB ColorConvertion(YUV/YCbC)
  • the parameters for adjusting the saturation and the segment interval determine the specific adjustment operations, as follows:
  • S'(x) is the adjusted saturation parameter value
  • the adjusted saturation parameter is obtained by performing the corresponding calculation through the segment interval formed by the corresponding segment parameter and the corresponding parameter for adjusting the saturation.
  • the adjustment of the segment interval is not limited to the three-segment interval, and it can also be adjusted according to the display effect of the screen, such as dividing the segment interval into more segments, or using a smooth curve to adjust the saturation parameter.
  • the value of the degree parameter determines the value of the saturation parameter of the output.
  • the values of k0, k1, k2, S0, S1, and S2 can also be adjusted according to the difference of the brightness parameter v in the HSV data or the brightness parameter L in the HSL data. Specifically, the adjustment can be made according to the brightness parameter v in the HSV data.
  • parm1, parm2 and parm3 are preset adjustment algorithms.
  • the adjustment methods for the values of k0, k1, k2, S0, S1, and S2 are different, which can be set according to actual needs;
  • the parameter v is not in the above preset interval, that is, not in the interval [v0, v1], [v2, v3] and [v4, v5]
  • the parameter value of the adjacent interval can be used for interpolation calculation to obtain it.
  • the method of adjusting the values of k0, k1, k2, S0, S1, and S2 according to the brightness parameter L in the HSL data is similar to the above-mentioned adjustment according to the brightness parameter v in the HSV data, so it is not repeated here. .
  • the chrominance parameters of the sub-image data of the YUV/YcbCr data type that is, obtain the luminance parameter of the sub-image data of the YUV/YCbCr data type; obtain the color in the preset parameter table according to the luminance parameter and the luminance segmentation function.
  • the adjustment strategy of the chromaticity parameter; the adjusted chromaticity parameter is obtained according to the adjustment strategy and the chromaticity parameter of the sub-image data of the YUV/YCbCr data type.
  • the segment function obtains the adjustment strategy of the chrominance parameter in the preset parameter table, that is, the adjustment strategy corresponding to regy0, regy1 or regy2.
  • the segment interval can be set according to actual needs, and if the brightness parameter is not in the preset When it is within the segmented interval of , it can be obtained by interpolation calculation through the value of the adjacent interval to determine the corresponding adjustment strategy.
  • the specific luminance segmentation function is described in the following formula:
  • regy0, regy1 and regy2 can determine specific chromaticity adjustment strategies in the preset parameter table, for example, regy0 determines the chromaticity adjustment strategy in the LUT table through the preset parameter table, that is, LUT0, regy1 through preset parameters
  • the table determines the chrominance adjustment strategy in the LUT table, namely LUT1 and regy2, through the preset parameter table to determine the chrominance adjustment strategy in the LUT table, namely LUT1, and determines the input chrominance parameter through the corresponding chrominance adjustment strategy, namely LUT0, LUT1 or LUT2 UV/CbCr, determine the corresponding chromaticity adjustment strategy according to the luminance parameter inputted by the sub-image data, and obtain the adjusted corresponding chromaticity parameter UV'/CbCr' through the chromaticity adjustment strategy, so as to adjust the data type of YUV/YcbCr.
  • the chromaticity parameters of the sub-image data are adjusted to complete the color
  • the resolution adjustment, filtering processing and color processing mentioned in the embodiments of the present application can all be adjusted and used in combination according to actual requirements, so that the display effect of the screen can be more effectively improved.
  • step S1400 image display is performed on the first sub-screen 111 and the second sub-screen 112 of the terminal 100 according to the adaptively processed image data and the original full-image data, wherein the full-image data refers to the pre-prepared image data on the screen The image data corresponding to the to-be-displayed image to be displayed.
  • the sub-image corresponding to the adaptively processed sub-image data is overlaid on the position corresponding to the sub-image in the image to be displayed corresponding to the full-image data, that is, the sub-image directly covers the original first sub-screen 111, and displays the overlaid to-be-displayed image on the first sub-screen 111, while the second sub-screen 112 displays images corresponding to other areas in the to-be-displayed image.
  • the sub-image can be better displayed on the first sub-screen 111 , improving the display effect of the screen, and reducing the display difference between the first sub-screen 111 and the second sub-screen 112 .
  • the sub-image corresponding to the adaptively processed sub-image data and the to-be-displayed image corresponding to the full-image data are fused, and the corresponding proportion is fused according to the corresponding weights of the sub-image and the to-be-displayed image
  • the fused images to be displayed are displayed on the screen, namely the first sub-screen 111 and the second sub-screen 112, wherein the corresponding weights can be calculated according to the resolution and image size of the sub-images and the images to be displayed, according to the corresponding weights Calculate and fuse.
  • the sub-images and the images to be displayed can be displayed on the first sub-screen 111 and the second sub-screen 112 more naturally, the display effect of the screen is improved, and the number of the first sub-screen 111 and the second sub-screen 111 is reduced. Display differences that exist on screen 112 .
  • the terminal 100 mentioned in the embodiments of the present application may be the terminal 100 provided with an off-screen camera, and the off-screen camera is located below the area of the screen of the terminal 100 where the first sub-screen 111 is located, that is, the first sub-screen 111 is located on the screen of the terminal 100
  • a sub-screen 111 is a part of the screen of the camera under the load screen.
  • the terminal 100 with an under-screen camera it should be noted that the current design of the mobile terminal 100 is developing towards a large screen. In order to effectively expand the display area, the camera often needs to be processed. The terminal 100 of the camera, but in order to receive light to facilitate imaging, the under-screen camera needs to adjust the light transmittance of the screen area corresponding to the under-screen camera.
  • the embodiment of the present application can solve the display problem in this region, effectively realize the consistency of the overall display effect of the screen of the terminal 100, and improve the display effect of the screen. It is also conducive to the development of under-screen cameras.
  • an image display method is provided.
  • the acquired screen parameters and full image data are obtained for display on the first sub-screen 111.
  • the obtained sub-image data is subjected to adaptive processing, and images are displayed on the first sub-screen 111 and the second sub-screen 112 according to the adaptively processed sub-image data and full image data, which can improve the terminal
  • the display effect of 100 screens shows the difference, thereby improving the user experience.
  • an embodiment of the present application further provides an electronic device, referring to FIG. 10 , comprising: at least one processor 200 , and a memory 300 communicatively connected to the at least one processor 200 ;
  • the processor 200 is configured to execute the image display method applied to the terminal 100 having the screen 110 in the embodiment of the first aspect by invoking the computer program stored in the memory 300 .
  • the electronic device may be a mobile terminal electronic device or a non-mobile terminal electronic device.
  • Mobile terminal electronic equipment can be mobile phone, tablet computer, notebook computer, PDA, vehicle terminal equipment, wearable device, super mobile personal computer, netbook, personal digital assistant, CPE, UFI (wireless hotspot device), etc.; non-mobile terminal electronic equipment
  • the device may be a personal computer, a television, a teller machine or a self-service machine, etc. The embodiments of the present application do not specifically limit it.
  • the electronic device further includes a screen 110 connected to the processor 200 and driven by the processor 200 to display images.
  • the memory 300 can be used to store non-transitory software programs and non-transitory computer-executable programs, as applied to the terminal 100 having the screen 110 in the embodiment of the first aspect of the present application image display method.
  • the processor 200 implements the image display method applied to the terminal 100 having the screen 110 in the above-mentioned first aspect embodiment by running the non-transitory software program and instructions stored in the memory 300 .
  • the memory 300 may include a storage program area and a storage data area, wherein the storage program area may store an operating system and an application program required by at least one function; The image display method of the terminal 100. Additionally, memory 300 may include high-speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory 300 may include memories located remotely with respect to the processor 200, and these remote memories may be connected to the terminal 100 through a network. Examples of such networks include, but are not limited to, the Internet, an intranet, a local area network, a mobile communication network, and combinations thereof.
  • the non-transitory software programs and instructions required to implement the image display method applied to the terminal 100 with the screen 110 in the first aspect of the embodiment are stored in the memory 300 , and when executed by one or more processors 200 , execute The image display method applied to the terminal 100 having the screen 110 in the above-mentioned first aspect embodiment.
  • the embodiments of the present application further provide a computer-readable storage medium storing computer-executable instructions, where the computer-executable instructions are used to: execute the computer-readable storage medium applied to the terminal 100 having the screen 110 in the embodiments of the first aspect Image display method;
  • the computer-readable storage medium stores computer-executable instructions that are executed by one or more control processors 200, eg, by one of the electronic devices of the second aspect embodiment Executed by the processor 200, the above-mentioned one or more processors 200 can be made to execute the image display method applied to the terminal 100 having the screen 110 in the above-mentioned embodiment of the first aspect.
  • the device embodiments described above are only illustrative, and the units described as separate components may or may not be physically separated, that is, they may be located in one place, or may be distributed to multiple network units. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
  • Computer storage media include, but are not limited to, RAM, ROM, EEPROM, flash memory or other memory 300 technology, CD-ROM, digital versatile disk (DVD) or other optical disk storage, magnetic cartridges, magnetic tape, magnetic disk storage or other magnetic storage devices, or Any other medium that can be used to store the desired information and that can be accessed by a computer.
  • communication media typically embodies computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism, and can include any information delivery media, as is well known to those of ordinary skill in the art .

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)

Abstract

Procédé d'affichage d'image, dispositif électronique et support d'enregistrement lisible par ordinateur. Le procédé d'affichage d'image s'applique à un terminal ayant un écran. L'écran comprend un premier sous-écran et un second sous-écran qui ont des résolutions différentes. Le procédé comprend : l'acquisition d'un paramètre d'écran d'un écran et de données d'image complète d'une image à afficher (S1100) ; en fonction du paramètre d'écran acquis et des données d'image complète, l'obtention de données de sous-image utilisées pour être affichées sur un premier sous-écran (S1200) ; la réalisation d'un traitement adaptatif sur les données de sous-image obtenues (S1300) ; et en fonction des données de sous-image qui ont été soumises à un traitement adaptatif, et des données d'image complète, l'affichage d'une image sur le premier sous-écran et un second sous-écran (S1400).
PCT/CN2021/104188 2020-08-11 2021-07-02 Procédé d'affichage d'image, dispositif électronique et support d'enregistrement lisible par ordinateur WO2022033230A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010801991.9 2020-08-11
CN202010801991.9A CN114078101A (zh) 2020-08-11 2020-08-11 图像显示方法、电子设备及计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2022033230A1 true WO2022033230A1 (fr) 2022-02-17

Family

ID=80246836

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/104188 WO2022033230A1 (fr) 2020-08-11 2021-07-02 Procédé d'affichage d'image, dispositif électronique et support d'enregistrement lisible par ordinateur

Country Status (2)

Country Link
CN (1) CN114078101A (fr)
WO (1) WO2022033230A1 (fr)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114780197B (zh) * 2022-04-29 2023-12-22 北京字跳网络技术有限公司 分屏渲染方法、装置、设备及存储介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110716677A (zh) * 2019-10-08 2020-01-21 Oppo广东移动通信有限公司 屏幕组件以及终端
CN111078170A (zh) * 2019-11-29 2020-04-28 北京小米移动软件有限公司 显示控制方法、显示控制装置及计算机可读存储介质
CN111258527A (zh) * 2020-01-14 2020-06-09 北京欧铼德微电子技术有限公司 显示调整方法、装置、设备和存储介质
CN111355879A (zh) * 2018-12-24 2020-06-30 北京小米移动软件有限公司 包含特效图案的图像获取方法、装置和电子设备
CN111383166A (zh) * 2018-12-29 2020-07-07 北京小米移动软件有限公司 处理待显示图像的方法及装置、电子设备、可读存储介质
CN111625213A (zh) * 2019-02-28 2020-09-04 北京小米移动软件有限公司 画面显示方法、装置和存储介质
CN111650995A (zh) * 2020-05-21 2020-09-11 Oppo广东移动通信有限公司 图像显示方法、装置、移动终端及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111355879A (zh) * 2018-12-24 2020-06-30 北京小米移动软件有限公司 包含特效图案的图像获取方法、装置和电子设备
CN111383166A (zh) * 2018-12-29 2020-07-07 北京小米移动软件有限公司 处理待显示图像的方法及装置、电子设备、可读存储介质
CN111625213A (zh) * 2019-02-28 2020-09-04 北京小米移动软件有限公司 画面显示方法、装置和存储介质
CN110716677A (zh) * 2019-10-08 2020-01-21 Oppo广东移动通信有限公司 屏幕组件以及终端
CN111078170A (zh) * 2019-11-29 2020-04-28 北京小米移动软件有限公司 显示控制方法、显示控制装置及计算机可读存储介质
CN111258527A (zh) * 2020-01-14 2020-06-09 北京欧铼德微电子技术有限公司 显示调整方法、装置、设备和存储介质
CN111650995A (zh) * 2020-05-21 2020-09-11 Oppo广东移动通信有限公司 图像显示方法、装置、移动终端及存储介质

Also Published As

Publication number Publication date
CN114078101A (zh) 2022-02-22

Similar Documents

Publication Publication Date Title
KR100453038B1 (ko) 컬러 영상의 채도 조절 장치 및 방법
US7129959B2 (en) Image display device employing selective or asymmetrical smoothing
US10565742B1 (en) Image processing method and apparatus
CN107909553B (zh) 一种图像处理方法及设备
JP5527931B2 (ja) 映像の視認性を向上させる装置及び方法
US8379971B2 (en) Image gamut mapping
US6262778B1 (en) Image processing system
JP2006033656A (ja) ユーザインタフェース提供装置
US11854157B2 (en) Edge-aware upscaling for improved screen content quality
TWI698124B (zh) 影像調整方法以及相關的影像處理電路
WO2019071990A1 (fr) Procédé et appareil de traitement d'image
CN111062867A (zh) 一种视频超分辨率重建方法
WO2022033230A1 (fr) Procédé d'affichage d'image, dispositif électronique et support d'enregistrement lisible par ordinateur
JPH11149556A (ja) 色テキストの解像度改善方法
CN110780961B (zh) 一种调整应用界面文字颜色的方法、存储介质及终端设备
KR20190073516A (ko) 화상 처리 장치, 디지털 카메라, 화상 처리 프로그램, 및 기록 매체
CN110298812B (zh) 一种图像融合处理的方法及装置
WO2023241339A1 (fr) Procédé et appareil de correction de dominante de couleur, dispositif, support de stockage et produit de programme
KR102095518B1 (ko) 영상의 고해상도 변환을 위한 리마스터링 시스템 및 그 방법
JP4369030B2 (ja) 画像補正方法および装置ならびに画像補正プログラムを記録したコンピュータ読取可能な記録媒体
JP4728411B2 (ja) デジタル画像における色にじみアーティファクトの低減方法
TWI531246B (zh) Color adjustment method and its system
JP2016163099A (ja) 画像処理装置および画像表示装置
US20220092836A1 (en) Information processing apparatus and non-transitory computer readable medium storing program
CN111613168B (zh) 一种影像显示处理方法、装置以及计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21855283

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 03.07.2023)

122 Ep: pct application non-entry in european phase

Ref document number: 21855283

Country of ref document: EP

Kind code of ref document: A1