WO2020191516A1 - 一种图像数据处理的装置和方法 - Google Patents

一种图像数据处理的装置和方法 Download PDF

Info

Publication number
WO2020191516A1
WO2020191516A1 PCT/CN2019/079208 CN2019079208W WO2020191516A1 WO 2020191516 A1 WO2020191516 A1 WO 2020191516A1 CN 2019079208 W CN2019079208 W CN 2019079208W WO 2020191516 A1 WO2020191516 A1 WO 2020191516A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
spr
sub
ddic
pixels
Prior art date
Application number
PCT/CN2019/079208
Other languages
English (en)
French (fr)
Inventor
韦育伦
许景翔
牛泽宇
刘洋
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to CN201980092287.6A priority Critical patent/CN113439442A/zh
Priority to PCT/CN2019/079208 priority patent/WO2020191516A1/zh
Publication of WO2020191516A1 publication Critical patent/WO2020191516A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/20Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T1/00General purpose image data processing
    • G06T1/20Processor architectures; Processor configuration, e.g. pipelining

Definitions

  • This application relates to the field of display technology, and in particular to an image data processing device and method.
  • the display panels are required to have high resolution, high contrast, and wide color gamut.
  • the display panels are also required to respond faster, thinner, and more powerful. Lower consumption and foldable.
  • image processing and image display are usually completed by different modules, and the processed image data needs to be sent from the image processing module to the image display module for display.
  • the display resolution of the display panel increases, the data volume of the image to be displayed is also greatly increased. Therefore, the bandwidth and power consumption required to transmit the image data to be displayed from the image processing module to the image display module also increase.
  • the embodiments of the present application provide an image data processing apparatus and method, which are used to reduce the bandwidth and power consumption of image data transmission.
  • the first aspect of the present application provides an image data processing device, which includes: an application processor AP, a display driver integrated circuit DDIC, and a transmission interface; the AP is used to perform sub-pixel rendering SPR on the original image to be displayed to obtain SPR image, the SPR image can be displayed on the sub-pixel arrangement SPA display screen, the number of physical sub-pixels of the SPA display screen is less than the number of sub-pixels of the original image to be displayed, and the number of sub-pixels of the SPR image The number is smaller than the number of sub-pixels of the original image to be displayed; the AP is also used to encode the SPR image to obtain the first data stream; the AP is also used to send the first data stream through the transmission interface Give the DDIC; the DDIC is used to decode the first data stream to obtain a decoded image.
  • SPA displays are usually used.
  • the original input three-component image data must be processed by SPR before it can be displayed correctly on the SPA display.
  • the number of sub-pixels contained in each pixel of the image data becomes smaller, and the amount of image data is greatly reduced.
  • the SPR processing After the image data is RGBG arrangement or RGB delta arrangement data
  • the data after SPR processing is 2/3 of the original image data; if the image data after SPR processing is data arranged by SPR1.5, SPR processing The subsequent data is 1/2 of the original image data.
  • the amount of image data after SPR processing is greatly reduced, the amount of data after encoding image data is also greatly reduced, and the bandwidth and power consumption required to transmit image data from AP to DDIC are also greatly reduced. Further, DDIC processing The amount of data and the storage space on the DDIC side are greatly reduced, greatly improving the performance of the image processing device. And for the SPR processing core, due to the higher integration on the SOC side, the hardware logic area occupied by the SPR processing core is smaller, and the cost and power consumption of the SPR processing core or the Intellectual Property (IP) core itself It will also decrease.
  • IP Intellectual Property
  • both the AP and DDIC sides include transmission interfaces.
  • the transmission interface includes a transmission interface and a reception interface.
  • the transmission interface on the AP side is the transmission interface and the transmission interface on the DDIC side is the reception interface.
  • the AP includes: SPR integrated circuit or SPR curing hardware logic, and the SPR integrated circuit or SPR curing hardware logic is specifically used to perform sub-pixel rendering SPR on the original image to be displayed to obtain the SPR image.
  • the SPR image is an RGBG image
  • the AP further includes 4 processing channels for: encoding the 4 components R, G, B, and G of the RGBG image to obtain the In the first data stream, each of the four processing channels processes one component.
  • the AP is specifically used to: convert the RGBG image into a bright-color separated SPR image, the bright-color separated SPR image includes 4 components U, Y, V, and Y, where U and The V is the chrominance signal component, and the Y is the luminance signal component; the 4 processing channels are specifically used to encode the 4 components U, Y, V, and Y of the bright-color separated SPR image to obtain the first A data stream in which each of the 4 processing channels processes one component separately.
  • RGBG is converted to a color space separated by bright colors, which further reduces the amount of image data and saves transmission bandwidth; and, because the correlation between the brightness and chroma of the image is reduced, the same Under the compression ratio, the image distortion caused by compression and decoding is reduced, so that the image obtained by decoding has a better display effect and less distortion.
  • the bright-color separated SPR image includes: UYVY image, YVYU image, YUYV image, or VYUY image.
  • the decoded image includes the four components U, Y, V, and Y
  • the DDIC is also used to: convert the four components U, Y, V, and Y of the decoded image into RGBG images
  • the SPR image is an RGB delta image
  • the AP is specifically used to: perform format mapping on the RGB delta image to obtain a mapped RGB image, and the number of sub-pixels of the mapped RGB image is equal to the RGB delta image.
  • the number of sub-pixels of the delta image convert the RGB image to the YCoCg color space separated by bright colors to obtain the YCoCg image, the YCoCg image includes 3 components Y, Co, Cg; the YCoCg image is encoded to obtain the first data flow.
  • the AP includes 4 processing channels, and 3 of the 4 processing channels are used to encode the 3 components R, G, and B of the RGB image to obtain In the first data stream, one processing channel processes one component; or, three of the four processing channels are used to encode the three components Y, Co, and Cg of the YCoCg image to obtain In the first data stream, one processing channel processes one component.
  • AP's 4 processing channels can encode 4-component data at the same time, or encode 3-component data at the same time.
  • 3-component data any 3 of the 4 channels can be processed.
  • the channel simultaneously encodes the 3 components of the image separately.
  • the decoded image is the YCoCg image
  • the DDIC is also used to: convert the three components Y, Co, and Cg of the YCoCg image into the three components R, G, and B of the RGB image ; Convert the RGB image into an RGB delta image that can be displayed on the SPA display screen.
  • the AP is integrated on the system chip SOC, and the DDIC is outside the SOC.
  • the device further includes: a folding screen including a first display screen and a second display screen;
  • the DDIC includes: a first DDIC and a second DDIC, and the first DDIC is used for driving The first display screen and the second DDIC are used to drive the second display screen.
  • the transmission interface is a mobile industry processor interface MIPI, a display serial interface DSI standardized by MIPI, or an embedded display port eDP standardized by the Video Electronics Standards Association VESA.
  • a second aspect of the present application provides an image data processing device, which includes: an application processor AP and a display driver integrated circuit DDIC; the AP includes a sub-pixel rendering processing core, an encoder, and a transmitting interface, and the DDIC includes a receiving interface And the decoder; the sub-pixel rendering processing core is used to perform sub-pixel rendering SPR on the original image to be displayed to obtain an SPR image, which can be displayed on the sub-pixel arrangement SPA display screen.
  • the physical sub-pixels of the SPA display screen The number of pixels is less than the number of sub-pixels of the original image to be displayed, and the number of sub-pixels of the SPR image is less than the number of sub-pixels of the original image to be displayed;
  • the encoder is used to perform the SPR image Encode to obtain a first data stream;
  • the sending interface is used to send the first data stream to the receiving interface of the DDIC;
  • the decoder is used to decode the first data stream to obtain a decoded SPR image.
  • SPA displays are usually used.
  • the original input three-component image data must be processed by SPR before it can be displayed correctly on the SPA display.
  • the SPR processing is completed in the AP. Since the number of sub-pixels contained in each pixel of the image data after the SPR processing becomes smaller, the amount of image data is greatly reduced. For example, If the image data after SPR processing is RGBG arrangement or RGB delta arrangement data, the data after SPR processing is 2/3 of the original image data; if the image data after SPR processing is SPR1.5 arrangement data, Then the data after SPR processing is 1/2 of the original image data.
  • the amount of image data processed by SPR is greatly reduced, the amount of data after encoding the image data is also greatly reduced, the bandwidth and power consumption required to transmit image data from AP to DDIC are also greatly reduced, and the amount of data processed by DDIC And the storage space on the DDIC side is greatly reduced, greatly improving the performance of the image processing device.
  • the SPR processing core due to the higher integration on the AP side, the hardware logic area occupied by the SPR processing core is smaller, and the cost and power consumption of the SPR processing core or Intellectual Property (IP) core itself It will also decrease.
  • IP Intellectual Property
  • the sub-pixel rendering processing core is a dedicated integrated hardware circuit or a dedicated curing hardware core.
  • the DDIC further includes a driver for driving the display screen to display the decoded SPR image.
  • the AP further includes: a first color space conversion module, configured to convert the SPR image into a bright color separated color space to obtain a bright color separated SPR image; the encoder is specifically used for Encoding the SPR image separated by bright colors to obtain the first data stream.
  • a first color space conversion module configured to convert the SPR image into a bright color separated color space to obtain a bright color separated SPR image
  • the encoder is specifically used for Encoding the SPR image separated by bright colors to obtain the first data stream.
  • RGBG is converted to a color space separated by bright colors, which further reduces the amount of image data and saves transmission bandwidth; and, because the correlation between the brightness and chroma of the image is reduced, the same Under the compression ratio, the image distortion caused by compression and decoding is reduced, so that the image obtained by decoding has a better display effect and less distortion.
  • the first color space conversion module is used to stretch and rotate the SPR image.
  • the first color space conversion module is specifically configured to perform color space conversion on the SPR image based on a color space conversion matrix.
  • the decoded SPR image is located in the bright-color separated color space
  • the DDIC further includes: a second color space conversion module for converting the decoded SPR image from the bright-color separated color space to the The color space where the SPR image is located.
  • the SPR image is an RGBG image
  • the color space of the bright color separation is a YUV color space
  • the first color space conversion module is specifically configured to: convert the RGBG image to the YUV color space to obtain
  • the bright color separated SPR image includes 4 components P0, P1, P2, and P3.
  • the 4 components P0, P1, P2, and P3 contain two luminance signal components and two chrominance signal components .
  • the bright-color separated SPR image includes: UYVY image, YVYU image, YUYV image, or VYUY image.
  • the decoded SPR image is located in the YUV color space
  • the decoded SPR image includes the four components P0, P1, P2, and P3
  • the second color space conversion module is specifically configured to: The four components P0, P1, P2, and P3 of the SPR image are converted into the four components R, G, B, and G of the RGBG image.
  • the SPR image is an RGB delta image
  • the color space of the bright color separation is the YCoCg color space
  • the AP further includes: a first format mapping module for converting the RGB delta image into an RGB image
  • the first color space conversion module is specifically used to: convert the RGB image to the YCoCg color space to obtain a YCoCg image
  • the YCoCg image includes three components Y, Co, and Cg.
  • the decoded SPR image is the YCoCg image
  • the second color space conversion module is specifically configured to: convert the three components Y, Co, and Cg of the YCoCg image into three components of the RGB image.
  • Components R, G, B; the DDIC also includes: a second format mapping unit, used to convert the RGB image into the RGB delta image.
  • the encoder includes: 4 processing channels; the SPR image is an RGBG image, and the 4 processing channels are used to perform processing on the 4 components R, G, B, and G of the RGBG image. Encoding process to obtain the first data stream, in which one processing channel processes one component; or, the SPR image is an RGB delta image, and the device further includes: a first format mapping unit for converting the RGB delta image into RGB Image; 3 of the 4 processing channels, the 3 components R, G, and B of the RGB image are respectively encoded to obtain the first data stream, wherein one processing channel processes one component.
  • the encoder includes: 4 processing channels; the 4 processing channels are used to encode the 4 components P0, P1, P2, and P3 to obtain the first data stream Or, 3 of the 4 processing channels are used to respectively perform encoding processing on the 3 components Y, Co, and Cg of the YCoCg image to obtain the first data stream.
  • the AP is integrated on the system chip SOC, and the DDIC is outside the SOC.
  • the transmission interface is a mobile industry processor interface MIPI, a display serial interface DSI standardized by MIPI, or an embedded display port eDP standardized by the Video Electronics Standards Association VESA.
  • the transmission interface includes an HDMI or V-By-One interface.
  • the DDIC further includes: a screen brightness compensator, which is used to perform Demura processing on the decoded SPR image to obtain the target displayed SPR image.
  • the AP further includes: an encapsulation module, configured to perform format encapsulation on the first data stream to obtain a second data stream, where the second data stream is related to the sending interface and the receiving interface Matched data stream; the sending interface is specifically used to send the second data stream to the receiving interface of the DDIC.
  • the DDIC further includes: a decapsulation module, configured to decapsulate the second data stream received by the receiving interface to obtain the first data stream.
  • the sending interface is used to format the first data stream to obtain a second data stream, and the second data stream is a data stream matching the sending interface and the receiving interface
  • the sending interface is also used to send the second data stream to the receiving interface of the DDIC; the receiving interface is used to decapsulate the second data stream to obtain the first data stream.
  • the third aspect of the present application provides an image data processing device.
  • the device includes a processor and a transmission interface; the processor is used to perform sub-pixel rendering SPR on the original image to be displayed to obtain an SPR image, which can be
  • the sub-pixel arrangement is displayed on the SPA display, the number of physical sub-pixels of the SPA display is less than the number of sub-pixels of the original image to be displayed, and the number of sub-pixels of the SPR image is less than that of the original image to be displayed
  • the number of sub-pixels; the SPR image is encoded to obtain the first data stream; the first data stream is sent through the transmission interface.
  • the transmission interface may be considered as a part of the processor, and the processor sends or receives data through the transmission interface.
  • the processor and the transmission interface jointly constitute the chip
  • the processor is a solidified hardware circuit and/or solidified hardware logic with arithmetic processing function in the chip, Drivers and other program instructions are solidified in these hardware circuits or hardware logics
  • the transmission interface is an interface for receiving or sending data in the chip.
  • the processor sends the first data stream to the display driver integrated circuit DDIC through the transmission interface.
  • the SPR image is an RGBG image
  • the processor further includes 4 processing channels for encoding the 4 components R, G, B, and G of the RGBG image to obtain the In the first data stream, each of the four processing channels processes one component.
  • the processor is specifically configured to: convert the RGBG image into a bright-color separated SPR image, the bright-color separated SPR image including 4 components U, Y, V, and Y, where the U And the V is the chrominance signal component, and the Y is the luminance signal component; the 4 processing channels are specifically used to encode the 4 components U, Y, V, and Y of the bright-color separated SPR image to obtain the The first data stream, wherein each of the four processing channels processes one component.
  • the SPR image is an RGB delta image
  • the processor is specifically configured to: perform format mapping on the RGB delta image to obtain a mapped RGB image, and the number of sub-pixels of the mapped RGB image is equal to the The number of sub-pixels of the RGB delta image; convert the RGB image to the YCoCg color space separated by bright colors to obtain the YCoCg image, the YCoCg image includes three components Y, Co, Cg; the YCoCg image is encoded to obtain the first One data stream.
  • the processor is integrated on a system chip SOC, and the DDIC is outside the SOC.
  • the transmission interface is a mobile industry processor interface MIPI, a display serial interface DSI standardized by MIPI, or an embedded display port eDP standardized by the Video Electronics Standards Association VESA.
  • the processor includes 4 processing channels, and 3 of the 4 processing channels are used to encode the 3 components R, G, and B of the RGB image respectively. Obtain the first data stream, where one processing channel processes one component; or, three of the four processing channels are used to encode the three components Y, Co, and Cg of the YCoCg image, respectively, Obtain the first data stream, in which one processing channel processes one component.
  • a fourth aspect of the present application provides a display driver integrated circuit DDIC.
  • the DDIC includes a decoder and a transmission interface; the transmission interface is used to receive a first data stream, and the first data stream includes sub-pixel rendering SPR images,
  • the SPR image can be displayed on a sub-pixel arrangement SPA display, the number of physical sub-pixels of the SPA display is less than the number of sub-pixels of the original image to be displayed, and the number of sub-pixels of the SPR image is less than the The number of sub-pixels of the original image to be displayed;
  • the decoder is used to decode the first data stream to obtain decoded image data.
  • the first data stream is sent by the application processor AP.
  • the SPR image is an RGBG image
  • the decoded image data includes four components U, Y, V, and Y, where U and V are chrominance signal components, and Y is a luminance signal.
  • the DDIC also includes a first color space conversion processing integrated circuit; the color space conversion processing integrated circuit is used to convert the four components U, Y, V, and Y of the decoded image into four components R, Y, V, and Y of the RGBG image. G, B, G.
  • the SPR image is an RGB delta image
  • the decoded image data is an RGB image
  • the DDIC further includes: a format mapping processing integrated circuit or format mapping processing hardware logic; the format mapping processing integrated circuit or The format mapping process solidifies the hardware logic for format mapping of the RGB image to obtain the RGB delta image.
  • the decoder includes 4 processing channels.
  • the DDIC includes a first sub-DDIC and a second sub-DDIC, and the first sub-DDIC and the second sub-DDIC are coupled.
  • the AP is integrated on the system chip SOC, and the DDIC is outside the SOC.
  • the transmission interface is a mobile industry processor interface MIPI, a display serial interface DSI standardized by MIPI, or an embedded display port eDP standardized by the Video Electronics Standards Association VESA.
  • the fifth aspect of the present application provides a method for image data processing.
  • the method includes: performing sub-pixel rendering SPR on an original image to be displayed to obtain an SPR image, which can be displayed on a sub-pixel arrangement SPA display screen.
  • the number of physical sub-pixels of the SPA display screen is less than the number of sub-pixels of the original image to be displayed, and the number of sub-pixels of the SPR image is less than the number of sub-pixels of the original image to be displayed; Encode to obtain the first data stream; send the first data stream through the transmission interface.
  • the SPR image is an RGBG image
  • encoding the SPR image to obtain the first data stream specifically includes: performing the four components R, G, B, and G of the RGBG image respectively Encoding process to obtain the first data stream.
  • the encoding the SPR image to obtain the first data stream specifically includes: converting the RGBG image into a bright-color separated SPR image, and the bright-color separated SPR image includes 4 components U, Y, V, and Y, where U and V are chrominance signal components, and Y is a luminance signal component; the four components U, Y, V, and Y of the bright-color separated SPR image are encoded separately to obtain The first data stream.
  • the bright-color separated SPR image includes: UYVY image, YVYU image, YUYV image, or VYUY image.
  • the SPR image is an RGB delta image
  • encoding the SPR image to obtain the first data stream specifically includes: performing format mapping on the RGB delta image to obtain a mapped RGB image, the mapping The number of sub-pixels in the RGB image is equal to the number of sub-pixels in the RGB delta image; the RGB image is converted to the YCoCg color space separated by bright colors to obtain a YCoCg image, the YCoCg image includes 3 components Y, Co, Cg; The YCoCg image is encoded to obtain the first data stream.
  • a sixth aspect of the present application provides an image data processing method, the method includes: receiving a first data stream, the first data stream includes sub-pixel rendering SPR image, the SPR image can be arranged in sub-pixel SPA display screen
  • the above shows that the number of physical sub-pixels of the SPA display screen is less than the number of sub-pixels of the original image to be displayed, and the number of sub-pixels of the SPR image is less than the number of sub-pixels of the original image to be displayed; decoding The first data stream obtains decoded image data.
  • the SPR image is an RGBG image
  • the decoded image data includes four components U, Y, V, and Y, where U and V are chrominance signal components, and Y is a luminance signal.
  • the method further includes: converting the four components U, Y, V, and Y of the decoded image into four components R, G, B, and G of the RGBG image.
  • the SPR image is an RGB delta image
  • the decoded image data is an RGB image
  • the method further includes: performing format mapping on the RGB image to obtain the RGB delta image.
  • the SPR image is an RGB delta image
  • the decoded image data is the YCoCg image
  • the method further includes: converting the three components Y, Co, and Cg of the YCoCg image into the RGB image Three components R, G, B; convert the RGB image into an RGB delta image that can be displayed on the SPA display screen.
  • the seventh aspect of the present application provides an image data processing method, which includes: AP performs sub-pixel rendering SPR on the original image to be displayed to obtain an SPR image, which can be displayed on a sub-pixel arrangement SPA display screen,
  • the number of physical sub-pixels of the SPA display screen is less than the number of sub-pixels of the original image to be displayed, and the number of sub-pixels of the SPR image is less than the number of sub-pixels of the original image to be displayed;
  • the SPR image is encoded to obtain the first data stream; the AP sends the first data stream to the DDIC; the DDIC decodes the first data stream to obtain the decoded SPR image.
  • the method further includes: the DDIC driving the display screen to display the decoded SPR image.
  • the AP encodes the SPR image to obtain the first data stream, which specifically includes: the AP converts the SPR image into a bright-color separated color space to obtain a bright-color separated SPR image; The AP encodes the SPR image separated by bright colors to obtain the first data stream.
  • RGBG is converted to a color space separated by bright colors, which further reduces the amount of image data and saves transmission bandwidth; and, because the correlation between the brightness and chroma of the image is reduced, the same Under the compression ratio, the image distortion caused by compression and decoding is reduced, so that the image obtained by decoding has a better display effect and less distortion.
  • the AP converts the SPR image into the bright-color separated color space, which specifically includes: the AP converts the SPR image into the bright-color separated color space based on a color space conversion matrix.
  • the decoded SPR image is located in the color space of the bright color separation
  • the method further includes: DDIC converts the decoded SPR image from the color space of the bright color separation to the color space where the SPR image is located.
  • the SPR image is an RGBG image
  • the color space of the bright color separation is the YUV color space
  • the AP converts the SPR image into the color space of the bright color separation, which specifically includes: the AP uses the RGBG
  • the image is converted to the YUV color space to obtain the bright-color separated SPR image.
  • the bright-color separated SPR image includes 4 components P0, P1, P2, and P3.
  • the 4 components P0, P1, P2, and P3 contain two brightness Signal component and two chrominance signal components.
  • the bright-color separated SPR image includes: UYVY image, YVYU image, YUYV image, or VYUY image.
  • the decoded SPR image is located in the YUV color space
  • the decoded SPR image includes the 4 components P0, P1, P2, and P3
  • the DDIC separates the decoded SPR image from the color space of the bright color
  • the conversion to the color space where the SPR image is located specifically includes: the DDIC converts the four components P0, P1, P2, and P3 of the decoded SPR image into the four components R, G, B, and G of the RGBG image.
  • the SPR image is an RGB delta image
  • the color space of the bright color separation is the YCoCg color space
  • the AP converts the SPR image into the color space of the bright color separation, which specifically includes:
  • the RGB delta image is converted into an RGB image;
  • the AP converts the RGB image into the YCoCg color space to obtain a YCoCg image, and the YCoCg image includes three components Y, Co, and Cg.
  • the decoded SPR image is the YCoCg image
  • the DDIC converts the decoded SPR image from the color space of the bright color separation to the color space where the SPR image is located, specifically including: the DDIC transfers the YCoCg
  • the three components Y, Co, and Cg of the image are converted into the three components R, G, and B of the RGB image; the DDIC converts the RGB image into the RGB delta image.
  • the method further includes: the DDIC performs Demura processing on the decoded SPR image to obtain the target display SPR image.
  • the method further includes: the AP performs format encapsulation on the first data stream to obtain a second data stream, where the second data stream is data matching the sending interface and the receiving interface Stream; send the second data stream to the receiving interface of the DDIC; the DDIC decapsulates the second data stream received by the receiving interface to obtain the first data stream.
  • the eighth aspect of the present application provides a computer-readable storage medium with instructions stored in the computer-readable storage medium, which when run on a computer or processor, cause the computer or processor to execute the fifth aspect or The method in any of its possible implementations.
  • the ninth aspect of the present application provides a computer-readable storage medium that stores instructions in the computer-readable storage medium, and when it runs on a computer or processor, the computer or processor executes the sixth aspect or The method in any of its possible implementations.
  • the tenth aspect of the present application provides a computer-readable storage medium with instructions stored in the computer-readable storage medium, which when run on a computer or processor, cause the computer or processor to execute the seventh aspect or The method in any of its possible implementations.
  • the eleventh aspect of the present application provides a computer program product containing instructions, which when it runs on a computer or processor, causes the computer or processor to execute the fifth aspect or any of its possible implementations The method.
  • the twelfth aspect of the present application provides a computer program product containing instructions, which when it runs on a computer or processor, causes the computer or processor to execute the sixth aspect or any of its possible implementations The method.
  • the thirteenth aspect of this application provides a computer program product containing instructions, which when it runs on a computer or processor, causes the computer or processor to execute the seventh aspect or any one of its possible implementations.
  • the method is not limited to:
  • FIG. 1a is an exemplary RGB Stripe display screen that is commonly used for pixel arrangement of traditional display screens provided by an embodiment of the application;
  • FIG. 1b is an exemplary SPA arrangement display screen provided by an embodiment of the application.
  • Fig. 1c is another exemplary SPA arrangement display screen provided by an embodiment of the application.
  • FIG. 1d is another exemplary SPA arrangement display screen provided by an embodiment of the application.
  • FIG. 2a is a schematic structural diagram of an exemplary image processing device provided by an embodiment of the application.
  • FIG. 2b is a schematic structural diagram of another exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 3 is an example of encoding SPR image data provided by an embodiment of the application.
  • FIG. 4a is another example of encoding SPR image data provided by an embodiment of the application.
  • FIG. 4b is another example of encoding SPR image data provided by an embodiment of the application.
  • FIG. 5a is a schematic structural diagram of another exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 5b is a schematic structural diagram of another exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 5c is a schematic structural diagram of another exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 6a is a schematic structural diagram of another exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 6b is a schematic structural diagram of another exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 6c is a schematic structural diagram of another exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 7 is a schematic diagram of input and output of an exemplary color space conversion provided by an embodiment of the application.
  • FIG. 8 is a schematic diagram of another exemplary color space conversion input and output provided by an embodiment of the application.
  • FIG. 9 is a schematic diagram of the hardware architecture of an exemplary image processing apparatus provided by an embodiment of the application.
  • FIG. 10 is a schematic flowchart of an image processing method provided by an embodiment of this application.
  • FIG. 11 is a schematic flowchart of another image processing method provided by an embodiment of the application.
  • At least one (item) refers to one or more, and “multiple” refers to two or more.
  • “And/or” is used to describe the association relationship of associated objects, indicating that there can be three types of relationships, for example, “A and/or B” can mean: only A, only B, and both A and B , Where A and B can be singular or plural.
  • the character “/” generally indicates that the associated objects are in an “or” relationship.
  • the following at least one item (a)” or similar expressions refers to any combination of these items, including any combination of a single item (a) or plural items (a).
  • At least one (a) of a, b or c can mean: a, b, c, "a and b", “a and c", “b and c", or "a and b and c" ", where a, b, and c can be single or multiple.
  • VESA Video Electronics Standards Association
  • DSC Display Stream Compression
  • DSC One of the standard compression algorithms for image compression. It was released by VESA in 2014. It has the characteristics of real-time encoding and decoding, fixed compression ratio, low cost and high display quality. Compared with the traditional compression algorithm, the DSC compression algorithm does not need more frame data, does not need more storage resources inside the multimedia chip, and is compatible with the display interface. DSC is mainly aimed at the high-quality audio and video transmission requirements of high-end electronic devices. Illustratively, the DSC compression algorithm is suitable for encoding audio and video data when transmitting audio and video data based on the Mobile Industry Processor Interface (MIPI).
  • MIPI Mobile Industry Processor Interface
  • Color can be the different perceptions of the eyes to light of different frequencies, or it can represent light of different frequencies that exist objectively.
  • the color space is the color range defined by the coordinate system established by people to express colors.
  • the color gamut defines a color space together with the color model.
  • the color model is an abstract mathematical model that uses a set of color components to express colors.
  • the color model may include, for example, a three-primary color mode (red green blue, RGB) and a printing four-color mode (cyan magenta yellow key plate, CMYK).
  • Color gamut refers to the sum of colors that a system can produce.
  • Adobe RGB and sRGB are two different color spaces based on the RGB model.
  • RGB color space (Red Green Blue color space): RGB defines the colors of the three primary colors of red, green, and blue, where the color value of one of the three primary colors takes the maximum value, and the color value of the other two colors is zero. The corresponding color represents the color of the maximum value. Exemplarily, in the three primary colors of red, green, and blue, the color values R, G, and B are all 0-255. When the values of R and G are all zero, the value of B is 255. The color represents blue. Each pixel in the RGB format image contains three sub-pixels of R, G, and B.
  • YUV It is a color space separated by bright colors.
  • Each pixel in a YUV format image contains a luminance component Y, a chrominance component U, and a chrominance component V, where Y represents the brightness (Luminance, Luma), U And V represents Chroma.
  • YCoCg It is a color space with bright color separation, with good transform coding gain.
  • Each pixel in the YCoCg format image contains a luminance value Y and two chrominance values Co and Cg, where Co represents chrominance (orange), and Cg represents chrominance (green).
  • the conversion relationship between RGB color space and YCoCg color space is shown in formula 1:
  • UYVY A data format belonging to the YUV color space. Every 2 pixels in the UYVY format image contains two Y components, one U component and one V component.
  • P0-P3 represent 4 pixels. When these 4 pixels are in the RGB color space, each pixel contains one R, G, and B. When these 4 pixels are in the YUV color space, each Each pixel contains one Y, U, and V. When these 4 pixels are in the UVYV color space, every two pixels contain two Ys, one U and one V.
  • PPI Pixel Per Inch
  • the size of the display panel of the display device is often limited. It is desired to increase the resolution of the display panel without expanding the size of the display panel as much as possible, which greatly increases the process difficulty and manufacturing cost of the display panel.
  • the industry adopts sub-pixel arrangement display panels.
  • the number of physical sub-pixels of the SPA display panel is less than the number of sub-pixels of the image to be displayed.
  • each pixel includes three sub-pixels, each of which is a red sub-pixel R , Green sub-pixel G and blue sub-pixel B.
  • an exemplary SPA display screen provided by this embodiment of the application.
  • the SPA display screen in Figure 1b is arranged in RGBG, and each pixel contains two sub-pixels.
  • Pixel 1 includes two sub-pixels of R and G
  • pixel 2 includes two sub-pixels of B and G
  • pixel 3 includes two sub-pixels of R and G
  • pixel 4 includes two sub-pixels of B and G, which alternately appear in a combination of RG and BG.
  • Fig. 1c another exemplary SPA display screen provided by this embodiment of the application.
  • the SPA display screen in Fig. 1c adopts RGB delta arrangement, and each pixel includes three sub-pixels of RGB, two adjacent to each other. Pixels have shared sub-pixels. Taking the a-th row in FIG. 1c as an example, pixel 1 and pixel 2 share a blue sub-pixel B2, and pixel 2 and pixel 3 share a red sub-pixel R2 and green sub-pixel G3.
  • RGB Stripe display three pixels are composed of 9 sub-pixels; in the RGBG display, the three pixels are composed of 6 sub-pixels; in the RGB delta display , Three pixels consist of 6 sub-pixels. Therefore, when the number of pixels is the same, RGBG arrangement and RGB delta arrangement require fewer sub-pixels than RGB Stripe arrangement.
  • the SPA display screen with RGBG arrangement only needs to have 3840*2160*2 sub-pixels to display the image to be displayed normally. Under the premise of ensuring the resolution, it is greatly The number of sub-pixels of the display screen is reduced.
  • SPR sub-pixel rendering
  • RGBG arrangement and RGB delta arrangement are two exemplary arrangements of SPA display screens, and the ratio of the number of pixels to the number of sub-pixels in these two arrangements is 1:2, optional, and There are other arrangements of SPA display screens.
  • the number of sub-pixels contained in one pixel can also be less than 2.
  • one pixel can only contain 1.75 sub-pixels or one pixel only contains 1.5 sub-pixels, as shown in the figure 1d shows an exemplary SPA arrangement SPR1.5.
  • SPR1.5 arrangement one pixel contains 1.5 sub-pixels.
  • the display panel there are 4 pixels and 6 sub-pixels, which are equivalent to Each pixel contains 1.5 sub-pixels.
  • the embodiment of the application proposes a new architecture of the image processing device.
  • the amount of data transmitted from the image processing module to the image display driving module is greatly reduced, thereby reducing the image data from the image processing module.
  • the transmission bandwidth and power consumption required by the image display driver module is greatly reduced.
  • the architecture proposed in the embodiments of this application is suitable for active-matrix organic light emitting diode (AMOLED), liquid crystal display (LCD), and light emitting diode display (Micro Light Emitting Diode Display, MicroLED) and other display screens, and the applicable product form is not limited to mobile terminals such as mobile phones, tablet computers, cameras, video cameras, etc., and can be applied to electronic devices with displays such as computers and TVs.
  • the architecture proposed in the embodiments of the present application is particularly suitable for electronic devices that require high-quality image or video display.
  • FIG. 2a it is a schematic structural diagram of an exemplary image processing apparatus provided in an embodiment of this application.
  • the image processing device 200 includes a System on Chip (SOC) 201, a Display Driving Integrated Circuit (DDIC) 202, and a display screen 203.
  • SOC System on Chip
  • DDIC Display Driving Integrated Circuit
  • the SOC and DDIC are two independent chips.
  • DDIC is the integrated circuit chip of the display screen and is the main part of the imaging system of the display screen.
  • the DDIC 202 is used to drive the display screen 203 and can also be used to control the driving current.
  • the image processing device may include multiple DDICs.
  • the display screen 203 may be LCD, AMOLED, microLED, light-emitting diode (Light Emitting Diode, LED) display, organic light-emitting diode (Organic Light-Emitting Diode, OLED) display screen, cathode ray tube (Cathode Ray Tube) , CRT) display screen, etc.
  • the display screen 203 is a SPA display screen, and its arrangement includes but is not limited to RGBG arrangement, RGB delta arrangement, or SPR1.5 arrangement.
  • the SOC 201 includes a sub-pixel rendering module 2011, an encoder 2012 and a sending interface 2013.
  • the sub-pixel rendering module 2011 is configured to perform sub-pixel rendering SPR processing on the image to be displayed to obtain SPR image data.
  • the display screen 203 is a SPA display screen
  • the number of physical sub-pixels contained in the display screen is less than the number of sub-pixels of the original image to be displayed.
  • the original image to be displayed is not processed by SPR and cannot be displayed at a lower resolution.
  • SPR image data is obtained.
  • the number of sub-pixels of the SPR image data and the arrangement of sub-pixels conform to the display screen 203, so the SPR data can be correctly displayed on the display screen 203 .
  • the obtained SPR image data may be RGBG arrangement image data or RGB delta arrangement image data.
  • the SPR processing includes various existing sub-pixel rendering processing, for example, it may be the SPR processing in the Chinese patent application with the application number 201810281666.7 and the invention name "Pixel Processing Method and Device".
  • the sub-pixel rendering module 2011 is a dedicated hardened hardware logic or dedicated hardware integrated circuit in the SOC, for example, it may be a hardened hardware core in a GPU.
  • the sub-pixel rendering module may also be a software module running on the processor.
  • the encoder 2012 is used for encoding and compressing the SPR data processed in 2011 to obtain compressed image data.
  • the encoder 2012 includes 4 processing channels.
  • the encoder 2012 can encode and compress image data of 4 input components, or can encode and compress image data of less than 4 input components, for example, It can encode 3 input components.
  • the encoder 2012 encodes 3 input components, there is no correspondence between the input components and the channels, and 3 channels can be selected from the 4 channels of the encoder.
  • SPR image data is RGB delta arrangement data
  • the encoder includes 4 channels: channel one, channel two, channel three, and channel four.
  • the RGB delta data is formatted to obtain mapped RGB data.
  • Each pixel of the mapped RGB data contains three components R, G, and B. It should be understood that before and after the mapping, The number of pixels remains unchanged, and the format mapping of RGB delta images to RGB images will not result in an increase in data volume.
  • the R component is processed by channel one
  • the G component is processed by channel two
  • the B component is processed by channel three
  • channel four is idle.
  • channel allocation method in Figure 3 is just an example, optional, idle
  • the channel of can be any of the 4 channels, and there is no correspondence between the channel and the data component.
  • channel one can process the G component
  • channel two can process the B component
  • channel three can process the R component. There is no restriction on this.
  • the encoder further includes a color space conversion (Color Space Conversion, CSC) module.
  • CSC Color Space Conversion
  • the CSC module may be implemented by a dedicated hardware integrated circuit or hardware logic.
  • the CSC module is used to convert RGB data into YUV or YCoCg data.
  • YUV or YCoCg data contains 3 sub-components, each of which is processed by a channel.
  • the CSC module can also be external to the encoder, and the CSC module sends YUV or YCoCg data to the encoder for processing.
  • FIG. 4a another example of encoding SPR image data provided by this embodiment of the application.
  • SPR image data is RGBG arranged data
  • the encoder also includes 4 channels: channel one, channel two, channel three and channel four, which can process the image data of four input components at the same time.
  • the data arranged in RGBG has 4 components R, G, B, and G for every two pixels, which are respectively processed by channel 1 to channel 4 of the encoder to obtain compressed data. It should be understood that there is no correspondence between the input component and the channel.
  • FIG. 4a only shows an exemplary corresponding form of the input component and the channel, and other corresponding forms may also exist, which is not limited in this application.
  • the RGBG data is converted to the color space CSC to obtain the UYVY data.
  • the UYVY data contains 4 components for every two pixels: one The U component, one V component, and two Y components are sent to the 4 channels of the encoder respectively, and each channel processes one channel of component data to obtain compressed data.
  • format mapping or color space conversion can also be implemented in the encoder, and in this case, the encoder further includes a CSC module.
  • the encoding algorithm used by the encoder 2012 can be DSC1.2.
  • DSC1.2 is a compression algorithm proposed by VESA.
  • DSC1.2 has 4 processing channels, which can perform UVYV data, RGBG arrangement or RGB delta arrangement.
  • the SPR image data of the cloth is encoded.
  • Encoder 2012 can also use the DSC1.1 compression algorithm to encode SPR data, but since DSC1.1 can only encode three-component image data, before using DSC1.1 to encode SPR data, it is necessary to perform SPR image data Up-sampling, so that each pixel in the processed SPR image data also contains 3 components. It should be understood that three-component data refers to image data in which 1 pixel contains 3 sub-pixels,
  • the sending interface 2013 is used to send the compressed data obtained by the encoder 2012 to the DDIC 202.
  • the transmission interface 2013 may be a MIPI transmission interface (or MIPI transmitter), a MIPI standardized display serial interface (Display Serial Interface, DSI), or a VESA standardized embedded display port (Embedded Display Port, eDP).
  • the transmission interface 2013 may also be a high-definition multimedia interface (HDMI) or a V-By-One interface, which is a kind of interface developed for image transmission. Digital interface standard.
  • the compressed data obtained by compression encoding by the encoder 2012 matches the transmission interface 2013, or the format of the compressed data meets the format requirements of the transmission interface 2013.
  • the system chip 201 further includes a packaging module for format packaging of the compressed data obtained by the encoder, and the packaging data stream meets the format requirements of the sending interface 2013.
  • the package module may be a solidified hardware logic or a dedicated integrated circuit.
  • the data stream sent by the sending interface 2013 is transmitted to the receiving interface of DDIC in accordance with the transmission protocol of the sending interface.
  • the transmission protocol is D-phy or C-phy of MIPI.
  • the SOC 201 may also include at least one of the following: a memory, a microprocessor, and a microcontroller (Microcontroller Unit, MCU), an application processor (AP), and image data.
  • Processor Image Signal Processor, ISP
  • Digital Signal Processor Digital Signal Processor
  • DSP Digital Signal Processor
  • GPU Graphics Processing Unit
  • the SOC 201 may also include a definition processing module, a contrast processing module, and a color correction module. It should be understood that the definition processing module, contrast processing module, and color correction module may all be solidified hardware logic or dedicated integrated circuits.
  • the sub-pixel rendering module 2011 may be located in an AP, ISP, GPU or other general-purpose processor, and the embodiment of the present application does not limit the location of the sub-pixel rendering module.
  • the DDIC 202 includes a receiving interface 2021 and a decoder 2022.
  • the DDIC 202 may also include a screen brightness compensator 2023.
  • the receiving interface 2021 is used to receive the compressed data sent by the sending interface 2013.
  • the receiving interface may be a MIPI receiver, MIPI standardized DSI, VESA standardized eDP, HDMI receiver, or V-By-One interface.
  • the receiving interface 2021 and the transmitting interface 2013 are matched.
  • the transmitting interface 2013 is In the case of a MIPI transmitter, the receiving interface 2021 is a MIPI receiver, when the sending interface 2013 is an HDMI transmitter, the receiving interface 2021 is an HDMI receiver, and so on.
  • the compressed data received by the receiving interface 2021 matches the format requirements of the receiving interface 2021.
  • the DDIC 202 further includes a decapsulation module, which is used to format the received compressed data to obtain compressed data that can be recognized by the decoder.
  • the decoder 2022 is used to decode the compressed image data to obtain SPR image data.
  • the decoder 2022 matches the encoder 2012, and the decoder 2022 can identify the encoded and compressed data obtained by the encoder 2012, and can decompress the encoded and compressed data into SPR image data before compression.
  • the SPR image data contains color information. Show on the display.
  • the encoder 2012 adopts the DSC1.2 compression algorithm
  • the decoder 2022 adopts the decoding algorithm corresponding to DCS1.2.
  • the screen brightness compensator 2023 is used to perform brightness compensation on the SPR image data to compensate for the problem of inconsistent brightness of the screen.
  • the display screen can individually control the switch of each sub-pixel. Due to the limitation of the manufacturing process, the display unit of each sub-pixel is not completely the same. Various types of display screens will more or less have a brightness difference.
  • the uniform phenomenon is also called the Mura phenomenon in the industry.
  • the so-called Mura phenomenon means that when the same pixel value is set for the display unit of each sub-pixel, the brightness displayed by each sub-pixel display unit is not consistent, so the brightness of the display panel appears to be uneven to human eyes.
  • the screen brightness correction module is the Demura module.
  • the Demura technology detects the Mura area, obtains the compensation data of the pixels in the Mura area, and compensates the pixels in the Mura area based on the compensation data, so as to eliminate the display process restriction zone The brightness is uneven.
  • the Demura module is used to Demura the SPR image data to compensate for the uneven brightness caused by the screen Mura phenomenon.
  • the DDIC 202 further includes a gamma corrector for gamma correction on the display screen 203.
  • the DDIC 202 also includes a digital-to-analog converter (DAC), which is used to convert a digital image into an analog signal.
  • the analog signal is a drive current, and the DDIC controls it according to the pixel value of the digital image.
  • the size of the drive current makes the pixels of different pixel values show different brightness values on the display screen.
  • the DDIC 202 further includes a memory, for example, it may include a Static Random Access Memory (SRAM). Since the amount of image data after SPR processing is greatly reduced, the storage space occupied on the DDIC side is also reduced.
  • SRAM Static Random Access Memory
  • the DDIC 202 sends the SPR image data corrected by Demura and gamma to the display screen 203 for display.
  • SPA displays are usually used.
  • the original input three-component image data must be processed by SPR before it can be displayed correctly on the SPA display.
  • the SPR processing of image data adapted to the display screen is completed in DDIC.
  • the SPR processing is completed in the SOC. The number of sub-pixels contained in the pixel becomes smaller, and the amount of image data is greatly reduced.
  • the data after SPR processing is the original image data If the SPR processed image data is SPR1.5 arrangement data, the SPR processed data is 1/2 of the original image data. Since the amount of image data processed by SPR is greatly reduced, the amount of data after encoding the image data is also greatly reduced, the bandwidth and power consumption required to transmit image data from SOC to DDIC are also greatly reduced, and the amount of data processed by DDIC And the storage space on the DDIC side is greatly reduced, greatly improving the performance of the image processing device.
  • the hardware logic area occupied by the SPR module is smaller, and the cost and power consumption of the SPR module or the Intellectual Property (IP) core itself will also be reduced. .
  • the encoder of the embodiment of the present application includes 4 processing channels, which can encode image data less than or equal to 4 component input. Therefore, the SPR processed RGBG and RGB delta data can be directly encoded to obtain Compressed data conforming to the transmission interface format and transmission protocol.
  • the image processing device may also include SOC and DDIC instead of the display screen, and the display screen is the display screen of the display device outside the image processing device, as shown in FIG. 2b.
  • the system chip and display driver chip please refer to the description in the part of Figure 2a, which will not be repeated here.
  • the display screen 300 is a display screen of an electronic device other than the image processing apparatus 200.
  • the image processing device may be an integrated chip or processor product containing SOC and DDIC in a mobile phone, and the display screen 300 is the display screen of the mobile phone.
  • the image processing device 200 and the display screen 300 are common It constitutes a mobile phone or other terminal with a display screen.
  • the image processing device 500 includes an application processor AP 501, a display driver integrated circuit DDIC 502, and a display screen 503, wherein the AP 501 and DDIC 502 are integrated inside the system chip SOC.
  • the AP 501 includes a sub-pixel rendering module 5011, an encoder 5012, and a transmitting interface 5013.
  • the DDIC 502 includes a receiving interface 5021, a decoder 5022, and a screen brightness compensator 5023.
  • the application processor AP 501 may also include a packaging module, and the DDIC 502 may also include a decapsulation module, a gamma corrector, and a memory.
  • the memory of the DDIC is used to store the encoded and compressed data stream of the receiving interface, or the SPR image data decoded by the decoder.
  • the display screen 503 is a SPA display screen.
  • the types of the display screen 503 include but are not limited to LCD, AMOLED, microLED, LED, OLED, and CRT.
  • the compressed image data is transmitted to the DDIC 502 through the sending interface.
  • the receiving interface receives the compressed image data, it is decoded to obtain the SPR image that can be displayed on the screen.
  • the screen brightness compensator 5023 performs brightness compensation on the decoded SPR image data to compensate for the uneven brightness of the screen.
  • 5023 performs Demura processing on SPR image data to compensate for uneven brightness caused by the Mura phenomenon of the screen.
  • the DDIC 502 further includes a DAC, which is used to convert image data into a driving current, and the DDIC drives the display screen 503 to display the image on the display screen.
  • the processing of the image data by the AP 501 may also include a preprocessing module, which is used to perform sharpness processing, contrast adjustment, color enhancement, color correction, scaling processing, etc. on the data image.
  • AP 501 is integrated inside the SOC, and DDIC 502 is located outside the SOC, which is an independent integrated chip, as shown in Figure 5b.
  • the image processing device 500 includes an AP 501 and a DDIC 502, but does not include a display screen 503, as shown in FIG. 5c.
  • the image processing device may include AP, but not DDIC and display screen.
  • FIG. 6a it is a schematic structural diagram of another image processing apparatus provided by an embodiment of this application.
  • the image processing device 600 includes a system chip SOC601, a display drive integrated circuit DDIC 602, and a display screen 603.
  • SOC 601 includes:
  • Sub-pixel rendering module 6011 used to perform sub-pixel rendering SPR processing on the image to be displayed to obtain SPR image data.
  • the SPR image data is image data that can be normally displayed on the SPA screen.
  • the sub-pixel rendering module 6011 is a dedicated hardened hardware logic or dedicated hardware integrated circuit in the SOC, for example, it may be a hardened hardware core in a GPU.
  • Color space conversion module 6012 used to convert the SPR image into the color space of the light color separation to obtain the SPR image of the light color separation.
  • the SPR image may belong to an RGB color space
  • the SPR image may be an RGBG image or an RGB delta image
  • the bright color separation color space may be a YUV color space or a YCoCg color space.
  • the SPR image is an RGBG image
  • the color space of the bright color separation is a YUV color space.
  • the color space conversion module 6012 is specifically used to convert the RGBG image to the YUV color space to obtain the bright-color separated SPR image.
  • the bright-color separated SPR image includes 4 components P0, P1, P2, and P3.
  • the components P0, P1, P2, and P3 contain two luminance signal components and two chrominance signal components.
  • P0, P1, P2, and P3 may be: U, Y, V, and Y.
  • the bright-color separated SPR image includes but is not limited to: UYVY image, YVYU image, YUYV image, or VYUY image.
  • the SPR image is an RGB delta image
  • the color space of the bright color separation is the YCoCg color space.
  • the SOC 601 also includes a first format mapping module, which is used to convert RGB delta images into RGB images. It should be understood that the RGB delta image format is mapped to the mapped RGB image, and the number of sub-pixels of the data before and after the mapping remains unchanged. Change, format mapping of RGB delta image to RGB image will not increase the amount of data; for example, as shown in Figure 1c, RGB delta arrangement, the ath row contains 3 pixels in total, pixel 1, pixel 2, and pixel 3.
  • Pixel 1 contains (R1, G1, B2)
  • pixel 2 contains (R2, G3, B2)
  • pixel 3 contains (R2, G3, B3)
  • pixel 1 and pixel 2 share a blue sub-pixel B2
  • pixel 2 and pixel 3 share the red sub-pixel R2 and the green sub-pixel G3.
  • Row a contains 2 pixels, namely (R1, G1, B2), (R2, G3, B3), before and after format mapping, The number of sub-pixels has not changed.
  • the color space conversion module 6012 specifically used to convert the RGB image to the YCoCg color space to obtain a YCoCg image.
  • the YCoCg image includes three components Y, Co, and Cg.
  • Converting the SPR image to the color space of bright color separation further reduces the data volume of image data and saves transmission bandwidth; and, under the same compression ratio, reduces the image distortion caused by compression and decoding, so that the decoded image The display effect is better.
  • the color space conversion module 6012 is dedicated hardened hardware logic or dedicated hardware integrated circuit in the SOC.
  • 6012 may be hardened hardware logic or hardened hardware core in AP, or may be hardened hardware in GPU Logic or hardened hardware core.
  • the color space conversion module 6012 includes a 4*4 color space conversion coefficient matrix.
  • the color space conversion coefficient matrix includes 16 decorrelation coefficients: X0-X15, and the input is the four components R and G of the RGBG image. , B, G, the output is also four components P0, P1, P2 and P3, as shown in Figure 7. At this time, the color space conversion matrix is solidified in the hardware logic or hardware core. Compared with the input R, G, B, and G, the output P0, P1, P2, P3 are between the brightness and chromaticity. The correlation is greatly reduced.
  • Encoder 6013 used to encode SPR image data after decorrelation processing to obtain compressed image data.
  • the sending interface 6014 is used to send the compressed image data to the DDIC 602. For details, please refer to the description of the sending interface 2013, which will not be repeated here.
  • the SOC 601 also includes an encapsulation module, an image preprocessing module, and so on.
  • DDIC 602 includes:
  • the receiving interface 6021 is used to receive the compressed image data sent by the sending interface 6015.
  • the receiving interface 2021 please refer to the description of the receiving interface 2021, which is not repeated here.
  • the decoder 6022 is used to decode the compressed image data to obtain decoded SPR image data. For details, please refer to the description of the decoder 2022, which will not be repeated here.
  • the color space conversion module 6023 is used to convert the decoded SPR image data obtained by the decoder into the color space where the SPR image data is located.
  • the decoded SPR image data obtained by the decoder is located in the YUV color space.
  • the decoded SPR image contains 4 components P0, P1, P2, and P3.
  • the decoded SPR image data may include but not limited to: UYVY image, YVYU image, YUYV Image or VYUY image.
  • the color space conversion module 6023 is specifically configured to convert the four components P0, P1, P2, and P3 of the decoded SPR image into the four components R, G, B, and G of the RGBG image.
  • the color space conversion module 6023 is specifically configured to convert the three components Y, Co, and Cg of the YCoCg image into the three components R, G, and B of the RGB image.
  • the DDIC also includes a second format mapping unit for converting RGB images into RGB delta images.
  • the second format mapping unit is the inverse mapping of the format mapping in the first format mapping unit.
  • the first format mapping unit maps RGB delta data to RGB data
  • the second format mapping unit maps RGB data to RGB delta data. During this process, the number of sub-pixels in the image remains unchanged.
  • the color space conversion module 6023 is a dedicated hardened hardware logic or dedicated hardware integrated circuit in the SOC.
  • 6023 may be hardened hardware logic or hardened hardware core in AP, or may be hardened hardware in GPU Logic or hardened hardware core.
  • the color space conversion module 6023 also includes a 4*4 color space conversion matrix, which includes 16 coefficients: Y0-Y15, the input is four pixel components P0, P1, P2, and P3, and the output is RGBG The four pixel components R, G, B, and G of the image are shown in Figure 8.
  • the matrix is solidified in the hardware logic or hardware core.
  • the matrix is the inverse matrix of the color space conversion matrix in the color space conversion module 6012.
  • the matrix is not the inverse matrix of the color space conversion matrix in 6012.
  • the inverse matrix of the color space conversion matrix can be removed. Fine-tune the matrix or replace several coefficients to get the matrix in 6023.
  • the screen brightness compensator 6024 is used to perform brightness compensation on the SPR image data to eliminate the problem of inconsistent screen brightness. For details, please refer to the description of the screen brightness compensator 2023, which will not be repeated here.
  • the DDIC 602 also includes a gamma corrector, a DAC, and a memory.
  • a gamma corrector for details, please refer to the description of the foregoing embodiment, which will not be repeated here.
  • Performing sub-pixel rendering SPR processing on the image data to be displayed on the SOC side greatly reduces the amount of data to be transmitted, reduces the bandwidth and power consumption required to send image data from the SOC to the DDIC, and also reduces the processing on the DDIC side
  • the amount of data and storage space reduces the power consumption of the image processing device and improves the performance of the device.
  • the embodiment of the present application performs color space conversion on the SOC side to convert the SPR image to the bright-color separated SPR image, which further reduces the amount of image data, saves transmission bandwidth, and reduces the correlation between the brightness and chroma of the image , It reduces the image distortion caused by compression and decoding. Under the premise of ensuring the same compression ratio, the image obtained after decoding is better displayed on the display screen with less distortion.
  • the hardware logic area occupied by the SPR module is smaller, and the cost and power consumption of the SPR module or the Intellectual Property (IP) core itself will also be reduced. .
  • the image processing device 600 includes a system chip 601 and a DDIC 602, but does not include a display screen 603, as shown in FIG. 6b.
  • the product form of the image processing device 600 may be a processor chip.
  • the processor chip and the display screen together constitute an electronic device or mobile terminal, such as a smart phone, a camera, or a TV.
  • the sub-pixel rendering module, the color space conversion module, the encoder, and the transmission interface are all integrated in the AP, and the AP and DDIC are integrated in the SOC, as shown in Figure 6c.
  • the image processing device includes a folding screen.
  • the folding screen of the image processing device includes a first display screen and a second display screen, the first display screen and the second display screen are coupled, and the first display screen and The second display screen can be used together to display the same image;
  • the image processing device also includes a first DDIC and a second DDIC, wherein the first DDIC is used to drive the first display screen, and the second DDIC is used to drive the second display screen, Data transmission and communication between multiple display screens is based on multiple DDICs. Since SPR processing needs to refer to surrounding pixels, if SPR processing is in DDIC, there is data interaction between multiple DDICs for image processing devices with multiple folding screens.
  • SPR processing is placed on the SOC side, which avoids data interaction between multiple DDICs, reduces data interaction and data processing between chips, and avoids electromagnetic interference (EMI) and electrostatic discharge (Electrostatic Interference). Discharge, ESD) issues.
  • EMI electromagnetic interference
  • ESD electrostatic discharge
  • FIG. 9 it is a schematic diagram of the hardware architecture of an exemplary image processing apparatus provided by an embodiment of this application.
  • the hardware architecture of the image processing device 900 may be suitable for SOC and AP.
  • the image processing device 900 includes at least one central processing unit (CPU), at least one memory, GPU, decoder, dedicated video or graphics processor, receiving interface, sending interface, and the like.
  • the image processing device 900 may also include a microprocessor, a microcontroller MCU, and so on.
  • the above-mentioned parts of the image processing device 900 are coupled through a connector. It should be understood that, in the various embodiments of the present application, coupling refers to mutual connection in a specific manner, including direct connection or through other The devices are indirectly connected, such as through various interfaces, transmission lines or buses.
  • interfaces are usually electrical communication interfaces, but it is not excluded that they may be mechanical interfaces or other forms of interfaces, which are not limited in this embodiment.
  • the above-mentioned parts are integrated on the same chip; in another optional case, the CPU, GPU, decoder, receiving interface, and transmitting interface are integrated on one chip, and the chip is The various parts of the bus access external memory.
  • the dedicated video/graphics processor can be integrated with the CPU on the same chip, or it can exist as a separate processor chip.
  • the dedicated video/graphics processor can be a dedicated ISP.
  • the chip involved in the embodiments of this application is a system manufactured on the same semiconductor substrate by an integrated circuit process, also called a semiconductor chip, which can be manufactured on a substrate using an integrated circuit process (usually a semiconductor such as silicon)
  • the outer layer of the integrated circuit formed on the material) is usually encapsulated by a semiconductor packaging material.
  • the integrated circuit may include various types of functional devices, and each type of functional device includes transistors such as logic gate circuits, Metal-Oxide-Semiconductor (MOS) transistors, bipolar transistors or diodes, and may also include capacitors and resistors. Or inductance and other components.
  • MOS Metal-Oxide-Semiconductor
  • bipolar transistors or diodes may also include capacitors and resistors. Or inductance and other components.
  • Each functional device can work independently or under the action of necessary driver software, and can realize various functions such as communication, calculation, or storage.
  • the CPU may be a single-CPU processor or a multi-CPU processor; optionally, the CPU may be a processor group composed of multiple processors, between multiple processors Coupled to each other through one or more buses.
  • part of the processing of the image signal or video signal is completed by the GPU, part is completed by a dedicated video/graphics processor, and may also be completed by software code running on a general-purpose CPU or GPU.
  • the memory can be used to store computer program instructions, including operating system (Operation System, OS), various user application programs, and various computer program codes used to execute the solution of the application; the memory can also be used to store Video data, image data, etc.; the CPU can be used to execute computer program codes stored in the memory to implement the methods in the embodiments of the present application.
  • operating system Operating System
  • OS Operating System
  • user application programs various computer program codes used to execute the solution of the application
  • the memory can also be used to store Video data, image data, etc.
  • the CPU can be used to execute computer program codes stored in the memory to implement the methods in the embodiments of the present application.
  • the memory may be a non-power-down volatile memory, such as Embedded MultiMedia Card (EMMC), Universal Flash Storage (UFS) or Read-Only Memory (ROM) ), or other types of static storage devices that can store static information and instructions, or volatile memory (volatile memory), such as random access memory (Random Access Memory, RAM), or can store information and instructions
  • EMMC Embedded MultiMedia Card
  • UFS Universal Flash Storage
  • ROM Read-Only Memory
  • volatile memory volatile memory
  • volatile memory volatile memory
  • volatile memory volatile memory
  • RAM random access memory
  • EEPROM Electrically Erasable Programmable Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • CD-ROM Compact Disc Read-Only Memory
  • CD storage
  • the receiving interface may be an interface for data input of the processor chip.
  • the receiving interface may be MIPI, HDMI, Display Port (DP), or the like.
  • the format of the input image is RGB
  • the SPR image data obtained after sub-pixel rendering is an RGBG image
  • the display screen is a SPA display screen.
  • the embodiment of the application does not treat the input image
  • the format, the image format after SPR processing, etc. constitute limitations.
  • the input image format can also be YUV, YCoCg, raw image in the original format, etc.
  • the SPR data can also be RGB delta or SPR1.5.
  • the method includes:
  • Step 1 Perform sub-pixel rendering SPR processing on the image to be displayed to obtain SPR image data, which is an image that can be normally displayed on the SPA display screen.
  • the image to be displayed is an RGB image
  • the image obtained by SPR processing is an RGBG image. Every 2 pixels of the RGBG image contains 4 components: one R component, one B component, and two G components. The number of pixels is less than the number of sub-pixels contained in the RGB image.
  • Step 2 Encode the RGBG image to obtain a first data stream.
  • the first data stream is a bit stream, and the data amount of the first data stream is less than the data amount of the input RGBG image; the encoding is done by the encoder,
  • the encoder contains 4 processing channels, and simultaneously encodes 4 components of 2 pixels of RGBG, each channel processes one component, and the processing results of the 4 channels together constitute the first data stream.
  • Step 3 Encapsulate the first data stream to obtain the second data stream.
  • the second data stream is a data stream matching the transmission interface (including the sending interface of the AP and the receiving interface of the DDIC) and the transmission protocol.
  • the transmission interface including the sending interface of the AP and the receiving interface of the DDIC
  • the transmission protocol is the C-phy protocol of the MIPI standard
  • the second data stream is a data stream compliant with the MIPI standard.
  • the encapsulation may be completed by the sending interface of the AP, or may be completed by the packaging module, and the packaging module may be a dedicated hardened hardware logic in the AP.
  • Step 4 Send the second data stream to DDIC. Exemplarily, it is sent to the receiving interface of DDIC through the sending interface of the AP.
  • Step 5 Decapsulate the second data stream to obtain the first data stream.
  • the decapsulation may be completed by the receiving interface of the DDIC, or may be completed by the decapsulation module, and the decapsulation module may be a dedicated curing hardware logic in the DDIC.
  • the first data stream obtained by decapsulation is a data stream that can be processed or recognized by the decoder.
  • Step 6 Decode the first data stream to obtain RGBG image data that can be displayed on the display screen.
  • the display screen Due to the limitation of the display screen manufacturing process, the display screen has a mura phenomenon.
  • the method also includes:
  • Step 7 Perform screen brightness compensation processing on the decoded RGBG image data to obtain compensated RGBG image data.
  • Demura processing is performed on the decoded RGBG image data, and pixels in the mura area are compensated based on the compensation data.
  • Step 8 Drive the SPA display screen and display the compensated RGBG image data on the display screen.
  • step 1-step 3 are completed by the AP side
  • step 4 is completed by the cooperation of AP and DDIC
  • step 5-step 8 are completed by DDIC.
  • the image after SPR processing is an RGB delta image.
  • the YCoCg image includes 3 components Y, Co, Cg;
  • the YCoCg image is encoded to obtain the first data stream.
  • the encoder on the AP side may have only three processing channels, and the three processing channels respectively perform encoding processing on the three components Y, Co, and Cg at the same time; or the encoder on the AP side has 4 processing channels. Any three of the processing channels respectively perform encoding processing on the three components Y, Co, and Cg at the same time.
  • the AP sends the first data stream to the DDIC through the transmission interface
  • DDIC receives the first data stream through the transmission interface, it decodes the first data stream, and the decoded image obtained is a YCoCg image.
  • the method further includes:
  • DDIC converts the three components Y, Co, Cg of YCoCg image into three components R, G, B of RGB image;
  • the 4-component input may be simultaneously encoded, or the 3-component input may be simultaneously encoded.
  • FIG. 11 it is a schematic flowchart of another image processing method provided by an embodiment of this application.
  • the method embodiment corresponding to FIG. 11 performs color space conversion on the RGBG data obtained by sub-pixel rendering in the AP, and converts the RGBG data to the color space of bright color separation, which further reduces the image
  • the amount of data is saved, the transmission bandwidth is reduced, and the correlation between the brightness and chroma of the image is reduced, and the image obtained after the image is encoded and decoded is less distorted and the display effect is better.
  • the method includes:
  • Step 1 Perform sub-pixel rendering processing on the image to be displayed to obtain SPR image data.
  • the image to be displayed is an RGB image
  • the SPR image is an RGBG image.
  • the image to be displayed may also be in formats such as YUV, YCoCg, raw, etc.
  • the color space of the image to be displayed may be converted to obtain an RGB image
  • the SPR image data may be an RGB delta or SPR1.5 image.
  • step one can be completed by the AP, or by other general-purpose or dedicated processors on the SOC, or by the aforementioned sub-pixel rendering module, sub-pixel rendering processing core, or sub-pixel rendering processing solidification hardware logic. .
  • Step 2 Perform color space conversion on RGBG image data, convert RGBG image to bright color separation color space, and obtain 4 output components P0, P1, P2, P3, 4 output components including two luminance signals and two chromaticities signal.
  • every 2 pixels correspond to 4 input components and 4 output components, with 2 pixels as a set of input, the input includes four input components R, G, B, and G, and the output includes P0, P1, P2, and P3.
  • Four output components for example, can convert RGBG image to YUV space.
  • the four output components are: U, Y, V, Y, and the result of color space conversion can be UYVY image, YVYU image, YUYV Image or VYUY image, etc.
  • step 2 can be completed by other general-purpose or special-purpose processors on APs and SOCs, or can be completed by the aforementioned color space conversion module, dedicated color space conversion integrated circuit, and dedicated color space conversion curing hardware core.
  • the output image is a separated image signal of luminance and chrominance. If the chrominance signal is disturbed during transmission, it will not affect the brightness of the image when it is restored to the RGB space for display, which reduces the correlation between the chrominance and the luminance signal And the data volume of the converted image is further reduced, which can further save bandwidth.
  • Step 3 Encode the 4 output components to obtain the first data stream.
  • Encoding is implemented by AP's encoder.
  • the encoder includes 4 processing channels, which are used to process the 4 components R, G, B, and G of the RGBG image, and one channel processes one component. For details, please refer to the description of step 2 in FIG. 10, which will not be repeated here.
  • the method may also include:
  • Step 4 Encapsulate the first data stream to obtain the second data stream.
  • the encapsulation can be completed by the sending interface on the AP side, or by the packaging module on the AP side or a dedicated package integrated circuit.
  • Step 5 Send the second data stream to DDIC. Exemplarily, it is sent to the receiving interface of DDIC through the sending interface of the AP.
  • the method may also include:
  • Step 6 Decapsulate the second data stream to obtain the first data stream.
  • the decapsulation can be completed by the receiving interface of the DDIC, or the decapsulation module or the decapsulation application specific integrated circuit on the DDIC side.
  • Step 7 Decode the first data stream to obtain a decoded image.
  • the decoded image is located in a color space separated by bright colors.
  • the decoded image includes 4 components P0, P1, P2, and P3.
  • the decoded image may be a UYVY image, YVYU image, YUYV image, VYUY image, or the like.
  • the decoding is completed by the decoder on the DDIC side.
  • Step 8 Perform color space conversion on the decoded image to obtain an RGBG image, which can be displayed on the SPA display screen.
  • the color space conversion can be completed by a color space conversion module on the DDIC side, and the color space conversion module is a dedicated integrated circuit or dedicated curing hardware logic.
  • the method may also include:
  • Step 9 Perform screen brightness compensation processing on the RGBG image data to obtain compensated RGBG image data.
  • Demura processing is performed on the decoded RGBG image data, and pixels in the mura area are compensated based on the compensation data.
  • the screen brightness compensation processing can be completed by the screen brightness compensator on the DDIC side.
  • Step 10 Drive the SPA display screen and display the compensated RGBG image data on the display screen.
  • step ten is completed by the driver on the DDIC side.
  • the embodiment of the present application also provides a computer-readable storage medium that stores instructions in the computer-readable storage medium. When it runs on a computer or a processor, the computer or the processor executes the method provided in the embodiment of the present application. Part or all of the functions.
  • the embodiments of the present application also provide a computer program product containing instructions, which when run on a computer or a processor, cause the computer or the processor to execute any of the methods provided in the embodiments of the present application.

Abstract

本申请实施例公开了一种图像处理的装置和方法,该图像处理的装置包括:应用处理器AP、显示驱动集成电路DDIC和传输接口,AP对原始待显示图像进行子像素渲染SPR和编码之后发送给DDIC,DDIC解码之后得到SPR图像,该SPR图像可以直接显示在SPA显示屏上,由于AP对数据进行SPR处理之后,图像数据量大大减小,将图像数据从AP传输到DDIC所需的带宽和功耗也大大降低,进一步的,DDIC处理的数据量以及DDIC侧的存储空间都极大减小,大大提高了图像处理装置的性能。

Description

一种图像数据处理的装置和方法 技术领域
本申请涉及显示技术领域,尤其涉及一种图像数据处理的装置和方法。
背景技术
为了给用户提供更好的体验,当前市场对显示面板的要求越来越高,例如要求显示面板具有高分辨率、高对比度、广色域,还要求显示面板反应速度更快、更轻薄、功耗更低以及可折叠等。在显示装置中图像处理和图像显示通常是由不同的模块来完成的,处理之后的图像数据需要从图像处理模块发送到图像显示模块进行显示。随着显示面板的显示分辨率的提高,待显示图像的数据量也大幅度提高,因此将待显示的图像数据从图像处理模块传输到图像显示模块所需的带宽和功耗也随之增加。
如何在显示高分辨率图像的情况下尽可能降低显示装置的功耗对于提升用户体验,满足用户需求至关重要。
发明内容
本申请实施例提供一种图像数据处理的装置和方法,用于减少图像数据传输的带宽和功耗。
本申请第一方面提供了一种图像数据处理的装置,该装置包括:应用处理器AP、显示驱动器集成电路DDIC和传输接口;该AP,用于对原始待显示图像进行子像素渲染SPR,得到SPR图像,该SPR图像能够在子像素排布SPA显示屏上显示,该SPA显示屏的物理子像素的个数小于该原始待显示图像的子像素的个数,该SPR图像的子像素的个数小于该原始待显示图像的子像素的个数;该AP,还用于对该SPR图像进行编码,得到第一数据流;该AP,还用于通过该传输接口将该第一数据流发送给该DDIC;该DDIC,用于解码该第一数据流,得到解码图像。
具有高质量图像或视频显示需求的电子设备为了满足不断提升的分辨率要求,通常会采用SPA显示屏,原始输入的三分量的图像数据必须经过SPR处理才可以在SPA显示屏上正确显示。本申请实施例提出的图像处理装置的架构中,由于AP对数据进行SPR处理之后,图像数据每个像素包含的子像素个数变小,图像数据量大大减小,示例性的,如果SPR处理后的图像数据为RGBG排布或者RGB delta排布的数据,则SPR处理后的数据是原始图像数据的2/3;如果SPR处理后的图像数据为SPR1.5排布的数据,则SPR处理后的数据是原始图像数据的1/2。由于SPR处理后的图像数据量大大减小,对图像数据进行编码之后的数据量也大大减小,将图像数据从AP传输到DDIC所需的带宽和功耗也大大降低,进一步的,DDIC处理的数据量以及DDIC侧的存储空间都极大减小,大大提高了图像处理装置的性能。并且对于SPR处理核来说,由于SOC侧的集成度更高,SPR处理核占用的硬件逻辑的面积更小,SPR处理核或者 说SPR知识产权(Intellectual Property,IP)核本身的成本和功耗也会减少。
应当理解,AP和DDIC侧都包括传输接口,传输接口包括发送接口和接收接口,AP将图像数据发送给DDIC,则AP侧的传输接口为发送接口,DDIC侧的传输接口为接收接口。
在一种可能的实施方式中,该AP包括:SPR集成电路或SPR固化硬件逻辑,该SPR集成电路或SPR固化硬件逻辑,具体用于对该原始待显示图像进行子像素渲染SPR,得到该SPR图像。
在一种可能的实施方式中,该SPR图像为RGBG图像,该AP还包括4个处理通道,用于:对该RGBG图像的4个分量R、G、B、G分别进行编码处理,得到该第一数据流,其中,该4个处理通道中的每个处理通道分别处理一个分量。
在一种可能的实施方式中,该AP具体用于:将该RGBG图像转换为亮色分离的SPR图像,该亮色分离的SPR图像包括4个分量U、Y、V和Y,其中,该U和该V为色度信号分量,该Y为亮度信号分量;该4个处理通道,具体用于对该亮色分离的SPR图像的4个分量U、Y、V和Y分别进行编码处理,得到该第一数据流,其中,该4个处理通道中的每个处理通道分别处理一个分量。
在本申请实施例中,将RGBG转换到亮色分离的色彩空间中,进一步降低了图像数据的数据量,节省了传输带宽;并且,由于降低了图像的亮度与色度的相关性,在相同的压缩比下,减少了由压缩和解码导致的图像失真,使得解码获得的图像显示效果更好、失真更少。
在一种可能的实施方式中,该亮色分离的SPR图像包括:UYVY图像、YVYU图像、YUYV图像或VYUY图像。
在一种可能的实施方式中,该解码图像包括该4个分量U、Y、V和Y,该DDIC还用于:将该解码图像的四个分量U、Y、V和Y转换为RGBG图像的四个分量R、G、B、G。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该AP具体用于:对该RGB delta图像进行格式映射,得到映射RGB图像,该映射RGB图像的子像素的个数等于该RGB delta图像的子像素的个数;将该RGB图像转换到亮色分离的YCoCg颜色空间,得到YCoCg图像,该YCoCg图像包括3个分量Y、Co、Cg;对该YCoCg图像进行编码,得到该第一数据流。
在一种可能的实施方式中,该AP包括4条处理通道,该4个处理通道中的3个处理通道,用于对该RGB图像的3个分量R、G、B分别进行编码处理,得到该第一数据流,其中,一个处理通道处理一个分量;或者,该4个处理通道中的3个处理通道,用于对该YCoCg图像的3个分量Y、Co、Cg分别进行编码处理,得到该第一数据流,其中,一个处理通道处理一个分量。
AP的4条处理通道既可以对4分量的数据同时进行编码处理,也可以对3分量的数据同时进行编码处理,当对3分量的数据进行编码处理时,4条通道中的任意3条处理通道同时对图像的3个分量分别进行编码处理。
在一种可能的实施方式中,该解码图像为该YCoCg图像,该DDIC还用于:将该YCoCg图像的三个分量Y、Co、Cg转换为该RGB图像的三个分量R、G、B;将该 RGB图像转换为能够在该SPA显示屏上显示的RGB delta图像。
在一种可能的实施方式中,该AP集成在系统芯片SOC上,该DDIC在该SOC之外。
在一种可能的实施方式中,该装置还包括:折叠屏,该折叠屏包括第一显示屏和第二显示屏;该DDIC包括:第一DDIC和第二DDIC,该第一DDIC用于驱动该第一显示屏,该第二DDIC用于驱动该第二显示屏。
对于多折叠屏的图像处理装置存在多个DDIC,由于SPR处理需要参考周围的像素,如果SPR处理在DDIC中,折叠屏交界处的像素在进行SPR处理时,需要参考相邻显示屏的像素,因此多个DDIC之间存在数据交互,本申请实施例将SPR处理放在AP侧,避免了多个DDIC之间存在数据交互,减少了芯片间的数据交互和数据处理,避免了电磁干扰(Electromagnetic Interference,EMI)和静电放电(Electrostatic Discharge,ESD)问题。
在一种可能的实施方式中,该传输接口为移动产业处理器接口MIPI、MIPI标准化的显示串行接口DSI或者视频电子标准协会VESA标准化的嵌入式显示端口eDP。
本申请第二方面提供了一种图像数据处理的装置,该装置包括:应用处理器AP和显示驱动器集成电路DDIC;该AP包括子像素渲染处理核,编码器和发送接口,该DDIC包括接收接口和解码器;该子像素渲染处理核,用于对原始待显示图像进行子像素渲染SPR,得到SPR图像,该SPR图像能够在子像素排布SPA显示屏上显示,该SPA显示屏的物理子像素的个数小于该原始待显示图像的子像素的个数,该SPR图像的子像素的个数小于该原始待显示图像的子像素的个数;该编码器,用于对该SPR图像进行编码,得到第一数据流;该发送接口,用于将该第一数据流发送给该DDIC的接收接口;该解码器,用于解码该第一数据流,得到解码SPR图像。
具有高质量图像或视频显示需求的电子设备为了满足不断提升的分辨率要求,通常会采用SPA显示屏,原始输入的三分量的图像数据必须经过SPR处理才可以在SPA显示屏上正确显示。本申请实施例提出的图像处理装置的架构中,SPR处理在AP中完成,由于SPR处理之后的图像数据每个像素包含的子像素个数变小,图像数据量大大减小,示例性的,如果SPR处理后的图像数据为RGBG排布或者RGB delta排布的数据,则SPR处理后的数据是原始图像数据的2/3;如果SPR处理后的图像数据为SPR1.5排布的数据,则SPR处理后的数据是原始图像数据的1/2。由于SPR处理后的图像数据量大大减小,对图像数据进行编码之后的数据量也大大减小,将图像数据从AP传输到DDIC所需的带宽和功耗也大大降低,DDIC处理的数据量以及DDIC侧的存储空间都极大减小,大大提高了图像处理装置的性能。并且对于SPR处理核来说,由于AP侧的集成度更高,SPR处理核占用的硬件逻辑的面积更小,SPR处理核或者说SPR知识产权(Intellectual Property,IP)核本身的成本和功耗也会减少。
在一种可能的实施方式中,该子像素渲染处理核为专用集成硬件电路或专用固化硬件核。
在一种可能的实施方式中,该DDIC还包括:驱动器,用于驱动显示屏显示该解码SPR图像。
在一种可能的实施方式中,该AP还包括:第一颜色空间转换模块,用于将该SPR 图像转换到亮色分离的颜色空间中,得到亮色分离的SPR图像;该编码器,具体用于对该亮色分离的SPR图像进行编码,得到该第一数据流。
在本申请实施例中,将RGBG转换到亮色分离的色彩空间中,进一步降低了图像数据的数据量,节省了传输带宽;并且,由于降低了图像的亮度与色度的相关性,在相同的压缩比下,减少了由压缩和解码导致的图像失真,使得解码获得的图像显示效果更好、失真更少。
在一种可能的实施方式中,该第一颜色空间转换模块,用于对该SPR图像进行拉伸旋转。
在一种可能的实施方式中,该第一颜色空间转换模块,具体用于基于颜色空间转换矩阵对该SPR图像进行颜色空间转换。
在一种可能的实施方式中,该解码SPR图像位于该亮色分离的颜色空间,该DDIC还包括:第二颜色空间转换模块,用于将该解码SPR图像从该亮色分离的颜色空间转换到该SPR图像所在的颜色空间。
在一种可能的实施方式中,该SPR图像为RGBG图像,该亮色分离的颜色空间为YUV颜色空间,该第一颜色空间转换模块具体用于:将该RGBG图像转换到该YUV颜色空间,得到该亮色分离的SPR图像,该亮色分离的SPR图像包括4个分量P0、P1、P2和P3,该4个分量P0、P1、P2和P3中包含两个亮度信号分量和两个色度信号分量。
在一种可能的实施方式中,该亮色分离的SPR图像包括:UYVY图像、YVYU图像、YUYV图像或VYUY图像。
在一种可能的实施方式中,该解码SPR图像位于该YUV颜色空间,该解码SPR图像包含该4个分量P0、P1、P2和P3,该第二颜色空间转换模块具体用于:将该解码SPR图像的四个分量P0、P1、P2和P3转换为RGBG图像的四个分量R、G、B、G。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该亮色分离的颜色空间为YCoCg颜色空间,该AP还包括:第一格式映射模块,用于将该RGB delta图像转换为RGB图像;该第一颜色空间转换模块具体用于:将该RGB图像转换到该YCoCg颜色空间,得到YCoCg图像,该YCoCg图像包括3个分量Y、Co、Cg。
在一种可能的实施方式中,该解码SPR图像为该YCoCg图像,该第二颜色空间转换模块具体用于:将该YCoCg图像的三个分量Y、Co、Cg转换为该RGB图像的三个分量R、G、B;该DDIC还包括:第二格式映射单元,用于将该RGB图像转换为该RGB delta图像。
在一种可能的实施方式中,该编码器包括:4个处理通道;该SPR图像为RGBG图像,该4个处理通道,用于对RGBG图像的4个分量R、G、B、G分别进行编码处理,得到该第一数据流,其中,一个处理通道处理一个分量;或者,该SPR图像为RGB delta图像,该装置还包括:第一格式映射单元,用于将该RGB delta图像转换为RGB图像;该4个处理通道中的3个处理通道,对该RGB图像的3个分量R、G、B分别进行编码处理,得到该第一数据流,其中,一个处理通道处理一个分量。
在一种可能的实施方式中,该编码器包括:4个处理通道;该4个处理通道,用 于对该4个分量P0、P1、P2和P3分别进行编码处理,得到该第一数据流;或者,该4个处理通道中的3个处理通道,用于对该YCoCg图像的3个分量Y、Co、Cg分别进行编码处理,得到该第一数据流。
在一种可能的实施方式中,该AP集成在系统芯片SOC上,该DDIC在该SOC之外。
在一种可能的实施方式中,该传输接口为移动产业处理器接口MIPI、MIPI标准化的显示串行接口DSI或者视频电子标准协会VESA标准化的嵌入式显示端口eDP。
在一种可选的实施方式中,该传输接口包括HDMI或V-By-One接口。
在一种可能的实施方式中,该DDIC还包括:屏幕亮度补偿器,用于对该解码SPR图像进行Demura处理,得到目标显示的SPR图像。
在一种可能的实施方式中,该AP还包括:封装模块,用于对该第一数据流进行格式封装,得到第二数据流,该第二数据流为与该发送接口和该接收接口相匹配的数据流;该发送接口,具体用于将该第二数据流发送给该DDIC的接收接口。该DDIC还包括:解封装模块,用于对该接收接口接收的该第二数据流进行解封装,得到该第一数据流。
在一种可能的实施方式中,该发送接口,用于对该第一数据流进行格式封装,得到第二数据流,该第二数据流为与该发送接口和该接收接口相匹配的数据流;该发送接口,还用于将该第二数据流发送给该DDIC的接收接口;该接收接口,用于对该第二数据流进行解封装,得到该第一数据流。
本申请第三方面提供了一种图像数据处理的装置,该装置包括:处理器和传输接口;该处理器用于:对原始待显示图像进行子像素渲染SPR,得到SPR图像,该SPR图像能够在子像素排布SPA显示屏上显示,该SPA显示屏的物理子像素的个数小于该原始待显示图像的子像素的个数,该SPR图像的子像素的个数小于该原始待显示图像的子像素的个数;对该SPR图像进行编码,得到第一数据流;通过该传输接口发送该第一数据流。
应当理解,该传输接口可以认为是处理器的一部分,该处理器通过传输接口发送或接收数据。当该图像数据处理装置为一种具有图像数据处理功能的芯片时,该处理器和传输接口共同构成该芯片,该处理器为芯片中具有运算处理功能的固化硬件电路和/或固化硬件逻辑,这些硬件电路或硬件逻辑中固化有驱动程序和其他程序指令、该传输接口为芯片中接收或发送数据的接口。
在一种可能的实时方式中,该处理器通过该传输接口将该第一数据流发送给显示驱动器集成电路DDIC。
在一种可能的实施方式中,该SPR图像为RGBG图像,该处理器还包括4个处理通道,用于对该RGBG图像的4个分量R、G、B、G分别进行编码处理,得到该第一数据流,其中,该4个处理通道中的每个处理通道分别处理一个分量。
在一种可能的实施方式中,该处理器具体用于:将该RGBG图像转换为亮色分离的SPR图像,该亮色分离的SPR图像包括4个分量U、Y、V和Y,其中,该U和该V为色度信号分量,该Y为亮度信号分量;该4个处理通道,具体用于对该亮色分离的SPR图像的4个分量U、Y、V和Y分别进行编码处理,得到该第一数据流,其中, 该4个处理通道中的每个处理通道分别处理一个分量。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该处理器具体用于:对该RGB delta图像进行格式映射,得到映射RGB图像,该映射RGB图像的子像素的个数等于该RGB delta图像的子像素的个数;将该RGB图像转换到亮色分离的YCoCg颜色空间,得到YCoCg图像,该YCoCg图像包括3个分量Y、Co、Cg;对该YCoCg图像进行编码,得到该第一数据流。
在一种可能的实施方式中,该处理器集成在系统芯片SOC上,该DDIC在该SOC之外。
在一种可能的实施方式中,该传输接口为移动产业处理器接口MIPI、MIPI标准化的显示串行接口DSI或者视频电子标准协会VESA标准化的嵌入式显示端口eDP。
在一种可能的实施方式中,该处理器包括4条处理通道,该4个处理通道中的3个处理通道,用于对该RGB图像的3个分量R、G、B分别进行编码处理,得到该第一数据流,其中,一个处理通道处理一个分量;或者,该4个处理通道中的3个处理通道,用于对该YCoCg图像的3个分量Y、Co、Cg分别进行编码处理,得到该第一数据流,其中,一个处理通道处理一个分量。
本申请第四方面提供了一种显示驱动器集成电路DDIC,该DDIC包括:解码器和传输接口;该传输接口,用于接收第一数据流,该第一数据流中包括子像素渲染SPR图像,该SPR图像能够在子像素排布SPA显示屏上显示,该SPA显示屏的物理子像素的个数小于该原始待显示图像的子像素的个数,该SPR图像的子像素的个数小于该原始待显示图像的子像素的个数;该解码器,用于解码该第一数据流,得到解码图像数据。
在一种可能的实施方式中,该第一数据流为应用处理器AP发送来的。
在一种可能的实施方式中,该SPR图像为RGBG图像,该解码图像数据包括4个分量U、Y、V和Y,其中,该U和该V为色度信号分量,该Y为亮度信号分量,该DDIC还包括第一颜色空间转换处理集成电路;该颜色空间转换处理集成电路,用于将该解码图像的四个分量U、Y、V和Y转换为RGBG图像的四个分量R、G、B、G。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该解码图像数据为RGB图像,该DDIC还包括:格式映射处理集成电路或格式映射处理固化硬件逻辑;该格式映射处理集成电路或格式映射处理固化硬件逻辑,用于对该RGB图像进行格式映射,得到该RGB delta图像。
在一种可能的实施方式中,该解码器包括4条处理通道。
在一种可能的实施方式中,该DDIC包括第一子DDIC和第二子DDIC,该第一子DDIC和该第二子DDIC耦合。
在一种可能的实施方式中,该AP集成在系统芯片SOC上,该DDIC在该SOC之外。
在一种可能的实施方式中,该传输接口为移动产业处理器接口MIPI、MIPI标准化的显示串行接口DSI或者视频电子标准协会VESA标准化的嵌入式显示端口eDP。
本申请第五方面提供了一种图像数据处理的方法,该方法包括:对原始待显示图 像进行子像素渲染SPR,得到SPR图像,该SPR图像能够在子像素排布SPA显示屏上显示,该SPA显示屏的物理子像素的个数小于该原始待显示图像的子像素的个数,该SPR图像的子像素的个数小于该原始待显示图像的子像素的个数;对该SPR图像进行编码,得到第一数据流;通过传输接口发送该第一数据流。
在一种可能的实施方式中,该SPR图像为RGBG图像,该对该SPR图像进行编码,得到第一数据流,具体包括:对该RGBG图像的4个分量R、G、B、G分别进行编码处理,得到该第一数据流。
在一种可能的实施方式中,该对该SPR图像进行编码,得到第一数据流,具体包括:将该RGBG图像转换为亮色分离的SPR图像,该亮色分离的SPR图像包括4个分量U、Y、V和Y,其中,该U和该V为色度信号分量,该Y为亮度信号分量;对该亮色分离的SPR图像的4个分量U、Y、V和Y分别进行编码处理,得到该第一数据流。
在一种可能的实施方式中,该亮色分离的SPR图像包括:UYVY图像、YVYU图像、YUYV图像或VYUY图像。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该对该SPR图像进行编码,得到第一数据流,具体包括:对该RGB delta图像进行格式映射,得到映射RGB图像,该映射RGB图像的子像素的个数等于该RGB delta图像的子像素的个数;将该RGB图像转换到亮色分离的YCoCg颜色空间,得到YCoCg图像,该YCoCg图像包括3个分量Y、Co、Cg;对该YCoCg图像进行编码,得到该第一数据流。
本申请第六方面提供了一种图像数据处理的方法,该方法包括:接收第一数据流,该第一数据流中包括子像素渲染SPR图像,该SPR图像能够在子像素排布SPA显示屏上显示,该SPA显示屏的物理子像素的个数小于该原始待显示图像的子像素的个数,该SPR图像的子像素的个数小于该原始待显示图像的子像素的个数;解码该第一数据流,得到解码图像数据。
在一种可能的实施方式中,该SPR图像为RGBG图像,该解码图像数据包括4个分量U、Y、V和Y,其中,该U和该V为色度信号分量,该Y为亮度信号分量,该方法还包括:将该解码图像的四个分量U、Y、V和Y转换为RGBG图像的四个分量R、G、B、G。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该解码图像数据为RGB图像,该方法还包括:对该RGB图像进行格式映射,得到该RGB delta图像。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该解码图像数据为该YCoCg图像,该方法还包括:将该YCoCg图像的三个分量Y、Co、Cg转换为该RGB图像的三个分量R、G、B;将该RGB图像转换为能够在该SPA显示屏上显示的RGB delta图像。
本申请第七方面提供了一种图像数据处理的方法,该方法包括:AP对原始待显示图像进行子像素渲染SPR,得到SPR图像,该SPR图像能够在子像素排布SPA显示屏上显示,该SPA显示屏的物理子像素的个数小于该原始待显示图像的子像素的个数,该SPR图像的子像素的个数小于该原始待显示图像的子像素的个数;该AP对该SPR图像进行编码,得到第一数据流;AP将该第一数据流发送给DDIC;该DDIC解码该 第一数据流,得到解码SPR图像。
在一种可能的实施方式中,该方法还包括:该DDIC驱动显示屏显示该解码SPR图像。
在一种可能的实施方式中,该AP对该SPR图像进行编码,得到第一数据流,具体包括:该AP将该SPR图像转换到亮色分离的颜色空间中,得到亮色分离的SPR图像;该AP对该亮色分离的SPR图像进行编码,得到该第一数据流。
在本申请实施例中,将RGBG转换到亮色分离的色彩空间中,进一步降低了图像数据的数据量,节省了传输带宽;并且,由于降低了图像的亮度与色度的相关性,在相同的压缩比下,减少了由压缩和解码导致的图像失真,使得解码获得的图像显示效果更好、失真更少。
在一种可能的实施方式中,该AP将该SPR图像转换到亮色分离的颜色空间中,具体包括:该AP基于颜色空间转换矩阵将该SPR图像转换到亮色分离的颜色空间中。
在一种可能的实施方式中,该解码SPR图像位于该亮色分离的颜色空间,该方法还包括:DDIC将该解码SPR图像从该亮色分离的颜色空间转换到该SPR图像所在的颜色空间。
在一种可能的实施方式中,该SPR图像为RGBG图像,该亮色分离的颜色空间为YUV颜色空间,该AP将该SPR图像转换到亮色分离的颜色空间中,具体包括:该AP将该RGBG图像转换到该YUV颜色空间,得到该亮色分离的SPR图像,该亮色分离的SPR图像包括4个分量P0、P1、P2和P3,该4个分量P0、P1、P2和P3中包含两个亮度信号分量和两个色度信号分量。
在一种可能的实施方式中,该亮色分离的SPR图像包括:UYVY图像、YVYU图像、YUYV图像或VYUY图像。
在一种可能的实施方式中,该解码SPR图像位于该YUV颜色空间,该解码SPR图像包含该4个分量P0、P1、P2和P3,该DDIC将该解码SPR图像从该亮色分离的颜色空间转换到该SPR图像所在的颜色空间,具体包括:该DDIC将该解码SPR图像的四个分量P0、P1、P2和P3转换为RGBG图像的四个分量R、G、B、G。
在一种可能的实施方式中,该SPR图像为RGB delta图像,该亮色分离的颜色空间为YCoCg颜色空间,该AP将该SPR图像转换到亮色分离的颜色空间中,具体包括:该AP将该RGB delta图像转换为RGB图像;该AP将该RGB图像转换到该YCoCg颜色空间,得到YCoCg图像,该YCoCg图像包括3个分量Y、Co、Cg。
在一种可能的实施方式中,该解码SPR图像为该YCoCg图像,该DDIC将该解码SPR图像从该亮色分离的颜色空间转换到该SPR图像所在的颜色空间,具体包括:该DDIC将该YCoCg图像的三个分量Y、Co、Cg转换为该RGB图像的三个分量R、G、B;该DDIC将该RGB图像转换为该RGB delta图像。
在一种可能的实施方式中,该方法还包括:该DDIC对该解码SPR图像进行Demura处理,得到目标显示的SPR图像。
在一种可能的实施方式中,该方法还包括:该AP对该第一数据流进行格式封装,得到第二数据流,该第二数据流为与该发送接口和该接收接口相匹配的数据流;将该第二数据流发送给该DDIC的接收接口;该DDIC对该接收接口接收的该第二数据流 进行解封装,得到该第一数据流。
本申请第八方面提供了一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当其在计算机或处理器上运行时,使得该计算机或处理器执行如上述第五方面或者其任一种可能的实施方式中该的方法。
本申请第九方面提供了一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当其在计算机或处理器上运行时,使得该计算机或处理器执行如上述第六方面或者其任一种可能的实施方式中该的方法。
本申请第十方面提供了一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当其在计算机或处理器上运行时,使得该计算机或处理器执行如上述第七方面或者其任一种可能的实施方式中该的方法。
本申请第十一方面提供了一种包含指令的计算机程序产品,当其在计算机或处理器上运行时,使得该计算机或处理器执行如上述第五方面或者其任一种可能的实施方式中该的方法。
本申请第十二方面提供了一种包含指令的计算机程序产品,当其在计算机或处理器上运行时,使得该计算机或处理器执行如上述第六方面或者其任一种可能的实施方式中该的方法。
本申请第十三方面提供了一种包含指令的计算机程序产品,当其在计算机或处理器上运行时,使得该计算机或处理器执行如上述第七方面或者其任一种可能的实施方式中该的方法。
附图说明
图1a为本申请实施例提供的一种示例性的传统显示屏常用的像素排布方式RGB Stripe显示屏;
图1b为本申请实施例提供的一种示例性的SPA排布显示屏;
图1c为本申请实施例提供的另一种示例性的SPA排布显示屏;
图1d为本申请实施例提供的另一种示例性的SPA排布显示屏;
图2a为本申请实施例提供的一种示例性的图像处理装置的架构示意图;
图2b为本申请实施例提供的另一种示例性的图像处理装置的架构示意图;
图3为本申请实施例提供的一种示例性的对SPR图像数据进行编码的示例;
图4a为本申请实施例提供的另一种对SPR图像数据进行编码的示例;
图4b为本申请实施例提供的另一种对SPR图像数据进行编码的示例;
图5a为本申请实施例提供的另一种示例性的图像处理装置的架构示意图;
图5b为本申请实施例提供的另一种示例性的图像处理装置的架构示意图;
图5c为本申请实施例提供的另一种示例性的图像处理装置的架构示意图;
图6a为本申请实施例提供的另一种示例性的图像处理装置的架构示意图;
图6b为本申请实施例提供的另一种示例性的图像处理装置的架构示意图;
图6c为本申请实施例提供的另一种示例性的图像处理装置的架构示意图;
图7为本申请实施例提供的一种示例性的颜色空间转换的输入输出示意图;
图8为本申请实施例提供的另一种示例性的颜色空间转换的输入输出示意图;
图9为本申请实施例提供的一种示例性的图像处理装置的硬件架构示意图;
图10为本申请实施例提供的一种图像处理的方法流程示意图;
图11为本申请实施例提供的另一种图像处理的方法流程示意图。
具体实施方式
本申请的说明书实施例和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。此外,术语“包括”和“具有”以及他们的任何变形,意图在于覆盖不排他的包含,例如,包含了一系列步骤或单元。方法、系统、产品或设备不必限于清楚地列出的那些步骤或单元,而是可包括没有清楚地列出的或对于这些过程、方法、产品或设备固有的其它步骤或单元。
应当理解,在本申请中,“至少一个(项)”是指一个或者多个,“多个”是指两个或两个以上。“和/或”,用于描述关联对象的关联关系,表示可以存在三种关系,例如,“A和/或B”可以表示:只存在A,只存在B以及同时存在A和B三种情况,其中A,B可以是单数或者复数。字符“/”一般表示前后关联对象是一种“或”的关系。“以下至少一项(个)”或其类似表达,是指这些项中的任意组合,包括单项(个)或复数项(个)的任意组合。例如,a,b或c中的至少一项(个),可以表示:a,b,c,“a和b”,“a和c”,“b和c”,或“a和b和c”,其中a,b,c可以是单个,也可以是多个。
首先,为了便于理解本申请实施例,对本申请实施例涉及的一些概念或术语进行解释。
视频电子标准协会(Video Electronics Standards Association,VESA):制定计算机和小型工作站视频设备标准的国际组织,图像压缩方面的标准工作组之一,VESA在2014年发布了显示数据流压缩(Display Stream Compression,DSC)算法。
DSC:图像压缩方面的标准压缩算法之一,2014年由VESA发布,具备实时编码解码的特性、固定的压缩比例、低成本和高显示质量等特点。DSC压缩算法与传统的压缩算法相比,不需要更多帧数据,在多媒体芯片内部无需更多的存储资源,并且与显示接口兼容。DSC主要针对高端电子设备的高质量音视频传输需求,示例性的,DSC压缩算法适用于对基于移动产业处理器接口(Mobile Industry Processor Interface,MIPI)传输音视频数据时对音视频数据进行编码。
颜色空间(Color Space):颜色可以是眼睛对于不同频率的光线的不同感受,也可以表示客观存在的不同频率的光。颜色空间是人们建立起用来表示色彩的坐标系统所定义的色彩范围。色域与色彩模型一起定义一个颜色空间。其中,色彩模型是用一组颜色成分表示颜色的抽象数学模型。色彩模型例如可以包括三原色光模式(red green blue,RGB)、印刷四色模式(cyan magenta yellow key plate,CMYK)。色域是指一个系统能够产生的颜色的总合。示例性的,Adobe RGB和sRGB是两个基于RGB模型的不同的颜色空间。
RGB颜色空间(Red Green Blue color space):RGB定义了红色、绿色与蓝色三原色的颜色,其中,三原色中一个颜色的色彩值取最大值,且其它两个颜色的色彩值都为零时所对应的颜色表示该取最大值的颜色。示例性的,红色、绿色与蓝色三原色 中,色彩值R、G和B的取值均为0-255,则当R、G取值均为零时,B取值为255时所对应的颜色表示蓝色。RGB格式的图像中每个像素包含R、G、B三个子像素。
YUV:为一种亮色分离的颜色空间,YUV格式的图像中每个像素包含一个亮度分量Y、一个色度分量U、一个色度分量V,其中,Y表示明亮度(Luminance、Luma),U和V表示色度(Chroma)。
YCoCg:为一种亮色分离的颜色空间,具有良好的变换编码增益。YCoCg格式的图像中每个像素包含一个亮度值Y,两个色度值Co和Cg,其中Co表示橙色色度(chrominance orange),Cg表示绿色色度(chrominance green)。RGB颜色空间和YCoCg颜色空间转换关系如公式1所示:
Figure PCTCN2019079208-appb-000001
UYVY:属于YUV颜色空间的一种数据格式,UYVY格式的图像中每2个像素包含两个Y分量,一个U分量和一个V分量。
表1
Figure PCTCN2019079208-appb-000002
如表1所示,P0-P3表示4个像素,当这4个像素在RGB颜色空间中时,每个像素包含一个R、G、B,当这4个像素在YUV颜色空间中时,每个像素包含一个Y、U、V,当这4个像素在UVYV颜色空间中时,每2个像素包含两个Y,一个U和一个V。
每英寸的像素数(Pixels Per Inch,PPI)越高的显示面板能够提供更加细腻精美的画面,提升客户观看的视觉效果。PPI越高,显示面板包含的像素就越多,显示分辨率就越高。然而显示装置的显示面板的尺寸往往是有限制的,想要在尽量不扩展显示面板尺寸的情况下提升显示面板的分辨率,这大大提升了显示面板的工艺难度和制造成本。为了克服该问题,业界采用子像素排布(sub-pixel arrangement)的显示面板,SPA显示面板的物理子像素个数比要显示的图像的子像素的个数少。
如图1a所示,为本申请实施例提供的一种传统显示屏常用的像素排布方式RGB Stripe显示屏,在RGB Stripe显示屏中,每个像素包含三个子像素,分别为红色子像素R、绿色子像素G和蓝色子像素B。
如图1b所示,为本申请实施例提供的一种示例性的SPA显示屏,图1b中的SPA显示屏采用RGBG排布,每个像素包含两个子像素,图1b中第a行中的像素1包含R、G两个子像素,像素2包含B、G两个子像素,像素3包含R、G两个子像素,像素4包含B、G两个子像素,以RG、BG的组合交替出现。
如图1c所示,为本申请实施例提供的另一种示例性的SPA显示屏,图1c中的SPA显示屏采用RGB delta排布,每个像素均包含RGB三个子像素,相邻两个像素存在共用的子像素,以图1c中的第a行为例,像素1和像素2共用一个蓝色子像素B2,像素2和像素3共用红色子像素R2和绿色子像素G3。
可以看出,在RGB Stripe排布的显示屏中,三个像素由9个子像素组成;在RGBG排布的显示屏中,三个像素由6个子像素组成;在RGB delta排布的显示屏中,三个像素由6个子像素组成。因此,当像素个数相同时,RGBG排布和RGB delta排布比RGB Stripe排布所需的子像素个数更少。示例性的,假如待显示图像的尺寸为3840*2160*3,RGBG排布的SPA显示屏只需要有3840*2160*2个子像素就可以正常显示待显示图像,在保证分辨率的前提下大大降低了显示屏的子像素的个数。但是,原始图像数据必须经过子像素渲染(Sub-pixel Rendering,SPR)处理才可以在SPA屏上正常显示。SPR技术是通过专有子像素呈现算法使得高分辨率的图像可以显示在低分辨率的显示面板上的技术。应当理解,RGB和RGBG都属于RGB颜色空间。
RGBG排布和RGB delta排布为两种示例性的SPA显示屏的排布方式,且这两种排布方式像素个数与子像素个数的比例都是1:2,可选的,还存在其他排布方式的SPA显示屏,在一些SPA显示屏中,一个像素包含的子像素个数也可以小于2,例如一个像素可以只包含1.75个子像素或者一个像素只包含1.5个子像素,如图1d所示,为一种示例性的SPA排布方式SPR1.5,在SPR1.5排布中,一个像素包含1.5个子像素,在该显示面板中,有4个像素,6个子像素,相当于每个像素包含1.5个子像素。
随着显示屏的显示分辨率越来越高,图像处理模块与图像显示模块之间的数据传输量也越来越大,因此传输图像数据的带宽和功耗都随之增加。本申请实施例提出一种图像处理装置的新架构,在本申请实施例提供的图像处理装置中,图像处理模块传到图像显示驱动模块的数据量大大减小,从而降低图像数据从图像处理模块传到图像显示驱动模块所需的传输带宽和功耗。本申请实施例提出的架构适用于有源矩阵有机发光二极体(Active-matrix organic light emitting diode,AMOLED)、液晶显示器(Liquid Crystal Display,LCD)、为发光二极管显示器(Micro Light Emitting Diode Display,microLED)等显示屏,并且适用的产品形态不局限于手机、平板电脑、照相机、摄像机等可移动终端,同时可适用于电脑、电视等具有显示屏的电子设备。示例性的,本申请实施例提出的架构尤其适用于有高质量图像或视频显示需求的电子设备。
如图2a所示,为本申请实施例提供的一种示例性的图像处理装置的架构示意图。
该图像处理装置200包括系统芯片(System on Chip,SOC)201,显示驱动集成电路(Display Driving Integrated Circuit,DDIC)202,以及显示屏203。图2a所示的架构中,SOC和DDIC是两个独立的芯片,待显示的图像数据在SOC 201中处理之后,传送到DDIC 202处理,最终显示在显示屏203上。DDIC是显示屏的集成电路芯片,是显示屏成像系统的主要部分,DDIC 202用于驱动显示屏203,还可以用于控制驱动电流,可选的,图像处理装置可以包含多个DDIC。示例性的,显示屏203可以是LCD、AMOLED、microLED、发光二级管(Light Emitting Diode,LED)显示器,有机发光二极管(Organic Light-Emitting Diode,OLED)显示屏、阴极射线管(Cathode Ray Tube,CRT)显示屏等。显示屏203为SPA显示屏,其排布方式包括但不限于RGBG排布、RGB delta排布或SPR1.5排布等。
SOC 201包括子像素渲染模块2011、编码器2012和发送接口2013。
子像素渲染模块2011,用于对待显示图像进行子像素渲染SPR处理,得到SPR图像数据。
由于显示屏203为SPA显示屏,显示屏包含的物理子像素的个数小于原始的待显示图像的子像素的个数,原始的待显示图像不经过SPR处理,无法显示在具有较低分辨率的显示屏203上。子像素渲染模块2011对待显示图像进行SPR处理之后,得到SPR图像数据,该SPR图像数据的子像素个数以及子像素的排布方式符合显示屏203,因此SPR数据可以正确显示在显示屏203上。示例性的,得到的SPR图像数据可以为RGBG排布的图像数据或者RGB delta排布的图像数据。SPR处理包括现有的各种子像素渲染处理,例如可以为申请号为201810281666.7,发明名称为《像素处理方法及装置》的中国专利申请中的SPR处理。示例性的,子像素渲染模块2011为SOC内专用的固化硬件逻辑或专用的硬件集成电路,例如可以是GPU内的一个固化硬件核。在一种可选的方案中,子像素渲染模块也可以是跑在处理器上的软件模块。
编码器2012,用于对2011处理得到的SPR数据进行编码压缩,得到压缩图像数据。
在一种可选的方案中,编码器2012包括4个处理通道,编码器2012可以对4个输入分量的图像数据进行编码压缩,也可以对小于4个输入分量的图像数据进行编码压缩,例如可以对3个输入分量进行编码编码。当编码器2012对3个输入分量进行编码编码时,输入分量与通道之间无对应性,从编码器的4个通道中任选3个通道即可。
如图3所示,为本申请实施例提供的一种对SPR图像数据进行编码的示例。SPR图像数据为RGB delta排布的数据,编码器包括4个通道:通道一、通道二、通道三和通道四。在编码器对RGB delta排布的数据进行编码压缩之前,对RGB delta数据进行格式映射得到映射RGB数据,映射RGB数据每个像素包含三个分量R、G、B,应当理解,映射前后,子像素个数保持不变,将RGB delta图像进行格式映射到RGB图像不会导致数据量增加。在图3中,R分量由通道一处理,G分量由通道二处理,B分量由通道三处理,通道四闲置,应当理解,图3中的通道分配方式只是一种示例,可选的,闲置的通道可以是4个通道中的任一个通道,且通道与数据分量之间也没有对应性,例如可以由通道一处理G分量,通道二处理B分量,通道三处理R分量,本申请实施例对此不做限定。
在一种可选的方案中,编码器中还包括颜色空间转换(Color Space Conversion,CSC)模块,示例性的,该CSC模块可以由专用的硬件集成电路或硬件逻辑实现。CSC模块用于将RGB数据转换为YUV或YCoCg数据,YUV或YCoCg数据包含3个子分量,每个子分量各由一个通道来处理。可选的,该CSC模块还可以在编码器外部,CSC模块将YUV或者YCoCg数据送给编码器处理。
如图4a所示,为本申请实施例提供的另一种对SPR图像数据进行编码的示例。
SPR图像数据为RGBG排布的数据,编码器也包括4个通道:通道一、通道二、通道三和通道四,可以同时处理四个输入分量的图像数据。RGBG排布的数据,每两个像素具有4个分量R、G、B、G,分别由编码器的通道一至通道四处理,得到压缩数据。应当理解,输入分量与通道之间不具有对应性,图4a只示出了输入分量与通道的一种示例性的对应形式,还可以存在其他对应形式,本申请对此不作限定。
在一种可选的方案中,如图4b所示,在编码器对RGBG数据进行编码编码之前,将RGBG数据经过颜色空间转换CSC得到UYVY数据,UYVY数据每两个像素包含 4个分量:一个U分量、一个V分量、2个Y分量,将这4个分量分别送给编码器的4个通道,每个通道处理一路分量数据,得到压缩数据。在一种可选的方案中,格式映射或颜色空间转换也可以在编码器内实现,此时编码器还包括CSC模块。
示例性的,编码器2012使用的编码算法可以为DSC1.2,DSC1.2为VESA提出的一种压缩算法,DSC1.2具有4条处理通道,可以对UVYV数据、RGBG排布或RGB delta排布的SPR图像数据进行编码处理。编码器2012也可以采用DSC1.1压缩算法对SPR数据进行编码,但是由于DSC1.1只能对三分量图像数据进行编码,在采用DSC1.1对SPR数据进行编码之前,需要对SPR图像数据进行上采样,使得处理后的SPR图像数据中每个像素也包含3个分量。应当理解,三分量数据是指1个像素包含3个子像素的图像数据,
发送接口2013,用于将编码器2012得到的压缩数据发送给DDIC 202。
示例性的,发送接口2013可以为MIPI发送接口(或者说MIPI发送器)、MIPI标准化的显示串行接口(Display Serial Interface,DSI)或者VESA标准化的嵌入式显示端口(Embedded Display Port,eDP)。在一种可选的情况中,发送接口2013也可以是高清晰度多媒体接口(High Definition Multimedia Interface,HDMI)或者V-By-One接口,V-By-One接口是一种面向图像传输开发的数字接口标准。
在一种可选的方案中,编码器2012压缩编码得到的压缩数据与发送接口2013相匹配,或者说压缩数据的格式符合发送接口2013的格式要求。
在一种可选的方案中,系统芯片201还包括封装模块,用于对编码器得到的压缩数据进行格式封装,封装数据流符合发送接口2013的格式要求。应当理解,封装模块可以为固化的硬件逻辑或专用的集成电路。
在一种可选的方案中,发送接口2013发出的数据流配合发送接口的传输协议传送到DDIC的接收接口,示例性的,该传输协议为MIPI的D-phy或者C-phy。
应当理解,虽然图中未示出,SOC 201还可以包括以下中的至少一项:存储器、微处理器和微控制器(Microcontroller Unit,MCU)、应用处理器(Application Processor,AP)、图像数据处理器(Image Signal Processor,ISP)、数字信号处理器(Digital Signal Processor、DSP)、图形处理单元(Graphics Processing Unit,GPU)或者具有图像或视频处理功能的集成电路等。SOC 201还可以包括清晰度处理模块,对比度处理模块和颜色校正模块等,应当理解,清晰度处理模块,对比度处理模块和颜色校正模块等均可以为固化的硬件逻辑或专用的集成电路。子像素渲染模块2011可以位于AP、ISP、GPU或者其他通用处理器中,本申请实施例对子像素渲染模块的位置不做限定。
DDIC 202包括接收接口2021和解码器2022,可选的,DDIC202还可以包括屏幕亮度补偿器2023。
接收接口2021,用于接收发送接口2013发送过来的压缩数据。
示例性的,接收接口可以为MIPI接收器、MIPI标准化的DSI、VESA标准化的eDP、HDMI接收器或者V-By-One接口,接收接口2021与发送接口2013是相匹配的,当发送接口2013是MIPI发送器时,接收接口2021是MIPI接收器,当发送接口2013是HDMI发送器时,接收接口2021是HDMI接收器,以此类推。
在一种可选的方案中,接收接口2021接收的压缩数据与接收接口2021的格式要 求相匹配。
在一种可选的方案中,DDIC 202还包括解封装模块,用于对接收的压缩数据进行格式解封装,得到解码器可以识别的压缩数据。
解码器2022,用于对压缩图像数据进行解码,得到SPR图像数据。
应当理解,解码器2022与编码器2012相匹配,解码器2022可以识别编码器2012得到的编码压缩数据,并可以将编码压缩数据解压为压缩前的SPR图像数据,SPR图像数据包含色彩信息,可以在显示屏上显示。示例性的,当编码器2012采用DSC1.2压缩算法,对应的,解码器2022采用DCS1.2对应的解码算法。
屏幕亮度补偿器2023,用于对SPR图像数据进行亮度补偿,以补偿屏幕的亮度不一致的问题。
应当理解,显示屏幕能够做到单独控制每个子像素的开关,由于制作工艺的局限性,每个子像素的显示单元并不是完全一致,各种类型的显示屏或多或少都会出现一种亮度不均匀的现象,业内也称为Mura现象。所谓Mura现象是指在为每个子像素的显示单元设定相同的像素值情况下,每个子像素显示单元显示的亮度却是不一致的,所以人眼看上去显示面板的亮度是不均匀的。
示例性的,该屏幕亮度校正模块为Demura模块,Demura技术通过检测Mura区域、获取Mura区域的像素点的补偿数据,并基于这些补偿数据对Mura区域的像素进行补偿,以消除显示屏工艺限制带来的亮度不均匀现象。Demura模块,用于对SPR图像数据进行Demura,以补偿屏幕Mura现象带来的亮度不均匀问题。
可选的,DDIC 202还包括伽马Gamma校正器,用于对显示屏203进行gamma校正。
应当理解,图像数据在显示屏幕上显示时,由于人眼对不同亮度的敏感度不一致和屏幕自身的光电特性,屏幕显示出的图像亮度与图像的原始亮度之间存在着一定偏差。因此,屏幕显示出的图像和输入图像相比存在失真。屏幕显示器的物理特性决定了如果输入的灰阶值是线性变化的,输出的亮度值就不是线性的,对屏幕进行Gamma校正,可以消除屏幕显示出的图像与输入图像之前的偏差。应当理解,在一种可选的情况中,伽马校正器也可以在SOC的AP中。
可选的,DDIC 202还包括数模转换器(Digital to analog converter,DAC),用于将数字图像转换为模拟信号,示例性的,该模拟信号为驱动电流,DDIC根据数字图像的像素值控制驱动电流的大小,使得不同像素值的像素在显示屏上显示出不同的亮度值。
可选的,DDIC 202还包括存储器,例如可以包括静态随机存储器(Static Random Access Memory,SRAM)。由于SPR处理之后的图像数据量极大减小,在DDIC侧占用的存储空间也变小。
DDIC 202将经过Demura和gamma校正的SPR图像数据发送给显示屏203进行显示。
具有高质量图像或视频显示需求的电子设备为了满足不断提升的分辨率要求,通常会采用SPA显示屏,原始输入的三分量的图像数据必须经过SPR处理才可以在SPA显示屏上正确显示。现有技术中,对图像数据适应显示屏的SPR处理是在DDIC中完 成的,本申请实施例提出的图像处理装置的架构中,SPR处理在SOC中完成,由于SPR处理之后的图像数据每个像素包含的子像素个数变小,图像数据量大大减小,示例性的,如果SPR处理后的图像数据为RGBG排布或者RGB delta排布的数据,则SPR处理后的数据是原始图像数据的2/3;如果SPR处理后的图像数据为SPR1.5排布的数据,则SPR处理后的数据是原始图像数据的1/2。由于SPR处理后的图像数据量大大减小,对图像数据进行编码之后的数据量也大大减小,将图像数据从SOC传输到DDIC所需的带宽和功耗也大大降低,DDIC处理的数据量以及DDIC侧的存储空间都极大减小,大大提高了图像处理装置的性能。
并且对于SPR模块来说,由于SOC侧的集成度更高,SPR模块占用的硬件逻辑的面积更小,SPR模块或者说SPR知识产权(Intellectual Property,IP)核本身的成本和功耗也会减少。
进一步的,本申请实施例的编码器包含4条处理通道,可以对小于或等于4分量输入的图像数据进行编码处理,因此可以对SPR处理后的RGBG和RGB delta数据直接进行编码处理,并得到符合传输接口格式和传输协议的压缩数据。
在一种可选的方案中,图像处理装置也可以包括SOC和DDIC,而不包括显示屏,显示屏为图像处理装置外侧的显示设备的显示屏,如图2b所示。关于系统芯片和显示驱动芯片请参考图2a部分的描述,此处不再赘述。显示屏300为图像处理装置200之外的电子设备的显示屏。示例性的,这种情况下,该图像处理装置可以为手机中包含SOC和DDIC的集成芯片或处理器产品,显示屏300为手机的显示屏,此时,图像处理装置200和显示屏300共同构成手机或其他具有显示屏的终端。
如图5a所示,为本申请实施例提供的另一种示例性的图像处理装置的架构示意图。图像处理装置500包括应用处理器AP 501、显示驱动集成电路DDIC 502和显示屏503,其中AP 501和DDIC 502集成在系统芯片SOC内部。AP 501包括子像素渲染模块5011、编码器5012和发送接口5013,DDIC 502包括接收接口5021、解码器5022和屏幕亮度补偿器5023。可选的,应用处理器AP 501还可以包括封装模块,DDIC 502还可以包括解封装模块、gamma校正器和存储器等。对于子像素渲染模块5011、编码器5012、发送接口5013、接收接口5021、解码器5022、屏幕亮度补偿器5023、封装模块、解封装模块、gamma校正器请参考前述实施例,此处不再赘述。DDIC的存储器用于存储接收接口的编码压缩的数据流,或者解码器解码得到的SPR图像数据。显示屏503为SPA显示屏,显示屏503的种类包括但不限于LCD、AMOLED、microLED、LED、OLED以及CRT等。
待显示图像数据在AP 501进行子像素渲染和编码压缩处理之后,通过发送接口将压缩图像数据传送给DDIC 502,接收接口接收到压缩的图像数据之后,进行解码处理,得到屏幕可以显示的SPR图像数据,进一步的,屏幕亮度补偿器5023对解码得到的SPR图像数据进行亮度补偿,以补偿屏幕的亮度不均匀问题。示例性的,5023对SPR图像数据进行Demura处理,以补偿屏幕的Mura现象带来的亮度不均匀问题。可选的,DDIC 502还包括DAC,用于将图像数据转换为驱动电流,DDIC驱动显示屏503,将图像显示在显示屏上。可选的,AP 501对图像数据的处理还可以包括预处理模块,预处理模块用于对数据图像进行清晰度处理、对比度调整、颜色增强、颜色校正、缩放 处理等。
在一种可能的方案中,AP 501集成在SOC内部,DDIC 502位于SOC之外,为独立集成的芯片,如图5b所示。
在一种可能的方案中,图像处理装置500包括AP 501和DDIC 502,但是不包括显示屏503,如图5c所示。
在一种可能的方案中,图像处理装置可以包括AP,但是不包括DDIC和显示屏。
如图6a所示,为本申请实施例提供的另一种图像处理装置的架构示意图。
图像处理装置600包括系统芯片SOC 601、显示驱动集成电路DDIC 602和显示屏603。
SOC 601包括:
子像素渲染模块6011:用于对待显示图像进行子像素渲染SPR处理,得到SPR图像数据,SPR图像数据为可以在SPA屏上正常显示的图像数据。示例性的,子像素渲染模块6011为SOC内专用的固化硬件逻辑或专用的硬件集成电路,例如可以是GPU内的一个固化硬件核。
具体请参考对子像素渲染模块2011的描述,此处不再赘述。
颜色空间转换模块6012:用于将SPR图像转换到亮色分离的颜色空间中,得到亮色分离的SPR图像。
示例性的,该SPR图像可以属于RGB颜色空间,该SPR图像可以为RGBG图像或者RGB delta图像,该亮色分离的颜色空间可以为YUV颜色空间或者YCoCg颜色空间。
示例性的,该SPR图像为RGBG图像,该亮色分离的颜色空间为YUV颜色空间。此时,颜色空间转换模块6012:具体用于将RGBG图像转换到YUV颜色空间,得到该亮色分离的SPR图像,该亮色分离的SPR图像包括4个分量P0、P1、P2和P3,该4个分量P0、P1、P2和P3中包含两个亮度信号分量和两个色度信号分量。示例性的,P0、P1、P2和P3可以为:U、Y、V和Y。该亮色分离的SPR图像包括但不限于:UYVY图像、YVYU图像、YUYV图像或VYUY图像。
示例性的,该SPR图像为RGB delta图像,该亮色分离的颜色空间为YCoCg颜色空间。此时,SOC 601还包括第一格式映射模块,用于将RGB delta图像转换为RGB图像,应当理解,将RGB delta图像进行格式映射到映射RGB图像,映射前后数据的子像素的个数保持不变,将RGB delta图像进行格式映射到RGB图像不会导致数据量增加;示例性的,如图1c所示为RGB delta排布,第a行包含像素1、像素2、像素3共3个像素,其中像素1包含(R1、G1、B2),像素2包含(R2、G3、B2),像素3包含(R2、G3、B3),像素像素1和像素2共用一个蓝色子像素B2,像素2和像素3共用红色子像素R2和绿色子像素G3。对RGB delta图像进行格式映射后得到的映射RGB图像子像素个数不变,第a行包含2个像素,分别为(R1、G1、B2)、(R2、G3、B3),格式映射前后,子像素的个数并没有发生变化。
颜色空间转换模块6012:具体用于将RGB图像转换到YCoCg颜色空间,得到YCoCg图像,YCoCg图像包括3个分量Y、Co、Cg。
将SPR图像转换到亮色分离的颜色空间中进一步降低了图像数据的数据量,节省 了传输带宽;并且,在相同的压缩比下,减少了由压缩和解码导致的图像失真,使得解码获得的图像显示效果更好。
示例性的,颜色空间转换模块6012为SOC内专用的固化硬件逻辑或专用的硬件集成电路,可选的,6012可以为AP中的固化硬件逻辑或固化硬件核,或者可以为GPU中的固化硬件逻辑或固化硬件核。
示例性的,颜色空间转换模块6012中包括一个4*4的颜色空间转换系数矩阵,该颜色空间转换系数矩阵包括16个去相关系数:X0-X15,输入为RGBG图像的四个分量R、G、B、G,输出同样为四个分量P0、P1、P2和P3,如图7所示。此时,该颜色空间转换矩阵被固化在硬件逻辑或硬件核中,与输入的R、G、B、G相比,输出的P0、P1、P2、P3之间的亮度和色度之间的相关性大大减小。
编码器6013:用于对去相关处理后的SPR图像数据进行编码,得到压缩的图像数据。
具体请参考对编码器2012的描述,此处不再赘述。
发送接口6014,用于将压缩的图像数据发送给DDIC 602。具体请参考对发送接口2013的描述,此处不再赘述。
可选的,SOC 601还包括封装模块、图像预处理模块等。
DDIC 602包括:
接收接口6021,用于接收发送接口6015发送过来的压缩的图像数据。具体请参考对接收接口2021的描述,此处不再赘述。
解码器6022,用于对压缩的图像数据进行解码,得到解码SPR图像数据。具体请参考对解码器2022的描述,此处不再赘述。
颜色空间转换模块6023,用于将解码器得到的解码SPR图像数据转换到SPR图像数据所在的颜色空间。
解码器得到的解码SPR图像数据位于YUV颜色空间,解码SPR图像包含4个分量P0、P1、P2和P3,示例性的,该解码SPR图像数据可以包括但不限于:UYVY图像、YVYU图像、YUYV图像或VYUY图像。此时,颜色空间转换模块6023,具体用于将解码SPR图像的四个分量P0、P1、P2和P3转换为RGBG图像的四个分量R、G、B、G。
解码SPR图像为YCoCg图像时,颜色空间转换模块6023,具体用于将YCoCg图像的三个分量Y、Co、Cg转换为所述RGB图像的三个分量R、G、B。此时,DDIC还包括第二格式映射单元,用于将RGB图像转换为RGB delta图像。应当理解,第二格式映射单元为第一格式映射单元中格式映射的逆映射,第一格式映射单元将RGB delta数据映射为RGB数据,第二格式映射单元将RGB数据映射为RGB delta数据,在此过程中,图像的子像素个数保持不变。
示例性的,颜色空间转换模块6023为SOC内专用的固化硬件逻辑或专用的硬件集成电路,可选的,6023可以为AP中的固化硬件逻辑或固化硬件核,或者可以为GPU中的固化硬件逻辑或固化硬件核。
示例性的,颜色空间转换模块6023中也包括一个4*4的颜色空间转换矩阵,该矩阵包括16个系数:Y0-Y15,输入为四个像素分量P0、P1、P2和P3,输出为RGBG 图像的四个像素分量R、G、B、G,如图8所示。此时,该矩阵被固化在硬件逻辑或硬件核中。示例性的,该矩阵为颜色空间转换模块6012中的颜色空间转换矩阵的逆矩阵,可选的,该矩阵不是6012中的颜色空间转换矩阵的逆矩阵,例如可以对去颜色空间转换矩阵的逆矩阵进行微调或者更换若干个系数得到6023中的矩阵。
屏幕亮度补偿器6024,用于对SPR图像数据进行亮度补偿,以消除屏幕亮度不一致的问题。具体请参考对屏幕亮度补偿器2023的描述,此处不再赘述。
可选的,DDIC 602还包括伽马Gamma校正器、DAC和存储器,具体请参考前述实施例的描述,此处不再赘述。
在SOC侧对待显示的图像数据进行子像素渲染SPR处理,大大降低了要传输的数据量,降低了将图像数据从SOC发送到DDIC所需的带宽和功耗,同时,也降低了DDIC侧处理的数据量和存储空间,降低了图像处理装置的功耗,提升了装置性能。进一步的,本申请实施例在SOC侧进行颜色空间转换,将SPR图像转换到亮色分离的SPR图像,进一步降低了图像的数据量,节省传输带宽,并且降低了图像的亮度与色度的相关性,减少了由压缩和解码导致的图像失真,在保证相同压缩比的前提下,解码之后得到的图像显示到显示屏上效果更好,失真更少。
并且对于SPR模块来说,由于SOC侧的集成度更高,SPR模块占用的硬件逻辑的面积更小,SPR模块或者说SPR知识产权(Intellectual Property,IP)核本身的成本和功耗也会减少。
在一种可选的方案中,图像处理装置600包含系统芯片601和DDIC 602,但是不包括显示屏603,如图6b所示,此时,图像处理装置600的产品形态可以是处理器芯片,处理器芯片和显示屏共同构成电子设备或移动终端,如智能手机、相机或者电视机等。
在一种可选的方案中,子像素渲染模块、颜色空间转换模块、编码器和发送接口均集成在AP内部,AP和DDIC集成在SOC中,如图6c所示。
在一种可能的方案中,图像处理装置包括折叠屏,例如图像处理装置的折叠屏包括第一显示屏和第二显示屏,第一显示屏和第二显示屏耦合,且第一显示屏和第二显示屏能够共同用于显示同一个图像;图像处理装置还包括第一DDIC和第二DDIC,其中,第一DDIC用于驱动第一显示屏,第二DDIC用于驱动第二显示屏,多个显示屏之间基于多个DDIC进行数据传输和沟通,由于SPR处理需要参考周围的像素,如果SPR处理在DDIC中,则对于多折叠屏的图像处理装置,多个DDIC之间存在数据交互,本申请实施例将SPR处理放在SOC侧,避免了多个DDIC之间存在数据交互,减少了芯片间的数据交互和数据处理,避免了电磁干扰(Electromagnetic Interference,EMI)和静电放电(Electrostatic Discharge,ESD)问题。
如图9所示,为本申请实施例提供的一种示例性的图像处理装置的硬件架构示意图。图像处理装置900的硬件架构可以适用于SOC和AP。
示例性的,该图像处理装置900包括至少一个中央处理单元(Central Processing Unit,CPU)、至少一个存储器、GPU、解码器、专用的视频或图形处理器、接收接口和发送接口等。可选的,图像处理装置900还可以包括微处理器和微控制器MCU等。在一种可选的情况中,图像处理装置900的上述各个部分通过连接器相耦合,应 当理解,本申请的各个实施例中,耦合是指通过特定方式的相互联系,包括直接相连或者通过其他设备间接相连,例如可以通过各类接口、传输线或总线等相连,这些接口通常是电性通信接口,但是也不排除可能是机械接口或其它形式的接口,本实施例对此不做限定。在一种可选的情况中,上述各部分集成在同一个芯片上;在另一种可选的情况中,CPU、GPU、解码器、接收接口以及发送接口集成在一个芯片上,该芯片内部的各部分通过总线访问外部的存储器。专用视频/图形处理器可以与CPU集成在同一个芯片上,也可以作为单独的处理器芯片存在,例如专用视频/图形处理器可以为专用ISP。在本申请实施例中涉及的芯片是以集成电路工艺制造在同一个半导体衬底上的系统,也叫半导体芯片,其可以是利用集成电路工艺制作在衬底(通常是例如硅一类的半导体材料)上形成的集成电路的集合,其外层通常被半导体封装材料封装。所述集成电路可以包括各类功能器件,每一类功能器件包括逻辑门电路、金属氧化物半导体(Metal-Oxide-Semiconductor,MOS)晶体管、双极晶体管或二极管等晶体管,也可包括电容、电阻或电感等其他部件。每个功能器件可以独立工作或者在必要的驱动软件的作用下工作,可以实现通信、运算、或存储等各类功能。
可选的,CPU可以是一个单核(single-CPU)处理器或多核(multi-CPU)处理器;可选的,CPU可以是多个处理器构成的处理器组,多个处理器之间通过一个或多个总线彼此耦合。在一种可选的情况中,对于图像信号或视频信号的处理一部分由GPU完成,一部分由专用视频/图形处理器完成,还有可能由跑在通用CPU或GPU上的软件代码完成。
存储器,可用于存储计算机程序指令,包括操作系统(Operation System,OS)、各种用户应用程序、以及用于执行本申请方案的程序代码在内的各类计算机程序代码;存储器还可以用于存储视频数据、图像数据等;CPU可以用于执行存储器中存储的计算机程序代码,以实现本申请实施例中的方法。可选的,存储器可以是非掉电易失性存储器,例如是嵌入式多媒体卡(Embedded Multi Media Card,EMMC)、通用闪存存储(Universal Flash Storage,UFS)或只读存储器(Read-Only Memory,ROM),或者是可存储静态信息和指令的其他类型的静态存储设备,还可以是掉电易失性存储器(volatile memory),例如随机存取存储器(Random Access Memory,RAM)或者可存储信息和指令的其他类型的动态存储设备,也可以是电可擦可编程只读存储器(Electrically Erasable Programmable Read-Only Memory,EEPROM)、只读光盘(Compact Disc Read-Only Memory,CD-ROM)或其他光盘存储、光碟存储(包括压缩光碟、激光碟、光碟、数字通用光碟、蓝光光碟等)、磁盘存储介质或者其他磁存储设备、或者能够用于携带或存储具有指令或数据结构形式的程序代码并能够由计算机存取的任何其他计算机可读存储介质,但不限于此。
该接收接口可以为处理器芯片的数据输入的接口,在一种可选的情况下,该接收接口可以是MIPI、HDMI或Display Port(DP)等。
如图10所示,为本申请实施例提供的一种图像处理的方法流程示意图。在图10对应的方法示例中,输入的图像的格式为RGB,子像素渲染之后得到的SPR图像数据为RGBG图像,显示屏为SPA显示屏,应当理解,本申请实施例并不对对输入的图像格式、SPR处理之后的图像格式等构成限定,例如输入的图像格式还可以为YUV、 YCoCg、原始格式的图像raw图像等,SPR数据还可以是RGB delta或SPR1.5等。
该方法包括:
步骤一、对待显示图像进行子像素渲染SPR处理,得到SPR图像数据,该SPR图像数据为能够在SPA显示屏上正常显示的图像。示例性的,该待显示图像为RGB图像,SPR处理得到的图像为RGBG图像,RGBG图像每2个像素包含4个分量:一个R分量,一个B分量,两个G分量,RGBG图像包含的子像素个数小于RGB图像包含的子像素个数。
步骤二、对RGBG图像进行编码,得到第一数据流,示例性的,该第一数据流为比特流,第一数据流的数据量小于输入的RGBG图像的数据量;编码由编码器完成,编码器包含4个处理通道,对RGBG的2个像素的4个分量同时进行编码,每个通道处理一个分量,4个通道的处理结果共同构成第一数据流。
步骤三、对第一数据流进行封装,得到第二数据流。该第二数据流为与传输接口(包括AP的发送接口和DDIC的接收接口)以及传输协议相匹配的数据流。例如,当发送接口为MIPI发送器,接收接口为MIPI接收器,传输协议为MIPI标准的C-phy协议,则第二数据流为符合MIPI标准的数据流。示例性的,封装可以由AP的发送接口完成,也可以由封装模块完成,该封装模块可以为AP内一个专用的固化硬件逻辑。
步骤四、将第二数据流发送给DDIC。示例性的,通过AP的发送接口发送给DDIC的接收接口。
步骤五、对该第二数据流进行解封装,得到该第一数据流。示例性的,解封装可以由DDIC的接收接口完成,也可以由解封装模块完成,该解封装模块可以为DDIC内一个专用的固化硬件逻辑。解封装得到的第一数据流为解码器可以处理或识别的数据流。
步骤六、对第一数据流进行解码,得到能够在显示屏上显示的RGBG图像数据。
由于显示屏制造工艺的限制,显示屏存在mura现象,为了补偿显示屏的mura现象,可选的,该方法还包括:
步骤七、对解码得到的RGBG图像数据进行屏幕亮度补偿处理,得到补偿处理后的RGBG图像数据。示例性的,对解码得到的RGBG图像数据进行Demura处理,基于补偿数据对mura区域的像素进行补偿。
步骤八、驱动SPA显示屏,将补偿处理后的RGBG图像数据显示在显示屏上。
图10对应的方法实施例中,步骤一-步骤三由AP侧完成,步骤四由AP和DDIC配合完成,步骤五-步骤八由DDIC完成。
在一种可选的情况中,经过SPR处理后的图像为RGB delta图像。
对SPR图像进行编码,得到第一数据流,具体包括:
对RGB delta图像进行格式映射,得到映射RGB图像,应当理解,这里的格式映射前后,图像的子像素数的总量不会发生改变,并不会导致图像的数据量增加;
将RGB图像进行颜色空间转换,将RGB图像转换到亮色分离的YCoCg颜色空间,得到YCoCg图像,YCoCg图像包括3个分量Y、Co、Cg;
对YCoCg图像进行编码,得到该第一数据流。
此时,AP侧的编码器可以只有三个处理通道,该三个处理通道分别对3个分量Y、 Co、Cg同时进行编码处理;或者AP侧的编码器有4个处理通道,该4个处理通道中的任意3个处理通道分别对3个分量Y、Co、Cg同时进行编码处理。
在一种可选的情况中,AP将第一数据流通过传输接口发送给DDIC;
对应的,DDIC在通过传输接口收到第一数据流之后,对第一数据流进行解码,得到的解码图像为YCoCg图像,该方法还包括:
DDIC将YCoCg图像的三个分量Y、Co、Cg转换为RGB图像的三个分量R、G、B;
将RGB图像转换为能够在SPA显示屏上显示的RGB delta图像。
本申请实施例中,当AP侧的编码器有4个处理通道时,即可以对4分量的输入同时进行编码处理,也可以对3分量的输入同时进行编码处理。
如图11所示,为本申请实施例提供的另一种图像处理的方法流程示意图。与图10所示的方法实施例相比,图11对应的方法实施例在AP中对子像素渲染得到的RGBG数据进行颜色空间转换,将RGBG数据转换到亮色分离的颜色空间,进一步降低了图像的数据量,节省传输带宽,且降低了图像的亮度和色度之间的相关性,在对图像进行编码、解码之后得到的图像失真更少,显示效果更佳。
该方法包括:
步骤一、对待显示图像进行子像素渲染处理,得到SPR图像数据。
示例性的,该待显示图像为RGB图像,该SPR图像为RGBG图像。应当理解,该待显示图像还可以是YUV、YCoCg、raw等格式,此时,可以对待显示图像进行颜色空间转换得到RGB图像,该SPR图像数据可以是RGB delta或者SPR1.5图像。可选的,步骤一可以由AP完成,或者可以由SOC上的其他通用或专用处理器完成,或者也可以由前述的子像素渲染模块、子像素渲染处理核或者子像素渲染处理固化硬件逻辑完成。
步骤二、对RGBG图像数据进行颜色空间转换,将RGBG图像转换到亮色分离的颜色空间,得到4个输出分量P0、P1、P2、P3,4个输出分量包括两个亮度信号和两个色度信号。应当理解,每2个像素对应4个输入分量和4个输出分量,以2个像素为一组输入,输入包括R、G、B、G四个输入分量,输出包括P0、P1、P2和P3四个输出分量,示例性的,可以将RGBG图像转换到YUV空间中,此时,4个输出分量为:U、Y、V、Y,颜色空间转换的结果可以为UYVY图像、YVYU图像、YUYV图像或VYUY图像等。可选的,步骤二可以由AP、SOC上的其他通用或专用处理器完成,或者可以由前述的颜色空间转换模块、专用的颜色空间转换集成电路、专用的颜色空间转换固化硬件核完成。
输出图像为亮度色度分离的图像信号,若色度信号在传输过程中受到干扰,在还原到RGB空间进行显示的时候不会影响图像的亮度,降低了图像的色度和亮度信号的相关性,并且转换之后的图像的数据量进一步减小,从而可进一步节省带宽。
步骤三、对4个输出分量进行编码,得到第一数据流。
编码由AP的编码器实现,编码器包括4条处理通道,分别用于处理RGBG图像的4个分量R、G、B和G,其中,一条通道处理一个分量。具体请参考图10中步骤二的描述,此处不再赘述。
可选的,该方法还可以包括:
步骤四、对第一数据流进行封装,得到第二数据流。具体请参考图10中步骤三的描述,此处不再赘述。可选的,封装可以由AP侧的发送接口完成,或者由AP侧的封装模块或专用的封装集成电路完成。
步骤五、将第二数据流发送给DDIC。示例性的,通过AP的发送接口发送给DDIC的接收接口。
示例性的,AP的发送接口请参考对发送接口2013的描述,此处不再赘述。
可选的,该方法还可以包括:
步骤六、对该第二数据流进行解封装,得到该第一数据流。具体请参考图10中步骤五的描述,此处不再赘述。可选的,解封装可以由DDIC的接收接口完成,或者由DDIC侧的解封装模块或者解封装专用集成电路完成。
步骤七、对第一数据流进行解码,得到解码图像,该解码图像位于亮色分离的颜色空间,该解码图像包括4个分量P0、P1、P2和P3。示例性的,该解码图像可以为UYVY图像、YVYU图像、YUYV图像或VYUY图像等。可选的,解码由DDIC侧的解码器完成。
步骤八、对该解码图像进行颜色空间转换,得到RGBG图像,RGBG图像可以显示在SPA显示屏上。可选的,该颜色空间转换可以由DDIC侧的颜色空间转换模块完成,该颜色空间转换模块为专用的集成电路或者专用的固化硬件逻辑。
可选的,该方法还可以包括:
步骤九、对RGBG图像数据进行屏幕亮度补偿处理,得到补偿处理后的RGBG图像数据。示例性的,对解码得到的RGBG图像数据进行Demura处理,基于补偿数据对mura区域的像素进行补偿。可选的,屏幕亮度补偿处理可以由DDIC侧的屏幕亮度补偿器完成。
步骤十、驱动SPA显示屏,将补偿处理后的RGBG图像数据显示在显示屏上。可选的,步骤十由DDIC侧的驱动器完成。
应当理解,为了便于理解,图10、11对应的方法实施例以步骤的形式对方法进行描述,但是在某些情况下,可以以不同于此处的顺序执行所描述的步骤。
本申请实施例还提供一种计算机可读存储介质,该计算机可读存储介质中存储有指令,当其在计算机或处理器上运行时,使得计算机或处理器执行本申请实施例提供的方法中的部分或全部功能。
本申请实施例还提供一种包含指令的计算机程序产品,当其在计算机或处理器上运行时,使得计算机或处理器执行本申请实施例提供的任一个方法。
以上所述实施例仅用以说明本申请的技术方案,而非对其限制;尽管参照前述实施例对本申请进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本申请各实施例技术方案的范围。

Claims (28)

  1. 一种图像数据处理的装置,其特征在于,所述装置包括:应用处理器AP和显示驱动器集成电路DDIC;
    所述AP,用于对原始待显示图像进行子像素渲染SPR,得到SPR图像,所述SPR图像能够在子像素排布SPA显示屏上显示,所述SPA显示屏的物理子像素的个数小于所述原始待显示图像的子像素的个数,所述SPR图像的子像素的个数小于所述原始待显示图像的子像素的个数;
    所述AP,还用于对所述SPR图像进行编码,得到第一数据流;
    所述AP,还用于通过传输接口将所述第一数据流发送给所述DDIC;
    所述DDIC,用于解码所述第一数据流,得到解码图像。
  2. 根据权利要求1所述的装置,其特征在于,所述AP包括:SPR集成电路或SPR固化硬件逻辑,
    所述SPR集成电路或SPR固化硬件逻辑,具体用于对所述原始待显示图像进行子像素渲染SPR,得到所述SPR图像。
  3. 根据权利要求1或2所述的装置,其特征在于,所述SPR图像为RGBG图像,所述AP还包括4个处理通道,用于:
    对所述RGBG图像的4个分量R、G、B、G分别进行编码处理,得到所述第一数据流,其中,所述4个处理通道中的每个处理通道分别处理一个分量。
  4. 根据权利要求3所述的装置,其特征在于,所述AP具体用于:
    将所述RGBG图像转换为亮色分离的SPR图像,所述亮色分离的SPR图像包括4个分量U、Y、V和Y,其中,所述U和所述V为色度信号分量,所述Y为亮度信号分量;
    所述4个处理通道,具体用于对所述亮色分离的SPR图像的4个分量U、Y、V和Y分别进行编码处理,得到所述第一数据流,其中,所述4个处理通道中的每个处理通道分别处理一个分量。
  5. 根据权利要求4所述的装置,其特征在于,所述解码图像包括所述4个分量U、Y、V和Y,所述DDIC还用于:
    将所述解码图像的四个分量U、Y、V和Y转换为RGBG图像的四个分量R、G、B、G。
  6. 根据权利要求1或2所述的装置,其特征在于,所述SPR图像为RGB delta图像,所述AP具体用于:
    对所述RGB delta图像进行格式映射,得到映射RGB图像,所述映射RGB图像的子像素的个数等于所述RGB delta图像的子像素的个数;
    将所述RGB图像转换到亮色分离的YCoCg颜色空间,得到YCoCg图像,所述YCoCg图像包括3个分量Y、Co、Cg;
    对所述YCoCg图像进行编码,得到所述第一数据流。
  7. 根据权利要求6所述的装置,其特征在于,所述解码图像为所述YCoCg图像,所述DDIC还用于:
    将所述YCoCg图像的三个分量Y、Co、Cg转换为所述RGB图像的三个分量R、G、B;
    将所述RGB图像转换为能够在所述SPA显示屏上显示的RGB delta图像。
  8. 根据权利要求1至7任一项所述的装置,其特征在于,所述AP集成在系统芯片SOC上,所述DDIC在所述SOC之外。
  9. 根据权利要求1至8任一项所述的装置,其特征在于,所述装置还包括:折叠屏,所述折叠屏包括第一显示屏和第二显示屏;
    所述DDIC包括:第一DDIC和第二DDIC,所述第一DDIC用于驱动所述第一显示屏,所述第二DDIC用于驱动所述第二显示屏。
  10. 根据权利要求1至9任一项所述的装置,其特征在于,所述传输接口为移动产业处理器接口MIPI、MIPI标准化的显示串行接口DSI或者视频电子标准协会VESA标准化的嵌入式显示端口eDP。
  11. 一种图像数据处理的装置,其特征在于,所述装置包括:处理器和传输接口;
    所述处理器用于:
    对原始待显示图像进行子像素渲染SPR,得到SPR图像,所述SPR图像能够在子像素排布SPA显示屏上显示,所述SPA显示屏的物理子像素的个数小于所述原始待显示图像的子像素的个数,所述SPR图像的子像素的个数小于所述原始待显示图像的子像素的个数;
    对所述SPR图像进行编码,得到第一数据流;
    通过所述传输接口发送所述第一数据流。
  12. 根据权利要求11所述的装置,其特征在于,所述SPR图像为RGBG图像,所述处理器还包括4个处理通道,用于对所述RGBG图像的4个分量R、G、B、G分别进行编码处理,得到所述第一数据流,其中,所述4个处理通道中的每个处理通道分别处理一个分量。
  13. 根据权利要求12所述的装置,其特征在于,所述处理器具体用于:
    将所述RGBG图像转换为亮色分离的SPR图像,所述亮色分离的SPR图像包括4个分量U、Y、V和Y,其中,所述U和所述V为色度信号分量,所述Y为亮度信号分量;
    所述4个处理通道,具体用于对所述亮色分离的SPR图像的4个分量U、Y、V和Y分别进行编码处理,得到所述第一数据流,其中,所述4个处理通道中的每个处理通道分别处理一个分量。
  14. 根据权利要求11所述的装置,其特征在于,所述SPR图像为RGB delta图像,所述处理器具体用于:
    对所述RGB delta图像进行格式映射,得到映射RGB图像,所述映射RGB图像的子像素的个数等于所述RGB delta图像的子像素的个数;
    将所述RGB图像转换到亮色分离的YCoCg颜色空间,得到YCoCg图像,所述YCoCg图像包括3个分量Y、Co、Cg;
    对所述YCoCg图像进行编码,得到所述第一数据流。
  15. 一种显示驱动器集成电路DDIC,其特征在于,所述DDIC包括:解码器和传 输接口;
    所述传输接口,用于接收第一数据流,所述第一数据流中包括子像素渲染SPR图像,所述SPR图像能够在子像素排布SPA显示屏上显示,所述SPA显示屏的物理子像素的个数小于所述原始待显示图像的子像素的个数,所述SPR图像的子像素的个数小于所述原始待显示图像的子像素的个数;
    所述解码器,用于解码所述第一数据流,得到解码图像数据。
  16. 根据权利要求15所述的DDIC,其特征在于,所述SPR图像为RGBG图像,所述解码图像数据包括4个分量U、Y、V和Y,其中,所述U和所述V为色度信号分量,所述Y为亮度信号分量,所述DDIC还包括第一颜色空间转换处理集成电路;
    所述颜色空间转换处理集成电路,用于将所述解码图像的四个分量U、Y、V和Y转换为RGBG图像的四个分量R、G、B、G。
  17. 根据权利要求15所述的DDIC,其特征在于,所述SPR图像为RGB delta图像,所述解码图像数据为RGB图像,所述DDIC还包括:格式映射处理集成电路或格式映射处理固化硬件逻辑;
    所述格式映射处理集成电路或格式映射处理固化硬件逻辑,用于对所述RGB图像进行格式映射,得到所述RGB delta图像。
  18. 根据权利要求15至17任一项所述的DDIC,其特征在于,所述解码器包括4条处理通道。
  19. 根据权利要求15至18任一项所述的DDIC,其特征在于,所述DDIC包括第一子DDIC和第二子DDIC,所述第一子DDIC和所述第二子DDIC耦合。
  20. 一种图像数据处理的方法,其特征在于,所述方法包括:
    对原始待显示图像进行子像素渲染SPR,得到SPR图像,所述SPR图像能够在子像素排布SPA显示屏上显示,所述SPA显示屏的物理子像素的个数小于所述原始待显示图像的子像素的个数,所述SPR图像的子像素的个数小于所述原始待显示图像的子像素的个数;
    对所述SPR图像进行编码,得到第一数据流;
    通过传输接口发送所述第一数据流。
  21. 根据权利要求20所述的方法,其特征在于,所述SPR图像为RGBG图像,所述对所述SPR图像进行编码,得到第一数据流,具体包括:
    对所述RGBG图像的4个分量R、G、B、G分别进行编码处理,得到所述第一数据流。
  22. 根据权利要求21所述的方法,其特征在于,所述对所述SPR图像进行编码,得到第一数据流,具体包括:
    将所述RGBG图像转换为亮色分离的SPR图像,所述亮色分离的SPR图像包括4个分量U、Y、V和Y,其中,所述U和所述V为色度信号分量,所述Y为亮度信号分量;
    对所述亮色分离的SPR图像的4个分量U、Y、V和Y分别进行编码处理,得到所述第一数据流。
  23. 根据权利要求20所述的方法,其特征在于,所述SPR图像为RGB delta图像, 所述对所述SPR图像进行编码,得到第一数据流,具体包括:
    对所述RGB delta图像进行格式映射,得到映射RGB图像,所述映射RGB图像的子像素的个数等于所述RGB delta图像的子像素的个数;
    将所述RGB图像转换到亮色分离的YCoCg颜色空间,得到YCoCg图像,所述YCoCg图像包括3个分量Y、Co、Cg;
    对所述YCoCg图像进行编码,得到所述第一数据流。
  24. 一种图像数据处理的方法,其特征在于,所述方法包括:
    接收第一数据流,所述第一数据流中包括子像素渲染SPR图像,所述SPR图像能够在子像素排布SPA显示屏上显示,所述SPA显示屏的物理子像素的个数小于所述原始待显示图像的子像素的个数,所述SPR图像的子像素的个数小于所述原始待显示图像的子像素的个数;
    解码所述第一数据流,得到解码图像数据。
  25. 根据权利要求24所述的方法,其特征在于,所述SPR图像为RGBG图像,所述解码图像数据包括4个分量U、Y、V和Y,其中,所述U和所述V为色度信号分量,所述Y为亮度信号分量,所述方法还包括:
    将所述解码图像的四个分量U、Y、V和Y转换为RGBG图像的四个分量R、G、B、G。
  26. 根据权利要求24所述的方法,其特征在于,所述SPR图像为RGB delta图像,所述解码图像数据为RGB图像,所述方法还包括:
    对所述RGB图像进行格式映射,得到所述RGB delta图像。
  27. 一种计算机可读存储介质,所述计算机可读存储介质中存储有指令,当所述指令在计算机或处理器上运行时,使得所述计算机或处理器执行如权利要求20至23或者24至26任一项所述的方法。
  28. 一种包含指令的计算机程序产品,当其在计算机或处理器上运行时,使得所述计算机或处理器执行如权利要求20至23或者24至26任一项所述的方法。
PCT/CN2019/079208 2019-03-22 2019-03-22 一种图像数据处理的装置和方法 WO2020191516A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201980092287.6A CN113439442A (zh) 2019-03-22 2019-03-22 一种图像数据处理的装置和方法
PCT/CN2019/079208 WO2020191516A1 (zh) 2019-03-22 2019-03-22 一种图像数据处理的装置和方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2019/079208 WO2020191516A1 (zh) 2019-03-22 2019-03-22 一种图像数据处理的装置和方法

Publications (1)

Publication Number Publication Date
WO2020191516A1 true WO2020191516A1 (zh) 2020-10-01

Family

ID=72609175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2019/079208 WO2020191516A1 (zh) 2019-03-22 2019-03-22 一种图像数据处理的装置和方法

Country Status (2)

Country Link
CN (1) CN113439442A (zh)
WO (1) WO2020191516A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113329257A (zh) * 2021-05-19 2021-08-31 Oppo广东移动通信有限公司 图像显示方法、装置、电子设备及可读存储介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114049249B (zh) * 2021-10-30 2023-08-18 深圳曦华科技有限公司 图像转换的方法及相关装置
CN115223516B (zh) * 2022-09-20 2022-12-13 深圳市优奕视界有限公司 图形渲染与lcd驱动一体化芯片及相关方法和设备

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105282546A (zh) * 2014-06-11 2016-01-27 三星电子株式会社 图像处理装置及方法
CN106774671A (zh) * 2016-12-22 2017-05-31 广东虹勤通讯技术有限公司 一种可折叠屏幕的控制方法及电子设备
CN107734337A (zh) * 2016-08-11 2018-02-23 联咏科技股份有限公司 图像处理方法及相关装置
WO2018089111A1 (en) * 2016-11-14 2018-05-17 Google Llc Early sub-pixel rendering

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7480417B2 (en) * 2004-10-19 2009-01-20 Microsoft Corp. System and method for encoding mosaiced image data employing a reversible color transform
CN103067708B (zh) * 2012-12-25 2015-06-17 太原理工大学 基于w-z结构的贝尔模板数字图像编解码方法
KR102023184B1 (ko) * 2013-02-20 2019-09-20 삼성디스플레이 주식회사 표시장치, 데이터 처리장치 및 그 방법
CN108881915B (zh) * 2018-07-11 2020-09-04 武汉精测电子集团股份有限公司 基于dsc编码技术视频播放的装置和方法

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105282546A (zh) * 2014-06-11 2016-01-27 三星电子株式会社 图像处理装置及方法
CN107734337A (zh) * 2016-08-11 2018-02-23 联咏科技股份有限公司 图像处理方法及相关装置
WO2018089111A1 (en) * 2016-11-14 2018-05-17 Google Llc Early sub-pixel rendering
CN106774671A (zh) * 2016-12-22 2017-05-31 广东虹勤通讯技术有限公司 一种可折叠屏幕的控制方法及电子设备

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113329257A (zh) * 2021-05-19 2021-08-31 Oppo广东移动通信有限公司 图像显示方法、装置、电子设备及可读存储介质

Also Published As

Publication number Publication date
CN113439442A (zh) 2021-09-24

Similar Documents

Publication Publication Date Title
JP7115776B2 (ja) ビデオ信号の処理方法及び装置
CN107211130B (zh) 对彩色画面进行编码和解码的方法和装置
WO2020007167A1 (zh) 视频信号的处理方法及装置
US20220165199A1 (en) System and method for a multi-primary wide gamut color system
WO2020191516A1 (zh) 一种图像数据处理的装置和方法
US20180192077A1 (en) Method and device for encoding both a hdr picture and a sdr picture obtained from said hdr picture using color mapping functions
US11488510B2 (en) System and method for a multi-primary wide gamut color system
JP3859514B2 (ja) サブサンプリングされたy/cカラー信号によるフラットパネル駆動
WO2021073304A1 (zh) 一种图像处理的方法及装置
US8860745B2 (en) System and method for color gamut mapping
US9607574B2 (en) Video data compression format
US20170085887A1 (en) Method, apparatus and system for displaying video data
EP3051489A1 (en) A method and apparatus of encoding and decoding a color picture
US11146770B2 (en) Projection display apparatus and display method
US11315467B1 (en) System and method for a multi-primary wide gamut color system
JP2002512470A (ja) ネットワーク環境においてビデオ・プロトコルをサポートする方法及び装置
CN101552926B (zh) 一种彩色图像信号处理方法及装置
US20200365098A1 (en) Packing of subpixel rendered data for display stream compression
US20220343822A1 (en) System and method for a multi-primary wide gamut color system
CN110225327B (zh) 与yuv格式兼容传播多原色色度信息的方法
TWI754863B (zh) 影像擷取裝置及方法
US20240022750A1 (en) Integrated chip including interface, operating method thereof, and electronic device including integrated chip
TW200813925A (en) Storage structure for overdrive image data and the method thereof
KR20230097030A (ko) 멀티-프라이머리 와이드 개멋 컬러 시스템을 위한 시스템 및 방법

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19922143

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19922143

Country of ref document: EP

Kind code of ref document: A1