CN111277862B - Video color gamut detection method and system based on embedded CPU - Google Patents

Video color gamut detection method and system based on embedded CPU Download PDF

Info

Publication number
CN111277862B
CN111277862B CN202010123305.7A CN202010123305A CN111277862B CN 111277862 B CN111277862 B CN 111277862B CN 202010123305 A CN202010123305 A CN 202010123305A CN 111277862 B CN111277862 B CN 111277862B
Authority
CN
China
Prior art keywords
rgb
coordinate
coordinates
color gamut
xyz
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010123305.7A
Other languages
Chinese (zh)
Other versions
CN111277862A (en
Inventor
袁三男
吴立新
刘志超
孙哲
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai University of Electric Power
Original Assignee
Shanghai University of Electric Power
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai University of Electric Power filed Critical Shanghai University of Electric Power
Priority to CN202010123305.7A priority Critical patent/CN111277862B/en
Publication of CN111277862A publication Critical patent/CN111277862A/en
Application granted granted Critical
Publication of CN111277862B publication Critical patent/CN111277862B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs
    • H04N21/23418Processing of video elementary streams, e.g. splicing of video streams or manipulating encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Processing Of Color Television Signals (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a video color gamut detection method and a video color gamut detection system based on an embedded CPU (Central processing Unit), wherein a control module reads RGB (red, green and blue) coordinates of pixel points in a video frame to be detected and transmits the RGB coordinates to a calculation module for processing; the calculation module is used for carrying out normalization processing on the read RGB coordinates and converting the processed RGB coordinates into XYZ coordinate values by utilizing an XYZ color model; the comparison module respectively compares the obtained R, G, B coordinate values of the pixel points until the XYZ coordinate values of the maximum coordinate corresponding conversion are obtained, and informs the control module to stop reading the video frame; and the calculation module utilizes a GNTSC formula and comparison result data to obtain the color gamut of the current video frame. The invention processes the video stream through the CPU, calculates different color gamut values in the video segments, monitors the segments with low color gamut, deletes and modifies the segments, and improves the visual experience of audiences.

Description

Video color gamut detection method and system based on embedded CPU
Technical Field
The invention relates to the technical field of image processing, in particular to a video color gamut detection method and system based on an embedded CPU.
Background
Color gamut is a method of encoding a color and also refers to the sum of colors that a technical system is capable of producing. In computer graphics processing, a color gamut is some complete subset of colors. The most common application of color subsets is to accurately represent a given situation. Such as a given color space or a color gamut of an output device.
The color gamut refers to a range region of colors that can be expressed by a certain color expression pattern. Generally, the higher the color gamut, the richer the colors that can be represented. Of course, different industries have different requirements for different colors, so that there are different color gamut standards. NTSC is the color television broadcast standard set by the national television systems committee of 12 months in 1952. In order to improve the visual experience of the audience, in the detection of the video, the real-time calculation detection of the video color gamut is necessary.
Disclosure of Invention
This section is for the purpose of summarizing some aspects of embodiments of the invention and to briefly introduce some preferred embodiments. In this section, as well as in the abstract and the title of the invention of this application, simplifications or omissions may be made to avoid obscuring the purpose of the section, the abstract and the title, and such simplifications or omissions are not intended to limit the scope of the invention.
The present invention has been made in view of the above-mentioned conventional problems.
Therefore, the invention provides a video color gamut detection method based on an embedded CPU, which can delete the video with low color gamut in a video segment and enable the audience to obtain better video viewing experience.
In order to solve the technical problems, the invention provides the following technical scheme: the method comprises the steps that a control module reads RGB coordinates of pixel points in a video frame to be detected and transmits the RGB coordinates to a calculation module for processing; the calculation module is used for carrying out normalization processing on the read RGB coordinates and converting the processed RGB coordinates into XYZ coordinate values by utilizing an XYZ color model; the comparison module respectively compares the obtained R, G, B coordinate values of the pixel points until the XYZ coordinate values of the maximum coordinate corresponding conversion are obtained, and informs the control module to stop reading the video frame; and the calculation module utilizes a GNTSC formula and comparison result data to obtain the color gamut of the current video frame.
As a preferred scheme of the video color gamut detection method based on the embedded CPU in the present invention, wherein: the comparing module compares the coordinate values of the pixel points R, G, B, and the comparing module compares the R coordinate of each pixel point until the XYZ coordinates Xr and Yr which are correspondingly converted by the maximum red coordinate MR are obtained; the comparison module compares the G coordinates of each pixel point until XYZ coordinates Xg and Yg which are converted correspondingly by the maximum green coordinate MG are obtained; and the comparison module compares the B coordinates of each pixel point until the XYZ coordinates Xb and Yb converted correspondingly to the maximum blue coordinate MB are obtained.
As a preferred scheme of the video color gamut detection method based on the embedded CPU in the present invention, wherein: the control module reads the RGB coordinates, and comprises the steps of decoding the video frame into a data matrix of each pixel point, solving the matrix value and obtaining the RGB coordinate value.
As a preferred scheme of the video color gamut detection method based on the embedded CPU in the present invention, wherein: decoding the video frame further comprises solving the matrix value to be 3 bits, and recording as the RGB coordinate value; respectively marking the R coordinate, the G coordinate and the B coordinate as red, green and blue numerical values of the pixel points; and combining the RGB coordinates of each pixel point into the matrix for the control module to read.
As a preferred scheme of the video color gamut detection method based on the embedded CPU in the present invention, wherein: the normalization processing of the RGB coordinate values comprises that the calculation module substitutes the RGB coordinate values into a formula to carry out normalization processing, eliminates dimension influence and obtains corresponding color proportion, the formula is as follows,
Figure BDA0002393657560000021
wherein R ', G ', B ': normalized RGB coordinate values, R, G, B: the RGB coordinate values of the pixel points are such that, in the RGB space, the color depends on the proportion of RGB.
As a preferred scheme of the video color gamut detection method based on the embedded CPU in the present invention, wherein: comprising converting the normalized RGB coordinate values into the XYZ coordinate values using the XYZ color model as follows,
Figure BDA0002393657560000022
wherein, x, y, z: converted coordinate values, r, g, b: and normalizing the processed RGB coordinates.
As a preferred scheme of the video color gamut detection method based on the embedded CPU in the present invention, wherein: solving the color gamut of the current frame comprises substituting the Xr, Yr, Xg, Yg, Xb and Yb into the GNTSC formula, solving the NTSC color gamut of CIE1931 and GB21520-2015 standards, the GNTSC formula is as follows,
Figure BDA0002393657560000031
Figure BDA0002393657560000032
wherein G isNTSC: said NTSC color gamut, x, of said video framer、yr: XYZ coordinate, x, of pixel point corresponding to maximum R coordinate in RGB color systemg、yg: XYZ coordinate, x of pixel point corresponding to maximum G coordinate in RGB color systemb、yb: and the maximum B coordinate in the RGB color system corresponds to the XYZ coordinate of the pixel point.
As a preferred scheme of the video color gamut detection system based on the embedded CPU in the present invention, wherein: the device comprises a control module, a calculation module and a comparison module, wherein the control module is used for reading the RGB coordinate values in the video frame of the color gamut to be detected and comprises a decoding body and an assembly body, the decoding body is used for decoding the video frame into the data matrix, and the assembly body is used for combining the RGB coordinate values into the matrix; the calculation module is connected with the control module and used for calculating and processing the RGB coordinate values read by the control module, converting the RGB coordinate values into the XYZ coordinate values and calculating the color gamut of the video frame; and the comparison module is used for comparing the coordinate values of the pixel point R, G, B to obtain the XYZ coordinate values of the maximum coordinate corresponding to the conversion.
The invention has the beneficial effects that: the invention processes the video stream through the CPU, calculates different color gamut values in the video segments, monitors the segments with low color gamut, deletes and modifies the segments, and improves the visual experience of audiences.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise. Wherein:
fig. 1 is a schematic flowchart of a video color gamut detection method based on an embedded CPU according to a first embodiment of the present invention;
fig. 2 is a schematic diagram illustrating a distribution of module structures of a video gamut detecting system based on an embedded CPU according to a second embodiment of the present invention.
Detailed Description
In order to make the aforementioned objects, features and advantages of the present invention comprehensible, specific embodiments accompanied with figures are described in detail below, and it is apparent that the described embodiments are a part of the embodiments of the present invention, not all of the embodiments. All other embodiments, which can be obtained by a person skilled in the art without making creative efforts based on the embodiments of the present invention, shall fall within the protection scope of the present invention.
In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention, but the present invention may be practiced in other ways than those specifically described and will be readily apparent to those of ordinary skill in the art without departing from the spirit of the present invention, and therefore the present invention is not limited to the specific embodiments disclosed below.
Furthermore, reference herein to "one embodiment" or "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment is included in at least one implementation of the invention. The appearances of the phrase "in one embodiment" in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments.
The present invention will be described in detail with reference to the drawings, wherein the cross-sectional views illustrating the structure of the device are not enlarged partially in general scale for convenience of illustration, and the drawings are only exemplary and should not be construed as limiting the scope of the present invention. In addition, the three-dimensional dimensions of length, width and depth should be included in the actual fabrication.
Meanwhile, in the description of the present invention, it should be noted that the terms "upper, lower, inner and outer" and the like indicate orientations or positional relationships based on the orientations or positional relationships shown in the drawings, and are only for convenience of describing the present invention and simplifying the description, but do not indicate or imply that the referred device or element must have a specific orientation, be constructed in a specific orientation and operate, and thus, cannot be construed as limiting the present invention. Furthermore, the terms first, second, or third are used for descriptive purposes only and are not to be construed as indicating or implying relative importance.
The terms "mounted, connected and connected" in the present invention are to be understood broadly, unless otherwise explicitly specified or limited, for example: can be fixedly connected, detachably connected or integrally connected; they may be mechanically, electrically, or directly connected, or indirectly connected through intervening media, or may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in specific cases to those skilled in the art.
Example 1
The color gamut represents a range area formed by colors which can be expressed by a certain color expression mode, the color gamut of a video frame is solved, a CPU is required to process a video stream to obtain RGB coordinates of pixels of the video frame, the RGB coordinates are converted into XYZ coordinates through normalization, the RGB coordinates of each pixel are compared one by one to obtain the maximum XYZ coordinate converted by the red coordinate MR, the maximum XYZ coordinate converted by the green coordinate MR and the maximum XYZ coordinate converted by the blue coordinate MR, and the obtained maximum XYZ coordinates of primary colors are used for obtaining the color gamuts NTSC of the CIE1931 standard and the GB21520-2015 standard, namely the color gamut value of the video frame. The color gamut value obtained by the method can be used for monitoring the display quality of the video color, and the video which does not meet the color gamut standard is rectified, so that the visual experience of audiences is improved.
Referring to fig. 1, a first embodiment of the present invention provides a video gamut detection method based on an embedded CPU, including:
s1: the control module 100 reads the RGB coordinates of the pixel points in the video frame to be detected, and transmits the RGB coordinates to the calculation module 200 for processing. It should be noted that the reading of the RGB coordinates by the control module 100 includes:
and decoding the video frame into a data matrix of each pixel point, solving a matrix value, and obtaining RGB coordinate values.
Further, decoding the video frame further comprises:
solving the matrix value to be 3bit and recording as RGB coordinate value;
respectively marking the R coordinate, the G coordinate and the B coordinate as red, green and blue numerical values of the pixel points;
the RGB coordinates of each pixel are combined into a matrix for the control module 100 to read.
S2: the calculation module 200 performs normalization processing on the read RGB coordinates, and converts the processed RGB coordinates into XYZ coordinate values using an XYZ color model. It should be noted that the normalization processing of the RGB coordinate values includes:
the calculation module 200 substitutes the RGB coordinate values into a formula for normalization processing to eliminate dimensional influence and obtain a corresponding color ratio, wherein the formula is as follows,
Figure BDA0002393657560000051
wherein R ', G ', B ': normalized RGB coordinate values, R, G, B: the RGB coordinate value of the pixel point, in the RGB space, the color depends on the proportion of RGB.
Further, the normalized RGB coordinate values are converted into XYZ coordinate values using an XYZ color model, as follows:
Figure BDA0002393657560000052
wherein, x, y, z: converted coordinate values, r, g, b: and normalizing the processed RGB coordinates.
S3: the comparison module 300 compares the obtained coordinates of the pixel point R, G, B respectively until the XYZ coordinates of the maximum coordinate corresponding to the transformation are obtained, and notifies the control module 100 to stop reading the video frame. It should be further explained that the comparing module 300 respectively compares the coordinate values of the pixel points R, G, B specifically includes:
the comparison module 300 compares the R coordinates of each pixel point until the XYZ coordinates Xr and Yr of the maximum red MR are obtained;
the comparison module 300 compares the G coordinates of each pixel point until XYZ coordinates Xg and Yg converted correspondingly to the maximum green coordinate MG are obtained;
the comparison module 300 compares the B coordinates of each pixel point until the converted XYZ coordinates Xb and Yb corresponding to the maximum blue coordinate MB are obtained.
S4: the calculation module 200 uses the GNTSC formula and the comparison result data to obtain the color gamut of the current video frame. It should be further noted that, the obtaining the color gamut of the current frame includes:
substituting Xr, Yr, Xg, Yg, Xb and Yb into the GNTSC formula to solve the NTSC color gamut of CIE1931 and GB21520-2015 standards, wherein the GNTSC formula is as follows,
Figure BDA0002393657560000061
Figure BDA0002393657560000062
wherein G isNTSC: NTSC color gamut, x, of video framesr、yr: XYZ coordinate, x, of pixel point corresponding to maximum R coordinate in RGB color systemg、yg: XYZ coordinate, x, of pixel point corresponding to maximum G coordinate in RGB color systemb、yb: and the maximum B coordinate in the RGB color system corresponds to the XYZ coordinate of the pixel point.
Preferably, the traditional color gamut detection method needs to use a color analyzer for detection, but the detection effect is not ideal, and the color quality and the viewing experience cannot be well improved, but the method adopts an embedded CPU (HI3798CV200, an integrated 4-core 64-bit high-performance Cortex A53 processor) to support 4K2KP60@10bit ultra-high definition video decoding and displaying, calculates different color gamut values of each video segment according to the NTSC calculation standard, monitors the video of a low color gamut, deletes and corrects the video, so that the audience obtains better viewing experience.
Preferably, in order to verify and explain the technical effects adopted in the method of the present invention, the experimental results are compared by scientific demonstration means to verify the actual effects of the method of the present invention. In order to verify that the method of the present invention has the effect of improving the visual experience of the audience, in this embodiment, a color gamut calculation program is used to measure the color gamut of a segment of video to be tested, an automatic test device is started, the video is input into the color gamut calculation program, the color gamut is calculated from the data of the video stream decoded by a decoder, and the low color gamut value is deleted, and the test results are shown in the following table:
table 1: and (5) a color gamut test table.
Xr,Yr Xg,Yg Xb,Yb CI1931-NTSC GB21520-NTSC
0.670,0.330 0.210,0.710 0.140,0.080 0.1233 0.1321
0.325,0.221 0.123,0.654 0.440,0.125 0.1512 0.1356
0.216,0.652 0.225,0.321 0.325,0.587 0.2216 0.2311
0.561,0.332 0.256,0.114 0.234,0.255 0.3422 0.3523
0.728,0.225 0.652,0.415 0.112,0.332 0.5126 0.4121
Referring to table 1, it can be seen that the color gamut values of CI1931-NTSC and CB21520-NTSC are in an ascending state, i.e. the displayed color space is increasingly large, and the data result can visually verify that the method of the present invention can improve the color display and improve the viewing experience of the viewer.
Example 2
Referring to fig. 2, a second embodiment of the present invention, which is different from the first embodiment, provides an embedded CPU-based video gamut detection system, which includes a control module 100, a calculation module 200, a comparison module 300,
the control module 100 is configured to read RGB coordinate values in a video frame of a color gamut to be detected, and includes a decoding body 101 and an assembly body 102, where the decoding body 101 is configured to decode the video frame into a data matrix, and the assembly body 102 is configured to combine the RGB coordinate values into a matrix.
The calculation module 200 is connected to the control module 100, and is configured to calculate and process the RGB coordinate values read by the control module 100, convert the RGB coordinate values into XYZ coordinate values, and calculate the color gamut of the video frame.
And the comparison module 300 is configured to compare the coordinate values of the pixel point R, G, B to obtain an XYZ coordinate value corresponding to the maximum coordinate.
It should be recognized that embodiments of the present invention can be realized and implemented by computer hardware, a combination of hardware and software, or by computer instructions stored in a non-transitory computer readable memory. The methods may be implemented in a computer program using standard programming techniques, including a non-transitory computer-readable storage medium configured with the computer program, where the storage medium so configured causes a computer to operate in a specific and predefined manner, according to the methods and figures described in the detailed description. Each program may be implemented in a high level procedural or object oriented programming language to communicate with a computer system. However, the program(s) can be implemented in assembly or machine language, if desired. In any case, the language may be a compiled or interpreted language. Furthermore, the program can be run on a programmed application specific integrated circuit for this purpose.
Further, the operations of processes described herein can be performed in any suitable order unless otherwise indicated herein or otherwise clearly contradicted by context. The processes described herein (or variations and/or combinations thereof) may be performed under the control of one or more computer systems configured with executable instructions, and may be implemented as code (e.g., executable instructions, one or more computer programs, or one or more applications) collectively executed on one or more processors, by hardware, or combinations thereof. The computer program includes a plurality of instructions executable by one or more processors.
Further, the method may be implemented in any type of computing platform operatively connected to a suitable interface, including but not limited to a personal computer, mini computer, mainframe, workstation, networked or distributed computing environment, separate or integrated computer platform, or in communication with a charged particle tool or other imaging device, and the like. Aspects of the invention may be embodied in machine-readable code stored on a non-transitory storage medium or device, whether removable or integrated into a computing platform, such as a hard disk, optically read and/or write storage medium, RAM, ROM, or the like, such that it may be read by a programmable computer, which when read by the storage medium or device, is operative to configure and operate the computer to perform the procedures described herein. Further, the machine-readable code, or portions thereof, may be transmitted over a wired or wireless network. The invention described herein includes these and other different types of non-transitory computer-readable storage media when such media include instructions or programs that implement the steps described above in conjunction with a microprocessor or other data processor. The invention also includes the computer itself when programmed according to the methods and techniques described herein. A computer program can be applied to input data to perform the functions described herein to transform the input data to generate output data that is stored to non-volatile memory. The output information may also be applied to one or more output devices, such as a display. In a preferred embodiment of the invention, the transformed data represents physical and tangible objects, including particular visual depictions of physical and tangible objects produced on a display.
As used in this application, the terms "component," "module," "system," and the like are intended to refer to a computer-related entity, either hardware, firmware, a combination of hardware and software, or software in execution. For example, a component may be, but is not limited to being: a process running on a processor, an object, an executable, a thread of execution, a program, and/or a computer. By way of example, both an application running on a computing device and the computing device can be a component. One or more components can reside within a process and/or thread of execution and a component can be localized on one computer and/or distributed between two or more computers. In addition, these components can execute from various computer readable media having various data structures thereon. The components may communicate by way of local and/or remote processes such as in accordance with a signal having one or more data packets (e.g., data from one component interacting with another component in a local system, distributed system, and/or across a network such as the internet with other systems by way of the signal).
It should be noted that the above-mentioned embodiments are only for illustrating the technical solutions of the present invention and not for limiting, and although the present invention has been described in detail with reference to the preferred embodiments, it should be understood by those skilled in the art that modifications or equivalent substitutions may be made on the technical solutions of the present invention without departing from the spirit and scope of the technical solutions of the present invention, which should be covered by the claims of the present invention.

Claims (1)

1. A video color gamut detection method based on an embedded CPU is characterized in that: comprises the steps of (a) preparing a mixture of a plurality of raw materials,
the control module (100) reads the RGB coordinates of the pixel points in the video frame to be detected and transmits the RGB coordinates to the calculation module (200) for processing;
the calculation module (200) is used for carrying out normalization processing on the read RGB coordinates and converting the processed RGB coordinates into XYZ coordinate values by utilizing an XYZ color model;
the comparison module (300) respectively compares the obtained R, G, B coordinate values of the pixel points until the XYZ coordinate values which are transformed correspondingly to the maximum coordinate are obtained, and informs the control module (100) to stop reading the video frame;
the calculation module (200) utilizes a GNTSC formula and comparison result data to obtain the color gamut of the current video frame;
the comparing module (300) compares the coordinate values of the pixel point R, G, B,
the comparison module (300) compares the R coordinates of each pixel point until XYZ coordinates Xr and Yr which are correspondingly converted from the maximum red coordinate MR are obtained;
the comparison module (300) compares the G coordinates of each pixel point until XYZ coordinates Xg and Yg which are converted correspondingly by the maximum green coordinate MG are obtained;
the comparison module (300) compares the B coordinates of each pixel point until XYZ coordinates Xb and Yb converted correspondingly to the maximum blue coordinate MB are obtained;
the control module (100) reading the RGB coordinates includes,
decoding the video frame into a data matrix of each pixel point, solving a matrix value, and obtaining RGB coordinate values;
decoding the video frame may further include,
solving the matrix value to be 3bit, and recording as the RGB coordinate value;
respectively marking the R coordinate, the G coordinate and the B coordinate as red, green and blue numerical values of the pixel points;
combining the RGB coordinates of each pixel point into a matrix for reading by the control module (100);
the normalization processing of the RGB coordinate values includes,
the calculating module (200) substitutes the RGB coordinate values into a formula to carry out normalization processing, eliminates dimension influence and obtains corresponding color proportion, the formula is as follows,
Figure FDA0003242648610000011
wherein R ', G ', B ': normalized RGB coordinate values, R, G, B: the RGB coordinate values of the pixel points are such that, in the RGB space, the color depends on the proportion of RGB;
converting the normalized RGB coordinate values into the XYZ coordinate values using the XYZ color model, as follows,
Figure FDA0003242648610000021
wherein, x, y, z: converted coordinate values, r, g, b: normalizing the processed RGB coordinates;
the deriving of the color gamut for the current frame comprises,
substituting the Xr, the Yr, the Xg, the Yg, the Xb and the Yb into the GNTSC formula to solve the NTSC color gamut of CIE1931 and GB21520-2015 standards, wherein the GNTSC formula is as follows,
Figure FDA0003242648610000022
Figure FDA0003242648610000023
wherein G isNTSC: said NTSC color gamut, x, of said video framer、yr: XYZ coordinate, x, of pixel point corresponding to maximum R coordinate in RGB color systemg、tg: XYZ coordinate, x of pixel point corresponding to maximum G coordinate in RGB color systemb、yb: and the maximum B coordinate in the RGB color system corresponds to the XYZ coordinate of the pixel point.
CN202010123305.7A 2020-02-27 2020-02-27 Video color gamut detection method and system based on embedded CPU Active CN111277862B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010123305.7A CN111277862B (en) 2020-02-27 2020-02-27 Video color gamut detection method and system based on embedded CPU

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010123305.7A CN111277862B (en) 2020-02-27 2020-02-27 Video color gamut detection method and system based on embedded CPU

Publications (2)

Publication Number Publication Date
CN111277862A CN111277862A (en) 2020-06-12
CN111277862B true CN111277862B (en) 2021-11-16

Family

ID=70999361

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010123305.7A Active CN111277862B (en) 2020-02-27 2020-02-27 Video color gamut detection method and system based on embedded CPU

Country Status (1)

Country Link
CN (1) CN111277862B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022067615A1 (en) * 2020-09-30 2022-04-07 京东方科技集团股份有限公司 Video color gamut analysis and display method, apparatus, system, and computer device
CN113628286B (en) * 2021-08-09 2024-03-22 咪咕视讯科技有限公司 Video color gamut detection method, device, computing equipment and computer storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714771A (en) * 2012-10-03 2014-04-09 索尼公司 Image display unit, method of driving image display unit, signal generator, signal generation program, and signal generation method
CN109243365A (en) * 2018-09-20 2019-01-18 合肥鑫晟光电科技有限公司 Display methods, the display device of display device

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6775326B2 (en) * 2016-05-13 2020-10-28 シナプティクス・ジャパン合同会社 Color adjustment method, color adjustment device and display system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103714771A (en) * 2012-10-03 2014-04-09 索尼公司 Image display unit, method of driving image display unit, signal generator, signal generation program, and signal generation method
CN109243365A (en) * 2018-09-20 2019-01-18 合肥鑫晟光电科技有限公司 Display methods, the display device of display device

Also Published As

Publication number Publication date
CN111277862A (en) 2020-06-12

Similar Documents

Publication Publication Date Title
EP3796648A1 (en) Method and device for video signal processing
CN111277862B (en) Video color gamut detection method and system based on embedded CPU
US10490157B2 (en) Compression of distorted images for head-mounted display
US8559709B1 (en) Method and apparatus for progressive encoding for text transmission
US11120725B2 (en) Method and apparatus for color gamut mapping color gradient preservation
US10750188B2 (en) Method and device for dynamically monitoring the encoding of a digital multidimensional signal
US20100129001A1 (en) Method, device and program for measuring image quality adjusting ability, and method, device and program for adjusting image quality
KR20090013934A (en) Method and system of immersive generation for two-dimension still image and factor dominating method, image content analysis method and scaling parameter prediction method for generating immersive
US20220237754A1 (en) Image processing method and apparatus
US20110032984A1 (en) Methods circuits and systems for transmission of video
JP4777185B2 (en) Image processing apparatus and control method thereof, computer program, computer-readable storage medium, and image encoding apparatus
US10152945B2 (en) Image processing apparatus capable of performing conversion on input image data for wide dynamic range
US20170339408A1 (en) Content providing apparatus, display apparatus, and control method therefor
US20240046836A1 (en) Image processing methods and apparatuses, electronic devices and storage media
US20140333654A1 (en) Image color adjusting method and electronic device using the same
US8861850B2 (en) Digital image color correction
CN107534763A (en) Adaptive color levels interpolation method and equipment
JP6702602B2 (en) Self image diagnostic method, self image diagnostic program, display device, and self image diagnostic system
CN109685861B (en) Picture compression method, device and equipment and computer readable storage medium
US20130258199A1 (en) Video processor and video processing method
US20230086245A1 (en) Image processing method, electronic device, and image display system
US20140375672A1 (en) Image processing apparatus, image adjustment system, image processing method, and recording medium
US10735703B2 (en) Electronic device and associated image processing method
US20220358623A1 (en) Saturation enhancement method and device, and computer readable storage medium
KR20090060029A (en) Methdo and system of immersive enhancement for video sequence displaying

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant