CN113613007A - Three-dimensional color lookup table generation method and display device - Google Patents

Three-dimensional color lookup table generation method and display device Download PDF

Info

Publication number
CN113613007A
CN113613007A CN202110813079.XA CN202110813079A CN113613007A CN 113613007 A CN113613007 A CN 113613007A CN 202110813079 A CN202110813079 A CN 202110813079A CN 113613007 A CN113613007 A CN 113613007A
Authority
CN
China
Prior art keywords
color space
color
pixel point
processed
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110813079.XA
Other languages
Chinese (zh)
Other versions
CN113613007B (en
Inventor
郝才研
查林
庞曼伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Qingdao Xinxin Microelectronics Technology Co Ltd
Original Assignee
Qingdao Xinxin Microelectronics Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Qingdao Xinxin Microelectronics Technology Co Ltd filed Critical Qingdao Xinxin Microelectronics Technology Co Ltd
Priority to CN202110813079.XA priority Critical patent/CN113613007B/en
Publication of CN113613007A publication Critical patent/CN113613007A/en
Application granted granted Critical
Publication of CN113613007B publication Critical patent/CN113613007B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/186Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a colour or a chrominance component
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/10Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding
    • H04N19/169Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding
    • H04N19/182Methods or arrangements for coding, decoding, compressing or decompressing digital video signals using adaptive coding characterised by the coding unit, i.e. the structural portion or semantic portion of the video signal being the object or the subject of the adaptive coding the unit being a pixel
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/42Methods or arrangements for coding, decoding, compressing or decompressing digital video signals characterised by implementation details or hardware specially adapted for video compression or decompression, e.g. dedicated software implementation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N19/00Methods or arrangements for coding, decoding, compressing or decompressing digital video signals
    • H04N19/44Decoders specially adapted therefor, e.g. video decoders which are asymmetric with respect to the encoder

Abstract

The application relates to the technical field of image data processing, and discloses a method for generating a three-dimensional color lookup table and display equipment, wherein the method comprises the following steps: firstly, constructing color characteristic data of a display panel, wherein the color characteristic data is used for describing a mapping relation between a first color space and a second color space when the color of the first color space is displayed through the display panel; the first color space is a color space related to the device, the second color space is a color space unrelated to the device, and the second color space is used for describing a result displayed by the display panel by colors of the first color space; and then, constructing a mapping relation between the third color space and the first color space based on a specified mapping relation between the third color space and the second color space to obtain the 3DLUT, so that the 3DLUT can be automatically generated.

Description

Three-dimensional color lookup table generation method and display device
Technical Field
The present application relates to the field of image data processing technologies, and in particular, to a method for generating a three-dimensional color lookup table and a display device.
Background
Color gamut refers to the range of colors that can be rendered with a display device. The wider the color gamut of the display panel in a general display device, the more various colors can be presented, and more vivid colors can be displayed. Along with the continuous development of display panel technology, the color gamut of the display panel is also wider and wider, the types of colors that can be presented are also more and more, and people have higher and higher requirements on the wide color gamut.
When the display panel displays colors, a 3D LUT (3 dimensional LOOK UP TABLE) is usually required to correctly display colors. The present application proposes a new method of building a 3D LUT.
Disclosure of Invention
The embodiment of the application provides a method for generating a three-dimensional color lookup table and display equipment, which are used for providing a new method for establishing a 3D LUT.
In a first aspect, an embodiment of the present application provides a method for generating a three-dimensional color lookup table, where the method includes:
constructing color characteristic data of a display panel, wherein the color characteristic data is used for describing a mapping relation between a first color space and a second color space when the color of the first color space is displayed through the display panel; the first color space is a device-dependent color space, the second color space is a device-independent color space, and the second color space is used for describing a result of displaying colors of the first color space through the display panel;
and constructing a mapping relation between a third color space and the first color space based on a specified mapping relation between the third color space and the second color space to obtain a three-dimensional color look-up table (3D LUT).
In some possible embodiments, the constructing the color characteristic data of the display panel includes:
sampling the first color space to obtain a plurality of test points;
outputting each test point in the plurality of test points to the display panel for display, and measuring the color value of the display result of each test point in the second color space;
and constructing the color characteristic data based on the corresponding relation between the color value of each test point in the first color space and the color value of each test point in the second color space.
In some possible embodiments, the third color space comprises N to-be-processed pixel points, N depending on the requirements of the 3D LUT; the constructing a mapping relationship between a third color space and the first color space based on a specified mapping relationship between the third color space and the second color space to obtain a three-dimensional color look-up table (3D LUT) comprises:
determining the color value of each pixel point to be processed in the second color space based on the designated mapping relation;
for each pixel point to be processed, screening out a pixel point closest to the color value of the pixel point to be processed from the pixel points in the second color space recorded in the color characteristic data as a target pixel point;
determining a pixel point of a first color space corresponding to the target pixel point as a mapping pixel point of the pixel point to be processed based on the color characteristic data;
and constructing a corresponding relation between each pixel point to be processed and the corresponding mapping pixel point to obtain the 3D LUT.
In some possible embodiments, the screening, as a target pixel, a pixel closest to a color value of the pixel to be processed from among the pixels in the second color space recorded in the color characteristic data includes:
screening out pixel points with the brightness difference of the pixel points to be processed and the second color space smaller than a preset brightness threshold value from the pixel points of the second color space recorded in the color characteristic data as candidate pixel points;
and screening out the pixel point closest to the color value of the pixel point to be processed as a target pixel point of the pixel point to be processed based on the chromaticity difference of the second color space with the pixel point to be processed from the candidate pixel points.
In some possible embodiments, the preset brightness threshold is a fixed brightness threshold; or the preset brightness threshold value is positively correlated with the brightness value of the pixel point to be processed.
In some possible embodiments, the color values of the second color space are described in terms of luminance and chrominance, the chrominance comprising a first chrominance component and a second chrominance component;
before screening out a pixel point closest to the color value of the pixel point to be processed as a target pixel point of the pixel point to be processed based on the chromaticity difference of the second color space with the pixel point to be processed from the secondary candidate pixel points, the method further comprises:
initializing a residual of the first chrominance component and a residual of the second chrominance component;
the step of screening out a pixel point closest to the color value of the pixel point to be processed as a target pixel point of the pixel point to be processed based on the chromaticity difference of the second color space with the pixel point to be processed from the candidate pixel points comprises the following steps:
traversing each candidate pixel point, and if a first difference value between a first chrominance component of any candidate pixel point and the pixel point to be processed and a second difference value between second chrominance components meet a residual error updating condition, updating a residual error of the first chrominance component and a residual error of the second chrominance component, wherein the residual error updating condition is used for reducing the residual error;
and adopting the candidate pixel point corresponding to the final residual error of the first chrominance component and the residual error of the second chrominance component as the target pixel point.
In some possible embodiments, the residual update condition comprises any one of the following conditions:
condition 1, | x '-x | < Diff _ x, and | y' -y | < Diff _ y;
condition 2, | x ' -x | < Diff _ x, and | | Diff _ x | - | x ' -x | > or | | | Diff _ y | - | y ' -y | |, and | | | x ' -x | -y ' -y | < | Diff _ x-Diff _ y |;
the condition 3, y '-y < Diff _ y, and Diff _ x-x' -x < Diff _ y-y '-y < Diff _ x-Diff _ y-y, and x' -y < Diff _ x-Diff-y;
wherein x is the first chrominance component of the pixel point to be processed, x 'is the first chrominance component of the candidate pixel point, Diff _ x is the residual error of the first chrominance component, y is the second chrominance component of the pixel point to be processed, y' is the second chrominance component of the candidate pixel point, and Diff _ y is the residual error of the second chrominance component.
In some possible embodiments, the sampling the first color space comprises:
uniformly sampling the first color space; alternatively, the first and second electrodes may be,
and sampling specified rules of the first color space, wherein the specified rules are one or more of the brightest area containing the first color space and the darkest area containing the first color space.
In some possible embodiments, the constructing the color characteristic data based on a correspondence between color values of the test points in the first color space and color values of the test points in the second color space includes:
for each test point, establishing a corresponding relation between the test point and a color value of a second color space output by the test point through the display panel;
aiming at pixel points except each test point in the first color space, determining the color value of the pixel point in the second color space according to linear interpolation by utilizing the color value of each test point in the second color space; and constructing a corresponding relation between the pixel points and the color values of the corresponding second color space.
In a second aspect, an embodiment of the present application provides a display device, including: a display panel and a controller;
the display panel is used for displaying images;
the controller configured to perform any of the methods of the first aspect described above.
In a third aspect, an embodiment of the present application provides a computer-readable storage medium having stored thereon computer program instructions, which, when executed by a processor, implement the steps of any of the methods described above.
Additional features and advantages of the application will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by the practice of the application. The objectives and other advantages of the application may be realized and attained by the structure particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings needed to be used in the embodiments of the present application will be briefly described below, and it is obvious that the drawings described below are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 illustrates an application environment diagram in accordance with some embodiments;
a block diagram of a hardware configuration of a display device 200 according to some embodiments is illustrated in fig. 2;
fig. 3 is a schematic diagram illustrating a specific connection between the display device 200 and the external device 300;
fig. 4 is a schematic flowchart illustrating a method for generating a three-dimensional color lookup table according to an embodiment of the present application;
fig. 5 is a partial schematic flow chart illustrating the determination of a 3D LUT provided by an embodiment of the present application;
fig. 6 exemplarily illustrates a flow chart of screening a target pixel provided in the embodiment of the present application.
Detailed Description
To make the objects, embodiments and advantages of the present application clearer, the following description of exemplary embodiments of the present application will clearly and completely describe the exemplary embodiments of the present application with reference to the accompanying drawings in the exemplary embodiments of the present application, and it is to be understood that the described exemplary embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
All other embodiments, which can be derived by a person skilled in the art from the exemplary embodiments described herein without inventive step, are intended to be within the scope of the claims appended hereto. In addition, while the disclosure herein has been presented in terms of one or more exemplary examples, it should be appreciated that aspects of the disclosure may be implemented solely as a complete embodiment.
It should be noted that the brief descriptions of the terms in the present application are only for the convenience of understanding the embodiments described below, and are not intended to limit the embodiments of the present application. These terms should be understood in their ordinary and customary meaning unless otherwise indicated.
The terms "first", "second", "third", and the like in the description and claims of this application and in the above-described drawings are used for distinguishing between similar or analogous objects or entities and are not necessarily meant to define a particular order or sequence Unless otherwise indicated. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments described herein are, for example, capable of operation in sequences other than those illustrated or otherwise described herein.
Furthermore, the terms "comprises" and "comprising," as well as any variations thereof, are intended to cover a non-exclusive inclusion, such that a product or device that comprises a list of elements is not necessarily limited to those elements explicitly listed, but may include other elements not expressly listed or inherent to such product or device.
The term "unit" as used herein refers to any known or later developed hardware, software, firmware, artificial intelligence, fuzzy logic, or combination of hardware and/or software code that is capable of performing the functionality associated with that element.
The term "remote control" as used in this application refers to a component of an electronic device, such as the display device disclosed in this application, that is typically wirelessly controllable over a short range of distances. Typically using infrared and/or Radio Frequency (RF) signals and/or bluetooth to connect with the electronic device, and may also include WiFi, wireless USB, bluetooth, motion sensor, etc. For example: the hand-held touch remote controller replaces most of the physical built-in hard keys in the common remote control device with the user interface in the touch screen.
The term "gesture" as used in this application refers to a user's behavior through a change in hand shape or an action such as hand motion to convey a desired idea, action, purpose, or result.
Fig. 1 is a schematic diagram illustrating an operation scenario between a display device and a control apparatus according to an embodiment. As shown in fig. 1, a user may operate the display apparatus 200 through a mobile terminal 1002 and a control device 1001.
In some embodiments, the control device 1001 may include a measurement unit, a calculation unit, and a storage unit, where the measurement unit is configured to measure a color value of a pixel point in a second color space when the display device 200 displays a color in a first color space. The calculation unit is used for generating a three-dimensional color lookup table (3D LUT), and the storage unit is used for storing the generated 3D LUT.
In some embodiments, the control device 1001 may be a remote controller, and the communication between the remote controller and the display device includes an infrared protocol communication or a bluetooth protocol communication, and other short-distance communication methods, etc. to control the display device 200 in a wireless or other wired manner. The user may input a user command through a key on a remote controller, voice input, control panel input, etc. to control the display apparatus 200. Such as: the user can input a corresponding control command through a volume up/down key, a channel control key, up/down/left/right moving keys, a voice input key, a menu key, a power on/off key, etc. on the remote controller, to implement the function of controlling the display device 200.
In some embodiments, mobile terminals, tablets, computers, laptops, and other smart devices may also be used to control the display device 200. For example, the display device 200 is controlled using an application program running on the smart device. The application, through configuration, may provide the user with various controls in an intuitive User Interface (UI) on a screen associated with the smart device.
In some embodiments, the mobile terminal 1002 may install a software application with the display device 200, implement connection communication through a network communication protocol, and implement the purpose of one-to-one control operation and data communication. Such as: the control instruction protocol can be established between the mobile terminal 1002 and the display device 200, the remote control keyboard is synchronized to the mobile terminal 1002, and the function of controlling the display device 200 is realized by controlling the user interface on the mobile terminal 1002. The audio and video content displayed on the mobile terminal 1002 can also be transmitted to the display device 200, so as to realize the synchronous display function.
As also shown in fig. 1, the display apparatus 200 also performs data communication with the server 400 through various communication means. The display device 200 may be allowed to be communicatively connected through a Local Area Network (LAN), a Wireless Local Area Network (WLAN), and other networks. The server 400 may provide various contents and interactions to the display apparatus 200. Illustratively, the display device 200 receives software program updates, or accesses a remotely stored digital media library, by sending and receiving information, as well as Electronic Program Guide (EPG) interactions. The server 400 may be a cluster or a plurality of clusters, and may include one or more types of servers. Other web service contents such as video on demand and advertisement services are provided through the server 400.
The display device 200 may be a liquid crystal display, an OLED display, a projection display device. The particular display device type, size, resolution, etc. are not limiting, and those skilled in the art will appreciate that the display device 200 may be modified in performance and configuration as desired.
The display apparatus 200 may additionally provide an intelligent network tv function of a computer support function including, but not limited to, a network tv, an intelligent tv, an Internet Protocol Tv (IPTV), and the like, in addition to the broadcast receiving tv function.
A hardware configuration block diagram of a display device 200 according to an exemplary embodiment is exemplarily shown in fig. 2.
In some embodiments, at least one of the controller 250, the tuner demodulator 210, the communicator 220, the detector 230, the input/output interface 255, the display panel 275, the audio output interface 285, the memory 260, the power supply 290, the user interface 265, and the external device interface 240 is included in the display apparatus 200.
In some embodiments, a display panel 275 receives image signals from the first processor output and displays video content and images and components of the menu manipulation interface.
In some embodiments, the display panel 275 includes a display screen assembly for presenting a picture and a driving assembly for driving the display of an image.
In some embodiments, the video content is displayed from broadcast television content, or alternatively, from various broadcast signals that may be received via wired or wireless communication protocols. Alternatively, various image contents received from the network communication protocol and sent from the network server side can be displayed.
In some embodiments, the display panel 275 is used to present a user manipulation UI interface generated in the display apparatus 200 and used to control the display apparatus 200.
In some embodiments, a driving assembly for driving the display is further included according to the type of the display panel 275.
In some embodiments, display panel 275 is a projection display panel and may also include a projection device and a projection screen.
In some embodiments, communicator 220 is a component for communicating with external devices or external servers according to various communication protocol types. For example: the communicator may include at least one of a vvifii chip, a bluetooth communication protocol chip, a wired ethernet communication protocol chip, or other network communication protocol chip or a near field communication protocol chip, and an infrared receiver.
In some embodiments, the display apparatus 200 may establish control signal and data signal transmission and reception with the external control device 1001 or the content providing apparatus through the communicator 220.
In some embodiments, the user interface 265 may be configured to receive infrared control signals from a control device 1001 (e.g., an infrared remote control, etc.).
In some embodiments, the detector 230 is a signal used by the display device 200 to collect an external environment or interact with the outside.
In some embodiments, the detector 230 includes a light receiver, a sensor for collecting the intensity of ambient light, and parameters changes can be adaptively displayed by collecting the ambient light, and the like.
In some embodiments, an image collector 232 in the detector 230, such as a camera, a video camera, etc., may be used to collect external environment scenes, collect attributes of a user or gestures interacted with the user, adaptively change display parameters, and also recognize user gestures, so as to implement a function of interaction with the user.
In some embodiments, the detector 230 may also include a temperature sensor or the like, such as by sensing ambient temperature.
In some embodiments, the display apparatus 200 may adaptively adjust a display color temperature of an image. For example, the display apparatus 200 may be adjusted to display a cool tone when the temperature is in a high environment, or the display apparatus 200 may be adjusted to display a warm tone when the temperature is in a low environment.
In some embodiments, the detector 230 may further include a sound collector 231, such as a microphone, for collecting voice data, wherein when the user speaks an instruction by voice, the microphone can collect voice data including the instruction spoken by the user. For example, the sound collector 231 may collect a voice signal including a control instruction of the user to control the display device 200, or collect an ambient sound for recognizing an ambient scene type, so that the display device 200 may adaptively adapt to an ambient noise.
In some embodiments, as shown in fig. 2, the input/output interface 255 is configured to allow data transfer between the controller 250 and external other devices or other controllers 250. Such as receiving video signal data and audio signal data of an external device, or command instruction data, etc.
In some embodiments, the external device interface 240 may include, but is not limited to, the following: the interface can be any one or more of a high-definition multimedia interface (HDMI), an analog or data high-definition component input interface, a composite video input interface, a USB input interface, an RGB port and the like. The plurality of interfaces may form a composite input/output interface.
In some embodiments, as shown in fig. 2, the tuning demodulator 210 is configured to receive a broadcast television signal through a wired or wireless receiving manner, perform modulation and demodulation processing such as amplification, mixing, resonance, and the like, and demodulate an audio and video signal from a plurality of wireless or wired broadcast television signals, where the audio and video signal may include a television audio and video signal carried in a television channel frequency selected by a user and an EPG data signal.
In some embodiments, the frequency points demodulated by the tuner demodulator 210 are controlled by the controller 250, and the controller 250 can send out control signals according to user selection, so that the modem responds to the television signal frequency selected by the user and modulates and demodulates the television signal carried by the frequency.
In some embodiments, the broadcast television signal may be classified into a terrestrial broadcast signal, a cable broadcast signal, a satellite broadcast signal, an internet broadcast signal, or the like according to the broadcasting system of the television signal. Or may be classified into a digital modulation signal, an analog modulation signal, and the like according to a modulation type. Or the signals are classified into digital signals, analog signals and the like according to the types of the signals.
In some embodiments, the controller 250 and the modem 210 may be located in different separate devices, that is, the modem 210 may also be located in an external device of the main device where the controller 250 is located, such as an external set-top box. Therefore, the set top box outputs the television audio and video signals modulated and demodulated by the received broadcast television signals to the main body equipment, and the main body equipment receives the audio and video signals through the first input/output interface.
In some embodiments, the controller 250 controls the operation of the display device and responds to user operations through various software control programs stored in memory. The controller 250 may control the overall operation of the display apparatus 200. For example: in response to receiving a user command for selecting a UI object displayed on the display panel 275, the controller 250 may perform an operation related to the object selected by the user command.
In some embodiments, the object may be any one of selectable objects, such as a hyperlink or an icon. Operations related to the selected object, such as: displaying an operation connected to a hyperlink page, document, image, or the like, or performing an operation of a program corresponding to the icon. The user command for selecting the UI object may be a command input through various input means (e.g., a mouse, a keyboard, a touch pad, etc.) connected to the display apparatus 200 or a voice command corresponding to a voice spoken by the user.
As shown in fig. 2, the controller 250 includes at least one of a Random Access Memory 251 (RAM), a Read-Only Memory 252 (ROM), a video processor 270, an audio processor 280, other processors 253 (e.g., a Graphics Processing Unit (GPU), a Central Processing Unit 254 (CPU), a Communication Interface (Communication Interface), and a Communication Bus 256(Bus), which connects the respective components.
In some embodiments, RAM 251 is used to store temporary data for the operating system or other programs that are running.
In some embodiments, ROM 252 is used to store instructions for various system boots.
In some embodiments, the ROM 252 is used to store a Basic Input Output System (BIOS). The system is used for completing power-on self-test of the system, initialization of each functional module in the system, a driver of basic input/output of the system and booting an operating system.
In some embodiments, when the power of the display apparatus 200 is started upon receiving the power-on signal, the CPU executes the system boot instruction in the ROM 252 and copies the temporary data of the operating system stored in the memory into the RAM 251 so as to boot or run the operating system. After the start of the operating system is completed, the CPU copies the temporary data of the various application programs in the memory to the RAM 251, and then, the various application programs are started or run.
In some embodiments, CPU processor 254 is used to execute operating system and application program instructions stored in memory. And executing various application programs, data and contents according to various interactive instructions received from the outside so as to finally display and play various audio and video contents.
In some example embodiments, the CPU processor 254 may comprise a plurality of processors. The plurality of processors may include a main processor and one or more sub-processors. A main processor for performing some operations of the display apparatus 200 in a pre-power-up mode and/or operations of displaying a screen in a normal mode. One or more sub-processors for one operation in a standby mode or the like.
In some embodiments, the graphics processor 253 is used to generate various graphics objects, such as: icons, operation menus, user input instruction display graphics, and the like. The display device comprises an arithmetic unit which carries out operation by receiving various interactive instructions input by a user and displays various objects according to display attributes. And the system comprises a renderer for rendering various objects obtained based on the arithmetic unit, wherein the rendered objects are used for being displayed on the display panel.
In some embodiments, the video processor 270 is configured to receive an external video signal, and perform video processing such as decompression, decoding, scaling, noise reduction, frame rate conversion, resolution conversion, image synthesis, and the like according to a standard codec protocol of the input signal, so as to obtain a signal that can be displayed or played on the direct display device 200.
In some embodiments, video processor 270 includes a demultiplexing module, a video decoding module, an image synthesis module, a frame rate conversion module, a display formatting module, and the like.
The demultiplexing module is used for demultiplexing the input audio and video data stream, and if the input MPEG-2 is input, the demultiplexing module demultiplexes the input audio and video data stream into a video signal and an audio signal.
And the video decoding module is used for processing the video signal after demultiplexing, including decoding, scaling and the like.
And the image synthesis module is used for carrying out superposition mixing processing on the GUI signal input by the user or generated by the user and the video image after the zooming processing by the graphic generator so as to generate an image signal for display.
The frame rate conversion module is configured to convert an input video frame rate, such as a 60Hz frame rate into a 120Hz frame rate or a 240Hz frame rate, and the normal format is implemented in, for example, an interpolation frame mode.
The display format module is used for converting the received video output signal after the frame rate conversion, and changing the signal to conform to the signal of the display format, such as outputting an RGB data signal.
In some embodiments, the graphics processor 253 and the video processor may be integrated or separately configured, and the integrated configuration may perform processing of graphics signals output to the display panel, and the separate configuration may perform different functions, for example, GPU + frc (frame Rate conversion) architecture.
In some embodiments, the audio processor 280 is configured to receive an external audio signal, decompress and decode the received audio signal according to a standard codec protocol of the input signal, and perform noise reduction, digital-to-analog conversion, and amplification processes to obtain an audio signal that can be played in a speaker.
In some embodiments, video processor 270 may comprise one or more chips. The audio processor may also comprise one or more chips.
In some embodiments, the video processor 270 and the audio processor 280 may be separate chips or may be integrated together with the controller in one or more chips.
In some embodiments, the audio output, under the control of controller 250, receives sound signals output by audio processor 280, such as: the speaker 286, and an external sound output terminal of a generating device that can output to an external device, in addition to the speaker carried by the display device 200 itself, such as: external sound interface or earphone interface, etc., and may also include a near field communication module in the communication interface, for example: and the Bluetooth module is used for outputting sound of the Bluetooth loudspeaker.
The power supply 290 supplies power to the display device 200 from the power input from the external power source under the control of the controller 250. The power supply 290 may include a built-in power supply circuit installed inside the display apparatus 200, or may be a power supply interface installed outside the display apparatus 200 to provide an external power supply in the display apparatus 200.
A user interface 265 for receiving an input signal of a user and then transmitting the received user input signal to the controller 250. The user input signal may be a remote controller signal received through an infrared receiver, and various user control signals may be received through the network communication module.
In some embodiments, the user inputs a user command through the control device 1001 or the mobile terminal 1002, the user input interface is according to the user input, and the display apparatus 200 responds to the user input through the controller 250.
In some embodiments, a user may input a user command on a Graphical User Interface (GUI) displayed on the display panel 275, and the user input interface receives the user input command through the Graphical User Interface (GUI). Alternatively, the user may input the user command by inputting a specific sound or gesture, and the user input interface receives the user input command by recognizing the sound or gesture through the sensor.
In some embodiments, a "user interface" is a media interface for interaction and information exchange between an application or operating system and a user that enables conversion between an internal form of information and a form that is acceptable to the user. A commonly used presentation form of the User Interface is a Graphical User Interface (GUI), which refers to a User Interface related to computer operations and displayed in a graphical manner. It may be an interface element such as an icon, a window, a control, etc. displayed in the display screen of the electronic device, where the control may include a visual interface element such as an icon, a button, a menu, a tab, a text box, a dialog box, a status bar, a navigation bar, a Widget, etc.
The memory 260 includes a memory storing various software modules for driving the display device 200. Such as: various software modules stored in the first memory, including: at least one of a basic module, a detection module, a communication module, a display control module, a browser module, and various service modules.
The base module is a bottom layer software module for signal communication between various hardware in the display device 200 and for sending processing and control signals to the upper layer module. The detection module is used for collecting various information from various sensors or user input interfaces, and the management module is used for performing digital-to-analog conversion and analysis management.
As shown in fig. 3, a detailed connection diagram of the display device 200 and the external device 300 is shown. The display device 200 includes a mapping unit 302, a display panel 303, wherein the mapping unit 302 further includes a 3D LUT unit 3021, an internal and external interface unit 3022, and a test unit 3023, and the external device 300 includes a measurement unit 304, a calculation unit 305, and a storage unit 306. Here, the external device 300 may be the control device 1001 or may be another device, and is only an example here.
Some terms in the embodiments of the present application will be explained below to facilitate understanding by those skilled in the art.
(1) The test signal, which is output from the test unit 3023, is used to test the display color gamut of the display panel 303.
(2) The first color space refers to a color space related to the device, for example, RGB (Red-Green-Blue, color model based on additive color primaries, a color standard, which is a color space obtained by changing three color channels of Red, Green, and Blue and superimposing the color channels with each other).
(3) The second color space refers to a device-independent color space, for example, Yxy (Y denotes luminance, and x and Y denote chromaticity characteristics of color) color space. And the result of the first color space being displayed in the device may also be represented in a second color space.
(4) The third color space, which is a color space associated with the device, may be the same color space as the first color space or may be a different color space from the first color space.
(5) The target signal is a signal that a user needs to display on the device, and is displayed on the device by using a color value corresponding to the target signal.
Having described some of the language used in the examples of the present application, the inventive concepts of the present application will now be described:
first, the calculation unit 305 controls the test unit 3023 to send a test signal to the display panel 303 through the internal and external interface unit 3022 (in this case, the test unit 3023 is connected to the display panel 303), and after the test signal is displayed on the display panel 303, the color value displayed on the display panel 303 in the second color space by the test signal in the first color space is measured by the measurement unit 304 and sent to the calculation unit 305, and the color characteristic data of the display panel 303 is determined in the calculation unit 305. The target signal and the color value corresponding to the target signal in the third color space are then input to the calculation unit 305 for determining the mapping relationship between the target signal and the test signal, and the mapping relationship between the target signal and the test signal is further sent to the 3D LUT unit 3021 and the storage unit 306 through the internal and external interface unit 3022. Here, the measurement unit 304 may be a color analyzer of a brand model.
After the mapping relationship between the target signal and the test signal is input in the 3D LUT unit 3021, the signal in the signal input unit 301 is input to the 3D LUT unit 3021, the signal is mapped by the 3D LUT unit 3021, and is displayed in the display panel 303 (when the 3D LUT unit 3021 is connected to the display panel 303), and finally, wide color gamut display is realized.
Illustratively, assuming that the target signal color gamut requirement is bt.709, for example, the target signal is (255,0, 0), the color value corresponding to the target signal is determined to be (0.64,0.33) according to the color gamut standard requirement, and the color value of the display panel corresponding to the test signal (251, 3, 2) is known to be (0.64,0.33) through the test of the color characteristic of the display panel by the test signal, the requirement of displaying the bt.709 color gamut is achieved by establishing the mapping relationship between the target signal (255,0, 0) and the test signal (251, 3, 2).
In the related art, it is common to manually debug a 3D LUT and then implement a wide color gamut using the 3D LUT. Although in the manual debugging process, in order to save time, only a fixed number of pixel points are debugged, and other pixel points are filled in an interpolation mode, a large amount of time is still needed for debugging, so that the display efficiency is low.
In order to solve the above problem, in the embodiment of the present application, first, color characteristic data of the display panel is constructed, a mapping relationship between the first color space and the second color space is determined, and then, based on a specified mapping relationship between the third color space and the second color space, a mapping relationship between the third color space and the first color space is constructed, so that a 3D LUT can be automatically generated.
After introducing the design concept of the embodiment of the present application, some simple descriptions are provided below for application scenarios to which the technical solution of the embodiment of the present application can be applied, and it should be noted that the application scenarios described below are only used for describing the embodiment of the present application and are not limited. In specific implementation, the technical scheme provided by the embodiment of the application can be flexibly applied according to actual needs.
The application provides a method for generating a three-dimensional color lookup table, which is shown in fig. 4 and includes the following steps:
s401, constructing color characteristic data of the display panel, wherein the color characteristic data is used for describing a mapping relation between a first color space and a second color space when the color of the first color space is displayed through the display panel; the first color space is a device dependent color space, the second color space is a device independent color space, and the second color space is used to describe a result of displaying colors of the first color space via the display panel.
Illustratively, the first color space is represented by an RGB color space and the second color space is represented by a Yxy color space. The specific first color space related to the device and the specific second color space not related to the device are not limited, and may be adjusted according to the practical application.
In one embodiment of the application, a plurality of test points are obtained by sampling a first color space, then each test point in the plurality of test points is output to a display panel to be displayed, and the color value of the display result of each test point in a second color space is measured; and constructing color characteristic data based on the corresponding relation between the color value of each test point in the first color space and the color value of each test point in the second color space.
For example, after the test point (100,0,0) in the first color space is output to the display panel for display, it is known that the color value in the second color space is (0.635, 0.32, 100) according to the measured display result, and then the corresponding relationship between the color value in the first color space and the color value in the second color space of the test point (100,0,0) and the test point (0, 0,0) is (100, 0) - (0.635, 0.32, 100).
By displaying in the second color space using the plurality of sampling points of the first color space, more accurate color characteristic data of the display panel can be obtained.
In some possible embodiments, the first color space is uniformly sampled; or, the first color space is sampled according to a specified rule, and the specified rule is one or more of the brightest area containing the first color space and the darkest area containing the first color space.
Taking uniform sampling as an example, the number of test points is determined by the sampling density of the first color space. For example, the first color space is an RGB color space having a maximum of 256 in the case of an 8-bit signal3Different test signals, i.e. 2563And (4) a test point. If 17 test points are uniformly sampled in each dimension, 17 test points of 0, 16,32, and 224,240,255 are selected in the RGB three channels, and the total number of test points is 173And 4913.
The sampling points can cover the first color space by uniformly sampling or sampling according to a specified rule, and accurate color characteristic data of the display panel can be obtained after each sampling point utilizing the first color space is displayed in the second color space.
As described above, after the corresponding relationship between the test points and the color values of the test points in the second color space output by the display panel is established for each test point, the color values of the pixel points in the second color space are determined according to linear interpolation by using the color values of each test point in the second color space for the pixel points in the first color space except for each test point; and constructing a corresponding relation between the pixel points and the color values of the second color space corresponding to the pixel points.
For example, assume that the corresponding relationship between the color value of the first color space and the color value of the second color space of the test point (100,0,0) is (100,0,0) - (0.635, 0.32, 100), and the pixel (100,1,0) and the pixel (100,0,1) are not the sample point, i.e., not the test point. Because the color values of the pixel point (100,1,0) and the pixel point (100,0,1) in the first color space are the designated ranges of the test point (100,0,0) in the vicinity of the color value in the first color space, then, according to the linear interpolation, the color values of the pixel point (100,1,0) and the pixel point (100,0,1) in the second color space are the designated ranges of the test point (100,0,0) in the vicinity of the color value in the second color space. For example, the color value of the pixel (tO0,1,0) in the second color space is (0.634, 0.31, 100), the color value of the pixel (100,0,1) in the second color space is (0.633, 0.33, 100), that is, the corresponding relationship between the color value of the pixel (100,1,0) in the first color space and the color value of the second color space is (100,1,0) - (0.634, 0.31, 100), and the corresponding relationship between the color value of the pixel (100,0,1) in the first color space and the color value of the second color space is (100,0,1) - (0.633, 0.33, 100).
After the corresponding relation of the color values of the test points in the first color space and the color values of the test points in the second color space is determined, the corresponding relation of the pixel points in the first color space except the test points in the first color space and the color values of the test points in the second color space is determined through linear interpolation, and therefore color characteristic data of the display panel can be accurately obtained.
After obtaining the color characteristic data of the display panel, continuing to execute step S402, based on the specified mapping relationship between the third color space and the second color space, constructing the mapping relationship between the third color space and the first color space, and obtaining the three-dimensional color look-up table 3D LUT.
Wherein the third color space comprises N to-be-processed pixel points, N depending on the requirements of the 3D LUT. That is, the number of the pixels to be processed in the third color space is the same as that of the pixels in the first color space.
In some possible embodiments, a partial flow diagram for determining a 3D LUT is shown, as shown in fig. 5. The method comprises the following steps:
s501, determining color values of the pixel points to be processed in the second color space based on the designated mapping relation.
And the step of specifying the mapping relation refers to obtaining a color value of the pixel point to be processed in the second color space when the pixel point to be processed in the third color space is mapped to the second color space.
For example, it is assumed that the color value of the second color space is described by using luminance and chrominance, the chrominance is correlated with the target color gamut and the target color temperature, and the luminance needs to be determined according to the maximum luminance required by the target and the luminance curve corresponding to the maximum luminance. If the maximum brightness is determined as the maximum brightness that can be achieved by the white field of the display panel, the brightness curve may be gamma 2.2, 2.4, 2.6 or HDR 2084, HLG, etc. If the target color gamut and the target color temperature of the pixel point to be processed in the third color space are defined as a first mapping relation and the brightness curve is defined as a second mapping relation, the first mapping relation and the second mapping relation can form a specified mapping relation.
For example, based on the specified mapping relationship between the third color space and the second color space, it can be known that the color value of the target pixel (100,0,0) in the second color space is (0.635, 0.32, 100), (0.635, 0.32) represents chroma, and 100 represents brightness.
And S502, aiming at each pixel point to be processed, screening out the pixel point closest to the color value of the pixel point to be processed from the pixel points in the second color space recorded in the color characteristic data as a target pixel point.
Exemplarily, assuming that the color value of the to-be-processed pixel (255,0, 0) in the second color space is determined to be (0.64,0.33, 200) based on the designated mapping relationship, and the color value of the screened pixel (250,2, 4) in the color characteristic data in the second color space is (0.637,0.328,195), since the color value of the to-be-processed pixel (255,0, 0) is closest to the color value of the pixel (250,2, 4), the pixel (250,2, 4) is taken as the target pixel.
In an optional embodiment, among the pixels in the second color space recorded in the color characteristic data, the pixels whose luminance difference from the second color space of the pixel to be processed is smaller than the preset luminance threshold are screened out as candidate pixels, and then, among the candidate pixels, the pixels closest to the color value of the pixel to be processed are screened out as target pixels of the pixel to be processed based on the chromaticity difference from the second color space of the pixel to be processed.
Exemplarily, it is assumed that 10 pixel points of the second color space recorded in the color characteristic data are obtained, 4 candidate pixel points are obtained after the screening of the luminance difference, and then the screening of the chrominance difference is continued to determine the target pixel point with the closest color value.
The pixel points of the second color space are screened by utilizing the brightness difference firstly, and then the pixel points of the screened second color space are screened again by utilizing the chromaticity difference, so that the pixel points of the second color space can be screened out from the brightness value parameters with obvious differences firstly, and then the target pixel points are screened out again from the chromaticity value parameters with specific small differences, so that the color values of the obtained target pixel points are the closest pixel points.
Wherein the preset brightness threshold is a fixed brightness threshold; or the preset brightness threshold value is positively correlated with the brightness value of the pixel point to be processed.
For example, when the luminance curve corresponding to the luminance value of the pixel to be processed is gamma 2.2 and the preset luminance threshold is positively correlated with the luminance value of the pixel to be processed, the luminance curve Y corresponding to the luminance value of the pixel in the second color space may be between gamma 2.3 and gamma 2.1, and the luminance difference meets the preset luminance threshold. As shown in formula 1 and formula 2, yr、yg、ybR, G, B, respectively, are constantNumerical value, YmaxThe maximum brightness value of the pixel point to be processed.
Y>Ymax(yr(R/255)2.3+yg(G/255)2.3+yb(B/255)2.3) Equation 1
Y<Ymax(yr(R/255)2.1+yg(G/255)2.1+yb(B/255)2.1) Equation 2
Through the different settings to predetermine the luminance threshold value, can be accurate filter the pixel of second color space, and then obtain the closest target pixel of colour value.
In an alternative embodiment, the color value of the second color space is described by using luminance and chrominance, and the chrominance includes a first chrominance component and a second chrominance component; the first chrominance component and the second chrominance component may be parameters of different spatial dimensions. For example, the color value of the second color space is represented as (x, Y), (x, Y) representing chroma, Y representing luma, x representing the first chroma component, and Y representing the second chroma component.
Before a pixel point closest to the color value of the pixel point to be processed is screened out from the candidate pixel points based on the chromaticity difference of the second color space of the pixel point to be processed as a target pixel point of the pixel point to be processed, the residual error of the first chromaticity component and the residual error of the second chromaticity component are initialized. For example, Diff _ x denotes a residual of the first chrominance component, and Diff _ y denotes a residual of the second chrominance component. Herein, the initialization of Diff _ x is set to 10000, and the initialization of Diff _ y is set to 10000, which is merely an example and does not limit the specific values of Diff _ x and Diff _ y.
And traversing each candidate pixel point, and if a first difference value between the first chrominance components of any candidate pixel point and the pixel point to be processed and a second difference value between the second chrominance components meet a residual error updating condition, updating the residual error of the first chrominance components and the residual error of the second chrominance components, wherein the residual error updating condition is used for reducing the residual error. And adopting the candidate pixel points corresponding to the final residual error of the first chrominance component and the residual error of the second chrominance component as target pixel points.
Here, the residual update condition includes any one of the following conditions:
condition 1, | x '-x | < Diff _ x, and | y' -y | < Diff _ y;
condition 2, | x ' -x | < Diff _ x, and | | Diff _ x | - | x ' -x | > or | | | Diff _ y | - | y ' -y | |, and | | | x ' -x | -y ' -y | < | Diff _ x-Diff _ y |;
condition 3, | y ' -y | < Diff _ y, and | | Diff-x | - | x ' -x | | is less than | | | Diff _ y | - | y ' -y | |, and | | | × ' - | -y ' -y | < | Diff _ x-Diff _ y |;
wherein x is a first chrominance component of the pixel point to be processed, x 'is a first chrominance component of the candidate pixel point, Diff _ x is a residual error of the first chrominance component, y is a second chrominance component of the pixel point to be processed, y' is a second chrominance component of the candidate pixel point, and Diff _ y is a residual error of the second chrominance component.
Exemplarily, after the screening of the brightness value, the candidate pixel points are 3, that is, A, B, C, and are respectively used as (x)A,yA)、(xB,yB)、(xC,yC) Representing the chrominance values of three pixels.
Firstly, comparing the pixel point A with the pixel point to be processed, and because initial values of Diff _ x and Diff _ y are larger, a first difference value between the pixel point A and a first chrominance component of the pixel point to be processed and a second difference value between a second chrominance component of the pixel point A and the pixel point to be processed after comparison meet a residual error updating condition, at the moment, the residual error of the first chrominance component and the residual error of the second chrominance component are updated, namely Diff _ x is ═ x |, andA-x|,Diff_y=|yA-y|。
then selecting pixel point B to compare with pixel point to be processed, if the comparison still satisfies residual error updating condition, at this moment, updating residual error of first chroma component and residual error of second chroma component again, i.e. Diff _ x ═ xB-x|,Diff_y=|yB-y|。
And then selecting the pixel point C to be compared with the pixel point to be processed, and if the residual updating condition is not met after comparison, taking the candidate pixel point B as a target pixel point.
And initializing the residual error of the first chrominance component and the residual error of the second chrominance component, and updating the residual errors in the process of comparing the chrominance difference every time, thereby obtaining the target pixel point with the closest color value. In addition, different residual error conditions are set, so that the color difference of two pixel points can be reflected more accurately, and finally, mapping is performed accurately.
S503, based on the color characteristic data, determining a pixel point of the first color space corresponding to the target pixel point as a mapping pixel point of the pixel point to be processed.
S504, establishing corresponding relations between the pixel points to be processed and the corresponding mapping pixel points to obtain a 3D LUT.
The number of the pixels to be processed in the third color space is set to be the same as that of the pixels in the first color space, more pixels can be accurately debugged, accurate display of more colors is achieved, and the condition that the accuracy of color values of other pixels which are not debugged cannot be guaranteed when fewer fixed-number pixels are debugged in the related technology is avoided. The pixel points in the third color space are mapped to the second color space, and the target pixel points with the closest color values are obtained through screening, so that the 3D LUT can be obtained more accurately.
In an embodiment of the application, before the pixel point of the second color space recorded in the color characteristic data is screened by using the luminance difference and the pigment difference, the screening range can be further determined based on the color value of the third color space of the pixel point to be processed, the color value of the second color space, and the preset value, wherein the preset value is positively correlated with the signal size corresponding to the pixel point to be processed.
For example, assume that the color value of the third color space of the pixel to be processed is (R, G, B), the signal size corresponding to the pixel to be processed is 8bit, and the preset value corresponding to 8bit is 140, the color values (R1, G1, B1) of the second color space respectively correspond to the screening ranges R1- [ R-140, R +140], G1- [ G-140, G +140], B1- [ B-140, B +140 ]. If the signal size corresponding to the pixel point to be processed is 6 bits, and the preset value corresponding to the 6 bits is 120, the screening ranges corresponding to the color values (R1, G1, B1) in the second color space are R1- [ R-120, R +120], G1- [ G-120, G +120], B1- [ B-120, B +120 ].
The application provides a method for generating a three-dimensional color lookup table, which comprises the steps of firstly constructing color characteristic data of a display panel, determining a mapping relation between a first color space and a second color space, and then constructing a mapping relation between a third color space and the first color space based on a specified mapping relation between the third color space and the second color space, so that a 3D LUT can be automatically generated.
As shown in fig. 6, a schematic flow chart of screening a target pixel point is shown, which includes the following steps:
s601, constructing color characteristic data of the display panel, wherein the color characteristic data is used for describing a mapping relation between a first color space and a second color space when the color of the first color space is displayed through the display panel;
s602, determining the color value of each pixel point to be processed in the second color space based on the designated mapping relation between the third color space and the second color space;
s603, determining a screening range based on the color value of the third color space and the color value of the second color space of the pixel point to be processed;
s604, screening out pixel points with the brightness difference of the second color space with the pixel points to be processed being smaller than a preset brightness threshold value from the pixel points of the second color space recorded in the color characteristic data as candidate pixel points;
s605, initializing a residual error of the first chroma component and a residual error of the second chroma component; and screening out the pixel point closest to the color value of the pixel point to be processed as a target pixel point of the pixel point to be processed based on the chromaticity difference of the second color space of the pixel point to be processed from the candidate pixel points.
Embodiments of the present application also provide a computer storage medium, in which computer program instructions are stored, and when the instructions are run on a computer, the computer is caused to execute the steps of the above method for responding to a device.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.

Claims (10)

1. A method for generating a three-dimensional color lookup table, the method comprising:
constructing color characteristic data of a display panel, wherein the color characteristic data is used for describing a mapping relation between a first color space and a second color space when the color of the first color space is displayed through the display panel; the first color space is a device-dependent color space, the second color space is a device-independent color space, and the second color space is used for describing a result of displaying colors of the first color space through the display panel;
and constructing a mapping relation between a third color space and the first color space based on a specified mapping relation between the third color space and the second color space to obtain a three-dimensional color look-up table (3D LUT).
2. The method of claim 1, wherein constructing color characterization data for the display panel comprises:
sampling the first color space to obtain a plurality of test points;
outputting each test point in the plurality of test points to the display panel for display, and measuring the color value of the display result of each test point in the second color space;
and constructing the color characteristic data based on the corresponding relation between the color value of each test point in the first color space and the color value of each test point in the second color space.
3. The method of claim 1, wherein the third color space comprises N to-be-processed pixel points, N depending on the requirements of the 3D LUT; the constructing a mapping relationship between a third color space and the first color space based on a specified mapping relationship between the third color space and the second color space to obtain a three-dimensional color look-up table (3D LUT) comprises:
determining the color value of each pixel point to be processed in the second color space based on the designated mapping relation;
for each pixel point to be processed, screening out a pixel point closest to the color value of the pixel point to be processed from the pixel points in the second color space recorded in the color characteristic data as a target pixel point;
determining a pixel point of a first color space corresponding to the target pixel point as a mapping pixel point of the pixel point to be processed based on the color characteristic data;
and constructing a corresponding relation between each pixel point to be processed and the corresponding mapping pixel point to obtain the 3D LUT.
4. The method according to claim 3, wherein the screening out, as a target pixel, a pixel closest to a color value of the pixel to be processed among the pixels in the second color space recorded in the color characteristic data includes:
screening out pixel points with the brightness difference of the pixel points to be processed and the second color space smaller than a preset brightness threshold value from the pixel points of the second color space recorded in the color characteristic data as candidate pixel points;
and screening out the pixel point closest to the color value of the pixel point to be processed as a target pixel point of the pixel point to be processed based on the chromaticity difference of the second color space with the pixel point to be processed from the candidate pixel points.
5. The method of claim 4, wherein the preset brightness threshold is a fixed brightness threshold; or the preset brightness threshold value is positively correlated with the brightness value of the pixel point to be processed.
6. The method of claim 4, wherein the color values of the second color space are described in terms of luma and chroma, the chroma comprising a first chroma component and a second chroma component;
before screening out a pixel point closest to the color value of the pixel point to be processed as a target pixel point of the pixel point to be processed based on the chromaticity difference of the second color space with the pixel point to be processed from the secondary candidate pixel points, the method further comprises:
initializing a residual of the first chrominance component and a residual of the second chrominance component;
the step of screening out a pixel point closest to the color value of the pixel point to be processed as a target pixel point of the pixel point to be processed based on the chromaticity difference of the second color space with the pixel point to be processed from the candidate pixel points comprises the following steps:
traversing each candidate pixel point, and if a first difference value between a first chrominance component of any candidate pixel point and the pixel point to be processed and a second difference value between second chrominance components meet a residual error updating condition, updating a residual error of the first chrominance component and a residual error of the second chrominance component, wherein the residual error updating condition is used for reducing the residual error;
and adopting the candidate pixel point corresponding to the final residual error of the first chrominance component and the residual error of the second chrominance component as the target pixel point.
7. The method of claim 6, wherein the residual update condition comprises any one of the following conditions:
condition 1, | x '-x | < Diff _ x, and | y' -y | < Diff _ y;
condition 2, | x ' -x | < Diff _ x, and | | Diff _ x | - | x ' -x | > or | | | Diff _ y | - | y ' -y | |, and | | | | x ' - × - | y ' -y | < | Diff _ x-Diff _ y |;
the condition 3, y '-y < Diff _ y, and Diff _ x-x' -x < Diff _ y-y '-y < Diff _ x-Diff _ y-y, and x' -y < Diff _ x-Diff-y;
wherein x is the first chrominance component of the pixel point to be processed, x 'is the first chrominance component of the candidate pixel point, Diff _ x is the residual error of the first chrominance component, y is the second chrominance component of the pixel point to be processed, y' is the second chrominance component of the candidate pixel point, and Diff _ y is the residual error of the second chrominance component.
8. The method of claim 2, wherein sampling the first color space comprises:
uniformly sampling the first color space; alternatively, the first and second electrodes may be,
and sampling specified rules of the first color space, wherein the specified rules are one or more of the brightest area containing the first color space and the darkest area containing the first color space.
9. The method of claim 2, wherein constructing the color characterization data based on a correspondence between color values of the test points in the first color space and color values of the test points in the second color space comprises:
for each test point, establishing a corresponding relation between the test point and a color value of a second color space output by the test point through the display panel;
aiming at pixel points except each test point in the first color space, determining the color value of the pixel point in the second color space according to linear interpolation by utilizing the color value of each test point in the second color space; and constructing a corresponding relation between the pixel points and the color values of the corresponding second color space.
10. A display device, comprising: a display panel and a controller;
the display panel is used for displaying images;
the controller configured to perform the method of any of claims 1-9.
CN202110813079.XA 2021-07-19 2021-07-19 Three-dimensional color lookup table generation method and display device Active CN113613007B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110813079.XA CN113613007B (en) 2021-07-19 2021-07-19 Three-dimensional color lookup table generation method and display device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110813079.XA CN113613007B (en) 2021-07-19 2021-07-19 Three-dimensional color lookup table generation method and display device

Publications (2)

Publication Number Publication Date
CN113613007A true CN113613007A (en) 2021-11-05
CN113613007B CN113613007B (en) 2024-03-05

Family

ID=78337901

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110813079.XA Active CN113613007B (en) 2021-07-19 2021-07-19 Three-dimensional color lookup table generation method and display device

Country Status (1)

Country Link
CN (1) CN113613007B (en)

Citations (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02116893A (en) * 1988-10-27 1990-05-01 Nippon I B M Kk Method and device for quantizing color image
CA2153908A1 (en) * 1994-07-15 1996-01-16 Hiroaki Ueda Compact Image Conversion System Operable at a High Speed
CA2348325A1 (en) * 2000-05-23 2001-11-23 Jonathan Martin Shekter System for manipulating noise in digital images
CN1413338A (en) * 1999-09-16 2003-04-23 西尔弗布鲁克研究股份有限公司 Method and apparatus for producing print from Bayer image
EP1389003A2 (en) * 2002-08-08 2004-02-11 Canon Kabushiki Kaisha Color conversion table generation method and apparatus
JP2005295153A (en) * 2004-03-31 2005-10-20 Canon Inc Color conversion device and its method
US20090122075A1 (en) * 2007-11-12 2009-05-14 Samsung Electronics Co., Ltd. Color conversion method and apparatus for display device
US20110116137A1 (en) * 2009-11-17 2011-05-19 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and lookup table generation method
TWI553622B (en) * 2015-05-07 2016-10-11 鈺緯科技開發股份有限公司 Image processing device with image compensation function and image processing method thereof
US20170061926A1 (en) * 2015-09-02 2017-03-02 Intel Corporation Color transformation using non-uniformly sampled multi-dimensional lookup table
CN107493464A (en) * 2016-06-10 2017-12-19 汤姆逊许可公司 The method of the source color of the LUT mapping graph pictures of input color outside the active colour gamut of apparatus
CN108701351A (en) * 2016-10-17 2018-10-23 华为技术有限公司 A kind of image shows Enhancement Method and device
CN109492543A (en) * 2018-10-18 2019-03-19 广州市海林电子科技发展有限公司 The small target detecting method and system of infrared image
CN109741279A (en) * 2019-01-04 2019-05-10 Oppo广东移动通信有限公司 Image saturation method of adjustment, device, storage medium and terminal
CN109859702A (en) * 2018-12-28 2019-06-07 南京奥视威电子科技股份有限公司 A kind of 3D lookup table generating method, display color calibrating method, display color correction system
WO2019221934A1 (en) * 2018-05-16 2019-11-21 Synaptics Incorporated High dynamic range (hdr) data conversion and color space mapping
CN111445427A (en) * 2020-05-20 2020-07-24 青岛信芯微电子科技股份有限公司 Video image processing method and display device
CN112004075A (en) * 2020-09-03 2020-11-27 北京印刷学院 Neighborhood chromatic aberration compensation method and device
CN112118468A (en) * 2020-09-17 2020-12-22 海信视像科技股份有限公司 Method for changing color of peripheral equipment along with color of picture and display equipment

Patent Citations (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH02116893A (en) * 1988-10-27 1990-05-01 Nippon I B M Kk Method and device for quantizing color image
CA2153908A1 (en) * 1994-07-15 1996-01-16 Hiroaki Ueda Compact Image Conversion System Operable at a High Speed
CN1670765A (en) * 1999-06-16 2005-09-21 西尔弗布鲁克研究股份有限公司 Method of sharpening image using luminance channel
CN1413338A (en) * 1999-09-16 2003-04-23 西尔弗布鲁克研究股份有限公司 Method and apparatus for producing print from Bayer image
CA2348325A1 (en) * 2000-05-23 2001-11-23 Jonathan Martin Shekter System for manipulating noise in digital images
EP1389003A2 (en) * 2002-08-08 2004-02-11 Canon Kabushiki Kaisha Color conversion table generation method and apparatus
JP2005295153A (en) * 2004-03-31 2005-10-20 Canon Inc Color conversion device and its method
US20090122075A1 (en) * 2007-11-12 2009-05-14 Samsung Electronics Co., Ltd. Color conversion method and apparatus for display device
US20110116137A1 (en) * 2009-11-17 2011-05-19 Canon Kabushiki Kaisha Image processing apparatus, image processing method, and lookup table generation method
TWI553622B (en) * 2015-05-07 2016-10-11 鈺緯科技開發股份有限公司 Image processing device with image compensation function and image processing method thereof
US20170061926A1 (en) * 2015-09-02 2017-03-02 Intel Corporation Color transformation using non-uniformly sampled multi-dimensional lookup table
CN107493464A (en) * 2016-06-10 2017-12-19 汤姆逊许可公司 The method of the source color of the LUT mapping graph pictures of input color outside the active colour gamut of apparatus
CN108701351A (en) * 2016-10-17 2018-10-23 华为技术有限公司 A kind of image shows Enhancement Method and device
WO2019221934A1 (en) * 2018-05-16 2019-11-21 Synaptics Incorporated High dynamic range (hdr) data conversion and color space mapping
CN109492543A (en) * 2018-10-18 2019-03-19 广州市海林电子科技发展有限公司 The small target detecting method and system of infrared image
CN109859702A (en) * 2018-12-28 2019-06-07 南京奥视威电子科技股份有限公司 A kind of 3D lookup table generating method, display color calibrating method, display color correction system
CN109741279A (en) * 2019-01-04 2019-05-10 Oppo广东移动通信有限公司 Image saturation method of adjustment, device, storage medium and terminal
CN111445427A (en) * 2020-05-20 2020-07-24 青岛信芯微电子科技股份有限公司 Video image processing method and display device
CN112004075A (en) * 2020-09-03 2020-11-27 北京印刷学院 Neighborhood chromatic aberration compensation method and device
CN112118468A (en) * 2020-09-17 2020-12-22 海信视像科技股份有限公司 Method for changing color of peripheral equipment along with color of picture and display equipment

Also Published As

Publication number Publication date
CN113613007B (en) 2024-03-05

Similar Documents

Publication Publication Date Title
CN113630655B (en) Method for changing color of peripheral equipment along with picture color and display equipment
CN112214189B (en) Image display method and display device
CN111899680B (en) Display device and setting method thereof
CN111752518A (en) Screen projection method of display equipment and display equipment
CN112118400B (en) Display method of image on display device and display device
CN112118468A (en) Method for changing color of peripheral equipment along with color of picture and display equipment
CN113240644B (en) Display processing method of high dynamic range image and related device
US20130308055A1 (en) Information processor, information processing method, and computer program product
CN112073788A (en) Video data processing method and device and display equipment
CN113094142A (en) Page display method and display equipment
CN112073774A (en) Image quality processing method and display device
CN112073795B (en) Video data processing method and device and display equipment
CN113518257B (en) Multisystem screen projection processing method and equipment
CN112289271B (en) Display device and dimming mode switching method
CN112214190A (en) Display equipment resource playing method and display equipment
CN113613007B (en) Three-dimensional color lookup table generation method and display device
CN115547265A (en) Display apparatus and display method
CN112218156B (en) Method for adjusting video dynamic contrast and display equipment
CN114694599B (en) Display device and mura processing method
CN219418469U (en) Correction system
US20190213974A1 (en) Color Matching for Output Devices
CN115119035B (en) Display device, image processing method and device
CN114694552A (en) Display device and display abnormality processing method
CN114913796A (en) Display device and display brightness adjustment method
WO2022105410A1 (en) Display method, and device parameter memory method and recovery method therefor

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant