WO2019023835A1 - 一种广色域图像显示方法及设备 - Google Patents
一种广色域图像显示方法及设备 Download PDFInfo
- Publication number
- WO2019023835A1 WO2019023835A1 PCT/CN2017/095123 CN2017095123W WO2019023835A1 WO 2019023835 A1 WO2019023835 A1 WO 2019023835A1 CN 2017095123 W CN2017095123 W CN 2017095123W WO 2019023835 A1 WO2019023835 A1 WO 2019023835A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- color gamut
- image
- displayed
- color
- data
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 50
- 238000013507 mapping Methods 0.000 claims description 49
- 238000004590 computer program Methods 0.000 claims description 6
- 238000006243 chemical reaction Methods 0.000 abstract description 11
- 230000000694 effects Effects 0.000 abstract description 6
- 230000006870 function Effects 0.000 description 14
- 238000012937 correction Methods 0.000 description 13
- 238000010586 diagram Methods 0.000 description 10
- 239000011159 matrix material Substances 0.000 description 9
- 238000012545 processing Methods 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 239000003086 colorant Substances 0.000 description 5
- 241000699666 Mus <mouse, genus> Species 0.000 description 4
- 238000001514 detection method Methods 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000003993 interaction Effects 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 241000699670 Mus sp. Species 0.000 description 2
- 238000004364 calculation method Methods 0.000 description 2
- 238000007726 management method Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 230000005236 sound signal Effects 0.000 description 2
- 230000001133 acceleration Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000006399 behavior Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 230000015572 biosynthetic process Effects 0.000 description 1
- 239000002131 composite material Substances 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000005484 gravity Effects 0.000 description 1
- 238000012905 input function Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 230000007774 longterm Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000010295 mobile communication Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 230000003238 somatosensory effect Effects 0.000 description 1
- 238000010561 standard procedure Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000010897 surface acoustic wave method Methods 0.000 description 1
- 238000003786 synthesis reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/02—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed
- G09G5/06—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the way in which colour is displayed using colour palettes, e.g. look-up tables
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/46—Colour picture communication systems
- H04N1/56—Processing of colour picture signals
- H04N1/60—Colour correction or control
- H04N1/6058—Reduction of colour to a range of reproducible colours, e.g. to ink- reproducible colour gamut
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
- H04N9/67—Circuits for processing colour signals for matrixing
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0666—Adjustment of display parameters for control of colour parameters, e.g. colour temperature
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2340/00—Aspects of display data processing
- G09G2340/06—Colour space transformation
Definitions
- the present application relates to the field of image processing technologies, and in particular, to a wide color gamut image display method and device.
- a gamut is usually a range of colors, which is the sum of all the colors that a method of defining colors can produce.
- the color field of the content displayed by the terminal is usually limited by the terminal software and hardware. As technology continues to evolve, the range of gamuts that terminals can support is constantly increasing.
- the commonly used color gamut mainly includes DisplayP3, AdobeRGB and sRGB (standard Red Green Blue).
- the wide color gamut (WCG) is a general term relative to sRGB. Among them, the position of the three color gamuts in the CIE1931 chromaticity diagram can be seen in Figure 1.
- sRGB is a standard method used by Microsoft to combine Epson, HP, etc. to define colors, so that various computer peripherals and applications such as display, print and scan have a common language for color.
- DisplayP3 and AdobeRGB can support a larger color range than sRGB.
- the terminal When an image is displayed, the terminal usually maps images of other color gamuts to its supported color gamut for display.
- a device supporting DCI-P3 is used to display an image of DisplayP3.
- DisplayP3 is the same as DCI-P3 color gamut, the difference is only gamma.
- the comparison of DCI-P3 and ARGB in gamut space is shown in Figure 2. It can be seen that DCI-P3 and ARGB are not completely involved and included.
- the embodiment of the present application provides a wide color gamut image display method and device. To achieve color loss when implementing gamut mapping.
- the embodiment of the present application provides a wide color gamut image display method.
- the method is applicable to a terminal that supports a first color gamut (for example, the terminal can support DCI-P3), the method includes: determining an image to be displayed, a color gamut of the image to be displayed is a second color gamut, and the second The color gamut includes a wide color gamut (eg, the second color gamut is the ARGB color gamut); the image to be displayed is mapped from the second color gamut to the third color gamut, and the color range of the third color gamut covers the first color gamut and the second The color range of the color gamut; the image to be displayed is mapped from the third color gamut to the first color gamut, and the image to be displayed mapped to the first color gamut is displayed.
- the color gamut of the image is first mapped to a color gamut that completely contains the color gamut (first color gamut) to be converted and the image itself (first The color gamut of the two-color domain (the third color gamut, or super gamut), and then map the image from the wide color gamut to the color gamut that needs to be converted, thereby reducing the color gamut
- the color loss caused by not including each other makes the image display better, the degree of reproduction is higher, or the color enhancement is stronger.
- the second color gamut includes one or more of sRGB, ARGB, and DisplayP3.
- the embodiments of the present application can support the conversion of three mainstream color gamuts, and the universality is strong.
- the third color gamut is:
- Gamma sRGB.
- Gamma can also take a value of 2.2 and so on.
- the white point of the third color gamut can be selected as D65, D75 or others. Through the embodiment of the present application, the third color gamut can completely enclose P3, sRGB, and ARGB, and the area can be defined to be larger.
- the mapping the image to be displayed from the third color gamut to the first color gamut comprises: performing color enhancement or color restoration on the image to be displayed mapped to the third color gamut; The image to be displayed after color enhancement or color reproduction is mapped from the third color gamut to the first color gamut.
- the color reproduction or the color enhancement amount mode can be realized by the embodiment of the present application, and the color enhancement can be performed by default, and the display effect is better.
- the image to be displayed includes data of an application, a color gamut of the data of the application is the second color gamut, and the image to be displayed is mapped by the second color gamut Mapping to the third color gamut, and mapping the image to be displayed from the third color gamut to the first color gamut comprises: mapping data of an application of the image to be displayed from the second color gamut to the a third color gamut; mapping data of the application of the image to be displayed from the third color gamut to the first color gamut.
- the embodiment of the present application can realize the color gamut conversion only on the data of the application program, and the calculation amount is small. Since the data of the operating system generally corresponds to white, the conversion is not affected by the display effect of the converted image to be displayed. small.
- the image to be displayed further includes data of an operating system, the data of the operating system is a third color gamut; and before displaying the image to be displayed mapped to the first color gamut, the method The method further includes: mapping data of an operating system of the image to be displayed from the third color gamut to the first color gamut.
- the data of the application includes a content source
- the content source is the second color gamut
- the image to be displayed is mapped from the second color gamut to a third a color gamut
- mapping the image to be displayed from the third color gamut to the first color gamut comprises: mapping a content source of the image to be displayed from the second color gamut to the third color gamut; Mapping a content source of the image to be displayed from the third color gamut to the first color gamut.
- the gamut conversion can be performed only on the content displayed by the application by using the embodiment of the present application. For example, for the gallery application, only the image in the gallery can be converted into a color gamut, and the calculation amount is small, and the effect is not affected.
- an embodiment of the present application provides a terminal.
- the terminal has the function of implementing the behavior of the terminal in the actual method of the first aspect described above and its alternative implementation. This function can be implemented in hardware or in hardware by executing the corresponding software.
- the hardware or software includes one or more units corresponding to the functions described above.
- the terminal includes a display, a processor, and a memory
- the display is for displaying the processed image
- the memory is for the user to store the data and the program.
- the processor executes a memory-stored computer-executable instruction to cause the fault-handling device to perform the image display method as in the first aspect and various alternatives of the first aspect.
- an embodiment of the present application provides a computer readable storage medium, including computer readable instructions.
- the computer reads and executes the computer readable instructions, the computer is caused to perform the methods of the first aspect described above and its alternative implementations.
- an embodiment of the present application provides a computer program product, comprising computer readable instructions, when a computer reads and executes the computer readable instructions, such that the computer performs the foregoing first aspect and its optional implementation method.
- Figure 1 is a schematic view showing the position of a color gamut standard in a CIE1931 chromaticity diagram
- Figure 2 is a comparison of color gamut standards in gamut space
- FIG. 3 is a schematic structural diagram of a part of a mobile phone according to an embodiment of the present disclosure
- FIG. 4 is a flowchart of a method for displaying a wide color gamut image according to an embodiment of the present application
- FIG. 5 is a terminal for displaying an image according to an embodiment of the present application.
- FIG. 6 is a schematic structural diagram of a color characteristic file according to an embodiment of the present invention.
- FIG. 7 is a schematic structural diagram of a part of a GUI system of Android
- FIG. 8 is a schematic flow chart of a wide color gamut image display method
- FIG. 9 is a flowchart of another wide color gamut image display method according to an embodiment of the present application.
- FIG. 10 is an example provided by an embodiment of the present application.
- Figure 11 is another example provided by the embodiment of the present application.
- FIG. 12 is a schematic structural diagram of an image display apparatus according to an embodiment of the present application.
- Embodiments of the present invention provide a wide color gamut image display method and terminal.
- mapping an image different from the wide color gamut supported by the terminal to a wide color gamut that completely encompasses the wide color gamut supported by the terminal and the color gamut of the image and then mapping the image through the screen correction to the wide color gamut supported by the terminal, so that The terminal displays.
- This enables a wide color gamut display that supports multiple color gamut images with no color loss.
- the terminal involved in the embodiment of the present invention may include a mobile phone, a tablet computer, a PDA (Personal Digital Assistant), a POS (Point of Sales), a vehicle-mounted computer, and the like.
- the terminal includes at least a memory, a processor, and a display, the terminal supporting a first color gamut.
- the memory is used by the user to store data and programs, and the processor can run data and programs stored in the memory.
- the processor is configured to determine an image to be displayed, and map the image to be displayed from the second color gamut to the third color gamut, and then map the image to be displayed from the third color gamut to the first color gamut.
- the color gamut of the image to be displayed is a second color gamut, and the second color gamut includes a wide color gamut; and the color range of the third color gamut includes a color range of the first color gamut and the second color gamut.
- a display for displaying an image for example, displaying an image to be displayed mapped to a first color gamut.
- the processor is further configured to perform color enhancement or color restoration on the image to be displayed mapped to the third color gamut; and map the image to be displayed after the color enhancement or color reproduction from the third color gamut to the The first color gamut is described.
- the image to be displayed includes data of an application
- the color gamut of the data of the application is a second color gamut
- the processor is further configured to map data of the application of the image to be displayed from the second color gamut to a third color gamut; data of an application of the image to be displayed is mapped from the third color gamut to the first color gamut.
- the image to be displayed further includes data of an operating system, and the data of the operating system is a third color gamut;
- the processor is further configured to map the data of the operating system of the image to be displayed from the third color gamut to the first color gamut.
- the data of the application includes a content source, where the content source is a second color gamut
- the processor is further configured to: map the content source of the image to be displayed from the second color gamut to the third color gamut; The content source of the image is mapped by the third color gamut to the first color gamut.
- FIG. 3 is a schematic structural diagram of a part of a mobile phone according to an embodiment of the present application.
- the mobile phone 100 includes an RF (Radio Frequency) circuit 110, a memory 120, other input devices 130, a display screen 140, a sensor 150, an audio circuit 160, an I/O subsystem 170, a processor 180, and a power supply. 190 and other components.
- RF Radio Frequency
- FIG. 2 does not constitute a limitation to the mobile phone, and may include more or less components than those illustrated, or combine some components, or split some components, or Different parts are arranged.
- display 140 is a User Interface (UI) and that handset 100 may include a user interface that is smaller than shown or less.
- UI User Interface
- the components of the mobile phone 100 will be specifically described below with reference to FIG. 3:
- the RF circuit 110 can be used for transmitting and receiving information or during a call, and receiving and transmitting the signal. Specifically, after receiving the downlink information of the base station, the processor 180 processes the data. In addition, the uplink data is designed to be sent to the base station.
- RF circuits include, but are not limited to, an antenna, at least one amplifier, a transceiver, a coupler, an LNA (Low Noise Amplifier), a duplexer, and the like.
- RF circuitry 110 can also communicate with the network and other devices via wireless communication.
- the wireless communication may use any communication standard or protocol, including but not limited to GSM (Global System of Mobile communication), GPRS (General Packet Radio Service), CDMA (Code Division Multiple Access). , Code Division Multiple Access), WCDMA (Wideband Code Division Multiple Access), LTE (Long Term Evolution), e-mail, SMS (Short Messaging Service), and the like.
- the memory 120 can be used to store software programs and modules, and the processor 180 executes various functional applications and data processing of the mobile phone 100 by running software programs and modules stored in the memory 120.
- the memory 120 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may be stored. Data created according to the use of the mobile phone 100 (such as audio data, phone book, etc.).
- memory 120 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
- Other input devices 130 can be used to receive input numeric or character information, as well as generate key signal inputs related to user settings and function controls of the handset 100.
- other input devices 130 may include, but are not limited to, a physical keyboard, function keys (such as volume control buttons, switch buttons, etc.), trackballs, mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
- function keys such as volume control buttons, switch buttons, etc.
- trackballs mice, joysticks, and light mice (the light mouse is not sensitive to display visual output).
- Other input devices 130 are coupled to other input device controllers 171 of I/O subsystem 170 for signal interaction with processor 180 under the control of other device input controllers 171.
- the display screen 140 can be used to display information input by the user or information provided to the user as well as various menus of the mobile phone 100, and can also accept user input.
- the specific display screen 140 may include a display panel 141 and a touch panel 142.
- the display panel 141 can be configured by using an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), or the like.
- the touch panel 142 also referred to as a touch screen, a touch sensitive screen, etc., can collect contact or non-contact operations on or near the user.
- the operation of the user using a finger, a stylus or the like on the touch panel 142 or in the vicinity of the touch panel 142 may also include a somatosensory operation; the operation includes a single point control operation and a multi-point control operation. Wait for the operation type.) and drive the corresponding connection device according to the preset program.
- the touch panel 142 may include two parts: a touch detection device and a touch controller. Wherein, the touch detection device detects the touch orientation and posture of the user, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller receives the touch information from the touch detection device, and converts the signal into a processor.
- the processed information is sent to the processor 180 and can receive commands from the processor 180 and execute them.
- the touch panel 142 can be implemented by using various types such as resistive, capacitive, infrared, and surface acoustic waves, and the touch panel 142 can be implemented by any technology developed in the future.
- the touch panel 142 can cover the display panel 141, and the user can display the content according to the display panel 141 (the display content includes, but is not limited to, a soft keyboard, a virtual mouse, a virtual button, an icon, etc.) on the display panel 141. Operation is performed on or near the covered touch panel 142.
- the touch panel 142 After detecting the operation thereon or nearby, the touch panel 142 transmits to the processor 180 through the I/O subsystem 170 to determine user input, and then the processor 180 according to the user The input provides a corresponding visual output on display panel 141 via I/O subsystem 170.
- the touch panel 142 and the display panel 141 are two separate components to implement the input and input functions of the mobile phone 100, in some embodiments, the touch panel 142 may be integrated with the display panel 141. The input and output functions of the mobile phone 100 are implemented.
- the handset 100 can also include at least one type of sensor 150, such as a light sensor, motion sensor, and other sensors.
- the light sensor may include an ambient light sensor and a proximity sensor, wherein the ambient light sensor may adjust the brightness of the display panel 141 according to the brightness of the ambient light, and the proximity sensor may close the display panel 141 when the mobile phone 100 moves to the ear. / or backlight.
- the accelerometer sensor can detect the magnitude of acceleration in all directions (usually three axes). When it is stationary, it can detect the magnitude and direction of gravity. It can be used to identify the gesture of the mobile phone (such as horizontal and vertical screen switching, related Game, magnetometer attitude calibration), vibration recognition related functions (such as pedometer, tapping), etc.
- the mobile phone 100 can also be configured with gyroscopes, barometers, hygrometers, thermometers, infrared sensors and other sensors, here Let me repeat.
- the audio circuit 160, the speaker 161, and the microphone 162 can provide an audio interface between the user and the handset 100.
- the audio circuit 160 can transmit the converted audio data to the speaker 161 for conversion to the sound signal output by the speaker 161; on the other hand, the microphone 162 converts the collected sound signal into a signal, which is received by the audio circuit 160.
- the audio data is converted to audio data, which is then output to the RF circuit 108 for transmission to, for example, another mobile phone, or the audio data is output to the memory 120 for further processing.
- the I/O subsystem 170 is used to control external devices for input and output, and may include other device input controllers 171, sensor controllers 172, and display controllers 173.
- one or more other input control device controllers 171 receive signals from other input devices 130 and/or send signals to other input devices 130.
- Other input devices 130 may include physical buttons (press buttons, rocker buttons, etc.) , dial, slide switch, joystick, click wheel, light mouse (light mouse is a touch-sensitive surface that does not display visual output, or an extension of a touch-sensitive surface formed by a touch screen). It is worth noting that other input control device controllers 171 can be connected to any one or more of the above devices.
- Display controller 173 in I/O subsystem 170 receives signals from display 140 and/or transmits signals to display 140. After the display 140 detects the user input, the display controller 173 converts the detected user input into an interaction with the user interface object displayed on the display screen 140, ie, implements human-computer interaction.
- Sensor controller 172 can receive signals from one or more sensors 150 and/or to one or A plurality of sensors 150 transmit signals.
- the processor 180 is the control center of the handset 100, connecting various portions of the entire handset with various interfaces and lines, by running or executing software programs and/or modules stored in the memory 120, and recalling data stored in the memory 120, The various functions and processing data of the mobile phone 100 are executed to perform overall monitoring of the mobile phone.
- the processor 180 may include one or more processing units; preferably, the processor 180 may integrate an application processor and a modem processor, where the application processor mainly processes an operating system, a user interface, an application, and the like.
- the modem processor primarily handles wireless communications. It can be understood that the above modem processor may not be integrated into the processor 180.
- the handset 100 also includes a power source 190 (such as a battery) that supplies power to the various components.
- a power source 190 such as a battery
- the power source can be logically coupled to the processor 180 via a power management system to manage functions such as charging, discharging, and power consumption through the power management system.
- the mobile phone 100 may further include a camera, a Bluetooth module, and the like, and details are not described herein.
- FIG. 4 is a flowchart of a method for displaying a wide color gamut image according to an embodiment of the present application.
- the method is applicable to a terminal that supports a first color gamut. As shown in FIG. 4, the method specifically includes:
- S410 Determine an image to be displayed, where a color gamut of the image to be displayed is a second color gamut.
- the second color gamut includes a wide color gamut.
- the image to be displayed includes application data (APP buffer), and may also include operating system data (status bar or system bar) and the like.
- APP buffer application data
- operating system data status bar or system bar
- the data of the application may refer to an image in a gallery, a web page in a browser, or a user interface of other applications, etc.
- the data of the operating system may refer to an operating system's status bar, display controls, and the like.
- FIG. 5 is a terminal for displaying an image according to an embodiment of the present application.
- the image displayed by the terminal includes a system window, an APP window, and the like.
- 5 is a completed image.
- application data is generally displayed through an application window (Application Window), and in some cases, through an application.
- Application Window is displayed.
- Sub Window is displayed.
- the data of the operating system is generally displayed through the system window.
- the application window includes all the windows created by the application itself, and the window that the system is responsible for displaying before it is applied.
- the child window can include an application-defined dialog box, or an input method window, etc., the child window is attached to an application window (setting the same token), and the like.
- the System Window is generally designed for the system and does not depend on any application window. For example, Status Bar, Navigation Bar, Wallpaper, Caller ID (Phone), Lock KeyGuard, Toast, volume adjustment window, mouse cursor, etc.
- the application window, the application sub-window, and the system window that need to be displayed on the terminal constitute an image to be displayed.
- the color gamut of the image to be displayed may refer to the color gamut of the data of the application.
- the color gamut of the data of the operating system is generally preset, and the color gamut of the operating system of the same version is generally fixed.
- the color gamut of the data of the operating system may be a fourth color gamut different from the first color gamut, the second color gamut, and the third color gamut, or may be the second color gamut or the third color gamut.
- the color gamut of the operating system's data can be sRGB.
- the data of the application may include an APP display control and a content source, where the APP display control and the content source may correspond to one or more layers, for example, the APP display control and the content source may correspond to The same layer, the content source can also correspond to one or more layers.
- the APP display control and the content source corresponding to one or more layers may include one or more color gamuts, each layer may correspond to a different color gamut, or all layers or partial layers correspond to the same color gamut .
- the color gamut corresponding to the data of the application is collectively referred to as the second color gamut, but the second color gamut may refer to one or more different color gamuts.
- the color gamut of the image to be displayed may also refer to all color gamuts corresponding to the image to be displayed, that is, the color gamut including the data of the application and the color gamut of the data of the operating system. All gamuts are collectively referred to as the second gamut.
- the image to be displayed includes data of an operating system and data of an application, wherein the data of the operating system includes data corresponding to the status bar 511 and the navigation bar 512; the data of the application includes an APP display control 521, The APP display control 523 and the corresponding data of the content 522 (shown in FIG. 5 as a photo in the gallery) and the like may further include preview images of other photos (not shown), etc., wherein the photos in the gallery are The content source, the photos to be displayed, the APP display control, and the preview images of other photos belong to different layers, and the color gamut corresponding to each layer may be different, and the APP display control may be set to the third color gamut, the photo
- the color gamut can be various.
- the image downloaded from the network is generally the sRGB color gamut.
- the image taken with the SLR camera is generally the ARGB color gamut.
- the image captured by the device supporting the Display-P3 color gamut is generally Display-P3. Wait.
- the image to be displayed may include a graphical user interface (GUI) of the application.
- GUI graphical user interface
- the corresponding color is generally white.
- the system status bar is generally white, so before and after the conversion. The effect is the same.
- the color gamut of the image to be displayed when converting the color gamut of the image to be displayed, only the color gamut of the data of the operating system may be mapped to the third color gamut.
- the color gamut of the desktop wallpaper or the screen saver can be converted only.
- the application icon displayed on the desktop can also be converted into a color gamut.
- the data of the operating system and the data of the application can be converted when the color gamut of the image to be displayed is converted.
- the layer may be converted, for example, the format of the data corresponding to the layer may be recognized, when the format is recognized as JPG.
- the format of a picture such as JPEG or SVG is converted, the color gamut of the layer is converted.
- FIG. 6 is a schematic structural diagram of a color characteristic file according to an embodiment of the present invention.
- the image to be displayed includes an international color consortiun profile (ICC profile), and the ICC profile includes parameters mapped from the second color gamut to the third color gamut.
- ICC profile international color consortiun profile
- redTRCTag, greenTRCTag, blueTRCTag and "redMatrixColumnTag, greenMatrixColumnTag, blueMatrixColumnTag” are determined from the ICC profile.
- the image is subjected to degamma correction, 3*3 matrix difference, and gamma correction to convert to the third color gamut.
- the parameters required for degamma correction and gamma correction can be obtained according to "redTRCTag, greenTRCTag, blueTRCTag", and the parameters required for 3*3 matrix difference It can be parsed according to "redMatrixColumnTag, greenMatrixColumnTag, blueMatrixColumnTag".
- M2 is a matrix in which ARGB is mapped to the XYZ space
- M1 is a matrix in which the XYZ space is mapped to the third color gamut.
- M3 is the color gamut of the Display-P3 gamut mapped to the XYZ space.
- the color range of the third color gamut may include a Display-P3 color gamut, an sRGB color gamut, and a color range of the AGRB.
- the third color gamut may have a value of:
- the image to be displayed can be mapped to the first color gamut by degamma correction, color gamut conversion, and gamma correction, and displayed after screen correction.
- the color gamut conversion can use 3D lookup table (3DLUT) to achieve color restoration and color enhancement.
- the system defaults to display the image to be displayed after the color enhancement, or may display the image to be displayed after the color enhancement or color reproduction corresponding to the setting according to the user setting.
- GUI system of Android includes the following parts: Window and View Manager System, Surface Flinger, InputManager System, and Application Framework System (Activity) Manager System).
- Embodiments of the invention are primarily applicable to display synthesis systems.
- SurfaceFlinger is a stand-alone system that receives all the windows interface as input, according to zorder (a computer term used to set the order), The parameters such as transparency, size, position, etc., calculate the position of each Surface in the final composite image, and then submit the final display Buffer to HWComposer or OpenGL, and then display it to a specific display device.
- zorder a computer term used to set the order
- HWComposer defines a set of HAL layer interfaces, and then each chip manufacturer implements according to various hardware features. Its main job is to finally synthesize the display parameters of the layer calculated by SurfaceFlinger into a display Buffer. Note that Surface Flinger is not the only input to HWComposer, there is The Surface is not managed by Android's WindowManager. For example, the camera's preview input Buffer can be written directly by hardware, and then used as one of the HWComposer inputs to make the final composition with the SurfaceFlinger output.
- OpenGL ES OpenGL for Embedded Systems
- OpenGL is a subset of the OpenGL 3D graphics API.
- OpenGL is a 2D/3D graphics library that requires underlying hardware (GPU) and driver support.
- Applications such as Gallery APP
- OpenGl functions directly to implement complex graphical interfaces.
- EGL is the interface between OpenGL ES and SurfaceFlinger.
- Display is an abstraction of Android's output display device.
- the traditional Display device is the LCD screen on the terminal, and can also support other external input devices, such as HDMI, Wifi Display and so on.
- the input of Display is the Surface of all the windows filtered according to the above LayerStack value.
- the output is the same Buffer size as the display device.
- the Buffer is finally sent to the hardware FB device, or the HDMI device, or the remote Wifi Display Sink device. display. There are SurfaceFlinger, OpenGL and HWComposer on the input to output path.
- FIG. 8 is a schematic flow chart of a wide color gamut image display method.
- the terminal maps the data of the application and the color gamut of the data of the operating system to the Display-P3 color gamut, and the color gamut of the data of the application and the data of the operating system is displayed by Display-
- the P3 color gamut is mapped to the DCI-P3 color gamut for the terminal to display. It is not difficult to find that after the ARGB color gamut is converted to the Display-P3 color gamut, color loss occurs.
- FIG. 9 is a schematic flowchart diagram of a method for displaying a wide color gamut image according to an embodiment of the present application.
- FIG. 8 in SurfaceFlinger, data of an application and data of an operating system are received.
- the application data is ARGB color gamut
- the operating system data is sRGB color gamut.
- the color gamut of the data of the application and the color gamut of the data of the operating system are mapped to the super gamut (D65) by degamma correction, 3*3 matrix difference, gamma correction (in combination with the embodiment shown in FIG. 2)
- An example of a third color gamut in the middle wherein the parameters mapped to super gamut (D65) can be obtained from the data of the application and the profile description of the data of the operating system.
- D65 is a white point
- super gamut can also use other white points, for example, D75.
- super gamut generally refers to a third color gamut.
- the color gamut of the data of the application and the color gamut of the data of the operating system are mapped by super gamut (D65) to the color gamut of the display (DCI-P3) for display display.
- mapping from super gamut (D65) to DCI-P3 requires degamma correction, difference and gamma correction.
- difference when the difference is made, both color reproduction and color enhancement can be achieved.
- the color gamut of the data of the application and the color gamut of the data of the operating system are mapped from the super gamut (D65) to the color gamut of the display, it is generally necessary to perform processing such as screen color correction and color temperature adjustment.
- processing such as screen color correction and color temperature adjustment.
- the ideal color gamut generally deviates from the actual color gamut of the screen.
- the ARGB nonlinearity (255, 235, 217) is subjected to Degamma operation and 3*3 matrix difference operation to obtain DisplayP3 linearity (261.2939, 214.9968, 181.5335), and then directly cropped (255, 214.9968, 181.5335), perform Gamma operation to obtain DisplayP3 nonlinearity (255, 237, 219).
- Degamma operation, 3*3 matrix difference operation, and gamma operation are performed by ARGB nonlinearity (255, 235, 217), mapped to super gamut (D65), and then subjected to Degamma operation, 3DLUT by super gamut (D65).
- the operation and the gamma operation result in a DisplayP3 nonlinearity (255, 234, 217).
- the color difference between the two is calculated according to DeltaE2000 to be 1.8679 and 0.6058.
- FIG. 11 shows some simulation experiments of gamut mapping using super gamut. According to these implementations, color gamut conversion is performed by mapping gamut to super gamut, and the chromatic aberration is small. Less color loss.
- FIG. 12 is a schematic structural diagram of an image display apparatus according to an embodiment of the present application. As shown in FIG. 12, the device supports a first color gamut, and the device includes:
- the determining unit 1201 is configured to determine an image to be displayed, and the color gamut of the image to be displayed is a second color gamut.
- the first mapping unit 1202 is configured to map the image to be displayed from the second color gamut to the third color gamut, wherein the color range of the third color gamut covers the color range of the first color gamut and the second color gamut.
- the second mapping unit 1203 is configured to map the image to be displayed from the third color gamut to the first color gamut.
- the display unit 1204 is configured to display an image to be displayed mapped to the first color gamut.
- the second color gamut includes one or more of sRGB, ARGB, and DisplayP3;
- the third color gamut is:
- the device further includes: a third mapping unit 1205, configured to perform color enhancement or color restoration on the image to be displayed mapped to the third color gamut;
- the second mapping unit 1203 is specifically configured to map the image to be displayed after color enhancement or color reproduction from the third color gamut to the first color gamut.
- the image to be displayed includes the data of the application, the color gamut of the data of the application is the second color gamut, and the first mapping unit 1202 is specifically configured to: use the data of the application of the image to be displayed by the Mapping a second color gamut to the third color gamut;
- the second mapping unit 1203 is specifically configured to map data of the application of the image to be displayed from the third color gamut to the first color gamut.
- the image to be displayed further includes data of an operating system, and the data of the operating system is a third color gamut; the second mapping unit 1203 is further configured to: use data of an operating system of the image to be displayed by The third color gamut is mapped to the first color gamut.
- the data of the application includes a content source
- the content source is the second color gamut
- the first mapping unit 1202 is specifically configured to: use the second color of the content source of the image to be displayed Mapping the domain to the third color gamut;
- the second mapping unit 1203 is specifically configured to map the content source of the image to be displayed from the third color gamut to The first color gamut.
- the processor may be a central processing unit (CPU), and may be other general-purpose processors, digital signal processors (DSPs), application specific integrated circuits (ASICs), Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components, etc.
- the general purpose processor may be a microprocessor or the processor or any conventional processor or the like.
- the memory can include read only memory and random access memory, store program code, and provide instructions and data to the processor.
- the communication bus may include a power bus, a control bus, and a status signal bus in addition to the data bus.
- the various buses are labeled as communication buses in the figures.
- each step of the above method may be completed by an integrated logic circuit of hardware in a processor or an instruction in a form of software.
- the steps of the method disclosed in the embodiments of the present invention may be directly implemented as a hardware processor, or may be performed by a combination of hardware and software modules in the processor.
- the software module can be located in a conventional storage medium such as random access memory, flash memory, read only memory, programmable read only memory or electrically erasable programmable memory, registers, and the like.
- the storage medium is located in the memory, and the processor reads the information in the memory and combines the hardware to complete the steps of the above method. To avoid repetition, it will not be described in detail here.
- a component can be, but is not limited to being, a process running on a processor, a processor, an object, an executable, a thread of execution, a program, and/or a computer.
- an application running on a computing device and a computing device can be a component.
- One or more components can reside within a process and/or execution thread, and the components can be located on one computer and/or distributed between two or more computers.
- these components can execute from various computer readable media having various data structures stored thereon.
- a component may, for example, be based on signals having one or more data packets (eg, data from two components interacting with another component between the local system, the distributed system, and/or the network, such as the Internet interacting with other systems) Communicate through local and/or remote processes.
- data packets eg, data from two components interacting with another component between the local system, the distributed system, and/or the network, such as the Internet interacting with other systems
- the embodiment of the present application provides a computer program product comprising instructions for performing the method or step of FIG. 4 above when the instructions are run on a computer.
- the embodiment of the present application provides a computer readable storage medium for storing instructions, which are executed when the instructions are executed on a computer.
- the present invention may be implemented in whole or in part by software, hardware, firmware, or any combination thereof.
- software it may be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions.
- the computer program instructions When the computer program instructions are loaded and executed on a computer, the processes or functions described in accordance with embodiments of the present invention are generated in whole or in part.
- the computer can be a general purpose computer, a special purpose computer, a computer network, or other programmable device.
- the computer instructions can be stored in a computer readable storage medium or transferred from one computer readable medium to another computer readable medium, for example, the computer instructions can be wired from a website site, computer, server or data center (for example, coaxial cable, optical fiber, digital subscriber line (DSL)) or wireless (eg infrared, wireless, microwave, etc.) to another website site, computer, server or data center.
- the computer readable storage medium can be any available medium or package that the computer can access A data storage device such as a server or data center that integrates one or more available media.
- the usable medium may be a magnetic medium (eg, a floppy disk, a hard disk, a magnetic tape), an optical medium (eg, a DVD), or a semiconductor medium (eg, a solid state hard disk) or the like.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Controls And Circuits For Display Device (AREA)
- Image Processing (AREA)
Abstract
本申请实施例提供了一种广色域图像显示方法及终端,对于需要进行色域转换来显示的图像,首先将该图像的色域映射到一个完全包含住需要转换到的色域(第一色域)以及图像自身的色域(第二色域)的广色域(第三色域),然后在将该图像由该广色域映射到需要转换到的色域,从而降低由于色域之间的不相互包含造成的颜色丢失,使得图像的显示效果更好还原度更高,或者色彩增强更强。
Description
本申请涉及图像处理技术领域,尤其涉及一种广色域图像显示方法及设备。
色域通常是指一个色彩范围,也就是一个定义色彩的方法能够产生的所有颜色的总和。终端显示内容的色域,通常受到终端软件以及硬件的限制。随着技术的不断发展,终端能够支持的色域的范围也在不断的增加。
常用的色域主要包括DisplayP3、AdobeRGB以及sRGB(standard Red Green Blue)三种。广色域(wide color gamut,WCG)是相对于sRGB的一个统称。其中,三种色域在CIE1931色度图的位置可参见图1。sRGB是由微软联合爱普生、惠普等提供一种标准方法来定义色彩,让显示、打印和扫描等各种计算机外部设备与应用软件对于色彩有一个共通的语言。DisplayP3以及AdobeRGB相较于sRGB能够支持更大的色彩范围。终端在显示图像时,通常是将其他色域的图像映射到其支持的色域进行显示,例如,支持DCI-P3的设备用于显示DisplayP3的图像。其中,DisplayP3与DCI-P3色域相同,区别仅为gamma。其中,DCI-P3与ARGB在色域空间中的对比如图2所示。可见,DCI-P3与ARGB并不是完全包含与被包含的关系。
在使用支持DisplayP3色域的屏幕显示ARGB的图像,或者使用ARGB色域的屏幕显示DisplayP3的图像时,由于这两种色域不是相互包含,会使得图像中的部分颜色在屏幕上无法正确显示。
发明内容
本申请实施例提供了一种广色域图像显示方法及设备。以实现在色域映射时,介绍颜色丢失。
本申请实施例提供了一种广色域图像显示方法。该方法适用于终端,该终端支持第一色域(例如,该终端可以支持DCI-P3),该方法包括:确定待显示图像,该待显示图像的色域为第二色域,该第二色域包括广色域(例如,第二色域为ARGB色域);将待显示图像由第二色域映射到第三色域,第三色域的色彩范围涵盖第一色域以及第二色域的色彩范围;将待显示图像由第三色域映射到第一色域,并显示映射到第一色域的待显示图像。通过本申请实施例,对于需要进行色域转换来显示的图像,首先将该图像的色域映射到一个完全包含住需要转换到的色域(第一色域)以及图像自身的色域(第二色域)的广色域(第三色域,或者称为超级色域(super gamut)),然后在将该图像由该广色域映射到需要转换到的色域,从而降低由于色域之间的不相互包含造成的颜色丢失,使得图像的显示效果更好还原度更高,或者色彩增强更强。
在一个可选地实现中,第二色域包括,sRGB、ARGB和DisplayP3中的一种或多种。通过本申请实施例可以支持三种主流色域的转换,普适性强。
在另一个可选的实现中,第三色域为:
R=(0.68,0.32);
G=(0.2125,0.7368);
B=(0.15,0.06);
W=(0.3127,0.3290);
Gamma=sRGB。另外,Gamma还可以取值2.2等等。该第三色域的白点可以选择D65、D75或者其他。通过本申请实施例可以实现第三色域完全包住P3、sRGB以及ARGB,当然区域可以定义的更大。
在又一个可选地实现中,所述将所述待显示图像由所述第三色域映射到第一色域包括:对映射到第三色域的待显示图像进行色彩增强或者色彩还原;将色彩增强或色彩还原后的待显示图像,由所述第三色域映射到所述第一色域。通过本申请实施例可以实现,色彩还原或者色彩增强量中模式,且可以默认为色彩增强,显示效果更好。
在再一个可选地实现中,待显示图像包括应用程序的数据,该应用程序的数据的色域为所述第二色域,所述将所述待显示图像由所述第二色域映射到第三色域,以及将所述待显示图像由所述第三色域映射到第一色域包括:将所述待显示图像的应用程序的数据由所述第二色域映射到所述第三色域;将所述待显示图像的应用程序的数据由所述第三色域映射到所述第一色域。通过本申请实施例可以实现仅对应用程序的数据进行色域的转换,计算量小,由于操作系统的数据一般对应的为白色,所以不做转换对转换后的待显示图像的显示效果影响非常小。
在再一个可选地实现中,所述待显示图像还包括操作系统的数据,所述操作系统的数据为第三色域;在显示映射到第一色域的待显示图像之前,所述方法还包括:将所述待显示图像的操作系统的数据由所述第三色域映射到所述第一色域。
在再一个可选地实现中,所述应用程序的数据包括内容源,所述内容源为所述第二色域,所述将所述待显示图像由所述第二色域映射到第三色域,以及将所述待显示图像由所述第三色域映射到第一色域包括:将所述待显示图像的内容源由所述第二色域映射到所述第三色域;将所述待显示图像的内容源由所述第三色域映射到所述第一色域。通过本申请实施例可以实现仅对应用程序显示的内容进行色域转换,例如,对于图库应用可以仅对图库中的图像进行色域转换,计算量小,且不影响效果。
第二方面,本申请实施例提供了一种终端。该终端具有实现上述第一方面以及其可选地实现中的方法实际中终端的行为的功能。该功能可以通过硬件实现,也可以通过硬件执行相应的软件实现。所述硬件或软件包括一个或多个与上述功能相对应的单元。
在一个可选地实现中,该终端包括:显示器、处理器以及存储器,显示器用于显示处理后的图像,存储器用于用户存储数据和程序。当终端运行时,处理器执行存储器存储的计算机执行指令,以使故障处理设备执行如第一方面以及第一方面的各种可选方式中的图像显示方法。
第三方面,本申请实施例提供了一种计算机可读存储介质,包括计算机可读指令,
当计算机读取并执行所述计算机可读指令时,使得计算机执行前述第一方面以及其可选地实现中的方法。
第四方面,本申请实施例提供了一种计算机程序产品,包括计算机可读指令,当计算机读取并执行所述计算机可读指令,使得计算机执行前述第一方面以及其可选地实现中的方法。
图1为色域标准在CIE1931色度图的位置示意图;
图2为色域标准在色域空间中的对比图;
图3为本申请实施例提供的一种手机的部分结构示意图;
图4为本申请实施例提供的一种广色域图像显示方法流程图;
图5为本申请实施例提供的一种显示图像的终端;
图6为本发明实施例提供的一种色彩特性文件结构示意图;
图7为Android的GUI系统部分结构示意图;
图8为一种广色域图像显示方法的流程示意图;
图9为本申请实施例提供的另一种广色域图像显示方法流程图;
图10为本申请实施例提供的一个示例;
图11为本申请实施例提供的另一个示例;
图12为本申请实施例提供的一种图像显示装置结构示意图。
本发明实施例提供了一种广色域图像显示方法及终端。通过将与终端支持的广色域不同的图像,映射到一个完全包含终端支持的广色域和图像的色域的广色域,再将图像通过屏幕矫正映射到终端支持的广色域,以便终端进行显示。以此可以实现支持多种色域图像的广色域显示,且无颜色丢失。
本发明实施例涉及的终端可以包括手机、平板电脑、PDA(Personal Digital Assistant,个人数字助理)、POS(Point of Sales,销售终端)、车载电脑等。该终端至少包括存储器、处理器以及显示器,该终端支持第一色域。其中,存储器用于用户存储数据和程序,处理器可以运行存储器中存储的数据和程序。具体地,处理器,用于确定待显示图像,并将待显示图像由第二色域映射到第三色域,再将待显示图像由第三色域映射到第一色域。其中,该待显示图像的色域为第二色域,该第二色域包括广色域;第三色域的色彩范围涵盖第一色域以及第二色域的色彩范围。显示器,用于显示图像,例如,显示映射到第一色域的待显示图像。
可选地,处理器还用于,对映射到第三色域的待显示图像进行色彩增强或者色彩还原;将色彩增强或色彩还原后的待显示图像,由所述第三色域映射到所述第一色域。
可选地,待显示图像包括应用程序的数据,该应用程序的数据的色域为第二色域,该处理器还用于,将待显示图像的应用程序的数据由第二色域映射到第三色域;将待显示图像的应用程序的数据由第三色域映射到所述第一色域。
可选地,待显示图像还包括操作系统的数据,该操作系统的数据为第三色域;该
处理器还用于,将待显示图像的操作系统的数据由第三色域映射到第一色域。
可选地,应用程序的数据包括内容源,该内容源为第二色域,该处理器还用于,将待显示图像的内容源由第二色域映射到第三色域;将待显示图像的内容源由第三色域映射到第一色域。
以终端为手机为例,图3为本申请实施例提供的一种手机的部分结构示意图。参考图3,手机100包括、RF(Radio Frequency,射频)电路110、存储器120、其他输入设备130、显示屏140、传感器150、音频电路160、I/O子系统170、处理器180、以及电源190等部件。本领域技术人员可以理解,图2中示出的手机结构并不构成对手机的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。本领领域技术人员可以理解显示屏140属于用户界面(UI,User Interface),且手机100可以包括比图示或者更少的用户界面。
下面结合图3对手机100的各个构成部件进行具体的介绍:
RF电路110可用于收发信息或通话过程中,信号的接收和发送,特别地,将基站的下行信息接收后,给处理器180处理;另外,将设计上行的数据发送给基站。通常,RF电路包括但不限于天线、至少一个放大器、收发信机、耦合器、LNA(Low Noise Amplifier,低噪声放大器)、双工器等。此外,RF电路110还可以通过无线通信与网络和其他设备通信。所述无线通信可以使用任一通信标准或协议,包括但不限于GSM(Global System of Mobile communication,全球移动通讯系统)、GPRS(General Packet Radio Service,通用分组无线服务)、CDMA(Code Division Multiple Access,码分多址)、WCDMA(Wideband Code Division Multiple Access,宽带码分多址)、LTE(Long Term Evolution,长期演进)、电子邮件、SMS(Short Messaging Service,短消息服务)等。
存储器120可用于存储软件程序以及模块,处理器180通过运行存储在存储器120的软件程序以及模块,从而执行手机100的各种功能应用以及数据处理。存储器120可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图象播放功能等)等;存储数据区可存储根据手机100的使用所创建的数据(比如音频数据、电话本等)等。此外,存储器120可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。
其他输入设备130可用于接收输入的数字或字符信息,以及产生与手机100的用户设置以及功能控制有关的键信号输入。具体地,其他输入设备130可包括但不限于物理键盘、功能键(比如音量控制按键、开关按键等)、轨迹球、鼠标、操作杆、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)等中的一种或多种。其他输入设备130与I/O子系统170的其他输入设备控制器171相连接,在其他设备输入控制器171的控制下与处理器180进行信号交互。
显示屏140可用于显示由用户输入的信息或提供给用户的信息以及手机100的各种菜单,还可以接受用户输入。具体的显示屏140可包括显示面板141,以及触控面板142。其中显示面板141可以采用LCD(Liquid Crystal Display,液晶显示器)、OLED(Organic Light-Emitting Diode,有机发光二极管)等形式来配置显示面板141。触控面板142,也称为触摸屏、触敏屏等,可收集用户在其上或附近的接触或者非接触操
作(比如用户使用手指、触笔等任何适合的物体或附件在触控面板142上或在触控面板142附近的操作,也可以包括体感操作;该操作包括单点控制操作、多点控制操作等操作类型。),并根据预先设定的程式驱动相应的连接装置。可选的,触控面板142可包括触摸检测装置和触摸控制器两个部分。其中,触摸检测装置检测用户的触摸方位、姿势,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成处理器能够处理的信息,再送给处理器180,并能接收处理器180发来的命令并加以执行。此外,可以采用电阻式、电容式、红外线以及表面声波等多种类型实现触控面板142,也可以采用未来发展的任何技术实现触控面板142。进一步的,触控面板142可覆盖显示面板141,用户可以根据显示面板141显示的内容(该显示内容包括但不限于,软键盘、虚拟鼠标、虚拟按键、图标等等),在显示面板141上覆盖的触控面板142上或者附近进行操作,触控面板142检测到在其上或附近的操作后,通过I/O子系统170传送给处理器180以确定用户输入,随后处理器180根据用户输入通过I/O子系统170在显示面板141上提供相应的视觉输出。虽然在图3中,触控面板142与显示面板141是作为两个独立的部件来实现手机100的输入和输入功能,但是在某些实施例中,可以将触控面板142与显示面板141集成而实现手机100的输入和输出功能。
手机100还可包括至少一种传感器150,比如光传感器、运动传感器以及其他传感器。具体地,光传感器可包括环境光传感器及接近传感器,其中,环境光传感器可根据环境光线的明暗来调节显示面板141的亮度,接近传感器可在手机100移动到耳边时,关闭显示面板141和/或背光。作为运动传感器的一种,加速计传感器可检测各个方向上(一般为三轴)加速度的大小,静止时可检测出重力的大小及方向,可用于识别手机姿态的应用(比如横竖屏切换、相关游戏、磁力计姿态校准)、振动识别相关功能(比如计步器、敲击)等;至于手机100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。
音频电路160、扬声器161,麦克风162可提供用户与手机100之间的音频接口。音频电路160可将接收到的音频数据转换后的信号,传输到扬声器161,由扬声器161转换为声音信号输出;另一方面,麦克风162将收集的声音信号转换为信号,由音频电路160接收后转换为音频数据,再将音频数据输出至RF电路108以发送给比如另一手机,或者将音频数据输出至存储器120以便进一步处理。
I/O子系统170用来控制输入输出的外部设备,可以包括其他设备输入控制器171、传感器控制器172、显示控制器173。可选的,一个或多个其他输入控制设备控制器171从其他输入设备130接收信号和/或者向其他输入设备130发送信号,其他输入设备130可以包括物理按钮(按压按钮、摇臂按钮等)、拨号盘、滑动开关、操纵杆、点击滚轮、光鼠(光鼠是不显示可视输出的触摸敏感表面,或者是由触摸屏形成的触摸敏感表面的延伸)。值得说明的是,其他输入控制设备控制器171可以与任一个或者多个上述设备连接。所述I/O子系统170中的显示控制器173从显示屏140接收信号和/或者向显示屏140发送信号。显示屏140检测到用户输入后,显示控制器173将检测到的用户输入转换为与显示在显示屏140上的用户界面对象的交互,即实现人机交互。传感器控制器172可以从一个或者多个传感器150接收信号和/或者向一个或者
多个传感器150发送信号。
处理器180是手机100的控制中心,利用各种接口和线路连接整个手机的各个部分,通过运行或执行存储在存储器120内的软件程序和/或模块,以及调用存储在存储器120内的数据,执行手机100的各种功能和处理数据,从而对手机进行整体监控。可选的,处理器180可包括一个或多个处理单元;优选的,处理器180可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器180中。
手机100还包括给各个部件供电的电源190(比如电池),优选的,电源可以通过电源管理系统与处理器180逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗等功能。
尽管未示出,手机100还可以包括摄像头、蓝牙模块等,在此不再赘述。
图4为本申请实施例提供的一种广色域图像显示方法流程图。该方法适用于终端,该终端支持第一色域。如图4所示,该方法具体包括:
S410,确定待显示图像,该待显示图像的色域为第二色域。第二色域包括广色域。
其中,待显示图像包括应用程序的数据(APP buffer),还可以包括操作系统的数据(status bar或者system bar)等等。例如,应用程序的数据可以是指图库中的图像、浏览器中的网页或者其他应用程序的用户界面等等;操作系统的数据可以是指操作系统的状态栏以及显示控件等等。
图5为本申请实施例提供的一种显示图像的终端。结合图5所示,终端显示的图像包括系统窗口以及APP窗口等等。其中,图5为显示完成的图像,从图形用户界面(graphical user interface,GUI)的角度来说,应用程序的数据一般通过应用程序窗口(Application Window)来显示,在一些情况下还可以通过应用程序的子窗口(Sub Window)来显示。操作系统的数据一般通过系统窗口来显示。
具体的,应用程序窗口包括所有应用程序自己创建的窗口,以及在应用起来之前系统负责显示的窗口。
子窗口可以包括应用自定义的对话框,或者输入法窗口等等,子窗口依附于某个应用窗口(设置相同的token)等等。
系统窗口(System Window)一般为系统设计的,不依附于任何应用的窗口,比如说,状态栏(Status Bar),导航栏(Navigation Bar),壁纸(Wallpaper),来电显示窗口(Phone),锁屏窗口(KeyGuard),信息提示窗口(Toast),音量调整窗口以及鼠标光标等等。
需要在终端上显示的应用程序窗口、应用程序子窗口以及系统窗口构成待显示图像。
在本申请实施例中,待显示图像的色域可以是指应用程序的数据的色域。操作系统的数据的色域一般是预设好的,同一版本的操作系统的色域一般是固定的。其中,操作系统的数据的色域可以是不同于第一色域、第二色域以及第三色域的第四色域,也可以是第二色域或者第三色域。例如,操作系统的数据的色域可以为sRGB。
另外,对于应用程序的数据,可以包括APP显示控件以及内容源,其中,APP显示控件和内容源可以对应一个或多个图层,例如,APP显示控件和内容源可以对应不
同的图层,内容源也可以对应一个或多个图层。APP显示控件和内容源对应的一个或多个图层可以包括一个或多个色域,每个图层可以对应不同的色域,也可以是,所有图层或部分图层对应相同的色域。为了描述上的方便,应用程序的数据对应的色域统称为第二色域,但第二色域可以指代一个或多个不同的色域。
在另一种实现方式下,待显示图像的色域也可以是指待显示图像所对应的所有色域,也就是说,包括应用程序的数据的色域以及操作系统的数据的色域在内的所有色域统称为第二色域。
如图5所示,待显示图像包括操作系统的数据以及应用程序的数据,其中,操作系统的和数据包括包括状态栏511以及导航栏512对应的数据;应用程序的数据包括APP显示控件521、APP显示控件523以及内容522(图5中示出的为图库中的照片)等对应的数据,还可以包括其他照片的预览图像(图中未示出)等等,其中,图库中的照片为内容源,待显示的照片、APP显示控件以及其他照片的预览图像分别属于不同的图层,而每个图层对应的色域可以是不同的,APP显示控件可以设置为第三色域,照片的色域可以为多种,例如,从网络下载的图像一般为sRGB色域,使用单反相机拍摄的图像一般为ARGB色域,使用支持Display-P3色域的设备拍摄图像一般为Display-P3等等。
S420,将待显示图像由第二色域映射到第三色域,该第三色域的色彩范围涵盖第一色域以及第二色域的色彩范围。其中,第三色域可以称为超级色域(super gamut)
在一种实现方式下,在对待显示图像的色域进行色域转换时,可以仅将应用程序的数据的色域映射到第三色域。例如,待显示图像可以包括应用程序的图形用户界面(graphical user interface,GUI),此时一般对于操作系统的数据,其对应的颜色一般为白色,例如,系统状态栏一般为白色,所以转换前后效果一致。
在另一种实现方式下,在对待显示图像的色域进行转换时,可以仅将操作系统的数据的色域映射到第三色域。例如,在终端在显示操作系统的桌面或锁屏窗口时,可以仅对桌面的壁纸或屏保的色域进行转换,当然,此时也可以对桌面上显示的应用图标进行色域转换。
在又一种实现方式下,在对待显示图像的色域进行转换时,可以对操作系统的数据以及应用程序的数据进行转换。
另外,由于待显示图像是分图层的,所以,在对待显示图像的色域进行转换时,可以分图层进行转换,例如,可以识别图层对应的数据的格式,当识别到格式为JPG、JPEG、SVG等等图片的格式时,对该图层的色域进行转换。
图6为本发明实施例提供的一种色彩特性文件结构示意图。如图6所示,待显示图像中包括色彩特性文件(international color consortiun profile,ICC profile),该ICC profile中包括由第二色域映射到第三色域的参数。
以第二色域为ARGB作为示例,从ICC profile中确定“redTRCTag,greenTRCTag,blueTRCTag”以及“redMatrixColumnTag,greenMatrixColumnTag,blueMatrixColumnTag”。对图像进行degamma矫正、3*3矩阵差值以及gamma矫正等转换到第三色域。其中,degamma矫正以及gamma矫正需要的参数可以根据“redTRCTag,greenTRCTag,blueTRCTag”进行解析得到,3*3矩阵差值需要的参数
可以根据“redMatrixColumnTag,greenMatrixColumnTag,blueMatrixColumnTag”进行解析得到。
其中,ARGB色域映射到第三色域的3*3矩阵M4=M1*M2。M2为ARGB映射到XYZ空间的矩阵,M1为XYZ空间映射到第三色域的矩阵。
Display-P3色域映射到第三色域的3*3矩阵M5=M1*M3。M3为Display-P3色域映射到XYZ空间的色域。
在第三色域的一个实现方式中,第三色域的色彩范围可以包含Display-P3色域、sRGB色域以及AGRB的色彩范围,例如,该第三色域可以取值为:
R=(0.68,0.32);
G=(0.2125,0.7368);
B=(0.15,0.06);
W=(0.3127,0.3290);
Gamma=sRGB。
此时,
S430,将待显示图像由第三色域映射到第一色域,并显示。
可以将待显示图像经过degamma矫正、色域转换以及gamma矫正映射到第一色域,经过屏幕矫正后进行显示。其中,色域转换可以采用三维查找表(3D-look up table,3DLUT)实现色彩还原和色彩增强等等。
其中,系统默认为显示色彩增强后的待显示图像,也可以是根据用户设置,显示该设置对应的色彩增强或色彩还原后的待显示图像。
下面结合更具体的示例结合附图对本发明实施例进行进一步地介绍。
以安卓操作系统的架构为例,Android的GUI系统包括以下部分:窗口和图形系统(Window and View Manager System),显示合成系统(Surface Flinger),用户输入系统(InputManager System)以及应用框架系统(Activity Manager System)。
本发明实施例主要适用于显示合成系统。结合图7所示,其中,在安卓操作系统中SurfaceFlinger是一个独立的系统,它接收所有窗口(Window)的界面(Surface)作为输入,根据zorder(是一种计算机用语,用于设置顺序),透明度,大小,位置等参数,计算出每个Surface在最终合成图像中的位置,然后交由HWComposer或OpenGL生成最终的显示Buffer,然后显示到特定的显示设备上。
HWComposer定义一套HAL层接口,然后各个芯片厂商根据各种硬件特点来实现。它的主要工作是将SurfaceFlinger计算好的图层(Layer)的显示参数最终合成到一个显示Buffer上。注意的是,Surface Flinger并非是HWComposer的唯一输入,有
的Surface不由Android的WindowManager管理,比如说摄像头的预览输入Buffer,可以有硬件直接写入,然后作为HWComposer的输入之一与SurfaceFlinger的输出做最后的合成。
OpenGL ES(OpenGL for Embedded Systems)是OpenGL三维图形API的子集。OpenGL是一个2D/3D图形库,需要底层硬件(GPU)和驱动的支持。应用程序(例如图库APP)也可以直接调用OpenGl函数来实现复杂的图形界面。
EGL是OpenGL ES和SurfaceFlinger之间的接口。
Display是Android对输出显示设备的一个抽象,传统的Display设备是终端上的LCD屏,还可以支持其他外部输入设备,比如HDMI,Wifi Display等等。Display的输入是根据上面的LayerStack值进行过滤的所有Window的Surface,输出是和显示设备尺寸相同的Buffer,这个Buffer最终送到了硬件的FB设备,或者HDMI设备,或者远处的Wifi Display Sink设备进行显示。输入到输出这条路径上有SurfaceFlinger,OpenGL和HWComposer。
结合图7所示的架构,以图8以及图9为例对本申请实施例进行进一步地介绍。
图8为一种广色域图像显示方法的流程示意图。如图8所示,在该方法中,终端将应用程序的数据以及操作系统的数据的色域映射到Display-P3色域,在将应用程序的数据以及操作系统的数据的色域由Display-P3色域映射到DCI-P3色域,以便终端进行显示。不难发现,对于ARGB色域转换到Display-P3色域后,会出现色彩丢失。
图9为本申请实施例提供的一种广色域图像显示方法的流程示意图。如图8所示,在SurfaceFlinger中,接收应用程序的数据以及操作系统的数据。其中,应用程序的数据为ARGB色域,操作系统的数据为sRGB色域。
将应用程序的数据的色域以及操作系统的数据的色域经过degamma矫正、3*3矩阵差值、gamma矫正映射到超级色域(super gamut)(D65)(结合图2所示的实施例中的第三色域的一个示例),其中,映射到super gamut(D65)的参数可以根据应用程序的数据以及操作系统的数据的文件描述(profile description)获取。需要说明的是,D65为白点,super gamut还可以使用其他的白点,例如,D75,在本申请实施例中super gamut一般指第三色域。
对应用程序的数据的色域以及操作系统的数据的色域由super gamut(D65)映射到display的色域(DCI-P3),以便display进行显示。
其中,由super gamut(D65)映射到DCI-P3需要经过degamma矫正、差值以及gamma矫正。其中,在进行差值时,可以实现色彩还原以及色彩增强两种效果。
另外,在将应用程序的数据的色域以及操作系统的数据的色域由super gamut(D65)映射到display的色域时,一般还需要经过屏幕颜色矫正、色温调节等处理。例如待显示图像由super gamut(D65)映射到虚拟色域后,在由虚拟色域映射到理想色域,由理想色域再映射到屏幕实际色域。其中,理想色域与屏幕的实际色域一般存在偏差。
例如,对于直接将ARGB映射到DisplayP3的方法:
首先,将ARGB非线性(255,235,217)进行Degamma运算、3*3矩阵差值运算得到DisplayP3线性(261.2939,214.9968,181.5335),然后直接裁剪(255,214.9968,
181.5335),进行Gamma运算得到DisplayP3非线性(255,237,219)。
本申请实施例通过ARGB非线性(255,235,217)进行Degamma运算、3*3矩阵差值运算以及Gamma运算,映射到super gamut(D65),再由super gamut(D65)进行Degamma运算、3DLUT运算以及Gamma运算,得到DisplayP3非线性(255,234,217)。
如图10所示,根据DeltaE2000计算得到两者的色差为1.8679以及0.6058。
结合图11所示,图11示出了采用super gamut进行色域映射的一些仿真实验,根据这些实现可以看出,通过将色域映射到super gamut这种方式进行色域转换,色差较小,颜色丢失较少。
图12为本申请实施例提供的一种图像显示装置结构示意图。如图12所示,该装置支持第一色域,该装置包括:
确定单元1201,用于确定待显示图像,该待显示图像的色域为第二色域。
第一映射单元1202,用于将待显示图像由第二色域映射到第三色域,其中,第三色域的色彩范围涵盖第一色域以及第二色域的色彩范围。
第二映射单元1203,用于将待显示图像由第三色域映射到第一色域。
显示单元1204,用于显示映射到第一色域的待显示图像。
可选地,第二色域包括,sRGB、ARGB和DisplayP3中的一种或多种;
可选地,第三色域为:
R=(0.68,0.32);
G=(0.2125,0.7368);
B=(0.15,0.06);
W=(0.3127,0.3290);
Gamma=sRGB。
可选地,该装置还包括:第三映射单元1205,用于对映射到第三色域的待显示图像进行色彩增强或者色彩还原;
第二映射单元1203具体用于,将色彩增强或色彩还原后的待显示图像,由所述第三色域映射到所述第一色域。
可选地,待显示图像包括应用程序的数据,该应用程序的数据的色域为第二色域,第一映射单元1202具体用于,将所述待显示图像的应用程序的数据由所述第二色域映射到所述第三色域;
第二映射单元1203具体用于,将所述待显示图像的应用程序的数据由所述第三色域映射到所述第一色域。
可选地,所述待显示图像还包括操作系统的数据,所述操作系统的数据为第三色域;第二映射单元1203具体还用于,将所述待显示图像的操作系统的数据由所述第三色域映射到所述第一色域。
可选地,所述应用程序的数据包括内容源,所述内容源为所述第二色域,第一映射单元1202具体用于,将所述待显示图像的内容源由所述第二色域映射到所述第三色域;
第二映射单元1203具体用于,将所述待显示图像的内容源由所述第三色域映射到
所述第一色域。
应理解,在本发明上述实施例中,处理器可以是中央处理单元(Central Processing Unit,简称CPU),还可以是其他通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
存储器可以包括只读存储器和随机存取存储器,存储有程序代码,并向处理器提供指令和数据。
通信总线除包括数据总线之外,还可以包括电源总线、控制总线和状态信号总线等。但是为了清楚说明起见,在图中将各种总线都标为通信总线。
在实现过程中,上述方法的各步骤可以通过处理器中的硬件的集成逻辑电路或者软件形式的指令完成。结合本发明实施例所公开的方法的步骤可以直接体现为硬件处理器执行完成,或者用处理器中的硬件及软件模块组合执行完成。软件模块可以位于随机存储器,闪存、只读存储器,可编程只读存储器或者电可擦写可编程存储器、寄存器等本领域成熟的存储介质中。该存储介质位于存储器,处理器读取存储器中的信息,结合其硬件完成上述方法的步骤。为避免重复,这里不再详细描述。
在本说明书中使用的术语“部件”、“模块”、“系统”等用于表示计算机相关的实体、硬件、固件、硬件和软件的组合、软件、或执行中的软件。例如,部件可以是但不限于,在处理器上运行的进程、处理器、对象、可执行文件、执行线程、程序和/或计算机。通过图示,在计算设备上运行的应用和计算设备都可以是部件。一个或多个部件可驻留在进程和/或执行线程中,部件可位于一个计算机上和/或分布在2个或更多个计算机之间。此外,这些部件可从在上面存储有各种数据结构的各种计算机可读介质执行。部件可例如根据具有一个或多个数据分组(例如来自与本地系统、分布式系统和/或网络间的另一部件交互的二个部件的数据,例如通过信号与其它系统交互的互联网)的信号通过本地和/或远程进程来通信。
本申请实施例提供了一种包含指令的计算机程序产品,当所述指令在计算机上运行时,执行上述图4中的方法或步骤。
本申请实施例提供了一种计算机可读存储介质,用于存储指令,当所述指令在计算机上执行时,执行上述图4中的方法或步骤。
在上述各个本发明实施例中,可以全部或部分地通过软件、硬件、固件或者其任意组合来实现。当使用软件实现时,可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令。在计算机上加载和执行所述计算机程序指令时,全部或部分地产生按照本发明实施例所述的流程或功能。所述计算机可以是通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读介质向另一个计算机可读介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线(例如同轴电缆、光纤、数字用户线(digital subscriber line,DSL))或无线(例如红外、无线、微波等)方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机可读存储介质可以是计算机能够存取的任何可用介质或者是包
含一个或多个可用介质集成的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质(例如,固态硬盘)等。
以上所述,仅为本发明的具体实施方式,但本发明的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本发明揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本发明的保护范围之内。因此,本发明的保护范围应以所述权利要求的保护范围为准。
Claims (23)
- 一种广色域图像显示方法,其特征在于,所述方法适用于终端,所述终端支持第一色域,所述方法包括:确定待显示图像,所述待显示图像的色域为第二色域,所述第二色域包括广色域;将所述待显示图像由所述第二色域映射到第三色域,所述第三色域的色彩范围涵盖所述第一色域以及所述第二色域的色彩范围;将所述待显示图像由所述第三色域映射到第一色域,并显示映射到第一色域的待显示图像。
- 根据权利要求1所述的方法,其特征在于,所述第二色域包括,sRGB、ARGB和DisplayP3中的一种或多种。
- 根据权利要求2所述的方法,其特征在于,所述第三色域为:R=(0.68,0.32);G=(0.2125,0.7368);B=(0.15,0.06);W=(0.3127,0.3290);Gamma=sRGB。
- 根据权利要求1-3任意一项所述的方法,其特征在于,所述将所述待显示图像由所述第三色域映射到第一色域,包括:对映射到第三色域的待显示图像进行色彩增强或者色彩还原;将色彩增强或色彩还原后的待显示图像,由所述第三色域映射到所述第一色域。
- 根据权利要求1-4任意一项所述的方法,其特征在于,所述待显示图像包括应用程序的数据,所述应用程序的数据的色域为所述第二色域,所述将所述待显示图像由所述第二色域映射到第三色域,以及将所述待显示图像由所述第三色域映射到第一色域包括:将所述待显示图像的应用程序的数据由所述第二色域映射到所述第三色域;将所述待显示图像的应用程序的数据由所述第三色域映射到所述第一色域。
- 根据权利要求5所述的方法,其特征在于,所述待显示图像还包括操作系统的数据,所述操作系统的数据为第三色域;在显示映射到第一色域的待显示图像之前,所述方法还包括:将所述待显示图像的操作系统的数据由所述第三色域映射到所述第一色域。
- 根据权利要求5所述的方法,其特征在于,所述应用程序的数据包括内容源,所述内容源为所述第二色域,所述将所述待显示图像由所述第二色域映射到第三色域,以及将所述待显示图像由所述第三色域映射到第一色域包括:将所述待显示图像的内容源由所述第二色域映射到所述第三色域;将所述待显示图像的内容源由所述第三色域映射到所述第一色域。
- 一种广色域图像显示装置,其特征在于,所述装置支持第一色域,所述装置包括:确定单元,用于确定待显示图像,所述待显示图像的色域为第二色域,所述第二色域包括广色域;第一映射单元,用于将所述待显示图像由所述第二色域映射到第三色域,所述第三色域的色彩范围涵盖所述第一色域以及所述第二色域的色彩范围;第二映射单元,用于将所述待显示图像由所述第三色域映射到第一色域;显示单元,用于显示映射到第一色域的待显示图像。
- 根据权利要求8所述的装置,其特征在于,所述第二色域包括,sRGB、ARGB和DisplayP3中的一种或多种。
- 根据权利要求9所述的装置,其特征在于,所述第三色域为:R=(0.68,0.32);G=(0.2125,0.7368);B=(0.15,0.06);W=(0.3127,0.3290);Gamma=sRGB。
- 根据权利要求8-10任意一项所述的装置,其特征在于,还包括:第三映射单元,用于对映射到第三色域的待显示图像进行色彩增强或者色彩还原;所述第二映射单元还用于,将色彩增强或色彩还原后的待显示图像,由所述第三色域映射到所述第一色域。
- 根据权利要求8-11任意一项所述的装置,其特征在于,所述待显示图像包括应用程序的数据,所述应用程序的数据的色域为所述第二色域,所述第一映射单元还用于,将所述待显示图像的应用程序的数据由所述第二色域映射到所述第三色域;所述第二映射单元还用于,将所述待显示图像的应用程序的数据由所述第三色域映射到所述第一色域。
- 根据权利要求12所述的装置,其特征在于,所述待显示图像还包括操作系统的数据,所述操作系统的数据为第三色域;所述第二映射单元还用于,将所述待显示图像的操作系统的数据由所述第三色域映射到所述第一色域。
- 根据权利要求12所述的装置,其特征在于,所述应用程序的数据包括内容源,所述内容源为所述第二色域,所述第一映射单元还用于,将所述待显示图像的内容源由所述第二色域映射到所述第三色域;所述第二映射单元还用于,将所述待显示图像的内容源由所述第三色域映射到所述第一色域。
- 一种终端,其特征在于,所述终端支持第一色域,所述终端包括:处理器,用于确定待显示图像,所述待显示图像的色域为第二色域,所述第二色域包括广色域;所述处理器还用于,将所述待显示图像由所述第二色域映射到第三色域,所述第三色域的色彩范围涵盖所述第一色域以及所述第二色域的色彩范围;所述处理器还用于,将所述待显示图像由所述第三色域映射到第一色域;显示器,用于显示映射到第一色域的待显示图像。
- 根据权利要求15所述的终端,其特征在于,所述第二色域包括,sRGB、ARGB和DisplayP3中的一种或多种。
- 根据权利要求16所述的终端,其特征在于,所述第三色域为:R=(0.68,0.32);G=(0.2125,0.7368);B=(0.15,0.06);W=(0.3127,0.3290);Gamma=sRGB。
- 根据权利要求15-17任意一项所述的终端,其特征在于,所述处理器还用于,对映射到第三色域的待显示图像进行色彩增强或者色彩还原;将色彩增强或色彩还原后的待显示图像,由所述第三色域映射到所述第一色域。
- 根据权利要求15-18任意一项所述的终端,其特征在于,所述待显示图像包括应用程序的数据,所述应用程序的数据的色域为所述第二色域,所述处理器还用于,将所述待显示图像的应用程序的数据由所述第二色域映射到所述第三色域;将所述待显示图像的应用程序的数据由所述第三色域映射到所述第一色域。
- 根据权利要求19所述的终端,其特征在于,所述待显示图像还包括操作系统的数据,所述操作系统的数据为第三色域;所述处理器还用于,将所述待显示图像的操作系统的数据由所述第三色域映射到所述第一色域。
- 根据权利要求19所述的终端,其特征在于,所述应用程序的数据包括内容源,所述内容源为所述第二色域,所述处理器还用于,将所述待显示图像的内容源由所述第二色域映射到所述第三色域;将所述待显示图像的内容源由所述第三色域映射到所述第一色域。
- 一种计算机可读存储介质,包括计算机可读指令,当计算机读取并执行所述计算机可读指令时,使得计算机执行如权利要求1-7任意一项所述的方法。
- 一种计算机程序产品,包括计算机可读指令,当计算机读取并执行所述计算机可读指令,使得计算机执行如权利要求1-7任意一项所述的方法。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780068941.0A CN109923606B (zh) | 2017-07-31 | 2017-07-31 | 一种广色域图像显示方法及设备 |
PCT/CN2017/095123 WO2019023835A1 (zh) | 2017-07-31 | 2017-07-31 | 一种广色域图像显示方法及设备 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/095123 WO2019023835A1 (zh) | 2017-07-31 | 2017-07-31 | 一种广色域图像显示方法及设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2019023835A1 true WO2019023835A1 (zh) | 2019-02-07 |
Family
ID=65233423
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/095123 WO2019023835A1 (zh) | 2017-07-31 | 2017-07-31 | 一种广色域图像显示方法及设备 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN109923606B (zh) |
WO (1) | WO2019023835A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110691194A (zh) * | 2019-09-19 | 2020-01-14 | 锐迪科微电子(上海)有限公司 | 广色域图像确定方法及装置 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110378973B (zh) * | 2019-07-17 | 2022-08-12 | Oppo广东移动通信有限公司 | 图像信息处理方法、装置以及电子设备 |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101523480A (zh) * | 2006-10-12 | 2009-09-02 | 皇家飞利浦电子股份有限公司 | 颜色映射的方法 |
CN101534374A (zh) * | 2008-03-14 | 2009-09-16 | 富士施乐株式会社 | 颜色处理装置及颜色处理方法 |
CN102577396A (zh) * | 2009-09-21 | 2012-07-11 | 三星电子株式会社 | 用于广色域的rgb原色生成系统和方法以及利用rgb原色的彩色编码系统 |
US8902246B2 (en) * | 2010-12-22 | 2014-12-02 | Apple Inc. | Color correction for wide gamut systems |
CN104427319A (zh) * | 2013-09-03 | 2015-03-18 | 索尼公司 | 信息处理设备、信息处理方法、程序和图像显示设备 |
CN104966508A (zh) * | 2015-07-17 | 2015-10-07 | 上海天马有机发光显示技术有限公司 | 一种驱动方法、控制芯片及显示设备 |
CN105957497A (zh) * | 2016-04-28 | 2016-09-21 | 苏州佳世达电通有限公司 | 电子设备及其控制方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004088345A (ja) * | 2002-08-26 | 2004-03-18 | Konica Minolta Holdings Inc | 画像形成方法、画像処理装置、プリント作成装置及び記憶媒体 |
US8379971B2 (en) * | 2006-08-16 | 2013-02-19 | Tp Vision Holding B.V. | Image gamut mapping |
KR20080095671A (ko) * | 2007-04-25 | 2008-10-29 | 삼성전자주식회사 | 광 색상범위 영상의 디스플레이 방법 및 장치 |
CN101754030A (zh) * | 2010-01-13 | 2010-06-23 | 山东大学 | 一种激光电视的色域扩展映射系统及其方法 |
JP2012234382A (ja) * | 2011-05-02 | 2012-11-29 | Ricoh Co Ltd | 画像表示システムおよび画像表示方法 |
US9799305B2 (en) * | 2014-09-19 | 2017-10-24 | Barco N.V. | Perceptually optimised color calibration method and system |
CN105654455B (zh) * | 2014-11-11 | 2018-06-15 | 曲阜师范大学 | 一种“图像到设备”的色域映射算法 |
CN104767983B (zh) * | 2015-03-19 | 2018-05-04 | 华为技术有限公司 | 一种图像处理方法及装置 |
US11277610B2 (en) * | 2015-09-23 | 2022-03-15 | Arris Enterprises Llc | Single layer high dynamic range coding with standard dynamic range backward compatibility |
CN106782428B (zh) * | 2016-12-27 | 2019-05-07 | 上海天马有机发光显示技术有限公司 | 一种显示装置的色域调整方法及色域调整系统 |
-
2017
- 2017-07-31 WO PCT/CN2017/095123 patent/WO2019023835A1/zh active Application Filing
- 2017-07-31 CN CN201780068941.0A patent/CN109923606B/zh active Active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101523480A (zh) * | 2006-10-12 | 2009-09-02 | 皇家飞利浦电子股份有限公司 | 颜色映射的方法 |
CN101534374A (zh) * | 2008-03-14 | 2009-09-16 | 富士施乐株式会社 | 颜色处理装置及颜色处理方法 |
CN102577396A (zh) * | 2009-09-21 | 2012-07-11 | 三星电子株式会社 | 用于广色域的rgb原色生成系统和方法以及利用rgb原色的彩色编码系统 |
US8902246B2 (en) * | 2010-12-22 | 2014-12-02 | Apple Inc. | Color correction for wide gamut systems |
CN104427319A (zh) * | 2013-09-03 | 2015-03-18 | 索尼公司 | 信息处理设备、信息处理方法、程序和图像显示设备 |
CN104966508A (zh) * | 2015-07-17 | 2015-10-07 | 上海天马有机发光显示技术有限公司 | 一种驱动方法、控制芯片及显示设备 |
CN105957497A (zh) * | 2016-04-28 | 2016-09-21 | 苏州佳世达电通有限公司 | 电子设备及其控制方法 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110691194A (zh) * | 2019-09-19 | 2020-01-14 | 锐迪科微电子(上海)有限公司 | 广色域图像确定方法及装置 |
Also Published As
Publication number | Publication date |
---|---|
CN109923606A (zh) | 2019-06-21 |
CN109923606B (zh) | 2020-12-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11693496B2 (en) | Display method and device | |
US11861161B2 (en) | Display method and apparatus | |
US12026347B2 (en) | Method for displaying multiple application windows by mobile terminal, and mobile terminal | |
US11115518B2 (en) | Display method and terminal | |
WO2018161534A1 (zh) | 一种显示图像的方法、双屏终端和计算机可读的非易失性存储介质 | |
US10198127B2 (en) | Display control device, display control method, and program | |
US20170199662A1 (en) | Touch operation method and apparatus for terminal | |
TWI546775B (zh) | 圖像處理的方法及裝置 | |
WO2020151460A1 (zh) | 对象处理方法及终端设备 | |
US9137402B2 (en) | Displaying an operational screen of an image forming apparatus on a display screen of a client device in order to control the image forming apparatus from the client device | |
WO2019114458A1 (en) | Method for display control and related products | |
WO2018137304A1 (zh) | 一种2d应用在vr设备中的显示方法及终端 | |
WO2023284621A1 (zh) | 设置方法、装置、电子设备和存储介质 | |
WO2019071594A1 (zh) | 一种显示处理方法及电子设备 | |
US11955098B2 (en) | Determining compensation value | |
US20140043267A1 (en) | Operation Method of Dual Operating Systems, Touch Sensitive Electronic Device Having Dual Operating Systems, and Computer Readable Storage Medium Having Dual Operating Systems | |
WO2018098959A2 (zh) | 一种画面显示方法及电子设备 | |
WO2019023835A1 (zh) | 一种广色域图像显示方法及设备 | |
WO2017005080A1 (zh) | 网页显示方法、终端设备及存储介质 | |
WO2015014138A1 (zh) | 一种显示框显示的方法、装置和设备 | |
WO2019089398A1 (en) | Networked user interface back channel discovery via wired video connection | |
WO2020168964A1 (zh) | 一种图像处理方法、终端设备和系统 | |
WO2020238477A1 (zh) | 编辑方法及移动终端 | |
JP7150791B2 (ja) | モバイル端末により複数のアプリケーションウィンドウを表示する方法、及びモバイル端末 | |
WO2023138214A1 (zh) | 屏幕匹配方法、屏幕配置方法及屏幕匹配装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17920266 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17920266 Country of ref document: EP Kind code of ref document: A1 |