CN114972080A - Image processing method, device and medium - Google Patents
Image processing method, device and medium Download PDFInfo
- Publication number
- CN114972080A CN114972080A CN202210513367.8A CN202210513367A CN114972080A CN 114972080 A CN114972080 A CN 114972080A CN 202210513367 A CN202210513367 A CN 202210513367A CN 114972080 A CN114972080 A CN 114972080A
- Authority
- CN
- China
- Prior art keywords
- dynamic range
- image
- high dynamic
- color space
- processed
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 24
- 238000000034 method Methods 0.000 claims abstract description 34
- 238000012545 processing Methods 0.000 claims abstract description 17
- 238000006243 chemical reaction Methods 0.000 claims description 4
- 230000002194 synthesizing effect Effects 0.000 claims description 3
- 230000006870 function Effects 0.000 description 15
- 238000010586 diagram Methods 0.000 description 5
- 230000009286 beneficial effect Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 238000005516 engineering process Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 238000006467 substitution reaction Methods 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000013307 optical fiber Substances 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
- G06T5/92—Dynamic range modification of images or parts thereof based on global image properties
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/40—Scaling of whole images or parts thereof, e.g. expanding or contracting
- G06T3/4007—Scaling of whole images or parts thereof, e.g. expanding or contracting based on interpolation, e.g. bilinear interpolation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10024—Color image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20208—High dynamic range [HDR] image processing
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
Abstract
The application discloses an image processing method, which comprises the following steps: decoding an image to be processed to obtain high dynamic range data; converting the image to be processed from a first color space to a second color space, and performing linear interpolation processing on the image to be processed converted to the second color space through a CDC 3D lookup table to obtain a second image; and converting the second image from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range. According to the method and the device, the 3D lookup table is utilized, the original IC can directly support the high dynamic range function through upgrading the firmware, and zero hardware cost upgrading is realized.
Description
Technical Field
The present application relates to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, and a medium.
Background
The high dynamic range can dynamically adjust the picture color for each specific scene, and has wider application in the fields of televisions and displays. The standard dynamic range refers to a common color display mode, and the information size is smaller than the high dynamic range, so that the popularization is higher. Therefore, in practical applications, it is often necessary to convert the high dynamic range color display mode to the standard dynamic range color display mode.
In order to convert the color display mode with high dynamic range into the color display mode with standard dynamic range, the related art scheme generally needs to upgrade hardware, such as X4 of AML, to support the function of converting the high dynamic range into the standard dynamic range at the chip level, which has the disadvantage that the hardware needs to be upgraded to convert the high dynamic range into the standard dynamic range, thereby increasing the development cost of the hardware.
Therefore, the above technical problems of the related art need to be solved.
Disclosure of Invention
The present application is directed to solving one of the technical problems in the related art. Therefore, embodiments of the present application provide an image processing method, an image processing apparatus, and an image processing medium, which can convert a high dynamic range color display mode into a standard dynamic range color display mode through software.
According to an aspect of an embodiment of the present application, there is provided an image processing method, including:
decoding an image to be processed to obtain high dynamic range data;
converting the image to be processed from a first color space to a second color space, and performing linear interpolation processing on the image to be processed converted to the second color space through a CDC 3D lookup table to obtain a second image;
and converting the second image from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range.
In one embodiment, after the second image is converted from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range, the method further comprises:
and synthesizing the first image and the second image and outputting the synthesized image.
In one embodiment, after obtaining the high dynamic range data, the method further comprises:
sending the high dynamic range data to the HWC unit.
In one embodiment, converting the image to be processed from a first color space to a second color space includes:
converting the image to be processed from an RGB color space of 255 × 255 × 255 into a preset color space through an HWC unit.
In one embodiment, the predetermined color space is a 17 × 17 × 17 RGB color space.
In one embodiment, decoding the image to be processed includes:
and decoding the high dynamic range data of the image to be processed through VE.
In one embodiment, the converting the intermediate image from the high dynamic range to the standard dynamic range space according to the metadata of the high dynamic range includes:
and converting the intermediate image from the high dynamic range to a standard dynamic range space according to the metadata of the high dynamic range through a preset algorithm.
In one embodiment, after the intermediate image is converted from the high dynamic range to the standard dynamic range space according to the metadata of the high dynamic range, the method further comprises:
and dynamically transmitting the CDC 3D lookup table to a display driver along with the layer parameters, and writing the CDC 3D lookup table into a CDC register.
According to an aspect of embodiments of the present application, there is provided an image processing apparatus, including:
the acquisition module is used for decoding the image to be processed to obtain high dynamic range data;
the decoding module is used for converting the image to be processed from a first color space to a second color space and carrying out linear interpolation processing on the image to be processed converted to the second color space through a CDC 3D lookup table to obtain a second image;
and the conversion module is used for converting the second image from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range.
According to an aspect of the embodiments of the present application, there is provided a storage medium storing a program executable by a processor, the program executable by the processor implementing an image processing method according to the foregoing embodiments when executed by the processor.
The image processing method provided by the embodiment of the application has the beneficial effects that: the method comprises the steps of obtaining an image to be processed; decoding the image to be processed to obtain metadata with a high dynamic range; converting the RGB color space of the image to be processed into a preset color space, and performing linear interpolation processing on the remaining intermediate points through a CDC 3D lookup table to obtain an intermediate image; and converting the intermediate image from the high dynamic range to a standard dynamic range space according to the metadata of the high dynamic range. According to the method and the device, the 3D lookup table is utilized, the original IC can directly support the high dynamic range function through upgrading the firmware, and zero hardware cost upgrading is achieved.
Additional aspects and advantages of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
FIG. 1 is a schematic diagram illustrating an image processing method according to an embodiment of the present disclosure;
fig. 2 is a flowchart of an image processing method provided in an embodiment of the present application;
FIG. 3 is a schematic diagram of an apparatus for implementing a high dynamic range to standard dynamic range according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of another apparatus for implementing high dynamic range to standard dynamic range according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," "third," and "fourth," etc. in the description and claims of this application and the accompanying drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
Dynamic Range (DR) is used in many fields to represent the ratio of the maximum value and the minimum value of each variable. In digital images, the dynamic range characterizes the ratio between the maximum and minimum grey values within the displayable range of the image, i.e. the number of levels of grey division of the image from "brightest" to "darkest". The larger the dynamic range of an image is, the richer the brightness levels which can be represented by the image are, and the more vivid the visual effect of the image is. Since the dynamic range of a natural scene in the real world is between 10-3 and 106, the dynamic range is very large, and is called High Dynamic Range (HDR). The dynamic range of a normal image is a Low Dynamic Range (LDR), which may also be referred to as a Standard Dynamic Range (SDR), relative to a high dynamic range image. The high dynamic range can dynamically adjust the picture color according to each specific scene, and has wider application in the fields of televisions and displays. The standard dynamic range refers to a common color display mode, and the information size is smaller than the high dynamic range and is higher in popularity. Therefore, in practical applications, it is often necessary to convert the high dynamic range color display mode to the standard dynamic range color display mode.
In order to convert the color display mode with high dynamic range into the color display mode with standard dynamic range, the related art scheme generally needs to upgrade hardware, such as X4 of AML, to support the function of converting the high dynamic range into the standard dynamic range at the chip level, which has the disadvantage that the hardware needs to be upgraded to convert the high dynamic range into the standard dynamic range, thereby increasing the development cost of the hardware.
In particular, the high dynamic range 10+ is a high dynamic range standard established and promoted by samsung, sony, and the like, and is a high dynamic range standard with a high popularization rate at present. This technique is developed from a high dynamic range 10, supporting dynamic metadata processing, and can dynamically adjust picture colors for each particular scene. Foreign operators typically require support for the high dynamic range 10+ standard. In recent years, domestic operators have also become increasingly important. The high dynamic range described in the related art of the present application includes at least the high dynamic range 10 +.
In order to solve the above problem, the present application proposes an image processing method, the principle of which is as follows:
fig. 1 is a schematic diagram illustrating an image processing method according to an embodiment of the present application, and as shown in fig. 1, the principle of the present application includes using a color space of 17 × 17 × 17 to represent 255 × 255 × 255, using a samsung official high dynamic range 10+ to standard dynamic range algorithm to map into corresponding color space values, and filling the remaining points with linear interpolation to reduce the amount of computation, so as to facilitate software implementation. Finally, a 3D lookup table required by CDC is generated and transmitted to the DE driver.
Specifically, after the source image sends an image to VE, the image is decoded SEI to obtain metadata, and a 17 × 17 × 17 color space represents 255 × 255 × 255 according to preset parameters, and is filled by linear interpolation, so that the calculation amount is reduced.
Fig. 2 is a flowchart of an image processing method provided in an embodiment of the present application, and as shown in fig. 2, the image processing method provided in the present application specifically includes:
s201, decoding the image to be processed to obtain high dynamic range data.
S202, converting the image to be processed from the first color space to the second color space, and performing linear interpolation processing on the image to be processed converted to the second color space through a CDC 3D lookup table to obtain a second image.
In this embodiment, decoding the high dynamic range data of the image to be processed includes: and decoding the high dynamic range data of the image to be processed through VE. Wherein, VE is a decoder for decoding images, and the work flow includes initialization, creation, reading code stream, decoding, displaying, etc. The VE decoder can decode the image to obtain various items of information contained in the image, including metadata of a high dynamic range of the image to be processed.
Optionally, after obtaining the metadata with high dynamic range, the method further includes: sending the high dynamic range metadata to the HWC unit.
S203, converting the second image from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range.
In the present embodiment, the HWC unit converts the RGB color space of 255 × 255 × 255 into a preset color space. It should be noted that hwc (hwcomposer) is a HAL Layer module for window (Layer) composition and display in Android, and its implementation is device-specific and is usually done by the display device manufacturer to provide hardware support for the surfafinger service. Because the capabilities of display devices vary widely, it is difficult to directly use API to indicate the number of layers that the hardware device supports composition, whether the layers can perform rotation and mixed mode operations, and restrictions on Layer positioning and hardware composition.
In step S203, the preset color space is a 17 × 17 × 17 RGB color space. Step S203 may specifically be: the HWC unit converts the RGB color space of 255 × 255 × 255 into the RGB color space of 17 × 17 × 17.
In this embodiment, after the step S203 transfers the intermediate image from the high dynamic range to the standard dynamic range space according to the metadata of the high dynamic range, the method further includes: and synthesizing and outputting the original image to be processed and the image converted into the standard dynamic range space. The Standard Dynamic Range (Standard Dynamic Range) is a very common color display mode, has a smaller information size than a high Dynamic Range, is highly popular, and is generally applied to a display color Range of a non-drawing, clipping and a display device which does not need to work with a high color gamut (the work without the high color gamut refers to word processing, programming and the like), or a color Range of a non-professional photographic camera and a color Range of a photo.
Optionally, the converting the intermediate image from the high dynamic range to the standard dynamic range space according to the metadata of the high dynamic range includes: the HWC unit converts the intermediate image from a high dynamic range to a standard dynamic range space according to the metadata of the high dynamic range by using a preset algorithm. The conversion process combines the Samsung high dynamic range 10+ official algorithm, namely the software can directly support the high dynamic range 10+ film source, and the effect of the high dynamic range 10+ is displayed on a standard dynamic range television.
Optionally, after the intermediate image is converted from the high dynamic range to the standard dynamic range space according to the metadata of the high dynamic range, the method further comprises: and the HWC unit dynamically transmits the CDC 3D lookup table to a display driver along with the layer parameters and writes the CDC 3D lookup table into a CDC register. And the VE decodes the high dynamic range data and the image and transmits the high dynamic range data and the image to the HWC for processing, and the HWC transmits the processed CDC lookup table and the image data to the driver to finish displaying. Wherein the CDC object provides member functions that handle display device context, and members that handle display context corresponding to the window client.
Referring to fig. 3, an embodiment of the present invention further provides a device for implementing a high dynamic range to a standard dynamic range, including:
an obtaining module 301, configured to decode an image to be processed to obtain high dynamic range data;
a decoding module 302, configured to convert the image to be processed from the first color space to the second color space, and perform linear interpolation processing on the image to be processed converted to the second color space through a CDC 3D lookup table to obtain a second image;
a conversion module 303, configured to convert the second image from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range.
It can be seen that the contents in the foregoing method embodiments are all applicable to this apparatus embodiment, the functions specifically implemented by this apparatus embodiment are the same as those in the foregoing method embodiment, and the advantageous effects achieved by this apparatus embodiment are also the same as those achieved by the foregoing method embodiment.
Referring to fig. 4, an embodiment of the present application provides an apparatus for implementing a high dynamic range to a standard dynamic range, including:
at least one processor 401;
at least one memory 402 for storing at least one program;
the at least one program, when executed by the at least one processor 401, causes the at least one processor 401 to implement the image processing method of the foregoing embodiment.
Similarly, the contents of the method embodiments are all applicable to the apparatus embodiments, the functions specifically implemented by the apparatus embodiments are the same as the method embodiments, and the beneficial effects achieved by the apparatus embodiments are also the same as the beneficial effects achieved by the method embodiments.
An embodiment of the present invention further provides a storage medium storing a program, which is used to implement the image processing method of the foregoing embodiment when the program is executed by a processor.
The contents in the above method embodiments are all applicable to the present storage medium embodiment, and the functions implemented in the present storage medium embodiment are the same as those in the above method embodiments.
Similarly, the contents in the foregoing method embodiments are all applicable to this storage medium embodiment, the functions specifically implemented by this storage medium embodiment are the same as those in the foregoing method embodiments, and the advantageous effects achieved by this storage medium embodiment are also the same as those achieved by the foregoing method embodiments.
In alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of the present application are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed and in which sub-operations described as part of larger operations are performed independently.
Furthermore, although the present application is described in the context of functional modules, it should be understood that, unless otherwise stated to the contrary, one or more of the functions and/or features may be integrated in a single physical device and/or software module, or one or more functions and/or features may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion regarding the actual implementation of each module is not necessary for an understanding of the present application. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be understood within the ordinary skill of an engineer, given the nature, function, and internal relationship of the modules. Accordingly, those skilled in the art can, using ordinary skill, practice the present application as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative of and not intended to limit the scope of the application, which is defined by the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product stored in a storage medium and including instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and other various media capable of storing program codes.
The logic and/or steps represented in the flowcharts or otherwise described herein, e.g., an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium could even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in a computer memory.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following technologies, which are well known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
In the foregoing description of the specification, reference to the description of "one embodiment/example," "another embodiment/example," or "certain embodiments/examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the present application have been shown and described, it will be understood by those of ordinary skill in the art that: numerous changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the application, the scope of which is defined by the claims and their equivalents.
The above embodiments are only used to illustrate the technical solutions of the present application, and not to limit the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; and such modifications or substitutions do not depart from the spirit and scope of the corresponding technical solutions in the embodiments of the present application.
Claims (10)
1. An image processing method, characterized in that the method comprises:
decoding an image to be processed to obtain high dynamic range data;
converting the image to be processed from a first color space to a second color space, and performing linear interpolation processing on the image to be processed converted to the second color space through a CDC 3D lookup table to obtain a second image;
and converting the second image from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range.
2. An image processing method according to claim 1, wherein after the second image is converted from a high dynamic range to a standard dynamic range according to the metadata of the high dynamic range, the method further comprises:
and synthesizing the first image and the second image and outputting the synthesized image.
3. An image processing method according to claim 1, wherein after obtaining the high dynamic range data, the method further comprises:
sending the high dynamic range data to the HWC unit.
4. An image processing method according to claim 1, wherein converting the image to be processed from a first color space to a second color space comprises:
converting the image to be processed from an RGB color space of 255 × 255 × 255 into a preset color space through an HWC unit.
5. An image processing method as claimed in claim 3, wherein the predetermined color space is a 17 x 17 RGB color space.
6. An image processing method according to claim 1, wherein decoding the image to be processed comprises:
and decoding the high dynamic range data of the image to be processed through VE.
7. The method according to claim 1, wherein the converting the intermediate image from the high dynamic range to the standard dynamic range space according to the metadata of the high dynamic range comprises:
and converting the intermediate image from the high dynamic range to a standard dynamic range space according to the metadata of the high dynamic range through a preset algorithm.
8. An image processing method according to claim 1, wherein after the intermediate image is converted from a high dynamic range to a standard dynamic range space according to the metadata of the high dynamic range, the method further comprises:
and dynamically transmitting the CDC 3D lookup table to a display driver along with the layer parameters, and writing the CDC 3D lookup table into a CDC register.
9. An image processing apparatus, characterized in that the apparatus comprises:
the acquisition module is used for decoding the image to be processed to obtain high dynamic range data;
the decoding module is used for converting the image to be processed from a first color space to a second color space and carrying out linear interpolation processing on the image to be processed converted to the second color space through a CDC 3D lookup table to obtain a second image;
and the conversion module is used for converting the second image from the high dynamic range to the standard dynamic range according to the metadata of the high dynamic range.
10. Storage medium, characterized in that it stores a program executable by a processor, which when executed by the processor implements an image processing method according to any one of claims 1 to 8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210513367.8A CN114972080A (en) | 2022-05-12 | 2022-05-12 | Image processing method, device and medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210513367.8A CN114972080A (en) | 2022-05-12 | 2022-05-12 | Image processing method, device and medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114972080A true CN114972080A (en) | 2022-08-30 |
Family
ID=82981690
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210513367.8A Pending CN114972080A (en) | 2022-05-12 | 2022-05-12 | Image processing method, device and medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114972080A (en) |
-
2022
- 2022-05-12 CN CN202210513367.8A patent/CN114972080A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
AU2009225336B2 (en) | Method of compositing variable alpha fills supporting group opacity | |
US9984446B2 (en) | Video tone mapping for converting high dynamic range (HDR) content to standard dynamic range (SDR) content | |
EP3306944B1 (en) | Display method and display device | |
CN101809617B (en) | Enhancing dynamic ranges of images | |
US8068116B2 (en) | Methods, systems, and data structures for generating a rasterizer | |
JP6234920B2 (en) | High dynamic range image signal generation and processing | |
KR100604102B1 (en) | Methods and apparatus for processing DVD video | |
US20080165190A1 (en) | Apparatus and method of displaying overlaid image | |
US20050213853A1 (en) | Image processor | |
CN110192223B (en) | Display mapping of high dynamic range images | |
JP7359521B2 (en) | Image processing method and device | |
US20080291208A1 (en) | Method and system for processing data via a 3d pipeline coupled to a generic video processing unit | |
JP2017502353A (en) | Method and device for tone mapping high dynamic range images | |
Kainz et al. | Technical introduction to OpenEXR | |
CN107533832B (en) | Image processing apparatus, image processing method, and program | |
CN112449169A (en) | Method and apparatus for tone mapping | |
CN111833417A (en) | Method and system for realizing black and white mode of android application program | |
CN112740278A (en) | Blending adjacent bins | |
CN114972080A (en) | Image processing method, device and medium | |
CN114096988A (en) | Controlling display brightness while rendering composite scene-related and outputting related content | |
WO2023125467A1 (en) | Image processing method and apparatus, electronic device and readable storage medium | |
CN112309341A (en) | Electronic device for blending layers of image data | |
US11908079B2 (en) | Variable rate tessellation | |
CN107273072B (en) | Picture display method and device and electronic equipment | |
CN110930480B (en) | Method for directly rendering startup animation video of liquid crystal instrument |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |