CN109274985B - Video transcoding method and device, computer equipment and storage medium - Google Patents

Video transcoding method and device, computer equipment and storage medium Download PDF

Info

Publication number
CN109274985B
CN109274985B CN201811188614.1A CN201811188614A CN109274985B CN 109274985 B CN109274985 B CN 109274985B CN 201811188614 A CN201811188614 A CN 201811188614A CN 109274985 B CN109274985 B CN 109274985B
Authority
CN
China
Prior art keywords
video
space
target
mapping
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811188614.1A
Other languages
Chinese (zh)
Other versions
CN109274985A (en
Inventor
翟海昌
廖念波
汪亮
李�浩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201811188614.1A priority Critical patent/CN109274985B/en
Publication of CN109274985A publication Critical patent/CN109274985A/en
Application granted granted Critical
Publication of CN109274985B publication Critical patent/CN109274985B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/20Servers specifically adapted for the distribution of content, e.g. VOD servers; Operations thereof
    • H04N21/23Processing of content or additional data; Elementary server operations; Server middleware
    • H04N21/234Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs
    • H04N21/2343Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements
    • H04N21/234309Processing of video elementary streams, e.g. splicing of video streams, manipulating MPEG-4 scene graphs involving reformatting operations of video signals for distribution or compliance with end-user requests or end-user device requirements by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4 or from Quicktime to Realvideo
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/4402Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
    • H04N21/440218Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by transcoding between formats or standards, e.g. from MPEG-2 to MPEG-4

Abstract

The application provides a video transcoding method, a video transcoding device, computer equipment and a storage medium, wherein the method comprises the following steps: acquiring a video to be transcoded, wherein the video format of the video to be transcoded is a standard dynamic range SDR; mapping a source space of a video to be transcoded to a linear RGB space to obtain a first video; mapping the space of the first video to an HSI space, expanding the local color gamut of S in the HSI space to a target color gamut, and mapping the expanded HSI space to a linear RGB space; expanding the brightness domain of the first video to a target brightness domain; determining a target color gamut and a target brightness gamut according to the video format of a target video, wherein the video format of the target video is a High Dynamic Range (HDR); and mapping the space of the expanded first video to a source space to obtain a target video. The method realizes the conversion of the SDR video into the HDR video, so that a user can watch the HDR effect of the SDR video on the equipment supporting the HDR, and the film watching effect is greatly improved.

Description

Video transcoding method and device, computer equipment and storage medium
Technical Field
The present application relates to the field of video processing technologies, and in particular, to a video transcoding method, an apparatus, a computer device, and a storage medium.
Background
With the development of video technology, many distinctive video format standards, such as Standard Dynamic Range (SDR) and High-Dynamic Range (HDR), are emerging in the market. Among them, the HDR format is more color-rich than the SDR.
At present, most video formats are SDR formats, but in video playing equipment, videos are played in the SDR formats, so that the video watching effect is poor.
Disclosure of Invention
The application provides a video transcoding method, a video transcoding device, a computer device and a storage medium, which are used for solving the problem that in the related technology, the video is played in an SDR format, so that the film watching effect is poor.
An embodiment of one aspect of the present application provides a video transcoding method, including:
acquiring a video to be transcoded; the video format of the video to be transcoded is a standard dynamic range SDR;
mapping a source space of the video to be transcoded to a linear RGB space to obtain a first video;
mapping the space of the first video to an HSI space, expanding a local color gamut of S in the HSI space to a target color gamut, and mapping the expanded HSI space to a linear RGB space;
expanding a luminance domain of the first video to a target luminance domain; the target color gamut and the target brightness gamut are determined according to a video format of a target video, and the video format of the target video is a High Dynamic Range (HDR);
and mapping the space of the expanded first video to the source space to obtain the target video.
The video transcoding method provided by the embodiment of the application obtains a video to be transcoded, then maps a source space of the video to be transcoded to a linear RGB space to obtain a first video, then maps a space of the first video to an HSI space, expands a local color gamut of S in the HSI space to a target color gamut, maps the expanded HSI space to the linear RGB space, expands a brightness domain of the first video to the target brightness domain, and then maps the expanded space of the first video to the source space to obtain the target video. Therefore, after the video to be transcoded is mapped to the linear RGB space from the source space, color gamut expansion and brightness domain expansion are carried out, and the expanded first video is mapped to the source space from the linear RGB space to obtain the HDR video, so that the conversion of the SDR video into the HDR video is realized, a user can watch the HDR effect of the SDR video on the device supporting the HDR, and the film watching effect is greatly improved.
Another embodiment of the present application provides a video transcoding device, including:
the acquisition module is used for acquiring a video to be transcoded; the video format of the video to be transcoded is a standard dynamic range SDR;
the mapping module is used for mapping the source space of the video to be transcoded to a linear RGB space to obtain a first video;
the expansion module is used for mapping the space of the first video to an HSI space, expanding the local color gamut of S in the HSI space to a target color gamut and mapping the expanded HSI space to a linear RGB space;
the expansion module is further used for expanding the brightness domain of the first video to a target brightness domain; the target color gamut and the target brightness gamut are determined according to a video format of a target video, and the video format of the target video is a High Dynamic Range (HDR);
the mapping module is further configured to map the space of the expanded first video to the source space, so as to obtain the target video.
The video transcoding device provided by the embodiment of the application obtains a video to be transcoded, then maps a source space of the video to be transcoded to a linear RGB space to obtain a first video, then maps a space of the first video to an HSI space, expands a local color gamut of S in the HSI space to a target color gamut, maps the expanded HSI space to the linear RGB space, expands a brightness domain of the first video to the target brightness domain, and then maps the expanded space of the first video to the source space to obtain the target video. Therefore, after the video to be transcoded is mapped to the linear RGB space from the source space, color gamut expansion and brightness domain expansion are carried out, and the expanded first video is mapped to the source space from the linear RGB space to obtain the HDR video, so that the conversion of the SDR video into the HDR video is realized, a user can watch the HDR effect of the SDR video on the device supporting the HDR, and the film watching effect is greatly improved.
Another embodiment of the present application provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the video transcoding method as described in the above embodiment is implemented.
Another embodiment of the present application provides a non-transitory computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the non-transitory computer-readable storage medium implements the video transcoding method according to the above embodiment.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the application.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present application and together with the description, serve to explain the principles of the application.
Fig. 1 is a schematic flowchart of a video transcoding method according to an embodiment of the present disclosure;
fig. 2 is a schematic flowchart of another video transcoding method according to an embodiment of the present disclosure;
fig. 3 is a schematic flowchart of another video transcoding method according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a video transcoding apparatus according to an embodiment of the present disclosure;
FIG. 5 illustrates a block diagram of an exemplary computer device suitable for use in implementing embodiments of the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present application, as detailed in the appended claims.
The video transcoding method, apparatus, computer device, and storage medium of the embodiments of the present application are described below with reference to the accompanying drawings.
With the development of terminal technology, there are more and more HDR-supporting devices, but most of the videos are in SDR format at present, and if the videos are played in SDR format, the viewing effect is affected. To solve the problem, an embodiment of the present application provides a video transcoding method.
According to the video transcoding method, the color gamut expansion and the brightness domain expansion are carried out after the video to be transcoded is mapped to the linear RGB space from the source space, and the expanded first video is mapped to the source space from the linear RGB space to obtain the HDR video, so that the SDR video is converted into the HDR video, a user can watch the HDR effect of the SDR video on the device supporting the HDR, and the film watching effect is greatly improved.
Fig. 1 is a schematic flowchart of a video transcoding method according to an embodiment of the present application.
The video transcoding method in this embodiment may be implemented by a video transcoding device, and the video transcoding device may specifically be a hardware device or software installed in a hardware device. Hardware devices such as mobile terminals, servers, etc.
As shown in fig. 1, the video transcoding method may include the following steps:
step 101, acquiring a video to be transcoded; the video format of the video to be transcoded is standard dynamic range SDR.
In the related art, the SDR is a standard dynamic range image, and the HDR is a high dynamic range image, so that the HDR video has a wider color gamut and richer colors than the SDR video, and therefore, the embodiment of the present application provides a method for converting the video with the video format of the SDR into the HDR format, so as to make the viewing effect better.
In this embodiment, a video with a video format of SDR is used as a video to be transcoded.
Step 102, mapping a source space of a video to be transcoded to a linear RGB space to obtain a first video.
Color is usually described by three independent attributes, and the three independent variables are combined to form a space coordinate, namely a space. But the described color object itself is objective and different spaces simply weigh the same object from different angles.
In this embodiment, the source space may be a YUV space. YUV is a color coding method, and is a color-luminance separation space, where "Y" represents brightness, and "U" and "V" are chroma and concentration. And the RGB space is represented by physical three primary colors of blue, green and red, and is a primary color space.
In practical application, a YUV space is generally adopted when transmitting a video, and the SDR video to be transcoded is a YUV space in the embodiment. In order to facilitate subsequent expansion of the color gamut and the brightness gamut of the first video, the YUV space of the video to be transcoded can be mapped to the linear RGB space to obtain the first video. Because the color domain expansion is carried out in the HSI space and the brightness domain expansion is carried out in the RGB space, if the YUV space is directly mapped to the HSI space, the color domain expansion can be carried out firstly, and then the YUV space is mapped to the RGB space for the brightness domain expansion. Therefore, in the embodiment, the SDR video to be transcoded is mapped to the linear RGB space first, and then is directly performed if the luminance domain expansion is performed first, and is mapped to the HSI space after the luminance domain expansion is performed to perform the color domain expansion; if the color gamut expansion is performed first, the linear RGB space is mapped to the HSI space for color gamut expansion, and the HSI space is mapped to the linear RGB space for brightness gamut expansion after the color gamut expansion.
Step 103, mapping the space of the first video to an HSI space, expanding the local color gamut of S in the HSI space to a target color gamut, and mapping the expanded HSI space to a linear RGB space.
The HSI space reflects the way in which the human visual system perceives color, which is perceived in three basic characteristic quantities of hue, saturation and intensity. Namely, the HSI space describes the color characteristics by using H, S, I three parameters, wherein H defines the wavelength of the color and is called hue; s represents the shade degree of the color, called saturation; i denotes intensity or brightness.
Since the HSI space adapts to the visual characteristics of a human, in this embodiment, the color gamut is expanded after the linear RGB space of the first video is mapped to the HSI space.
Specifically, after the first video is obtained, the linear space of the first video is mapped to the HSI space, then the local color gamut of S in the HSI space is expanded to the target color gamut, and then the expanded HSI space is mapped to the linear RGB space.
Wherein the target color gamut is determined according to a format of the target video, wherein the format of the target video is HDR. Taking HDR10 as an example, the target color gamut may be a full color gamut, eighty percent color gamut, or the like.
In specific implementation, the linear RGB space of the first video may be mapped to the XYZ space, the XYZ space is used as a bridge, and then the XYZ space is mapped to the HSI space according to the chromaticity coordinates of the target video, for example, according to the BT2020 chromaticity coordinates. Among them, BT2020 is a standard for a new generation of Ultra-high definition (UHD) video production and display system promulgated by the international telecommunication union radio communication sector (ITU-R).
After the space of the first video is mapped to the HSI space, a local color gamut with the highest chromatic value in S is obtained for S in the HSI space, and the local color gamut is expanded to a target color gamut by using a linear pull-up algorithm or according to a preset curve, while the H hue can be appropriately adjusted according to manual presetting or pre-analysis, and I can be kept unchanged.
The linear stretching algorithm is to multiply each chroma value in the local color gamut by a fixed multiple to obtain an expanded chroma value. That is, the fixed multiple for each chrominance value is the same. And performing color gamut expansion according to a preset curve means that the fixed multiples corresponding to each chromatic value are different.
For example, taking the first video as an example, each pixel in the first video corresponds to a chrominance value. And for each pixel point, when the chromatic value corresponding to the pixel point is within top 10%, expanding by adopting a linear pull-up algorithm, and if the corresponding chromatic value is not within top 10%, not expanding. Wherein top 10% refers to the range where 10% of all the pixels of the first video with the highest chroma value is located. Here, the local color gamut refers to a set of chroma values located within top 10% in the first video.
In this embodiment, the linear RGB space of the first video is transformed to the HSI space, and the local color gamut of S in the HSI space is expanded to the target color gamut, so that the color gamut of the first video can be expanded, and the color of the video can be increased.
And 104, expanding the brightness domain of the first video to a target brightness domain.
Wherein the target luminance domain is determined according to the format HDR of the target video. Taking HDR10 as an example, the target luminance domain is [0,10000] nit, where nit (nit) is the unit of luminance, is the luminous intensity per unit area, and represents the intensity of light reflected by an object viewed by the eye from a certain direction.
Whereas the luma domain of the SDR is 0,100 nit, since the luma domain of the SDR is at most 100nit, the luma domain of the HDR10 is at most 10000nit, the luma of the SDR is a relative luma, and the luma represented by HDR is an absolute luma, it is necessary to map the 100nit SDR luma domain to the HDR luma domain.
Since the wider the luminance domain, the more image details are provided, in this embodiment, the luminance domain of the first video can be expanded. As a possible implementation manner, the luminance domain of the first video may be expanded to the target luminance domain by using preset parameters and corresponding preset functions. The preset function is shown in formulas (1), (2) and (3).
y=a×xr(1)
Where y is the target luminance value after expansion, x is the luminance value before expansion, a is a coefficient, and r is a preset parameter, for example, r is 1.68.
y=asinxγ+β (2)
Where y is the target luminance value after expansion, x is the luminance value before expansion, a represents a coefficient, γ represents a preset parameter, and β is a constant.
Figure GDA0002364550350000051
Wherein y is the target brightness value after expansion, x is the brightness value before expansion, α and mu are coefficients, β and omega are constants, gamma represents a preset parameter, and theta and sigma are coefficients.
Further, before the brightness domain is expanded by using the preset function, the initial preset parameters can be adjusted according to the brightness value of each pixel point.
Specifically, the brightness value of each pixel point in the first video can be obtained, and the brightness value of each pixel point is adjusted according to the initial preset parameters. For example, the initial preset parameter is 1.68, the brightness values are sorted from high to low, the first 10% brightness values are selected and averaged, and the preset parameter is adjusted according to the average value. Wherein, the smaller the average value is, the smaller the adjusted preset parameter is. Since a smaller average value indicates a larger dark area and a smaller brightness area in the video, and if the preset parameter is too large, the difference between dark and light will be more obvious, so the smaller the average value, the smaller the adjusted preset parameter.
And after the preset parameters are adjusted, expanding the brightness domain according to the adjusted preset parameters and the corresponding preset functions.
And 105, mapping the space of the expanded first video to a source space to obtain a target video.
After performing color gamut expansion and luminance gamut expansion on the first video, the space of the first video is a linear RGB space, and in order to convert the first video into a video format that can be played, the linear RGB space of the first video needs to be mapped to a source space to obtain a target video with a video format that is an HDR format. Therefore, the method and the device can convert the source video in the SDR format into the target video in the HDR format.
Specifically, the linear RGB space of the first video is first mapped according to the PQ curve defined in the target video, that is, according to the PQ curve defined in the HDR format, so as to obtain the nonlinear RGB space. Then, the nonlinear RGB space is mapped to the source space, resulting in the target video in HDR format.
According to the video transcoding method, the video in the SDR format is mapped to the linear RGB space from the source space, then color gamut expansion and brightness domain expansion are carried out, the linear RGB space is mapped to the source space, and the target video in the HDR format is obtained, so that the video in the SDR format is converted into the HDR, a user can view the HDR effect of the SDR video on HDR equipment, and viewing experience is improved.
On the basis of the foregoing embodiment, the above mapping the source space to be transcoded to the linear RGB space to obtain the first video may be implemented by the manner shown in fig. 2. Fig. 2 is a schematic flowchart of another video transcoding method according to an embodiment of the present application.
As shown in fig. 2, mapping the source space to be transcoded to the linear RGB space may include the following steps:
step 201, mapping a source space of a video to be transcoded to a nonlinear RGB space.
In practical applications, when the source space is a YUV space, the YUV space of the video to be transcoded can be converted into a nonlinear RGB space according to a conversion formula from the YUV space to the nonlinear RGB space defined by different specifications of SDR formats (such as BT709, BT601, etc.).
Step 202, normalizing the nonlinear RGB space of the video to be transcoded to obtain a normalized nonlinear RGB space.
For the requirements of precision and expansion, the nonlinear RGB space can be normalized to obtain a normalized nonlinear space.
For example, an SDR format video is quantized with 8bits of depth, i.e., each color channel is quantized with 8bits, and when performing normalization, the R, G, and B values of each pixel may be divided by 255, thereby normalizing the 8-bits nonlinear RGB space to a floating-point domain of [0,1 ]. Where 1 represents the maximum luminance and 0 represents the minimum luminance.
Step 203, mapping the normalized nonlinear RGB space to a linear RGB space to obtain a first video.
Because the video transmission adopts a nonlinear RGB space, the dark light area of the nonlinear RGB space is large, the number of pixel points in the dark light area is large, and because the bit occupied by each pixel point is the same, the total bit distributed by the dark light area is increased. In order for the RGB space to feed back true luminance values, a non-linear RGB mapping to a linear RGB space is required.
In specific implementation, an inverse function of OETF defined in BT709 may be adopted to map a nonlinear RGB space after normalization of a video to be transcoded to a linear RGB space, so as to obtain a first video.
According to the video transcoding method, after the video to be transcoded is mapped to the nonlinear RGB space, normalization is carried out, and the nonlinear RGB space after normalization is mapped to the linear RGB space, so that the first video is obtained, and color gamut and brightness gamut expansion is conveniently carried out subsequently.
In the above-described color gamut expansion, the linear RGB space is mapped to the XYZ space, then mapped to the HSI space, and the color gamut expansion is performed by S.
As another possible implementation of the gamut expansion, the linear RGB space of the first video may be mapped to the CIELAB space. Among them, the CIELAB space is a uniform space, and the uniformity is that when the value is uniformly changed, the human sense is also uniformly changed. The CIELAB space includes three parameters L*、a*And b*Wherein L is*Represents the color brightness (L)*0 indicates black, L*100 indicates white), a*Indicating the position between red/magenta and green (a)*Negative values indicate green, while positive values indicate magenta), b*Indicates the position between yellow and blue (b)*Negative values indicate blue and positive values indicate yellow).
In practical use, since the RGB space cannot be directly converted into the CIELAB space, the linear RGB space may be first converted into the XYZ space, and then mapped into the CIELAB space from the XYZ space.
After the linear RGB space is mapped to the CIELAB space, the gamut with the highest value for each dimension in the CIELAB space is expanded to the corresponding target gamut. Specifically, L is*、a*And b*The local color gamut with the highest value of (a) is expanded to the corresponding target color gamut. Thereby achieving gamut expansion of the first video.
In the above embodiment, after the luminance domain of the first video is expanded by using the preset parameters and the corresponding preset functions, color halos may appear in the high luminance portions of the image, which is mainly caused by the excessive difference between the adjacent luminance values.
In order to avoid the influence of the chromatic halo on the HDR effect, the difference between the brightness value of the pixel point and the brightness value of the adjacent pixel point in the first video may be compared with a preset difference threshold. When the difference value between the brightness values of the pixel point and the adjacent pixel point is larger than the preset difference value threshold, the difference between the brightness values of the pixel point and the adjacent pixel point is over large, the pixel point can be filtered by adopting a filter, the difference between the pixel point and the adjacent pixel point is smoothed, and the occurrence of color halo is avoided.
Wherein the filter includes 8 coefficients, and the value of the 8 coefficients can be determined according to the type of the filter. For example, with a filter: -1,4, -11,40,40, -11,4, -1} filtering the pixel points. The 8 coefficient values of the filter are coefficient values when the type is cosine filtering.
As an example of filtering a pixel point by using a filter, for a current pixel point, luminance values of 4 pixel points on the left side of the current pixel point and luminance values of 4 pixel points on the right side of the current pixel point in a transverse direction may be obtained, and luminance values of the 8 pixel points from left to right in the transverse direction and 8 coefficients of the filter are subjected to corresponding weighted summation to obtain a luminance value of the current pixel point. Then, the brightness values of 4 pixel points above the current pixel point and the brightness values of 4 pixel points below the current pixel point in the longitudinal direction are obtained, and the brightness values of the 8 pixel points from top to bottom in the longitudinal direction and 8 coefficients of the filter are correspondingly weighted and summed to obtain the brightness value of the current pixel point. And then, calculating the average value of the calculation result in the transverse direction and the calculation result in the longitudinal direction, and taking the average value as the brightness value of the current pixel point to smooth the brightness difference between the current pixel point and the adjacent pixel point.
It is understood that the longitudinal direction may be calculated first, and then the transverse direction may be calculated, which is not limited in this embodiment.
Further, in order to improve the HDR effect of the video, after the luminance domain of the first video is expanded to the target luminance domain by using the preset function and the preset parameter, each image block in the image may be further subjected to fine adjustment. Fig. 3 is a flowchart illustrating another video transcoding method according to an embodiment of the present application.
After the preset parameters and the corresponding preset functions are adopted to expand the luminance domain of the first video to the target luminance domain, as shown in fig. 3, the video transcoding method further includes:
step 301, for each image in a first video, dividing the image into a plurality of image blocks.
Since the video is composed of images of one frame by one frame, in the present embodiment, each image is divided into a plurality of image blocks for each image in the first video. For example, the image is divided into 3 x 3 image blocks, each having the same area.
Step 302, aiming at the image block, adjusting preset parameters according to the difference of the brightness values between the image block and the adjacent image block.
In this embodiment, for each image block, the preset parameter may be adjusted according to a difference between the luminance value of the image block and the luminance value of the adjacent image block. The brightness value of the image block can be equal to the sum of the brightness values of all the pixel points in the image block, and the ratio of the number of the pixel points contained in the image block.
In specific implementation, a corresponding relationship between the difference value and the adjustment amount of the preset parameter may be established in advance, and after the difference value between the brightness value of the image block and the brightness value of the adjacent image block is determined, the adjustment amount of the preset parameter may be determined by querying the corresponding relationship, so as to adjust the preset parameter. Wherein, the larger the difference value is, the larger the value of the preset parameter is.
And 303, adjusting the brightness value of the image block by adopting the adjusted preset parameters and the corresponding preset functions.
And after the preset parameters are adjusted, adjusting the brightness value of the image block by adopting the adjusted preset parameters and the corresponding preset functions. The preset function formula is as described in the foregoing embodiments.
According to the video transcoding method, each image in the video is divided into the plurality of image blocks, and the brightness values are finely adjusted by the image blocks, so that the brightness expansion effect can be improved.
After the brightness value of each image block of each image in the first video is adjusted, whether the difference value of the brightness value between each pixel point and the adjacent pixel point in the first video is larger than a preset difference value threshold value or not can be judged, so that when the difference value is larger than the preset difference value threshold value, the pixel points are filtered by adopting a filter, the occurrence of a color halo condition is avoided, the brightness expansion effect of the video is further improved, and the HDR effect of the video is further improved.
In order to implement the foregoing embodiments, an embodiment of the present application further provides a video transcoding device. Fig. 4 is a schematic structural diagram of a video transcoding device according to an embodiment of the present application.
As shown in fig. 4, the video transcoding apparatus includes: an acquisition module 410, a mapping module 420, and an expansion module 430.
The obtaining module 410 is configured to obtain a video to be transcoded; the video format of the video to be transcoded is a standard dynamic range SDR.
The mapping module 420 is configured to map a source space of the video to be transcoded to a linear RGB space, so as to obtain a first video.
The expansion module 430 is configured to map a space of the first video to an HSI space, expand a local color gamut of S in the HSI space to a target color gamut, and map the expanded HSI space to a linear RGB space.
The expansion module 430 is further configured to expand the luminance domain of the first video to a target luminance domain; the target color gamut and the target brightness gamut are determined according to the video format of the target video, and the video format of the target video is a high dynamic range HDR.
The mapping module 420 is further configured to map the space of the expanded first video to a source space, so as to obtain a target video.
In one possible implementation manner of the embodiment of the present application, the mapping module 420 is specifically configured to,
mapping a source space of a video to be transcoded to a nonlinear RGB space;
normalizing the nonlinear RGB space of the video to be transcoded to obtain a normalized nonlinear RGB space;
and mapping the normalized nonlinear RGB space to a linear RGB space to obtain a first video.
In one possible implementation manner of the embodiment of the present application, the expansion module 430 is specifically configured to,
mapping a linear RGB space of a first video to an XYZ space;
mapping the XYZ space to the HSI space according to the chromaticity coordinate of the target video;
aiming at S in an HSI space, obtaining a local color gamut with the highest chromatic value in the S;
and expanding the local color gamut to the target color gamut by adopting a linear pull-up algorithm or according to a preset curve.
In a possible implementation manner of the embodiment of the present application, the mapping module 420 is further configured to map a source space of a video to be transcoded to a linear RGB space to obtain a first video, and then map the linear RGB space of the first video to a CIELAB space;
the expansion module 430 is further configured to expand the local color gamut with the highest value in each dimension in the CIELAB space to the corresponding target color gamut.
In one possible implementation manner of the embodiment of the present application, the expansion module 430 is specifically configured to,
expanding the brightness domain of the first video to a target brightness domain by adopting preset parameters and a corresponding preset function;
aiming at each pixel point in the first video, when the difference value of the brightness values of the pixel point and the adjacent pixel point is larger than a preset difference value threshold value, filtering the pixel point by adopting a filter to smooth the difference between the pixel point and the adjacent pixel point; and the filter is determined according to the brightness value of the adjacent pixel point.
In a possible implementation manner of the embodiment of the present application, the expansion module 430 is further specifically configured to,
for each image in the first video, dividing the image into a plurality of image blocks;
aiming at an image block, adjusting preset parameters according to the difference of brightness values between the image block and an adjacent image block;
and adjusting the brightness value of the image block by adopting the adjusted preset parameters and the corresponding preset functions.
In a possible implementation manner of the embodiment of the present application, the expansion module 430 is further specifically configured to,
acquiring the brightness value of each pixel point in the first video;
and adjusting initial preset parameters according to the brightness value of each pixel point.
In one possible implementation manner of the embodiment of the present application, the mapping module 420 is specifically configured to,
mapping a linear RGB space of a first video according to a PQ curve specified by a target video to obtain a nonlinear RGB space;
and mapping the nonlinear RGB space to a source space to obtain a target video.
It should be noted that the foregoing explanation on the embodiment of the video transcoding method is also applicable to the video transcoding apparatus of this embodiment, and therefore, the details are not repeated herein.
The video transcoding device provided by the embodiment of the application obtains a video to be transcoded, then maps a source space of the video to be transcoded to a linear RGB space to obtain a first video, then maps a space of the first video to an HSI space, expands a local color gamut of S in the HSI space to a target color gamut, maps the expanded HSI space to the linear RGB space, expands a brightness domain of the first video to the target brightness domain, and then maps the expanded space of the first video to the source space to obtain the target video. Therefore, after the video to be transcoded is mapped to the linear RGB space from the source space, color gamut expansion and brightness domain expansion are carried out, and the expanded first video is mapped to the source space from the linear RGB space to obtain the HDR video, so that the conversion of the SDR video into the HDR video is realized, a user can watch the HDR effect of the SDR video on the device supporting the HDR, and the film watching effect is greatly improved.
In order to implement the foregoing embodiments, an embodiment of the present application further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and running on the processor, where when the processor executes the computer program, the video transcoding method as described in the foregoing embodiments is implemented.
FIG. 5 illustrates a block diagram of an exemplary computer device suitable for use in implementing embodiments of the present application. The computer device 12 shown in fig. 5 is only an example and should not bring any limitation to the function and scope of use of the embodiments of the present application.
As shown in FIG. 5, computer device 12 is in the form of a general purpose computing device. The components of computer device 12 may include, but are not limited to: one or more processors or processing units 16, a system memory 28, and a bus 18 that couples various system components including the system memory 28 and the processing unit 16.
Bus 18 represents one or more of any of several types of bus structures, including a memory bus or memory controller, a peripheral bus, an accelerated graphics port, and a processor or local bus using any of a variety of bus architectures. These architectures include, but are not limited to, Industry Standard Architecture (ISA) bus, Micro Channel Architecture (MAC) bus, enhanced ISA bus, Video Electronics Standards Association (VESA) local bus, and Peripheral Component Interconnect (PCI) bus, to name a few.
Computer device 12 typically includes a variety of computer system readable media. Such media may be any available media that is accessible by computer device 12 and includes both volatile and nonvolatile media, removable and non-removable media.
The system Memory 28 may include computer system readable media in the form of volatile Memory, such as Random Access Memory (RAM) 30 and/or cache Memory 32. Computer device 12 may further include other removable/non-removable, volatile/nonvolatile computer system storage media. By way of example only, storage system 34 may be used to read from and write to non-removable, nonvolatile magnetic media (not shown in FIG. 5, and commonly referred to as a "hard drive"). Although not shown in FIG. 5, a disk drive for reading from and writing to a removable, nonvolatile magnetic disk (e.g., a "floppy disk") and an optical disk drive for reading from or writing to a removable, nonvolatile optical disk (e.g., a Compact disk Read Only Memory (CD-ROM), a Digital versatile disk Read Only Memory (DVD-ROM), or other optical media) may be provided. In these cases, each drive may be connected to bus 18 by one or more data media interfaces. Memory 28 may include at least one program product having a set (e.g., at least one) of program modules that are configured to carry out the functions of embodiments of the application.
A program/utility 40 having a set (at least one) of program modules 42 may be stored, for example, in memory 28, such program modules 42 including, but not limited to, an operating system, one or more application programs, other program modules, and program data, each of which examples or some combination thereof may comprise an implementation of a network environment. Program modules 42 generally perform the functions and/or methodologies of the embodiments described herein.
Computer device 12 may also communicate with one or more external devices 14 (e.g., keyboard, pointing device, display 24, etc.), with one or more devices that enable a user to interact with computer device 12, and/or with any devices (e.g., network card, modem, etc.) that enable computer device 12 to communicate with one or more other computing devices. Such communication may be through an input/output (I/O) interface 22. Moreover, computer device 12 may also communicate with one or more networks (e.g., a Local Area Network (LAN), a Wide Area Network (WAN), and/or a public Network such as the Internet) via Network adapter 20. As shown in FIG. 5, the network adapter 20 communicates with the other modules of the computer device 12 via the bus 18. It should be appreciated that although not shown in FIG. 5, other hardware and/or software modules may be used in conjunction with computer device 12, including but not limited to: microcode, device drivers, redundant processing units, external disk drive arrays, RAID systems, tape drives, and data backup storage systems, among others.
The processing unit 16 executes various functional applications and data processing by executing programs stored in the system memory 28, for example, implementing the video transcoding method provided by the above-described embodiment.
Embodiments of the present application further provide a non-transitory computer-readable storage medium, on which a computer program is stored, and when the computer program is executed by a processor, the non-transitory computer-readable storage medium implements the video transcoding method according to the foregoing embodiments.
The non-transitory computer readable storage medium described above may take any combination of one or more computer readable media. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a Read Only Memory (ROM), an Erasable Programmable Read Only Memory (EPROM) or flash Memory, an optical fiber, a portable compact disc Read Only Memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present application may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of Network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
It should be noted that, in the description of the present application, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. In addition, in the description of the present application, "a plurality" means two or more unless otherwise specified.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
It should be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are known in the art, may be used: a discrete logic circuit having a logic Gate circuit for implementing a logic function on a data signal, an asic having an appropriate combinational logic Gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), and the like.
It will be understood by those skilled in the art that all or part of the steps carried by the method for implementing the above embodiments may be implemented by hardware related to instructions of a program, which may be stored in a computer readable storage medium, and when the program is executed, the program includes one or a combination of the steps of the method embodiments.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each module may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. The integrated module, if implemented in the form of a software functional module and sold or used as a stand-alone product, may also be stored in a computer readable storage medium.
The storage medium mentioned above may be a read-only memory, a magnetic or optical disk, etc.
In the description herein, reference to the description of the term "one embodiment," "some embodiments," "an example," "a specific example," or "some examples," etc., means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described above, it is understood that the above embodiments are exemplary and should not be construed as limiting the present application, and that variations, modifications, substitutions and alterations may be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (9)

1. A method of video transcoding, comprising:
acquiring a video to be transcoded; the video format of the video to be transcoded is a standard dynamic range SDR;
mapping a source space of the video to be transcoded to a linear RGB space to obtain a first video;
mapping the space of the first video to an HSI space, expanding a local color gamut of S in the HSI space to a target color gamut, and mapping the expanded HSI space to a linear RGB space; wherein the expanding the gamut local to the target gamut for S in the HSI space comprises: aiming at S in the HSI space, obtaining a local color gamut with the highest chromatic value in the S; expanding the local color gamut to a target color gamut by adopting a linear pull-up algorithm or according to a preset curve;
expanding the brightness domain of the first video to a target brightness domain by adopting preset parameters and a corresponding preset function; the target color gamut and the target brightness gamut are determined according to a video format of a target video, and the video format of the target video is a High Dynamic Range (HDR);
the expanding the brightness domain of the first video to the target brightness domain by adopting the preset parameters and the corresponding preset functions comprises: acquiring the brightness value of each pixel point in the first video; adjusting initial preset parameters according to the brightness value of each pixel point; the expanding the brightness domain of the first video to the target brightness domain by adopting the preset parameters and the corresponding preset functions comprises: for each image in the first video, dividing the image into a plurality of image blocks; aiming at the image blocks, adjusting preset parameters according to the difference of the brightness values between the image blocks and the adjacent image blocks; adjusting the brightness value of the image block by adopting the adjusted preset parameters and the corresponding preset functions;
and mapping the space of the expanded first video to the source space to obtain the target video.
2. The method of claim 1, wherein mapping a source space of the video to be transcoded to a linear RGB space to obtain a first video comprises:
mapping a source space of the video to be transcoded to a nonlinear RGB space;
normalizing the nonlinear RGB space of the video to be transcoded to obtain a normalized nonlinear RGB space;
and mapping the normalized nonlinear RGB space to a linear RGB space to obtain the first video.
3. The method of claim 1, wherein mapping the space of the first video to an HSI space, and extending the gamut local to S in the HSI space to a target gamut comprises:
mapping a linear RGB space of the first video to an XYZ space;
and mapping the XYZ space to the HSI space according to the chromaticity coordinates of the target video.
4. The method of claim 1, wherein after mapping a source space of the video to be transcoded to a linear RGB space to obtain the first video, the method further comprises:
mapping a linear RGB space of the first video to a CIELAB space;
and expanding the local color gamut with the highest numerical value of each dimension in the CIELAB space to the corresponding target color gamut.
5. The method of claim 1, wherein the expanding the luminance domain of the first video to a target luminance domain comprises:
aiming at each pixel point in the first video, when the difference value of the brightness values of the pixel point and the adjacent pixel point is larger than a preset difference value threshold value, filtering the pixel point by adopting a filter to smooth the difference between the pixel point and the adjacent pixel point; and the filter is determined according to the brightness value of the adjacent pixel point.
6. The method of claim 1, wherein the mapping the space of the augmented first video to the source space to obtain the target video comprises:
mapping the linear RGB space of the first video according to a PQ curve specified by a target video to obtain a nonlinear RGB space;
and mapping the nonlinear RGB space to the source space to obtain the target video.
7. A video transcoding apparatus, comprising:
the acquisition module is used for acquiring a video to be transcoded; the video format of the video to be transcoded is a standard dynamic range SDR;
the mapping module is used for mapping the source space of the video to be transcoded to a linear RGB space to obtain a first video;
the expansion module is used for mapping the space of the first video to an HSI space, expanding the local color gamut of S in the HSI space to a target color gamut and mapping the expanded HSI space to a linear RGB space; wherein the expanding the gamut local to the target gamut for S in the HSI space comprises: aiming at S in the HSI space, obtaining a local color gamut with the highest chromatic value in the S; expanding the local color gamut to a target color gamut by adopting a linear pull-up algorithm or according to a preset curve;
the expansion module is further used for expanding the brightness domain of the first video to a target brightness domain by adopting preset parameters and corresponding preset functions; the target color gamut and the target brightness gamut are determined according to a video format of a target video, and the video format of the target video is a High Dynamic Range (HDR);
the expansion module is further used for acquiring the brightness value of each pixel point in the first video before expanding the brightness domain of the first video to a target brightness domain by adopting preset parameters and a corresponding preset function; adjusting initial preset parameters according to the brightness value of each pixel point; after the brightness domain of the first video is expanded to a target brightness domain by adopting preset parameters and corresponding preset functions, dividing the image into a plurality of image blocks for each image in the first video; aiming at the image blocks, adjusting preset parameters according to the difference of the brightness values between the image blocks and the adjacent image blocks; adjusting the brightness value of the image block by adopting the adjusted preset parameters and the corresponding preset functions;
the mapping module is further configured to map the space of the expanded first video to the source space, so as to obtain the target video.
8. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor when executing the program implementing a video transcoding method as claimed in any one of claims 1 to 6.
9. A non-transitory computer readable storage medium having stored thereon a computer program, which when executed by a processor implements a method of video transcoding as claimed in any of claims 1 to 6.
CN201811188614.1A 2018-10-12 2018-10-12 Video transcoding method and device, computer equipment and storage medium Active CN109274985B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811188614.1A CN109274985B (en) 2018-10-12 2018-10-12 Video transcoding method and device, computer equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811188614.1A CN109274985B (en) 2018-10-12 2018-10-12 Video transcoding method and device, computer equipment and storage medium

Publications (2)

Publication Number Publication Date
CN109274985A CN109274985A (en) 2019-01-25
CN109274985B true CN109274985B (en) 2020-04-28

Family

ID=65196496

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811188614.1A Active CN109274985B (en) 2018-10-12 2018-10-12 Video transcoding method and device, computer equipment and storage medium

Country Status (1)

Country Link
CN (1) CN109274985B (en)

Families Citing this family (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110298896A (en) * 2019-06-27 2019-10-01 北京奇艺世纪科技有限公司 Picture code-transferring method, device and electronic equipment
CN110475149B (en) * 2019-08-30 2020-04-03 广州波视信息科技股份有限公司 Method and system for processing ultra-high-definition video
CN112261442B (en) * 2020-10-19 2022-11-11 上海网达软件股份有限公司 Method and system for real-time transcoding of HDR (high-definition link) and SDR (short-definition link) of video
CN113489930B (en) * 2021-06-10 2024-03-19 中央广播电视总台 Video signal processing method, device and storage medium
CN113676773B (en) * 2021-08-16 2023-11-14 广州虎牙信息科技有限公司 Video playing method, system, device, computer equipment and storage medium
CN114422829A (en) * 2022-01-30 2022-04-29 杭州雾联科技有限公司 HDR cloud video processing method, system and equipment
CN114501023B (en) * 2022-03-31 2022-07-26 深圳思谋信息科技有限公司 Video processing method, device, computer equipment and storage medium
CN114693567B (en) * 2022-05-30 2022-09-20 深圳思谋信息科技有限公司 Image color adjusting method and device, computer equipment and storage medium
CN116167950B (en) * 2023-04-26 2023-08-04 镕铭微电子(上海)有限公司 Image processing method, device, electronic equipment and storage medium
CN117319620B (en) * 2023-11-30 2024-03-08 深圳市尊正数字视频有限公司 HDR preview-level on-site real-time color mixing method

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104463820A (en) * 2014-10-29 2015-03-25 广东工业大学 Reverse tone mapping algorithm based on frequency domain
CN104978945A (en) * 2014-04-14 2015-10-14 深圳Tcl新技术有限公司 Image saturation enhancement method and apparatus
CN105205787A (en) * 2014-06-20 2015-12-30 汤姆逊许可公司 Method and apparatus for dynamic range expansion of LDR video sequence
CN105850114A (en) * 2013-12-27 2016-08-10 汤姆逊许可公司 Method for inverse tone mapping of an image
CN106233706A (en) * 2014-02-25 2016-12-14 苹果公司 For providing the apparatus and method of the back compatible of the video with standard dynamic range and HDR
CN106878694A (en) * 2015-12-10 2017-06-20 瑞昱半导体股份有限公司 high dynamic range signal processing system and method
CN107154059A (en) * 2017-06-26 2017-09-12 杭州当虹科技有限公司 A kind of high dynamic range video processing method
CN107211141A (en) * 2015-01-30 2017-09-26 汤姆逊许可公司 The method and apparatus that inverse tone mapping (ITM) is carried out to image
CN108109180A (en) * 2017-12-12 2018-06-01 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled
CN108605125A (en) * 2016-01-28 2018-09-28 皇家飞利浦有限公司 HDR videos are coded and decoded

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
ES2694858T3 (en) * 2014-05-28 2018-12-27 Koninklijke Philips N.V. Methods and apparatus for encoding HDR images, and methods and apparatus for the use of such encoded images
US10863201B2 (en) * 2015-12-21 2020-12-08 Koninklijke Philips N.V. Optimizing high dynamic range images for particular displays
US9984446B2 (en) * 2015-12-26 2018-05-29 Intel Corporation Video tone mapping for converting high dynamic range (HDR) content to standard dynamic range (SDR) content
EP3312798A1 (en) * 2016-10-20 2018-04-25 Thomson Licensing Method and device for inverse tone mapping
GB2558234B (en) * 2016-12-22 2020-05-13 Apical Ltd Image processing
CN107657594A (en) * 2017-09-22 2018-02-02 武汉大学 The quick tone mapping method and system of a kind of high quality
CN108495108B (en) * 2018-03-02 2019-11-05 深圳创维-Rgb电子有限公司 Image conversion method and device, terminal, storage medium

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105850114A (en) * 2013-12-27 2016-08-10 汤姆逊许可公司 Method for inverse tone mapping of an image
CN106233706A (en) * 2014-02-25 2016-12-14 苹果公司 For providing the apparatus and method of the back compatible of the video with standard dynamic range and HDR
CN104978945A (en) * 2014-04-14 2015-10-14 深圳Tcl新技术有限公司 Image saturation enhancement method and apparatus
CN105205787A (en) * 2014-06-20 2015-12-30 汤姆逊许可公司 Method and apparatus for dynamic range expansion of LDR video sequence
CN104463820A (en) * 2014-10-29 2015-03-25 广东工业大学 Reverse tone mapping algorithm based on frequency domain
CN107211141A (en) * 2015-01-30 2017-09-26 汤姆逊许可公司 The method and apparatus that inverse tone mapping (ITM) is carried out to image
CN106878694A (en) * 2015-12-10 2017-06-20 瑞昱半导体股份有限公司 high dynamic range signal processing system and method
CN108605125A (en) * 2016-01-28 2018-09-28 皇家飞利浦有限公司 HDR videos are coded and decoded
CN107154059A (en) * 2017-06-26 2017-09-12 杭州当虹科技有限公司 A kind of high dynamic range video processing method
CN108109180A (en) * 2017-12-12 2018-06-01 上海顺久电子科技有限公司 The method and display device that a kind of high dynamic range images to input are handled

Also Published As

Publication number Publication date
CN109274985A (en) 2019-01-25

Similar Documents

Publication Publication Date Title
CN109274985B (en) Video transcoding method and device, computer equipment and storage medium
CN109076231B (en) Method and device for encoding HDR pictures, corresponding decoding method and decoding device
US10390027B2 (en) Method and apparatus of encoding and decoding a color picture
KR102385015B1 (en) Method of improving the perceptual luminance nonlinearity-based image data exchange across different display capabilities
US11647213B2 (en) Method and device for decoding a color picture
TWI559779B (en) Extending image dynamic range
CN107409210B (en) Method and apparatus for matching colors between color pictures of different dynamic ranges
TWI763629B (en) Method and device for encoding both a hdr picture and a sdr picture obtained from said hdr picture using color mapping functions
CN108352076B (en) Encoding and decoding method and corresponding devices
US20170324959A1 (en) Method and apparatus for encoding/decoding a high dynamic range picture into a coded bitstream
WO2017203942A1 (en) Image processing device, image processing method, and program
WO2021073304A1 (en) Image processing method and apparatus
CN110691227A (en) Video signal processing method and device
EP3453175B1 (en) Method and apparatus for encoding/decoding a high dynamic range picture into a coded bistream
CN112689137B (en) Video signal processing method and device
WO2022120799A9 (en) Image processing method and apparatus, electronic device, and storage medium
US11769464B2 (en) Image processing
EP3716619A1 (en) Gamut estimation
EP3242481A1 (en) Method and apparatus for encoding/decoding a high dynamic range picture into a coded bitstream

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant