CN115393197A - Image synthesis method and device and image processing equipment - Google Patents

Image synthesis method and device and image processing equipment Download PDF

Info

Publication number
CN115393197A
CN115393197A CN202211323465.1A CN202211323465A CN115393197A CN 115393197 A CN115393197 A CN 115393197A CN 202211323465 A CN202211323465 A CN 202211323465A CN 115393197 A CN115393197 A CN 115393197A
Authority
CN
China
Prior art keywords
image
dynamic range
luminance
images
linear
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202211323465.1A
Other languages
Chinese (zh)
Other versions
CN115393197B (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Moore Threads Technology Co Ltd
Original Assignee
Moore Threads Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Moore Threads Technology Co Ltd filed Critical Moore Threads Technology Co Ltd
Priority to CN202211323465.1A priority Critical patent/CN115393197B/en
Publication of CN115393197A publication Critical patent/CN115393197A/en
Application granted granted Critical
Publication of CN115393197B publication Critical patent/CN115393197B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • G06T3/04
    • G06T5/90
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Abstract

The invention discloses an image synthesis method and device and an image processing device, and relates to the field of image processing. The image synthesis method includes: respectively converting the nonlinear brightness signals of the plurality of first images into linear brightness signals to obtain a plurality of first linear brightness images; synthesizing the plurality of first linear brightness images into a second linear brightness image according to a spatial sequence; converting the second linear luminance image into a second image having a different dynamic range criterion than at least one of the dynamic range criteria of the plurality of first images. The images to be spliced are converted into the linear brightness space, and are spliced in the linear brightness space, so that the original brightness information of the images can be kept during splicing, and the spliced images are converted into the images with the standard uniform dynamic range, so that the spliced whole images can be correctly displayed.

Description

Image synthesis method and device and image processing equipment
Technical Field
The present invention relates to the field of image processing, and in particular, to an image synthesis method and apparatus, and a graphics processing device.
Background
With the development of multimedia technology, the spread of image data and video data is more and more extensive, and the data formats of the image data and the video data are also diversified. For example, the images are classified into a Standard Dynamic Range (SDR) format, a High Dynamic Range (HDR) format, and the like. On the display side where images and video are displayed, images or video of a dynamic range standard are often supported.
How to simultaneously display an image composed of a plurality of images with different dynamic range standards on the same display becomes an urgent problem to be solved.
This section is intended to provide a background or context to the embodiments of the invention that are recited in the claims. The description herein is not admitted to be prior art by inclusion in this section.
Disclosure of Invention
In order to solve at least one of the above problems or other similar problems, embodiments of the present invention provide an image synthesis method, apparatus, and graphics processing device.
According to a first aspect of embodiments of the present invention, there is provided an image synthesis method, including: respectively converting the nonlinear brightness signals of the plurality of first images into linear brightness signals to obtain a plurality of first linear brightness images; synthesizing the plurality of first linear brightness images into a second linear brightness image according to a spatial sequence; converting the second linear luminance image into a second image having a different dynamic range criterion than at least one of the dynamic range criteria of the plurality of first images.
According to a second aspect of the embodiments of the present invention, there is provided an image synthesizing apparatus including: a first conversion unit which converts the nonlinear luminance signals of the plurality of first images into linear luminance signals, respectively, to obtain a plurality of first linear luminance images; a synthesizing unit that synthesizes the plurality of first linear luminance images into one second linear luminance image in a spatial order; a second conversion unit that converts the second linear luminance image into a second image having a dynamic range standard different from at least one of dynamic range standards of the plurality of first images.
According to a third aspect of embodiments of the present invention, there is provided a graphics processing apparatus including: a communication interface, communicatively connected to a display device, for obtaining a target dynamic range standard of the display device and a target peak brightness of the display device, where the target dynamic range standard is a highest dynamic range standard that the display device can support, and the target peak brightness is a maximum peak brightness that the display device can display; and a control device which converts the non-linear luminance signals of the plurality of first images into linear luminance signals to obtain a plurality of first linear luminance images, synthesizes the plurality of first linear luminance images into a second linear luminance image in a spatial order, and converts the second linear luminance image into a second image according to the target dynamic range standard, the second image having a dynamic range standard which is different from at least one of the dynamic range standards of the plurality of first images and the same as the target dynamic range standard.
The embodiment of the invention also provides an image synthesis method, which comprises the following steps: converting a nonlinear brightness signal of a target image into a linear brightness signal to obtain a corresponding target linear brightness image, wherein the target image is a first image different from a second image in dynamic range standard in a plurality of first images; converting the target linear luminance image into a fifth image having the same dynamic range standard as the second image; and synthesizing the fifth image and the first image with the same dynamic range standard as the second image into a sixth image according to a spatial sequence.
The embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and the processor implements the method when executing the computer program.
An embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, and the computer program, when executed by a processor, implements the above method.
An embodiment of the present invention further provides a computer program product, where the computer program product includes a computer program, and when the computer program is executed by a processor, the computer program implements the method described above.
In the embodiment of the invention, the images to be spliced are converted into the linear brightness space and spliced in the linear brightness space, so that the original brightness information of the images can be reserved during splicing, and the spliced images are converted into the images with the standard uniform dynamic range, thereby correctly displaying the spliced whole images.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a schematic diagram of an image synthesis method according to an embodiment of the first aspect of the present invention.
Fig. 2 is another schematic diagram of an image synthesis method according to an embodiment of the first aspect of the present invention.
Fig. 3 is a schematic diagram of a data flow of the image synthesizing method shown in fig. 2.
Fig. 4 is yet another schematic diagram of an image synthesis method according to an embodiment of the first aspect of the present invention.
Fig. 5 is a schematic diagram of a second image of an embodiment of the first aspect of the invention.
Fig. 6 is a flow chart of obtaining a corrected histogram according to an embodiment of the first aspect of the present invention.
Fig. 7 is a schematic diagram of an image synthesizing apparatus according to an embodiment of the second aspect of the present invention.
FIG. 8 is a schematic diagram of a graphics processing apparatus according to an embodiment of the third aspect of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention more apparent, the embodiments of the present invention are further described in detail below with reference to the accompanying drawings. The exemplary embodiments and descriptions of the present invention are provided to explain the present invention, but not to limit the present invention.
In the related art, the display side for displaying images and videos often supports only one dynamic range standard image or video. For example, if a display can only support the playback of images or video in SDR format, and it is desired to play images or video in HDR format with the display, then the images or video in HDR format need to be converted to SDR format.
In a scenario where a plurality of images of different dynamic range standards need to be displayed on the same display at the same time, if the dynamic range standard supported by the display cannot simultaneously satisfy the plurality of images, for example, it is desirable to display video in HDR format and video in SDR format in a display in SDR standard, or it is desirable to display video in HDR format and video in SDR format in HDR standard at the same time, the inventors have found that splicing image data in two formats in a nonlinear luminance space, for example, splicing an image or video in HDR format and an image or video in SDR format, since the display cannot know the position of the splicing, and the display performs nonlinear luminance-to-linear luminance conversion on the spliced whole image, this may result in that only the image or video in SDR format in the whole image can be correctly displayed, while the luminance and the color of the image or video in HDR format in part cannot be correctly displayed, and in HDR display, only the image or video in HDR format in SDR format in part of the whole image in SDR format in HDR format can be correctly displayed, and only the luminance and the color of the image or video in SDR format in HDR format in part of the whole image or video in SDR format cannot be correctly displayed, and the video in HDR display effect of the video in SDR format cannot be displayed.
To solve the above problems, the present disclosure provides an image synthesis method, an apparatus, and a graphics processing device.
Embodiments of the first aspect
An embodiment of the first aspect of the present invention provides an image synthesis method, and fig. 1 is a schematic diagram of the image synthesis method of the embodiment of the first aspect of the present invention.
As shown in fig. 1, the method 100 includes:
step 101: respectively converting the nonlinear brightness signals of the plurality of first images into linear brightness signals to obtain a plurality of first linear brightness images;
step 103: synthesizing the plurality of first linear brightness images into a second linear brightness image according to a spatial sequence;
step 105: converting the second linear luminance image into a second image having a different dynamic range criterion than at least one of the dynamic range criteria of the plurality of first images.
In the embodiment of the present invention, the "non-linear luminance signal" refers to, for example, that the electric signal of a pixel in an image has a non-linear relationship with the luminance actually displayed by the pixel when the image is displayed using the display, and the "linear luminance signal" refers to, for example, that the electric signal of a pixel in an image has a linear relationship with the luminance actually displayed by the pixel when the image is displayed using the display. The meaning of the "non-linear luminance signal" and the "linear luminance signal" may refer to the related art, and the embodiment of the present invention does not limit this.
Further, for convenience of description hereinafter, an image having "linear luminance signal" is sometimes referred to as an image located in a linear luminance space, or the like, for example, a first linear luminance image is referred to as an image located in a linear luminance space; the "converting a non-linear luminance signal of an image into a linear luminance signal" is also sometimes referred to as "converting an image into a linear luminance space", or the like.
Therefore, images to be spliced are converted into a linear brightness space and spliced in the linear brightness space, original brightness information of the images can be reserved during splicing, the spliced images are converted into images with a uniform dynamic range standard, and the spliced whole images can be displayed correctly.
For convenience of description, the dynamic range standard of the first image may be referred to as "first dynamic range standard" and the dynamic range standard of the second image may be referred to as "second dynamic range standard". In addition, the dynamic range standards of the plurality of first images may be the same or different, that is, the plurality of first dynamic range standards may be the same or different dynamic range standards, for example, the plurality of first images may all be images of the HDR standard, may all be images of the SDR standard, may have both images of the HDR standard and the SDR standard.
In at least one embodiment, the first Dynamic Range standard or the second Dynamic Range standard may be an HDR standard, for example, the first image or the second image may be an image or a video of a Hybrid Log Gamma (HLG), perceptual Quantization (PQ), HDR10+, dolby Vision (Dolby Vision), cyanine HDR (visual HDR), or the like, and the first Dynamic Range standard or the second Dynamic Range standard may also be an SDR standard or a Low Dynamic Range (LDR) standard, which is not limited by the embodiment of the present invention.
Wherein the second image has a different dynamic range criterion than at least one of the dynamic range criteria of the plurality of first images. When a first image with the same dynamic range standard as that of a second image exists, the non-linear brightness signal of the first image can be converted to obtain a corresponding first linear brightness image, and the first linear brightness images corresponding to other first images are synthesized into a second linear brightness image according to the spatial sequence.
Fig. 2 is another schematic diagram of an image synthesis method according to an embodiment of the first aspect of the present invention.
In at least one embodiment, as shown in fig. 2, step 101 may comprise:
step 1011: converting the non-linear brightness signal of the corresponding first image into a corresponding linear brightness signal according to an electro-optical conversion function corresponding to the first dynamic range standard to obtain a corresponding third linear brightness image;
step 1013: the plurality of third linear luminance images are converted into a corresponding plurality of first linear luminance images according to the target color gamut.
In step 103', the plurality of first linear luminance images are spatially combined into one second linear luminance image.
In step 105', the second linear luminance image is converted into the second image according to a photoelectric conversion function.
Fig. 3 is a schematic diagram of a data flow of the image synthesizing method shown in fig. 2.
As shown in fig. 3, the Electro-Optical Transfer Function (EOTF) of the first image 301 is 311; the EOTF for the first image 302 is 321; the color corresponding to the target color gamut is converted to 303, the first image 301 passes through EOTF 311 and becomes a third linear luminance image 312, the first image 302 passes through EOTF 321 and becomes a third linear luminance image 322, the third linear luminance image 312 passes through color conversion 303 and becomes a first linear luminance image 313, the third linear luminance image 322 passes through color conversion 303 and becomes a first linear luminance image 323, the stitching process 304 spatially combines the first linear luminance image 313 and the first linear luminance image 323 into a second linear luminance image 305, and the second linear luminance image 305 passes through the conversion of the photoelectric conversion Function (OETF) 306 and becomes a second image 307.
For example, in step 1011, the first image of the HDR standard is converted into the first linear luminance image by using the electro-optical conversion function corresponding to the HDR standard, for example, a PQ curve, an HLG curve, and the like, and the first image of the SDR standard is converted into the first linear luminance image by using the electro-optical conversion function corresponding to the SDR standard, for example, a GAMMA curve, and the like, for the first image of the SDR standard.
For example, in step 1013, the target color gamut is bt.709 or NTSC color gamut standard corresponding to the SDR standard, and for the first linear luminance image corresponding to the HDR standard, the color gamut of the first linear luminance image, for example, bt.2020 or DCI-P3 color gamut standard, is converted into the target color gamut by a color space conversion matrix, or a three-dimensional color space lookup table; for a first linear brightness image corresponding to the SDR standard, if the color gamut standard of the first linear brightness image is the same as that of the target color gamut, the conversion is not performed, and if the color gamut standard of the first linear brightness image is different from that of the target color gamut, the first linear brightness image can be converted into the target color gamut through a color space conversion matrix or a three-dimensional color space lookup table; further, in step 1013, if the target color gamut is the bt.2020 or DCI-P3 color gamut corresponding to the HDR standard, the color gamut of the first linear luminance image corresponding to the SDR standard may be converted into the target color gamut by the color space conversion matrix or the three-dimensional color space lookup table, or the color gamut of the first linear luminance image corresponding to the HDR standard, which is different from the target color gamut, may be converted into the target color gamut. The embodiment of the invention does not limit the color space conversion matrix and the three-dimensional color space lookup table, and can refer to the related technology.
Therefore, the original brightness information of the images to be spliced can be reserved during splicing by splicing the images in the linear brightness space, so that the brightness information of the whole image basically cannot be lost after the conversion of the dynamic range standard is carried out, and the display picture can achieve the expected effect.
In addition, in fig. 3, the example that two first images are synthesized into one second image is described, but the embodiment of the present invention is not limited thereto, and three or more first images may be synthesized, and the method may refer to the method for synthesizing two first images, and is not described herein again.
In at least one embodiment, the electrical-to-optical transfer function is constructed from at least one of the first dynamic range criterion, a peak brightness of the first image, and metadata information of the first image.
For example, in step 1011, the first dynamic range criterion is, for example, an SDR criterion, and the electro-optic transfer function is, for example, a GAMMA (GAMMA) curve, e.g., in the case of GAMMA 2.0, the EOTF may be the following equation (1):
Y=(E')^2.2 (1),
e' represents the electrical signal of the current pixel,
in at least one embodiment, the first dynamic range criterion may be, for example, an HDR criterion, the EOTF may be, for example, a Perceptual Quantization (PQ) curve, and, for example, PQ 1000, the EOTF may be the following equation (2):
Y=(max(E'^(1/m2)-c1,0)/(c2-c3*E'^(1/m2)))^(1/m1) (2),
wherein E' represents the electrical signal of the current pixel;
y represents the normalized luminance;
m1 = 2610/16384 = 0.1593;
m2=2523x128/4096=78.84375;
c1=3424/4096=0.83;
c2=2413x32/4096=18.85;
c3=2392x32/4096=18.68。
furthermore, the actual luminance value Fd may be expressed as 10000 × y in nit (nit). In addition, the actual luminance values can also be fixed to (2 ^ 24)/(10000 × Y).
In at least one embodiment, the first dynamic range criterion may be, for example, an HDR criterion, the EOTF may be, for example, a mixed Log Gamma (HLG) curve, and the EOTF may be the following equation (3):
Fd=EOTF[max(0,(1-beta)*E'+beta)]=OOTF[OETF -1 [max(0,(1-beta)*E'+beta)]](3),
wherein E' is the electrical signal of the current pixel;
the formula of OOTF is:
Fd=OOTF[E]=alpha*Ys^(gama-1)*E
Ys=0.2627*Rs+0.6780*Gs+0.0593*Bs
rs, gs, bs are RGB linear data
gama can be 1.2 or other values, and can be specifically set according to actual needs, which is not limited in the embodiment of the invention;
beta and alpha are predefined variables, conforming to the definition in the ITU-R BT.2100 standard, where alpha is, for example, associated with the peak brightness of the display, beta is, for example, associated with the black level of the display, which is, for example, the brightness of the display at an input value of 0 for the signal;
OETF -1 the formula of (1) is:
when x is>=0 and x<=1/2; then OETF -1 [x]=x^2/2;
When x is>1/2 and x<=1; then OETF -1 [x]={exp((x-c)/a)+b}/12;
Wherein a =0.17883277, b =1-4a, c =0.5-a × (4 a).
Furthermore, the metadata information of the first image may include at least one of dynamic metadata information and static metadata information, for example, the metadata information may represent information describing key information or features of the video or image during processing, for example, the static metadata information may be an upper limit of maximum luminance at the pixel level of the entire image, which may be referred to as defined in ST 2086 standard; the dynamic metadata information may be an upper limit or a lower limit of luminance in each frame image or each scene in the video, and may refer to the definition in the SMPTE ST 2094 standard, which is not limited by the embodiment of the present invention.
In step 1013, the target color gamut may be a color gamut standard executed by the output device itself, for example, bt.2020, DCI-P3, bt.709, NTSC, and the like, and the output device may be, for example, a display, a projector, and the like, which is not limited in this embodiment of the present invention; further, the target color gamut may be acquired in advance or stored in advance according to actual needs.
In step 103 or step 103', the spatial order may be an up-down order or a left-right order, or may be an order defined by a predetermined template such as a four-square grid, a nine-square grid, or the like, and further, when the number of the first images is not enough to fill each area of the predetermined template such as the four-square grid, the nine-square grid, or the like, an area where the first image is not set may be filled with a blank image, and further, the blank image may be an image of a pure color or a predetermined pattern that is the same as the target dynamic range standard. In addition, the embodiment of the present invention does not limit what spatial sequence the splicing is performed specifically, and the splicing may be user-defined, or may be preset according to actual needs, and reference may be made to related technologies for how to perform the splicing specifically, which is not limited in the embodiment of the present invention.
In at least one embodiment, the photoelectric conversion function is constructed from at least one of a target dynamic range criterion, a target peak luminance, a target minimum luminance, and an ambient illuminance parameter.
For example, in step 105', the OETF may be an inverse function of the EOTF in step 1011, the EOTF in step 1011 refers to information such as a dynamic range standard and a peak brightness of the first image, and the OETF replaces the information such as a target dynamic range standard and a target peak brightness with information such as a dynamic range standard and a peak brightness of the output device, for example, the target dynamic range standard may be a dynamic range standard of the display, that is, the dynamic range standard of the HDR display is an HDR standard, the dynamic range standard of the SDR display is an SDR standard, and the target peak brightness is a maximum brightness that the output device can provide, for example, the peak brightness that the display can output, the maximum brightness that the projector can output, and the like, which is not limited in this embodiment of the present invention; in addition, the information of the target dynamic range standard, the target peak luminance, and the like may be acquired or stored in advance according to actual needs.
In addition, in the OETF, the size of the parameter in the function may be adjusted accordingly in combination with other parameters of the output device, illuminance information of an environment in which the output device is located, and the like, for example, when the OETF is constructed, the size of a black level of the display, for example, luminance of the display at an input value of a signal of 0, and the like may be considered, and hereinafter, the "black level" may be sometimes referred to as "target minimum luminance". For example, a black level value of the display or the ambient illumination may be converted to a non-linear domain, the black level value may correspond to the lowest value of the non-linear mapping, and the value of the ambient illumination may correspond to the highest value of the non-linear mapping. An interface for the user to input the ambient illumination may also be provided, thereby facilitating the user to adjust according to the actual environment in which the user is located. When the OETF is constructed by considering the size of the black level of the display, the brightness range of the image can be mapped into the brightness range of the display as completely as possible, so that the excellent performance of the display is fully exerted; when the OETF is constructed in consideration of the illuminance of the environment where the display is located, the size and range of the brightness output by the display can be adjusted according to the ambient illuminance to adapt to the characteristics of human visual perception, thereby providing comfortable brightness and contrast for the user.
In at least one embodiment, as shown in fig. 2, the method 100 may further comprise:
step 107: expanding a dynamic range of the second image according to at least one of a luminance value of the second image and a target peak luminance.
Thus, by expanding the dynamic range of the image, the contrast of the image can be adjusted, and the sense of gradation of the image can be increased.
In step 107, the dynamic range of the second image may be expanded according to the luminance values of the second image, for example, a histogram of the luminance values of all pixels of the second image may be counted, and the dynamic range of the second image may be expanded by histogram equalization; the dynamic range of the second image may also be expanded according to the luminance value of the second image and the target peak luminance, and for example, a luminance mapping curve may be constructed based on the luminance value of the second image and the target peak luminance, and the dynamic range of the second image may be expanded using the luminance mapping curve. In addition, the dynamic range of the second image may be expanded according to the target peak luminance, for example, the luminance values of the pixels in the second image whose luminance values exceed the target peak luminance may all be changed to the target peak luminance, and the luminance values of the other pixels in the second image may be adjusted accordingly.
The embodiment of the present invention does not limit the manner of expanding the dynamic range, and for example, the dynamic range of the second image may be expanded according to the histogram of the luminance values of the second image. For example, in the case where the second image is the SDR standard, the dynamic range of the second image may be extended according to a histogram of luminance values of the second image. However, the embodiment of the present invention is not limited to this, and when the second image is the HDR standard, the dynamic range of the second image may also be expanded according to the histogram of the luminance values of the second image.
Fig. 4 is yet another schematic diagram of an image synthesis method according to an embodiment of the first aspect of the present invention.
As shown in fig. 4, step 107 may include:
step 1071: counting in histogram form the frequency of luminance values of pixels within at least a portion of the area of the second image;
step 1072: calculating the total pixel number of the frequency exceeding the predetermined threshold, wherein the predetermined threshold can be set as a median of the frequency, or can be set according to other modes;
step 1073: calculating an average pixel number that equally distributes the total pixel number to all sections;
step 1074: correcting the frequency of each section according to the average pixel number to obtain a corrected histogram;
step 1075: and adjusting the distribution range of the brightness values of the pixels in the at least one part of region of the second image according to the cumulative histogram of the correction histogram.
Fig. 5 is a schematic diagram of a second image of an embodiment of the first aspect of the invention.
For example, as shown in fig. 5, the second image may be divided into 4 × 4 regions, and step 107 may be performed for each region, but the embodiment of the present invention is not limited to this, and may also be divided into another number of regions according to actual needs, or histogram statistics may be performed only on a certain part of the second image, or histogram statistics may be performed on the entire second image without dividing the second image into regions, and the embodiment of the present invention is not limited to this.
Further, when the second image is divided into a plurality of regions for processing, as shown in fig. 4, step 107 may further include:
step 1076: and for the spatial domain pixels in the preset range of the boundary position of the adjacent regions, calculating corresponding correction values according to the cumulative histograms of the regions where the spatial domain pixels are located and the cumulative histograms of the adjacent regions in a preset number, and correcting the pixel values of the spatial domain pixels according to the correction values.
The "spatial pixels" may be, for example, pixels within a predetermined number of pixels from a boundary position of an adjacent region, or pixels within a predetermined distance from a boundary position of an adjacent region, and the like, where the "predetermined number" is, for example, 100 or 20, and the "predetermined distance" is, for example, 1mm or 4mm, and the like.
Thus, the spatial continuity of the image is ensured by interpolating the pixels at the divided boundaries in the image.
For example, as shown in fig. 5, for the spatial domain pixel a of the area a, correction histograms corresponding to the area a and the area B, the area C, and the area D adjacent to the area a are calculated, and the cumulative histograms of the area a, the area B, the area C, and the area D, respectively, HIST _ BLOCK _00, HIST _ BLOCK _01, HIST _ BLOCK _10, and HIST _ BLOCK _11 are calculated from the correction histograms of the respective areas.
The cumulative histogram hit _ BLOCK _00 calculates a ratio curl _ lt of the number of pixels having a pixel value equal to or less than the original pixel value of the pixel a to the total number of pixels in the area a, for example, calculates a ratio of a histogram value in a section corresponding to the original pixel value of the pixel a in the cumulative histogram hit _ BLOCK _00 to a value in a rightmost section of the histogram.
The ratio curve _ rt of the number of pixels of which the pixel value is equal to or less than the original pixel value of the pixel a to the total number of pixels in the region B is calculated from the cumulative histogram hit _ BLOCK _01, for example, the ratio of the histogram value of the interval corresponding to the original pixel value of the pixel a in the cumulative histogram hit _ BLOCK _01 to the value of the rightmost interval of the histogram is calculated.
The cumulative histogram hit _ BLOCK _10 calculates a ratio curve _ lb of the total number of pixels in the region C to the number of pixels having a pixel value equal to or less than the original pixel value of the pixel a, for example, a ratio of a histogram value in a section corresponding to the original pixel value of the pixel a to a value in a rightmost section of the histogram in the cumulative histogram hit _ BLOCK _ 10.
The cumulative histogram hit _ BLOCK _11 calculates a ratio curr _ rb of the number of pixels having a pixel value equal to or less than the original pixel value of the pixel a to the total number of pixels in the region C, for example, calculates a ratio of a histogram value in a section corresponding to the original pixel value of the pixel a in the cumulative histogram hit _ BLOCK _11 to a value in a rightmost section of the histogram.
And carrying out interpolation processing according to weight _ w/weight _ h to obtain the final correction value of the pixel a.
For example, as shown in fig. 5, weight _ w is a distance of pixel a from a left boundary of the area a, weight _ h is a distance of pixel a from an upper boundary of the area a, block _ width is a width of the area a, i.e., a distance between the left boundary and the right boundary of the area a, and block _ height is a height of the area a, i.e., a distance between the upper boundary and the lower boundary of the area a.
Horizontal interpolation is performed for the pixel a, a first horizontal correction value temp1, temp1= (curve _ lt × weight _ w + curve _ rt × (block _ width-weight _ w))/block _ width is calculated, and a second horizontal correction value temp2, temp2= (curve _ lb × weight _ w + curve _ rb × (block _ width-weight _ w))/block _ width is calculated.
And performing vertical interpolation on the pixel a according to the first horizontal correction value and the second horizontal correction value to obtain a final correction value (tmp 1 × weight _ h + tmp2 × (block _ height-weight _ h))/block _ height.
The original pixel value of the pixel a is corrected by the correction value. For example, the product of the original pixel value and the correction value may be used as the final pixel value, but the embodiment of the present invention is not limited thereto, and the pixel value of the pixel a may also be corrected in other manners, and specifically, reference may be made to the related art.
In addition, in step 1071, if the second image is an RGB image, the RGB image may be converted into a YUV image, and the Y-channel data may be accumulated and counted to obtain the histogram data with bit depth.
Fig. 6 is a flow chart of obtaining a corrected histogram according to an embodiment of the first aspect of the present invention.
For example, in step 1072, the total number of pixels process _ bin exceeding the threshold thr is calculated, and in step 1073, the average value process _ ave obtained by flattening the total number of process _ bin to all the bins (bins) of the histogram is calculated.
In step 1074, a corrected histogram is obtained through the flow 600, for example as shown in fig. 5. The flow 600 may include:
in operation 601: judging whether the excess _ bin has the residue, if yes, ending the flow, and if no, entering operation 602;
operation 602: judging whether all bins (i.e., bin _ sum) in the histogram have been traversed, if yes, entering operation 603, and if no, entering operation 604;
operation 603: go from the histogram header, i.e., bin set to 0, back to operation 601;
in operation 604: determining whether the value f of the current position (bin) is greater than a threshold thr, if yes, proceeding to operation 605, and if no, proceeding to operation 607;
operation 605: replacing the value f of the current position (bin) with a threshold thr; entering operation 606;
operation 606: move to the next position, go back to operation 601;
operation 607: it is determined whether the difference between the value f of the current position (bin) and the threshold thr is smaller than the process _ ave, if yes, operation 610 is performed, and if no, operation 608 is performed:
in operation 608: accumulating the value f of the current position (bin) by an accumulation parameter a, which may be an integer greater than or equal to 1, in operation 609;
operation 609: move to the next position, go back to operation 601;
operation 610: if it is determined whether the output _ bin is smaller than the difference between the threshold thr and the current position value f, if yes, operation 611 is performed; in the case of "no", proceed to operation 614;
operation 611: accumulating the process _ bin for the value f of the current position, and entering operation 612;
operation 612: setting the excess _ bin to 0, and proceeding to operation 613;
in operation 613: move to the next position, go back to operation 601;
operation 614: the difference between the threshold thr and the value f of the current position is subtracted from the excess _ bin in operation 615;
in operation 615: replacing the value f of the current position with a threshold thr, and entering operation 616;
operation 616: move to the next position and return to operation 601.
Thus, after traversing the entire histogram, a corrected histogram is obtained.
Additionally, the process 600 may further include obtaining a cumulative histogram from the corrected histogram. Thereby, the dynamic range of the second image is expanded by the cumulative histogram.
In at least one embodiment, in step 107, the expansion may be further performed based on a luminance mapping curve constructed according to the luminance value of the second image and the target peak luminance. For example, in the case where the second image is the HDR standard, the dynamic range of the second image may be extended based on the luminance mapping curve. However, the embodiment of the present invention is not limited to this, and the dynamic range of the second image may also be extended based on the luminance mapping curve when the second image is the SDR standard.
For example, the luminance mapping curve causes a rate of change in luminance values of pixels in the second image to decrease as the luminance values increase.
For example, the maximum luminance value of the second image may be counted, the maximum luminance value of the second image (may also be referred to as "peak luminance of the second image") may be mapped to the peak luminance that can be displayed by the display device (may also be referred to as "target peak luminance"), the mapping ratio may be logarithmically set while retaining the mapping ratio, and the luminance values of all pixels of the second image may be expanded at the ratio after the logarithm is set.
For another example, the luminance mapping curve may conform to equation (4):
Y=lw_lut_f/dst_luma (4);
wherein, the first and the second end of the pipe are connected with each other,
lw_lut_f = (pow(E’,a)/((pow(E’,a*d))*b+c)*ori_luma;
dst _ luma is the target peak brightness;
ori _ luma is the peak luminance of the second image;
e' is the electrical signal of the current pixel;
a and d are predefined values and a is less than d;
b = ((pow(0.214,a))*Lwmax-mo)/(mo*(pow(0.214,a*d)-1)*Lwmax);
c = ((pow(0.214,a*d))*mo-((pow(0.214,a))*Lwmax))/(mo*(pow(0.214,a*d)-1)*Lwmax);
mo = 0.017+0.097*Ln-0.028*po;
Lwmax=dst_luma/ori_luma;
po=over_value/under_value;
over _ value is the number of pixels in the second image whose luminance value reaches the peak luminance;
under _ value is the number of pixels in the second image, the brightness value of which is lower than the peak brightness and the pixel value of which is not 0;
Ln=ln(ave);
ave is the mean value of the brightness of the second image.
Further, a and d are, for example, two threshold values corresponding to luminance, a pixel having luminance lower than the threshold value a may be referred to as a pixel of a "low gray region", a pixel having luminance higher than the threshold value d may be referred to as a pixel of a "high gray region", a may be used to control adjustment of luminance of a pixel of a "low gray region", for example, a value of a affects a portion of a luminance mapping curve corresponding to a low gray region, and d may be used to control adjustment of luminance of a pixel of a "high gray region", for example, a value of b affects a portion of a luminance mapping curve corresponding to a high gray region.
In at least one embodiment, the luminance mapping curve may be, for example, a curve that decreases the luminance of the pixel in the second image with the luminance value lower than a predetermined luminance value and increases the luminance of the pixel in the second image with the luminance value higher than the predetermined luminance value, such as an S-shaped curve, a second-order bezier curve, or another curve that can achieve the above functions, which is not limited by the embodiment of the present invention.
In at least one embodiment, the image synthesis method may further include synthesizing the second image and a third image having a dynamic range standard identical to that of the second image into a fourth image in a spatial order.
For example, after a plurality of first images are stitched in a linear luminance space and converted into a second image, a third image with the same dynamic range standard as that of the second image may be displayed on the same display at the same time as the second image, and then the second image and the third image may be stitched in a pixel level according to a spatial order to obtain a fourth image.
The region of the pixel-level stitching may be defined by an upper layer or a user, and may be all pixel regions or a part of pixel regions of the fourth image to be generated.
Wherein, the dynamic range standard of a part of the first images in the plurality of first images can be the same as that of the second image, and the dynamic range standard of another part of the first images is different from that of the second image.
For example, a first image of multiple HDR and SDR standards may be converted to a linear luminance space and color gamut converted, e.g., to the color gamut of the SDR standard, then stitched, and the stitched image converted to a second image of the SDR standard, which is then stitched with a third image of another SDR standard to become a fourth image of the SDR standard.
Therefore, the second image can be obtained through the first splicing, the fourth image can also be obtained through the second splicing, and whether the first splicing or the second splicing is performed can be selected according to the definition of an upper layer or a user or the actual requirement.
In some embodiments, in the step of expanding the dynamic range of the second image according to at least one of the luminance value of the second image and the target peak luminance, may include:
expanding the dynamic range of the second image according to the histogram of the brightness values of the second image if the dynamic range standard of the second image is a standard dynamic range standard;
and in the case where the dynamic range standard of the second image is a high dynamic range standard, expanding the dynamic range of the second image based on a luminance mapping curve constructed from the luminance value of the second image and the target peak luminance.
The dynamic range standard of the second image is the same as the dynamic range standard used for displaying by a display used for displaying the second image, and the manner of expanding the dynamic range of the second image is determined according to the dynamic range standard of the second image, so that the expanded second image is more suitable for being displayed on the corresponding display, and the display effect is further improved.
According to the embodiment of the first aspect, the images to be stitched are converted into the linear luminance space, and the stitching is performed in the linear luminance space, so that the original luminance information of the images can be retained during stitching, and the stitched images are converted into the images with the uniform dynamic range standard, so that the stitched whole images can be correctly displayed.
The embodiment of the invention also provides an image synthesis method, which comprises the following steps:
converting a nonlinear brightness signal of a target image into a linear brightness signal to obtain a corresponding target linear brightness image, wherein the target image is a first image different from a second image in dynamic range standard in a plurality of first images;
converting the target linear luminance image into a fifth image having the same dynamic range standard as the second image;
and synthesizing the fifth image and the first image with the same dynamic range standard as the second image into a sixth image according to a spatial sequence.
For example, in a case where a dynamic range standard of one first image (e.g., a target image) among a plurality of first images is different from a dynamic range standard of a second image, and dynamic range standards of the remaining first images are the same as the dynamic range standard of the second image, a non-linear luminance signal of the target image may be converted into a linear luminance signal to obtain a corresponding target linear luminance image, and the target linear luminance image may be converted into a fifth image. And the dynamic range standard of the fifth image is the same as that of the second image. And synthesizing the fifth image and the first image with the same dynamic range standard as the second image into a sixth image according to a spatial sequence.
The number of the first images having the same dynamic range standard as that of the second image may be one or more, and the composition manner may be defined by an upper layer or a user, as described above, which is not limited in the present disclosure. It should be noted that one or more target images may be provided, and when there are multiple target images, the multiple target linear luminance images may be respectively converted into multiple fifth images having the same dynamic range standard as the second image, where the target linear luminance images correspond to the fifth images one by one, and the multiple fifth images and the first image having the same dynamic range standard as the second image are synthesized into a sixth image in a spatial order.
For example, when some of the plurality of first dynamic range standards corresponding to the plurality of first images are the same as the target dynamic range standard (dynamic range standard of the second image) and some of the plurality of first images are different from the target dynamic range, the EOTF, the color gamut conversion, and the OETF may be performed only on the first image having the first dynamic range standard different from the target dynamic range standard, the first image may be converted into an image having the same target dynamic range standard, and then the image may be merged with the first image having the first dynamic range standard and the target dynamic range standard to obtain the sixth image.
For example, in the case where the plurality of first images include an HDR image and an SDR image, and the target dynamic range standard is an SDR standard, the conversion of the dynamic range standard may be performed only for the HDR image, the HDR image may be converted into a fifth image of the SDR standard, and the fifth image converted into the SDR standard may be spatially sequentially stitched with the first image of the SDR standard.
In addition, in the case where the plurality of first images include an HDR image and an SDR image, and the target dynamic range standard is the HDR standard, the conversion of the dynamic range standard may be performed only for the SDR image, the SDR image may be converted into a fifth image of the HDR standard, and the fifth image converted into the HDR standard may be spatially sequentially stitched with the first image of the HDR standard.
Further, the dynamic range expansion may be performed on the fifth image or the sixth image, and for example, the dynamic range of the fifth image or the sixth image may be expanded in accordance with at least one of the luminance value of at least a partial region in the fifth image or the sixth image and the target peak luminance. That is to say, the dynamic range of the fifth image may be expanded before the stitching, or the dynamic range of the sixth image may be expanded after the stitching.
Therefore, conversion processing is not needed for the images with the dynamic range standard being the same as the target dynamic range standard, so that the time for image processing is saved, and the speed for image processing is increased.
The implementation manner of the steps of converting the nonlinear luminance signal of the target image into the linear luminance signal to obtain the corresponding target linear luminance image, converting the target linear luminance image into the fifth image, and synthesizing the fifth image and the first image having the same dynamic range standard as the second image into the sixth image according to the spatial sequence may be referred to above, and is not described herein again.
Embodiments of the second aspect
An embodiment of a second aspect of the present invention provides an image synthesizing apparatus having the same principle as the image synthesizing method described in the embodiment of the first aspect, and the same contents are incorporated herein.
Fig. 7 is a schematic diagram of an image synthesizing apparatus according to an embodiment of the second aspect of the present invention.
As shown in fig. 7, the image synthesizing apparatus 900 may include a first conversion unit 901, a synthesizing unit 902, and a second conversion unit 903. The first conversion unit 901 converts the non-linear luminance signals of the plurality of first images into linear luminance signals, respectively, to obtain a plurality of first linear luminance images; the synthesizing unit 902 synthesizes the plurality of first linear luminance images into a second linear luminance image in a spatial order; the second conversion unit 903 converts the second linear luminance image into a second image having a dynamic range standard different from at least one of the dynamic range standards of the plurality of first images.
In at least one embodiment, the synthesizing unit 902 may further synthesize the second image and a third image having a dynamic range standard identical to that of the second image into a fourth image in a spatial order.
In at least one embodiment, the first conversion unit 901 converts a non-linear luminance signal of a corresponding first image into a corresponding linear luminance signal according to an electro-optical conversion function corresponding to a dynamic range standard of the first image to obtain a corresponding third linear luminance image, and converts a plurality of the third linear luminance images into a corresponding plurality of the first linear luminance images according to a target color gamut; the second conversion unit 903 converts the second linear luminance image into the second image according to a photoelectric conversion function. Wherein the electro-optic transfer function is constructed from at least one of the first dynamic range criterion, a peak brightness of the first image, and metadata information of the first image; the photoelectric conversion function is constructed according to at least one of a target dynamic range standard, a target peak luminance, a target minimum luminance, and an ambient illuminance parameter, and the target dynamic range standard is the same as the second dynamic range standard.
In at least one embodiment, as shown in fig. 7, the image synthesizing apparatus 900 may further include an expanding unit 904, and the expanding unit 904 expands the dynamic range of the second image according to at least one of the luminance value of the second image and the target peak luminance.
In at least one embodiment, in a case where the dynamic range standard of the second image is a standard dynamic range standard, the expansion unit 904 may further expand the dynamic range of the second image according to a histogram of luminance values of the second image; in a case where the dynamic range standard of the second image is a high dynamic range standard, the expansion unit 904 may also expand the dynamic range of the second image based on a luminance mapping curve constructed from the luminance value of the second image and the target peak luminance.
In at least one embodiment, in a case where at least one of the dynamic range standards of the plurality of first images is the same as the target dynamic range standard and at least one of the dynamic range standards of the plurality of first images is different from the target dynamic range standard, the second conversion unit 903 may further convert the gamut-converted first linear luminance image corresponding to the first image having a dynamic range standard different from the target dynamic range standard into a fifth image according to the photoelectric conversion function; the synthesizing unit 902 synthesizes the fifth image and the first image having the same dynamic range standard as the target dynamic range standard into a sixth image in spatial order. The expansion unit 904 may also expand the dynamic range of the fifth image or the sixth image according to the luminance value of at least a partial region in the fifth image or the sixth image and the target peak luminance.
According to the embodiment of the second aspect, the images to be stitched are converted into the linear brightness space, and the stitching is performed in the linear brightness space, so that the original brightness information of the images can be kept during stitching, and the stitched images are converted into the images with the uniform dynamic range standard, so that the stitched whole images can be correctly displayed.
The present disclosure provides an image synthesizing apparatus including: the system comprises a target linear brightness image unit, a target linear brightness image unit and a control unit, wherein the target linear brightness image unit is used for converting a nonlinear brightness signal of a target image into a linear brightness signal to obtain a corresponding target linear brightness image, and the target image is a first image which is different from a second image in dynamic range standard in a plurality of first images; an image conversion unit, configured to convert the target linear luminance image into a fifth image having the same dynamic range standard as the second image; and the image synthesis unit is used for synthesizing the fifth image and the first image with the same dynamic range standard as the second image into a sixth image according to the spatial sequence.
Embodiments of the third aspect
Embodiments of the third aspect of the present invention provide a graphics processing apparatus having the same principles as the image synthesis method described in embodiments of the first aspect, and the same is incorporated herein.
Furthermore, the graphics processing apparatus of embodiments of the present invention may be at least part of a display interface card or display adapter. However, the embodiment of the present invention is not limited thereto, and the graphics processing apparatus may also be other apparatuses capable of performing graphics processing.
FIG. 8 is a schematic diagram of a graphics processing apparatus according to an embodiment of the third aspect of the present invention.
As shown in fig. 8, the graphic processing apparatus 1 includes a communication interface 11 and a control device 12.
The communication interface 11 is communicatively connected to the external display device 2, and acquires a target dynamic range standard of the display device 2, which is the highest dynamic range standard that the display device 2 can support, and a target peak luminance of the display device 2, which is the maximum peak luminance that the display device 2 can display.
For example, the communication Interface 11 may be a Video Graphics Array (VGA) Interface, a Digital Video Interface (DVI), a Separate Video Interface (S-Video) Interface, a High Definition Multimedia Interface (HDMI), and so on. The embodiment of the present invention does not limit the type of the communication interface 11, and can be selected according to actual needs.
In addition, in the case that the above parameters of the display device 2 cannot be directly acquired, the above parameters of the display device 2 may also be input through the external input device 3, and the method for acquiring the above parameters of the display device 2 is not limited in the embodiment of the present invention, and the related art may be referred to.
In at least one embodiment, the control device 12 converts the non-linear luminance signals of the plurality of first images into linear luminance signals to obtain a plurality of first linear luminance images, synthesizes the plurality of first linear luminance images into a second linear luminance image according to a spatial order, and converts the second linear luminance image into a second image according to the target dynamic range standard, wherein the second image has a dynamic range standard different from at least one of the dynamic range standards of the plurality of first images and the same as the target dynamic range standard.
In at least one embodiment, the control device 12 may further combine the second image and a third image having a dynamic range standard that is the same as the dynamic range standard of the second image into a fourth image in a spatial order.
In at least one embodiment, the communication interface 11 also obtains a target color gamut of the display device 2, which is the maximum range of color gamuts that the display device 2 can support, and a target minimum luminance, which is the minimum luminance that the display device 2 can display.
As shown in fig. 8, the graphics processing apparatus 1 may further include a storage device 13, and the storage device 13 stores an ambient illuminance parameter in an environment in which the display device 2 is located. For example, an ambient illuminance parameter in the environment where the display device 2 is located may be input through the external input device 3 and stored by the storage device 13. The input device 3 is, for example, a keyboard, a mouse, or the like.
The display device 2 according to the embodiment of the present invention is not limited, and may be a device or an apparatus having a function of displaying an image, such as a display, a television, or a projector.
In at least one embodiment, the control device 12 converts the corresponding non-linear luminance signal of the first image into a corresponding linear luminance signal according to an electro-optical conversion function corresponding to the dynamic range standard of the first image to obtain a corresponding third linear luminance image; converting the plurality of third linear luminance images into a corresponding plurality of first linear luminance images according to the target color gamut; synthesizing the plurality of first linear brightness images into a second linear brightness image according to a spatial sequence; converting the second linear luminance image into the second image according to a photoelectric conversion function constructed according to at least one of a target dynamic range standard, a target peak luminance, a target minimum luminance, and an ambient illuminance parameter.
In at least one embodiment, control device 12 may also expand the dynamic range of the second image based on at least one of the luminance value of the second image and the target peak luminance.
In at least one embodiment, in a case where the dynamic range criterion of the second image is a standard dynamic range criterion, the control device 12 may further expand the dynamic range of the second image according to a histogram of luminance values of the second image; in the case where the dynamic range standard of the second image is a high dynamic range standard, the control device 12 may further expand the dynamic range of the second image based on a luminance mapping curve constructed from the luminance value of the second image and the target peak luminance.
In at least one embodiment, the control device 12 may expand the dynamic range of the second image according to a histogram of luminance values of the second image.
For example, the control device 12 counts the frequency of luminance values of pixels in at least a part of the area of the second image in the form of a histogram; calculating the total pixel number of which the frequency exceeds a preset threshold value; calculating an average pixel number that equally distributes the total pixel number to all sections; correcting the frequency of each section according to the average pixel number to obtain a correction histogram; adjusting a luminance range of pixels within the at least a portion of the region of the second image according to the corrected histogram.
In at least one embodiment, the control device 12 may also segment the second image into a plurality of regions; adjusting the distribution of the brightness values of the pixels in each area according to the cumulative histogram corresponding to the correction histogram; and for the spatial domain pixels positioned in the preset range of the boundary position of the adjacent regions, calculating a corresponding correction value according to the histogram of the region where the spatial domain pixels are positioned and the cumulative histograms of the adjacent regions with preset quantity, and correcting the pixel values of the spatial domain pixels according to the correction value.
In at least one embodiment, the control device 12 may also perform the expansion based on a luminance mapping curve constructed from the luminance values of the second image and the target peak luminance.
For example, the luminance mapping curve causes a rate of change in luminance values of pixels in the second image to decrease as the luminance values increase.
For example, the maximum luminance value of the second image may be counted, the maximum luminance value of the second image (may also be referred to as "peak luminance of the second image") may be mapped to the peak luminance that can be displayed by the display device (may also be referred to as "target peak luminance"), the mapping ratio may be kept, the logarithm of the mapping ratio may be taken, and the luminance values of all pixels of the second image may be expanded at the ratio after the logarithm is taken.
For another example, the luminance mapping curve conforms to equation (1):
Y=lw_lut_f/dst_luma (1);
wherein the content of the first and second substances,
lw_lut_f = (pow(E’,a)/((pow(E’,a*d))*b+c)*ori_luma;
dst _ luma is the target peak brightness;
ori _ luma is the peak luminance of the second image;
e' is the electrical signal of the current pixel;
a and d are predefined values and a is less than d;
b = ((pow(0.214,a))*Lwmax-mo)/(mo*(pow(0.214,a*d)-1)*Lwmax);
c = ((pow(0.214,a*d))*mo-((pow(0.214,a))*Lwmax))/(mo*(pow(0.214,a*d)-1)*Lwmax);
mo = 0.017+0.097*Ln-0.028*po;
Lwmax=dst_luma/ori_luma;
po=over_value/under_value;
over _ value is the number of pixels in the second image whose luminance value reaches the peak luminance;
under _ value is the number of pixels in the second image, the brightness value of which is lower than the peak brightness and the pixel value of which is not 0;
Ln=ln(ave);
ave is the mean value of the brightness of the second image.
For another example, the luminance mapping curve may be a curve that decreases the luminance of pixels having luminance values lower than a predetermined luminance value in the second image and increases the luminance of pixels having luminance values higher than the predetermined luminance value in the second image, and may be, for example, an S-shaped curve, a second-order bezier curve, or the like.
In at least one embodiment, in a case where at least one of the first dynamic range standards of the plurality of first images is the same as the target dynamic range standard and at least one of the first dynamic range standards of the plurality of first images is different from the target dynamic range standard, the control device 12 may further convert the first linear luminance image after the color gamut conversion corresponding to the first image whose first dynamic range standard is different from the target dynamic range standard into a fifth image according to the photoelectric conversion function; and synthesizing the fifth image and the first image with the first dynamic range standard being the same as the target dynamic range standard into a sixth image according to the spatial sequence.
In at least one embodiment, the control device 12 synthesizes the first linear luminance images after the color gamut conversion corresponding to the at least two first images into a target linear luminance image according to a spatial order; and converting the target linear brightness image into a second image according to the photoelectric conversion function.
According to the embodiment of the third aspect, the images to be stitched are converted into the linear brightness space, and the stitching is performed in the linear brightness space, so that the original brightness information of the images can be kept during stitching, and the stitched images are converted into the images with the uniform dynamic range standard, so that the stitched whole images can be correctly displayed.
The present disclosure also provides a graphics processing apparatus including a communication interface and a control device.
The communication interface is in communication connection with an external display device, and acquires a target dynamic range standard of the display device and a target peak brightness of the display device, wherein the target dynamic range standard is the highest dynamic range standard which can be supported by the display device, and the target peak brightness is the maximum peak brightness which can be displayed by the display device.
For example, the communication Interface may be a Video Graphics Array (VGA) Interface, a Digital Video Interface (DVI), a Separate Video Interface (S-Video) Interface, a High Definition Multimedia Interface (HDMI), and the like. The embodiment of the invention does not limit the type of the communication interface and can select the communication interface according to actual needs.
In addition, in the case that the parameter of the display device cannot be directly acquired, the parameter of the display device may be input through an external input device.
In at least one embodiment, the control device converts a nonlinear brightness signal of a target image into a linear brightness signal to obtain a corresponding target linear brightness image, wherein the target image is a first image different from a dynamic range standard of a second image in a plurality of first images; converting the target linear brightness image into a fifth image with the same dynamic range standard as the second image; and synthesizing the fifth image and the first image with the same dynamic range standard as the second image into a sixth image according to the spatial sequence.
An embodiment of the present invention further provides a computer device, which includes a memory, a processor, and a computer program stored in the memory and executable on the processor, and when the processor executes the computer program, the image synthesis method of the embodiment of the first aspect is implemented.
An embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored, and when being executed by a processor, the computer program implements the image synthesis method according to the embodiment of the first aspect.
Embodiments of the present invention further provide a computer program product, where the computer program product includes a computer program, and when executed by a processor, the computer program implements the image synthesis method according to the embodiments of the first aspect.
In the embodiment of the invention, the images to be spliced are converted into the linear brightness space and spliced in the linear brightness space, so that the original brightness information of the images can be reserved during splicing, and the spliced images are converted into the images with the standard uniform dynamic range, thereby correctly displaying the spliced whole images.
It should be understood by those skilled in the art that the values of the parameters in the formulas in the above embodiments are not limited, and may be adjusted appropriately according to actual situations, and reference may be made to related technologies.
In the embodiment of the present invention, the steps of the image synthesis method are labeled with numbers, but the order of the numbers does not represent the execution order of the steps, and the execution order of the steps may be arbitrarily combined according to the actual situation, which is not limited in the embodiment of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present invention has been described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the invention. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The above-mentioned embodiments are intended to illustrate the objects, technical solutions and advantages of the present invention in further detail, and it should be understood that the above-mentioned embodiments are only exemplary embodiments of the present invention, and are not intended to limit the scope of the present invention, and any modifications, equivalent substitutions, improvements and the like made within the spirit and principle of the present invention should be included in the scope of the present invention.

Claims (13)

1. An image synthesis method, characterized by comprising:
respectively converting the nonlinear brightness signals of the plurality of first images into linear brightness signals to obtain a plurality of first linear brightness images;
synthesizing the plurality of first linear brightness images into a second linear brightness image according to a spatial sequence;
converting the second linear luminance image into a second image having a different dynamic range criterion than at least one of the dynamic range criteria of the plurality of first images.
2. The image synthesis method according to claim 1, wherein the image synthesis method further comprises:
and synthesizing the second image and a third image with the same dynamic range standard as that of the second image into a fourth image according to a spatial sequence.
3. The image synthesizing method according to claim 1,
the step of converting the plurality of first images into a plurality of first linear luminance images respectively comprises:
converting the non-linear brightness signal of the corresponding first image into a corresponding linear brightness signal according to an electro-optical conversion function corresponding to the dynamic range standard of the first image to obtain a corresponding third linear brightness image; and
converting the plurality of third linear luminance images into a corresponding plurality of first linear luminance images according to the target color gamut,
in the step of converting the second linear luminance image into a second image, the second linear luminance image is converted into the second image according to a photoelectric conversion function.
4. The image synthesizing method according to claim 3,
the electro-optic transfer function is constructed from at least one of a dynamic range criterion of the first image, a peak luminance of the first image, and metadata information of the first image;
the photoelectric conversion function is constructed according to at least one of a target dynamic range standard, a target peak luminance, a target minimum luminance, and an ambient illuminance parameter, the target dynamic range standard being the same as the dynamic range standard of the second image.
5. The image synthesis method according to claim 1, wherein the image synthesis method further comprises:
expanding a dynamic range of the second image according to at least one of a luminance value of the second image and a target peak luminance.
6. The image synthesizing method according to claim 5, wherein,
in the step of expanding the dynamic range of the second image according to at least one of the luminance value of the second image and the target peak luminance,
expanding the dynamic range of the second image according to the histogram of the luminance values of the second image in the case where the dynamic range standard of the second image is a standard dynamic range standard;
and in the case where the dynamic range standard of the second image is a high dynamic range standard, expanding the dynamic range of the second image based on a luminance mapping curve constructed from the luminance value of the second image and the target peak luminance.
7. An image synthesis method, characterized by comprising:
converting a nonlinear brightness signal of a target image into a linear brightness signal to obtain a corresponding target linear brightness image, wherein the target image is a first image different from a second image in dynamic range standard in a plurality of first images;
converting the target linear brightness image into a fifth image with the same dynamic range standard as the second image;
and synthesizing the fifth image and the first image with the same dynamic range standard as the second image into a sixth image according to the spatial sequence.
8. An image synthesizing apparatus, characterized by comprising:
a first conversion unit which converts the nonlinear luminance signals of the plurality of first images into linear luminance signals, respectively, to obtain a plurality of first linear luminance images;
a synthesizing unit that synthesizes the plurality of first linear luminance images into one second linear luminance image in a spatial order;
a second conversion unit that converts the second linear luminance image into a second image having a dynamic range standard different from at least one of dynamic range standards of the plurality of first images.
9. A graphics processing apparatus, characterized in that the graphics processing apparatus comprises:
a communication interface, communicatively connected to a display device, for obtaining a target dynamic range standard of the display device and a target peak luminance of the display device, where the target dynamic range standard is a highest dynamic range standard that the display device can support, and the target peak luminance is a maximum peak luminance that the display device can display; and
and a control device which converts the nonlinear luminance signals of the plurality of first images into linear luminance signals to obtain a plurality of first linear luminance images, synthesizes the plurality of first linear luminance images into a second linear luminance image according to a spatial sequence, and converts the second linear luminance image into a second image according to the target dynamic range standard, wherein the second image has a dynamic range standard which is different from at least one of the dynamic range standards of the plurality of first images and is the same as the target dynamic range standard.
10. The graphics processing apparatus according to claim 9,
the communication interface further acquires a target color gamut of the display device, which is a color gamut that can be supported by the display device, and a target minimum luminance, which is a minimum luminance that can be displayed by the display device,
the control device converts the corresponding nonlinear brightness signal of the first image into a corresponding linear brightness signal according to an electro-optical conversion function corresponding to the dynamic range standard of the first image to obtain a corresponding third linear brightness image; converting the plurality of third linear luminance images into a corresponding plurality of first linear luminance images according to the target color gamut; synthesizing the plurality of first linear brightness images into a second linear brightness image according to a spatial sequence; converting the second linear luminance image into the second image according to a photoelectric conversion function, the photoelectric conversion function being constructed according to at least one of a target dynamic range standard, a target peak luminance, a target minimum luminance, and an ambient illuminance parameter.
11. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method of any of claims 1 to 7 when executing the computer program.
12. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program which, when executed by a processor, implements the method of any one of claims 1 to 7.
13. A computer program product, characterized in that the computer program product comprises a computer program which, when being executed by a processor, carries out the method of any one of claims 1 to 7.
CN202211323465.1A 2022-10-27 2022-10-27 Image synthesis method, device and graphic processing equipment Active CN115393197B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211323465.1A CN115393197B (en) 2022-10-27 2022-10-27 Image synthesis method, device and graphic processing equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211323465.1A CN115393197B (en) 2022-10-27 2022-10-27 Image synthesis method, device and graphic processing equipment

Publications (2)

Publication Number Publication Date
CN115393197A true CN115393197A (en) 2022-11-25
CN115393197B CN115393197B (en) 2023-05-05

Family

ID=84128517

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211323465.1A Active CN115393197B (en) 2022-10-27 2022-10-27 Image synthesis method, device and graphic processing equipment

Country Status (1)

Country Link
CN (1) CN115393197B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170330312A1 (en) * 2016-05-16 2017-11-16 Lg Electronics Inc. Image processing device and image processing method using the same
CN107590780A (en) * 2017-08-09 2018-01-16 深圳Tcl新技术有限公司 Method for displaying image, terminal and computer-readable recording medium
CN111885393A (en) * 2020-07-31 2020-11-03 广州华多网络科技有限公司 Live broadcast method, system, storage medium and equipment
CN113196378A (en) * 2018-12-13 2021-07-30 Ati科技无限责任公司 Method and system for improving visibility in a blend layer of a high dynamic range display
CN114511479A (en) * 2021-12-31 2022-05-17 阿里巴巴(中国)有限公司 Image enhancement method and device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170330312A1 (en) * 2016-05-16 2017-11-16 Lg Electronics Inc. Image processing device and image processing method using the same
CN107590780A (en) * 2017-08-09 2018-01-16 深圳Tcl新技术有限公司 Method for displaying image, terminal and computer-readable recording medium
CN113196378A (en) * 2018-12-13 2021-07-30 Ati科技无限责任公司 Method and system for improving visibility in a blend layer of a high dynamic range display
CN111885393A (en) * 2020-07-31 2020-11-03 广州华多网络科技有限公司 Live broadcast method, system, storage medium and equipment
CN114511479A (en) * 2021-12-31 2022-05-17 阿里巴巴(中国)有限公司 Image enhancement method and device

Also Published As

Publication number Publication date
CN115393197B (en) 2023-05-05

Similar Documents

Publication Publication Date Title
US10255879B2 (en) Method and apparatus for image data transformation
EP2989793B1 (en) Workflow for content creation and guided display management of enhanced dynamic range video
US9685120B2 (en) Image formats and related methods and apparatuses
JP5196731B2 (en) Image processing apparatus and image processing method
JP6921886B2 (en) Information processing device and information processing method
US10332481B2 (en) Adaptive display management using 3D look-up table interpolation
CN114866809B (en) Video conversion method, apparatus, device, storage medium, and program product
US10645359B2 (en) Method for processing a digital image, device, terminal equipment and associated computer program
EP4135316A1 (en) Dynamic range mapping method and apparatus
CN115393228B (en) Image processing method and device and graphic processing equipment
CN111416966A (en) Information processing apparatus, information processing method, and computer readable medium
EP3002730B1 (en) Blending images using mismatched source and display electro-optical transfer functions
KR102631844B1 (en) Methods, devices, terminal equipment and associated computer programs for processing digital images
CN115393197B (en) Image synthesis method, device and graphic processing equipment
WO2019044171A1 (en) Image processing device, display device, image processing method, control program, and recording medium
US11837140B2 (en) Chromatic ambient light correction
CN118044198A (en) Dynamic spatial metadata for image and video processing

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant