WO2021143300A1 - 图像处理方法、装置、电子设备及存储介质 - Google Patents

图像处理方法、装置、电子设备及存储介质 Download PDF

Info

Publication number
WO2021143300A1
WO2021143300A1 PCT/CN2020/126521 CN2020126521W WO2021143300A1 WO 2021143300 A1 WO2021143300 A1 WO 2021143300A1 CN 2020126521 W CN2020126521 W CN 2020126521W WO 2021143300 A1 WO2021143300 A1 WO 2021143300A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
tone mapping
image
values
curve
Prior art date
Application number
PCT/CN2020/126521
Other languages
English (en)
French (fr)
Inventor
马元蛟
罗俊
权威
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Publication of WO2021143300A1 publication Critical patent/WO2021143300A1/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • H04N23/84Camera processing pipelines; Components thereof for processing colour signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Definitions

  • This application relates to image processing technology, in particular to an image processing method, device, electronic equipment and storage medium.
  • tone mapping processing With the rapid development of image processing technology and the increasing data volume of captured video images, it becomes more and more necessary to perform tone mapping processing on captured video images.
  • the same tone mapping curve is used to perform tone mapping on multiple frames of images taken, so as to preserve the color, contrast, details and other information in the image to the greatest extent.
  • using the same tone mapping curve to tone map the captured multiple frames of images will cause discontinuous and unnatural display effects in the overall video stream after the tone mapping process.
  • the embodiments of the present application provide an image processing method, device, electronic equipment, and storage medium.
  • the to-be-processed video data includes at least two images and attribute information corresponding to the at least two images;
  • the values represent the brightness value of the ambient light when the image is taken;
  • tone mapping curves corresponding to the at least two images respectively, performing tone mapping processing on the at least two images to obtain at least two processed images.
  • An embodiment of the application provides an image processing device, which includes:
  • An acquiring unit configured to acquire video data to be processed; the video data to be processed includes at least two images and attribute information corresponding to the at least two images respectively;
  • the first processing unit is configured to use the attribute information respectively corresponding to the at least two images to determine the values corresponding to the at least two images to obtain at least two values; the values represent the environmental light when the image is taken Brightness value
  • a second processing unit configured to determine the tone mapping curves respectively corresponding to the at least two images based on the at least two values
  • the third processing unit is configured to use tone mapping curves respectively corresponding to the at least two images to perform tone mapping processing on the at least two images to obtain at least two processed images.
  • An embodiment of the present application provides an image processing device, including a memory, a processor, and a computer program stored on the memory and capable of running on the processor, and the processor implements the steps of any of the above methods when the processor executes the program.
  • the embodiment of the present application provides a computer storage medium on which computer instructions are stored, and when the instructions are executed by a processor, the steps of any of the foregoing methods are implemented.
  • the image processing method, device, electronic equipment, and storage medium provided by the embodiments of the application obtain video data to be processed; the video data to be processed includes at least two images and attribute information corresponding to the at least two images;
  • the attribute information respectively corresponding to the at least two images is determined to determine the values corresponding to the at least two images to obtain at least two values; the value represents the brightness value of the ambient light when the image is taken; based on the at least two images
  • To determine the tone mapping curves respectively corresponding to the at least two images use the tone mapping curves respectively corresponding to the at least two images to perform tone mapping processing on the at least two images to obtain the processed At least two images.
  • the attribute information of each image in at least two images can be used to accurately predict the brightness value of the ambient light corresponding to each image.
  • the brightness of the ambient light corresponding to each image can be predicted.
  • Value determine the tone mapping curve corresponding to each image, and use the determined tone mapping curve to perform tone mapping processing on each image, and finally achieve a smooth and natural effect of the overall video stream, especially when the illumination of the shooting scene changes It can still avoid the occurrence of discontinuous and unnatural problems in the overall video stream.
  • FIG. 1 is a schematic diagram of an implementation flow of an image processing method according to an embodiment of the present application
  • FIG. 2 is a schematic diagram of an implementation process of determining a tone mapping curve corresponding to each of at least two images in an embodiment of the present application
  • FIG. 3 is a schematic diagram of adjusting a first curve to obtain an initial tone mapping curve according to an embodiment of the present application
  • FIG. 5 is a schematic diagram of an implementation process of determining a tone mapping curve corresponding to each of at least two images in an embodiment of the present application
  • FIG. 6 is a schematic diagram of an implementation flow of tone mapping processing on each of at least two images according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of determining an image after tone mapping processing according to an embodiment of the present application.
  • FIG. 8 is another schematic diagram of determining an image after tone mapping processing according to an embodiment of the present application.
  • FIG. 9 is a schematic diagram of the composition structure of an image processing device according to an embodiment of the present application.
  • FIG. 10 is a schematic diagram of the composition structure of an electronic device according to an embodiment of the present application.
  • the same tone mapping curve such as a gamma curve
  • a gamma curve can be used to perform tone mapping on multiple frames of images taken, so as to preserve the color, contrast, details and other information in the image to the greatest extent.
  • using the same gamma curve to tone-map the captured multi-frame images will result in a discontinuous and unnatural display effect in the overall video stream after the tone mapping process.
  • video data to be processed is acquired; the video data to be processed includes at least two images and attribute information corresponding to the at least two images;
  • the attribute information corresponding to each image is determined, and the values corresponding to the at least two images are determined to obtain at least two values; the value represents the brightness value of the ambient light when the image is taken; and based on the at least two values, it is determined and Tone mapping curves respectively corresponding to the at least two images; and using the tone mapping curves respectively corresponding to the at least two images to perform tone mapping processing on the at least two images respectively to obtain the processed at least two images.
  • FIG. 1 is a schematic diagram of the implementation process of the image processing method according to an embodiment of the application; as shown in FIG. 1, the method includes:
  • Step 101 Obtain video data to be processed; the video data to be processed includes at least two images and attribute information corresponding to the at least two images respectively;
  • Step 102 Determine the values corresponding to the at least two images by using the attribute information respectively corresponding to the at least two images to obtain at least two values; the values represent the brightness value of the ambient light when the image is taken;
  • Step 103 Based on the at least two values, determine the tone mapping curves respectively corresponding to the at least two images;
  • Step 104 Using tone mapping curves corresponding to the at least two images, respectively, perform tone mapping processing on the at least two images to obtain at least two processed images.
  • the at least two images may refer to at least two frames of images;
  • the attribute information may refer to shooting parameters used to shoot the images, such as exposure time, aperture value, and sensitivity.
  • the metadata information of a single frame image can be extracted, and the extracted metadata information can be used as the attribute information of the single frame image.
  • the tone mapping curve may refer to a curve capable of adjusting the relative brightness of the image; specifically, when the acquired image is a color image, the tone mapping curve may refer to a curve capable of adjusting the color image. Color curves.
  • the brightness value of the ambient light corresponding to each of the at least two images can be calculated in the following two ways.
  • the first method combines the attribute information of each image to accurately calculate the brightness value of the ambient light corresponding to each image.
  • the attribute information of the current frame image can be combined, and the brightness value of the ambient light corresponding to the current frame image can be accurately calculated according to formula (1).
  • lu min ance t represents the brightness value of the ambient light corresponding to the current frame image
  • fps represents a fixed coefficient, such as 33
  • fnumber represents the shooting parameter used to shoot the current frame image, that is, the aperture value
  • exposuretime represents the current frame image used
  • the shooting parameter of is the exposure time
  • isovalue represents the shooting parameter used in shooting the current frame of image, that is, the sensitivity.
  • the second way is to use the alpha blending function to combine the previous image of each image to accurately calculate the brightness value of the ambient light corresponding to each image.
  • the alpha blending function can be used to accurately calculate the ambient light value when the current frame image is taken according to formula (2). Brightness value.
  • lu min ance t represents the brightness value of the ambient light corresponding to the current frame image
  • alpha represents a fixed coefficient, such as 0.7
  • lu min ance t-1 represents the brightness value of the ambient light corresponding to the previous frame of the current frame image, which can be Calculated according to formula (1).
  • the attribute information of each image in at least two images can be used to accurately predict the brightness value of the ambient light corresponding to each image.
  • the brightness value of the ambient light corresponding to each image can be subsequently determined based on the brightness value of the ambient light corresponding to each image.
  • the brightness value of the ambient light corresponding to each image can be selected from the at least two preset target brightness values
  • the two closest target brightness values are based on the initial tone mapping curves corresponding to the selected two target brightness values to generate a tone mapping curve corresponding to each image.
  • the determining the tone mapping curves respectively corresponding to the at least two images based on the at least two values includes:
  • For each of the at least two values select two target brightness values that match the corresponding values from the set of brightness values; determine that two target brightness values that match each of the at least two values respectively correspond The initial tone mapping curve; based on the initial tone mapping curve, generating tone mapping curves corresponding to the at least two images respectively.
  • the selection of two target brightness values that match the corresponding values from the set of brightness values may mean that the corresponding values are within a range of values corresponding to the two target brightness values.
  • the brightness value set includes 7 preset brightness values, such as 5 lux, 10 lux, 20 lux, 50 lux, 150 lux, 400 lux, and 800 lux. If the brightness value of the ambient light corresponding to the current frame image in at least two images is 8 lux, the two closest preset brightness values are selected from the brightness value set as the target brightness values, that is, 5 lux and 10 lux.
  • the process of determining the tone mapping curve corresponding to each of the at least two images is described.
  • the process of determining the tone mapping curve corresponding to each of the at least two images includes:
  • Step 1 Obtain the attribute information of each of the at least two images.
  • the metadata information of each image is extracted, such as exposure time, aperture value (also referred to as F value), and sensitivity (also referred to as isoGain value), and the extracted metadata information is used as the attribute information of each frame of image.
  • Step 2 Use the attribute information of each image to calculate the brightness value of the ambient light corresponding to each image.
  • the brightness value of the ambient light corresponding to each image can be calculated according to formula (1); alternatively, the brightness value of the ambient light corresponding to each image can also be calculated according to formula (2).
  • Step 3 Select two target brightness values that match the brightness value of the ambient light corresponding to each image from the brightness value set.
  • the brightness value set includes 7 preset brightness values, such as 5lux, 10lux, 20lux, 50lux, 150lux, 400lux, 800lux.
  • Step 4 Determine initial tone mapping curves respectively corresponding to the two target brightness values; generate a tone mapping curve corresponding to each image based on the initial tone mapping curve.
  • the brightness value of the ambient light corresponding to each image can be determined based on the attribute information of each image, and based on the brightness of the ambient light corresponding to each image Increase the value of the most suitable tone mapping curve. In this way, when the subsequent tone mapping is performed on each image, there will be no overexposure and insufficient brightening of the highlights, which can ensure that the overall video stream has a smooth and natural effect.
  • the initial tone mapping corresponding to the two target brightness values can be directly searched from the preset database curve. Or, based on a preset curve, an initial tone mapping curve corresponding to each of the two target brightness values is generated.
  • the determining the initial tone mapping curves respectively corresponding to the two target brightness values that match each of the at least two values includes:
  • the first curve represents the corresponding relationship between the input brightness value and the output brightness value; for two target brightness values that match each of the at least two values, the two targets are determined
  • the first curve serves as the initial tone mapping curve corresponding to the two target brightness values respectively.
  • the first curve may be a gamma curve
  • the second curve may be an ACES Filmic curve
  • the first function may be a sigmoid function
  • formula (3) may be used to express the first curve
  • formula (4) may be used to express the second curve
  • formula (5) may be used to express the first function.
  • lu m out represents the output brightness value
  • a represents the default coefficient, such as 2.51
  • b represents the default coefficient, such as 0.03
  • c represents the default coefficient, such as 2.43
  • d represents the default coefficient, such as 0.59
  • the value range of e is 0 Between and 1, such as 0.2.
  • lu m out represents the output brightness value
  • middle and k represent the default coefficients.
  • the area with a lower input brightness value is a dark area, and an area with a larger input brightness value is a bright area.
  • the values of middle and k in formula (5) are different, and the obtained curve 3 is different.
  • the subsequent generation can be based on two different curves 3
  • the tone mapping curve corresponding to each image After the tone mapping process is performed on each image using the tone mapping curve corresponding to each image, the contrast of each image can be improved. In related technologies, only the gamma curve is used for each image. Compared with tone mapping, it can avoid the lack of color saturation of the image after tone mapping or the dimness of the overall image, resulting in a decrease in sharpness, and then achieve the best tone mapping effect.
  • Figure 4 is a schematic diagram of the initial tone mapping curve corresponding to different target brightness values.
  • the target brightness value is one of the following: 5lux, 10lux, 20lux, 50lux, 150lux, 400lux, 800lux
  • the target brightness value is represented by trigger0
  • the initial tone mapping curve corresponding to the target brightness value of 10lux is represented by trigger1
  • the initial tone mapping curve corresponding to the target brightness value of 20lux is represented by trigger2
  • the target brightness value is the initial corresponding to 50lux.
  • the tone mapping curve is represented by trigger3, the initial tone mapping curve corresponding to a target brightness value of 150lux is represented by trigger4, the initial tone mapping curve corresponding to a target brightness value of 400lux is represented by trigger5, and the initial tone mapping curve corresponding to a target brightness value of 800lux is represented by trigger6 Express. It can be seen from FIG. 4 that the larger the target brightness value, the lower the tone compensation rate, and the initial tone mapping curve corresponding to the lowest target brightness value of 5 lux can improve the maximum tone compensation effect.
  • the generating tone mapping curves respectively corresponding to the at least two images based on the initial tone mapping curve includes:
  • the difference between the two target brightness values corresponding to each of the at least two values is obtained to obtain the first difference; the brightness value of each of the at least two values and the corresponding two target brightness values is the smallest Calculate the difference between the target brightness value of each value to obtain the second difference value; obtain the quotient of the first difference value corresponding to each value and the second difference value corresponding to each value to obtain the first ratio corresponding to each value;
  • the first ratio and the initial tone mapping curves corresponding to the two target brightness values corresponding to each of the at least two values are respectively generated to generate the tone mapping curves respectively corresponding to the at least two images.
  • formula (6) may be used to express the first difference value
  • formula (7) may be used to express the second difference value
  • formula (8) may be used to express the first ratio
  • offset represents the first difference
  • trigger1 and trigger0 respectively represent two target brightness values corresponding to each image.
  • dis represents the second difference value
  • lu minance t represents the brightness value of the ambient light corresponding to the current frame image
  • trigger0 represents the minimum brightness value of the two target brightness values corresponding to the current frame image.
  • weight represents the first ratio.
  • the interpolation algorithm corresponding to formula (9) can be used to generate the tone mapping curve corresponding to each image based on the two initial tone mapping curves and the first ratio.
  • curve_inter(t) (1-weight) ⁇ curve_initial0+weight ⁇ curve_initial1 (9)
  • curve_inter(t) represents the tone mapping curve corresponding to the current frame image
  • initial0 and initial1 respectively represent the initial tone mapping curves corresponding to the two target brightness values corresponding to the current frame image.
  • the two initial tone mapping curves are obtained by adjusting the first curve using the second curve and the first function, based on the two initial tone mapping curves, a tone mapping curve corresponding to each image is generated, and each image is subsequently used.
  • the tone mapping curve corresponding to each image can improve the contrast of each image after the tone mapping process is performed on each image. Compared with the related art only using the gamma curve to tone each image, it can avoid the tone mapping image. Insufficient color saturation or the overall image is dim, resulting in a decrease in sharpness, thereby achieving the best tone mapping effect.
  • the image after the tone mapping process will appear discontinuous and unnatural, when the tone mapping curve corresponding to each image is determined.
  • the difference in the brightness of the ambient light corresponding to the two adjacent images can be calculated.
  • the difference is greater than or equal to the threshold, based on the tone mapping curve corresponding to the two adjacent images, the two adjacent images are taken at a later time.
  • the tone mapping curve corresponding to the image is updated.
  • the method further includes:
  • the difference between the values corresponding to two adjacent images in the at least two images is calculated to obtain a third difference; it is determined whether the third difference is greater than or equal to a threshold; when it is determined that the third difference is greater than or equal to
  • thresholding based on the tone mapping curves corresponding to the two adjacent images in the at least two images, the tone mapping curve corresponding to the first image in the two adjacent images is updated; the two adjacent images include The first image and the second image; the shooting time of the first image is less than the shooting time of the second image.
  • the process of determining the tone mapping curve corresponding to each of the at least two images is described.
  • the process of determining the tone mapping curve corresponding to each of the at least two images includes:
  • Step 1 Obtain the attribute information of each of the at least two images.
  • Step 2 Use the attribute information of each image to calculate the brightness value of the ambient light corresponding to each image; determine the tone mapping curve corresponding to each image based on the brightness value of the ambient light corresponding to each image.
  • Step 3 Difference the brightness values of the ambient light corresponding to two adjacent images in the at least two images to obtain a third difference value.
  • the brightness values of the ambient light corresponding to two adjacent images are represented by luminance(t) and luminance(t-1).
  • Step 4 Determine whether the third difference value is greater than or equal to the threshold value; when it is determined that the third difference value is greater than or equal to the threshold value, step 5 is executed.
  • the threshold is represented by lumDiffThresh, and the value can be 30.
  • Step 5 Based on the tone mapping curves corresponding to the two adjacent images in the at least two images, the tone mapping curve corresponding to the first image in the two adjacent images is updated.
  • the two adjacent images include a first image and a second image; the shooting time of the first image is greater than the shooting time of the second image.
  • the tone mapping curve corresponding to the first image is represented by curve_inter(t)
  • the tone mapping curve corresponding to the second image is represented by curve_final(t-1).
  • the first image The corresponding tone mapping curve curve_inter(t) is updated to curve_final(t).
  • the tone mapping curve corresponding to the two adjacent images will be used to update and adjust the tone mapping curve corresponding to the image that was taken at a later time. In this way, it is possible to avoid the large difference in the brightness of the ambient light corresponding to the two adjacent images before and after the resulting color tone.
  • the image after the mapping process is discontinuous and unnatural.
  • the using the tone mapping curves respectively corresponding to the at least two images to perform tone mapping processing on the at least two images respectively includes:
  • Image signal processing is performed on the at least two images respectively to obtain YUV images corresponding to the at least two images respectively; using the tone mapping curves corresponding to the at least two images, each of the at least two images is The YUV image corresponding to the image undergoes tone mapping processing.
  • Y in YUV represents brightness, that is, grayscale value
  • U and V in YUV represent chromaticity, that is, image color and saturation.
  • the at least two images may also be YUV images obtained through image signal processing.
  • each image can be directly set in the YUV domain. Perform tone mapping processing on.
  • the tone mapping ratio is determined based on the tone mapping curve corresponding to each of the at least two images, and the tone mapping process is performed on each image in the YUV domain based on the tone mapping ratio.
  • the using the tone mapping curves respectively corresponding to the at least two images to perform tone mapping processing on the YUV image corresponding to each of the at least two images includes:
  • formula (10) can be used to determine the tone mapping ratio.
  • curve_final(t) represents the tone mapping curve corresponding to the current frame image
  • Y_in represents the Y component of the YUV image corresponding to the current frame image
  • ratio represents the tone mapping rate
  • formula (11) can be used in conjunction with the tone mapping ratio to adjust the values of the Y, U, and V components in the YUV image corresponding to the corresponding image.
  • Y_in represents the Y component of the YUV image corresponding to the current frame image
  • U_in represents the U component of the YUV image corresponding to the current frame image
  • V_in represents the V component of the YUV image corresponding to the current frame image
  • Y_out represents the adjusted current frame image
  • U_out represents the U component of the adjusted YUV image corresponding to the current frame image
  • V_out represents the V component of the adjusted YUV image corresponding to the current frame image.
  • the tone mapping ratio is determined based on the tone mapping curve corresponding to each of the at least two images, and the tone mapping process is performed on each image in the YUV domain based on the tone mapping ratio, which has the following advantages:
  • the brightness of different images under the same lighting conditions will also be very different. If the at least two The statistical global tone mapping processing of the image will cause the image to be overexposed or too dark. In this way, if the brightness value of the ambient light corresponding to each of the at least two images is low, such as less than 55lux, then The tone mapping process can be performed on each image based on the histogram of each image.
  • the method further includes:
  • the histogram of the image corresponding to the corresponding values is determined; the histogram corresponding to each value is used as a comparison of the at least two A tone mapping curve for performing tone mapping processing on each image in the image; using the histogram to perform tone mapping processing on each of the at least two images to obtain at least two processed images.
  • the histogram can be used to present an image with a curve with light and dark distribution. Two target brightness values that match the corresponding values are not selected from the brightness value set, indicating that the brightness value of the ambient light is low.
  • image signal processing can be performed on the at least two images to obtain the YUV images corresponding to the at least two images respectively; and the histogram of the YUV image corresponding to each image is determined, and the histogram is used for each The value of the Y component of the YUV image corresponding to each image is adjusted to obtain the YUV image after tone mapping processing.
  • the image can be tone mapped based on the histogram of the image corresponding to the corresponding values, which has the following advantages:
  • the process of performing tone mapping processing on each of at least two images includes:
  • Step 1 Obtain at least two YUV images and their corresponding attribute information.
  • the attribute information of each YUV image may be: the exposure time is 30 ms, the aperture value is 1.75, and the sensitivity is 6400.
  • the at least two YUV images are at least two YUV images obtained after image signal processing is performed on the acquired at least two images.
  • Step 2 Determine the brightness value of the ambient light corresponding to each YUV image based on the attribute information of each YUV image.
  • the brightness value of the ambient light can also be represented by an illuminance value. Assume that the brightness value of the ambient light corresponding to the current frame of the YUV image is 5 lux.
  • Step 3 Determine the corresponding tone mapping curve based on the brightness value of the ambient light corresponding to each YUV image.
  • Step 4 Use the tone mapping curve corresponding to each YUV image to perform tone mapping processing on the three channels of each YUV image.
  • the three channels of the YUV image are subjected to tone mapping processing, and the resulting YUV image is shown in FIG. 7.
  • the YUV image is subjected to tone mapping based on the histogram of the YUV image, and the obtained YUV image is shown in FIG. 8.
  • the YUV image is subjected to tone mapping processing, which can avoid the overexposure of the image caused by the excessive tone mapping ratio, and restore the black area in the YUV image to the original grayscale. , So as to improve the contrast of the YUV image, and then make the brightness of the overall video stream present a smooth and natural display effect.
  • the attribute information of each image in at least two images can be used to accurately predict the brightness value of the ambient light corresponding to each image.
  • the brightness of the ambient light corresponding to each image can be predicted.
  • Value determine the tone mapping curve corresponding to each image, and use the determined tone mapping curve to perform tone mapping processing on each image, and finally achieve a smooth and natural effect of the overall video stream, especially when the illumination of the shooting scene changes It can still avoid the occurrence of discontinuous and unnatural problems in the overall video stream.
  • FIG. 9 is a schematic diagram of the composition structure of an image processing device according to an embodiment of the application; as shown in FIG. 9, the device includes:
  • the obtaining unit 91 is configured to obtain video data to be processed; the video data to be processed includes at least two images and attribute information corresponding to the at least two images respectively;
  • the first processing unit 92 is configured to use the attribute information respectively corresponding to the at least two images to determine the values corresponding to the at least two images to obtain at least two values; the values represent the ambient light when the image is taken.
  • the second processing unit 93 is configured to determine the tone mapping curves respectively corresponding to the at least two images based on the at least two values;
  • the third processing unit 94 is configured to perform tone mapping processing on the at least two images by using the tone mapping curves respectively corresponding to the at least two images to obtain at least two processed images.
  • the second processing unit 93 is specifically configured to:
  • For each of the at least two values select two target brightness values that match the corresponding values from the set of brightness values; determine that two target brightness values that match each of the at least two values respectively correspond The initial tone mapping curve; based on the initial tone mapping curve, generating tone mapping curves corresponding to the at least two images respectively.
  • the second processing unit 93 is specifically configured to:
  • the first curve represents the corresponding relationship between the input brightness value and the output brightness value; for two target brightness values that match each of the at least two values, the two targets are determined
  • the first curve serves as the initial tone mapping curve corresponding to the two target brightness values respectively.
  • the second processing unit 93 is specifically configured to:
  • the difference between the two target brightness values corresponding to each of the at least two values is obtained to obtain the first difference; the brightness value of each of the at least two values and the corresponding two target brightness values is the smallest Calculate the difference between the target brightness value of each value to obtain the second difference value; obtain the quotient of the first difference value corresponding to each value and the second difference value corresponding to each value to obtain the first ratio corresponding to each value;
  • the first ratio and the initial tone mapping curves respectively corresponding to the two target brightness values are used to generate tone mapping curves respectively corresponding to the at least two images.
  • the square device further includes:
  • the update unit is configured to calculate the difference between the values corresponding to two adjacent images in the at least two images to obtain a third difference value; determine whether the third difference value is greater than or equal to a threshold value; when it is determined that the third difference value is greater than or equal to a threshold value; When the difference is greater than or equal to the threshold, based on the tone mapping curves corresponding to the two adjacent images in the at least two images, the tone mapping curve corresponding to the first image in the two adjacent images is updated;
  • the two adjacent images include a first image and a second image; the shooting time of the first image is shorter than the shooting time of the second image.
  • the third processing unit 94 is specifically configured to:
  • the third processing unit 94 is specifically configured to:
  • a tone mapping process is performed on the YUV image corresponding to each of the at least two images.
  • the third processing unit 94 is specifically configured to:
  • the acquiring unit 91 can be implemented by a communication interface in the device; the first processing unit 92, the second processing unit 93, and the third processing unit 94 can be implemented by a communication interface in the device;
  • the processor can be a central processing unit (CPU, Central Processing Unit), a digital signal processor (DSP, Digital Signal Processor), a microcontroller unit (MCU, Microcontroller Unit) or a programmable gate array (FPGA, Field-Programmable Gate Array) .
  • CPU Central Processing Unit
  • DSP Digital Signal Processor
  • MCU Microcontroller Unit
  • FPGA Field-Programmable Gate Array
  • the terminal provided in the above embodiment performs image processing
  • only the division of the above-mentioned program modules is used as an example for illustration.
  • the above-mentioned processing can be allocated to different program modules as needed, i.e.
  • the internal structure of the device is divided into different program modules to complete all or part of the processing described above.
  • the device provided in the above embodiment and the image processing method embodiment belong to the same concept, and the specific implementation process is described in the method embodiment, and will not be repeated here.
  • FIG. 10 is a schematic diagram of the hardware composition structure of the electronic device according to the embodiment of the application.
  • the electronic device 100 includes a memory 103 and a processor. 102 and a computer program stored on the memory 103 and capable of running on the processor 102; the processor 102 implements the method provided by one or more technical solutions when the processor 102 executes the program.
  • the electronic device 100 further includes a communication interface 101, which is used to exchange information with other devices; at the same time, various components in the electronic device 100 are coupled together through the bus system 104.
  • the bus system 104 is configured to implement connection and communication between these components.
  • the bus system 104 also includes a power bus, a control bus, and a status signal bus.
  • the memory 103 in this embodiment may be a volatile memory or a non-volatile memory, and may also include both volatile and non-volatile memory.
  • the non-volatile memory can be read-only memory (ROM, Read Only Memory), programmable read-only memory (PROM, Programmable Read-Only Memory), and erasable programmable read-only memory (EPROM, Erasable Programmable Read- Only Memory, Electrically Erasable Programmable Read-Only Memory (EEPROM, Electrically Erasable Programmable Read-Only Memory), magnetic random access memory (FRAM, ferromagnetic random access memory), flash memory (Flash Memory), magnetic surface memory , CD-ROM, or CD-ROM (Compact Disc Read-Only Memory); magnetic surface memory can be magnetic disk storage or tape storage.
  • the volatile memory may be a random access memory (RAM, Random Access Memory), which is used as an external cache.
  • RAM random access memory
  • SRAM static random access memory
  • SSRAM synchronous static random access memory
  • Synchronous Static Random Access Memory Synchronous Static Random Access Memory
  • DRAM Dynamic Random Access Memory
  • SDRAM Synchronous Dynamic Random Access Memory
  • DDRSDRAM Double Data Rate Synchronous Dynamic Random Access Memory
  • ESDRAM Enhanced Synchronous Dynamic Random Access Memory
  • SLDRAM synchronous connection dynamic random access memory
  • DRRAM Direct Rambus Random Access Memory
  • the memories described in the embodiments of the present application are intended to include, but are not limited to, these and any other suitable types of memories.
  • the methods disclosed in the foregoing embodiments of the present application may be applied to the processor 102 or implemented by the processor 102.
  • the processor 102 may be an integrated circuit chip with signal processing capability. In the implementation process, each step of the foregoing method may be completed by an integrated logic circuit of hardware in the processor 102 or instructions in the form of software.
  • the aforementioned processor 102 may be a general-purpose processor, a DSP, or other programmable logic devices, discrete gates or transistor logic devices, discrete hardware components, and the like.
  • the processor 102 may implement or execute various methods, steps, and logical block diagrams disclosed in the embodiments of the present application.
  • the general-purpose processor may be a microprocessor or any conventional processor or the like.
  • the steps of the method disclosed in the embodiments of the present application may be directly embodied as being executed and completed by a hardware decoding processor, or executed and completed by a combination of hardware and software modules in the decoding processor.
  • the software module may be located in a storage medium, and the storage medium is located in a memory.
  • the processor 102 reads information in the memory and completes the steps of the foregoing method in combination with its hardware.
  • the embodiment of the present application also provides a storage medium, which is specifically a computer storage medium, and more specifically, a computer-readable storage medium.
  • a storage medium which is specifically a computer storage medium, and more specifically, a computer-readable storage medium.
  • Stored thereon are computer instructions, that is, a computer program, and when the computer instructions are executed by a processor, the method provided by one or more of the above technical solutions.
  • the disclosed method and smart device can be implemented in other ways.
  • the device embodiments described above are merely illustrative.
  • the division of the units is only a logical function division, and there may be other divisions in actual implementation, such as: multiple units or components can be combined, or It can be integrated into another system, or some features can be ignored or not implemented.
  • the coupling, or direct coupling, or communication connection between the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms. of.
  • the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units, that is, they may be located in one place or distributed on multiple network units; Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the embodiments of the present application can be all integrated into one processing unit, or each unit can be individually used as a unit, or two or more units can be integrated into one unit;
  • the unit can be implemented in the form of hardware, or in the form of hardware plus software functional units.
  • the foregoing program can be stored in a computer readable storage medium. When the program is executed, it is executed. Including the steps of the foregoing method embodiment; and the foregoing storage medium includes: various media that can store program codes, such as a mobile storage device, ROM, RAM, magnetic disk, or optical disk.
  • the aforementioned integrated unit of the present application is implemented in the form of a software function module and sold or used as an independent product, it may also be stored in a computer readable storage medium.
  • the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a network device, etc.) is allowed to execute all or part of the methods described in the various embodiments of the present application.
  • the aforementioned storage media include: removable storage devices, ROM, RAM, magnetic disks, or optical disks and other media that can store program codes.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

本申请公开了一种图像处理方法、装置、电子设备及存储介质。其中,方法包括:获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。

Description

图像处理方法、装置、电子设备及存储介质
相关申请的交叉引用
本申请基于申请号为202010059813.3、申请日为2020年01月19日的中国专利申请提出,并要求该中国专利申请的优先权,该中国专利申请的全部内容在此以全文引入的方式引入本申请。
技术领域
本申请涉及图像处理技术,具体涉及一种图像处理方法、装置、电子设备及存储介质。
背景技术
随着图像处理技术的快速发展,以及拍摄的视频图像的数据量越来越大,因此对拍摄的视频图像进行色调映射处理变得越来越必要。相关技术中,利用相同的色调映射曲线对拍摄的多帧图像进行色调映射,以最大限度保留图像中的颜色、对比度、细节等信息。但是,在拍摄场景的光照度发生改变的情况下,利用相同的色调映射曲线对拍摄的多帧图像进行色调映射,会导致色调映射处理后的整体视频流出现不连续、不自然的展示效果。
发明内容
本申请实施例提供一种图像处理方法、装置、电子设备及存储介质。
本申请的技术方案可以如下实现:
获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与至少两个图像分别对应的属性信息;
利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;
基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;
利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。
本申请实施例提供一种图像处理装置,所述装置包括:
获取单元,用于获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;
第一处理单元,用于利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;
第二处理单元,用于基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;
第三处理单元,用于利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。
本申请实施例提供一种信图像处理装置,包括存储器、处理器及存储在存储器上并可在处理器上运行的计算机程序,所述处理器执行所述程序时实现上述任一方法的步骤。
本申请实施例提供一种计算机存储介质,其上存储有计算机指令,所述指令被处理器执行时实现上述任一方法的步骤。
本申请实施例提供的图像处理方法、装置、电子设备及存储介质,获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。采用本申请实施例的技术方案,能够利用至少两个图像中每个图像自身的属性信息,准确预测每个图像对应的环境光的亮度值,如此,可以基于每个图像对应的环境光的亮度值,确定每个图像对应的色调映射曲线,并利用确定的色调映射曲线对每个图像进行色调映射处理,最终实现整体视频流的流畅、自然的效果,尤其在拍摄场景的光照度发生改变的情况下,仍能够避免整体视频流出现不连续、不自然问题的发生。
附图说明
图1是本申请实施例图像处理方法的实现流程示意图;
图2是本申请实施例确定至少两个图像中每个图像对应的色调映射曲线的实现流程示意图;
图3是本申请实施例对第一曲线进行调整得到初始色调映射曲线的示意图;
图4是本申请实施例不同目标亮度值分别对应的初始色调映射曲线的 示意图;
图5是本申请实施例确定至少两个图像中每个图像对应的色调映射曲线的实现流程示意图;
图6是本申请实施例对至少两个图像中每个图像进行色调映射处理的实现流程示意图;
图7是本申请实施例一种确定色调映射处理后的图像的示意图;
图8是本申请实施例又一种确定色调映射处理后的图像的示意图;
图9是本申请实施例图像处理装置的组成结构示意图;
图10是本申请实施例电子设备的组成结构示意图。
具体实施方式
为了使本申请的目的、技术方案及优点更加清楚明白,以下结合附图及实施例,对本申请进行进一步详细说明。应当理解,此处描述的具体实施例仅仅用以解释本申请,并不用于限定本申请。
相关技术中,可以利用相同的色调映射曲线如gamma曲线对拍摄的多帧图像进行色调映射,以最大限度保留图像中的颜色、对比度、细节等信息。但是,在拍摄场景的光照度发生改变的情况下,利用相同的gamma曲线对拍摄的多帧图像进行色调映射,会导致色调映射处理后的整体视频流出现不连续、不自然的展示效果。
基于此,在本申请的各种实施例中,获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。
下面结合附图及具体实施例对本申请作进一步详细的说明。
本申请实施例提供一种图像处理方法,图1为本申请实施例图像处理方法的实现流程示意图;如图1所示,所述方法包括:
步骤101:获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;
步骤102:利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;
步骤103:基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;
步骤104:利用与所述至少两个图像分别对应的色调映射曲线,对所述 至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。
这里,在步骤101中,实际应用时,所述至少两个图像可以是指至少两帧图像;所述属性信息可以是指拍摄图像所使用的拍摄参数,如曝光时间、光圈值、感光度。实际应用中,可以提取单帧图像的metadata信息,将提取的metadata信息作为单帧图像的属性信息。
这里,在步骤103中,实际应用时,色调映射曲线可以是指能够调整图像的相对明暗程度的曲线;具体地,当获取的图像是彩色图像时,色调映射曲线可以是指能够调整彩色图像的颜色的曲线。
这里,实际应用时,可以按照以下两种方式计算至少两个图像中每个图像对应的环境光的亮度值。
第一种方式,结合每个图像的属性信息,准确计算每个图像对应的环境光的亮度值。
具体地,以当前帧图像为例,可以结合当前帧的图像的属性信息,并按照公式(1)准确计算当前帧的图像对应的环境光的亮度值。
Figure PCTCN2020126521-appb-000001
其中,lu min ance t表示当前帧图像对应的环境光的亮度值;fps表示固定系数,如33;fnumber表示拍摄当前帧图像所使用的拍摄参数,即光圈值;exposuretime表示拍摄当前帧图像所使用的拍摄参数,即曝光时间;isovalue表示拍摄当前帧图像所使用的拍摄参数,即感光度。
第二种方式,使用alpha混合函数,结合每个图像的前一个图像,准确计算每个图像对应的环境光的亮度值。
具体地,为了避免相邻两个图像对应的环境光的亮度值的差异较大,以当前帧图像为例,可以使用alpha混合函数,按照公式(2)准确计算拍摄当前帧图像时环境光的亮度值。
lu min ance t=alpha×lu min ance t+(1-alpha)×lu min ance t-1     (2)
其中,lu min ance t表示当前帧图像对应的环境光的亮度值;alpha表示固定系数,如0.7;lu min ance t-1表示当前帧图像的前一帧图像对应的环境光的亮度值,可以按照公式(1)计算得到。
这里,利用所述至少两个图像分别对应的属性信息,确定所述至少两个图像分别对应的数值,具备以下优点:
能够利用至少两个图像中每个图像自身的属性信息,准确预测每个图像对应的环境光的亮度值,如此,后续可以基于每个图像对应的环境光的亮度值,确定每个图像对应的色调映射曲线,并利用确定的色调映射曲线对每个图像进行色调映射处理,最终实现整体视频流的流畅、自然的效果。
实际应用时,当获取到所述至少两个图像中每个图像对应的环境光的亮度值后,可以从预设的至少两个目标亮度值中选取与每个图像对应的环 境光的亮度值最接近的两个目标亮度值,基于选取的两个目标亮度值对应的初始色调映射曲线,生成与每个图像对应的色调映射曲线。
基于此,在一实施例中,所述基于所述至少两个数值,确定所述至少两个图像分别对应的色调映射曲线,包括:
针对所述至少两个数值中每个数值,从亮度值集合中选取与相应数值匹配的两个目标亮度值;确定与所述至少两个数值中每个数值匹配的两个目标亮度值分别对应的初始色调映射曲线;基于所述初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线。
其中,所述从亮度值集合中选取与相应数值匹配的两个目标亮度值可以是指相应数值处于所述两个目标亮度值对应的数值范围内。
举例来说,亮度值集合中包括7个预设亮度值,如5lux、10lux、20lux、50lux、150lux、400lux、800lux。若至少两个图像中当前帧图像对应的环境光的亮度值为8lux,则从亮度值集合中选取两个最接近的预设亮度值作为目标亮度值,即5lux和10lux。
在一示例中,描述确定至少两个图像中每个图像对应的色调映射曲线的过程。
如图2所示,确定至少两个图像中每个图像对应的色调映射曲线的过程,包括:
步骤1:获取至少两个图像中每个图像的属性信息。
这里,提取每个图像的metadata信息,如曝光时间、光圈值(也称为F值)、感光度(也称为isoGain值),将提取的metadata信息作为每帧图像的属性信息。
步骤2:利用每个图像的属性信息,计算每个图像对应的环境光的亮度值。
这里,可以按照公式(1)计算每个图像对应的环境光的亮度值;或者,也可以按照公式(2)计算每个图像对应的环境光的亮度值。
步骤3:从亮度值集合中选取与每个图像对应的环境光的亮度值匹配的两个目标亮度值。
亮度值集合中包括7个预设亮度值,如5lux、10lux、20lux、50lux、150lux、400lux、800lux。
步骤4:确定与所述两个目标亮度值分别对应的初始色调映射曲线;基于所述初始色调映射曲线,生成与每个图像对应的色调映射曲线。
在本示例中,针对光照在5lux到1000lux之间的所有应用场景,能够基于每个图像的属性信息,确定每个图像对应的环境光的亮度值,并基于每个图像对应的环境光的亮度值提高最适合的色调映射曲线,如此,后续对每个图像进行色调映射时,不会出现亮部过曝提亮不足问题的发生,可保证整体视频流具备流畅、自然的效果。
实际应用时,当确定至少两个图像中每个图像对应的环境光的亮度值 匹配的两个目标亮度值后,可以从预设数据库中直接查找与两个目标亮度值分别对应的初始色调映射曲线。或者,基于预设曲线,生成与两个目标亮度值中每个目标亮度值对应的初始色调映射曲线。
基于此,在一实施例中,所述确定与所述至少两个数值中每个数值匹配的两个目标亮度值分别对应的初始色调映射曲线,包括:
获取第一曲线;所述第一曲线表征输入的亮度值与输出的亮度值的对应关系;针对与所述至少两个数值中每个数值匹配的两个目标亮度值,确定所述两个目标亮度值中每个目标亮度值对应的第二曲线和第一函数;利用所述第二曲线和第一函数,对所述第一曲线进行调整,得到调整后的第一曲线;将调整后的第一曲线作为所述两个目标亮度值分别对应的初始色调映射曲线。
其中,所述第一曲线可以为gamma曲线,所述第二曲线可以为ACES Filmic曲线,所述第一函数可以为sigmoid函数。
实际应用时,可以使用公式(3)表示所述第一曲线,可以使用公式(4)表示所述第二曲线,可以使用公式(5)表示所述第一函数。
Figure PCTCN2020126521-appb-000002
其中,lu m out表示输出的亮度值;lu m in表示输入的亮度值;gamma gain表示固定系数,如2.0。gamma gain取值越大,输出的亮度值与输入的亮度值之间的动态范围越大。
Figure PCTCN2020126521-appb-000003
其中,lu m out表示输出的亮度值;a表示默认系数,如2.51;b表示默认系数,如0.03;c表示默认系数,如2.43;d表示默认系数,如0.59;e的取值范围在0和1之间,如0.2。
Figure PCTCN2020126521-appb-000004
其中,lu m out表示输出的亮度值;middle、k表示默认系数。
这里,实际应用时,首先,使用较大的gamma gain取值,提高第一曲线对应的输出的亮度值与输入的亮度值之间的动态范围,如图3所示的曲线1所示;然后,利用公式(4)对应的第二曲线对第一曲线进行调整,将第一曲线中对应暗部区域的曲线拉高,将第一曲线中对应亮度区域的曲线拉低,以达到整体曲线的平衡和自然,调整后的第一曲线为图3所示的曲线2;最后,利用第一函数对第一曲线进行再次调整,以调整第一曲线中暗部区域和亮部区域的比例,调整后的第一曲线为图3所示的曲线3;其中,输入的亮度值较低的区域为暗部区域,输入的亮度值较大的区域为亮部区域。这 里,针对每个图像对应的两个不同的目标亮度值,确定公式(5)中的middle、k的取值不同,得到的曲线3不同,如此,后续可以基于两个不同的曲线3,生成与每个图像对应的色调映射曲线,利用每个图像对应的色调映射曲线对每个图像进行色调映射处理后,能够提高每个图图像的对比度,与相关技术中仅利用gamma曲线对每个图像进行色调映射相比,能够避免色调映射后图像的颜色饱和度不足或整体图像画面昏暗导致清晰度下降问题的发生,进而实现最佳的色调映射效果。
图4是不同目标亮度值分别对应的初始色调映射曲线的示意图,如图4所示,若目标亮度值为以下之一:5lux、10lux、20lux、50lux、150lux、400lux、800lux,则目标亮度值为5lux对应的初始色调映射曲线用trigger0表示,目标亮度值为10lux对应的初始色调映射曲线用trigger1表示,目标亮度值为20lux对应的初始色调映射曲线用trigger2表示,目标亮度值为50lux对应的初始色调映射曲线用trigger3表示,目标亮度值为150lux对应的初始色调映射曲线用trigger4表示,目标亮度值为400lux对应的初始色调映射曲线用trigger5表示,目标亮度值为800lux对应的初始色调映射曲线用trigger6表示。从图4中可看出,目标亮度值越大,色调补偿率越低,且最低目标亮度值即5lux对应的初始色调映射曲线能够提高最大的色调补偿效果。
实际应用时,当确定至少两个图像中每个图像对应的环境光的亮度值匹配的两个目标亮度值,并确定与两个目标亮度值分别对应的初始色调映射曲线后,可以基于两个初始色调映射曲线,结合每个图像对应的环境光的亮度值,使用内插算法,生成与每个图像对应的色调映射曲线。
基于此,在一实施例中,所述基于所述初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线,包括:
将所述至少两个数值中每个数值对应的两个目标亮度值求差,得到第一差值;将所述至少两个数值中每个数值与对应的两个目标亮度值中亮度值最小的目标亮度值求差,得到第二差值;将所述每个数值对应的第一差值和每个数值对应的第二差值求商,得到每个数值对应的第一比值;利用所述第一比值,以及所述至少两个数值中每个数值对应的两个目标亮度值分别对应的初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线。
实际应用时,可以使用公式(6)表示所述第一差值,使用公式(7)表示所述第二差值,使用公式(8)表示第一比值。
offset=trigger1-trigger0     (6)
其中,offset表示第一差值;trigger1、trigger0分别表示与每个图像对应的两个目标亮度值。
dis=lu minance t-trigger0     (7)
其中,dis表示第二差值;lu minance t表示当前帧图像对应的环境光的亮 度值;trigger0表示与当前帧图像对应的两个目标亮度值中的最小亮度值。
weight=dis/offset     (8)
其中,weight表示第一比值。
实际应用时,可以使用公式(9)对应的内插算法,基于两个初始色调映射曲线,结合第一比值,生成与每个图像对应的色调映射曲线。
curve_inter(t)=(1-weight)×curve_initial0+weight×curve_initial1    (9)
其中,curve_inter(t)表示当前帧图像对应的色调映射曲线,initial0、initial1分别表示当前帧图像对应的两个目标亮度值分别对应的初始色调映射曲线。
这里,基于至少两个图像中每个图像对应的环境光的亮度值,确定对应的两个初始色调映射曲线,并基于两个初始色调映射曲线,生成与每个图像对应的色调映射曲线,具备以下优点:
由于两个初始色调映射曲线是利用第二曲线和第一函数对第一曲线进行了调整得到的,因此,基于两个初始色调映射曲线,生成与每个图像对应的色调映射曲线,后续利用每个图像对应的色调映射曲线对每个图像进行色调映射处理后,能够提高每个图图像的对比度,与相关技术中仅利用gamma曲线对每个图像进行色调映射相比,能够避免色调映射后图像的颜色饱和度不足或整体图像画面昏暗导致清晰度下降问题的发生,进而实现最佳的色调映射效果。
实际应用时,为了避免相邻前后两个图像对应的环境光的亮度差值较大导致色调映射处理后的图像出现不连续、不自然的现象,当确定每个图像对应的色调映射曲线后,可以计算相邻前后两个图像对应的环境光的亮度的差值,在差值大于或等于阈值时,基于相邻两个图像对应的色调映射曲线,对相邻两个图像拍摄时刻较晚的图像对应的色调映射曲线进行更新。
基于此,在一实施例中,所述方法还包括:
针对所述至少两个图像中相邻两个图像分别对应的数值求差,得到第三差值;判断所述第三差值是否大于或等于阈值;当确定所述第三差值大于或等于阈值时,基于所述至少两个图像中相邻两个图像分别对应的色调映射曲线,对相邻两个图像中的第一图像对应的色调映射曲线进行更新;所述相邻两个图像包括第一图像和第二图像;所述第一图像的拍摄时刻小于第二图像的拍摄时刻。
在一示例中,描述确定至少两个图像中每个图像对应的色调映射曲线的过程。
如图5所示,确定至少两个图像中每个图像对应的色调映射曲线的过程,包括:
步骤1:获取至少两个图像中每个图像的属性信息。
步骤2:利用每个图像的属性信息,计算每个图像对应的环境光的亮度 值;基于每个图像对应的环境光的亮度值,确定与每个图像对应的色调映射曲线。
步骤3:将所述至少两个图像中相邻两个图像分别对应的环境光的亮度值求差,得到第三差值。
这里,相邻两个图像分别对应的环境光的亮度值用luminance(t)和luminance(t-1)表示。
步骤4:判断所述第三差值是否大于或等于阈值;当确定所述第三差值大于或等于阈值时,执行步骤5。
这里,阈值用lumDiffThresh表示,取值可以为30。
步骤5:基于所述至少两个图像中相邻两个图像分别对应的色调映射曲线,对相邻两个图像中的第一图像对应的色调映射曲线进行更新。
所述相邻两个图像包括第一图像和第二图像;所述第一图像的拍摄时刻大于第二图像的拍摄时刻。
这里,第一图像对应的色调映射曲线用curve_inter(t)表示,第二图像对应的色调映射曲线用curve_final(t-1)表示,利用公式(9)所示的内插算法,将第一图像对应的色调映射曲线curve_inter(t)更新为curve_final(t)。
在本示例中,当确定所述至少两个图像中每个图像对应的色调映射曲线后,在所述至少两个图像中相邻两个图像对应的环境光的亮度值大于或等于阈值时,会利用相邻两个图像对应的色调映射曲线对拍摄时刻较晚的图像对应的色调映射曲线进行更新调整,如此,能够避免相邻前后两个图像对应的环境光的亮度差值较大导致色调映射处理后的图像出现不连续、不自然的现象。
实际应用时,为了提高进行色调映射的处理时间,达到视频流的实时展示效果,可以直接在YUV(Y表示明亮度,U表示色度,V表示色度)域上对至少两个图像中每个图像进行色调映射处理。
基于此,在一实施例中,所述利用所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,包括:
对所述至少两个图像分别进行图像信号处理,得到所述至少两个图像分别对应的YUV图像;利用所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像中每个图像对应的YUV图像进行色调映射处理。
其中,YUV中的Y表示明亮度,即灰阶值,YUV中的U和V表示色度,即影像色彩和饱和度。
这里,所述至少两个图像也可以是经过图像信号处理得到的YUV图像,如此,当确定所述至少两个图像中每个图像对应的色调映射曲线后,可直接对每个图像在YUV域上进行色调映射处理。
实际应用时,基于所述至少两个图像中每个图像对应的色调映射曲线,确定色调映射比率,基于色调映射比率,对每个图像在YUV域上进行色调映射处理。
基于此,在一实施例中,所述利用所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像中每个图像对应的YUV图像进行色调映射处理,包括:
针对所述至少两个图像中每个图像,确定相应图像对应的YUV图像的Y、U、V分量;利用相应图像对应的YUV图像的Y分量,结合相应图像对应的色调映射曲线,确定色调映射比率;利用所述色调映射比率,调整相应图像对应的YUV图像中的Y、U、V分量的取值;将调整后的YUV图像作为色调映射处理后的YUV图像。
实际应用时,可以使用公式(10)确定色调映射比率。
ratio=curve_final(t)×Y_in    (10)
其中,curve_final(t)表示当前帧图像对应的色调映射曲线;Y_in表示当前帧图像对应的YUV图像的Y分量;ratio表示色调映射率。
实际应用时,可以使用公式(11)时,结合色调映射比率,调整相应图像对应的YUV图像中的Y、U、V分量的取值。
Y_out=Y_in×ratio
U_out=U_in×ratio
V_out=V_in×ratio    (11)
其中,Y_in表示当前帧图像对应的YUV图像的Y分量;U_in表示当前帧图像对应的YUV图像的U分量;V_in表示当前帧图像对应的YUV图像的V分量;Y_out表示当前帧图像对应的调整后的YUV图像的Y分量;U_out表示当前帧图像对应的调整后的YUV图像的U分量;V_out表示当前帧图像对应的调整后的YUV图像的V分量。
这里,基于所述至少两个图像中每个图像对应的色调映射曲线,确定色调映射比率,基于色调映射比率,对每个图像在YUV域上进行色调映射处理,具备以下优点:
直接在YUV域上对至少两个图像中每个图像进行色调映射处理,与相关技术中在RGB域对图像进行色调映射处理相比,无需进行YUV域到RGB域、RGB域到YUV域的转换,从而提高进行色调映射的处理时间,达到视频流的实时展示效果,如每秒传输帧数大于30fps。
实际应用时,在拍摄夜景视频时,在较低照度场景下,如亮度小于5lux,由于灯光照明等原因,相同光照情况下不同图像的亮度也会有很大不同,如果对所述至少两个图像进行统计的全局色调映射处理,会产生图像过曝或过暗的情况的发生,如此,若所述至少两个图像中每个图像对应的环境光的亮度值较低,如小于55lux,则可以基于每个图像的直方图,对每个图像进行色调映射处理。
基于此,在一实施例中,所述方法还包括:
针对所述至少两个数值中每个数值,从亮度值集合中选取与相应数值 匹配的两个目标亮度值;
当从亮度值集合中未选取到与相应数值匹配的两个目标亮度值时,确定所述相应数值对应的图像的直方图;将所述每个数值对应的直方图作为对所述至少两个图像中每个图像进行色调映射处理的色调映射曲线;利用所述直方图,对所述至少两个图像中每个图像进行色调映射处理,得到处理后的至少两个图像。
其中,直方图可以用于利用具有明暗分布的曲线呈现图像。从亮度值集合中未选取到与相应数值匹配的两个目标亮度值,表明环境光的亮度值较低。
实际应用时,可以对所述至少两个图像分别进行图像信号处理,得到所述至少两个图像分别对应的YUV图像;并确定每个图像对应的YUV图像的直方图,利用直方图,对每个图像对应的YUV图像的Y分量的取值进行调整,得到色调映射处理后的YUV图像。
这里,当从亮度值集合中未选取到与相应数值匹配的两个目标亮度值时,可以基于相应数值对应的图像的直方图,对该图像进行色调映射处理,具备以下优点:
利用图像的直方图特性对图像进行局部色调映射处理,与相关技术中对至少两个图像进行全局色调映射处理的方式相比,在拍摄场景的光照较暗时,仍然达到较好的色调映射效果,同时能够使环境光的亮度的差值较大的相邻两个图像具备相似的色调映射效果,从而实现整体视频流的流畅、自然的展示效果。
在一示例中,描述对至少两个图像中每个图像进行色调映射处理的过程。
如图6所示,对至少两个图像中每个图像进行色调映射处理的过程,包括:
步骤1:获取至少两个YUV图像以及分别对应的属性信息。
这里,每个YUV图像的属性信息可以为:曝光时间为30ms,光圈值为1.75,感光度为6400。
这里,至少两个YUV图像是对获取的至少两个图像进行图像信号处理后得到的至少两个YUV图像。
步骤2:基于每个YUV图像的属性信息,确定每个YUV图像对应的环境光的亮度值。
这里,环境光的亮度值也可以用照度值表示。假设当前帧YUV图像对应的环境光的亮度值为5lux。
步骤3:基于每个YUV图像对应的环境光的亮度值,确定对应的色调映射曲线。
步骤4:利用每个YUV图像对应的色调映射曲线,对每个YUV图像的三个通道进行色调映射处理。
这里,基于YUV图像对应的色调映射曲线,对YUV图像的三个通道进行色调映射处理,得到的YUV图像为图7所示。
这里,若YUV图像对应的环境光的亮度值低于阈值如5lux,则基于YUV图像的直方图,对YUV图像进行色调映射处理,得到的YUV图像为图8所示。在低照度场景下,基于YUV图像的直方图,对YUV图像进行色调映射处理,能够避免色调映射比率过大导致图像过曝现象的发生,而且能将YUV图像中黑色区域还原到原本的灰度,从而提高YUV图像的对比度,进而使得整体视频流的亮度呈现流畅、自然的展示效果。
采用本申请实施例的技术方案,能够利用至少两个图像中每个图像自身的属性信息,准确预测每个图像对应的环境光的亮度值,如此,可以基于每个图像对应的环境光的亮度值,确定每个图像对应的色调映射曲线,并利用确定的色调映射曲线对每个图像进行色调映射处理,最终实现整体视频流的流畅、自然的效果,尤其在拍摄场景的光照度发生改变的情况下,仍能够避免整体视频流出现不连续、不自然问题的发生。
为实现本申请实施例图像处理方法,本申请实施例还提供一种图像处理装置。图9为本申请实施例图像处理装置的组成结构示意图;如图9所示,所述装置包括:
获取单元91,用于获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;
第一处理单元92,用于利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;
第二处理单元93,用于基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;
第三处理单元94,用于利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。
在一实施例中,所述第二处理单元93,具体用于:
针对所述至少两个数值中每个数值,从亮度值集合中选取与相应数值匹配的两个目标亮度值;确定与所述至少两个数值中每个数值匹配的两个目标亮度值分别对应的初始色调映射曲线;基于所述初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线。
在一实施例中,所述第二处理单元93,具体用于:
获取第一曲线;所述第一曲线表征输入的亮度值与输出的亮度值的对应关系;针对与所述至少两个数值中每个数值匹配的两个目标亮度值,确定所述两个目标亮度值中每个目标亮度值对应的第二曲线和第一函数;利用所述第二曲线和第一函数,对所述第一曲线进行调整,得到调整后的第 一曲线;将调整后的第一曲线作为所述两个目标亮度值分别对应的初始色调映射曲线。
在一实施例中,所述第二处理单元93,具体用于:
将所述至少两个数值中每个数值对应的两个目标亮度值求差,得到第一差值;将所述至少两个数值中每个数值与对应的两个目标亮度值中亮度值最小的目标亮度值求差,得到第二差值;将所述每个数值对应的第一差值和每个数值对应的第二差值求商,得到每个数值对应的第一比值;利用所述第一比值,以及所述两个目标亮度值分别对应的初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线。
在一实施例中,所述方装置还包括:
更新单元,用于针对所述至少两个图像中相邻两个图像分别对应的数值求差,得到第三差值;判断所述第三差值是否大于或等于阈值;当确定所述第三差值大于或等于阈值时,基于所述至少两个图像中相邻两个图像分别对应的色调映射曲线,对相邻两个图像中的第一图像对应的色调映射曲线进行更新;所述相邻两个图像包括第一图像和第二图像;所述第一图像的拍摄时刻小于第二图像的拍摄时刻。
在一实施例中,所述第三处理单元94,具体用于:
针对所述至少两个数值中每个数值,从亮度值集合中选取与相应数值匹配的两个目标亮度值;当从亮度值集合中未选取到与相应数值匹配的两个目标亮度值时,确定所述相应数值对应的图像的直方图;将所述每个数值对应的直方图作为对所述至少两个图像中每个图像进行色调映射处理的色调映射曲线;利用所述直方图,对所述至少两个图像中每个图像进行色调映射处理,得到处理后的至少两个图像。
在一实施例中,第三处理单元94,具体用于:
对所述至少两个图像分别进行图像信号处理,得到所述至少两个图像分别对应的YUV图像;
利用所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像中每个图像对应的YUV图像进行色调映射处理。
在一实施例中,第三处理单元94,具体用于:
针对所述至少两个图像中每个图像,确定相应图像对应的YUV图像的Y、U、V分量;利用相应图像对应的YUV图像的Y分量,结合相应图像对应的色调映射曲线,确定色调映射比率;利用所述色调映射比率,调整相应图像对应的YUV图像中的Y、U、V分量的取值;将调整后的YUV图像作为色调映射处理后的YUV图像。
实际应用时,所述获取单元91可由所述装置中的通信接口实现;所述第一处理单元92、第二处理单元93、第三处理单元94可由所述装置中的通信接口实现;所述处理器可以是中央处理器(CPU,Central Processing Unit)、数字信号处理器(DSP,Digital Signal Processor)、微控制单元(MCU, Microcontroller Unit)或可编程门阵列(FPGA,Field-Programmable Gate Array)。
需要说明的是:上述实施例提供的终端在进行图像处理时,仅以上述各程序模块的划分进行举例说明,实际应用时,可以根据需要而将上述处理分配由不同的程序模块完成,即将所述装置的内部结构划分成不同的程序模块,以完成以上描述的全部或者部分处理。另外,上述实施例提供的所述装置与图像处理方法实施例属于同一构思,其具体实现过程详见方法实施例,这里不再赘述.
基于上述设备的硬件实现,本申请实施例还提供了一种电子设备,图10为本申请实施例的电子设备的硬件组成结构示意图,如图10所示,电子设备100包括存储器103、处理器102及存储在存储器103上并可在处理器102上运行的计算机程序;所述处理器102执行所述程序时实现上述一个或多个技术方案提供的方法。
需要说明的是,所述处理器102执行所述程序时实现的具体步骤已在上文详述,这里不再赘述。
可以理解,电子设备100还包括通信接口101,所述通信接口101用于和其它设备进行信息交互;同时,电子设备100中的各个组件通过总线系统104耦合在一起。可理解,总线系统104配置为实现这些组件之间的连接通信。总线系统104除包括数据总线之外,还包括电源总线、控制总线和状态信号总线等。
可以理解,本实施例中的存储器103可以是易失性存储器或非易失性存储器,也可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(ROM,Read Only Memory)、可编程只读存储器(PROM,Programmable Read-Only Memory)、可擦除可编程只读存储器(EPROM,Erasable Programmable Read-Only Memory)、电可擦除可编程只读存储器(EEPROM,Electrically Erasable Programmable Read-Only Memory)、磁性随机存取存储器(FRAM,ferromagnetic random access memory)、快闪存储器(Flash Memory)、磁表面存储器、光盘、或只读光盘(CD-ROM,Compact Disc Read-Only Memory);磁表面存储器可以是磁盘存储器或磁带存储器。易失性存储器可以是随机存取存储器(RAM,Random Access Memory),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的RAM可用,例如静态随机存取存储器(SRAM,Static Random Access Memory)、同步静态随机存取存储器(SSRAM,Synchronous Static Random Access Memory)、动态随机存取存储器(DRAM,Dynamic Random Access Memory)、同步动态随机存取存储器(SDRAM,Synchronous Dynamic Random Access Memory)、双倍数据速率同步动态随机存取存储器(DDRSDRAM,Double Data Rate Synchronous Dynamic Random Access  Memory)、增强型同步动态随机存取存储器(ESDRAM,Enhanced Synchronous Dynamic Random Access Memory)、同步连接动态随机存取存储器(SLDRAM,SyncLink Dynamic Random Access Memory)、直接内存总线随机存取存储器(DRRAM,Direct Rambus Random Access Memory)。本申请实施例描述的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
上述本申请实施例揭示的方法可以应用于处理器102中,或者由处理器102实现。处理器102可能是一种集成电路芯片,具有信号的处理能力。在实现过程中,上述方法的各步骤可以通过处理器102中的硬件的集成逻辑电路或者软件形式的指令完成。上述的处理器102可以是通用处理器、DSP,或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。处理器102可以实现或者执行本申请实施例中的公开的各方法、步骤及逻辑框图。通用处理器可以是微处理器或者任何常规的处理器等。结合本申请实施例所公开的方法的步骤,可以直接体现为硬件译码处理器执行完成,或者用译码处理器中的硬件及软件模块组合执行完成。软件模块可以位于存储介质中,该存储介质位于存储器,处理器102读取存储器中的信息,结合其硬件完成前述方法的步骤。
本申请实施例还提供了一种存储介质,具体为计算机存储介质,更具体的为计算机可读存储介质。其上存储有计算机指令,即计算机程序,该计算机指令被处理器执行时上述一个或多个技术方案提供的方法。
在本申请所提供的几个实施例中,应该理解到,所揭露的方法和智能设备,可以通过其它的方式实现。以上所描述的设备实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,如:多个单元或组件可以结合,或可以集成到另一个系统,或一些特征可以忽略,或不执行。另外,所显示或讨论的各组成部分相互之间的耦合、或直接耦合、或通信连接可以是通过一些接口,设备或单元的间接耦合或通信连接,可以是电性的、机械的或其它形式的。
上述作为分离部件说明的单元可以是、或也可以不是物理上分开的,作为单元显示的部件可以是、或也可以不是物理单元,即可以位于一个地方,也可以分布到多个网络单元上;可以根据实际的需要选择其中的部分或全部单元来实现本实施例方案的目的。
另外,在本申请各实施例中的各功能单元可以全部集成在一个处理单元中,也可以是各单元分别单独作为一个单元,也可以两个或两个以上单元集成在一个单元中;上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
本领域普通技术人员可以理解:实现上述方法实施例的全部或部分步骤可以通过程序指令相关的硬件来完成,前述的程序可以存储于一计算机可读取存储介质中,该程序在执行时,执行包括上述方法实施例的步骤; 而前述的存储介质包括:移动存储设备、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
或者,本申请上述集成的单元如果以软件功能模块的形式实现并作为独立的产品销售或使用时,也可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分可以以软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机、服务器、或者网络设备等)执行本申请各个实施例所述方法的全部或部分。而前述的存储介质包括:移动存储设备、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
需要说明的是:“第一”、“第二”等是用于区别类似的对象,而不必用于描述特定的顺序或先后次序。
另外,本申请实施例所记载的技术方案之间,在不冲突的情况下,可以任意组合。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。

Claims (11)

  1. 一种图像处理方法,所述方法包括:
    获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;
    利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;
    基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;
    利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。
  2. 根据权利要求1所述的方法,其中,所述基于所述至少两个数值,确定所述至少两个图像分别对应的色调映射曲线,包括:
    针对所述至少两个数值中每个数值,从亮度值集合中选取与相应数值匹配的两个目标亮度值;
    确定与所述至少两个数值中每个数值匹配的两个目标亮度值分别对应的初始色调映射曲线;
    基于所述初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线。
  3. 根据权利要求2所述的方法,其中,所述确定与所述至少两个数值中每个数值匹配的两个目标亮度值分别对应的初始色调映射曲线,包括:
    获取第一曲线;所述第一曲线表征输入的亮度值与输出的亮度值的对应关系;
    针对与所述至少两个数值中每个数值匹配的两个目标亮度值,确定所述两个目标亮度值中每个目标亮度值对应的第二曲线和第一函数;
    利用所述第二曲线和第一函数,对所述第一曲线进行调整,得到调整后的第一曲线;将调整后的第一曲线作为所述两个目标亮度值分别对应的初始色调映射曲线。
  4. 根据权利要求2或3所述的方法,其中,所述基于所述初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线,包括:
    将所述至少两个数值中每个数值对应的两个目标亮度值求差,得到第一差值;
    将所述至少两个数值中每个数值与对应的两个目标亮度值中亮度值最小的目标亮度值求差,得到第二差值;
    将所述每个数值对应的第一差值和第二差值求商,得到所述每个数 值对应的第一比值;
    利用所述第一比值,以及所述两个目标亮度值分别对应的初始色调映射曲线,生成与所述至少两个图像分别对应的色调映射曲线。
  5. 根据权利要求4所述的方法,其中,所述方法还包括:
    针对所述至少两个图像中相邻两个图像分别对应的数值求差,得到第三差值;
    当确定所述第三差值大于或等于阈值时,基于所述至少两个图像中相邻两个图像分别对应的色调映射曲线,对相邻两个图像中的第一图像对应的色调映射曲线进行更新;所述相邻两个图像包括第一图像和第二图像;所述第一图像的拍摄时刻小于第二图像的拍摄时刻。
  6. 根据权利要求1所述的方法,其中,所述基于所述至少两个数值,确定所述至少两个图像分别对应的色调映射曲线,包括:
    针对所述至少两个数值中每个数值,从亮度值集合中选取与相应数值匹配的两个目标亮度值;
    当从亮度值集合中未选取到与相应数值匹配的两个目标亮度值时,确定所述相应数值对应的图像的直方图;
    将所述每个数值对应的直方图作为对所述至少两个图像中每个图像进行色调映射处理的色调映射曲线。
  7. 根据权利要求1所述的方法,其中,所述利用所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,包括:
    对所述至少两个图像分别进行图像信号处理,得到所述至少两个图像分别对应的YUV图像;
    利用所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像中每个图像对应的YUV图像进行色调映射处理。
  8. 根据权利要求7所述的方法,其中,所述利用所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像中每个图像对应的YUV图像进行色调映射处理,包括:
    针对所述至少两个图像中每个图像,确定相应图像对应的YUV图像的Y、U、V分量;
    利用相应图像对应的YUV图像的Y分量,结合相应图像对应的色调映射曲线,确定色调映射比率;
    利用所述色调映射比率,调整相应图像对应的YUV图像中的Y、U、V分量的取值;
    将调整后的YUV图像作为色调映射处理后的YUV图像。
  9. 一种图像处理装置,所述装置包括:
    获取单元,用于获取待处理视频数据;所述待处理视频数据包括至少两个图像以及与所述至少两个图像分别对应的属性信息;
    第一处理单元,用于利用与所述至少两个图像分别对应的属性信息,确定与所述至少两个图像分别对应的数值,得到至少两个数值;所述数值表征拍摄图像时环境光的亮度值;
    第二处理单元,用于基于所述至少两个数值,确定与所述至少两个图像分别对应的色调映射曲线;
    第三处理单元,用于利用与所述至少两个图像分别对应的色调映射曲线,对所述至少两个图像分别进行色调映射处理,得到处理后的至少两个图像。
  10. 一种电子设备,包括:处理器和用于存储能够在处理器上运行的计算机程序的存储器,
    其中,所述处理器用于运行所述计算机程序时,执行权利要求1至8任一项所述方法的步骤。
  11. 一种存储介质,其上存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1至8任一项所述方法的步骤。
PCT/CN2020/126521 2020-01-19 2020-11-04 图像处理方法、装置、电子设备及存储介质 WO2021143300A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010059813.3A CN111294575B (zh) 2020-01-19 2020-01-19 图像处理方法、装置、电子设备及存储介质
CN202010059813.3 2020-01-19

Publications (1)

Publication Number Publication Date
WO2021143300A1 true WO2021143300A1 (zh) 2021-07-22

Family

ID=71029135

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/126521 WO2021143300A1 (zh) 2020-01-19 2020-11-04 图像处理方法、装置、电子设备及存储介质

Country Status (2)

Country Link
CN (1) CN111294575B (zh)
WO (1) WO2021143300A1 (zh)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111294575B (zh) * 2020-01-19 2022-06-28 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及存储介质
CN112150392B (zh) * 2020-09-30 2024-03-19 普联技术有限公司 一种低照度图像的修复方法及装置
WO2022077353A1 (en) * 2020-10-15 2022-04-21 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Method and apparatus for tone mapping, and computer usable medium storing software for implementing the method
CN116508326A (zh) * 2020-11-06 2023-07-28 Oppo广东移动通信有限公司 色调映射方法及设备、存储实现该方法的软件的计算机可用介质
CN112565636B (zh) * 2020-12-01 2023-11-21 影石创新科技股份有限公司 图像处理方法、装置、设备和存储介质
CN114422767A (zh) * 2022-01-29 2022-04-29 Oppo广东移动通信有限公司 一种视频效果增强方法、装置、移动终端和存储介质

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060033749A1 (en) * 2004-03-05 2006-02-16 Matsushita Electric Industrial Co., Ltd. Image signal processing method, image signal processing apparatus, and image displaying apparatus
CN1992842A (zh) * 2005-12-30 2007-07-04 联发科技股份有限公司 一种亮度补偿装置与方法
CN104519281A (zh) * 2014-12-05 2015-04-15 深圳市先河系统技术有限公司 一种图像的处理方法及处理装置
CN105118433A (zh) * 2015-09-07 2015-12-02 西安诺瓦电子科技有限公司 显示屏显示优化方法
CN105632447A (zh) * 2016-03-31 2016-06-01 青岛海信移动通信技术股份有限公司 一种液晶显示屏的显示亮度调整方法及装置
CN111294575A (zh) * 2020-01-19 2020-06-16 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及存储介质

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8363131B2 (en) * 2009-01-15 2013-01-29 Aptina Imaging Corporation Apparatus and method for local contrast enhanced tone mapping
JP5726914B2 (ja) * 2010-02-19 2015-06-03 トムソン ライセンシングThomson Licensing 高ダイナミックレンジのビデオ・トーン・マッピングのためのパラメータ補間
US10504487B2 (en) * 2016-10-14 2019-12-10 Apple Inc. Ambient and content adaptive pixel manipulation
US10755392B2 (en) * 2017-07-13 2020-08-25 Mediatek Inc. High-dynamic-range video tone mapping
CN107481696A (zh) * 2017-09-27 2017-12-15 天津汇讯视通科技有限公司 一种基于gamma相关自适应图像处理方法
CN108090879B (zh) * 2017-12-12 2020-11-10 上海顺久电子科技有限公司 一种对输入的高动态范围图像进行处理的方法和显示设备
CN108198155B (zh) * 2017-12-27 2021-11-23 合肥君正科技有限公司 一种自适用色调映射方法及系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060033749A1 (en) * 2004-03-05 2006-02-16 Matsushita Electric Industrial Co., Ltd. Image signal processing method, image signal processing apparatus, and image displaying apparatus
CN1992842A (zh) * 2005-12-30 2007-07-04 联发科技股份有限公司 一种亮度补偿装置与方法
CN104519281A (zh) * 2014-12-05 2015-04-15 深圳市先河系统技术有限公司 一种图像的处理方法及处理装置
CN105118433A (zh) * 2015-09-07 2015-12-02 西安诺瓦电子科技有限公司 显示屏显示优化方法
CN105632447A (zh) * 2016-03-31 2016-06-01 青岛海信移动通信技术股份有限公司 一种液晶显示屏的显示亮度调整方法及装置
CN111294575A (zh) * 2020-01-19 2020-06-16 Oppo广东移动通信有限公司 图像处理方法、装置、电子设备及存储介质

Also Published As

Publication number Publication date
CN111294575B (zh) 2022-06-28
CN111294575A (zh) 2020-06-16

Similar Documents

Publication Publication Date Title
WO2021143300A1 (zh) 图像处理方法、装置、电子设备及存储介质
US11403740B2 (en) Method and apparatus for image capturing and processing
CN108335279B (zh) 图像融合和hdr成像
CN112565636B (zh) 图像处理方法、装置、设备和存储介质
CN110033418B (zh) 图像处理方法、装置、存储介质及电子设备
WO2019019904A1 (zh) 白平衡处理方法、装置和终端
TW202022799A (zh) 測光補償方法及其相關監控攝影設備
JP2012165069A (ja) 画像処理装置及び方法
CN112907497B (zh) 图像融合方法以及图像融合装置
US11445127B2 (en) Leveraging HDR sensors for handling mixed illumination auto white balance
CN107682611B (zh) 对焦的方法、装置、计算机可读存储介质和电子设备
CN110706162A (zh) 一种图像处理方法、装置及计算机存储介质
CN114429476A (zh) 图像处理方法、装置、计算机设备以及存储介质
TW202004662A (zh) 影像強化方法及電腦系統
Yang et al. Mantissa-exponent-based tone mapping for wide dynamic range image sensors
CN116645527A (zh) 图像识别方法、系统、电子设备和存储介质
WO2022067761A1 (zh) 图像处理方法、装置、拍摄设备、可移动平台及计算机可读存储介质
CN114283100A (zh) 高动态范围图像合成与色调映射方法及电子设备
CN112488933A (zh) 一种视频的细节增强方法、装置、移动终端和存储介质
KR20130138435A (ko) 밝기 대비 향상을 통한 영상 화질 개선 방법 및 장치
CN112133101B (zh) 增强车牌区域的方法、装置、摄像装置、计算设备及存储介质
WO2018010026A1 (en) Method of presenting wide dynamic range images and a system employing same
CN115103129A (zh) 拍摄参数的确定方法、装置、电子设备和计算机存储介质
CN118014915A (zh) 图像处理方法、装置、电子设备和计算机可读存储介质
CN118071658A (zh) 图像处理方法、装置、电子设备和计算机可读存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20913966

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20913966

Country of ref document: EP

Kind code of ref document: A1