CN113822819A - HDR scene detection method and device, terminal and readable storage medium - Google Patents

HDR scene detection method and device, terminal and readable storage medium Download PDF

Info

Publication number
CN113822819A
CN113822819A CN202111203958.7A CN202111203958A CN113822819A CN 113822819 A CN113822819 A CN 113822819A CN 202111203958 A CN202111203958 A CN 202111203958A CN 113822819 A CN113822819 A CN 113822819A
Authority
CN
China
Prior art keywords
image
pixel
area
exposure
difference
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111203958.7A
Other languages
Chinese (zh)
Other versions
CN113822819B (en
Inventor
邹涵江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202111203958.7A priority Critical patent/CN113822819B/en
Publication of CN113822819A publication Critical patent/CN113822819A/en
Application granted granted Critical
Publication of CN113822819B publication Critical patent/CN113822819B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/57Control of the dynamic range
    • H04N25/58Control of the dynamic range involving two or more exposures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20172Image enhancement details
    • G06T2207/20208High dynamic range [HDR] image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application discloses an HDR scene detection method, an HDR scene detection device, a terminal and a non-volatile computer-readable storage medium. The HDR scene detection method comprises the following steps: acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing the calibrated pixel difference values of different pixel intensities under different exposure duration differences; calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area and the preset weight. Compared with the method and the device for acquiring the overexposed area and the overexposed area of the image directly through the gray level change of the image, the method and the device for acquiring the HDR scene detection can eliminate the influence of natural black and/or white object blocks in the image, and therefore the accuracy of the HDR scene detection can be improved.

Description

HDR scene detection method and device, terminal and readable storage medium
Technical Field
The present application relates to the field of image technologies, and in particular, to an HDR scene detection method, an HDR scene detection apparatus, a terminal, and a non-volatile computer-readable storage medium.
Background
At present, the function of taking pictures of a mobile phone by a user is higher and higher, but the existing mobile phone shooting technology is limited by the size of an optical device, and if a scene contains large backlight and high-contrast illumination conditions, a shot image may lose a large amount of details. The current solution is to increase the Dynamic Range of the image by HDR (High Dynamic Range) processing. However, it is unknown for the non-professional photography user when HDR processing is required, and if the current shooting scene is not an HDR scene, that is, the current shooting scene is not suitable for HDR processing, multiple exposures and high dynamic range algorithm processing on the same scene may increase the calculation cost and time; if the current shot scene is an HDR scene, that is, the current shot scene is suitable for the HDR processing, the shooting imaging information is lost without the HDR processing. Therefore, it is very important to automatically and accurately detect HDR scenes, and a wrong detection algorithm cannot improve the quality of images but also affects the user experience.
Disclosure of Invention
The embodiment of the application provides an HDR scene detection method, an HDR scene detection device, a terminal and a non-volatile computer readable storage medium.
The embodiment of the application provides a HDR scene detection method. The HDR scene detection method comprises the following steps: acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing calibration pixel difference values of different pixel intensities under different exposure duration differences; calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area, the area of the dark area and a preset weight.
The embodiment of the application provides a HDR scene detection device. The HDR scene detection device comprises an acquisition module, a calculation module and a determination module. The acquisition module is used for acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing calibration pixel difference values of different pixel intensities under different exposure duration differences; the calculation module is used for calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; the determining module is used for determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area, the area of the dark area and a preset weight.
The terminal of the embodiments of the present application includes one or more processors, memory, and one or more programs. Wherein one or more of the programs are stored in the memory and executed by one or more of the processors, the programs including instructions for performing the HDR scene detection method of embodiments of the present application. The HDR scene detection method comprises the following steps: acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing calibration pixel difference values of different pixel intensities under different exposure duration differences; calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area, the area of the dark area and a preset weight.
A non-transitory computer-readable storage medium of an embodiment of the present application contains a computer program that, when executed by one or more processors, causes the processors to perform the following HDR scene detection method: acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing calibration pixel difference values of different pixel intensities under different exposure duration differences; calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area, the area of the dark area and a preset weight.
The HDR scene detection method, the HDR scene detection device, the terminal and the nonvolatile computer readable storage medium obtain the difference region according to the two frames of images exposed in different exposure durations and the preset lookup table, and determine whether the current scene is the HDR scene according to the information entropy, the brightness variance, the area of the overexposure region in the difference region and the area of the overexposure region in the difference region of the images with longer exposure durations. Compared with the method that the overexposure area and the overexposed area of the image are obtained directly through the gray level change of the image, the method can eliminate the influence of natural black and/or white object blocks in the image, and is favorable for improving the accuracy of HDR scene detection.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the present application.
Drawings
The above and/or additional aspects and advantages of the present application will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic flow diagram of an HDR scene detection method in some embodiments of the present application;
fig. 2 is a schematic structural diagram of an HDR scene detection apparatus according to some embodiments of the present application;
FIG. 3 is a schematic block diagram of a terminal according to some embodiments of the present application;
FIG. 4 is a schematic flow chart diagram of an HDR scene detection method in some embodiments of the present application;
FIG. 5 is a schematic diagram of acquiring a first image and a second image in an HDR scene detection method according to some embodiments of the present disclosure;
FIG. 6 is a flow diagram illustrating an HDR scene detection method according to some embodiments of the present disclosure;
FIG. 7 is a schematic diagram of acquiring a first image and a second image in an HDR scene detection method according to some embodiments of the present disclosure;
FIG. 8 is a flow diagram illustrating an HDR scene detection method according to some embodiments of the present disclosure;
FIG. 9 is a schematic diagram of obtaining a disparity region in an HDR scene detection method according to some embodiments of the present application;
FIG. 10 is a flow diagram illustrating an HDR scene detection method according to some embodiments of the present application;
FIG. 11 is a schematic diagram of obtaining a gray scale image in an HDR scene detection method according to some embodiments of the present application;
fig. 12 to 13 are schematic flow diagrams of HDR scene detection methods in some embodiments of the present application;
fig. 14 is a schematic diagram of acquiring an overexposed area and an overexposed area in a difference area in an HDR scene detection method according to some embodiments of the present disclosure;
fig. 15 to 16 are schematic flow diagrams of HDR scene detection methods in some embodiments of the present application;
FIG. 17 is a schematic diagram of a long exposure image and a short exposure image in an HDR scene detection method according to some embodiments of the present application;
FIG. 18 is a flow diagram illustrating an HDR scene detection method in some embodiments of the present application;
FIG. 19 is a schematic diagram of a connection between a non-volatile computer readable storage medium and a processor according to some embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below by referring to the drawings are exemplary only for the purpose of explaining the embodiments of the present application, and are not to be construed as limiting the embodiments of the present application.
The "dynamic range" is used to describe the light amount intensity distribution range from the darkest shaded portion to the brightest highlight portion in the screen. In photographing/photography, there are two concepts of "dynamic range of scene" and "dynamic range of camera", where "dynamic range of scene" refers to the range or ratio of maximum brightness and minimum brightness in a photographed scene, that is, the difference between the brightest area and the darkest area in a picture; and "dynamic range of the camera" refers to the range of brightness variation that is acceptable for the light sensing element. A High Dynamic Range (HDR) scene, i.e., a scene in which the Dynamic Range of the scene is larger than the Dynamic Range of the camera, has too bright or too dark regions beyond the Range that can be recorded by the photosensitive elements, and shows that a completely white (highlight overflow becomes completely white) or completely black (shadow region becomes completely black) region appears in the shot picture, and the image quality is greatly reduced due to lack of details of bright or dark portions. For such scenes, the imaging quality can currently be improved by applying HDR algorithmic processing. It is therefore first to be solved whether the shot scene is an HDR scene. The current HDR detection algorithm has a low accuracy in determining whether a current scene is suitable for taking a picture using a high dynamic range imaging mode (i.e., HDR mode), and particularly, when there is a natural black or natural white object in the scene, the image brightness is easily analyzed and erroneously determined as an underexposure region and an overexposure region, so that the scene dynamic determination is inaccurate.
In order to solve the above problem, referring to fig. 1, an embodiment of the present invention provides an HDR (High Dynamic Range) scene detection method, where the HDR scene detection method includes:
01: acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing the calibrated pixel difference values of different pixel intensities under different exposure duration differences;
02: calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and
03: and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area and the preset weight.
Referring to fig. 2, the present embodiment further provides an HDR scene detection apparatus 100, where the HDR scene detection apparatus 100 includes an obtaining module 10, a calculating module 20, and a determining module 30. Step 01 of the HDR scene detection method may be implemented by the obtaining module 10, step 02 may be implemented by the calculating module 20, and step 03 may be implemented by the determining module 30. That is, the obtaining module 10 may be configured to obtain the difference region according to a preset lookup table, a first image including the current scene, and a second image, where an exposure duration of the first image is longer than an exposure duration of the second image, and the lookup table is used to represent calibration pixel difference values of different pixel intensities under different exposure duration differences; the calculation module 20 may be configured to calculate an information entropy of the first image and a luminance variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; the determining module 30 may be configured to determine whether the current scene is an HDR scene according to the information entropy, the luminance variance, the area of the overexposed region, the area of the overly dark region, and the preset weight.
Referring to fig. 3, the present embodiment further provides a terminal 1000, where the terminal 1000 includes one or more processors 200, a memory 300, and one or more programs. Wherein one or more programs are stored in the memory 300 and executed by the one or more processors 200, the programs including instructions for performing the HDR scene detection method of embodiments of the present application. That is, when one or more processors 30 execute the program, the processors 30 may implement the methods in step 01, step 02, and step 03. That is, the one or more processors 200 are to: acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing the calibrated pixel difference values of different pixel intensities under different exposure duration differences; calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area and the preset weight.
Specifically, terminal 1000 can include, but is not limited to, a mobile phone, a laptop, a smart television, a tablet, a smart watch, a head-mounted display device, a drone, a digital camera, a digital camcorder, or a computer. The HDR scene detection apparatus 100 may be an integration of functional modules integrated in the terminal 1000. The present application is described only by taking the terminal 1000 as a mobile phone as an example, and the situation when the terminal 1000 is another type of device is similar to a mobile phone, and will not be described in detail.
In the HDR scene detection method, the HDR scene detection apparatus 100, and the terminal 1000 in the present application, the difference region is obtained according to the two frames of images exposed with different exposure durations and the preset lookup table, and then, according to the information entropy of the image with a longer exposure duration, the luminance variance, the area of the overexposed region in the difference region, and the area of the overexposed region in the difference region, it is determined whether the current scene is the HDR scene. Compared with the method that the overexposure area and the overexposed area of the image are obtained directly through the gray level change of the image, the method can eliminate the influence of natural black and/or white object blocks in the image, and is favorable for improving the accuracy of HDR scene detection.
In some embodiments, referring to fig. 2 and fig. 5, the HDR scene detection apparatus further includes an imaging module 40, where the imaging module 40 includes a pixel array 41, and the pixel array 41 includes a plurality of photosensitive pixels 411 arranged in a two-dimensional array. The photosensitive pixels 411 can generate pixel information after exposure, thereby obtaining an image. The imaging module 40 captures two images of the same scene (current scene) with different exposure durations, namely a first image and a second image. Wherein the exposure duration of the first image is longer than the exposure duration of the second image.
For example, referring to fig. 4 and fig. 5, in some embodiments, the HDR detection method further includes:
041: the pixel array 41 is controlled to be exposed for a first exposure duration to obtain a first image, and the pixel array 41 is controlled to be exposed for a second exposure duration to obtain a second image, the first exposure duration being longer than the second exposure duration.
Referring to fig. 2, in some embodiments, the HDR scene detection apparatus 100 further includes a control module 50, and step 041 may be implemented by the control module 50. That is, the control module 50 may be configured to control the pixel array 41 to be exposed for a first exposure duration to obtain a first image, and control the pixel array 41 to be exposed for a second exposure duration to obtain a second image, where the first exposure duration is longer than the second exposure duration.
Referring to FIG. 3, in some embodiments, step 041 may be implemented as performed by one or more processors 200. That is, the one or more processors 200 are further configured to control the pixel array 41 to be exposed for a first exposure duration to obtain a first image and to control the pixel array 41 to be exposed for a second exposure duration to obtain a second image, the first exposure duration being longer than the second exposure duration.
Specifically, the control module 50 may control the pixel array 41 to perform two exposures, so that the imaging module 40 captures the current scene to obtain two images including the current scene. For example, referring to fig. 5, in the first exposure, the pixel array 41 is exposed for a first exposure duration L to obtain a first image. At this time, the first image includes first pixel information generated after all the photosensitive pixels 411 in the pixel array 41 are exposed for the first exposure time period L. In the second exposure, the pixel array 41 is exposed for a second exposure time period S to obtain a second image, wherein the first exposure time period L is longer than the second exposure time period S. At this time, the second image includes second pixel information generated after all the photosensitive pixels 411 in the pixel array 41 are exposed for the second exposure time period S. Since all the photosensitive pixels 411 on the pixel array 41 are exposed each time, that is, the first image and the second image both include the pixel information generated after all the photosensitive pixels 411 are exposed, compared with the case where only part of the photosensitive pixels 411 are exposed each time, that is, the obtained image only includes the pixel information generated after part of the photosensitive pixels 411 are exposed, more pixel information can be obtained, which is beneficial to improving the resolution and image quality of the image.
Referring to fig. 6 and fig. 7, in some embodiments, the HDR detection method further includes:
042: controlling the pixel array to expose, at least one photosensitive pixel being exposed for a first exposure duration, at least one photosensitive pixel being exposed for a second exposure duration less than the first exposure duration; the photosensitive pixels exposed in the first exposure time period generate first pixel information to obtain a first image, and the photosensitive pixels exposed in the second exposure time period generate second pixel information to obtain a second image.
Referring to fig. 2, in some embodiments, step 042 may be implemented by control module 50. That is, the control module 50 may also be configured to control the pixel array to be exposed, with at least one photosensitive pixel being exposed for a first exposure duration and at least one photosensitive pixel being exposed for a second exposure duration that is less than the first exposure duration; the photosensitive pixels exposed in the first exposure time period generate first pixel information to obtain a first image, and the photosensitive pixels exposed in the second exposure time period generate second pixel information to obtain a second image.
Referring to fig. 3, in some embodiments, step 042 may also be implemented by one or more processors 200. That is, the one or more processors 200 are also configured to control the pixel array exposure, at least one light-sensitive pixel exposed for a first exposure duration, at least one light-sensitive pixel exposed for a second exposure duration that is less than the first exposure duration; the photosensitive pixels exposed in the first exposure time period generate first pixel information to obtain a first image, and the photosensitive pixels exposed in the second exposure time period generate second pixel information to obtain a second image.
Specifically, the control module 50 controls the pixel array 41 to expose, at least one photosensitive pixel 411 in the pixel array 41 is exposed for a first exposure duration L, and all the photosensitive pixels 411 exposed for the first exposure duration L generate first pixel information to obtain a first image; at least one of the photosensitive pixels 411 is exposed for a second exposure duration S that is less than the first exposure duration L, and all of the photosensitive pixels 411 exposed for the second exposure duration S generate second pixel information to obtain a second image. The exposure of the pixel array 41 is controlled, wherein the photosensitive pixels 411 are partially exposed in the first exposure duration L and partially exposed in the second exposure duration S, so as to obtain two images with different exposure durations, compared with the case of controlling the pixel array 41 to expose twice respectively, the problem of motion smear and other effects caused by a certain time interval between the acquisition of two frames of images (a first image and a second image) is avoided, and the subsequent determination of whether the current scene is an HDR scene according to the first image and the second image is facilitated.
For example, in some embodiments, as shown in fig. 7, the photosensitive pixels 411 exposed with the first exposure period L (the photosensitive pixels 411 labeled L in the left diagram of fig. 7) are spaced apart from the photosensitive pixels 411 exposed with the second exposure period S (the photosensitive pixels 411 labeled S in the left diagram of fig. 7). The control module 50 controls the exposure of the pixel array 41, with the light-sensitive pixels 411 labeled L in the left diagram of fig. 7 being exposed for a first exposure period and the light-sensitive pixels 411 labeled S in the left diagram of fig. 7 being exposed for a second exposure period. The photosensitive pixels 411 exposed for the first exposure period generate first pixel information (i.e., the pixels labeled L in the first intermediate image of fig. 7) arranged to form a first intermediate image that also includes empty pixels N (the pixels labeled N in the first intermediate image of fig. 7 have a pixel value of 0). And interpolating all empty pixels N in the first intermediate image pixels by performing interpolation processing on two adjacent first pixel information to obtain a first image. Similarly, the photosensitive pixels 411 exposed for the second exposure period generate second pixel information (i.e., the pixels labeled S in the second intermediate image of fig. 7) arranged to form a second intermediate image that also includes empty pixels N (the pixels labeled N in the second intermediate image of fig. 7 have a pixel value of 0). And performing interpolation processing on the adjacent two pieces of second pixel information to interpolate all the empty pixels N in the second intermediate image pixels so as to obtain a second image. Since the photosensitive pixels 411 of different exposure time lengths are arranged at intervals, it is advantageous to perform interpolation processing on the obtained image later.
It should be noted that, in some embodiments, the exposure process of the pixel array 41 may be: (1) the photosensitive pixels 411 exposed for the first exposure period L and the photosensitive pixels 411 exposed for the second exposure period S are sequentially exposed (wherein the exposure sequence of the two is not limited), and the exposure times of the two do not overlap; (2) the photosensitive pixels 411 exposed for the first exposure period L and the photosensitive pixels 411 exposed for the second exposure period S are sequentially exposed (wherein the exposure sequence of the two is not limited), and the exposure time of the two is not overlapped; (3) the exposure proceeding times of the photosensitive pixels 411 exposed for the second exposure period S are all within the exposure proceeding time of the exposed photosensitive pixels 411 for the first exposure period L. In particular, the (3) exposure method can shorten the overall exposure time required by the pixel array 41, which is beneficial to increasing the frame rate of the image.
And after the first image and the second image are obtained, obtaining a difference area according to a preset lookup table and the first image and the second image containing the current scene. Specifically, referring to fig. 1 and 8, in some embodiments, step 01: obtaining a difference region according to a preset lookup table, a first image containing a current scene and a second image, comprising:
011: acquiring information difference between each pixel in the second image and the corresponding pixel in the first image;
012: acquiring an information quantity change value corresponding to each pixel in the second image according to the second image, the information difference and the lookup table, wherein the information change value is a difference value between the information difference corresponding to the pixel and a calibration pixel difference value corresponding to the pixel; and
013: and acquiring a difference area according to the information quantity change value corresponding to each pixel in the second image and a preset change threshold.
Referring to fig. 2, in some embodiments, step 011, step 012 and step 013 can be performed by the obtaining module 10. That is, the obtaining module 10 may be further configured to obtain an information difference between each pixel in the second image and a corresponding pixel in the first image; acquiring an information quantity change value corresponding to each pixel in the second image according to the second image, the information difference and the lookup table, wherein the information change value is a difference value between the information difference corresponding to the pixel and a calibration pixel difference value corresponding to the pixel; and acquiring a difference area according to the information quantity change value corresponding to each pixel in the second image and a preset change threshold value.
Referring to fig. 3, in some embodiments, step 011, step 012, and step 013 can also be implemented by one or more processors 200. That is, the one or more processors 200 are further configured to obtain information differences between each pixel in the second image and a corresponding pixel in the first image; acquiring an information quantity change value corresponding to each pixel in the second image according to the second image, the information difference and the lookup table, wherein the information change value is a difference value between the information difference corresponding to the pixel and a calibration pixel difference value corresponding to the pixel; and acquiring a difference area according to the information quantity change value corresponding to each pixel in the second image and a preset change threshold value.
After the first image and the second image are acquired, the information difference between each pixel in the second image and the pixel corresponding to the first image is acquired. For example, in some embodiments, the difference image is obtained according to a difference between a pixel value of each pixel in the second image and a pixel value of a corresponding pixel in the first image, wherein the pixel value of each pixel in the difference image is an information difference between a pixel at a corresponding position in the second image and the corresponding pixel in the first image. For example, as shown in fig. 9, the pixel value of the pixel C1 arranged in the 1 st row and 1 st column of the differential image is the information difference between the pixel B1 arranged in the 1 st row and 1 st column of the second image and the pixel a1 arranged in the 1 st row and 1 st column of the first image. Where the pixel value of pixel C1 in the difference image is equal to the pixel value of pixel a1 in the first image minus the pixel value of pixel B1 in the second image.
After the information difference between each pixel in the second image and the corresponding pixel in the first image is obtained, the information quantity change value corresponding to each pixel in the second image is obtained according to the second image, the information difference and the lookup table, and the information change value is the difference between the information difference corresponding to the pixel and the calibrated pixel difference corresponding to the pixel.
It should be noted that, since the exposure duration is changed only when the exposure duration is changed under the same conditions without noise interference, the intensity (i.e., the pixel value of the pixel) of the pixel other than the saturation region (including the overexposed region and the overexposed region) should be changed in a linear relationship with the exposure duration. For example, the exposure time is doubled, and the observed image pixel intensity is also nearly doubled. However, noise is inevitably introduced during the digital signal conversion process, which makes the relationship between the variation of the pixel intensity (i.e. the pixel value of the pixel) and the exposure time period non-linear. Therefore, in order to eliminate noise interference, a lookup table may be generated in advance, and the lookup table is used to characterize the calibration pixel difference value of different pixel intensities (i.e. pixel values of pixels) under different exposure time differences. That is, the look-up table specifies how much different pixel intensities (i.e., pixel values of pixels) should change at different exposure time differences. For example, in one example, it is assumed that the lookup table specifies that the exposure time period difference is 0.1s, the specified pixel difference value corresponding to the pixel with the pixel value of 60 is 20, which indicates that when the third image and the fourth image are obtained by shooting the same scene with different exposure time periods, the exposure time period of the third image is shorter than the exposure time period of the fourth image by 0.1s, and the pixel with the pixel value of 60 exists in the non-saturated region (the region other than the overexposed region and the overexposed region) of the third image, the pixel value of the pixel at the position corresponding to the pixel value of 60 of the third image in the fourth image should be 80.
Referring to fig. 9, taking the example of obtaining the information content variation value corresponding to the pixel B1 arranged in the 1 st row and the 1 st column of the second image as an example, obtaining the information variation values corresponding to the pixels located at other positions of the second image is similar, and is not repeated herein. Illustratively, the pixel value of the pixel B1 arranged in the 1 st row and the 1 st column of the second image is obtained, and then the calibrated pixel difference value corresponding to the pixel value is found in the lookup table. After the calibration pixel difference value corresponding to the pixel B1 in the second image is acquired, the difference value between the information difference between the pixel B1 in the second image and the pixel a1 in the first image (i.e., the pixel arranged at the 1 st row and the 1 st column in the first image) and the calibration pixel difference value corresponding to the pixel B1 in the second image is used as the information amount change value corresponding to the pixel B1. For example, assuming that the pixel value of the pixel B1 arranged in the 1 st row and 1 st column of the second image is m, the pixel value of the pixel C1 arranged in the 1 st row and 1 st column of the differential image is p, that is, the information difference between the pixel B1 in the second image and the pixel a1 in the first image (that is, the pixel arranged in the 1 st row and 1 st column of the first image) is p, and the scaled pixel difference value scaled with the pixel value m in the lookup table is n, the information amount change value corresponding to the pixel of the pixel B1 is the absolute value of the difference between p and n.
In particular, if the calibrated pixel difference values of the same pixel intensity (i.e., the pixel values of the pixels) under different exposure time duration differences are calibrated in the lookup table, in some embodiments, the information amount variation value corresponding to each pixel in the second image may also be obtained according to the second image, the information difference, the lookup table, and the difference value between the exposure time durations of the first image and the second image. For example, assume that the difference between the exposure time periods of the first image and the second image is t1, the pixel value of the pixel B1 arranged in the 1 st row and 1 st column of the second image is m, the pixel value of the pixel C1 arranged in the 1 st row and 1 st column of the differential image is p, that is, the information difference between the pixel B1 in the second image and the pixel a1 in the first image (i.e., the pixel arranged in the 1 st row and 1 st column of the first image) is p, and the lookup table specifies: when the difference value of the exposure time lengths is t1, the difference value of the calibration pixel corresponding to the pixel value m is n 1; when the difference value of the exposure time lengths is t2, the difference value of the calibration pixel corresponding to the pixel value m is n 2; the pixel of the pixel B1 corresponds to an information amount change value that is the absolute value of the difference between p and n 1.
After the information amount change value corresponding to each pixel in the second image is obtained, the difference area is obtained according to the information amount change value corresponding to each pixel in the second image and a preset change threshold value. Specifically, referring to fig. 9, in some embodiments, a pixel in the second image is arbitrarily selected, an information amount variation value corresponding to the pixel is obtained, and then the information amount variation value corresponding to the pixel is compared with a preset variation threshold. If the information change value corresponding to the pixel is larger than a preset change threshold value, the information amount of the first image and the second image is considered to be changed greatly, and the change is not caused by the difference of the exposure duration alone, and then the position corresponding to the pixel in the first image is determined to be a difference area; if the information change value corresponding to the pixel is smaller than a preset change threshold value, the information amount of the first image and the second image is not changed greatly, and the change is caused by the difference of the exposure time length, and then the position corresponding to the pixel in the first image is determined to be a non-difference area. And then selecting another pixel in the second image to repeat the steps until all pixels in the second image are traversed, so that the difference area can be obtained. For example, as shown in fig. 9, if the amount of change of the pixel B2 arranged in the 2 nd row and 2 nd column of the second image is greater than the preset threshold, it may be determined that the pixel a2 arranged in the 2 nd row and 2 nd column of the first image is located in the difference region; the amount of change of the pixel B1 arranged at the 1 st row and the 1 st column of the second image is less than the preset threshold, it may be determined that the pixel a1 arranged at the 1 st row and the 1 st column of the first image is located in the non-difference region.
It should be noted that, the pixels in the second image may also be sequentially traversed according to a certain order, and the magnitude between the information amount change value corresponding to the pixel and the preset change threshold value is compared, for example, from left to right, from top to bottom, and the like, which is not limited herein. Because the pixels in the difference area are all greatly changed in information and are not caused by exposure time difference, the overexposed area and the overexposed area of the image are acquired according to the difference area, compared with the case that the overexposed area and the overexposed area of the image are acquired directly through the gray scale change of the image, the influence of natural black and/or white object blocks in the image can be eliminated, and the accuracy of HDR scene detection is improved.
After the difference region is obtained, the information entropy and the brightness variance of the first image are calculated, and the area of an overexposure region in the difference region is calculated. Specifically, referring to fig. 1 and 10, in some embodiments, calculating the information entropy of the first image and the luminance variance of the first image includes:
021: performing gray processing on the first image to obtain a gray image; and
022: and generating a gray histogram according to the gray values of all pixels in the gray image, and acquiring the information entropy and the brightness variance according to the gray histogram.
Referring to fig. 2, in some embodiments, step 021 and step 022 can be implemented by computing module 20. That is, the calculation module 20 is further configured to perform a gray-scale processing on the first image to obtain a gray-scale image; and generating a gray histogram according to the gray values of all pixels in the gray image, and acquiring the information entropy and the brightness variance according to the gray histogram.
Referring to fig. 3, in some embodiments, step 021 and step 022 may be implemented by one or more processors 200. That is, the one or more processors 200 are also configured to perform grayscale processing on the first image to obtain a grayscale image; and generating a gray histogram according to the gray values of all pixels in the gray image, and acquiring the information entropy and the brightness variance according to the gray histogram.
Specifically, referring to fig. 11, the first image is gray-processed to obtain a gray-scale image, and then a gray-scale histogram is generated according to the gray-scale values of all pixels in the gray-scale image. For example, in one embodiment, the size of all gray values in the gray image and the number of pixels that are the same gray value may be counted to generate a gray histogram. And after the gray histogram is obtained, the information entropy of the first image and the brightness variance of the first image are obtained according to the gray histogram. Specifically, when the luminance variance of the first image is calculated, the luminance mean value of the first image is calculated through the gray level histogram, wherein the luminance mean value is recorded as
Figure BDA0003306142940000101
The first brightness variance is recorded as
Figure BDA0003306142940000102
Wherein x isiThe proportion of pixels with the gray value i in the first image is represented, namely, the gray level. The entropy of the image reflects the bit average of the gray level set in the image and describes the average information content of the image information source, and the entropy of the information of the first image can be calculated by a formula
Figure BDA0003306142940000103
And (6) calculating.
Referring to fig. 1 and 12, in some embodiments, calculating the area of the overexposed region and the area of the dark region in the difference region includes:
021: performing gray processing on the first image to obtain a gray image;
023: acquiring an overexposure area and an overexposure area in the difference area according to the difference area and the gray value of the pixel in the gray image; and
024: and calculating the area of the overexposed area and the area of the dark area according to the number of pixels in the overexposed area, the number of pixels in the dark area and the number of all pixels in the first image.
Referring to fig. 2, in some embodiments, step 021, step 023 and step 024 may be implemented by the calculating module 20, that is, the calculating module 20 is further configured to perform a gray processing on the first image to obtain a gray image; acquiring an overexposure area and an overexposure area in the difference area according to the difference area and the gray value of the pixel in the gray image; and calculating the area of the overexposure area and the area of the overexposure area according to the number of the pixels in the overexposure area, the number of the pixels in the overexposure area and the number of all the pixels in the first image.
Referring to fig. 3, in some embodiments, step 021, step 023 and step 024 may also be implemented by one or more processors 200. That is, the one or more processors 200 are also configured to perform grayscale processing on the first image to obtain a grayscale image; acquiring an overexposure area and an overexposure area in the difference area according to the difference area and the gray value of the pixel in the gray image; and calculating the area of the overexposure area and the area of the overexposure area according to the number of the pixels in the overexposure area, the number of the pixels in the overexposure area and the number of all the pixels in the first image.
The method comprises the steps of firstly carrying out gray processing on a first image to obtain a gray image, and then obtaining an overexposure area and an overexposed area in a difference area according to the difference area and the gray value of a pixel in the gray image. For example, referring to fig. 13, in some embodiments, step 023: acquiring an overexposure area and an overexposure area in the difference area according to the difference area and the gray value of the pixel in the gray image, wherein the acquiring comprises the following steps:
0231: traversing all pixels in the gray level image, comparing the gray level value of each pixel with a first gray level threshold value and a second gray level threshold value, and if the gray level value of the current pixel is greater than the first gray level threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is located in the overexposure area; and if the gray value of the current pixel is smaller than the second gray threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is located in the too-dark area.
Referring to fig. 2, in some embodiments, step 0231 may be implemented by calculation module 20. That is, the calculating module 20 is further configured to traverse all pixels in the grayscale image, and compare the grayscale value of each pixel with the first grayscale threshold and the second grayscale threshold, and if the grayscale value of the current pixel is greater than the first grayscale threshold and the pixel corresponding to the current pixel in the first image is located in the difference region, determine that the pixel corresponding to the current pixel in the first image is located in the overexposure region; and if the gray value of the current pixel is smaller than the second gray threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is located in the too-dark area.
Referring to fig. 3, in some embodiments, step 0231 may also be implemented by one or more processors 200. That is, the one or more processors 200 may be further configured to traverse all pixels in the grayscale image, and compare the grayscale value of each pixel with the first grayscale threshold and the second grayscale threshold, and if the grayscale value of the current pixel is greater than the first grayscale threshold and the pixel corresponding to the current pixel in the first image is located in the difference region, determine that the pixel corresponding to the current pixel in the first image is located in the overexposure region; and if the gray value of the current pixel is smaller than the second gray threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is located in the too-dark area.
It should be noted that, the first grayscale threshold and the second grayscale threshold may be preset, and when the grayscale value of a pixel in the grayscale image is greater than the first grayscale threshold, it indicates that a pixel corresponding to the current pixel in the first image may be overexposed, or a white object may exist at a position corresponding to the current pixel in the first image; when the gray value of the pixel in the gray image is less than the second gray threshold, it indicates that the pixel corresponding to the current pixel in the first image may be too dark, or a black object may exist at the position corresponding to the current pixel in the first image. In addition, because the pixels in the difference area have large information variation and are not caused by the difference of the exposure time, the overexposed area and the too dark area can be obtained according to the difference area and the gray value, so as to avoid mistakenly judging the area where the black object is located as the too dark area; and/or the area where the white object is located is judged as the overexposed area by mistake, so that the accuracy of HDR scene detection is improved.
Specifically, referring to fig. 14, in some embodiments, one pixel in the gray scale is arbitrarily selected, and the gray scale value of the current pixel is obtained. The gray value of the current pixel is compared with the size between the first gray threshold value and the second gray threshold value. If the gray value of the current pixel is larger than the first gray threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is over-exposed, namely determining that the pixel corresponding to the current pixel in the first image is located in the over-exposed area of the difference area; and if the gray value of the current pixel is smaller than the second gray threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is located in the overexposed dark area of the difference area. And then selecting another pixel in the gray-scale image to repeat the steps until all pixels in the gray-scale image are traversed, so that an over-dark area and an over-exposed area can be obtained. For example, as shown in fig. 14, if the gray value of the pixel D1 arranged in the 1 st row and 1 st column of the gray image is greater than the first gray threshold value, but the pixel a1 arranged in the 1 st row and 1 st column of the first image is in the non-difference region, it is determined that the pixel a1 in the first image is located in the non-difference region (i.e., not in the overexposed region and not in the overly dark region); determining that the pixel A2 in the first image is located in the overexposure area if the gray value of the pixel D2 arranged in the 2 nd row and the 2 nd column of the gray image is greater than the first gray threshold value and the pixel A2 arranged in the 2 nd row and the 2 nd column of the first image is located in the difference area; the gray value of the pixel D7 arranged in the 7 th row and the 7 th column of the gray image is less than the second gray threshold, and the pixel a7 arranged in the 7 th row and the 7 th column of the first image is located in the difference region, it is determined that the pixel a7 in the first image is located in the too-dark region.
It should be noted that, the pixels in the grayscale image may also be sequentially traversed according to a certain order, and the grayscale value of each pixel may be compared with the magnitude between the first grayscale threshold and the second grayscale threshold, for example, from left to right, from top to bottom, and the like, which is not limited herein. In addition, in some embodiments, a pixel located in the difference region of the first image may be further selected, the gray value of the pixel corresponding to the pixel in the gray image is compared with the size between the first gray threshold and the second gray threshold, and if the gray value of the pixel corresponding to the pixel in the gray image is greater than the first gray threshold, the pixel is determined to be located in the overexposure region of the difference region; and if the gray value of the pixel corresponding to the pixel in the gray image is smaller than the second gray threshold value, confirming that the pixel corresponding to the pixel is located in the overexposed dark area of the difference area. And then selecting another pixel in the difference area of the first image, and repeating the steps until all pixels in the difference area of the first image are selected. In this way, an over-dark area and an over-exposed area can be obtained, and compared with the comparison of the gray values of all the pixels in the gray image with the first gray threshold and the second gray threshold, the embodiment can reduce the amount of calculation while obtaining the over-exposed area and the over-dark area of the difference area, which is beneficial to improving the speed of HDR scene detection.
After the overexposure area and the overexposure area of the difference area are obtained, the area of the overexposure area and the area of the overexposure area are calculated according to the number of pixels in the overexposure area, the number of pixels in the overexposure area and the number of all pixels in the first image. For example, in some embodiments, the formula may be calculated
Figure BDA0003306142940000121
Calculating the area of an overexposure area; and by calculation of formula
Figure BDA0003306142940000122
Figure BDA0003306142940000123
The area of the overly dark area is calculated. Wherein A isoIs the area of the overexposure region, AuArea of the dark region, N (M)over) Is the number of pixels in the overexposed region, N (M)under) Is the number of pixels in the too dark region, and n (i) is the number of pixels in the first image.
After the information entropy of the first image, the brightness variance of the first image, the area of the overexposure region, and the area of the overly dark region are obtained, whether the current scene is the HDR scene may be determined according to the information entropy of the first image, the brightness variance of the first image, the area of the overexposure region, the area of the overly dark region, and the preset weight. Specifically, referring to fig. 1 and 16, in some embodiments, step 03: determining whether the current scene is an HDR scene according to the information entropy of the first image, the brightness variance of the first image, the area of an overexposure area and a preset weight, wherein the determining comprises the following steps:
031: calculating an evaluation value according to the information entropy, the brightness variance, the area of the overexposure area, the area of the dark area and the preset weight;
032: if the evaluation value is larger than a preset threshold value, determining that the current scene is an HDR scene; and if the evaluation value is smaller than a preset threshold value, determining that the current scene is a non-HDR scene.
Referring to fig. 2, in some embodiments, step 031 and step 032 can be implemented by the determining module 30. That is, the determining module 30 is further configured to calculate the evaluation value according to the information entropy, the brightness variance, the area of the overexposure area, the area of the overly dark area, and the preset weight; if the evaluation value is larger than a preset threshold value, determining that the current scene is an HDR scene; and if the evaluation value is smaller than a preset threshold value, determining that the current scene is a non-HDR scene.
Referring to fig. 3, in some embodiments, step 031 and step 032 can also be implemented by one or more processors 200. That is, the one or more processors 200 are further configured to calculate the evaluation value according to the information entropy, the brightness variance, the area of the overexposed region, the area of the overly dark region, and the preset weight; if the evaluation value is larger than a preset threshold value, determining that the current scene is an HDR scene; and if the evaluation value is smaller than a preset threshold value, determining that the current scene is a non-HDR scene.
Specifically, in some embodiments, the evaluation value may be calculated according to information entropy, luminance variance, an area of an overexposed region, an area of an excessively dark region, and a preset weight. For example, in some embodiments, the calculation formula may be based on
Figure BDA0003306142940000131
And calculating an evaluation value. Wherein HDRscoreIs an evaluation value, E is an information entropy,σ2Is the variance of brightness, AoIs the area of the overexposure region, AuFor the area of the too dark area, w1, w2, w3, and w4 are all designed weights, and w1, w2, w3, and w4 correspond to the information entropy, the luminance variance, the area of the over-exposed area, and the area of the too dark area, respectively. In particular, in some embodiments, the calculation formula may also be based on
Figure BDA0003306142940000132
And calculating an evaluation value. Wherein C is a design parameter, and C may be a constant pre-designed according to requirements.
After the evaluation value is obtained, the magnitude between the evaluation value and a preset threshold value is compared. If the evaluation value is larger than a preset threshold value, the current scene is suitable for HDR processing, namely the current scene is determined to be an HDR scene; if the evaluation value is smaller than the preset threshold value, the current scene is not suitable for HDR processing, namely the current scene is determined to be a non-HDR scene, and therefore HDR scene detection is completed. It should be noted that, if the evaluation value is equal to the preset threshold, the current scene may be determined to be an HDR scene, and the current scene may also be determined to be a non-HDR scene, which is not limited herein.
Referring to fig. 17, in some embodiments, the HDR scene detection method further includes:
051: acquiring a short-exposure image and a long-exposure image which contain the same calibration information, wherein the exposure duration of the long-exposure image is longer than that of the short-exposure image;
052: calculating a first difference value corresponding to the pixel value of each pixel in the short-exposure image according to the pixel value of each pixel in the short-exposure image and the pixel value of the corresponding pixel in the long-exposure image; and
053 generating a lookup table according to the pixel value of each pixel in the short-exposure image and the corresponding first difference value.
Referring to fig. 2, in some embodiments, the HDR scene detection apparatus 100 further includes a generating module 60. Step 051, step 052 and step 053 may be implemented by the generation module 60. That is, the generating module 60 may be configured to obtain a short-exposure image and a long-exposure image that include the same calibration information, where an exposure duration of the long-exposure image is longer than an exposure duration of the short-exposure image; calculating a first difference value corresponding to the pixel value of each pixel in the short-exposure image according to the pixel value of each pixel in the short-exposure image and the pixel value of the corresponding pixel in the long-exposure image; and generating a lookup table according to the pixel value of each pixel in the short-exposure image and the corresponding first difference value.
Referring to fig. 3, in some embodiments, steps 051, 052 and 053 may also be implemented by one or more processors 200. That is, the one or more processors 200 may also be configured to obtain a short-exposure image and a long-exposure image that include the same calibration information, where the exposure duration of the long-exposure image is longer than the exposure duration of the short-exposure image; calculating a first difference value corresponding to the pixel value of each pixel in the short-exposure image according to the pixel value of each pixel in the short-exposure image and the pixel value of the corresponding pixel in the long-exposure image; and generating a lookup table according to the pixel value of each pixel in the short-exposure image and the corresponding first difference value.
It should be noted that, in some embodiments, the calibration information may be a standard color chart, so that after the short-exposure image obtained by shooting the calibration information is obtained, it is beneficial to make the pixel values of the pixels located at different positions of the short-exposure image different (i.e., the pixel intensities located at different positions of the short-exposure image are different), so as to be beneficial to subsequently obtaining more calibration pixel difference values corresponding to the pixel values (i.e., the pixel intensities) of different pixels. In addition, the process of obtaining the look-up table (i.e., the steps 051, 052 and 053) only needs to be executed once after the HDR scene detection apparatus 100 or the terminal 1000 is assembled, and does not need to be executed before each HDR scene detection.
Illustratively, a short-exposure image and a long-exposure image containing the same calibration information are acquired, wherein the exposure duration of the long-exposure image is longer than that of the short-exposure image. A first difference value corresponding to the pixel value of each pixel in the short-exposure image is then calculated based on the pixel value of each pixel in the short-exposure image and the pixel value of the corresponding pixel in the long-exposure image. For example, as shown in fig. 17, taking the example of calculating the first difference value corresponding to the pixel S1 arranged in the 1 st row and 1 st column of the short-exposure image, the first difference value corresponding to the pixel S1 in the short-exposure image is equal to the pixel value of the pixel L1 in the long-exposure image (i.e., the pixel arranged in the 1 st row and 1 st column of the long-exposure image), minus the pixel value of the pixel S1.
And after the first difference value corresponding to each pixel in the short-exposure image is obtained, generating a lookup table according to the pixel value of each pixel in the short-exposure image and the corresponding first difference value. Specifically, in some embodiments, a pixel value of a pixel in the short-exposure image is obtained, and a corresponding first difference value is obtained. And then associating the two images and filling the two images into a lookup table, acquiring the pixel value of the next pixel in the short-exposure image, and repeating the steps until all the pixels in the short-exposure image are selected, so that the lookup table is obtained. For example, a first difference value corresponding to a pixel having a pixel value of m1 in the short-exposure image is n 1; if the first difference value corresponding to the pixel with the pixel value m2 in the short-exposure image is n2, the look-up table records that the calibration pixel difference value corresponding to the pixel with the pixel value m1 is n1, and the calibration pixel difference value corresponding to the pixel with the pixel value m2 is n 2.
Referring to fig. 19, in some embodiments, the HDR scene detection method further includes:
061: acquiring short-exposure images and multi-frame long-exposure images containing the same calibration information, wherein the exposure time lengths of all the long-exposure images are longer than that of the short-exposure images, and the exposure time lengths of each frame of long-exposure image are different;
062: selecting a frame of long exposure image from a plurality of frames of long exposure images, calculating a second difference value between the exposure time length of the selected long exposure image and the exposure time length of the short exposure image, and calculating a first difference value corresponding to the pixel value of each pixel in the short exposure image according to the pixel value of each pixel in the short exposure image and the pixel value of the corresponding pixel in the selected long exposure image; selecting another frame long exposure image from the multiple frame long exposure images, and repeating the steps until all the long exposure images are selected;
063: and generating a preset lookup table according to the pixel value of each pixel in the short-exposure image, the corresponding first difference value and the corresponding second difference value.
Referring to fig. 2, in some embodiments, step 061, step 062, and step 063 may be implemented by generation module 60. That is, the generating module 60 may be configured to obtain a short-exposure image and multiple frames of long-exposure images that include the same calibration information, where the exposure durations of all the long-exposure images are longer than the exposure duration of the short-exposure image, and the exposure durations of each frame of the long-exposure images are different; selecting a frame of long exposure image from a plurality of frames of long exposure images, calculating a second difference value between the exposure time length of the selected long exposure image and the exposure time length of the short exposure image, and calculating a first difference value corresponding to the pixel value of each pixel in the short exposure image according to the pixel value of each pixel in the short exposure image and the pixel value of the corresponding pixel in the selected long exposure image; selecting another frame long exposure image from the multiple frame long exposure images, and repeating the steps until all the long exposure images are selected; and generating a preset lookup table according to the pixel value of each pixel in the short-exposure image, the corresponding first difference value and the corresponding second difference value.
Referring to fig. 3, in some embodiments, step 061, step 062, and step 063 may be implemented by one or more processors 200. That is, the one or more processors 200 may also be configured to obtain a short-exposure image and a multi-frame long-exposure image that include the same calibration information, where the exposure durations of all the long-exposure images are longer than the exposure duration of the short-exposure image, and the exposure durations of each frame of the long-exposure image are different; selecting a frame of long exposure image from a plurality of frames of long exposure images, calculating a second difference value between the exposure time length of the selected long exposure image and the exposure time length of the short exposure image, and calculating a first difference value corresponding to the pixel value of each pixel in the short exposure image according to the pixel value of each pixel in the short exposure image and the pixel value of the corresponding pixel in the selected long exposure image; selecting another frame long exposure image from the multiple frame long exposure images, and repeating the steps until all the long exposure images are selected; and generating a preset lookup table according to the pixel value of each pixel in the short-exposure image, the corresponding first difference value and the corresponding second difference value
It should be noted that, in some embodiments, the calibration information may be a standard color chart, so that after the short-exposure image obtained by shooting the calibration information is obtained, it is beneficial to make the pixel values of the pixels located at different positions of the short-exposure image different (i.e., the pixel intensities located at different positions of the short-exposure image are different), so as to be beneficial to subsequently obtaining more calibration pixel difference values corresponding to the pixel values (i.e., the pixel intensities) of different pixels. In addition, the process of obtaining the look-up table (i.e., the above-mentioned step 061, step 062, and step 063) only needs to be performed once after the HDR scene detection apparatus 100 or the terminal 1000 is assembled, and does not need to be performed before each HDR scene detection.
In an example, a short-exposure image and a multi-frame long-exposure image containing the same calibration information are obtained, the exposure time lengths of all the long-exposure images are longer than that of the short-exposure image, and the exposure time lengths of each frame of the long-exposure image are different. Selecting a frame of long exposure image from a plurality of frames of long exposure images, calculating a second difference value between the exposure time length of the selected long exposure image and the exposure time length of the short exposure image, and calculating a first difference value corresponding to the pixel value of each pixel in the short exposure image according to the pixel value of each pixel in the short exposure image and the pixel value of the corresponding pixel in the selected long exposure image; and selecting another frame long exposure image from the multiple frames of long exposure images, and repeating the steps until all the long exposure images are selected. Thus, a second difference between the exposure time length of the short-exposure image and the exposure time length of the multiple-frame-length exposure image, and a first difference between the pixel value of each pixel in the short-exposure image and the pixel value of the corresponding pixel in the long-exposure image in each frame are obtained.
In the embodiment, the specific implementation of calculating the first difference value corresponding to the pixel value of each pixel in the short-exposure image according to the pixel value of each pixel in the short-exposure image and the pixel value of the pixel corresponding to the selected long-exposure image is the same as that of calculating the first difference value corresponding to the pixel value of each pixel in the short-exposure image according to the pixel value of each pixel in the short-exposure image and the pixel value of the pixel corresponding to the long-exposure image in the above embodiment, and details are not described herein.
After a second difference value between the exposure time of the short-exposure image and the exposure time of the multi-frame long-exposure image is obtained, and a first difference value between the pixel value of each pixel in the short-exposure image and the pixel value of the corresponding pixel in the long-exposure image in each frame is obtained, a preset lookup table is generated according to the pixel value of each pixel in the short-exposure image, the corresponding first difference value and the corresponding second difference value. Specifically, in some embodiments, a frame of long-exposure image is selected from a plurality of frames of long-exposure images, a second difference (i.e., a difference between the exposure duration of the selected long-exposure image and the exposure duration of the short-exposure image) corresponding to the selected long-exposure image is obtained, a pixel in the short-exposure image is selected to obtain a pixel value thereof, and a first difference between the pixel value thereof and a pixel value of a corresponding pixel in the selected long-exposure image is obtained. And then, the three are associated and filled in a lookup table, the pixel value of the next pixel in the short-exposure image is obtained, the steps are repeated until all the pixels in the short-exposure image are selected, and the calibrated pixel difference values corresponding to different pixel intensities (namely the pixel values) under the exposure time length difference are obtained. Then another frame length exposure image is selected to repeat the steps until all the length exposure images are selected, so that a lookup table containing calibration pixel difference values under different exposure time length differences of different pixel intensities (namely pixel values) is obtained. For example, a first difference value between a pixel having a pixel value of m1 in the short-exposure image and a corresponding pixel in the first long-exposure image is n 1; a first difference between a pixel having a pixel value of m1 in the short-exposure image and a corresponding pixel in the second long-exposure image is n2, and a second difference between the exposure time length of the short-exposure image and the exposure time length of the first long exposure is t 1; if the second difference between the exposure time length of the exposure image and the exposure time length of the second long exposure is t2, recording in the lookup table that when the difference between the exposure time lengths is t1, the pixel value m1 corresponds to n 1; when the exposure time period difference is t1, the pixel value m2 corresponds to n 2.
Referring to fig. 1 and 19, a non-volatile computer-readable storage medium 400 containing a computer program 401 is also provided in the embodiments of the present application. The computer program 401, when executed by one or more processors 200, causes the processor 200 to perform the HDR scene detection methods in 01, 011, 012, 013, 02, 021, 022, 023, 0231, 024, 03, 031, 032, 041, 042, 051, 052, 053, 061, 062, and 063.
For example, the computer program 401, when executed by the one or more processors 200, causes the processors 200 to perform the following method:
01: acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing the calibrated pixel difference values of different pixel intensities under different exposure duration differences;
02: calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and
03: and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area and the preset weight.
In the description herein, reference to the description of the terms "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example" or "some examples" or the like means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples. Furthermore, various embodiments or examples and features of different embodiments or examples described in this specification can be combined and combined by one skilled in the art without contradiction.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and the scope of the preferred embodiments of the present application includes other implementations in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
Although embodiments of the present application have been shown and described above, it is to be understood that the above embodiments are exemplary and not to be construed as limiting the present application, and that changes, modifications, substitutions and alterations can be made to the above embodiments by those of ordinary skill in the art within the scope of the present application.

Claims (11)

1. An HDR scene detection method, comprising:
acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing calibration pixel difference values of different pixel intensities under different exposure duration differences;
calculating the information entropy of the first image and the brightness variance of the first image; calculating the area of an overexposure area and the area of an overexposure area in the difference area; and
and determining whether the current scene is an HDR scene according to the information entropy, the brightness variance, the area of the overexposure area, the area of the dark area and a preset weight.
2. The HDR scene detection method of claim 1, wherein the obtaining the difference region according to the preset lookup table and the first and second images comprising the current scene comprises:
acquiring information difference between each pixel in the second image and the corresponding pixel in the first image;
acquiring an information quantity change value corresponding to each pixel in the second image according to the second image, the information difference and the lookup table, wherein the information change value is a difference value between the information difference corresponding to the pixel and the calibrated pixel difference corresponding to the pixel; and
and acquiring the difference region according to the information quantity change value corresponding to each pixel in the second image and a preset change threshold value.
3. The HDR scene detection method of claim 1, wherein said calculating the entropy of the information of the first image, the luminance variance of the first image, comprises:
performing gray scale processing on the first image to obtain a gray scale image;
and generating a gray histogram according to the gray values of all pixels in the gray image, and acquiring the information entropy and the brightness variance according to the gray histogram.
4. The HDR scene detection method of claim 1, wherein said calculating areas of overexposed and dark regions within the difference region comprises:
performing gray scale processing on the first image to obtain a gray scale image;
acquiring an overexposure area and an overexposure area in the difference area according to the difference area and the gray value of the pixel in the gray image; and
and calculating the area of the overexposure area and the area of the overexposure area according to the number of the pixels in the overexposure area, the number of the pixels in the overexposure area and the number of all the pixels in the first image.
5. The HDR scene detection method of claim 4, wherein the obtaining of the overexposed and the too dark areas in the difference region according to the difference region and the gray scale values of the pixels in the gray scale image comprises:
traversing all pixels in the gray level image, comparing the gray level value of each pixel with a first gray level threshold value and a second gray level threshold value, and if the gray level value of the current pixel is greater than the first gray level threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is located in an overexposure area; and if the gray value of the current pixel is smaller than a second gray threshold value and the pixel corresponding to the current pixel in the first image is located in the difference area, determining that the pixel corresponding to the current pixel in the first image is located in an excessively dark area.
6. The HDR scene detection method of claim 1, wherein the determining whether the current scene is an HDR scene according to the information entropy, the luminance variance, the area of the overexposed region, the area of the dark region, and preset weights comprises:
calculating an evaluation value according to the information entropy, the brightness variance, the area of the overexposure area, the area of the dark area and a preset weight;
if the evaluation value is larger than a preset threshold value, determining that the current scene is an HDR scene; and if the evaluation value is smaller than a preset threshold value, determining that the current scene is a non-HDR scene.
7. The HDR scene detection method of claim 1, further comprising:
acquiring a short-exposure image and a long-exposure image which contain the same calibration information, wherein the exposure duration of the long-exposure image is longer than that of the short-exposure image;
calculating a first difference value corresponding to the pixel value of each pixel in the short-exposure image according to the pixel value of each pixel in the short-exposure image and the pixel value of the corresponding pixel in the long-exposure image; and
and generating the lookup table according to the pixel value of each pixel in the short-exposure image and the corresponding first difference value.
8. The HDR scene detection method of claim 1, further comprising:
controlling the pixel array to expose, at least one of the photosensitive pixels being exposed for a first exposure duration, at least one of the photosensitive pixels being exposed for a second exposure duration less than the first exposure duration; the photosensitive pixels exposed in the first exposure time length generate first pixel information to obtain the first image, and the photosensitive pixels exposed in the second exposure time length generate second pixel information to obtain the second image; or
And controlling the pixel array to be exposed for a first exposure duration to obtain the first image, and controlling the pixel array to be exposed for a second exposure duration to obtain the second image, wherein the first exposure duration is longer than the second exposure duration.
9. An HDR scene detection apparatus, comprising:
the acquisition module is used for acquiring a difference region according to a preset lookup table, a first image containing a current scene and a second image, wherein the exposure duration of the first image is longer than that of the second image, and the lookup table is used for representing calibration pixel difference values of different pixel intensities under different exposure duration differences;
the calculation module is used for calculating the information entropy and the brightness variance of the first image; calculating the areas of an overexposure area and an overexposure area in the difference area; and
a determining module, configured to determine whether the current scene is an HDR scene according to the information entropy, the luminance variance, the area of the overexposure region, the area of the overly dark region, a preset weight, and parameters.
10. A terminal, characterized in that the terminal comprises:
one or more processors, memory; and
one or more programs, wherein one or more of the programs are stored in the memory and executed by one or more of the processors, the programs comprising instructions for performing the HDR scene detection method of any of claims 1 to 8.
11. A non-transitory computer readable storage medium storing a computer program which, when executed by one or more processors, implements the HDR scene detection method of any of claims 1 to 8.
CN202111203958.7A 2021-10-15 2021-10-15 HDR scene detection method and device, terminal and readable storage medium Active CN113822819B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111203958.7A CN113822819B (en) 2021-10-15 2021-10-15 HDR scene detection method and device, terminal and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111203958.7A CN113822819B (en) 2021-10-15 2021-10-15 HDR scene detection method and device, terminal and readable storage medium

Publications (2)

Publication Number Publication Date
CN113822819A true CN113822819A (en) 2021-12-21
CN113822819B CN113822819B (en) 2023-10-27

Family

ID=78916832

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111203958.7A Active CN113822819B (en) 2021-10-15 2021-10-15 HDR scene detection method and device, terminal and readable storage medium

Country Status (1)

Country Link
CN (1) CN113822819B (en)

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016032289A (en) * 2014-07-25 2016-03-07 日本電気株式会社 Image synthesis system, image synthesis method, image synthesis program
WO2017215527A1 (en) * 2016-06-15 2017-12-21 深圳市万普拉斯科技有限公司 Hdr scenario detection method, device, and computer storage medium
CN108322669A (en) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 The acquisition methods and device of image, imaging device, computer readable storage medium and computer equipment
CN109005361A (en) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN110619593A (en) * 2019-07-30 2019-12-27 西安电子科技大学 Double-exposure video imaging system based on dynamic scene
CN111131719A (en) * 2019-12-09 2020-05-08 北京空间机电研究所 Video pipeline processing method and device
CN111479070A (en) * 2019-01-24 2020-07-31 杭州海康机器人技术有限公司 Image brightness determination method, device and equipment
CN111985527A (en) * 2020-07-03 2020-11-24 西安理工大学 Automatic backlight image detection method

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016032289A (en) * 2014-07-25 2016-03-07 日本電気株式会社 Image synthesis system, image synthesis method, image synthesis program
WO2017215527A1 (en) * 2016-06-15 2017-12-21 深圳市万普拉斯科技有限公司 Hdr scenario detection method, device, and computer storage medium
CN108322669A (en) * 2018-03-06 2018-07-24 广东欧珀移动通信有限公司 The acquisition methods and device of image, imaging device, computer readable storage medium and computer equipment
WO2019183813A1 (en) * 2018-03-27 2019-10-03 华为技术有限公司 Image capture method and device
CN109005361A (en) * 2018-08-06 2018-12-14 Oppo广东移动通信有限公司 Control method, device, imaging device, electronic equipment and readable storage medium storing program for executing
US20200045219A1 (en) * 2018-08-06 2020-02-06 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Control method, control apparatus, imaging device, and electronic device
CN111479070A (en) * 2019-01-24 2020-07-31 杭州海康机器人技术有限公司 Image brightness determination method, device and equipment
CN110619593A (en) * 2019-07-30 2019-12-27 西安电子科技大学 Double-exposure video imaging system based on dynamic scene
CN111131719A (en) * 2019-12-09 2020-05-08 北京空间机电研究所 Video pipeline processing method and device
CN111985527A (en) * 2020-07-03 2020-11-24 西安理工大学 Automatic backlight image detection method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HANNING YU ET AL.: "Luminance attentive networks for HDR image and panorama reconstruction", COMPUTER VISION AND PATTERN RECOGNITION *
侯幸林;罗海波;周培培;: "基于局部信息熵最大的多曝光控制方法", 红外与激光工程, no. 07 *
张淑芳;朱彤;: "一种基于HDR技术的交通标志牌检测和识别方法", 激光与光电子学进展, no. 09 *

Also Published As

Publication number Publication date
CN113822819B (en) 2023-10-27

Similar Documents

Publication Publication Date Title
CN108989700B (en) Imaging control method, imaging control device, electronic device, and computer-readable storage medium
EP3694203B1 (en) Method and device for obtaining exposure compensation value of high-dynamic-range image
US8305487B2 (en) Method and apparatus for controlling multiple exposures
JP6742732B2 (en) Method for generating HDR image of scene based on trade-off between luminance distribution and motion
CN109040609B (en) Exposure control method, exposure control device, electronic equipment and computer-readable storage medium
EP3609177A1 (en) Control method, control apparatus, imaging device, and electronic device
CN108632537B (en) Control method and apparatus, imaging device, computer device, and readable storage medium
US8737755B2 (en) Method for creating high dynamic range image
CN107846556B (en) Imaging method, imaging device, mobile terminal and storage medium
CN108683863B (en) Imaging control method, imaging control device, electronic equipment and readable storage medium
CN110246101B (en) Image processing method and device
CN108616689B (en) Portrait-based high dynamic range image acquisition method, device and equipment
CN108337446B (en) High dynamic range image acquisition method, device and equipment based on double cameras
CN109040607B (en) Imaging control method, imaging control device, electronic device and computer-readable storage medium
JP6935272B2 (en) Devices, methods, and programs for creating high dynamic range images
US11601600B2 (en) Control method and electronic device
WO2020034702A1 (en) Control method, device, electronic equipment and computer readable storage medium
JP2003348438A (en) Image photographing method and apparatus, image selecting method and apparatus, and program
JP2009017229A (en) Imaging device and imaging control method
US20110043674A1 (en) Photographing apparatus and method
JP2015144475A (en) Imaging apparatus, control method of the same, program and storage medium
JP2022179514A (en) Control apparatus, imaging apparatus, control method, and program
US20130063622A1 (en) Image sensor and method of capturing an image
US20070097254A1 (en) Devices and methods for calculating pixel values representative of a scene
JP5713643B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, PROGRAM, AND STORAGE MEDIUM

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant