WO2023040622A1 - Hdr图像处理方法及电子设备 - Google Patents

Hdr图像处理方法及电子设备 Download PDF

Info

Publication number
WO2023040622A1
WO2023040622A1 PCT/CN2022/114854 CN2022114854W WO2023040622A1 WO 2023040622 A1 WO2023040622 A1 WO 2023040622A1 CN 2022114854 W CN2022114854 W CN 2022114854W WO 2023040622 A1 WO2023040622 A1 WO 2023040622A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
exposure
frame
images
exposure time
Prior art date
Application number
PCT/CN2022/114854
Other languages
English (en)
French (fr)
Other versions
WO2023040622A9 (zh
Inventor
王宁
王宇
朱聪超
Original Assignee
荣耀终端有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 荣耀终端有限公司 filed Critical 荣耀终端有限公司
Publication of WO2023040622A1 publication Critical patent/WO2023040622A1/zh
Publication of WO2023040622A9 publication Critical patent/WO2023040622A9/zh

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/76Circuitry for compensating brightness variation in the scene by influencing the image signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio

Definitions

  • the present application relates to the technical field of image processing, and in particular to an HDR image processing method and electronic equipment.
  • HDR image processing methods are usually based on the same principle: a single camera exposes the same scene multiple times with different exposures to cover the brightness range of the entire scene, and then synthesizes these images with different exposures into one HDR image.
  • the HDR video is composed of multiple HDR images.
  • the video output frame rate will decrease (for example, Output 60 frames of images per second), and the output video may have problems such as stroboscopic (banding), low-light signal-to-noise ratio reduction, etc., which will affect the video playback effect.
  • the present application provides an HDR image processing method and electronic equipment, which solves the problem of lower video output frame rate caused by the HDR image processing method in the prior art.
  • the present application provides a HDR image processing method, the method comprising:
  • images are continuously collected at a preset frame rate to obtain M frames of images;
  • the M frames of images include alternately arranged long-exposure images and short-exposure images, and the exposure of the long-exposure images is greater than that of the short-exposure images exposure;
  • the fused HDR images are sequentially output according to the preset frame rate to obtain HDR video.
  • the image sensor alternately outputs long-exposure images and short-exposure images at a specific frame rate, and then the image processor fuses each frame image with the previous frame image, and multiplexes each frame image with the next frame image Fusion is performed, so that the continuous output of long exposure images and short exposure images are fused in pairs, and then multiple images obtained after the fusion are output, that is, HDR video. Since each frame of image is multiplexed once, the camera can finally realize The frame rate of the output HDR video is consistent with the frame rate of the output image of the image sensor.
  • the solution of this application can increase the output frame rate of the HDR video, and can also avoid the banding light flickering phenomenon in the video.
  • long-exposure images and short-exposure images are collected alternately, and each frame is fused with the previous frame and the subsequent frame through frame multiplexing to obtain multiple frames of HDR images to ensure that the frame rate remains unchanged.
  • the preset frame rate may be 60 frames per second, that is, the image sensor may collect 60 frames of images per second.
  • M can be an integer greater than or equal to 3.
  • M can be an integer greater than or equal to 3.
  • long exposure images and short exposure images are arranged alternately, that is, when collecting one frame long After the image is exposed, a frame of short exposure image will be collected; and after a frame of short exposure image is collected, a frame of long exposure image will be collected, and so on; the long exposure image has 30 frames, and the short exposure image has 30 frames.
  • the method further includes: storing the M frames of images in an image memory;
  • the fusing of each frame image in the M frame images with the previous frame image, and multiplexing the each frame image and the next frame image for fusion includes:
  • i takes 2, ..., M-1 in turn.
  • the image processor reads the first frame image and the second frame image from the image memory, and converts the first frame image Fusion with the second frame image to obtain the first frame HDR image;
  • the image processor reads the second frame image and the third frame image from the image memory, and fuses the second frame image and the third frame image to obtain the second frame HDR image;
  • the image processor reads the 3rd frame image and the 4th frame image from the image memory, and fuses the 3rd frame image with the 4th frame image to obtain the 3rd frame HDR image;
  • the image processor reads the 4th frame image and the 5th frame image from the image memory, and fuses the 4th frame image with the 5th frame image to obtain the 4th frame HDR image;
  • the image processor reads the 58th frame image and the 59th frame image from the image memory, and fuses the 58th frame image with the 59th frame image to obtain the 58th frame HDR image;
  • the 59th frame image and the 60th frame image are read from the image memory, and the 59th frame image and the 60th frame image are fused to obtain the 59th frame HDR image.
  • the continuously output long-exposure images and short-exposure images are fused in pairs, and then multiple images obtained after the fusion are output to realize HDR video output. Since each frame of image is multiplexed once, the video output frame rate will not be reduced after fusing the long exposure image and the short exposure image into one frame of image.
  • the HDR mode when shooting against the light or when the difference between light and dark light in the shooting environment is relatively large, the HDR mode can improve the effect of the dark part and the bright part of the photo at the same time. Compared with images captured in normal mode, photos or videos captured in HDR mode can better present the details of bright parts and dark parts, improving the image display effect.
  • the continuously collecting images at a preset frame rate includes:
  • the first camera adopts the first exposure parameter to acquire the long exposure image, and adopts the second exposure parameter to acquire the short exposure image;
  • the exposure degree corresponding to the first exposure parameter is greater than the exposure degree corresponding to the second exposure parameter.
  • the exposure time of the long exposure image is longer than the exposure time of the short exposure image.
  • the method further includes: determining the total exposure time according to the preset frame rate.
  • the image sensor outputs 60 frames of images per second. If a long exposure image and a short exposure image are used as a group, 30 groups are output per second, and the exposure time of each group is 1/30, which is about 33ms, that is to say, the total exposure time of the long-exposure images and short-exposure images in each group is 33ms.
  • the allocation principle is: the sum of the exposure time assigned to the long-exposure image and the exposure time assigned to the short-exposure image is less than or equal to the total exposure time of 33ms.
  • the method further includes: when the exposure parameter of the first camera includes an exposure time, determining an exposure degree corresponding to the exposure parameter according to the exposure time.
  • the method further includes: when the exposure parameters of the first camera include exposure time and ISO sensitivity value, determining the exposure corresponding to the exposure parameter according to the product value of the exposure time and the ISO sensitivity value .
  • the exposure parameter when the exposure parameter includes the exposure time and the ISO sensitivity value, the product value of the exposure time and the ISO sensitivity value may be used as the corresponding exposure degree.
  • the method when the exposure parameter of the first camera is exposure time, the method further includes:
  • the first automatic exposure algorithm includes: the sum of the first exposure time and the second exposure time is less than or equal to the total exposure time.
  • the exposure time of the short-exposure image can be 1ms, and the exposure time of the long-exposure image can be 4ms; or, the exposure time of the short-exposure image can be 2ms, and the exposure time of the long-exposure image can be 8ms; or, the exposure time of the short-exposure image can be 4ms, and the exposure time of the long-exposure image can be 16ms.
  • the length of the exposure time here is relative, and the specific value of the exposure time can be determined according to actual usage requirements, which is not limited in this embodiment of the present application.
  • the method when the exposure parameter of the first camera is exposure time, the method further includes:
  • a second automatic exposure algorithm is used to determine the first exposure time corresponding to the long exposure image and the second exposure time corresponding to the short exposure image;
  • the second automatic exposure algorithm includes: the sum of the first exposure time and the second exposure time is less than or equal to the total exposure time, and both the first exposure time and the second exposure time are light Integer multiples of the strobe period.
  • the application scheme proposes a corresponding processing strategy: by controlling the exposure time, the exposure time of the long-exposure image and the short-exposure image are both integral multiples of the light strobe cycle, so as to avoid banding and improve the image display effect.
  • the strobe cycle of the lights is 10 milliseconds.
  • the maximum value of the total exposure time of the long exposure image and the short exposure image is (1/30) second, which is about 33ms.
  • the exposure time of the long-exposure image and the exposure time of the short-exposure image can be 10ms at the same time.
  • the exposure time of the long-exposure image is 20ms
  • the exposure time of the short-exposure image is 10ms.
  • the condition that the period is an integer multiple of 10ms can avoid the phenomenon of banding for shooting scenes with fluorescent lights.
  • the determining the first exposure time and the second exposure time includes:
  • the preset exposure ratio is the ratio between the first exposure time and the second exposure time; or, the preset exposure ratio is the ratio between the first product value and the second product value,
  • the first product value is the product value of the first exposure time and the first ISO sensitivity value
  • the second product value is the product value of the second exposure time and the second ISO sensitivity value.
  • the exposure time of the short-exposure image can be 2ms
  • the ISO sensitivity value of the short-exposure image can be 200
  • the exposure parameter of the short-exposure image is 2*200
  • the exposure time can be 16ms
  • the ISO sensitivity value of the long exposure image can be 100
  • the exposure parameter of the long exposure image is 16*100.
  • the exposure time of the short-exposure image can be 10ms
  • the ISO sensitivity value of the short-exposure image can be 100
  • the exposure parameter of the short-exposure image is 10*100
  • the long-exposure image The exposure time of the long-exposure image can be 10ms
  • the ISO sensitivity value of the long-exposure image can be 200
  • the exposure parameter of the long-exposure image is 10*200.
  • the method before responding to the user's first operation and continuously acquiring images at a preset frame rate to obtain M frames of images, the method further includes:
  • the camera HDR mode is turned on.
  • the user can trigger the camera to turn on the HDR mode, and after the camera's HDR mode is turned on, HDR images or HDR videos can be captured to improve user experience.
  • the method also includes:
  • the camera's HDR mode When it is detected that the ambient brightness of the current shooting scene is greater than or equal to the preset brightness threshold, the camera's HDR mode will be automatically turned on;
  • the camera HDR mode when it is detected that the ambient brightness of the current shooting scene is lower than the preset brightness threshold, the camera HDR mode will be automatically turned off.
  • the preset brightness condition is met, that is, it can be determined that the current shooting scene is an HDR applicable scene, and the camera’s HDR mode is automatically turned on to improve human-machine Interactive experience.
  • the method also includes:
  • the long-exposure image and the short-exposure image are sequentially read from the image memory in the manner of frame multiplexing, and a long-exposure image and a short-exposure image are read and registered each time, and a short-exposure image is read each time and a long-exposure image for registration. Further, weighted fusion is performed on the registered two frames of images to obtain an HDR image, which improves the image display effect.
  • the registering each frame of the M frames of images with the previous frame of images, and multiplexing the registration of each frame of images with the next frame of images includes:
  • the long exposure image is aligned with the short exposure image as a reference.
  • the solution of this application registers and fuses the continuously output long-exposure images and short-exposure images, and then outputs multiple images obtained after fusion to realize HDR video output. Since each frame of image is multiplexed once, the image processor performs HDR synthesis once every two frames of image registration, so that the purpose of not reducing the frame rate can be achieved.
  • the HDR algorithm can output 60fps HDR video; if the image sensor outputs 120 frames per second, the HDR algorithm can output 120fps HDR video.
  • the frame rate of the final output HDR video can be basically consistent with the frame rate of the image output by the image sensor.
  • the image memory is a double rate DDR synchronous dynamic random access memory.
  • the present application provides an HDR image processing device, which includes a unit for executing the method in the first aspect above.
  • the device may correspond to executing the method described in the first aspect above.
  • the relevant description of the units in the device please refer to the description in the first aspect above, and details are not repeated here for brevity.
  • Hardware or software includes one or more modules or units corresponding to the functions described above. For example, a processing module or unit, a display module or unit, etc.
  • the present application provides an electronic device, the electronic device includes a processor, the processor is coupled with a memory, the memory is used to store computer programs or instructions, and the processor is used to execute the computer programs or instructions stored in the memory, so that the first The methods in the aspect are executed.
  • the processor is used to execute the computer programs or instructions stored in the memory, so that the apparatus performs the method in the first aspect.
  • the present application provides a computer-readable storage medium on which is stored a computer program (also referred to as an instruction or code) for implementing the method in the first aspect.
  • a computer program also referred to as an instruction or code
  • the computer when the computer program is executed by a computer, the computer can execute the method in the first aspect.
  • the present application provides a chip, including a processor.
  • the processor is used to read and execute the computer program stored in the memory, so as to execute the method in the first aspect and any possible implementation manners thereof.
  • the chip further includes a memory, and the memory is connected to the processor through a circuit or wires.
  • the present application provides a chip system, including a processor.
  • the processor is used to read and execute the computer program stored in the memory, so as to execute the method in the first aspect and any possible implementation manners thereof.
  • the chip system further includes a memory, and the memory is connected to the processor through a circuit or wires.
  • the present application provides a computer program product
  • the computer program product includes a computer program (also referred to as an instruction or code), and when the computer program is executed by a computer, the computer implements the method in the first aspect .
  • Fig. 1 is a schematic diagram of underexposed and overexposed images in the embodiment of the present application
  • FIG. 2 is a schematic diagram of an interface for enabling the HDR mode in the HDR image processing method provided by the embodiment of the present application;
  • FIG. 3 is a schematic flow diagram of an HDR image processing method provided in an embodiment of the present application.
  • FIG. 4 is another schematic flowchart of an HDR image processing method provided in an embodiment of the present application.
  • FIG. 5 is a schematic diagram of frame multiplexing of images with different exposures in an HDR image processing method provided in an embodiment of the present application
  • FIG. 6 is a schematic diagram of fusing short-frame images and long-frame images in an HDR image processing method provided by an embodiment of the present application;
  • FIG. 7 is a schematic flow diagram of an HDR image processing method provided in an embodiment of the present application applied to a scene with flickering lights;
  • FIG. 8 is a schematic diagram of detection of a light flickering scene in an HDR image processing method provided in an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of an HDR image processing device provided in an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of another HDR image processing device provided by an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • first and second and the like in the specification and claims herein are used to distinguish different objects, not to describe a specific order of objects.
  • first image and the second image are used to distinguish different images, not to describe a specific order of the images.
  • words such as “exemplary” or “for example” are used as examples, illustrations or illustrations. Any embodiment or design scheme described as “exemplary” or “for example” in the embodiments of the present application shall not be interpreted as being more preferred or more advantageous than other embodiments or design schemes. Rather, the use of words such as “exemplary” or “such as” is intended to present related concepts in a concrete manner.
  • multiple means two or more, for example, multiple processing units refer to two or more processing units, etc.; multiple A component refers to two or more components or the like.
  • Exposure refers to the intensity and time of light perceived by the camera. When shooting with a camera, due to the limitation of the dynamic range, the exposure may be too high or too low, which will directly lead to overexposure or underexposure of the subject and the background. If it is overexposed, the photo will be too bright; if it is underexposed or underexposed, the photo will be dark.
  • Exposure parameters including aperture, shutter speed and sensitivity. In actual implementation, proper exposure can be obtained by controlling the three parameters of aperture, shutter speed and sensitivity.
  • the exposure time can be adjusted.
  • sensitivity is used to measure the sensitivity of photosensitive components to light. Specifically, the higher the sensitivity, the stronger the ability to analyze light, the more light it senses, and the brightness of the photo increases; on the contrary, the lower the sensitivity, the weaker the ability to analyze light, and the less light it senses. Photo brightness is reduced.
  • Sensitivity is usually represented by ISO sensitivity value, which can be divided into several gears, and the exposure of adjacent gears is doubled: 50, 100, 200, 400, 800, 1600, 3200... .
  • ISO sensitivity value the higher the ISO sensitivity value, the stronger the sensitivity of the photosensitive components.
  • the exposure during shooting can be controlled by setting or adjusting parameters such as aperture, shutter speed and/or sensitivity, so that the shooting effect can be controlled.
  • the generated HDR images can be made more vivid in color, higher in contrast, and clearer in image details.
  • the change of exposure parameters can be controlled by controlling and adjusting the exposure time
  • the change of exposure parameters can also be controlled by adjusting the ISO sensitivity value
  • the change of exposure parameters can also be controlled by adjusting the size of the aperture.
  • the control of the change of the exposure parameter by adjusting the exposure time is taken as an example for illustration in this paper.
  • AE algorithm automatic exposure (auto explsure, AE) algorithm, through automatic control of aperture (lens aperture size), ISO sensitivity value and shutter speed and other exposure parameters to adjust the exposure.
  • Banding phenomenon In the scene where the fluorescent lamp is used as the light source, the user can see rolling dark stripes appearing one by one in the image captured by the camera, that is, stroboscopic stripes. This phenomenon is called banding or flicker light stroboscopic phenomenon.
  • the sensor is exposed separately for each row, and each row starts to expose at a different start time. Since the light energy irradiated on different pixels (pixels) of the sensor is different, the brightness of the image is different. .
  • the light flicker frequency is 100Hz
  • the brightness changes periodically every 10ms.
  • the exposure time is an integer multiple of the light strobe period (for example, 10ms)
  • banding can be avoided due to the consistent frequency
  • the exposure time is not an integer multiple of the light strobe period (for example, 10ms)
  • the inconsistent frequency there will be banding phenomenon.
  • the AE algorithm can adjust the short-frame exposure parameters to an integer multiple of the light strobe period (for example, 10ms) to ensure that no banding phenomenon occurs.
  • HDR High-dynamic range
  • the HDR mode or function can be set in the mobile phone camera. After the mobile phone turns on the HDR mode of the camera, the camera uses the AE algorithm to adjust the exposure parameters, and shoots multiple frames of images in rapid succession, for example, including underexposed images and overexposed images, and synthesizes these images, and the synthesized HDR In the image screen, the bright part is not exposed, and the details of the dark part are clearly visible.
  • Fig. 2 shows a schematic diagram of an interface for enabling the HDR mode in the camera application in the embodiment of the present application.
  • the "More” option is displayed in the camera shooting preview interface
  • the HDR control 11 is displayed in the menu bar corresponding to the "More” option
  • the electronic device enables the HDR mode.
  • an HDR logo 12 is displayed on the camera shooting preview interface, prompting the user that the current camera has enabled the HDR mode.
  • the camera After the camera enables the HDR mode, it can improve the brightness range of the picture, and finally achieve the effect of over-exposing the highlights and not under-exposing the dark parts, showing more details in the dark parts. Therefore, when taking pictures in an environment with a large contrast between light and dark, the camera with the HDR function will take better pictures.
  • Long-frame image and short-frame image In the embodiment of the present application, for the convenience of description, an image with a long exposure time is called a long-frame image or a long-exposure image, and an image with a short exposure time is called a short-frame image or a short-frame image. Expose the image. It should be noted that the length of the exposure time here is relative, and the specific value of the exposure time can be determined according to actual usage requirements, which is not limited in this embodiment of the present application.
  • Video frame rate A video file is composed of continuous multiple frames of images, and the frame rate can be understood as how many frames of images there are in the video screen recorded or played per second. Among them, the unit of the video frame rate is recorded as fps (frames per second), that is, the number of frames output per second. Exemplarily, the video frame rate may be 60fps, or may be 120fps. Among them, the higher the video frame rate, the smoother the video playback.
  • an image sensor is usually used to output two frames of images each time, including a long frame image and a short frame image, and the two frames of images are combined into one frame image to improve the dynamic range of the image.
  • this approach has two problems:
  • this application proposes an HDR image processing method based on frame multiplexing.
  • the HDR video frame rate can be kept consistent with the frame rate of the image sensor output image.
  • the image sensor when the image sensor outputs images at a frame rate of 60fps, it can ensure that the HDR video output frame rate is also 60fps; when the image sensor outputs images at a frame rate of 120fps, it can ensure that the HDR video output frame rate is also the same 120fps. Therefore, it is possible to output HDR video without reducing the video frame rate.
  • the maximum value of the total exposure time of the long-frame image and the short-frame image is (1/30 ) seconds, about 33ms.
  • the exposure time of the long-frame image and the exposure time of the short-frame image can be 10ms at the same time, for example, the exposure time of the long-frame image is 20ms, and the exposure time of the short-frame image is 10ms, so the exposure time is satisfied.
  • the condition that the period is an integer multiple of 10ms can avoid the phenomenon of banding for shooting scenes with fluorescent lights.
  • the execution subject of the HDR image processing method provided in the embodiment of the present application may be an electronic device (such as a mobile phone), or it may be a functional module and/or a functional entity in the electronic device capable of implementing the HDR image processing method, and the solution of the present application can It is realized by means of hardware and/or software, which can be determined according to actual usage requirements, and is not limited in this embodiment of the present application.
  • the following uses an electronic device as an example to illustrate the HDR image processing method provided by the embodiment of the present application with reference to the accompanying drawings.
  • Embodiment 1 Realize video HDR recording in a high dynamic scene.
  • FIG. 3 is a schematic flowchart of an HDR image processing method provided by an embodiment of the present application. Referring to Fig. 3, the method includes the following steps S101-S103.
  • the long-exposure image and the short-exposure image may be alternately output according to a preset frame rate, wherein the exposure of the long-exposure image is greater than the exposure of the short-exposure image.
  • M is an integer greater than 1.
  • An image obtained by the above-mentioned HDR imaging method may be referred to as an HDR image, and the HDR image may provide a high dynamic range between darker and fully illuminated areas in the scene.
  • a video composed of HDR images may be referred to as an HDR video.
  • the frame-multiplexing HDR image processing method alternately outputs long-frame images and short-frame images at a specific frame rate, and then fuses each frame image with the previous frame image, and multiplexes each frame image Fusion with the next frame image, so that the continuous output long frame image and short frame image are fused in pairs, and then multiple images obtained after the fusion are output to realize HDR video output. Since each frame of image is multiplexed once, the video output frame rate will not be reduced after the long-frame image and the short-frame image are fused into one frame of image.
  • the HDR image processing method provided by the above embodiments of the present application will be described in detail below with reference to FIG. 4 .
  • the HDR image processing method provided by the embodiment of the present application may be executed in the order of steps (1), (2), (3), (4), and (5).
  • the image processor 201 may execute a preset AE algorithm in a high dynamic scene, and generate corresponding control instructions.
  • the control instructions generated by executing the AE algorithm will be referred to as AE algorithm control instructions for short below.
  • the AE algorithm control instruction can be used to control the exposure parameter change when the image sensor 202 is imaging.
  • the exposure parameter is the exposure time
  • the AE algorithm control instruction can indicate the exposure time for generating a long frame image and the exposure time for generating a short frame image. exposure time.
  • the maximum value of the total exposure time of the long-frame image and the short-frame image can be determined.
  • the preset frame rate of the image output by the image sensor 202 is 120fps, 120 frames of images are output per second;
  • the exposure time of one group is 1/60, which is about 17ms, that is to say, the total exposure time of the long frame images and short frame images in each group is 17ms.
  • the allocation principle is: the sum of the exposure duration assigned to the long-frame images and the exposure duration assigned to the short-frame images is less than or equal to the total exposure duration of 17ms.
  • the preset frame rate of the image output by the image sensor 202 is 60fps
  • 60 frames of images are output per second
  • a long frame image and a short frame image are used as a group
  • 30 groups are output per second, wherein
  • the exposure time of each group is 1/30, which is about 33ms, that is to say, the total exposure time of the long-frame images and short-frame images in each group is 33ms.
  • the allocation principle is: the sum of the exposure duration assigned to the long-frame images and the exposure duration assigned to the short-frame images is less than or equal to the total exposure duration of 33ms.
  • the image sensor 202 outputting images at a frame rate of 60 fps is taken as an example below for exemplary description.
  • the image processor 201 may send the AE algorithm control signaling to the image sensor 202 .
  • the AE algorithm control instruction may instruct the image sensor 202 to respectively output a long-frame image with an exposure time T1 and a short-frame image with an exposure time T2; wherein, T1 is greater than T2.
  • the image processor 201 may send the AE algorithm control instruction to the image sensor 202 .
  • step (2) shown in FIG. 4 the image sensor 202 receives the AE algorithm control instruction sent by the image processor 201 .
  • the image sensor 202 can output multiple frames of images continuously and sequentially according to the preset frame rate according to the AE algorithm control instruction, and store the output multiple frames of images in the memory 203 .
  • the image sensor 202 outputs images in a manner of alternately outputting long-frame images L and short-frame images S with different exposure times.
  • An adjacent long-frame image and a short-frame image form a group, for example, a long-frame image L1 and a short-frame image S1 form a group, a long-frame image L2 and a short-frame image S2 form a group, and a long-frame image L2 and a short-frame image
  • the image S2 is a group, and so on, the long-frame image L30 and the short-frame image S30 are a group. That is to say, the image sensor 202 can generate and output 30 sets of images per second, wherein each set of images in the 30 sets of images includes a long-frame image and a short-frame image.
  • the memory 203 may be a double data rate (DDR) synchronous dynamic random access memory, or any other memory that meets actual usage requirements, which is not limited in this embodiment of the present application.
  • DDR double data rate
  • step (3) shown in FIG. 4 when the image processor 201 reads the image from the memory 203, it reads in a frame-multiplexing manner, that is, it can reread and read each frame of image, so that the long-frame image and the short-frame image Frame multiplexing is realized when frame images are registered and fused according to the HDR algorithm provided in this application.
  • the long-frame image L of each group can be registered with the short-frame image S of the same group once, and registered with the short-frame image S of the previous group; the short-frame image S of each group The frame image S can be registered once with the long-frame image L of the same group, and registered with the long-frame image L of the next group.
  • the long-frame image L1 and short-frame image S1 in the first group are first read, and the long-frame image L1 and short-frame image S1 are registered. After image registration, a long-frame image L1 and a short-frame image S1' can be obtained.
  • read in sequence according to the above frame multiplexing method read a long frame image and a short frame image each time and perform registration.
  • the long-frame image L30 in the 30th group and the short-frame image S30 in the 30th group are read, and the long-frame image L30 and the short-frame image S30 are registered. After image registration, a long-frame image L30 and a short-frame image S30' can be obtained.
  • the short-frame image S30 in the 30th group can be registered with the long-frame image L31. If there is no long-frame image L31, then the short-frame image S30 in the 30th group may not be image-registered, not fused, and directly output as a fused image (for example, the short-frame image S30 is directly output as a fused image of the 60th frame) , or discard it.
  • step (4) shown in FIG. 4 the image processor 201 performs weighted fusion of the two registered images.
  • the image processor 201 performs HDR synthesis once every two frames of image registration, so that the purpose of not reducing the frame rate can be achieved.
  • the long-frame image L1 after image registration and the short-frame image S1' can be weighted and fused to obtain an image H1; and the short-frame image S1 after image registration and the long-frame image L2' performs weighted fusion to obtain image H2; and performs weighted fusion of long-frame image L2 and short-frame image S2' after image registration to obtain image H3; and combines short-frame image S2 and long-frame image after image registration L3' performs weighted fusion to obtain image H4; by analogy, the long-frame image L30 after image registration and short-frame image S30' are weighted and fused to obtain image H59. It should be noted that, if there is a long-frame image L31', the image-registered short-frame image S30 and long-frame image L31' can be weighted and fused to obtain an image H60.
  • the image sensor outputs images at a frame rate of 60fps
  • 60 frames of images are output per second
  • 120 frames of images are read through frame multiplexing, and then the 120 frames of images are weighted and fused to obtain 60 frames of images.
  • the frame rate of the HDR video output by the image processor 201 is consistent with the frame rate of the image output by the image sensor.
  • FIG. 6 exemplarily shows a schematic diagram of fusion of a short-frame image and a long-frame image.
  • (a) in FIG. 6 shows a short-frame image
  • (b) in FIG. 6 shows a long-frame image
  • (c) in FIG. 6 shows an image obtained by fusing the short-frame image and the long-frame image.
  • the image obtained after the fusion of the short-frame image and the long-frame image can not only reflect the image details shown in the oval dotted line box, but also can reflect the image details shown in the rectangular dotted line box.
  • the highlight part can be overexposed, the dark part can not be underexposed, the effect of more details in the dark part can be presented, and the image display effect can be improved.
  • the HDR mode when shooting against the light or when the difference between light and dark light in the shooting environment is relatively large, the HDR mode can improve the effect of the dark part and the bright part of the photo at the same time. Compared with images captured in normal mode, photos or videos captured in HDR mode can present more details in bright parts and dark parts.
  • step (5) shown in FIG. 4 the image processor 201 sequentially outputs the weighted and fused HDR images according to a preset frame rate, for example, to a display screen of an electronic device, and the display screen displays multiple frames of HDR images, thereby realizing HDR video recording.
  • the HDR algorithm can output 60fps HDR video; if the image sensor outputs 120 frames of images per second, then the HDR algorithm can output 120fps HDR video.
  • the frame rate of the final output HDR video can be consistent with the frame rate of the image sensor output image.
  • the long frame and the short frame may be alternately used as reference frames during image registration and fusion.
  • the anti-shake algorithm needs to be adapted, the anti-shake calculation can be performed based on the reference frame.
  • the image sensor alternately outputs long-frame images and short-frame images at a specific frame rate
  • the image processor fuses each frame image with the previous frame image, and multiplexes Use each frame of image to fuse with the next frame of image, so that the continuous output of long-frame images and short-frame images are fused in pairs, and then output multiple images obtained after fusion, that is, HDR video, because each frame of image is It is multiplexed once, so the frame rate of the final HDR video output by the camera can be consistent with the frame rate of the image output by the image sensor, and the solution of this application can increase the output frame rate of the HDR video.
  • the second embodiment video HDR recording is realized in the scene of flickering lights.
  • the application scheme proposes a corresponding processing strategy: by controlling the exposure time, the exposure time of the long-frame image and the short-frame image are both integral multiples of the light strobe period (for example, 10ms), so as to avoid banding.
  • the exposure time of the long-frame image and the short-frame image are both integral multiples of the light strobe period (for example, 10ms), so as to avoid banding.
  • a possible implementation manner of how to perform frame multiplexing to realize video HDR recording (for example, the video frame rate is 60fps) will be described in detail below.
  • the HDR image processing method provided by the foregoing embodiments of the present application will be described in detail below with reference to FIG. 7 .
  • the HDR image processing method provided by the embodiment of the present application may be executed in the order of steps (1), (2), (3), (4), and (5).
  • the difference between the second embodiment and the method described in the first embodiment is that in the first embodiment shown in Fig. 4, the image processor 201 does not consider whether the current scene is a light strobe scene , directly execute the preset AE algorithm in the high dynamic scene, and in the second embodiment shown in FIG. 7 , the image processor 201 executes the AE algorithm based on the detection result of the light flickering scene, and generates corresponding control instructions.
  • the control instruction is used to control the change of exposure parameters (such as exposure time) when the image sensor 202 is imaging, so as to determine the exposure time of the long-frame image and the exposure time of the short-frame image.
  • the AE algorithm control command in the lighting strobe scene is used to control the exposure time of the long-frame image and the short-frame image to be an integer multiple of the light strobe cycle to avoid banding.
  • the AE algorithm control command in the flash scene does not need to consider eliminating the banding phenomenon. The specific differences between the two are described in detail below.
  • the image processor 201 may first detect the light flickering scene, and then further execute the AE algorithm according to the scene detection result to generate corresponding control signaling.
  • the scene detection result is a scene without lights, that is, the current scene is not a scene with strobe lights.
  • banding does not occur in scenes without lights, so banding may not be considered when controlling exposure parameter changes.
  • the exposure time only needs to meet the following conditions: the sum of the long frame exposure time and the short frame exposure time is less than or equal to the maximum value of the total exposure time.
  • the maximum value of the total exposure time of the long-frame image and the short-frame image may be determined according to the frame rate of the image output by the image sensor 202 .
  • the pre-frame rate of the image output by the image sensor 202 is 60fps, 60 frames of images are output per second;
  • the exposure time of a group is 1/30, which is about 33ms, that is to say, the total exposure time of the long-frame images and short-frame images in each group is 33ms.
  • the allocation principle is: the sum of the exposure time allocated to the long-frame images and the exposure time to the short-frame images is less than or equal to the total exposure time of 33ms.
  • the AE algorithm when the scene detection result is a scene without light, the AE algorithm generates the following control instruction: instruct the image sensor 202 to output a long-frame image with an exposure time T1 and a short-frame image with an exposure time T2; wherein, T1 is greater than T2.
  • T1 when the maximum value of the total exposure time is 33ms, T1 can be 8ms, T2 can be 2ms; or T1 can be 10ms, T2 can be 6ms; or T1 can be 20ms, T2 can be 13ms; of course, T1 and T2 Other possible numerical values can also be taken respectively, which can be determined according to actual usage requirements, and are not limited in this embodiment of the present application.
  • the scene detection result is a scene with lights, that is, the current scene is a scene with strobe lights.
  • the exposure time is not an integer multiple of the light strobe period, banding will occur; if the exposure time is an integer multiple of the light strobe period, banding will be avoided.
  • the AE algorithm when the scene detection result is a scene with lights, the AE algorithm generates the following control instructions: instruct the image sensor 202 to output the long-frame image with the exposure time T1 and the short-frame image with the exposure time T2 respectively; wherein, T1 is greater than T2, and requires T1
  • T2 and T2 are integer multiples of 10ms.
  • T1 can be set to 20ms
  • T2 can be set to 10ms to avoid banding in scenes with lights.
  • the aperture size or the ISO sensitivity value is fixed to control the imaging exposure by adjusting the exposure time.
  • the exposure time is fixed, you can control the imaging exposure by adjusting the aperture size or ISO sensitivity value, so that the exposure of different frames of images changes. .
  • the image processor 201 may send the AE algorithm control instruction to the image sensor 202 .
  • step (2) after the image sensor 202 receives the AE algorithm control instruction sent by the image processor 201, the image sensor 202 can continuously output multiple frames of images according to the preset frame rate according to the AE algorithm control instruction, and output The multi-frame images of are stored in the memory 203. Specifically, the image sensor 202 outputs images in a manner of alternately outputting long-frame images L and short-frame images S with different exposure times.
  • step (3) when the image processor 201 reads the image from the memory 203, it reads in a frame multiplexing manner, and can read two identical frames of images for each frame of image, so that the long frame image and the Frame multiplexing is realized when short frame images are registered and fused. Further, the image processor 201 registers the read long-frame image and short-frame image.
  • step (4) the image processor 201 performs weighted fusion on the registered two frames of images.
  • step (5) the image processor 201 performs HDR synthesis once every two frames of images are registered, so it can output 60 frames of images per second, and finally output 60fps HDR video.
  • step (2)-step (5) for details, refer to the detailed description of the above-mentioned first embodiment for the step (2)-step (5), which will not be repeated here.
  • an anti-flicker (flicker) sensor may be used to detect a light flickering scene.
  • any other possible sensor can also be used to detect the light strobe scene, which can be determined according to actual usage requirements, and is not limited in the embodiment of the present application.
  • the camera can be detected whether there are bright spots and dark spots appearing alternately. If there are bright spots and dark spots appearing alternately, it means that there is light flickering in the current shooting environment, and it can be determined as a light flickering scene; Not a light strobe scene.
  • FIG. 8 respectively show two consecutive frames of images collected by the mobile phone camera, and there are bright spots in the oval dashed box in Fig. 8 (a), and in Fig. 8 (b) There are dark spots in the oval dotted frame. At this time, there are bright spots and dark spots alternately in two consecutive frames of images. This indicates that there is light flickering in the current shooting environment, that is, the current shooting scene is a scene of strobe lights.
  • the aforementioned image sensor may be a complementary metal-oxide semiconductor (complementary metal-oxide semiconductor, CMOS) image sensor, or may be a charge coupled device (charge coupled device, CCD) image sensor , or any other sensor that meets actual usage requirements, which is not limited in this embodiment of the present application.
  • CMOS complementary metal-oxide semiconductor
  • CCD charge coupled device
  • an image sensor is also called a photosensitive element.
  • the CMOS image sensor scans and exposes row by row until all pixels are exposed.
  • the light energy received by each pixel of the image sensor may be different, that is, the brightness of the image may be different, which may cause a banding phenomenon.
  • the electronic device when the electronic device turns on the HDR mode, it can capture high-quality images and videos with a high dynamic range.
  • the HDR algorithm supports the combination of multiple images taken at the same time and at different exposure levels to generate a high dynamic range HDR image in real time, with unique highlights, shadows, and contrast.
  • the banding phenomenon can be avoided by controlling the exposure parameters.
  • Embodiment 3 Video HDR recording is realized under different environmental brightness conditions.
  • the video HDR recording method in the first embodiment and the second embodiment above may be used.
  • the preset brightness condition is met, that is, it can be determined that the current shooting scene is an HDR applicable scene.
  • the video HDR recording method in the first embodiment and the second embodiment above is not used, but the video HDR recording can be performed according to the conventional video recording method .
  • the image sensor outputs long-frame images and short-frame images alternately according to the conventional frame output method, for example, outputting 60 frames of images per second as an example: sequentially output long-frame image L1, short-frame image S1, long-frame image L2, The short frame image S2, and so on, the long frame image L30, the short frame image S30; then, the long frame image L1 and the short frame image S1 are fused to obtain the image H1, and the long frame image L2 and the short frame image S2 are fused to obtain For the image H2, and so on, the long-frame image L30 and the short-frame image S30 are fused to obtain the image H30, and finally 30 frames of images are output per second, which reduces the video output frame rate to 30fps.
  • the video HDR recording method in the first embodiment and the second embodiment above is adopted; when the electronic device determines that the current shooting scene is a non-HDR applicable scene Then use the conventional video recording method. It can be flexibly switched according to the shooting scene to improve the image shooting effect.
  • the AE algorithm can calculate the long-frame exposure time and short-frame exposure time, and the exposure ratio of the short-frame image to the long-frame image according to the dynamic range of the scene and the ambient brightness.
  • the exposure time of the short-frame image can be 1ms, and the exposure time of the long-frame image can be 1ms. 4ms; or, the exposure time of the short-frame image can be 2ms, and the exposure time of the long-frame image can be 8ms; or, the exposure time of the short-frame image can be 4ms, and the exposure time of the long-frame image can be 16ms.
  • the exposure ratio is 1:4 as an example for illustration.
  • the exposure ratio can also be 1:2, 1:8, or any other possible value. , which is not limited in this embodiment of the present application.
  • the exposure time of the long-frame image and the exposure time of the short-frame image in the above embodiments are exemplary examples, and the specific values of the exposure time of the long-frame image and the short-frame image are not limited thereto, and may be used during specific implementation. It is determined according to actual usage requirements, and is not limited in this embodiment of the application.
  • the exposure parameter is taken as the exposure time as an example for illustration.
  • the exposure parameter includes not only the exposure time, but also the ISO sensitivity value.
  • the exposure parameter includes the exposure time and the ISO sensitivity value
  • the product value of the exposure time and the ISO sensitivity value may be used as the exposure parameter.
  • the exposure time of the short-frame image can be 2ms
  • the ISO sensitivity value of the short-frame image can be 200
  • the exposure parameter of the short-frame image is 2*200
  • the exposure time can be 16ms
  • the ISO sensitivity value of the long-frame image can be 100
  • the exposure parameter of the long-frame image is 16*100.
  • the exposure time of the short-frame image can be 10ms
  • the ISO sensitivity value of the short-frame image can be 100
  • the exposure parameter of the short-frame image is 10*100
  • the long-frame image The exposure time of the long-frame image can be 10ms
  • the ISO sensitivity value of the long-frame image can be 200
  • the exposure parameter of the long-frame image is 10*200.
  • the exposure time and the ISO sensitivity value in the above embodiments are exemplary examples, and the values of the specific exposure time and the ISO sensitivity value are not limited thereto, and can be determined according to actual use requirements during specific implementation. Examples are not limited.
  • the frame-multiplexing HDR image processing method alternately outputs long-frame images and short-frame images at a specific frame rate, and then fuses each frame image with the previous frame image, and multiplexes each frame image Fusion with the next frame image, so that the continuous output long frame image and short frame image are fused in pairs, and then multiple images obtained after the fusion are output to realize HDR video output. Since each frame of image is multiplexed once, the video output frame rate will not be reduced after the long-frame image and the short-frame image are fused into one frame of image.
  • the electronic device implementing the method includes hardware structures and/or software modules corresponding to each function.
  • the present application can be implemented in the form of hardware or a combination of hardware and computer software in combination with the units and algorithm steps of each example described in the embodiments disclosed herein. Whether a certain function is executed by hardware or by computer software driving hardware depends on the specific application and design constraints of the technical solution. Professionals may use different methods to implement the described functions for each specific application, but such implementation should not be regarded as exceeding the protection scope of the present application.
  • the embodiment of the present application may divide the electronic device into functional modules according to the above method examples.
  • each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
  • the above-mentioned integrated modules can be implemented in the form of hardware or in the form of software function modules. It should be noted that the division of modules in the embodiment of the present application is schematic, and is only a logical function division, and there may be other feasible division methods in actual implementation. In the following, description will be made by taking the division of each functional module corresponding to each function as an example.
  • FIG. 9 is a schematic block diagram of an HDR image processing apparatus 800 provided by an embodiment of the present application.
  • the HDR image processing apparatus 800 may be used to perform the actions performed by the electronic device in the above method embodiments.
  • the HDR image processing apparatus 800 includes an image acquisition unit 810 , an image processing unit 820 and a display unit 830 .
  • the image acquisition unit 810 is configured to continuously acquire images at a preset frame rate in response to the user's first operation to obtain M frames of images; the M frames of images include alternately arranged long-exposure images and short-exposure images, and the long-exposure images The exposure is greater than the exposure of the short exposure image;
  • the image processing unit 820 is configured to fuse each frame of the M frames of images with the previous frame of images, and multiplex the each frame of images for fusion with the next frame of images;
  • the display unit 830 is configured to sequentially output and display the fused HDR images according to a preset frame rate, and form HDR videos from multiple frames of fused HDR images.
  • the image sensor alternately outputs long-exposure images and short-exposure images at a specific frame rate, and then the image processor fuses each frame image with the previous frame image, and multiplexes each frame image with the next frame image Fusion is carried out, so that the continuous output of long exposure images and short exposure images are fused in pairs, and then multiple images obtained after the fusion are output, that is, HDR video. Since each frame of image is multiplexed once, the camera can finally realize The frame rate of the output HDR video is consistent with the frame rate of the output image of the image sensor.
  • This application scheme can increase the output frame rate of the HDR video, and can also avoid the banding light strobe phenomenon in the video.
  • the preset frame rate may be 60 frames per second, that is, the image sensor may collect 60 frames of images per second.
  • M can be an integer greater than or equal to 3.
  • M can be an integer greater than or equal to 3.
  • long exposure images and short exposure images are arranged alternately, that is, when collecting one frame long After the image is exposed, a frame of short exposure image will be collected; and after a frame of short exposure image is collected, a frame of long exposure image will be collected, and so on; the long exposure image has 30 frames, and the short exposure image has 30 frames.
  • the HDR image processing apparatus 800 may further include an image storage unit 840 .
  • An image storage unit 840 configured to store the M frames of images acquired by the image acquisition unit 810;
  • the fusing of each frame image in the M frame images with the previous frame image, and multiplexing the each frame image and the next frame image for fusion includes:
  • the image processing unit 820 reads the i-1th frame image and the i-th frame image from the image storage unit 840, and fuses the i-th frame image with the i+1-th frame image to obtain the i-1th frame HDR image;
  • the image processing unit 820 reads the i-th frame image and the i+1-th frame image from the image storage unit 840, and fuses the i+1-th frame image and the i+2-th frame image to obtain the i-th frame HDR image;
  • i takes 2, ..., M-1 in turn.
  • the image processing unit 820 reads the first frame image and the second frame image from the image storage unit 840, and converts the first frame image to 1 frame image is fused with the 2nd frame image to obtain the 1st frame HDR image; then, the image processing unit 820 reads the 2nd frame image and the 3rd frame image from the image storage unit 840, and combines the 2nd frame image with the 3rd frame image
  • the frame images are fused to obtain the second frame of HDR image;
  • the continuously output long-exposure images and short-exposure images are fused in pairs, and then multiple images obtained after the fusion are output to realize HDR video output. Since each frame of image is multiplexed once, the video output frame rate will not be reduced after fusing the long exposure image and the short exposure image into one frame of image.
  • the HDR mode when shooting against the light or when the difference between light and dark light in the shooting environment is relatively large, the HDR mode can improve the effect of the dark part and the bright part of the photo at the same time. Compared with images taken in normal mode, photos or videos taken in HDR mode can better present the details of bright parts and dark parts, improving the image display effect.
  • the image acquisition unit 810 is specifically configured to:
  • the exposure degree corresponding to the first exposure parameter is greater than the exposure degree corresponding to the second exposure parameter.
  • the exposure time of the long exposure image is longer than the exposure time of the short exposure image.
  • the image processing unit 820 is further configured to: determine the total exposure time according to the preset frame rate.
  • the image sensor outputs 60 frames of images per second. If a long exposure image and a short exposure image are used as a group, 30 groups are output per second, and the exposure time of each group is 1/30, which is about 33ms, that is to say, the total exposure time of the long-exposure images and short-exposure images in each group is 33ms.
  • the allocation principle is: the sum of the exposure time assigned to the long-exposure image and the exposure time assigned to the short-exposure image is less than or equal to the total exposure time of 33ms.
  • the image processing unit 820 is further configured to: when the exposure parameter of the first camera includes an exposure time, determine an exposure degree corresponding to the exposure parameter according to the exposure time.
  • the image processing unit 820 is further configured to: when the exposure parameters of the first camera include exposure time and ISO sensitivity value, determine the exposure parameter corresponding to the product value of the exposure time and the ISO sensitivity value. Exposure.
  • the exposure parameter when the exposure parameter includes the exposure time and the ISO sensitivity value, the product value of the exposure time and the ISO sensitivity value may be used as the corresponding exposure degree.
  • the image processing unit 820 is further configured to:
  • the first automatic exposure algorithm includes: the sum of the first exposure time and the second exposure time is less than or equal to the total exposure time.
  • the exposure time of the short-exposure image can be 1ms, and the exposure time of the long-exposure image can be 4ms; or, the exposure time of the short-exposure image can be 2ms, and the exposure time of the long-exposure image can be 8ms; or, the exposure time of the short-exposure image can be 4ms, and the exposure time of the long-exposure image can be 16ms.
  • the length of the exposure time here is relative, and the specific value of the exposure time can be determined according to actual usage requirements, which is not limited in this embodiment of the present application.
  • the image processing unit 820 is further configured to: use the second automatic exposure algorithm to determine the The first exposure time corresponding to the long exposure image and the second exposure time corresponding to the short exposure image;
  • the second automatic exposure algorithm includes: the sum of the first exposure time and the second exposure time is less than or equal to the total exposure time, and both the first exposure time and the second exposure time are light Integer multiples of the strobe period.
  • the application scheme proposes a corresponding processing strategy: by controlling the exposure time, the exposure time of the long-exposure image and the short-exposure image are both integral multiples of the light strobe cycle, so as to avoid the banding phenomenon and improve the image display effect.
  • the strobe cycle of the lights is 10 milliseconds.
  • the maximum value of the total exposure time of the long exposure image and the short exposure image is (1/30) second, which is about 33ms.
  • the exposure time of the long-exposure image and the exposure time of the short-exposure image can be 10ms at the same time.
  • the exposure time of the long-exposure image is 20ms
  • the exposure time of the short-exposure image is 10ms.
  • the condition that the period is an integer multiple of 10ms can avoid the phenomenon of banding for shooting scenes with fluorescent lights.
  • the determining the first exposure time and the second exposure time includes:
  • the preset exposure ratio is the ratio between the first exposure time and the second exposure time; or, the preset exposure ratio is the ratio between the first product value and the second product value,
  • the first product value is the product value of the first exposure time and the first ISO sensitivity value
  • the second product value is the product value of the second exposure time and the second ISO sensitivity value.
  • the exposure time of the short-exposure image can be 2ms
  • the ISO sensitivity value of the short-exposure image can be 200
  • the exposure parameter of the short-exposure image is 2*200
  • the exposure time can be 16ms
  • the ISO sensitivity value of the long exposure image can be 100
  • the exposure parameter of the long exposure image is 16*100.
  • the exposure time of the short-exposure image can be 10ms
  • the ISO sensitivity value of the short-exposure image can be 100
  • the exposure parameter of the short-exposure image is 10*100
  • the long-exposure image The exposure time of the long-exposure image can be 10ms
  • the ISO sensitivity value of the long-exposure image can be 200
  • the exposure parameter of the long-exposure image is 10*200.
  • the image acquisition unit 810 is also used for:
  • the HDR mode of the camera is turned on in response to the second operation of the user.
  • the user can trigger the camera to turn on the HDR mode, and after the camera's HDR mode is turned on, HDR images or HDR videos can be captured to improve user experience.
  • the image acquisition unit 810 is also used for:
  • the camera's HDR mode When it is detected that the ambient brightness of the current shooting scene is greater than or equal to the preset brightness threshold, the camera's HDR mode will be automatically turned on;
  • the camera HDR mode when it is detected that the ambient brightness of the current shooting scene is lower than the preset brightness threshold, the camera HDR mode will be automatically turned off.
  • the preset brightness condition is met, that is, it can be determined that the current shooting scene is an HDR applicable scene, and the camera’s HDR mode is automatically turned on to improve human-machine Interactive experience.
  • the image processing unit 820 is further configured to:
  • the long-exposure image and the short-exposure image are sequentially read from the image storage unit in the manner of frame multiplexing, one long-exposure image and one short-exposure image are read and registered each time, and one short-exposure image is read each time image and a long-exposure image for registration. Further, weighted fusion is performed on the registered two frames of images to obtain an HDR image, which improves the image display effect.
  • the registering each frame of the M frames of images with the previous frame of images, and multiplexing the registration of each frame of images with the next frame of images includes:
  • the long exposure image is aligned with the short exposure image as a reference.
  • the scheme of this application registers and fuses the continuously output long-exposure images and short-exposure images, and then outputs multiple images obtained after fusion to realize HDR video output. Since each frame of image is multiplexed once, the video output frame rate will not be reduced after fusing the long exposure image and the short exposure image into one frame of image.
  • the above-mentioned image storage unit is a double-rate DDR synchronous dynamic random access memory.
  • the HDR image processing device 800 may correspond to the implementation of the method described in the embodiment of the present application, and the above-mentioned and other operations and/or functions of the units in the HDR image processing device 800 are respectively in order to realize the corresponding flow of the method, For the sake of brevity, details are not repeated here.
  • FIG. 11 is a schematic structural diagram of an electronic device 900 provided by an embodiment of the present application.
  • the electronic device 900 may include a processor 910, an external memory interface 920, an internal memory 921, a universal serial bus (universal serial bus, USB) interface 930, a charge management module 940, a power management unit 941, a battery 942, an antenna 1, and an antenna 2.
  • SIM subscriber identification module
  • the sensor module 980 may include a pressure sensor 980A, a gyroscope sensor 980B, an air pressure sensor 980C, a magnetic sensor 980D, an acceleration sensor 980E, a distance sensor 980F, a proximity light sensor 980G, a fingerprint sensor 980H, a temperature sensor 980I, a touch sensor 980J, and ambient light sensor 980K and bone conduction sensor 980L and the like.
  • the structure illustrated in the embodiment of the present application does not constitute a specific limitation on the electronic device 900 .
  • the electronic device 900 may include more or fewer components than shown in the figure, or combine certain components, or separate certain components, or arrange different components.
  • the illustrated components can be realized in hardware, software or a combination of software and hardware.
  • the processor 910 may include one or more processing units, for example: the processor 910 may include an application processor (application processor, AP), a modem processor, a graphics processing unit (graphics processing unit, GPU), an image signal processor (image signal processor, ISP), controller, memory, video codec, digital signal processor (digital signal processor, DSP), baseband processor, and/or neural network processor (neural-network processing unit, NPU) wait. Wherein, different processing units may be independent devices, or may be integrated in one or more processors. Wherein, the controller may be the nerve center and command center of the electronic device 900 . The controller can generate operation control signals (such as control signals for image acquisition, control signals for adjusting exposure parameters, control signals for The control signal of the flash scene, etc.), to complete the control of fetching and executing instructions.
  • application processor application processor
  • AP application processor
  • modem processor graphics processing unit
  • ISP image signal processor
  • controller memory
  • video codec digital signal processor
  • DSP digital signal processor
  • baseband processor baseband processor
  • a memory may also be set in the processor 910 for storing instructions and data, for example, for storing the collected long exposure image and short exposure image, and for storing the registration and fusion of the long exposure image and the short exposure image. HDR video.
  • the memory in processor 910 is a cache memory.
  • the memory may hold instructions or data that the processor 910 has just used or recycled. If the processor 910 needs to use the instruction or data again, it can be directly recalled from the memory. Repeated access is avoided, and the waiting time of the processor 910 is reduced, thereby improving the efficiency of the system.
  • the processor 910 may be configured to execute the above program codes, and call related modules to implement the HDR image processing function of the electronic device in the embodiment of the present application.
  • processor 910 may include one or more interfaces.
  • the interface may include an integrated circuit (inter-integrated circuit, I2C) interface, a pulse code modulation (pulse code modulation, PCM) interface, a universal asynchronous receiver/transmitter (universal asynchronous receiver/transmitter, UART) interface, a general-purpose input and output (general- purpose input/output, GPIO) interface, and/or universal serial bus (universal serial bus, USB) interface, etc.
  • I2C integrated circuit
  • PCM pulse code modulation
  • PCM pulse code modulation
  • UART universal asynchronous receiver/transmitter
  • GPIO general-purpose input and output
  • USB universal serial bus
  • the electronic device 900 implements a display function through a GPU, a display screen 994, an application processor, and the like.
  • the GPU is a microprocessor for image processing, connected to the display screen 994 and the application processor. GPUs are used to perform mathematical and geometric calculations for graphics rendering.
  • Processor 910 may include one or more GPUs that execute program instructions to generate or alter display information.
  • the display screen 994 is used for displaying images or videos, for example, for displaying HDR images or videos.
  • Display 994 includes a display panel.
  • the display panel can be a liquid crystal display (LCD), an organic light-emitting diode (OLED), an active matrix organic light emitting diode or an active matrix organic light emitting diode (active-matrix organic light emitting diode, AMOLED), flexible light-emitting diode (flex light-emitting diode, FLED), Miniled, MicroLed, Micro-oLed, quantum dot light emitting diodes (quantum dot light emitting diodes, QLED), etc.
  • the electronic device 900 may include 1 or N display screens 994, where N is a positive integer greater than 1.
  • the electronic device 900 can realize the shooting function through an ISP, a camera 993 , a video codec, a GPU, a display screen 994 , an application processor, and the like.
  • the ISP is used to process the data fed back by the camera 993. For example, when capturing an image, the shutter is opened, the light is transmitted to the photosensitive element of the camera through the lens, and the light signal is converted into an electrical signal, and the photosensitive element of the camera transmits the electrical signal to the ISP for processing, and converts it into an image visible to the naked eye.
  • ISP can also perform algorithm optimization on image noise, brightness, and skin tone. ISP can also optimize the exposure, color temperature and other parameters of the shooting scene.
  • the ISP may be located in the camera 993 .
  • the camera 993 is used to capture still images or videos, for example, to collect long exposure images and short exposure images.
  • the object generates an optical image through the lens and projects it to the photosensitive element.
  • the photosensitive element may be a charge coupled device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor.
  • CMOS complementary metal-oxide-semiconductor
  • the photosensitive element converts the light signal into an electrical signal, and then transmits the electrical signal to the ISP to convert it into a digital image signal.
  • the ISP outputs the digital image signal to the DSP for processing.
  • DSP converts digital image signals into standard RGB, YUV and other image signals.
  • the electronic device 900 may include 1 or N cameras 993, where N is a positive integer greater than 1.
  • the NPU is a neural-network (NN) computing processor.
  • NN neural-network
  • Applications such as intelligent cognition of the electronic device 900 can be implemented through the NPU, such as image recognition, text recognition, and text understanding.
  • the external memory interface 920 may be used to connect an external memory card, such as a Micro SD card, to expand the storage capacity of the electronic device 900.
  • the external memory card communicates with the processor 910 through the external memory interface 920 to implement a data storage function. For example, files such as long-exposure images and short-exposure images collected, as well as HDR videos obtained by registration and fusion of long-exposure images and short-exposure images are saved in an external memory card.
  • the internal memory 921 may be used to store computer-executable program codes including instructions.
  • the processor 910 executes various functional applications and data processing of the electronic device 900 by executing instructions stored in the internal memory 921 .
  • the internal memory 921 may include an area for storing programs and an area for storing data.
  • the storage program area can store an operating system, an application program required by at least one function (such as an HDR image processing function, etc.), and the like.
  • the data storage area can store data created during use of the electronic device 900 (such as HDR image data, etc.).
  • the internal memory 921 may include a high-speed random access memory, and may also include a non-volatile memory, such as at least one magnetic disk storage device, flash memory device, universal flash storage (universal flash storage, UFS) and the like.
  • the pressure sensor 980A is used to sense the pressure signal and convert the pressure signal into an electrical signal.
  • pressure sensor 980A may be located on display screen 994 .
  • pressure sensors 980A such as resistive pressure sensors, inductive pressure sensors, and capacitive pressure sensors.
  • a capacitive pressure sensor may be comprised of at least two parallel plates with conductive material.
  • the electronic device 900 may also calculate the touched position according to the detection signal of the pressure sensor 980A.
  • touch operations acting on the same touch position but with different touch operation intensities may correspond to different operation instructions. For example: when a touch operation whose intensity is greater than or equal to the first pressure threshold acts on the HDR control in the camera application, an instruction to enable the HDR image processing function is executed.
  • the gyro sensor 980B can be used to determine the motion posture of the electronic device 900 .
  • the angular velocity of the electronic device 900 about three axes can be determined by the gyro sensor 980B.
  • the gyro sensor 980B can be used for image stabilization. Exemplarily, when the shutter is pressed, the gyro sensor 980B detects the shaking angle of the electronic device 900, calculates the distance that the lens module needs to compensate according to the angle, and allows the lens to counteract the shaking of the electronic device 900 through reverse movement to achieve anti-shake.
  • the gyroscope sensor 980B can also be used for navigation and somatosensory game scenes.
  • the acceleration sensor 980E can detect the acceleration of the electronic device 900 in various directions (generally three axes). When the electronic device 900 is stationary, it can detect the magnitude and direction of gravity, and can also be used to identify the posture of the electronic device, and can be used in applications such as switching between horizontal and vertical screens.
  • the distance sensor 980F is used to measure distance.
  • the electronic device 900 can measure the distance by infrared or laser. In some embodiments, in an image collection scene, the electronic device 900 can use the distance sensor 980F to measure distance to achieve fast focusing.
  • the ambient light sensor 980K is used for sensing ambient light brightness.
  • the electronic device 900 may judge whether the current shooting environment satisfies the condition for triggering the HDR image processing function according to the perceived brightness of the environment, and may automatically enable the HDR image processing function if the condition is satisfied.
  • the electronic device 900 can adaptively adjust the brightness of the display screen 994 according to the perceived ambient light brightness.
  • the ambient light sensor 980K can also be used to automatically adjust the white balance during image capture.
  • Touch sensor 980J also known as "touch panel”.
  • the touch sensor 980J can be arranged on the display screen 994, and the touch sensor 980J and the display screen 994 form a touch screen, also called “touch screen”.
  • the touch sensor 980J is used to detect a touch operation on or near it.
  • the touch sensor can pass the detected touch operation to the application processor to determine the type of touch event.
  • Visual output related to touch operations can be provided through the display screen 994 .
  • the touch sensor 980J may also be disposed on the surface of the electronic device 900 , which is different from the position of the display screen 994 .
  • the keys 990 include a power key, a volume key, an HDR control, and the like.
  • the key 990 may be a mechanical key. It can also be a touch button.
  • the electronic device 900 can receive key input and generate key signal input related to the user setting and function control of the electronic device 900. For example, when the electronic device receives the user's input on the HDR control, the electronic device 900 can generate a trigger to enable the HDR image processing function. instruction.
  • the motor 991 can generate vibration prompts.
  • the motor 991 can be used for incoming call vibration prompts, and can also be used for touch vibration feedback.
  • touch operations acting on different applications may correspond to different vibration feedback effects.
  • the motor 991 can also correspond to different vibration feedback effects for touch operations on different areas of the display screen 994 .
  • Different application scenarios can also correspond to different vibration feedback effects.
  • the touch vibration feedback effect can also support customization.
  • the electronic device 900 may be a mobile terminal or a non-mobile terminal.
  • the electronic device 900 may be a mobile phone, a tablet computer, a notebook computer, a handheld computer, a vehicle terminal, a wearable device, an ultra-mobile personal computer (ultra-mobile personal computer, UMPC), a netbook or a personal digital assistant (personal digital assistant) , PDA), wireless earphones, wireless bracelets, wireless smart glasses, wireless watches, augmented reality (augmented reality, AR)/virtual reality (virtual reality, VR) equipment.
  • the embodiment of the present application does not specifically limit the device type of the electronic device 900 .
  • the electronic device 900 shown in FIG. 11 may correspond to the HDR image processing apparatus 800 shown in FIG. 9 or 10 .
  • the processor 910, the camera 993, the display screen 994, and the internal memory 921 in the electronic device 900 shown in FIG. 11 may respectively correspond to the image processing unit 820 and the image acquisition unit in the HDR image processing device 800 in FIG. 10 810 , a display unit 830 , and an image storage unit 840 .
  • the processor 910 executes computer-executable instructions in the memory 921 to execute the operation steps of the above-mentioned method through the electronic device 900 .
  • the present application provides a chip, the chip is coupled with a memory, and the chip is used to read and execute computer programs or instructions stored in the memory, so as to execute the methods in the foregoing embodiments.
  • the present application provides an electronic device, which includes a chip, and the chip is used to read and execute computer programs or instructions stored in a memory, so that the methods in each embodiment are executed.
  • the embodiment of the present application also provides a computer-readable storage medium, the computer-readable storage medium stores program codes, and when the computer program codes run on the computer, the computer executes the above-mentioned Methods in the Examples.
  • the embodiments of the present application further provide a computer program product, the computer program product includes computer program code, and when the computer program code runs on the computer, the computer executes the Methods.
  • the electronic device includes a hardware layer, an operating system layer running on the hardware layer, and an application layer running on the operating system layer.
  • the hardware layer may include hardware such as a central processing unit (central processing unit, CPU), a memory management unit (memory management unit, MMU), and memory (also called main memory).
  • the operating system of the operating system layer can be any one or more computer operating systems that realize business processing through processes, for example, Linux operating system, Unix operating system, Android operating system, iOS operating system, or windows operating system.
  • the application layer may include applications such as browsers, address books, word processing software, and instant messaging software.
  • the embodiment of the present application does not specifically limit the specific structure of the execution subject of the method provided in the embodiment of the present application, as long as the program that records the code of the method provided in the embodiment of the present application can be executed according to the method provided in the embodiment of the present application Just communicate.
  • the subject of execution of the method provided by the embodiment of the present application may be an electronic device, or a functional module in the electronic device capable of invoking a program and executing the program.
  • Computer-readable media may include, but are not limited to, magnetic storage devices (such as hard disks, floppy disks, or tapes, etc.), optical disks (such as compact discs (compact disc, CD), digital versatile discs (digital versatile disc, DVD), etc. ), smart cards and flash memory devices (for example, erasable programmable read-only memory (EPROM), card, stick or key drive, etc.).
  • magnetic storage devices such as hard disks, floppy disks, or tapes, etc.
  • optical disks such as compact discs (compact disc, CD), digital versatile discs (digital versatile disc, DVD), etc.
  • smart cards and flash memory devices for example, erasable programmable read-only memory (EPROM), card, stick or key drive, etc.
  • Various storage media described herein can represent one or more devices and/or other machine-readable media for storing information.
  • the term "machine-readable medium” may include, but is not limited to, wireless channels and various other media capable of storing, containing and/or carrying instructions and/or data.
  • processors mentioned in the embodiment of the present application may be a central processing unit (central processing unit, CPU), and may also be other general processors, digital signal processors (digital signal processor, DSP), application specific integrated circuits ( application specific integrated circuit (ASIC), off-the-shelf programmable gate array (field programmable gate array, FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc.
  • a general-purpose processor may be a microprocessor, or the processor may be any conventional processor, or the like.
  • the memory mentioned in the embodiments of the present application may be a volatile memory or a nonvolatile memory, or may include both volatile and nonvolatile memories.
  • the non-volatile memory can be read-only memory (read-only memory, ROM), programmable read-only memory (programmable ROM, PROM), erasable programmable read-only memory (erasable PROM, EPROM), electrically programmable Erases programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
  • the volatile memory may be random access memory (RAM).
  • RAM can be used as an external cache.
  • RAM may include the following forms: static random access memory (static RAM, SRAM), dynamic random access memory (dynamic RAM, DRAM), synchronous dynamic random access memory (synchronous DRAM, SDRAM) , DDR SDRAM, enhanced synchronous dynamic random access memory (enhanced SDRAM, ESDRAM), synchronous connection dynamic random access memory (synchlink DRAM, SLDRAM) and direct memory bus random access memory (direct rambus RAM, DR RAM).
  • the processor is a general-purpose processor, DSP, ASIC, FPGA or other programmable logic devices, discrete gate or transistor logic devices, or discrete hardware components
  • the memory storage module may be integrated in the processor.
  • memories described herein are intended to include, but are not limited to, these and any other suitable types of memories.
  • the disclosed systems, devices and methods may be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units is only a logical function division. In actual implementation, there may be other division methods.
  • multiple units or components can be combined or May be integrated into another system, or some features may be ignored, or not implemented.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be through some interfaces, and the indirect coupling or communication connection of devices or units may be in electrical, mechanical or other forms.
  • the units described as separate components may or may not be physically separated, and the components shown as units may or may not be physical units, that is, they may be located in one place, or may be distributed to multiple network units. Part or all of the units can be selected according to actual needs to achieve the purpose of the solution of this embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one unit, each unit may exist separately physically, or two or more units may be integrated into one unit.
  • the functions described above are realized in the form of software function units and sold or used as independent products, they can be stored in a computer-readable storage medium.
  • the essence of the technical solution of this application, or the part that contributes to the prior art, or the part of the technical solution can be embodied in the form of computer software products, which are stored in a storage
  • the computer software product includes several instructions, which are used to make a computer device (which may be a personal computer, server, or network device, etc.) execute all or part of the steps of the methods described in the various embodiments of the present application.
  • the aforementioned storage medium may include, but is not limited to: various media that can store program codes such as U disk, mobile hard disk, ROM, RAM, magnetic disk or optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

本申请提供了一种HDR图像处理方法及电子设备,涉及图像处理技术领域。通过本申请方案,图像传感器以特定帧率交替输出长帧图像和短帧图像,然后图像处理器将每一帧图像与前一帧图像进行融合,并且复用每一帧图像与后一帧图像进行融合,如此将连续输出的长帧图像和短帧图像进行两两融合,然后输出融合后得到的多个HDR图像,形成HDR视频,由于每一帧图像均被复用一次,因此可以实现相机最终输出HDR视频的帧率与图像传感器输出图像的帧率一致,本申请方案可以提升HDR视频的输出帧率,还可以避免视频出现banding灯光频闪现象。

Description

HDR图像处理方法及电子设备
本申请要求于2021年9月17日提交国家知识产权局、申请号为202111095578.6、申请名称为“HDR图像处理方法及电子设备”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及图像处理技术领域,尤其涉及一种HDR图像处理方法及电子设备。
背景技术
现实世界中的场景在亮度上常具有非常高的动态范围。而普通数字成像设备(例如相机或手机)中的传感器感知的动态范围要小的多,导致在拍摄大动态范围时很难在一次曝光图像中展现所有的细节,会在太亮或者太暗的地方形成过曝光或者欠曝光。为了使成像结果呈现出丰富的色彩细节和明暗层次,能更好的匹配人眼对现实世界场景的认知特性,高动态范围(high dynamic range,HDR)成像成为了数字成像设备中越来越流行的一种成像技术。
目前,HDR图像处理方法通常基于同一原理:通过单个摄像头对同一场景以不同的曝光量多次曝光,实现对整个场景的亮度范围覆盖,然后将这些不同曝光度的图像合成为一张HDR图像,再由多个HDR图像组成HDR视频。
然而,在以特定帧率(例如每秒输出120帧图像)输出不同曝光度的图像的场景中,在采用上述HDR图像处理方式将不同曝光度的图像融合后,视频输出帧率会降低(例如每秒输出60帧图像),并且输出的视频可能会出现频闪(banding)现象、暗光信噪比降低等问题,从而影响视频播放效果。
发明内容
本申请提供一种HDR图像处理方法及电子设备,解决了现有技术中HDR图像处理方式造成视频输出帧率降低的问题。
为达到上述目的,本申请采用如下技术方案:
第一方面,本申请提供一种HDR图像处理方法,该方法包括:
响应于用户的第一操作,以预设帧率连续采集图像,得到M帧图像;该M帧图像包括交替排列的长曝光图像和短曝光图像,该长曝光图像的曝光度大于该短曝光图像的曝光度;
将该M帧图像中的每一帧图像与前一帧图像进行融合,并复用所述每一帧图像与后一帧图像进行融合;
将融合得到的HDR图像按照预设帧率依次输出,得到HDR视频。
通过本申请方案,图像传感器以特定帧率交替输出长曝光图像和短曝光图像,然后图像处理器将每一帧图像与前一帧图像进行融合,并且复用每一帧图像与后一帧图像进行融合,如此将连续输出的长曝光图像和短曝光图像进行两两融合,然后输出融合后得到的多 个图像,即HDR视频,由于每一帧图像均被复用一次,因此可以实现相机最终输出HDR视频的帧率与图像传感器输出图像的帧率一致,本申请方案可以提升HDR视频的输出帧率,还可以避免视频出现banding灯光频闪现象。
在录制视频过程中,交替采集长曝光图像和短曝光图像,并通过帧复用方式,将每一帧与前帧和后帧分别融合,得到多帧HDR图像,保证帧率不变。
示例性地,预设帧率可以为每秒60帧,即图像传感器可以每秒采集60帧图像。
其中,M可以为大于或等于3的整数,例如以M为60为例进行说明,在图像传感器连续采集得到的60帧图像中,长曝光图像和短曝光图像交替排列,即在采集一帧长曝光图像之后,会采集一帧短曝光图像;以及在采集一帧短曝光图像后,会采集一帧长曝光图像,依次类推;长曝光图像有30帧,短曝光图像有30帧。
在一些实施例中,通过将短曝光图像和长曝光图像进行融合,可以实现高光部分不过曝,暗部不欠曝,可以呈现暗部更多细节的效果,提升图像显示效果。
在一些可能实施方式中,在所述得到M帧图像之后,所述方法还包括:将所述M帧图像存储在图像存储器中;
其中,所述将所述M帧图像中的每一帧图像与前一帧图像进行融合,并复用所述每一帧图像与后一帧图像进行融合,包括:
从所述图像存储器中读取第i-1帧图像和第i帧图像,将所述第i帧图像与所述第i+1帧图像进行融合,得到第i-1帧HDR图像;
从所述图像存储器中读取所述第i帧图像和第i+1帧图像,将所述第i+1帧图像与第i+2帧图像进行融合,得到第i帧HDR图像;
其中,i依次取2,……,M-1。
示例性地,以M为60为例进行说明,i依次取2,……,M-1时,图像处理器从图像存储器中读取第1帧图像和第2帧图像,将第1帧图像与第2帧图像进行融合,得到第1帧HDR图像;
然后,图像处理器从图像存储器中读取第2帧图像和第3帧图像,将第2帧图像与第3帧图像进行融合,得到第2帧HDR图像;
然后,图像处理器从图像存储器中读取第3帧图像和第4帧图像,将第3帧图像与第4帧图像进行融合,得到第3帧HDR图像;
然后,图像处理器从图像存储器中读取第4帧图像和第5帧图像,将第4帧图像与第5帧图像进行融合,得到第4帧HDR图像;
依次类推,图像处理器从图像存储器中读取第58帧图像和第59帧图像,将第58帧图像与第59帧图像进行融合,得到第58帧HDR图像;
然后,从图像存储器中读取第59帧图像和第60帧图像,将第59帧图像与第60帧图像进行融合,得到第59帧HDR图像。
本申请中将连续输出的长曝光图像和短曝光图像进行两两融合,然后输出融合后得到的多个图像,实现HDR视频输出。由于每一帧图像均被复用一次,因此在将长曝光图像和短曝光图像融合为一帧图像之后,并不会降低视频输出帧率。
在实际实现时,当逆光拍摄或者拍摄环境明暗光差异比较大时,HDR模式可以同时提升照片暗部和亮部的效果。与采用普通模式拍摄的图像相比,在HDR模式下拍摄得到的 照片或视频更能呈现亮部的细节和暗部的细节,提升图像显示效果。
在一些可能实施方式中,所述以预设帧率连续采集图像,包括:
第一摄像头采用第一曝光参数采集长曝光图像,并采用第二曝光参数采集短曝光图像;
其中,所述第一曝光参数对应的曝光度大于所述第二曝光参数对应的曝光度。
示例性地,长曝光图像的曝光时间大于短曝光图像的曝光时间。
在一些可能实施方式中,所述方法还包括:根据所述预设帧率,确定所述总曝光时长。
示例性地,假设图像传感器每秒输出60帧图像,若以一长曝光图像和一短曝光图像为一组,则每秒输出30组,其中每一组的曝光时长为1/30,约为33ms,也就是说每组中的长曝光图像和短曝光图像的总曝光时长为33ms。在向每组中的长曝光图像和短曝光图像分配曝光时长时,分配原则为:分配给长曝光图像的曝光时长和分配给短曝光图像的曝光时长的和值小于或等于总曝光时长33ms。
在一些可能实施方式中,所述方法还包括:当所述第一摄像头的曝光参数包括曝光时间时,根据曝光时间确定所述曝光参数对应的曝光度。
在一些可能实施方式中,所述方法还包括:当所述第一摄像头的曝光参数包括曝光时间和ISO感光值时,根据曝光时间和ISO感光值的乘积值确定所述曝光参数对应的曝光度。
其中,当曝光参数包括曝光时间和ISO感光值时,可以将曝光时间和ISO感光值的乘积值,作为对应的曝光度。
在一些可能实施方式中,在第一摄像头的曝光参数为曝光时间的情况下,所述方法还包括:
采用第一自动曝光算法,确定所述长曝光图像对应的第一曝光时间和所述短曝光图像对应的第二曝光时间;
其中,所述第一自动曝光算法包括:所述第一曝光时间和所述第二曝光时间的总和小于或者等于总曝光时长。
示例性地,假设短曝光图像与长曝光图像的曝光比为1:4,那么在总曝光时长的最大值为33ms时,短曝光图像的曝光时间可以取1ms,长曝光图像的曝光时间可以取4ms;或者,短曝光图像的曝光时间可以取2ms,长曝光图像的曝光时间可以取8ms;或者,短曝光图像的曝光时间可以取4ms,长曝光图像的曝光时间可以取16ms。
需要说明的是,这里曝光时间的长短是相对而言的,具体曝光时间的取值可以根据实际使用需求确定,本申请实施例不作限定。
在一些可能实施方式中,在第一摄像头的曝光参数为曝光时间的情况下,所述方法还包括:
在当前拍摄场景中有日光灯照射的情况下,采用第二自动曝光算法,确定所述长曝光图像对应的第一曝光时间和所述短曝光图像对应的第二曝光时间;
其中,所述第二自动曝光算法包括:所述第一曝光时间和所述第二曝光时间的总和小于或者等于总曝光时长,且所述第一曝光时间和所述第二曝光时间均为灯光频闪周期的整数倍。
在本申请实施例中,在当前拍摄场景中有日光灯发光(即灯光频闪场景)时,可能会出现banding现象,影响视频效果。鉴于此,本申请方案提出了对应的处理策略:通过控制曝光时间,使得长曝光图像和短曝光图像的曝光时间均为灯光频闪周期的整数倍,以避 免出现banding现象,提升图像显示效果。
在一些可能实施方式中,所述灯光频闪周期为10毫秒。
例如,当图像传感器以60fps帧率输出图像时,长曝光图像和短曝光图像的总曝光时间的最大值为(1/30)秒,约为33ms。在此情况下,长曝光图像的曝光时间和短曝光图像的曝光时间可以同时取10ms,例如长曝光图像的曝光时间取20ms,短曝光图像的曝光时间取10ms,因此满足曝光时间是灯光频闪周期10ms的整数倍的条件,这样对于有日光灯的拍摄场景可以避免出现banding现象。
在一些可能实施方式中,所述确定所述第一曝光时间和所述第二曝光时间,包括:
根据预设曝光比,确定所述第一曝光时间和所述第二曝光时间;
其中,所述预设曝光比为所述第一曝光时间和所述第二曝光时间之间的比值;或者,所述预设曝光比为第一乘积值和第二乘积值之间的比值,所述第一乘积值为所述第一曝光时间和第一ISO感光值的乘积值,所述第二乘积值为所述第二曝光时间和第二ISO感光值的乘积值。
示例性地,假设总曝光时长的最大值为33ms,短曝光图像的曝光时间可以为2ms,短曝光图像的ISO感光值可以为200,短曝光图像的曝光参数为2*200;长曝光图像的曝光时间可以为16ms,长曝光图像的ISO感光值可以为100,长曝光图像的曝光参数为16*100。在此情况下,短曝光图像与长曝光图像的曝光比为(2*200):(16*100)=1:4。
再示例性地,假设总曝光时长的最大值为33ms,短曝光图像的曝光时间可以为10ms,短曝光图像的ISO感光值可以为100,短曝光图像的曝光参数为10*100;长曝光图像的曝光时间可以为10ms,长曝光图像的ISO感光值可以为200,长曝光图像的曝光参数为10*200。在此情况下,短曝光图像与长曝光图像的曝光比为(10*100):(10*200)=1:2。
在一些可能实施方式中,在所述响应于用户的第一操作,以预设帧率连续采集图像,得到M帧图像之前,所述方法还包括:
响应于用户的第二操作,开启相机HDR模式。
其中,可以根据用户实际使用需求,由用户触发开启相机HDR模式,在相机HDR模式开启后可以拍摄得到HDR图像或者HDR视频,提升用户体验。
在一些可能实施方式中,所述方法还包括:
当检测到当前拍摄场景的环境亮度大于或等于预设亮度阈值时,自动开启相机HDR模式;
或者,当检测到当前拍摄场景的环境亮度小于预设亮度阈值时,自动关闭相机HDR模式。
其中,若当前拍摄场景(例如逆光拍摄场景)的环境亮度的明暗对比度比较大的场景,则满足预设亮度条件,即可以确定当前拍摄场景为HDR适用场景,自动开启相机HDR模式,提升人机交互体验。
在一些可能实施方式中,所述方法还包括:
将所述M帧图像中的每一帧图像与前一帧图像进行配准,并复用所述每一帧图像与后一帧图像进行配准。
其中,按照帧复用的方式从图像存储器中依次读取长曝光图像和短曝光图像,每次读取一长曝光图像和一短曝光图像并进行配准,并且每次读取一短曝光图像和一长曝光图像 并进行配准。进一步地,将配准后的两帧图像进行加权融合,得到HDR图像,提升图像显示效果。
在一些可能实施方式中,所述将所述M帧图像中的每一帧图像与前一帧图像进行配准,并复用所述每一帧图像与后一帧图像进行配准,包括:
以所述长曝光图像为基准,对齐所述短曝光图像;
或者,以所述短曝光图像为基准,对齐所述长曝光图像。
在本申请中,由于本申请方案将连续输出的长曝光图像和短曝光图像进行两两配准并融合,然后输出融合后得到的多个图像,实现HDR视频输出。由于每一帧图像均被复用一次,因此图像处理器在每两帧图像配准完成之后进行一次HDR合成,因此可以实现帧率不降低的目的。
例如,若图像传感器每秒输出60帧图像,则经HDR算法可以输出60fps的HDR视频;若图像传感器每秒输出120帧图像,则经HDR算法可以输出120fps的HDR视频。通过本申请实施例提供的帧复用的HDR图像处理方法,可以实现最终输出HDR视频的帧率与图像传感器输出图像的帧率基本一致。
在一些可能实施方式中,所述图像存储器为双倍速率DDR同步动态随机存取存储器。
第二方面,本申请提供一种HDR图像处理装置,该装置包括用于执行上述第一方面中的方法的单元。该装置可对应于执行上述第一方面中描述的方法,该装置中的单元的相关描述请参照上述第一方面的描述,为了简洁,在此不再赘述。
其中,上述第一方面描述的方法可以通过硬件实现,也可以通过硬件执行相应的软件实现。硬件或软件包括一个或多个与上述功能相对应的模块或单元。例如,处理模块或单元、显示模块或单元等。
第三方面,本申请提供一种电子设备,所述电子设备包括处理器,处理器与存储器耦合,存储器用于存储计算机程序或指令,处理器用于执行存储器存储的计算机程序或指令,使得第一方面中的方法被执行。
例如,处理器用于执行存储器存储的计算机程序或指令,使得该装置执行第一方面中的方法。
第四方面,本申请提供一种计算机可读存储介质,其上存储有用于实现第一方面中的方法的计算机程序(也可称为指令或代码)。
例如,该计算机程序被计算机执行时,使得该计算机可以执行第一方面中的方法。
第五方面,本申请提供一种芯片,包括处理器。处理器用于读取并执行存储器中存储的计算机程序,以执行第一方面及其任意可能的实现方式中的方法。
可选地,所述芯片还包括存储器,存储器与处理器通过电路或电线连接。
第六方面,本申请提供一种芯片系统,包括处理器。处理器用于读取并执行存储器中存储的计算机程序,以执行第一方面及其任意可能的实现方式中的方法。
可选地,所述芯片系统还包括存储器,存储器与处理器通过电路或电线连接。
第七方面,本申请提供一种计算机程序产品,所述计算机程序产品包括计算机程序(也可称为指令或代码),所述计算机程序被计算机执行时使得所述计算机实现第一方面中的方法。
可以理解的是,上述第二方面至第七方面的有益效果可以参见上述第一方面中的相关 描述,在此不再赘述。
附图说明
图1为本申请实施例中曝光不足和曝光过度的图像示意图;
图2为本申请实施例提供的HDR图像处理方法中启用HDR模式的界面示意图;
图3为本申请实施例提供的一种HDR图像处理方法的流程示意图;
图4为本申请实施例提供的一种HDR图像处理方法的另一流程示意图;
图5为本申请实施例提供的一种HDR图像处理方法中对不同曝光度的图像进行帧复用的示意图;
图6为本申请实施例提供的一种HDR图像处理方法中将短帧图像和长帧图像进行融合的示意图;
图7为本申请实施例提供的一种HDR图像处理方法应用于灯光频闪场景中的流程示意图;
图8为本申请实施例提供的一种HDR图像处理方法中灯光频闪场景的检测示意图;
图9为本申请实施例提供的一种HDR图像处理装置的结构示意图;
图10为本申请实施例提供的另一种HDR图像处理装置的结构示意图;
图11为本申请实施例提供的电子设备的结构示意图。
具体实施方式
为使本申请实施例的目的、技术方案和优点更加清楚,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本文中术语“和/或”,是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。本文中符号“/”表示关联对象是或者的关系,例如A/B表示A或者B。
本文中的说明书和权利要求书中的术语“第一”和“第二”等是用于区别不同的对象,而不是用于描述对象的特定顺序。例如,第一图像和第二图像等是用于区别不同的图像,而不是用于描述图像的特定顺序。
在本申请实施例中,“示例性的”或者“例如”等词用于表示作例子、例证或说明。本申请实施例中被描述为“示例性的”或者“例如”的任何实施例或设计方案不应被解释为比其它实施例或设计方案更优选或更具优势。确切而言,使用“示例性的”或者“例如”等词旨在以具体方式呈现相关概念。
在本申请实施例的描述中,除非另有说明,“多个”的含义是指两个或者两个以上,例如,多个处理单元是指两个或者两个以上的处理单元等;多个元件是指两个或者两个以上的元件等。
为便于理解本申请实施例,以下对本申请实施例的部分用语进行解释说明,以便于本领域技术人员理解。
1)曝光度:指的是相机感受到光亮的强弱及时间的长短。当通过相机拍摄时,由于受到动态范围的限制,曝光度可能过高或者过低,会直接导致被拍摄对象和背景发生过曝或者欠曝。如果曝光过度,那么照片会太亮;如果曝光不足或欠曝光,那么照片会偏暗。
举例来说,当在明暗对比相差较大的拍摄环境下使用相机的普通模式拍照时,如图1中(a)的虚线框内所示,曝光不足的图像画面会呈现暗部全黑,如图1中(b)的虚线框内所示,曝光过度的图像画面会呈现亮部过曝。
由此可知,在过曝光时拍摄的图像中的亮部过曝,无法体现亮部细节;同样,在欠曝光时拍摄的图像中的暗部欠曝,也无法体现暗部细节。
2)曝光参数:包括光圈、快门速度和感光度。在实际实现时,可以通过控制光圈、快门速度和感光度这三个参数来得到合适的曝光度。
其中,通过控制快门速度,可以调整曝光时间。快门速度越快,曝光时间越短,曝光量就越少;相反,快门速度越慢,曝光时间越长,曝光量就越多,照片亮度增加。
其中,光圈越大(数值越小)、如F2.8,曝光量越多、照片亮度增加。光圈越小(数值越大)、如F16,曝光量减少、照片亮度降低。
其中,感光度用于衡量感光元器件对于光线的敏感程度。具体地,感光度越高,对光线的解析能力越强,感应到的光线越多,照片亮度增加;相反,感光度越低,对光线的解析能力越弱,感应到的光线就越少,照片亮度降低。
感光度常用ISO感光值表示,ISO感光值可以分为若干个档位,相邻档位的曝光量相差一倍:50、100、200、400、800、1600、3200...。ISO感光值越高,说明感光元器件的感光能力越强。
因此,可以通过设置或者调整光圈、快门速度和/或感光度等参数,控制拍摄时的曝光量,从而可以控制拍摄效果。通过合理调整曝光参数,可以使得生成的HDR图像色彩更鲜明、对比度更高、图像细节更清晰。
具体到本申请方案,可以通过控制调整曝光时间来控制曝光参数变化,也可以通过调整ISO感光值来控制曝光参数变化,还可以通过调整光圈大小来控制曝光参数变化。为了便于说明,本文中以通过调整曝光时间来控制曝光参数变化为例进行示例性说明。
3)AE算法:自动曝光(auto explsure,AE)算法,通过自动控制光圈(镜头光圈大小),ISO感光值和快门速度等曝光参数来调整曝光度。
4)banding现象:在日光灯作为光源的场景中,用户可以看到相机拍摄的画面出现一条一条的滚动暗条纹,即频闪条纹,这种现象叫做banding或flicker灯光频闪现象。
从图像传感器(sensor)来看,sensor是每行分别曝光的,并且每行在不同的起始时间开始曝光,由于照射在sensor的不同像素(pixel)上的光能量不同,则图像的亮度不同。
从光源方面来讲,以交流市电50Hz光源为例,灯光闪烁频率为100Hz,对应的灯光频闪周期为T=1/100=10毫秒(ms);即,在日光灯作为光源的场景中,亮度每隔10ms周期性地变化。
因此,如果曝光时间是灯光频闪周期(例如10ms)的整数倍,由于频率一致,那么可以避免出现banding现象;如果曝光时间不是灯光频闪周期(例如10ms)的整数倍,由于频率不一致,那么就会出现banding现象。
结合到本申请方案,当检测到banding现象或场景时,AE算法可以将短帧曝光参数调整到灯光频闪周期(例如10ms)的整数倍,以确保不会出现banding现象。
5)HDR:高动态范围(high-dynamic range,HDR),手机相机中可以设置有HDR模式或者功能。在手机开启相机的HDR模式之后,相机采用AE算法对曝光参数进行调整, 快速连续地拍摄多帧图像,例如包括欠曝光的图像和过曝光的图像,并将这些图像进行合成,合成后的HDR图像画面,亮部不过曝,暗部细节清晰可见。
图2示出了本申请实施例中在相机应用中开启HDR模式的界面示意图。如图2中(a)所示,相机拍摄预览界面中显示有“更多”选项,“更多”选项对应的菜单栏中显示有HDR控件11,响应于用户在HDR控件11上的触发操作,电子设备启用HDR模式。如图2中(b)所示,在电子设备启用HDR模式之后,相机拍摄预览界面中显示有HDR标识12,提示用户当前相机已开启HDR模式。
在相机启用HDR模式后,可以改善画面亮度的范围,最终达到高光部分不过曝,暗部不欠曝,呈现暗部更多细节的效果。因此,当在明暗对比相差比较大的环境中拍照时,带有HDR功能的相机拍照的效果更加出色。
6)长帧图像和短帧图像:在本申请实施例中,为了便于描述,将曝光时间长的图像称为长帧图像或者长曝光图像,将曝光时间短的图像称为短帧图像或者短曝光图像。需要说明的是,这里曝光时间的长短是相对而言的,具体曝光时间的取值可以根据实际使用需求确定,本申请实施例不作限定。
7)视频帧率:视频文件是由连续多帧图像组成,帧率可以理解为每秒录制或播放的视频画面中有多少帧图像。其中,视频帧率的单位记为fps(frames per second),即每秒输出的帧数。示例性地,视频帧率可以为60fps,或者可以为120fps。其中,视频帧率越高,视频播放越流畅。
目前,在录制HDR视频的过程中,通常采用图像传感器每次输出两帧图像,包括长帧图像和短帧图像,并将这两帧图像合成一帧图像,以提高图像动态范围。然而,这种方法存在如下两个问题:
问题一:当图像传感器输出图像的帧率为120fps时,由于两帧图像融合成一帧图像,因此HDR视频帧率降低为60fps。换言之,如果要实现HDR视频帧率为60fps,那么需要图像传感器输出图像的帧率为120fps,这样导致相机功耗和性能要求高。
问题二:假设图像传感器输出图像的帧率为120fps(此时HDR视频帧率为60fps),那么长帧图像和短帧图像的总曝光时间的最大值为(1/60)秒,约为17ms。在此情况下,长帧图像的曝光时间和短帧图像的曝光时间无法同时取到10ms,因此无法满足曝光时间是灯光频闪周期10ms的整数倍的条件,这样对于有日光灯的拍摄场景就会出现banding现象。
鉴于此,本申请提出一种基于帧复用的HDR图像处理方法,通过重复使用长帧图像和短帧图像进行图像融合,可以保持HDR视频帧率与图像传感器输出图像的帧率一致。示例性地,通过本申请方案,当图像传感器以60fps帧率输出图像时,可以保证HDR视频输出帧率同样为60fps;当图像传感器以120fps帧率输出图像时,可以保证HDR视频输出帧率同样为120fps。因此,可以实现在不降低视频帧率的情况下输出HDR视频。
此外,通过本申请方案,当图像传感器以60fps帧率输出图像(此时HDR视频输出帧率可以为60fps)时,那么长帧图像和短帧图像的总曝光时间的最大值为(1/30)秒,约为33ms。在此情况下,长帧图像的曝光时间和短帧图像的曝光时间可以同时取10ms,例如长帧图像的曝光时间取20ms,短帧图像的曝光时间取10ms,因此满足曝光时间是灯光频闪周期10ms的整数倍的条件,这样对于有日光灯的拍摄场景可以避免出现banding现象。
本申请实施例提供的HDR图像处理方法的执行主体可以为电子设备(例如手机),也 可以为该电子设备中能够实现该HDR图像处理方法的功能模块和/或功能实体,并且本申请方案能够通过硬件和/或软件的方式实现,具体的可以根据实际使用需求确定,本申请实施例不作限定。下面以电子设备为例,结合附图对本申请实施例提供的HDR图像处理方法进行示例性的说明。
下面分别针对不同的应用场景,通过三个实施例对本申请实施例提供的HDR图像处理方法进行详细地描述。
第一实施例:在高动态场景下实现视频HDR录制。
图3是本申请实施例提供的HDR图像处理方法的流程示意图。参照图3所示,该方法包括下述的步骤S101-S103。
S101,按照预设帧率交替输出曝光度不同的M个图像。
在本申请实施例中,可以按照预设帧率交替输出长曝光图像和短曝光图像,其中长曝光图像的曝光度大于短曝光图像的曝光度。其中,M为大于1的整数。
S102,将M个图像中的每一帧图像与前一帧图像进行配准并融合,并复用每一帧图像与后一帧图像进行配准并融合。
S103,将融合得到的M帧图像按照预设帧率依次输出,得到HDR视频。
通过上述HDR成像方式获得的图像可以称为HDR图像,HDR图像可以提供在场景中较暗到完全被照亮的区域之间的高动态范围。由HDR图像组成的视频可以称为HDR视频。
本申请实施例提供的帧复用的HDR图像处理方法,以特定帧率交替输出长帧图像和短帧图像,然后将每一帧图像与前一帧图像进行融合,并且复用每一帧图像与后一帧图像进行融合,如此将连续输出的长帧图像和短帧图像进行两两融合,然后输出融合后得到的多个图像,实现HDR视频输出。由于每一帧图像均被复用一次,因此在将长帧图像和短帧图像融合为一帧图像之后,并不会降低视频输出帧率。
下面结合图4对本申请上述实施例提供的HDR图像处理方法进行详细描述。如图4所示,本申请实施例提供的HDR图像处理方法可以按照步骤(1)、(2)、(3)、(4)、(5)的顺序执行。
在图4所示的步骤(1)中,图像处理器201可以执行高动态场景下预设的AE算法,生成相应的控制指令。为了便于说明,以下将通过执行AE算法生成的控制指令简称为AE算法控制指令。AE算法控制指令可以用于控制图像传感器202成像时的曝光参数变化,例如当曝光参数为曝光时间时,AE算法控制指令可以指示用于生成长帧图像的曝光时间和用于生成短帧图像的曝光时间。
需要说明的是,根据图像传感器202输出图像的帧率,可以确定出长帧图像和短帧图像的总曝光时间的最大值。
示例性地,若图像传感器202输出图像的预设帧率为120fps,则每秒输出120帧图像,若以一长帧图像和一短帧图像为一组,则每秒输出60组,其中每一组的曝光时长为1/60,约为17ms,也就是说每组中的长帧图像和短帧图像的总曝光时长为17ms。在向每组中的长帧图像和短帧图像分配曝光时长时,分配原则为:分配给长帧图像的曝光时长和分配给短帧图像的曝光时长的和值小于或等于总曝光时长17ms。
再示例性地,若图像传感器202输出图像的预设帧率为60fps,则每秒输出60帧图像, 若以一长帧图像和一短帧图像为一组,则每秒输出30组,其中每一组的曝光时长为1/30,约为33ms,也就是说每组中的长帧图像和短帧图像的总曝光时长为33ms。在向每组中的长帧图像和短帧图像分配曝光时长时,分配原则为:分配给长帧图像的曝光时长和分配给短帧图像的曝光时长的和值小于或等于总曝光时长33ms。
为了便于说明,下文中以图像传感器202按照60fps帧率输出图像为例进行示例性说明。
进一步地,图像处理器201可以将AE算法控制信令发送给图像传感器202。例如,AE算法控制指令可以指示图像传感器202分别输出曝光时间T1的长帧图像和曝光时间T2的短帧图像;其中,T1大于T2。
再进一步地,图像处理器201可以将AE算法控制指令发送给图像传感器202。
在图4所示步骤(2)中,图像传感器202接收到图像处理器201发送的AE算法控制指令。图像传感器202可以根据AE算法控制指令,按照预设帧率连续依次输出多帧图像,并将输出的多帧图像存储于存储器203中。
其中,图像传感器202输出图像的方式为:交替输出曝光时间不同的长帧图像L和短帧图像S。
示例性地,如图5所示,以预设帧率为60fps为例,先输出一长帧图像L1,再输出一短帧图像S1,再输出一长帧图像L2,再输出一短帧图像S2,以此出帧顺序类推,在输出一长帧图像L30后,再输出一短帧图像S30,如此交替输出长帧图像L和短帧图像S。相邻的一长帧图像和一短帧图像构成一组,例如长帧图像L1和短帧图像S1为一组,长帧图像L2和短帧图像S2为一组,长帧图像L2和短帧图像S2为一组,以此类推,长帧图像L30和短帧图像S30为一组。也就是说,图像传感器202可以每秒生成并输出30组图像,其中这30组图像中的每组图像包括一长帧图像和一短帧图像。
可选地,存储器203可以为双倍速率(double data rate,DDR)同步动态随机存取存储器,或者可以为其他任意满足实际使用需求的存储器,本申请实施例不作限定。
在图4所示步骤(3)中,图像处理器201在从存储器203中读取图像时按照帧复用的方式进行读取,即可以重读读取每帧图像,以便在长帧图像和短帧图像按照本申请提供的HDR算法配准并融合时实现帧复用。
示例性地,结合图5所示的HDR算法,每组的长帧图像L可以与同组的短帧图像S配准一次,并与上一组的短帧图像S配准;每组的短帧图像S可以与同组的长帧图像L配准一次,并与下一组的长帧图像L配准。
如图5所示,先读取第1组中的长帧图像L1和短帧图像S1,并将长帧图像L1和短帧图像S1进行配准。在经过图像配准之后,可以得到长帧图像L1和短帧图像S1'。
然后,再读取第1组中的短帧图像S1和第2组中的长帧图像L2,并将短帧图像S1和长帧图像L2进行配准。在经过图像配准之后,可以得到短帧图像S1和长帧图像L2'。
然后,再读取第2组中的长帧图像L2和第2组中的短帧图像S2,并将长帧图像L2和短帧图像S2进行配准。在经过图像配准之后,可以得到长帧图像L2和短帧图像S2'。
然后,再读取第2组中的短帧图像S2和第3组中的长帧图像L3,并将短帧图像S2和长帧图像L3进行配准。在经过图像配准之后,可以得到短帧图像S2和长帧图像L3'。
依次类推,按照上述帧复用的方式依次进行读取,每次读取一长帧图像和一短帧图像 并进行配准。示例性地,如图5所示,读取第30组中的长帧图像L30和第30组中的短帧图像S30,并将长帧图像L30和短帧图像S30进行配准。在经过图像配准之后,可以得到长帧图像L30和短帧图像S30'。
需要说明的是,若在第30组之后可以读取长帧图像L31,则可以将第30组中的短帧图像S30与长帧图像L31进行图像配准。若不存在长帧图像L31,则第30组中的短帧图像S30可以不作图像配准,不做融合,直接作为融合图像进行输出(例如短帧图像S30直接作为第60帧的融合图像输出),或者丢弃不用。
在图4所示步骤(4)中,图像处理器201将配准后的两帧图像进行加权融合。图像处理器201在每两帧图像配准完成之后进行一次HDR合成,因此可以实现帧率不降低的目的。
示例性地,如图5所示,可以将图像配准后的长帧图像L1与短帧图像S1'进行加权融合,得到图像H1;并将图像配准后的短帧图像S1与长帧图像L2'进行加权融合,得到图像H2;并将图像配准后的长帧图像L2与短帧图像S2'进行加权融合,得到图像H3;并将图像配准后的短帧图像S2与长帧图像L3'进行加权融合,得到图像H4;依次类推,将图像配准后的长帧图像L30与短帧图像S30'进行加权融合,得到图像H59。需要说明的是,若存在长帧图像L31',则可以将图像配准后的短帧图像S30与长帧图像L31'进行加权融合,得到图像H60。
如此,在图像传感器以60fps帧率输出图像的情况下,每秒输出60帧图像,通过帧复用方式读取到120帧图像,然后将该120帧图像进行加权融合,可以得到60帧图像,由此可知,图像处理器201输出HDR视频的帧率与图像传感器输出图像的帧率保持一致。
图6示例性地示出了短帧图像和长帧图像进行融合的示意图。其中,图6中(a)示出了短帧图像,图6中(b)示出了长帧图像,图6中(c)示出了短帧图像和长帧图像融合后得到的图像。
由于曝光不足,图6中(a)的椭圆形虚线框内所示的图像细节无法体现,此时图6中(a)的矩形虚线框内所示的图像细节可以体现。
由于曝光过度,图6中(b)的矩形虚线框内所示的亮部过曝,此时图6中(b)的椭圆形虚线框内所示的图像细节可以体现。
如图6中(c)所示,短帧图像和长帧图像融合后得到的图像,既可以体现椭圆形虚线框内所示的图像细节,也可以体现矩形虚线框内所示的图像细节。这样,可以实现高光部分不过曝,暗部不欠曝,可以呈现暗部更多细节的效果,提升图像显示效果。
在实际实现时,当逆光拍摄或者拍摄环境明暗光差异比较大时,HDR模式可以同时提升照片暗部和亮部的效果。与采用普通模式拍摄的图像相比,在HDR模式下拍摄得到的照片或视频更能呈现亮部的细节和暗部的细节。
在图4所示步骤(5)中,图像处理器201将加权融合得到的HDR图像按照预设帧率依次输出,例如输出给电子设备的显示屏,由显示屏显示多帧HDR图像,从而实现HDR视频的录制。
示例性地,若图像传感器每秒输出60帧图像,则经HDR算法可以输出60fps的HDR视频;若图像传感器每秒输出120帧图像,则经HDR算法可以输出120fps的HDR视频。通过本申请实施例提供的帧复用的HDR图像处理方法,可以实现最终输出HDR视频的帧 率与图像传感器输出图像的帧率一致。
在本申请实施例中,在图像配准和融合时可以交替用长帧和短帧作参考帧。在实际实现时,由于防抖算法需要做适配,因此可以基于参考帧做防抖计算。
本申请实施例提供的帧复用的HDR图像处理方法,图像传感器以特定帧率交替输出长帧图像和短帧图像,然后图像处理器将每一帧图像与前一帧图像进行融合,并且复用每一帧图像与后一帧图像进行融合,如此将连续输出的长帧图像和短帧图像进行两两融合,然后输出融合后得到的多个图像,即HDR视频,由于每一帧图像均被复用一次,因此可以实现相机最终输出HDR视频的帧率与图像传感器输出图像的帧率一致,本申请方案可以提升HDR视频的输出帧率。
第二实施例:在灯光频闪场景下实现视频HDR录制。
在本申请实施例中,在当前拍摄场景中有日光灯发光(即灯光频闪场景)时,可能会出现banding现象,影响视频效果。鉴于此,本申请方案提出了对应的处理策略:通过控制曝光时间,使得长帧图像和短帧图像的曝光时间均为灯光频闪周期(例如10ms)的整数倍,以避免出现banding现象。下面将详细描述如何在灯光频闪场景下进行帧复用以实现视频HDR录制(例如视频帧率为60fps)的可能实现方式。
下面结合图7对本申请上述实施例提供的HDR图像处理方法进行详细描述。如图7所示,本申请实施例提供的HDR图像处理方法可以按照步骤(1)、(2)、(3)、(4)、(5)的顺序执行。
结合图4和图7可知,第二实施例与第一实施例中所述方式不同之处在于,在图4所示第一实施例中图像处理器201没有考虑当前场景是否为灯光频闪场景,直接执行高动态场景下预设的AE算法,而在图7所示第二实施例中图像处理器201基于灯光频闪场景检测结果执行AE算法,生成相应的控制指令。该控制指令用于控制图像传感器202成像时的曝光参数(例如曝光时间)变化,以确定长帧图像的曝光时间和短帧图像的曝光时间。
需要说明的是,灯光频闪场景下的AE算法控制指令,用于控制长帧图像和短帧图像的曝光时间均是灯光频闪周期的整数倍,以避免出现banding现象,而在非灯光频闪场景下的AE算法控制指令不需要考虑消除banding现象,二者的具体区别见下文详细描述。
在步骤(1)中,图像处理器201可以先进行灯光频闪场景检测,然后进一步根据场景检测结果执行AE算法,生成相应的控制信令。
一方面,当检测到当前拍摄环境中没有日光灯等人造光源正在发光时,可以确定场景检测结果为无灯光场景,即当前场景不是灯光频闪场景。由上文描述可知,在无灯光场景中不会出现banding现象,因此在控制曝光参数变化时可以不考虑banding现象。在此情况下,在控制曝光参数(以曝光时间为例)变化时,曝光时间满足以下条件即可:长帧曝光时间和短帧曝光时间的和值小于或等于总曝光时长的最大值。
如第一实施例中所述方式,长帧图像和短帧图像的总曝光时间的最大值,可以根据图像传感器202输出图像的帧率进行确定。示例性地,若图像传感器202输出图像的预帧率为60fps,则每秒输出60帧图像,若以一长帧图像和一短帧图像为一组,则每秒输出30组,其中每一组的曝光时长为1/30,约为33ms,也就是说每组中的长帧图像和短帧图像的总曝光时长为33ms。在向每组中的长帧图像和短帧图像分配曝光时长时,分配原则为:分配给长帧图像的曝光时长和分配给短帧图像的曝光时长的和值小于或等于总曝光时长 33ms。
其中,当场景检测结果为无灯光场景时,AE算法生成如下控制指令:指示图像传感器202分别输出曝光时间T1的长帧图像和曝光时间T2的短帧图像;其中,T1大于T2。
例如,在总曝光时长的最大值为33ms情况下,T1可以取8ms,T2可以取2ms;或者T1可以取10ms,T2可以取6ms;或者T1可以取20ms,T2可以取13ms;当然T1和T2还可以分别取其他可能的数值,具体可以根据实际使用需求确定,本申请实施例不作限定。
另一方面,当检测到当前拍摄环境中有日光灯等人造光源正在发光时,可以确定场景检测结果为有灯光场景,即当前场景是灯光频闪场景。由上文描述可知,在有灯光场景中,若曝光时间不是灯光频闪周期的整数倍则会出现banding现象;若曝光时间是灯光频闪周期的整数倍则会避免出现banding现象。
其中,当场景检测结果为有灯光场景时,AE算法生成如下控制指令:指示图像传感器202分别输出曝光时间T1的长帧图像和曝光时间T2的短帧图像;其中,T1大于T2,并且要求T1和T2的取值均为10ms的整数倍。
例如,在总曝光时长的最大值为33ms情况下,T1可以取20ms,T2可以取10ms,以避免有灯光场景中的banding现象。这里是假设光圈大小或者ISO感光值一定的情况下通过调整感光时间来控制成像曝光度的。当然,还有其他可能的取值,例如T1取10ms,T2取10ms,在曝光时间一定的情况下,可以通过调整光圈大小或者ISO感光值来控制成像曝光度,使得不同帧图像的曝光度变化。
再进一步地,图像处理器201可以将AE算法控制指令发送给图像传感器202。
在步骤(2)中,在图像传感器202接收到图像处理器201发送的AE算法控制指令之后,图像传感器202可以根据AE算法控制指令,按照预设帧率连续依次输出多帧图像,并将输出的多帧图像存储于存储器203中。具体地,图像传感器202输出图像的方式为:交替输出曝光时间不同的长帧图像L和短帧图像S。
在步骤(3)中,图像处理器201在从存储器203中读取图像时按照帧复用的方式进行读取,针对每帧图像可以读取得到相同的两帧图像,以便在长帧图像和短帧图像配准并融合时实现帧复用。进一步地,图像处理器201将读取得到的长帧图像和短帧图像进行配准。
在步骤(4)中,图像处理器201将配准后的两帧图像进行加权融合。
在步骤(5)中,图像处理器201在每两帧图像配准完成之后进行一次HDR合成,因此可以实现每秒输出60帧图像,最终输出60fps的HDR视频。
对于上述步骤(2)-步骤(5)的说明,具体可以参考上述第一实施例对于步骤(2)-步骤(5)的详细说明,此处不再赘述。
可选地,在本申请实施例中,可以通过防闪烁(flicker)传感器进行灯光频闪场景检测。当然还可以通过其他任意可能的传感器进行灯光频闪场景检测,具体可以根据实际使用需求确定,本申请实施例不作限定。
可选地,可以通过对摄像头连续采集的多帧图像进行比对,检测是否有亮斑和暗斑交替呈现。若有亮斑和暗斑交替呈现,说明当前拍摄环境中有灯光闪烁,则可以确定为灯光频闪场景;若没有亮斑和暗斑交替呈现,说明当前拍摄环境中无灯光闪烁,则可以确定不是灯光频闪场景。
示例性地,图8中(a)和(b)分别示出了手机摄像头采集的连续两帧图像,图8中(a)中椭圆形虚线框中呈现有亮斑,图8中(b)中椭圆形虚线框中呈现有暗斑,此时连续两帧图像中有亮斑和暗斑交替呈现,这表明当前拍摄环境中存在灯光闪烁,也就是说,当前拍摄场景为灯光频闪场景。
需要说明的是,在本申请实施例中,上述图像传感器可以为互补型金属氧化物半导体(complementary metal-oxide semiconductor,CMOS)图像传感器,也可以为电荷耦合器件(charge coupled device,CCD)图像传感器,或者其他任意满足实际使用需求的传感器,本申请实施例不作限定。在实际实现时,图像传感器也被称为感光元件。
其中,CMOS图像传感器是逐行扫描逐行进行曝光,直至所有像素点都被曝光。当使用CMOS图像传感器时,图像传感器的各个像素上接受到的光能量可能不同,即图像亮度不同,可能会引起banding现象。
通过本申请方案,当电子设备开启HDR模式时,可以拍摄具有高动态范围的优质图像和视频。HDR算法支持将同一时间、不同曝光级别拍摄的多张图像结合可实时生成高动态范围HDR图像,且具有独特的高光、阴影和对比度。并且,可以通过控制曝光参数避免banding现象。
第三实施例:在不同环境亮度情况下实现视频HDR录制。
在本申请实施例中,可以先检测当前拍摄场景的环境亮度是否满足预设亮度条件,然后根据检测结果再确定是否采用上述第一实施例和第二实施例中的视频HDR录制方式。
一方面,当检测到环境亮度满足预设亮度条件时,则可以采用上述第一实施例和第二实施例中的视频HDR录制方式。
示例性地,若当前拍摄场景(例如逆光拍摄场景)的环境亮度的明暗对比度比较大的场景,则满足预设亮度条件,即可以确定当前拍摄场景为HDR适用场景。
另一方面,若检测到环境亮度不满足预设亮度条件时,则不采用上述第一实施例和第二实施例中的视频HDR录制方式,而是可以按照常规的视频录制方式进行视频HDR录制。具体地,图像传感器按照常规的出帧方式输出交替输出长帧图像和短帧图像,例如以每秒输出60帧图像为例:依次输出长帧图像L1,短帧图像S1,长帧图像L2,短帧图像S2,以此类推,长帧图像L30,短帧图像S30;然后,将长帧图像L1和短帧图像S1进行融合得到图像H1,将长帧图像L2和短帧图像S2进行融合得到图像H2,以此类推,将长帧图像L30和短帧图像S30进行融合得到图像H30,最后每秒输出30帧图像,这样导致视频输出帧率降低为30fps。
这样,在拍摄时,当电子设备确定当前拍摄场景为HDR适用场景时则采用上述第一实施例和第二实施例中的视频HDR录制方式,当电子设备确定当前拍摄场景为非HDR适用场景时则采用常规的视频录制方式。可以根据拍摄场景进行灵活切换,提升图像拍摄效果。
在本申请实施例中,AE算法可以根据场景动态范围和环境亮度,计算长帧曝光时间和短帧曝光时间,以及短帧图像与长帧图像的曝光比。
示例性地,假设短帧图像与长帧图像的曝光比为1:4,那么在总曝光时长的最大值为33ms时,短帧图像的曝光时间可以取1ms,长帧图像的曝光时间可以取4ms;或者,短帧图像的曝光时间可以取2ms,长帧图像的曝光时间可以取8ms;或者,短帧图像的曝光时 间可以取4ms,长帧图像的曝光时间可以取16ms。
可以理解,以上实施例中以曝光比为1:4为例进行示例性地说明,在实际实现时,曝光比还可以为1:2,也可以为1:8,或者为其他任意可能的数值,本申请实施例对此不作限定。
可以理解,以上实施例中长帧图像的曝光时间和短帧图像的曝光时间均为示例性地举例,具体长帧图像和短帧图像的曝光时间的取值不限于此,在具体实现时可以根据实际使用需求确定,本申请实施例不作限定。
需要说明的是,在本申请以上实施例中,均是以曝光参数为曝光时间为例进行示例性说明的,在实际实现时,曝光参数不仅包括曝光时间,还可以包括ISO感光值。当曝光参数包括曝光时间和ISO感光值时,可以将曝光时间和ISO感光值的乘积值,作为曝光参数。
示例性地,假设总曝光时长的最大值为33ms,短帧图像的曝光时间可以为2ms,短帧图像的ISO感光值可以为200,短帧图像的曝光参数为2*200;长帧图像的曝光时间可以为16ms,长帧图像的ISO感光值可以为100,长帧图像的曝光参数为16*100。在此情况下,短帧图像与长帧图像的曝光比为(2*200):(16*100)=1:4。
再示例性地,假设总曝光时长的最大值为33ms,短帧图像的曝光时间可以为10ms,短帧图像的ISO感光值可以为100,短帧图像的曝光参数为10*100;长帧图像的曝光时间可以为10ms,长帧图像的ISO感光值可以为200,长帧图像的曝光参数为10*200。在此情况下,短帧图像与长帧图像的曝光比为(10*100):(10*200)=1:2。
可以理解,以上实施例中曝光时间和ISO感光值均为示例性地举例,具体曝光时间的取值和ISO感光值的取值不限于此,在具体实现时可以根据实际使用需求确定,本申请实施例不作限定。
本申请实施例提供的帧复用的HDR图像处理方法,以特定帧率交替输出长帧图像和短帧图像,然后将每一帧图像与前一帧图像进行融合,并且复用每一帧图像与后一帧图像进行融合,如此将连续输出的长帧图像和短帧图像进行两两融合,然后输出融合后得到的多个图像,实现HDR视频输出。由于每一帧图像均被复用一次,因此在将长帧图像和短帧图像融合为一帧图像之后,并不会降低视频输出帧率。
也需要说明的是,在本申请实施例中,“大于”可以替换为“大于或等于”,“小于或等于”可以替换为“小于”,或者,“大于或等于”可以替换为“大于”,“小于”可以替换为“小于或等于”。
本文中描述的各个实施例可以为独立的方案,也可以根据内在逻辑进行组合,这些方案都落入本申请的保护范围中。
可以理解的是,上述各个方法实施例中由电子设备实现的方法和操作,也可以由可用于电子设备的部件(例如芯片或者电路)实现。
上文描述了本申请提供的方法实施例,下文将描述本申请提供的装置实施例。应理解,装置实施例的描述与方法实施例的描述相互对应,因此,未详细描述的内容可以参见上文方法实施例,为了简洁,这里不再赘述。
上文主要从方法步骤的角度对本申请实施例提供的方案进行了描述。可以理解的是,为了实现上述功能,实施该方法的电子设备包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该可以意识到,结合本文中所公开的实施例描述的各示例的单元 及算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的保护范围。
本申请实施例可以根据上述方法示例,对电子设备进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有其它可行的划分方式。下面以采用对应各个功能划分各个功能模块为例进行说明。
图9为本申请实施例提供的HDR图像处理装置800的示意性框图。该HDR图像处理装置800可以用于执行上文方法实施例中电子设备所执行的动作。该HDR图像处理装置800包括图像采集单元810、图像处理单元820和显示单元830。
图像采集单元810,用于响应于用户的第一操作,以预设帧率连续采集图像,得到M帧图像;该M帧图像包括交替排列的长曝光图像和短曝光图像,该长曝光图像的曝光度大于该短曝光图像的曝光度;
图像处理单元820,用于将该M帧图像中的每一帧图像与前一帧图像进行融合,并复用所述每一帧图像与后一帧图像进行融合;
显示单元830,用于将融合得到的HDR图像按照预设帧率依次输出显示,融合得到的多帧HDR图像形成HDR视频。
通过本申请方案,图像传感器以特定帧率交替输出长曝光图像和短曝光图像,然后图像处理器将每一帧图像与前一帧图像进行融合,并且复用每一帧图像与后一帧图像进行融合,如此将连续输出的长曝光图像和短曝光图像进行两两融合,然后输出融合后得到的多个图像,即HDR视频,由于每一帧图像均被复用一次,因此可以实现相机最终输出HDR视频的帧率与图像传感器输出图像的帧率一致,本申请方案可以提升HDR视频的输出帧率,还可以避免视频出现banding灯光频闪现象。
示例性地,预设帧率可以为每秒60帧,即图像传感器可以每秒采集60帧图像。
其中,M可以为大于或等于3的整数,例如以M为60为例进行说明,在图像传感器连续采集得到的60帧图像中,长曝光图像和短曝光图像交替排列,即在采集一帧长曝光图像之后,会采集一帧短曝光图像;以及在采集一帧短曝光图像后,会采集一帧长曝光图像,依次类推;长曝光图像有30帧,短曝光图像有30帧。
在一些实施例中,通过将短曝光图像和长曝光图像进行融合,可以实现高光部分不过曝,暗部不欠曝,可以呈现暗部更多细节的效果,提升图像显示效果。
在一些可能实施方式中,结合图9,如图10所示,该HDR图像处理装置800还可以包括图像存储单元840。
图像存储单元840,用于存储图像采集单元810采集得到的所述M帧图像;
其中,所述将所述M帧图像中的每一帧图像与前一帧图像进行融合,并复用所述每一帧图像与后一帧图像进行融合,包括:
图像处理单元820从图像存储单元840中读取第i-1帧图像和第i帧图像,将所述第i 帧图像与所述第i+1帧图像进行融合,得到第i-1帧HDR图像;
图像处理单元820从图像存储单元840中读取所述第i帧图像和第i+1帧图像,将所述第i+1帧图像与第i+2帧图像进行融合,得到第i帧HDR图像;
其中,i依次取2,……,M-1。
示例性地,以M为60为例进行说明,i依次取2,……,M-1时,图像处理单元820从图像存储单元840中读取第1帧图像和第2帧图像,将第1帧图像与第2帧图像进行融合,得到第1帧HDR图像;然后,图像处理单元820从图像存储单元840中读取第2帧图像和第3帧图像,将第2帧图像与第3帧图像进行融合,得到第2帧HDR图像;图像处理单元820依次类推,从图像存储单元840中读取第58帧图像和第59帧图像,将第58帧图像与第59帧图像进行融合,得到第58帧HDR图像;然后图像处理单元820从图像存储单元840中读取第59帧图像和第60帧图像,将第59帧图像与第60帧图像进行融合,得到第59帧HDR图像。
本申请中将连续输出的长曝光图像和短曝光图像进行两两融合,然后输出融合后得到的多个图像,实现HDR视频输出。由于每一帧图像均被复用一次,因此在将长曝光图像和短曝光图像融合为一帧图像之后,并不会降低视频输出帧率。
在实际实现时,当逆光拍摄或者拍摄环境明暗光差异比较大时,HDR模式可以同时提升照片暗部和亮部的效果。与采用普通模式拍摄的图像相比,在HDR模式下拍摄得到的照片或视频更能呈现亮部的细节和暗部的细节,提升图像显示效果。
在一些可能实施方式中,所述图像采集单元810,具体用于:
采用第一曝光参数采集长曝光图像,并采用第二曝光参数采集短曝光图像;
其中,所述第一曝光参数对应的曝光度大于所述第二曝光参数对应的曝光度。
示例性地,长曝光图像的曝光时间大于短曝光图像的曝光时间。
在一些可能实施方式中,图像处理单元820还用于:根据所述预设帧率,确定所述总曝光时长。
示例性地,假设图像传感器每秒输出60帧图像,若以一长曝光图像和一短曝光图像为一组,则每秒输出30组,其中每一组的曝光时长为1/30,约为33ms,也就是说每组中的长曝光图像和短曝光图像的总曝光时长为33ms。在向每组中的长曝光图像和短曝光图像分配曝光时长时,分配原则为:分配给长曝光图像的曝光时长和分配给短曝光图像的曝光时长的和值小于或等于总曝光时长33ms。
在一些可能实施方式中,图像处理单元820还用于:当所述第一摄像头的曝光参数包括曝光时间时,根据曝光时间确定所述曝光参数对应的曝光度。
在一些可能实施方式中,图像处理单元820还用于:当所述第一摄像头的曝光参数包括曝光时间和ISO感光值时,根据曝光时间和ISO感光值的乘积值确定所述曝光参数对应的曝光度。
其中,当曝光参数包括曝光时间和ISO感光值时,可以将曝光时间和ISO感光值的乘积值,作为对应的曝光度。
在一些可能实施方式中,在第一摄像头的曝光参数为曝光时间的情况下,图像处理单元820还用于:
采用第一自动曝光算法,确定所述长曝光图像对应的第一曝光时间和所述短曝光图像 对应的第二曝光时间;
其中,所述第一自动曝光算法包括:所述第一曝光时间和所述第二曝光时间的总和小于或者等于总曝光时长。
示例性地,假设短曝光图像与长曝光图像的曝光比为1:4,那么在总曝光时长的最大值为33ms时,短曝光图像的曝光时间可以取1ms,长曝光图像的曝光时间可以取4ms;或者,短曝光图像的曝光时间可以取2ms,长曝光图像的曝光时间可以取8ms;或者,短曝光图像的曝光时间可以取4ms,长曝光图像的曝光时间可以取16ms。
需要说明的是,这里曝光时间的长短是相对而言的,具体曝光时间的取值可以根据实际使用需求确定,本申请实施例不作限定。
在一些可能实施方式中,在第一摄像头的曝光参数为曝光时间的情况下,图像处理单元820还用于:在当前拍摄场景中有日光灯照射的情况下,采用第二自动曝光算法,确定所述长曝光图像对应的第一曝光时间和所述短曝光图像对应的第二曝光时间;
其中,所述第二自动曝光算法包括:所述第一曝光时间和所述第二曝光时间的总和小于或者等于总曝光时长,且所述第一曝光时间和所述第二曝光时间均为灯光频闪周期的整数倍。
在本申请实施例中,在当前拍摄场景中有日光灯发光(即灯光频闪场景)时,可能会出现banding现象,影响视频效果。鉴于此,本申请方案提出了对应的处理策略:通过控制曝光时间,使得长曝光图像和短曝光图像的曝光时间均为灯光频闪周期的整数倍,以避免出现banding现象,提升图像显示效果。
在一些可能实施方式中,所述灯光频闪周期为10毫秒。
例如,当图像传感器以60fps帧率输出图像时,长曝光图像和短曝光图像的总曝光时间的最大值为(1/30)秒,约为33ms。在此情况下,长曝光图像的曝光时间和短曝光图像的曝光时间可以同时取10ms,例如长曝光图像的曝光时间取20ms,短曝光图像的曝光时间取10ms,因此满足曝光时间是灯光频闪周期10ms的整数倍的条件,这样对于有日光灯的拍摄场景可以避免出现banding现象。
在一些可能实施方式中,所述确定所述第一曝光时间和所述第二曝光时间,包括:
根据预设曝光比,确定所述第一曝光时间和所述第二曝光时间;
其中,所述预设曝光比为所述第一曝光时间和所述第二曝光时间之间的比值;或者,所述预设曝光比为第一乘积值和第二乘积值之间的比值,所述第一乘积值为所述第一曝光时间和第一ISO感光值的乘积值,所述第二乘积值为所述第二曝光时间和第二ISO感光值的乘积值。
示例性地,假设总曝光时长的最大值为33ms,短曝光图像的曝光时间可以为2ms,短曝光图像的ISO感光值可以为200,短曝光图像的曝光参数为2*200;长曝光图像的曝光时间可以为16ms,长曝光图像的ISO感光值可以为100,长曝光图像的曝光参数为16*100。在此情况下,短曝光图像与长曝光图像的曝光比为(2*200):(16*100)=1:4。
再示例性地,假设总曝光时长的最大值为33ms,短曝光图像的曝光时间可以为10ms,短曝光图像的ISO感光值可以为100,短曝光图像的曝光参数为10*100;长曝光图像的曝光时间可以为10ms,长曝光图像的ISO感光值可以为200,长曝光图像的曝光参数为10*200。在此情况下,短曝光图像与长曝光图像的曝光比为(10*100):(10*200)=1:2。
在一些可能实施方式中,图像采集单元810还用于:
在所述响应于用户的第一操作,以预设帧率连续采集图像,得到M帧图像之前,响应于用户的第二操作,开启相机HDR模式。
其中,可以根据用户实际使用需求,由用户触发开启相机HDR模式,在相机HDR模式开启后可以拍摄得到HDR图像或者HDR视频,提升用户体验。
在一些可能实施方式中,图像采集单元810还用于:
当检测到当前拍摄场景的环境亮度大于或等于预设亮度阈值时,自动开启相机HDR模式;
或者,当检测到当前拍摄场景的环境亮度小于预设亮度阈值时,自动关闭相机HDR模式。
其中,若当前拍摄场景(例如逆光拍摄场景)的环境亮度的明暗对比度比较大的场景,则满足预设亮度条件,即可以确定当前拍摄场景为HDR适用场景,自动开启相机HDR模式,提升人机交互体验。
在一些可能实施方式中,图像处理单元820,还用于:
将所述M帧图像中的每一帧图像与前一帧图像进行配准,并复用所述每一帧图像与后一帧图像进行配准。
其中,按照帧复用的方式从图像存储单元中依次读取长曝光图像和短曝光图像,每次读取一长曝光图像和一短曝光图像并进行配准,并且每次读取一短曝光图像和一长曝光图像并进行配准。进一步地,将配准后的两帧图像进行加权融合,得到HDR图像,提升图像显示效果。
在一些可能实施方式中,所述将所述M帧图像中的每一帧图像与前一帧图像进行配准,并复用所述每一帧图像与后一帧图像进行配准,包括:
以所述长曝光图像为基准,对齐所述短曝光图像;
或者,以所述短曝光图像为基准,对齐所述长曝光图像。
由于本申请方案将连续输出的长曝光图像和短曝光图像进行两两配准并融合,然后输出融合后得到的多个图像,实现HDR视频输出。由于每一帧图像均被复用一次,因此在将长曝光图像和短曝光图像融合为一帧图像之后,并不会降低视频输出帧率。
在一些可能实施方式中,上述图像存储单元为双倍速率DDR同步动态随机存取存储器。
根据本申请实施例的HDR图像处理装置800可对应于执行本申请实施例中描述的方法,并且HDR图像处理装置800中的单元的上述和其它操作和/或功能分别为了实现方法的相应流程,为了简洁,在此不再赘述。
图11是本申请实施例提供的电子设备900的结构性示意性图。该电子设备900可以包括处理器910,外部存储器接口920,内部存储器921,通用串行总线(universal serial bus,USB)接口930,充电管理模块940,电源管理单元941,电池942,天线1,天线2,移动通信模块950,无线通信模块960,音频模块970,扬声器970A,受话器970B,麦克风970C,耳机接口970D,传感器模块980,按键990,马达991,指示器992,摄像头993,显示屏994,以及用户标识模块(subscriber identification module,SIM)卡接口995等。其中传感器模块980可以包括压力传感器980A,陀螺仪传感器980B,气压传感器980C,磁传感器 980D,加速度传感器980E,距离传感器980F,接近光传感器980G,指纹传感器980H,温度传感器980I,触摸传感器980J,环境光传感器980K以及骨传导传感器980L等。
可以理解的是,本申请实施例示意的结构并不构成对电子设备900的具体限定。在本申请另一些实施例中,电子设备900可以包括比图示更多或更少的部件,或者组合某些部件,或者拆分某些部件,或者不同的部件布置。图示的部件可以以硬件,软件或软件和硬件的组合实现。
处理器910可以包括一个或多个处理单元,例如:处理器910可以包括应用处理器(application processor,AP),调制解调处理器,图形处理器(graphics processing unit,GPU),图像信号处理器(image signal processor,ISP),控制器,存储器,视频编解码器,数字信号处理器(digital signal processor,DSP),基带处理器,和/或神经网络处理器(neural-network processing unit,NPU)等。其中,不同的处理单元可以是独立的器件,也可以集成在一个或多个处理器中。其中,控制器可以是电子设备900的神经中枢和指挥中心。控制器可以根据指令操作码和时序信号,产生操作控制信号(例如用于图像采集的控制信号、用于调整曝光参数的控制信号、用于图像配准及融合的控制信号、用于检测灯光频闪场景的控制信号等),完成取指令和执行指令的控制。
处理器910中还可以设置存储器,用于存储指令和数据,例如用于存储所采集的长曝光图像和短曝光图像,还可以用于存储通过长曝光图像和短曝光图像配准并融合得到的HDR视频。在一些实施例中,处理器910中的存储器为高速缓冲存储器。该存储器可以保存处理器910刚用过或循环使用的指令或数据。如果处理器910需要再次使用该指令或数据,可从存储器中直接调用。避免了重复存取,减少了处理器910的等待时间,因而提高了系统的效率。
处理器910可以用于执行上述程序代码,调用相关模块以实现本申请实施例中电子设备的HDR图像处理功能。
在一些实施例中,处理器910可以包括一个或多个接口。该接口可以包括集成电路(inter-integrated circuit,I2C)接口,脉冲编码调制(pulse code modulation,PCM)接口,通用异步收发传输器(universal asynchronous receiver/transmitter,UART)接口,通用输入输出(general-purpose input/output,GPIO)接口,和/或通用串行总线(universal serial bus,USB)接口等。可以理解的是,本申请实施例示意的各模块间的接口连接关系,只是示意性说明,并不构成对电子设备900的结构限定。在本申请另一些实施例中,电子设备900也可以采用上述实施例中不同的接口连接方式,或多种接口连接方式的组合。
电子设备900通过GPU,显示屏994,以及应用处理器等实现显示功能。GPU为图像处理的微处理器,连接显示屏994和应用处理器。GPU用于执行数学和几何计算,用于图形渲染。处理器910可包括一个或多个GPU,其执行程序指令以生成或改变显示信息。
显示屏994用于显示图像或视频等,例如用于显示HDR图像或视频。显示屏994包括显示面板。显示面板可以采用液晶显示屏(liquid crystal display,LCD),有机发光二极管(organic light-emitting diode,OLED),有源矩阵有机发光二极体或主动矩阵有机发光二极体(active-matrix organic light emitting diode的,AMOLED),柔性发光二极管(flex light-emitting diode,FLED),Miniled,MicroLed,Micro-oLed,量子点发光二极管(quantum dot light emitting diodes,QLED)等。在一些实施例中,电子设备900可以包括1个或N 个显示屏994,N为大于1的正整数。
电子设备900可以通过ISP、摄像头993、视频编解码器、GPU、显示屏994以及应用处理器等实现拍摄功能。
ISP用于处理摄像头993反馈的数据。例如,在采集图像时,打开快门,光线通过镜头被传递到摄像头感光元件上,光信号转换为电信号,摄像头感光元件将电信号传递给ISP处理,转化为肉眼可见的图像。ISP还可以对图像的噪点、亮度、肤色进行算法优化。ISP还可以对拍摄场景的曝光、色温等参数优化。在一些实施例中,ISP可以设置在摄像头993中。
摄像头993用于捕获静态图像或视频,例如用于采集长曝光图像和短曝光图像。物体通过镜头生成光学图像投射到感光元件。感光元件可以是电荷耦合器件(charge coupled device,CCD)或互补金属氧化物半导体(complementary metal-oxide-semiconductor,CMOS)光电晶体管。感光元件把光信号转换成电信号,之后将电信号传递给ISP转换成数字图像信号。ISP将数字图像信号输出到DSP加工处理。DSP将数字图像信号转换成标准的RGB,YUV等格式的图像信号。在一些实施例中,电子设备900可以包括1个或N个摄像头993,N为大于1的正整数。
数字信号处理器用于处理数字信号,除了可以处理数字图像信号,还可以处理其他数字信号。NPU为神经网络(neural-network,NN)计算处理器,通过借鉴生物神经网络结构,例如借鉴人脑神经元之间传递模式,对输入信息快速处理,还可以不断的自学习。通过NPU可以实现电子设备900的智能认知等应用,例如:图像识别,文本识别,文本理解等。
外部存储器接口920可以用于连接外部存储卡,例如Micro SD卡,实现扩展电子设备900的存储能力。外部存储卡通过外部存储器接口920与处理器910通信,实现数据存储功能。例如将所采集的长曝光图像和短曝光图像,以及通过长曝光图像和短曝光图像配准并融合得到的HDR视频等文件保存在外部存储卡中。
内部存储器921可以用于存储计算机可执行程序代码,可执行程序代码包括指令。处理器910通过运行存储在内部存储器921的指令,从而执行电子设备900的各种功能应用以及数据处理。内部存储器921可以包括存储程序区和存储数据区。其中,存储程序区可存储操作系统,至少一个功能所需的应用程序(比如HDR图像处理功能等)等。存储数据区可存储电子设备900使用过程中所创建的数据(比如HDR图像数据等)等。此外,内部存储器921可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件,闪存器件,通用闪存存储器(universal flash storage,UFS)等。
压力传感器980A用于感受压力信号,可以将压力信号转换成电信号。在一些实施例中,压力传感器980A可以设置于显示屏994。压力传感器980A的种类很多,如电阻式压力传感器,电感式压力传感器,电容式压力传感器等。电容式压力传感器可以是包括至少两个具有导电材料的平行板。当有力作用于压力传感器980A,电极之间的电容改变。电子设备900根据电容的变化确定压力的强度。当有触摸操作作用于显示屏994,电子设备900根据压力传感器980A检测触摸操作强度。电子设备900也可以根据压力传感器980A的检测信号计算触摸的位置。在一些实施例中,作用于相同触摸位置,但不同触摸操作强度的触摸操作,可以对应不同的操作指令。例如:当有触摸操作强度大于或等于第一压力阈值 的触摸操作作用于相机应用中的HDR控件时,执行开启HDR图像处理功能的指令。
陀螺仪传感器980B可以用于确定电子设备900的运动姿态。在一些实施例中,可以通过陀螺仪传感器980B确定电子设备900围绕三个轴(例如x,y和z轴)的角速度。陀螺仪传感器980B可以用于拍摄防抖。示例性的,当按下快门,陀螺仪传感器980B检测电子设备900抖动的角度,根据角度计算出镜头模组需要补偿的距离,让镜头通过反向运动抵消电子设备900的抖动,实现防抖。陀螺仪传感器980B还可以用于导航,体感游戏场景。
加速度传感器980E可检测电子设备900在各个方向上(一般为三轴)加速度的大小。当电子设备900静止时可检测出重力的大小及方向,还可以用于识别电子设备姿态,应用于横竖屏切换等应用。
距离传感器980F用于测量距离。电子设备900可以通过红外或激光测量距离。在一些实施例中,在图像采集场景中,电子设备900可以利用距离传感器980F测距以实现快速对焦。
环境光传感器980K用于感知环境光亮度。电子设备900可以根据感知的环境光亮度,判断当前拍摄环境是否满足触发开启HDR图像处理功能的条件,若判断满足条件,则可以自动开启HDR图像处理功能。此外,电子设备900可以根据感知的环境光亮度自适应调节显示屏994亮度。环境光传感器980K也可用于图像采集时自动调节白平衡。
触摸传感器980J,也称“触控面板”。触摸传感器980J可以设置于显示屏994,由触摸传感器980J与显示屏994组成触摸屏,也称“触控屏”。触摸传感器980J用于检测作用于其上或附近的触摸操作。触摸传感器可以将检测到的触摸操作传递给应用处理器,以确定触摸事件类型。可以通过显示屏994提供与触摸操作相关的视觉输出。在另一些实施例中,触摸传感器980J也可以设置于电子设备900的表面,与显示屏994所处的位置不同。
按键990包括开机键、音量键、HDR控件等。按键990可以是机械按键。也可以是触摸式按键。电子设备900可以接收按键输入,产生与电子设备900的用户设置以及功能控制有关的键信号输入,例如当电子设备接收到用户对HDR控件的输入,电子设备900可以产生触发启用HDR图像处理功能的指令。
马达991可以产生振动提示。马达991可以用于来电振动提示,也可以用于触摸振动反馈。例如,作用于不同应用(例如相机应用等)的触摸操作,可以对应不同的振动反馈效果。作用于显示屏994不同区域的触摸操作,马达991也可对应不同的振动反馈效果。不同的应用场景也可以对应不同的振动反馈效果。触摸振动反馈效果还可以支持自定义。
可选地,电子设备900可以为移动终端,也可以为非移动终端。示例性的,电子设备900可以为手机、平板电脑、笔记本电脑、掌上电脑、车载终端、可穿戴设备、超级移动个人计算机(ultra-mobile personal computer,UMPC)、上网本或者个人数字助理(personal digital assistant,PDA)、无线耳机、无线手环、无线智能眼镜、无线手表、增强现实(augmented reality,AR)/虚拟现实(virtual reality,VR)设备。本申请实施例对电子设备900的设备类型不予具体限定。
应理解,图11所示的电子设备900可对应于图9或图10所示的HDR图像处理装置800。其中,图11所示的电子设备900中的处理器910、摄像头993、显示屏994、内部存储器921,可以分别对应于图10中的HDR图像处理装置800中的图像处理单元820、图 像采集单元810、显示单元830、图像存储单元840。
在实际实现时,在电子设备900运行时,处理器910执行存储器921中的计算机执行指令以通过电子设备900执行上述方法的操作步骤。
可选地,在一些实施例中,本申请提供一种芯片,该芯片与存储器耦合,该芯片用于读取并执行存储器中存储的计算机程序或指令,以执行上述各实施例中的方法。
可选地,在一些实施例中,本申请提供一种电子设备,该电子设备包括芯片,该芯片用于读取并执行存储器存储的计算机程序或指令,使得各实施例中的方法被执行。
可选地,在一些实施例中,本申请实施例还提供了一种计算机可读存储介质,该计算机可读存储介质存储有程序代码,当计算机程序代码在计算机上运行时,使得计算机执行上述各实施例中的方法。
可选地,在一些实施例中,本申请实施例还提供了一种计算机程序产品,该计算机程序产品包括计算机程序代码,当计算机程序代码在计算机上运行时,使得计算机执行上述各实施例中的方法。
在本申请实施例中,电子设备包括硬件层、运行在硬件层之上的操作系统层,以及运行在操作系统层上的应用层。其中,硬件层可以包括中央处理器(central processing unit,CPU)、内存管理单元(memory management unit,MMU)和内存(也称为主存)等硬件。操作系统层的操作系统可以是任意一种或多种通过进程(process)实现业务处理的计算机操作系统,例如,Linux操作系统、Unix操作系统、Android操作系统、iOS操作系统或windows操作系统等。应用层可以包含浏览器、通讯录、文字处理软件、即时通信软件等应用。
本申请实施例并未对本申请实施例提供的方法的执行主体的具体结构进行特别限定,只要能够通过运行记录有本申请实施例提供的方法的代码的程序,以根据本申请实施例提供的方法进行通信即可。例如,本申请实施例提供的方法的执行主体可以是电子设备,或者,是电子设备中能够调用程序并执行程序的功能模块。
本申请的各个方面或特征可以实现成方法、装置或使用标准编程和/或工程技术的制品。本文中使用的术语“制品”可以涵盖可从任何计算机可读器件、载体或介质访问的计算机程序。例如,计算机可读介质可以包括但不限于:磁存储器件(例如,硬盘、软盘或磁带等),光盘(例如,压缩盘(compact disc,CD)、数字通用盘(digital versatile disc,DVD)等),智能卡和闪存器件(例如,可擦写可编程只读存储器(erasable programmable read-only memory,EPROM)、卡、棒或钥匙驱动器等)。
本文描述的各种存储介质可代表用于存储信息的一个或多个设备和/或其它机器可读介质。术语“机器可读介质”可以包括但不限于:无线信道和能够存储、包含和/或承载指令和/或数据的各种其它介质。
应理解,本申请实施例中提及的处理器可以是中央处理单元(central processing unit,CPU),还可以是其他通用处理器、数字信号处理器(digital signal processor,DSP)、专用集成电路(application specific integrated circuit,ASIC)、现成可编程门阵列(field programmable gate array,FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件等。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。
还应理解,本申请实施例中提及的存储器可以是易失性存储器或非易失性存储器,或 可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM)。例如,RAM可以用作外部高速缓存。作为示例而非限定,RAM可以包括如下多种形式:静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(dynamic RAM,DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、DDR SDRAM、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
需要说明的是,当处理器为通用处理器、DSP、ASIC、FPGA或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件时,存储器(存储模块)可以集成在处理器中。
还需要说明的是,本文描述的存储器旨在包括但不限于这些和任意其它适合类型的存储器。
本领域普通技术人员可以意识到,结合本文中所公开的实施例描述的各示例的单元及步骤,能够以电子硬件、或者计算机软件和电子硬件的结合来实现。这些功能究竟以硬件还是软件方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的保护范围。
所属领域的技术人员可以清楚地了解到,为描述的方便和简洁,上述描述的系统、装置和单元的具体工作过程,可以参考前述方法实施例中的对应过程,在此不再赘述。
在本申请所提供的几个实施例中,应该理解到,所揭露的系统、装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。此外,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本申请各个实施例中的各功能单元可以集成在一个单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。
所述功能如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个计算机可读取存储介质中。基于这样的理解,本申请的技术方案本质上,或者说对现有技术做出贡献的部分,或者该技术方案的部分,可以以计算机软件产品的形式体现出来,该计算机软件产品存储在一个存储介质中,该计算机软件产品包括若干指令,该指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本申请各个实施例所述方法的全部或部分步骤。前述的存储介质可以包括但不限于:U盘、移动硬 盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。
除非另有定义,本文所使用的所有的技术和科学术语与属于本申请的技术领域的技术人员通常理解的含义相同。本文中在本申请的说明书中所使用的术语只是为了描述具体的实施例的目的,不是旨在于限制本申请。
以上所述,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以所述权利要求的保护范围为准。

Claims (18)

  1. 一种高动态范围HDR图像处理方法,其特征在于,包括:
    响应于用户的第一操作,以预设帧率连续采集图像,得到M帧图像;所述M帧图像包括交替排列的长曝光图像和短曝光图像,所述长曝光图像的曝光度大于所述短曝光图像的曝光度;
    将所述M帧图像中的每一帧图像与前一帧图像进行融合,并复用所述每一帧图像与后一帧图像进行融合;
    将融合得到的HDR图像按照所述预设帧率依次输出,得到HDR视频。
  2. 根据权利要求1所述的方法,其特征在于,在所述得到M帧图像之后,所述方法还包括:将所述M帧图像存储在图像存储器中;
    其中,所述将所述M帧图像中的每一帧图像与前一帧图像进行融合,并复用所述每一帧图像与后一帧图像进行融合,包括:
    从所述图像存储器中读取第i-1帧图像和第i帧图像,将所述第i帧图像与所述第i+1帧图像进行融合,得到第i-1帧HDR图像;
    从所述图像存储器中读取所述第i帧图像和第i+1帧图像,将所述第i+1帧图像与第i+2帧图像进行融合,得到第i帧HDR图像;
    其中,i依次取2,……,M-1。
  3. 根据权利要求1或2所述的方法,其特征在于,所述以预设帧率连续采集图像,包括:
    第一摄像头采用第一曝光参数采集长曝光图像,并采用第二曝光参数采集短曝光图像;
    其中,所述第一曝光参数对应的曝光度大于所述第二曝光参数对应的曝光度。
  4. 根据权利要求3所述的方法,其特征在于,所述方法还包括:
    当所述第一摄像头的曝光参数包括曝光时间时,根据曝光时间确定所述曝光参数对应的曝光度;或者,
    当所述第一摄像头的曝光参数包括曝光时间和ISO感光值时,根据曝光时间和ISO感光值的乘积值确定所述曝光参数对应的曝光度。
  5. 根据权利要求1至4中任一项所述的方法,其特征在于,在第一摄像头的曝光参数为曝光时间的情况下,所述方法还包括:
    采用第一自动曝光算法,确定所述长曝光图像对应的第一曝光时间和所述短曝光图像对应的第二曝光时间;
    其中,所述第一自动曝光算法包括:所述第一曝光时间和所述第二曝光时间的总和小于或者等于总曝光时长。
  6. 根据权利要求1至4中任一项所述的方法,其特征在于,在第一摄像头的曝光参数为曝光时间的情况下,所述方法还包括:
    在当前拍摄场景中有日光灯照射的情况下,采用第二自动曝光算法,确定所述长曝光图像对应的第一曝光时间和所述短曝光图像对应的第二曝光时间;
    其中,所述第二自动曝光算法包括:所述第一曝光时间和所述第二曝光时间的总 和小于或者等于总曝光时长,且所述第一曝光时间和所述第二曝光时间均为灯光频闪周期的整数倍。
  7. 根据权利要求6所述的方法,其特征在于,所述灯光频闪周期为10毫秒。
  8. 根据权利要求5至7中任一项所述的方法,其特征在于,所述确定所述第一曝光时间和所述第二曝光时间,包括:
    根据预设曝光比,确定所述第一曝光时间和所述第二曝光时间;
    其中,所述预设曝光比为所述第一曝光时间和所述第二曝光时间之间的比值;或者,所述预设曝光比为第一乘积值和第二乘积值之间的比值,所述第一乘积值为所述第一曝光时间和第一ISO感光值的乘积值,所述第二乘积值为所述第二曝光时间和第二ISO感光值的乘积值。
  9. 根据权利要求5至8中任一项所述的方法,其特征在于,所述方法还包括:
    根据所述预设帧率,确定所述总曝光时长。
  10. 根据权利要求1至9中任一项所述的方法,其特征在于,在所述响应于用户的第一操作,以预设帧率连续采集图像,得到M帧图像之前,所述方法还包括:
    响应于用户的第二操作,开启相机HDR模式。
  11. 根据权利要求1至10中任一项所述的方法,其特征在于,所述方法还包括:
    当检测到当前拍摄场景的环境亮度大于或等于预设亮度阈值时,自动开启相机HDR模式;或者,
    当检测到当前拍摄场景的环境亮度小于预设亮度阈值时,自动关闭相机HDR模式。
  12. 根据权利要求1至11中任一项所述的方法,其特征在于,所述方法还包括:
    将所述M帧图像中的每一帧图像与前一帧图像进行配准,并复用所述每一帧图像与后一帧图像进行配准。
  13. 根据权利要求12所述的方法,其特征在于,所述将所述M帧图像中的每一帧图像与前一帧图像进行配准,并复用所述每一帧图像与后一帧图像进行配准,包括:
    以所述长曝光图像为基准,对齐所述短曝光图像;或者,
    以所述短曝光图像为基准,对齐所述长曝光图像。
  14. 根据权利要求2所述的方法,其特征在于,所述图像存储器为双倍速率DDR同步动态随机存取存储器。
  15. 一种电子设备,其特征在于,包括摄像头和处理器,所述处理器与存储器耦合,所述处理器用于执行所述存储器中存储的计算机程序或指令,以使得所述电子设备实现如权利要求1至14中任一项所述的方法。
  16. 一种芯片系统,其特征在于,所述芯片系统与存储器耦合,所述芯片系统用于读取并执行所述存储器中存储的计算机程序,以实现如权利要求1至14中任一项所述的方法。
  17. 一种计算机可读存储介质,其特征在于,所述计算机可读存储介质存储有计算机程序,当所述计算机程序在电子设备上运行时,使得所述电子设备执行如权利要求1至14中任一项所述的方法。
  18. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1至14中任一项所述的方法。
PCT/CN2022/114854 2021-09-17 2022-08-25 Hdr图像处理方法及电子设备 WO2023040622A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202111095578.6 2021-09-17
CN202111095578.6A CN115842962A (zh) 2021-09-17 2021-09-17 Hdr图像处理方法及电子设备

Publications (2)

Publication Number Publication Date
WO2023040622A1 true WO2023040622A1 (zh) 2023-03-23
WO2023040622A9 WO2023040622A9 (zh) 2023-05-19

Family

ID=85574158

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/114854 WO2023040622A1 (zh) 2021-09-17 2022-08-25 Hdr图像处理方法及电子设备

Country Status (2)

Country Link
CN (1) CN115842962A (zh)
WO (1) WO2023040622A1 (zh)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117692799A (zh) * 2023-08-26 2024-03-12 荣耀终端有限公司 一种拍摄方法及相关设备
CN117689559A (zh) * 2023-08-07 2024-03-12 上海荣耀智慧科技开发有限公司 一种图像融合方法、装置、电子设备及存储介质
CN117714897A (zh) * 2023-07-24 2024-03-15 荣耀终端有限公司 确定帧率的方法、电子设备及可读存储介质
CN117750190A (zh) * 2024-02-20 2024-03-22 荣耀终端有限公司 一种图像处理方法及电子设备

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116664742B (zh) * 2023-07-24 2023-10-27 泉州华中科技大学智能制造研究院 针对激光光条成像的hdr高动态范围处理方法及装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101917550A (zh) * 2010-07-01 2010-12-15 清华大学 高时空分辨率视频去模糊方法及系统
US20130208138A1 (en) * 2012-02-09 2013-08-15 Aptina Imaging Corporation Imaging systems and methods for generating auto-exposed high-dynamic-range images
CN104134352A (zh) * 2014-08-15 2014-11-05 青岛比特信息技术有限公司 基于长短曝光结合的视频车辆特征检测系统及其检测方法
CN106131445A (zh) * 2016-07-08 2016-11-16 深圳天珑无线科技有限公司 一种拍摄方法和装置
CN113038028A (zh) * 2021-03-24 2021-06-25 浙江光珀智能科技有限公司 一种图像生成方法及系统

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016034094A (ja) * 2014-07-31 2016-03-10 ソニー株式会社 画像処理装置、画像処理方法、プログラム、およびイメージセンサ
KR102628911B1 (ko) * 2019-01-07 2024-01-24 삼성전자주식회사 영상 처리 방법 및 이를 수행하는 영상 처리 장치
CN112738414B (zh) * 2021-04-06 2021-06-29 荣耀终端有限公司 一种拍照方法、电子设备及存储介质

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101917550A (zh) * 2010-07-01 2010-12-15 清华大学 高时空分辨率视频去模糊方法及系统
US20130208138A1 (en) * 2012-02-09 2013-08-15 Aptina Imaging Corporation Imaging systems and methods for generating auto-exposed high-dynamic-range images
CN104134352A (zh) * 2014-08-15 2014-11-05 青岛比特信息技术有限公司 基于长短曝光结合的视频车辆特征检测系统及其检测方法
CN106131445A (zh) * 2016-07-08 2016-11-16 深圳天珑无线科技有限公司 一种拍摄方法和装置
CN113038028A (zh) * 2021-03-24 2021-06-25 浙江光珀智能科技有限公司 一种图像生成方法及系统

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117714897A (zh) * 2023-07-24 2024-03-15 荣耀终端有限公司 确定帧率的方法、电子设备及可读存储介质
CN117689559A (zh) * 2023-08-07 2024-03-12 上海荣耀智慧科技开发有限公司 一种图像融合方法、装置、电子设备及存储介质
CN117692799A (zh) * 2023-08-26 2024-03-12 荣耀终端有限公司 一种拍摄方法及相关设备
CN117750190A (zh) * 2024-02-20 2024-03-22 荣耀终端有限公司 一种图像处理方法及电子设备

Also Published As

Publication number Publication date
WO2023040622A9 (zh) 2023-05-19
CN115842962A (zh) 2023-03-24

Similar Documents

Publication Publication Date Title
WO2023040622A1 (zh) Hdr图像处理方法及电子设备
WO2021052232A1 (zh) 一种延时摄影的拍摄方法及设备
EP3893491A1 (en) Method for photographing the moon and electronic device
CN113475057B (zh) 一种录像帧率的控制方法及相关装置
CN116582741B (zh) 一种拍摄方法及设备
US10200623B1 (en) Image capture setting determination in flash photography operations
CN109618102B (zh) 对焦处理方法、装置、电子设备及存储介质
CN116055897B (zh) 拍照方法及其相关设备
CN115633262B (zh) 图像处理方法和电子设备
CN116320771B (zh) 一种拍摄方法和电子设备
CN116744120B (zh) 图像处理方法和电子设备
CN116055890A (zh) 生成高动态范围视频的方法和电子设备
WO2022252649A1 (zh) 一种视频的处理方法及电子设备
CN115633252B (zh) 拍摄方法及其相关设备
CN117201930A (zh) 一种拍照方法和电子设备
EP4366289A1 (en) Photographing method and related apparatus
CN115883957A (zh) 一种拍摄模式推荐方法
CN113891008B (zh) 一种曝光强度调节方法及相关设备
CN116055855B (zh) 图像处理方法及其相关设备
CN117750190B (zh) 一种图像处理方法及电子设备
RU2789447C1 (ru) Способ и устройство многоканальной видеозаписи
CN116709042B (zh) 一种图像处理方法和电子设备
CN117692799A (zh) 一种拍摄方法及相关设备
CN103179337A (zh) 数字拍摄设备及其控制方法
CN115379039A (zh) 视频拍摄方法、装置和电子设备

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE