WO2019183813A1 - Image capture method and device - Google Patents

Image capture method and device Download PDF

Info

Publication number
WO2019183813A1
WO2019183813A1 PCT/CN2018/080734 CN2018080734W WO2019183813A1 WO 2019183813 A1 WO2019183813 A1 WO 2019183813A1 CN 2018080734 W CN2018080734 W CN 2018080734W WO 2019183813 A1 WO2019183813 A1 WO 2019183813A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
brightness
frame
value
short exposure
Prior art date
Application number
PCT/CN2018/080734
Other languages
French (fr)
Chinese (zh)
Inventor
孙涛
朱聪超
Original Assignee
华为技术有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 华为技术有限公司 filed Critical 华为技术有限公司
Priority to PCT/CN2018/080734 priority Critical patent/WO2019183813A1/en
Priority to CN201880077221.5A priority patent/CN111418201B/en
Publication of WO2019183813A1 publication Critical patent/WO2019183813A1/en

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules

Definitions

  • the embodiments of the present invention relate to the field of communications technologies, and in particular, to a shooting method and device.
  • the existing high dynamic image shooting method is as shown in FIG. 1 , and a long exposure image, a short exposure image, and a normal exposure image are used as a reference to synthesize an image by an exposure fusion algorithm.
  • the dark area in the image is brightened by long exposure, the bright area is restored by short exposure, and the entire image is bright and dark.
  • this shooting method has the following problems: the feature points between different brightness images are inconsistent, resulting in difficulty in registration, residual motion between images; long exposure images may be blurred due to long time; long exposure images due to overall brightness enhancement, images The contrast ratio is lowered, etc., resulting in poor quality of the captured image.
  • the embodiment of the present application provides a photographing method, device and system to solve the problem that the image quality of the captured image is poor when the high dynamic image is taken.
  • a first aspect of the embodiments of the present application provides a photographing method, which is applied to an electronic device including a camera and an image signal processor (ISP), including: the electronic device calculates a preview image according to a high dynamic range image.
  • the short exposure amount and the number of shooting frames M, the short exposure amount and the number of shooting frames M are sent to the camera through the ISP, and the camera is controlled to acquire the M frame short exposure image according to the short exposure amount and the number of shooting frames M, and the M frame short exposure image.
  • Multi-frame noise reduction processing and local brightness adjustment are performed to obtain a frame of RAW image, and the RAW image is sent to the ISP for processing to obtain a YUV image, and the YUV image is compression-encoded to obtain a target image.
  • the shooting parameters such as the short exposure amount and the shooting frame number M are calculated based on the currently captured preview image, and the multi-frame short exposure image is captured by using the calculated shooting parameters, and the multi-frame short exposure image is lowered.
  • the RAW image is obtained by processing such as noise and local brightness adjustment, and the RAW is processed by the ISP to obtain the target image.
  • the captured image has better noise performance, and the highlight detail of the image is preserved by local brightness adjustment, in addition, the RAW image is recharged.
  • the ISP's processing is faster, the shooting efficiency is improved.
  • calculating the short exposure amount based on the preview image comprises: calculating a luminance average value of the highlight region composed of the overexposed pixels in the preview image; and comparing the brightness average value of the highlight region to the preview image The exposure value, and the target brightness value that the user desires to achieve, determine the short exposure amount. In this way, the short exposure amount can be determined according to the brightness of the image desired by the user, so that the brightness of the captured image fits the user's request.
  • the short exposure amount can be calculated according to the preview image, and the shooting scheme of the multi-frame short exposure image described in the first aspect is executed.
  • the short exposure amount calculated according to the preview image may be lengthened. , that is, increase the exposure time to make the brightness of the captured image stronger.
  • calculating the short exposure amount according to the preview image includes: calculating a brightness average value of the feature area of the preview image, according to the brightness average value of the feature area and the target
  • the brightness value calculates a first brightness reduction ratio of the feature area, and determines a second brightness reduction ratio of the feature area according to the calculated first brightness reduction ratio and a preset minimum reduction ratio for performing brightness compensation on the feature area, according to the preview image
  • the exposure value and the second brightness reduction ratio determine a short exposure amount, wherein the second brightness reduction ratio is greater than or equal to the minimum reduction ratio.
  • the brightness reduction ratio of the compensation feature area is always greater than the preset minimum reduction ratio, ensuring the amount of exposure required to increase the brightness of the feature area.
  • the shooting scheme of the multi-frame short exposure image in order to avoid serious exposure of the preview image, cannot display the image well.
  • the problem of the details of the dark area when the ratio of the underexposed pixels in the preview image is greater than the preset threshold, the multi-frame noise reduction processing and the local brightness adjustment are performed on the M-frame short exposure image to obtain a frame of RAW image.
  • the method further includes: calculating a brightness average value of the over dark area of the preview image, determining a long exposure amount according to the brightness average value of the over dark area, the exposure value of the preview image, and the target brightness value, and controlling the camera to acquire a long exposure image of one frame.
  • the M frame short exposure image is subjected to multi-frame noise reduction processing, and the local brightness adjusted image and the long exposure image are subjected to exposure fusion to obtain a frame of RAW image, and the RAW image is input into the ISP for processing.
  • the processing scheme of the multi-frame short exposure image + one frame long exposure image can be improved to improve the noise detail of the dark area in the image.
  • the number of frames of the long exposure image can be dynamically set in combination with the degree of underexposure of the preview image (not only can be set to one frame, but also can be set to multiple frames), and multi-frame multi-exposure images are used.
  • a multi-frame long exposure image processing scheme to improve the noise detail of dark areas in the image.
  • the multi-frame noise reduction processing includes: multi-frame time domain noise reduction processing, or multi-frame time domain noise reduction processing and spatial domain noise reduction. deal with.
  • a second aspect of the embodiments of the present application provides a photographing method, which is applied to an electronic device including a camera and an ISP, including: the electronic device calculates a long exposure amount and a photographing frame number N according to a preview image of the high dynamic range image, The long exposure amount and the number of shooting frames N are sent to the camera through the ISP, and the camera is controlled to acquire N frames of long exposure images according to the long exposure amount and the number of shooting frames N, and multi-frame noise reduction processing and partial brightness adjustment for the N frame long exposure images.
  • a RAW image is obtained, and the RAW image is sent to the ISP for processing, and the YUV image is obtained, and the YUV image is compression-encoded to obtain a target image; wherein the scheme can be applied to the case where the preview image is underexposed severely.
  • the scheme can be applied to the case where the preview image is underexposed severely.
  • a third aspect of the embodiments of the present application provides an electronic device including a camera and an ISP, the electronic device further comprising: a calculating unit, a shooting control unit, and an image processing unit; wherein the calculating unit is configured to use the preview image Calculating a short exposure amount, and a shooting frame number M; a shooting control unit, configured to send the short exposure amount and the shooting frame number M to the camera through the ISP, and control the camera according to the short exposure amount Obtaining an M frame short exposure image with the shooting frame number M; and an image processing unit configured to perform multi-frame noise reduction processing and local brightness adjustment on the M frame short exposure image to obtain a frame RAW image, and the RAW image Sending to the ISP for processing, obtaining a YUV image, and compressing and encoding the YUV image to obtain a target image.
  • the calculating unit is configured to use the preview image Calculating a short exposure amount, and a shooting frame number M
  • a shooting control unit configured to send the short exposure amount and the shooting frame number
  • the electronic device can achieve the same benefits as any of the possible aspects of the first aspect or the first aspect.
  • a fourth aspect of an embodiment of the present application provides an electronic device including one or more processors and one or more memories.
  • the one or more memories are coupled to one or more processors for storing computer program code, the computer program code comprising computer instructions for causing the electronic device to perform when the one or more processors execute the computer instructions
  • a fifth aspect of the embodiments of the present application provides a computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform a photographing method in any of the possible designs of any of the above aspects.
  • a computer program product when the computer program product is run on a computer, causing the computer to perform a photographing method in any of the possible designs of any of the above aspects.
  • FIG. 1 is a schematic flow chart of a conventional high-motion image
  • Figure 2 is a schematic block diagram of an embodiment of the present application.
  • FIG. 3 is a structural diagram of a mobile phone according to an embodiment of the present application.
  • FIG. 4 is a flowchart of a shooting method according to an embodiment of the present application.
  • FIG. 5 is a flowchart of a shooting method according to an embodiment of the present application.
  • FIG. 5a is a gray histogram of a preview image according to an embodiment of the present application.
  • FIG. 5b is a schematic diagram of highlight recovery according to an embodiment of the present application.
  • FIG. 5c is a schematic diagram of local enhancement of an image according to an embodiment of the present application.
  • FIG. 5 is a schematic diagram of image highlighting provided by an embodiment of the present application.
  • FIG. 5e is a schematic diagram of a detailed back-up according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of a photographing effect provided by an embodiment of the present application.
  • FIG. 6b is a schematic diagram of an effect of photographing a person according to an embodiment of the present application.
  • FIG. 7 is a flowchart of a method for photographing a multi-frame short exposure image + a long exposure image according to an embodiment of the present application
  • FIG. 7 is a schematic diagram of a shooting effect of a multi-frame short exposure image + a long exposure image according to an embodiment of the present application.
  • FIG. 7b is a schematic diagram of a weight curve provided by an embodiment of the present application.
  • FIG. 7c is a schematic diagram of exposure fusion provided by an embodiment of the present application.
  • FIG. 8 is a structural diagram of an electronic device according to an embodiment of the present application.
  • the principle block diagram of the embodiment of the present application is as shown in FIG. 2, after the user opens the camera application, the camera preview function is activated, and the shooting parameters (short exposure amount, and the number of shooting frames M) are determined according to the preview image, and the ISP is set by the shooting device.
  • YUV image It can refer to the image obtained by YUV coding method.
  • YUV is a color coding method adopted by European television system.
  • a three-tube color camera can be used for image acquisition, and then the obtained color image signal is subjected to color separation, separately amplified and corrected to obtain RGB, and then subjected to a matrix conversion circuit to obtain a luminance signal Y and two color difference signals R-Y (ie, U ), B-Y (ie V), the last sender encodes the three signals of luminance and color difference respectively to obtain a YUV image.
  • the dynamic range refers to the ability of the camera to adapt to the illumination of the scene in the scene, specifically the range of brightness.
  • an image with a large range of brightness variation may be referred to as a High Dynamic Range (HDR) image
  • an image with a small range of luminance variation may be referred to as a Low Dynamic Range (LDR) image.
  • HDR High Dynamic Range
  • LDR Low Dynamic Range
  • Exposure refers to the time interval from when the camera shutter is opened to closed. During this time, the object can leave an image on the film.
  • the exposure time depends on the need. The longer the exposure time, the brighter the photo generated on the film. The darker the opposite, the longer the exposure time is generally required to extend the exposure time, and the short exposure time is suitable for light.
  • the short exposure amount and the long exposure amount are defined by the normal exposure amount, the exposure time smaller than the normal exposure amount is referred to as a short exposure amount, and the exposure time larger than the normal exposure amount is referred to as a long exposure amount.
  • the normal exposure can be the exposure when the camera previews the image.
  • the average value of the Y value of the current image can be calculated in the YUV space, and various exposure parameter settings can be adjusted (automatically or manually). When the average value falls near a target value, the exposure amount in the exposure parameter is considered It is the normal exposure.
  • the photographing method provided by the embodiment of the present application can be applied to an electronic device provided with a camera, wherein the electronic device can be a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a palm. Computer (Personal Digital Assistant, PDA), camera, digital camera, monitoring and other equipment.
  • the electronic device is used as the mobile phone 100 shown in FIG. 3 as an example, and the shooting method provided by the present application is introduced.
  • the illustrated mobile phone 100 is only one example of an electronic device, and the mobile phone 100 may have more or fewer components than those shown in the figures, two or more components may be combined, or Has a different component configuration.
  • the various components shown in the figures can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
  • the mobile phone 100 may include components such as a processor 101, a memory 102, an ISP 103, a camera 104, a touch screen 105, and the like, which may pass through one or more communication buses or signal lines (not shown in FIG. 3). Communicate. It will be understood by those skilled in the art that the hardware structure shown in FIG. 3 does not constitute a limitation on the mobile phone 100, and the mobile phone 100 may include more or less components than those illustrated, or combine some components, or different component arrangements. .
  • the processor 101 is a control center of the mobile phone 100, and connects various parts of the mobile phone 100 using various interfaces and lines, by running or executing an application (Application, App) stored in the memory 102, and calling data stored in the memory 102. And instructions to perform various functions and processing data of the mobile phone 100.
  • the processor 101 may include one or more processing units; the processor 101 may also integrate an application processor (application server), a modem processor, and a digital signal processor (Digital Signal Processor, DSP); wherein the application processor mainly processes an operating system, a user interface, an application, etc., the modem processor mainly processes wireless communication, and the DSP is mainly used to convert an analog signal into a digital signal, and filter the noise of the digital signal. Wait.
  • the processor 101 may be a Kirin 960 chip manufactured by Huawei Technologies Co., Ltd.
  • the memory 102 is used to store applications and data, and the processor 101 performs various functions and data processing of the mobile phone 100 by running applications and data stored in the memory 102.
  • the memory 102 mainly includes a storage program area and a storage data area, wherein the storage program area can store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.); the storage data area can be stored according to the use of the mobile phone. Data created at 100 o'clock (such as audio data, phone book, etc.).
  • the memory 102 may include a high speed random access memory, and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • the memory 102 can store various operating systems, such as the IOS operating system developed by Apple Inc., the ANDROID operating system developed by Google Inc., and the like.
  • the ISP 103 is configured to perform processing such as dead pixel repair, white balance, gamma correction, sharpness, color interpolation, and the like on the image output by the DSP in the processor 101, and output an image required by the user application.
  • ISP 103 is a determining factor in the performance of an imaging device.
  • the ISP 103 can be integrated in the AP, or it can be a separate chip, and is not limited.
  • the touch screen 104 can be referred to as a touch display panel for implementing the input and output functions of the mobile phone 100, and can collect touch operations on or near the user (such as the user using any suitable object or accessory such as a finger, a stylus, etc. on the touch screen 104.
  • the operation on or near the touch screen 104) (such as the user pressing the operation of the shooting button), and driving the corresponding connection device according to a preset program, can also be used to display information input by the user or information provided to the user (eg, by The images captured by the camera) and various menus of the phone.
  • the touch screen 104 may include two parts: a touch detection device and a touch controller, wherein the touch detection device detects a touch orientation of the user, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller The touch information is received from the touch detection device and converted into contact coordinates, sent to the processor 101, and can receive commands from the processor 101 and execute them.
  • the camera 105 can be referred to as a camera, and is a component having basic functions such as video capturing/propagating and still image capturing, and is mainly used for image capturing.
  • the camera 105 may include a camera and an image sensor, and the image sensor may be a Charge Coupled Device (CCD), or may be a Metal Oxide Semiconductor (CMOS) image sensor or the like. Any type of image sensor.
  • CCD Charge Coupled Device
  • CMOS Metal Oxide Semiconductor
  • the processor 101 when the mobile phone 100 captures the function of the high dynamic image, the processor 101 obtains the short exposure amount and the number of shooting frames M, and transmits the terminal exposure amount and M to the camera 105 through the ISP 103; After the instruction, the processor 101 controls the camera 105 to acquire the M frame short exposure image according to the short exposure amount and M; then, the processor 101 performs multi-frame time domain noise reduction, spatial domain noise reduction, local brightness adjustment, etc. on the M frame short exposure image.
  • the RAW map processing operation obtains a frame of RAW image; the processor 101 transmits the RAW image to the ISP 103, the ISP 103 converts the RAW image into a YUV image, and the processor 101 converts the YUV image into a target image, such as a JPEG image.
  • the possible design can refer to the scheme shown in FIG. 4 or FIG. 5.
  • the handset 100 may also include a light sensor 106.
  • the light sensor 106 may include an ambient light sensor and a proximity sensor.
  • the ambient light sensor may sense the brightness of ambient light around the mobile phone 100, so that the mobile phone 100 adjusts the brightness of the display of the touch screen 104 according to the brightness of the ambient light.
  • the proximity sensor can sense the proximity of the handset 100 to the human ear, and the handset 100 can turn off the power of the display as it moves to the ear.
  • the mobile phone 100 can also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like, and will not be described herein.
  • the mobile phone 100 may further include a power supply device 107 (such as a battery and a power management chip) that supplies power to the various components.
  • the battery may be logically connected to the processor 101 through the power management chip to manage charging, discharging, and power management through the power supply device 107. And other functions.
  • the mobile phone 100 may further include a Bluetooth device, a positioning device, an audio circuit, a speaker, a microphone, a WI-FI device, a near field communication (NFC) device, and the like, and details are not described herein.
  • a Bluetooth device a positioning device
  • an audio circuit a speaker
  • a microphone a WI-FI device
  • NFC near field communication
  • the following embodiments can all be implemented in an electronic device (e.g., mobile phone 100) having the above hardware.
  • FIG. 4 or FIG. 5 is a flowchart of a photographing method provided by an embodiment of the present application, wherein the method can be performed by the mobile phone 100 shown in FIG. 3 to capture a high dynamic image.
  • the method may include S501 to S506.
  • the processor of the mobile phone After detecting that the mobile phone is turned on, the processor of the mobile phone automatically turns on the mobile phone to perform the function for capturing high dynamic images provided by the application; or the processor of the mobile phone receives the shooting function provided by the user to open the application provided by the application. After the operation, the function of capturing high dynamic images provided by the present application is performed in the mobile phone.
  • the shooting method provided by the embodiment of the present application can be turned on to implement monitoring.
  • S502 The processor of the mobile phone receives the request for opening the camera application sent by the user, opens the camera application, and controls the camera to start the image preview function to obtain the preview image.
  • the preview image may refer to an image displayed on the display screen of the mobile phone before the image to be captured by the user is not imaged. For example, when the processor of the mobile phone detects that the user requests the boot camera application by clicking the desktop icon or sliding the camera shortcut icon on the unlock interface, the processor of the mobile phone controls the camera to capture, focus, etc. the image, and obtain a preview image. Further, the captured preview image can also be displayed on the display screen of the mobile phone for preview by the user.
  • the preview image may include a plurality of pixels, each pixel corresponding to a gray value, the gray value may be used to indicate the brightness of the pixel, and the value may range from 0 to 255, and the gray value of the pixel is larger. Indicates that the brighter the pixel, the smaller the gray value of the pixel, indicating that the pixel is darker.
  • the gray histogram of the preview image may be obtained, and according to the gray histogram of the preview image, the proportion value of the overexposed pixel and the ratio of the underexposed pixel in the preview image are calculated, and if the ratio of the overexposed pixel is greater than the first
  • the preset value and/or the scale value of the underexposed pixel is greater than the second preset value, and then the preview image is determined to be a high dynamic range image.
  • the ratio of the overexposure pixel is greater than the first preset value and/or the ratio of the underexposure pixel is greater than the second preset value, which may mean that the ratio of the overexposed pixel is greater than the first preset value, or the underexposure pixel
  • the ratio value is greater than the second preset value, or the ratio value of the overexposed pixel is greater than the first preset value and the ratio value of the underexposed pixel is greater than the second preset value.
  • the gray histogram is a statistic of the brightness level distribution of the pixels in the preview image, and all the pixels in the preview image are counted according to the grayscale value of the grayscale value, and the probability is represented by the horizontal axis.
  • the brightness and vertical axis of the image represent the relative number of pixels in the preview image that are within the brightness range to form a grayscale histogram.
  • the horizontal axis is from left to right, and the gray value is from small to large.
  • the overexposed pixel may refer to a pixel whose gray value is greater than the overexposure threshold; the underexposed pixel may refer to a pixel whose gray value is smaller than the underexposure threshold, and the overexposure threshold and the underexposure threshold may be set as needed, and are not limited.
  • the scale value of the overexposed pixel may refer to: the ratio of pixels between the [overexposed threshold Tover, 255] in the gray histogram to all the pixels included in the preview image, which is recorded as Ratio_over;
  • the ratio of underexposed pixels may refer to: gray
  • the ratio of pixels between [0, underexposed threshold] in the histogram to all pixels included in the preview image is recorded as Ratio_under.
  • Ratio_over exceeds the first preset value (for example, set to 5%)
  • Ratio_under exceeds the second pre- Setting the value (such as 40%) means that the preview image is underexposed, and the overall brightness contrast of the preview image is relatively large.
  • the brightness average value of the highlight region of the preview image may be calculated, and the short exposure amount is determined according to the brightness average value of the highlight region, the exposure value of the preview image, and the target brightness value.
  • the average value of the brightness of the highlight region may refer to the ratio of the sum of the gray values of all the pixels in the highlight region to the number of all the pixels in the highlight region
  • the target luminance value may refer to the brightness that the user desires to achieve.
  • the highlight area may refer to an area formed by overexposed pixels in the preview image, and the area may also be referred to as an overexposed area of the preview image, and the brightness average of the highlight area is greater than the target brightness average.
  • the ratio of the target luminance value to the luminance average value of the highlight region may be calculated, and the product of the exposure value of the preview image and the ratio is used as the short exposure amount.
  • the ratio Rover of the target luminance value to the average value Mover is used as the falling ratio of the luminance compensation to be performed on the highlight region.
  • E target E 1 *R over .
  • the average value of the brightness of the over dark area in the preview image may be calculated, and the number M of shot frames is determined according to the correspondence between the average value of the brightness of the over dark area and the number of shot frames.
  • the average value of the brightness of the over dark area may refer to the ratio of the sum of the gray values of all the pixels in the dark area to the number of all the pixels in the dark area, and the dark area may refer to the underexposure pixels in the preview image.
  • the area which may also be referred to as the underexposed area of the preview image.
  • the correspondence may be a preset functional relationship.
  • the luminance average value M under of the pixel whose gray value is in the [0, 64] interval can be calculated, and then, according to The following formula is used to set M.
  • M is 6.
  • M under is greater than the threshold 20 and less than or equal to 40
  • M is 4
  • M under is greater than the threshold 40
  • the shooting frame number may also be determined by referring to the shooting scene, and the shooting scene may include any shooting scenes such as day shooting, night shooting, backlight shooting, night scene shooting, dark light shooting, and point light shooting, and the current shooting scene is brighter, setting
  • M can be set to 2 frames or 4 frames or 6 frames, etc., for example, M is 6 frames for night scene shooting, 2 frames for M during daytime shooting, and 4 frames for large backlighting.
  • the ratio of the exposure time ET of the current preview image and the sensitivity ISO, and the preview image it is determined which shooting scene the current shooting is in, for example, when the ratio R of the exposure time ET and the ISO is less than a certain preset ratio (for example, when 0.9), it indicates that the current shooting scene is shooting during the day, and vice versa, for night scene shooting; if it is shooting during the day, and the ratio of the underexposed pixels in the preview image exceeds a certain threshold (large threshold), then the current shooting scene may be For backlighting scenes; if it is night scene shooting, and the scale value of overexposed pixels in the preview image exceeds a certain threshold T 1 (threshold is small) and less than a certain threshold T 2 (large threshold), the current shooting scene may be shot by point light source If it is a night scene shooting, and the scale value of the overexposed pixels in the preview image is smaller than the above T 1 , it can be considered as a dark light shot.
  • the exposure time ET is generally the reciprocal of the
  • the controller of the mobile phone sends the short exposure amount and the number of shooting frames M through the ISP, and when the controller of the mobile phone receives the photographing instruction issued by the user, the processor of the mobile phone controls the camera according to the short exposure amount and the number of shooting frames M. , shooting M frame short exposure images.
  • the processor of the mobile phone controls the camera to acquire the M frame short exposure image based on the short exposure amount and other parameters (sensitivity, aperture coefficient, etc.).
  • S505 The processor of the mobile phone performs multi-frame noise reduction processing and local brightness adjustment on the M frame short exposure image to obtain a frame of RAW image.
  • each process in this step can be referred to below.
  • the processor of the mobile phone transmits the RAW image to the ISP of the mobile phone, and the ISP of the mobile phone converts the RAW image into a YUV image, and the processor of the mobile phone performs JPEG encoding on the IPS processed YUV image to obtain a target image.
  • YUV is an encoding format
  • Y stands for brightness
  • U and V are chromaticity.
  • the ISP converts the RAW image to obtain the YUV image.
  • U -0.147R - 0.289G + 0.436B
  • V 0.615R - 0.515G - 0.100B.
  • the JPEG encoding of the YUV image may include processing such as detail folding, edge cropping, and the like. Specifically, the process can refer to the prior art, and details are not described herein.
  • the short exposure amount and the number of shooting frames that need to be compensated for the highlight region can be determined in real time according to the preview image, based on the determined short exposure amount. And shooting a frame number to collect a multi-frame short exposure image, and multi-frame short exposure in the RAW domain for multi-frame noise reduction processing and local brightness adjustment and other processing to obtain a frame of RAW map; then, the resulting RAW map is recharged to
  • the ISP processes the YUV image and processes the ISP-processed YUV image to obtain a JPEG image.
  • the short exposure amount and the frame number M are calculated based on the currently captured preview image to ensure that each shooting scene can control the exposure, and secondly, the multi-frame short exposure frame is performed in the RAW domain.
  • Noise reduction, local brightness adjustment and other processes not only have better noise performance, but also preserve high-light detail through local brightness adjustment; in addition, in this scheme, the processed RAW image is recharged to ISP processing, due to ISP processing Faster, greatly improving the shooting efficiency.
  • the left image is an effect diagram of the preview image
  • the right image is an effect diagram of the image captured after the scheme shown in FIG. 5 is executed.
  • FIG. 6a after the scheme shown in FIG. 5 is executed, The details of the highlight area and the over dark area are improved.
  • S505 may include the following processes (1) to (6), wherein (1) to (3) are multi-frame noise reduction processes, and (4) to (6) are local. Brightness adjustment process.
  • a frame reference image R is selected from the M frame short exposure images.
  • the contrast of each frame of the M frame short exposure image can be obtained, and the image with the highest contrast is used as the reference image R in order to improve the sharpness after image fusion.
  • the contrast of the image can be used to characterize the sharpness of the image, the higher the contrast, the clearer the image.
  • the average of the laplacian gradient of the image can be calculated and used as the contrast of the image.
  • the average value of the laplacian gradient of the image may be calculated by referring to the prior art, and details are not described herein again.
  • the second frame image may be any image other than the reference image R except for the M frame short exposure image
  • the third frame image may be any image of the M frame image that is not subjected to pixel value difference with the reference image.
  • the reference image is R
  • the second frame image is M
  • the feature points in R are first detected, and the feature points in the M image are matched, and the feature points in R and M are matched; then, the warp matrix is calculated, M is transformed by the warp matrix to obtain the registered M', and the feature points in R and M' are matched.
  • the feature point may be a pixel point in which the gray value of the R changes drastically or a point where the curvature is large on the edge of the image.
  • the feature points can be SURF feature points.
  • the Warp matrix can be called a transformation matrix, or other matrix that can transform, rotate, scale, etc. the image to deform the image.
  • the diff exceeds the set threshold (such as 10)
  • the set threshold such as 10
  • the location is a mismatched location. Then average the rest of R and M':
  • the difference detection is performed before the fusion, and the area with small difference is averaged to reduce the noise; for the area with large difference (the unmatched area), the average noise reduction is not performed, where the noise is large, and the next step is required.
  • Noise reduction performs noise filtering, that is, the mask obtained in step (2) can be used to guide the intensity of the subsequent airspace noise reduction processing.
  • the spatial denoising method of the time domain denoising image may be spatially denoised using a commonly used spatial denoising method such as Non-Local Means (NLM), and will not be described again.
  • NLM Non-Local Means
  • the noise reduction intensity can be set as needed, and is not limited.
  • step (3) may be performed to remove ghosts.
  • step (3) may not be performed, and only steps are performed ( 1) ⁇ (2) to achieve a variety of noise reduction processing.
  • only the motion region exists in the preview image is taken as an example for description.
  • the correction matrix C can be obtained, and the image I after the spatial domain noise reduction is corrected according to the correction matrix C (ie, Lens Shading Correction (LSC) processing), and the corrected image I is obtained.
  • the correction matrix C ie, Lens Shading Correction (LSC) processing
  • the corrected image I is obtained.
  • Image I and image I' are subjected to exposure fusion to obtain a highlight recovery image O.
  • each element in C corresponds to a correction coefficient r (r>1, the farther away from the center, the larger the r value)
  • the image I after the spatial domain noise reduction is corrected according to the correction matrix C, and the corrected image I′ can be obtained. It is meant that each corrected pixel I ij is multiplied by a correction coefficient r of C corresponding point C ij to obtain a corrected image I′.
  • W ij is the fusion weight
  • the calculation process can refer to the following calculation weights. The process is simply that when the weight W ij is calculated, its weight center is set to 210.
  • the unexposed details of I may be preserved by the method of highlight recovery, or the details of the overexposed area in I' may be retained.
  • the dark portion detail information in the image after the spatial domain noise reduction may be insufficient.
  • the gamma curve can be used for brightening, the Retinex method, and the like.
  • the brightening method brightens the image after the airborne noise reduction, and combines the highlighted image with the highlight recovery image to obtain an image with darkness and rich brightness details.
  • the processing is as shown in FIG. 5c, including:
  • R ⁇ G ⁇ B refers to the red/green/blue channel of the image, respectively.
  • the detail of the grayscale Gray is separated to obtain the base layer (base layer) and the detail layer D; for example, the grayscale gray can be separated into the base and the detail layer D by the Gaussian filtering method.
  • Dynamic range compression (or brightness enhancement) is performed on the Base layer to obtain an enhanced grayscale image.
  • the enhanced grayscale image can be enhanced by the Retinex method.
  • r (x, y) logS (x, y) - log (F (x, y) * S (x, y)) to obtain an enhanced grayscale, where S is the spatial domain noise reduction After the image, r is the enhanced result; F is the center surround function, generally defined as a Gaussian operator, such as:
  • the brightening gain (Gain) is determined according to the average brightness of the image of the enhanced grayscale image and the preset brightening amplitude.
  • the preset brightness range can be set as needed, and is not limited.
  • the enhanced grayscale image is brightened according to the Gain value to obtain a brightened image.
  • each pixel in the enhanced grayscale image may be multiplied by a Gain value, or a Gamma curve may be directly converted according to the Gain value to perform a point-by-point lookup table (the gamma curve is preset according to experience), and each pixel in the image is highlighted. Get the highlighted image G_bright.
  • the Base layer, the brightened image G_bright, and the highlight restored image O are subjected to exposure fusion to obtain a local brightness enhanced image B_enhance.
  • the detail enhancement layer B is used to enhance the detail of the local brightness enhancement image B_enhance (ie, image addition) to obtain the final enhanced grayscale image Genhance.
  • image addition may refer to adding the gray values of the pixels at the same position of the two images.
  • the gain coefficient of each pixel is calculated, and the corresponding gain coefficient is multiplied by four channels of each RGGB format in the noise-reduced image to obtain a final enhanced image Ienhance.
  • the exposure fusion technique in this step can be referred to the following.
  • the core of the detail overlay is to represent the original image as the sum of the base layer and the detail layer.
  • the detail component is separately enhanced and the enhanced image is obtained, such as multiplying the detail component by a coefficient.
  • the key is to acquire the basic components. Specifically, as shown in Figure 5e:
  • the back-off coefficient ⁇ is an empirical value, generally greater than 1.
  • the lower the sharpness of the image the higher the rebound coefficient ⁇ ; the higher the sharpness of the image, the lower the back-off coefficient ⁇ .
  • each operation in FIG. 5e is a point-by-point operation for each pixel of the image.
  • the image capturing subject that the user desires to capture in order to brighten the feature area when imaging
  • the brightness needs to be increased by a short exposure amount, that is, the exposure time is longer, so that the exposure of the preview image is higher, and the object in the feature area is brighter.
  • the average brightness value M face of the face area can be calculated
  • the average luminance value M face of the face region may be: a ratio of the total of the gray values of all the pixels of the face region to the number of all the pixels of the face region; generally, the average luminance value of the face region is lower. , R over is larger (R over is generally less than 1).
  • the final result of R over will be determined according to the following R over and the set R min , that is, whether R over is smaller than the minimum value R min , and if it is small, it is set to the minimum value R min , which can no longer be Small; if it is larger than the minimum value R min , then it is the calculated R over .
  • R min can be set according to the average brightness value M face of different face regions, as shown below, x represents the average brightness value of the face:
  • the left image in FIG. 6b is a preview image
  • the image on the right is an image obtained by highlighting the exposure value of the human face, which is obviously brightened a lot.
  • the ISP includes a spatial domain noise reduction module, a lens distortion correction LSC module, and a DRC module for adjusting brightness, when the ISP performs YUV format conversion on the RAW image, it also needs to be turned off. These modules are used to avoid ISP's repeated processing of images.
  • the long exposure amount can be determined, and the camera is controlled to capture a long exposure according to the long exposure amount.
  • the image is blended with the M-frame short exposure image and the long exposure image to improve dark area detail. For example, as shown in FIG. 7, it is a flow chart of a multi-frame short exposure image and a long exposure image.
  • the average value of the brightness of the over dark area of the preview image may be calculated, and the long exposure amount may be determined according to the average value of the brightness of the over dark area, the exposure value of the preview image, and the target brightness value.
  • the brightness average value of the over dark region may refer to a ratio of a sum of gray values of all pixels in the over dark region to a number of all pixels in the over dark region, and an average brightness value of the over dark region is smaller than the target luminance value.
  • the ratio of the target brightness value to the brightness average value of the over dark area can be calculated, and the product of the exposure value of the preview image and the ratio is used as the long exposure amount.
  • the average brightness value of the over dark area is Munder and the target brightness value is 210
  • the ratio Runder of the target brightness value to the average value Munder is used as the compensation ratio for the brightness increase of the underexposure area.
  • the left picture is a simple multi-frame short exposure effect picture
  • the right picture is a multi-frame short exposure image and a frame long exposure image fusion effect diagram.
  • one frame length is added. The details of the over dark areas are improved after the image is exposed.
  • the process of multi-frame time domain noise reduction and spatial domain noise reduction of the long exposure image may refer to the multi-frame time domain noise reduction and spatial domain noise reduction process of the short exposure image in FIG. 5, Let me repeat.
  • the exposure fusion mentioned above, and the exposure fusion of the image after the spatial attenuation of the long exposure image and the local brightness adjustment in the scheme shown in FIG. 7 can be referred to the following:
  • the Gaussian weight formula (the pixel value in the formula is normalized using the grayscale maximum value of 255, and the normalized by 128 is 0.5) is:
  • a plurality of images are weighted and summed according to weights, where ij represents a pixel point position and k represents an image of a different brightness.
  • the upper left three columns are a short exposure input map, a medium exposure input map, and a long exposure input map; the lower left three columns are a short exposure weight map, a medium exposure weight map, and a long exposure weight map;
  • the large image is the result of exposure fusion of different exposure images on the left side according to their respective weighted weight maps. It can be seen from Fig. 7c that after the image and its weight-weighted image are subjected to exposure fusion, the highlights and shadows of the image are well received. Show.
  • the long-exposure image when the long-exposure image has problems such as blur or ghosting, the long-exposure image is discarded, and only the multi-frame short-exposure image is fused (ie, The scheme shown in Fig. 5 is performed to fuse the poor quality long exposure image to affect the quality of the entire image.
  • the multi-frame short exposure fusion scheme shown in FIG. 4 or FIG. 5 may be selected to improve the highlight portion of the image. Details; conversely, if the image does not have any overexposed pixels, but the entire image of the preview image is underexposed (such as a dark night), then a long exposure can be determined to perform the fusion scheme of the long exposure image described below.
  • the multi-frame short exposure fusion shooting scheme shown in FIG. 4 or FIG. 5 is adopted.
  • the preview image is too dark, the multi-frame long exposure fusion shooting scheme is adopted; the preview image is bright and black, and The multi-frame short exposure fusion + one-frame long exposure image fusion shooting scheme shown in FIG. 7 is adopted.
  • the multi-frame long exposure fusion shooting scheme is: after the user opens the photographing application, the camera preview function is activated, and the shooting parameters (long exposure amount, and the number of shooting frames N) are determined according to the preview image, and the image set by the shooting device is set.
  • the signal processor (ISP) transmits the determined shooting parameters to the camera, and the control camera captures the N-frame long exposure image; subsequently, the N-frame long exposure frame is processed in the RAW domain (eg multi-frame noise reduction, local brightness) Enhance, etc.) Get a RAW image with high dynamic range and low noise, send the RAW image to ISP, and process the YUV image by ISP; finally, Joint Photographic Experts Group (JPEG) for YUV image
  • JPEG Joint Photographic Experts Group
  • the process of determining the shooting parameters (the long exposure amount and the number N of shots) based on the preview image can be referred to the above S503.
  • the brightness average value M over of the pixel whose brightness range is between [210, 225] may be less than the preset target value 210.
  • a new one can be set.
  • the target value Tnew, the ratio Rover of the target value Tnew and the luminance average value Mover is used as the rising ratio R over for the luminance compensation to be performed on the highlight region, R over may be greater than 1, that is, the set exposure will be larger than the exposure of the preview image.
  • the time is longer, and then, based on the exposure value E 1 of the preview image, multiplying Rover to obtain a long exposure E target :
  • the electronic device includes corresponding hardware structures and/or software modules for performing various functions.
  • the present application can be implemented in a combination of hardware or hardware and computer software in combination with the algorithmic steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present application.
  • the embodiments of the present application may divide the functional modules of the electronic device and the server according to the foregoing method.
  • each functional module may be divided according to each function, or two or more functions may be integrated into one processing module.
  • the above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
  • FIG. 8 is a schematic diagram showing a possible composition of the electronic device involved in the foregoing embodiment.
  • the electronic device 80 may include: a computing unit 801.
  • the calculation unit 801 is configured to calculate a short exposure amount according to the preview image, and capture the frame number M to support the electronic device to execute S503.
  • the shooting control unit 802 is configured to send the short exposure amount and the shooting frame number M to the camera through the ISP, and control the camera to acquire the M frame according to the short exposure amount and the shooting frame number M.
  • Short exposure image execute S504 with the supporting electronic device.
  • the image processing unit 803 is configured to perform multi-frame noise reduction processing and local brightness adjustment on the M-frame short exposure image to obtain a frame of RAW image, and send the RAW image to the ISP for processing to obtain a YUV image,
  • the YUV image is compression-encoded to obtain a target image.
  • S505 and S506 are executed in support electronic devices.
  • the electronic device provided by the embodiment of the present application is configured to perform the above-described photographing method, and thus the same effect as the above-described photographing method can be achieved.
  • the above-mentioned computing unit 801, the shooting control unit 802 and the image processing unit 803 can be integrated into a processing module, wherein the processing module is used for controlling and managing the actions of the electronic device, for example, for the processing module. Steps S501-S506 in FIG. 5, and/or other processes for the techniques described herein, are performed on the supporting electronic device.
  • the above electronic device may further include a display module and a storage module.
  • the storage module is used to store program code of the electronic device, recorded video, and various parameter information of the video.
  • the processing module may be a processor or a controller, such as a CPU, a graphics processing unit (GPU), a general purpose processor, a digital signal processor (DSP), and an application specific integrated circuit (application- Specific integrated circuit (ASIC), FPGA or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure.
  • the processor can also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
  • the display module can be a display, and can be used to display information input by the user, information provided to the user, and various menus of the terminal.
  • the display can be configured in the form of a liquid crystal display, an organic light emitting diode, or the like.
  • a touch panel can be integrated on the display for collecting touch events on or near the display, and transmitting the collected touch information to other devices (such as a processor, etc.).
  • the memory module may be a memory, which may include a high speed RAM, and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
  • the electronic device can also include a communication module that can be used to support communication of the electronic device with other network entities, such as communication with a server.
  • the communication module may specifically be a device that interacts with other electronic devices, such as a radio frequency circuit, a Bluetooth chip, and a WIFI chip.
  • the processing module is a processor
  • the display module is a touch screen
  • the storage module is a memory
  • the electronic device in the embodiment of the present application may specifically be the mobile phone shown in FIG. 3 .
  • the embodiment of the present application further provides a computer storage medium, where the computer storage medium stores computer instructions.
  • the computer instruction When the computer instruction is run on the electronic device, the electronic device performs the foregoing method steps to implement the shooting method in the foregoing embodiment.
  • the embodiment of the present application further provides a computer program product, when the computer program product is run on a computer, causing the computer to perform the above related method steps to implement the photographing method in the above embodiment.
  • the embodiment of the present application further provides a device, which may be a chip, a component or a module, and the device may include a connected processor and a memory; wherein the memory is used to store a computer to execute instructions, when the device is running, The processor may execute a computer-executable instruction stored in the memory to cause the chip to perform the photographing method in each of the above method embodiments.
  • the electronic device, the computer storage medium, the computer program product, or the chip provided by the embodiment of the present application are all used to perform the corresponding method provided above. Therefore, the beneficial effects that can be achieved can be referred to the corresponding correspondence provided above. The beneficial effects of the method are not repeated here.
  • the disclosed apparatus and method may be implemented in other manners.
  • the device embodiments described above are merely illustrative.
  • the division of modules or units is only a logical function division.
  • there may be another division manner for example, multiple units or components may be combined or It can be integrated into another device, or some features can be ignored or not executed.
  • the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • An integrated unit can be stored in a readable storage medium if it is implemented as a software functional unit and sold or used as a standalone product.
  • the technical solution of the embodiments of the present application may be embodied in the form of a software product in the form of a software product in essence or in the form of a contribution to the prior art, and the software product is stored in a storage medium.
  • a number of instructions are included to cause a device (which may be a microcontroller, chip, etc.) or a processor to perform all or part of the steps of the various embodiments of the present application.
  • the foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

Embodiments of the present application disclose an image capture method, device, and system, and relate to the technical field of communications, to improve the quality of captured images. The method is applied to an electronic device comprising a camera and an ISP, and comprises: calculating a short exposure value and a captured frame number M according to a preview image, sending the short exposure value and the captured frame number M to the camera via the ISP, and controlling, according to the short exposure value and the number of captured frames M, the camera to acquire M frames of short exposure images; performing multi-frame noise reduction processing and local brightness adjustment on the M frames of short exposure images to obtain a RAW image; sending the RAW image to the ISP for processing to obtain a YUV image; and performing compression encoding on the YUV image to obtain a target image. The image capture method provided by the present application is used for capturing high dynamic range images.

Description

一种拍摄方法及设备Shooting method and device 技术领域Technical field
本申请实施例涉及通信技术领域,尤其涉及一种拍摄方法及设备。The embodiments of the present invention relate to the field of communications technologies, and in particular, to a shooting method and device.
背景技术Background technique
随着科学技术的不断发展,智能手机、个人数字助理(英文:Personal Digital Assistant,PAD)等电子设备的功能越来越强大,越来越多的用户喜欢通过智能手机等电子设备来拍摄图像。然而,受电子设备的硬件条件限制,当拍摄高动态图像时,通常获取到的图像只能采集到高动态范围场景的一部分信息,或者存在亮区过曝问题,或者存在暗区亮度不足问题,整体拍摄效果较差。With the continuous development of science and technology, electronic devices such as smart phones and personal digital assistants (PADs) are becoming more and more powerful, and more and more users like to take images through electronic devices such as smart phones. However, limited by the hardware conditions of the electronic device, when shooting a high dynamic image, the usually acquired image can only collect a part of the information of the high dynamic range scene, or there is a problem of overexposure in the bright area, or there is a problem of insufficient brightness in the dark area. The overall shooting effect is poor.
为提高拍摄效果,现有高动态图像的拍摄方法如图1所示,利用一张长曝光图像,一张短曝光图像,以及一张正常曝光图像作为基准,通过曝光融合算法合成一张图像,使得图像中的暗区由长曝光提亮,亮区由短曝光恢复,整张图像亮暗合适。然而,这种拍摄方法存在着以下问题:不同亮度图像之间特征点不一致,导致配准困难,图像间残留运动;长曝光图像会因为时间长导致模糊;长曝光图像因为整体亮度的提升,图像的对比度会降低等,导致拍摄出来的图像质量较差。In order to improve the shooting effect, the existing high dynamic image shooting method is as shown in FIG. 1 , and a long exposure image, a short exposure image, and a normal exposure image are used as a reference to synthesize an image by an exposure fusion algorithm. The dark area in the image is brightened by long exposure, the bright area is restored by short exposure, and the entire image is bright and dark. However, this shooting method has the following problems: the feature points between different brightness images are inconsistent, resulting in difficulty in registration, residual motion between images; long exposure images may be blurred due to long time; long exposure images due to overall brightness enhancement, images The contrast ratio is lowered, etc., resulting in poor quality of the captured image.
发明内容Summary of the invention
本申请实施例提供一种拍摄方法、设备及系统,以解决现有拍摄高动态图像时,拍摄图像的图像质量较差的问题。The embodiment of the present application provides a photographing method, device and system to solve the problem that the image quality of the captured image is poor when the high dynamic image is taken.
为达到上述目的,本申请实施例采用如下技术方案:To achieve the above objective, the embodiment of the present application adopts the following technical solutions:
本申请实施例的第一方面,提供一种拍摄方法,该方法应用于包括摄像机和图像信号处理器(Image Signal Processor,ISP)的电子设备,包括:电子设备根据高动态范围图像的预览图像计算短曝光量以及拍摄帧数M,将短曝光量和拍摄帧数M通过ISP下发至摄像机,并控制摄像机根据短曝光量和拍摄帧数M采集M帧短曝光图像,对M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像,将RAW图像发送至ISP进行处理,得到YUV图像,对YUV图像进行压缩编码,得到目标图像。即本实施例提供的技术方案中,基于当前拍摄的预览图像计算短曝光量和拍摄帧数M等拍摄参数,采用计算出的拍摄参数拍摄多帧短曝光图像,对多帧短曝光图像的降噪、局部亮度调整等处理得到RAW图,并由ISP对RAW进行处理得到目标图像。如此,不仅保证每一个拍摄场景都可以控制曝光,而且,通过降噪处理,使拍摄出的图像具有更好的噪声表现,以及通过局部亮度调整使得图像的高光细节保留,此外,RAW图回灌至ISP处理,由于ISP的处理比较快,提高了拍摄效率。A first aspect of the embodiments of the present application provides a photographing method, which is applied to an electronic device including a camera and an image signal processor (ISP), including: the electronic device calculates a preview image according to a high dynamic range image. The short exposure amount and the number of shooting frames M, the short exposure amount and the number of shooting frames M are sent to the camera through the ISP, and the camera is controlled to acquire the M frame short exposure image according to the short exposure amount and the number of shooting frames M, and the M frame short exposure image. Multi-frame noise reduction processing and local brightness adjustment are performed to obtain a frame of RAW image, and the RAW image is sent to the ISP for processing to obtain a YUV image, and the YUV image is compression-encoded to obtain a target image. That is, in the technical solution provided by the embodiment, the shooting parameters such as the short exposure amount and the shooting frame number M are calculated based on the currently captured preview image, and the multi-frame short exposure image is captured by using the calculated shooting parameters, and the multi-frame short exposure image is lowered. The RAW image is obtained by processing such as noise and local brightness adjustment, and the RAW is processed by the ISP to obtain the target image. In this way, not only is it possible to control the exposure for each shooting scene, but also, through the noise reduction process, the captured image has better noise performance, and the highlight detail of the image is preserved by local brightness adjustment, in addition, the RAW image is recharged. To the ISP processing, because the ISP's processing is faster, the shooting efficiency is improved.
在一种可能的设计中,结合第一方面,基于预览图像计算短曝光量包括:计算预览图像中由过曝像素组成的高光区域的亮度平均值;根据高光区域的亮度平均值、预览图像的曝光值、以及用户期望达到的目标亮度值,确定短曝光量。如此,可以根据 用户所期望的图像亮度确定短曝光量,使拍摄出的图像的亮度贴合用户要求。In a possible design, in combination with the first aspect, calculating the short exposure amount based on the preview image comprises: calculating a luminance average value of the highlight region composed of the overexposed pixels in the preview image; and comparing the brightness average value of the highlight region to the preview image The exposure value, and the target brightness value that the user desires to achieve, determine the short exposure amount. In this way, the short exposure amount can be determined according to the brightness of the image desired by the user, so that the brightness of the captured image fits the user's request.
需要说明的是,只要预览图像中存在过曝像素,就可以根据预览图像计算短曝光量,执行第一方面所述的多帧短曝光图像的拍摄方案。It should be noted that, as long as the overexposed pixels exist in the preview image, the short exposure amount can be calculated according to the preview image, and the shooting scheme of the multi-frame short exposure image described in the first aspect is executed.
在又一种可能的设计中,结合第一方面,为了凸显用户所期望拍摄的主体(如人脸、花朵、景物所在的特征区域)的亮度,则可以使根据预览图像计算的短曝光量加长,即增加曝光时间,使其拍摄出的图像的亮度更强。具体的,若预览图像包括用户期望拍摄的图像拍摄主体所处的特征区域,则根据预览图像计算短曝光量包括:计算预览图像的特征区域的亮度平均值,根据特征区域的亮度平均值和目标亮度值计算特征区域的第一亮度下降比率,根据计算出的第一亮度下降比率、以及预设的对特征区域进行亮度补偿的最小下降比率,确定特征区域的第二亮度下降比率,根据预览图像的曝光值、以及第二亮度下降比率,确定短曝光量,其中,第二亮度下降比率大于等于最小下降比率。如此,使补偿特征区域的亮度下降比率始终大于预设的最小下降比率,保证了提高特征区域的亮度所需要的曝光量。In yet another possible design, in combination with the first aspect, in order to highlight the brightness of the subject (such as a face, a flower, and a feature area where the subject is located) that the user desires to photograph, the short exposure amount calculated according to the preview image may be lengthened. , that is, increase the exposure time to make the brightness of the captured image stronger. Specifically, if the preview image includes a feature area in which the image capturing body that the user desires to capture, calculating the short exposure amount according to the preview image includes: calculating a brightness average value of the feature area of the preview image, according to the brightness average value of the feature area and the target The brightness value calculates a first brightness reduction ratio of the feature area, and determines a second brightness reduction ratio of the feature area according to the calculated first brightness reduction ratio and a preset minimum reduction ratio for performing brightness compensation on the feature area, according to the preview image The exposure value and the second brightness reduction ratio determine a short exposure amount, wherein the second brightness reduction ratio is greater than or equal to the minimum reduction ratio. Thus, the brightness reduction ratio of the compensation feature area is always greater than the preset minimum reduction ratio, ensuring the amount of exposure required to increase the brightness of the feature area.
在又一种可能的设计中,结合第一方面或者上述任一可能的设计,为了避免因预览图像欠曝严重,本申请实施例提供的多帧短曝光图像的拍摄方案不能很好的展现图像中暗区的细节的问题,当预览图像中欠曝像素的比例值大于预设阈值时,在对M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像之前,所述方法还包括:计算预览图像的过暗区域的亮度平均值,根据过暗区域的亮度平均值、预览图像的曝光值、以及目标亮度值,确定长曝光量,控制摄像头采集一帧长曝光图像,将对M帧短曝光图像进行多帧降噪处理、局部亮度调整后的图像与长曝光图像进行曝光融合,得到一帧RAW图像,将RAW图像输入ISP进行处理。如此,可以通过多帧短曝光图像+一帧长曝光图像的处理方案,以改善图像中暗区的噪声细节。In another possible design, in combination with the first aspect or any of the above possible designs, in order to avoid serious exposure of the preview image, the shooting scheme of the multi-frame short exposure image provided by the embodiment of the present application cannot display the image well. The problem of the details of the dark area, when the ratio of the underexposed pixels in the preview image is greater than the preset threshold, the multi-frame noise reduction processing and the local brightness adjustment are performed on the M-frame short exposure image to obtain a frame of RAW image. The method further includes: calculating a brightness average value of the over dark area of the preview image, determining a long exposure amount according to the brightness average value of the over dark area, the exposure value of the preview image, and the target brightness value, and controlling the camera to acquire a long exposure image of one frame. The M frame short exposure image is subjected to multi-frame noise reduction processing, and the local brightness adjusted image and the long exposure image are subjected to exposure fusion to obtain a frame of RAW image, and the RAW image is input into the ISP for processing. In this way, the processing scheme of the multi-frame short exposure image + one frame long exposure image can be improved to improve the noise detail of the dark area in the image.
需要说明的是,在实际应用中,可以结合预览图像的欠曝程度,动态设置长曝光图像的帧数(不仅可以设置为一帧,还可以设置为多帧),采用多帧多曝光图像+多帧长曝光图像的处理方案来改善图像中暗区的噪声细节。It should be noted that, in practical applications, the number of frames of the long exposure image can be dynamically set in combination with the degree of underexposure of the preview image (not only can be set to one frame, but also can be set to multiple frames), and multi-frame multi-exposure images are used. A multi-frame long exposure image processing scheme to improve the noise detail of dark areas in the image.
在再一种可能的设计中,结合第一方面或者上述任一种可能的设计,上述多帧降噪处理包括:多帧时域降噪处理,或者多帧时域降噪处理和空域降噪处理。In still another possible design, in combination with the first aspect or any of the above possible designs, the multi-frame noise reduction processing includes: multi-frame time domain noise reduction processing, or multi-frame time domain noise reduction processing and spatial domain noise reduction. deal with.
本申请实施例的第二方面,提供一种拍摄方法,该方法应用于包括摄像机和ISP的电子设备,包括:电子设备根据高动态范围图像的预览图像计算长曝光量以及拍摄帧数N,将长曝光量和拍摄帧数N通过ISP下发至摄像机,并控制摄像机根据长曝光量和拍摄帧数N采集N帧长曝光图像,对N帧长曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像,将RAW图像发送至ISP进行处理,得到YUV图像,对YUV图像进行压缩编码,得到目标图像;其中,该方案可适用于预览图像欠曝严重的情况。如此,基于本实施例提供的技术方案,不仅可以改善图像的暗区的细节,而且,将得到的RAW图回灌至ISP处理,处理比较快,提高了拍摄效率。A second aspect of the embodiments of the present application provides a photographing method, which is applied to an electronic device including a camera and an ISP, including: the electronic device calculates a long exposure amount and a photographing frame number N according to a preview image of the high dynamic range image, The long exposure amount and the number of shooting frames N are sent to the camera through the ISP, and the camera is controlled to acquire N frames of long exposure images according to the long exposure amount and the number of shooting frames N, and multi-frame noise reduction processing and partial brightness adjustment for the N frame long exposure images. A RAW image is obtained, and the RAW image is sent to the ISP for processing, and the YUV image is obtained, and the YUV image is compression-encoded to obtain a target image; wherein the scheme can be applied to the case where the preview image is underexposed severely. In this way, based on the technical solution provided by the embodiment, not only the details of the dark area of the image can be improved, but also the obtained RAW image is recharged to the ISP processing, and the processing is faster, and the shooting efficiency is improved.
本申请实施例的第三方面,提供了一种电子设备,该电子设备包括摄像机和ISP,该电子设备还包括:计算单元,拍摄控制单元,图像处理单元;其中,计算单元用于根据预览图像计算短曝光量,以及拍摄帧数M;拍摄控制单元,用于将所述短曝光量和所述拍摄帧数M通过所述ISP下发至摄像机,并控制所述摄像机根据所述短曝光量 和所述拍摄帧数M采集M帧短曝光图像;图像处理单元,用于对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像,将所述RAW图像发送至所述ISP进行处理,得到YUV图像,对所述YUV图像进行压缩编码,得到目标图像。A third aspect of the embodiments of the present application provides an electronic device including a camera and an ISP, the electronic device further comprising: a calculating unit, a shooting control unit, and an image processing unit; wherein the calculating unit is configured to use the preview image Calculating a short exposure amount, and a shooting frame number M; a shooting control unit, configured to send the short exposure amount and the shooting frame number M to the camera through the ISP, and control the camera according to the short exposure amount Obtaining an M frame short exposure image with the shooting frame number M; and an image processing unit configured to perform multi-frame noise reduction processing and local brightness adjustment on the M frame short exposure image to obtain a frame RAW image, and the RAW image Sending to the ISP for processing, obtaining a YUV image, and compressing and encoding the YUV image to obtain a target image.
其中,电子设备的具体实现方式可以参考第一方面或上述方面的任一种可能的设计提供的拍摄方法中电子设备的行为功能,在此不再重复赘述。因此,该方面提供的电子设备可以达到与第一方面或第一方面的任一种可能的设计相同的有益效果。For a specific implementation manner of the electronic device, reference may be made to the behavior of the electronic device in the shooting method provided by the first aspect or any of the foregoing aspects, and details are not repeatedly described herein. Thus, the electronic device provided by this aspect can achieve the same benefits as any of the possible aspects of the first aspect or the first aspect.
本申请实施例的第四方面,提供了一种电子设备,包括一个或多个处理器和一个或多个存储器。该一个或多个存储器与一个或多个处理器耦合,一个或多个存储器用于存储计算机程序代码,计算机程序代码包括计算机指令,当一个或多个处理器执行计算机指令时,使得电子设备执行上述任一方面任一项可能的设计中的拍摄方法。A fourth aspect of an embodiment of the present application provides an electronic device including one or more processors and one or more memories. The one or more memories are coupled to one or more processors for storing computer program code, the computer program code comprising computer instructions for causing the electronic device to perform when the one or more processors execute the computer instructions A method of photographing in any of the possible designs of any of the above aspects.
本申请实施例的第五方面,提供了一种计算机存储介质,包括计算机指令,当计算机指令在电子设备上运行时,使得电子设备执行上述任一方面任一项可能的设计中的拍摄方法。A fifth aspect of the embodiments of the present application provides a computer storage medium comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform a photographing method in any of the possible designs of any of the above aspects.
本申请实施例的第六方面,提供了一种计算机程序产品,当计算机程序产品在计算机上运行时,使得计算机执行上述任一方面任一项可能的设计中的拍摄方法。In a sixth aspect of the embodiments of the present application, there is provided a computer program product, when the computer program product is run on a computer, causing the computer to perform a photographing method in any of the possible designs of any of the above aspects.
本申请的这些方面或其他方面在以下实施例的描述中会更加简明易懂。These and other aspects of the present application will be more readily apparent from the following description of the embodiments.
附图说明DRAWINGS
图1为现有拍摄高动态图像的流程示意图;1 is a schematic flow chart of a conventional high-motion image;
图2为本申请实施例提供的原理框图;Figure 2 is a schematic block diagram of an embodiment of the present application;
图3为本申请实施例提供的手机的结构图;FIG. 3 is a structural diagram of a mobile phone according to an embodiment of the present application;
图4为本申请实施例提供的一种拍摄方法的流程图;FIG. 4 is a flowchart of a shooting method according to an embodiment of the present application;
图5为本申请实施例提供的一种拍摄方法的流程图;FIG. 5 is a flowchart of a shooting method according to an embodiment of the present application;
图5a为本申请实施例提供的预览图像的灰度直方图;FIG. 5a is a gray histogram of a preview image according to an embodiment of the present application;
图5b为本申请实施例提供的高光恢复示意图;FIG. 5b is a schematic diagram of highlight recovery according to an embodiment of the present application;
图5c为本申请实施例提供的图像局部增强示意图;FIG. 5c is a schematic diagram of local enhancement of an image according to an embodiment of the present application;
图5d为本申请实施例提供的图像提亮示意图;FIG. 5 is a schematic diagram of image highlighting provided by an embodiment of the present application; FIG.
图5e为本申请实施例提供的细节回叠示意图;FIG. 5e is a schematic diagram of a detailed back-up according to an embodiment of the present application; FIG.
图6a为本申请实施例提供的拍摄效果示意图;FIG. 6 is a schematic diagram of a photographing effect provided by an embodiment of the present application; FIG.
图6b为本申请实施例提供的拍摄人物时的效果示意图;FIG. 6b is a schematic diagram of an effect of photographing a person according to an embodiment of the present application; FIG.
图7为本申请实施例提供的多帧短曝光图像+长曝光图像的拍摄方法的流程图;7 is a flowchart of a method for photographing a multi-frame short exposure image + a long exposure image according to an embodiment of the present application;
图7a为本申请实施例提供的多帧短曝光图像+长曝光图像的拍摄效果示意图;FIG. 7 is a schematic diagram of a shooting effect of a multi-frame short exposure image + a long exposure image according to an embodiment of the present application; FIG.
图7b为本申请实施例提供的权重曲线示意图;FIG. 7b is a schematic diagram of a weight curve provided by an embodiment of the present application; FIG.
图7c为本申请实施例提供的曝光融合示意图;FIG. 7c is a schematic diagram of exposure fusion provided by an embodiment of the present application; FIG.
图8为本申请实施例提供的一种电子设备的结构图。FIG. 8 is a structural diagram of an electronic device according to an embodiment of the present application.
具体实施方式detailed description
本申请实施例原理框图如图2所示,用户打开拍照应用后,启动摄像机的预览功能,根据预览图像确定拍摄参数(短曝光量、以及拍摄帧数M),并通过拍摄设备内 部设置的ISP将确定的拍摄参数传输至摄像机,控制摄像机采集M(M为大于或等于2的整数)帧短曝光;后续,对M帧短曝光帧在原始(RAW)域进行处理(如多帧时域降噪、单帧时域降噪、局部亮度增强等)得到一帧动态范围高、噪声少的RAW图,并将该RAW图发送至ISP,由ISP处理得到YUV图像;最后,对YUV图像进行联合图像专家小组(Joint Photographic Experts Group,JPEG)编码得到目标图像。如此,可以快速地得到一张高质量的JPEG图像,改善多种应用场景下的拍摄效果。改善拍摄的噪声、动态范围等效果。具体的,其实施方式可参照下述图4或图5所示方案。The principle block diagram of the embodiment of the present application is as shown in FIG. 2, after the user opens the camera application, the camera preview function is activated, and the shooting parameters (short exposure amount, and the number of shooting frames M) are determined according to the preview image, and the ISP is set by the shooting device. Transfer the determined shooting parameters to the camera, control the camera to capture M (M is an integer greater than or equal to 2) frame short exposure; subsequently, the M frame short exposure frame is processed in the original (RAW) domain (such as multi-frame time domain drop Noise, single-frame time domain noise reduction, local brightness enhancement, etc.) Obtain a RAW picture with high dynamic range and low noise, and send the RAW picture to ISP, and process the YUV image by ISP; finally, combine YUV image The Joint Photographic Experts Group (JPEG) encodes the target image. In this way, you can quickly get a high-quality JPEG image to improve the shooting in a variety of application scenarios. Improve the noise, dynamic range and other effects of shooting. Specifically, the implementation manner of the following may refer to the scheme shown in FIG. 4 or FIG. 5 below.
为了便于理解,示例的给出了部分与本申请实施例相关概念的说明以供参考。如下所示:For ease of understanding, the description of some of the concepts related to the embodiments of the present application is given for reference. As follows:
YUV图像:可以指采用YUV编码方法得到的图像,YUV是被欧洲电视系统所采用的一种颜色编码方法。通常,可采用三管彩色摄影机进行取像,然后把取得的彩色图像信号经分色、分别放大校正后得到RGB,再经过矩阵变换电路得到亮度信号Y和两个色差信号R-Y(即U)、B-Y(即V),最后发送端把亮度和色差三个信号分别进行编码,得到YUV图像。YUV image: It can refer to the image obtained by YUV coding method. YUV is a color coding method adopted by European television system. Generally, a three-tube color camera can be used for image acquisition, and then the obtained color image signal is subjected to color separation, separately amplified and corrected to obtain RGB, and then subjected to a matrix conversion circuit to obtain a luminance signal Y and two color difference signals R-Y (ie, U ), B-Y (ie V), the last sender encodes the three signals of luminance and color difference respectively to obtain a YUV image.
动态范围是指:摄像机对拍摄场景中景物光照反射的适应能力,具体指亮度的变化范围。通常情况下,亮度变化范围较大的图像可以称为高动态范围(High Dynamic Range,HDR)图像,亮度变化范围较小的图像可称为低动态范围(Low Dynamic Range,LDR)图像。The dynamic range refers to the ability of the camera to adapt to the illumination of the scene in the scene, specifically the range of brightness. Generally, an image with a large range of brightness variation may be referred to as a High Dynamic Range (HDR) image, and an image with a small range of luminance variation may be referred to as a Low Dynamic Range (LDR) image.
曝光量:是指从相机快门打开到关闭的时间间隔,在这一时间内,物体可以在底片上留下影像,曝光时间是看需要而定的,曝光时间越长底片上生成的相片越亮,相反越暗,在外界光线比较暗的情况下一般要求延长曝光时间,曝光时间短则适合光线比较好的情况。通常情况下,短曝光量和长曝光量以正常曝光量为分界线,小于正常曝光量的曝光时间称为短曝光量,大于正常曝光量的曝光时间称为长曝光量。Exposure: refers to the time interval from when the camera shutter is opened to closed. During this time, the object can leave an image on the film. The exposure time depends on the need. The longer the exposure time, the brighter the photo generated on the film. The darker the opposite, the longer the exposure time is generally required to extend the exposure time, and the short exposure time is suitable for light. Normally, the short exposure amount and the long exposure amount are defined by the normal exposure amount, the exposure time smaller than the normal exposure amount is referred to as a short exposure amount, and the exposure time larger than the normal exposure amount is referred to as a long exposure amount.
正常曝光量可以是相机预览图像时候的曝光。通常,可以在YUV空间计算当前图像的Y值的均值,并调节(自动或手动)各种曝光参数设定,当该均值落在一个目标值附近的时候,其曝光参数中的曝光量就认为是正常曝光量。The normal exposure can be the exposure when the camera previews the image. Generally, the average value of the Y value of the current image can be calculated in the YUV space, and various exposure parameter settings can be adjusted (automatically or manually). When the average value falls near a target value, the exposure amount in the exposure parameter is considered It is the normal exposure.
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行描述。其中,在本申请实施例的描述中,除非另有说明,“/”表示或的意思,例如,A/B可以表示A或B;本文中的“和/或”仅仅是一种描述关联对象的关联关系,表示可以存在三种关系,例如,A和/或B,可以表示:单独存在A,同时存在A和B,单独存在B这三种情况。另外,在本申请实施例的描述中,“多个”是指两个或多于两个。The technical solutions in the embodiments of the present application will be described below with reference to the accompanying drawings in the embodiments of the present application. In the description of the embodiments of the present application, unless otherwise stated, "/" means the meaning of or, for example, A/B may represent A or B; "and/or" herein is merely a description of the associated object. The association relationship indicates that there may be three relationships, for example, A and/or B, which may indicate that there are three cases in which A exists separately, A and B exist at the same time, and B exists separately. In addition, in the description of the embodiments of the present application, "a plurality" means two or more than two.
本申请实施例提供的拍摄方法可以适用于设置有摄像机的电子设备,其中,该电子设备可以为手机、平板电脑、笔记本电脑、超级移动个人计算机(Ultra-mobile Personal Computer,UMPC)、上网本、掌上电脑(Personal Digital Assistant,PDA)、摄像机、数码相机、监控等设备。具体的,本申请实施例以电子设备为图3所示手机100为例,对本申请提供的拍摄方法进行介绍。应该理解的是,图示手机100仅是电子设备的一个范例,并且手机100可以具有比图中所示出的更多的或者更少的部件,可以组合两个或更多的部件,或者可以具有不同的部件配置。图中所示出的各种部件可以在包括一个或多个信号处理和/或专用集成电路在内的硬件、软件、或硬件和软件的组合中实 现。The photographing method provided by the embodiment of the present application can be applied to an electronic device provided with a camera, wherein the electronic device can be a mobile phone, a tablet computer, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, a palm. Computer (Personal Digital Assistant, PDA), camera, digital camera, monitoring and other equipment. Specifically, in the embodiment of the present application, the electronic device is used as the mobile phone 100 shown in FIG. 3 as an example, and the shooting method provided by the present application is introduced. It should be understood that the illustrated mobile phone 100 is only one example of an electronic device, and the mobile phone 100 may have more or fewer components than those shown in the figures, two or more components may be combined, or Has a different component configuration. The various components shown in the figures can be implemented in hardware, software, or a combination of hardware and software, including one or more signal processing and/or application specific integrated circuits.
如图3所示,手机100可以包括:处理器101、存储器102、ISP103、摄像机104、触摸屏105等部件,这些部件可通过一根或多根通信总线或信号线(图3中未示出)进行通信。本领域技术人员可以理解,图3中示出的硬件结构并不构成对手机100的限定,手机100可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。As shown in FIG. 3, the mobile phone 100 may include components such as a processor 101, a memory 102, an ISP 103, a camera 104, a touch screen 105, and the like, which may pass through one or more communication buses or signal lines (not shown in FIG. 3). Communicate. It will be understood by those skilled in the art that the hardware structure shown in FIG. 3 does not constitute a limitation on the mobile phone 100, and the mobile phone 100 may include more or less components than those illustrated, or combine some components, or different component arrangements. .
下面结合图3对手机100的各个部件进行具体的介绍:The various components of the mobile phone 100 will be specifically described below with reference to FIG. 3:
处理器101是手机100的控制中心,利用各种接口和线路连接手机100的各个部分,通过运行或执行存储在存储器102内的应用程序(Application,App),以及调用存储在存储器102内的数据和指令,执行手机100的各种功能和处理数据。在一些实施例中,处理器101可包括一个或多个处理单元;处理器101还可以集成应用处理器(Application Processor,应用服务器)、调制解调处理器以及数字信号处理器(Digital Signal Processor,DSP);其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信,DSP主要用于将模拟信号转化为数字信号,以及对数字信号的噪声进行过滤等。可以理解的是,上述调制解调处理器、图像处理器也可以不集成到处理器101中。举例来说,处理器101可以是华为技术有限公司制造的麒麟960芯片。The processor 101 is a control center of the mobile phone 100, and connects various parts of the mobile phone 100 using various interfaces and lines, by running or executing an application (Application, App) stored in the memory 102, and calling data stored in the memory 102. And instructions to perform various functions and processing data of the mobile phone 100. In some embodiments, the processor 101 may include one or more processing units; the processor 101 may also integrate an application processor (application server), a modem processor, and a digital signal processor (Digital Signal Processor, DSP); wherein the application processor mainly processes an operating system, a user interface, an application, etc., the modem processor mainly processes wireless communication, and the DSP is mainly used to convert an analog signal into a digital signal, and filter the noise of the digital signal. Wait. It can be understood that the above-mentioned modem processor and image processor may not be integrated into the processor 101. For example, the processor 101 may be a Kirin 960 chip manufactured by Huawei Technologies Co., Ltd.
存储器102用于存储应用程序以及数据,处理器101通过运行存储在存储器102的应用程序以及数据,执行手机100的各种功能以及数据处理。存储器102主要包括存储程序区以及存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的应用程序(比如声音播放功能、图像播放功能等);存储数据区可以存储根据使用手机100时所创建的数据(比如音频数据、电话本等)。此外,存储器102可以包括高速随机存取存储器,还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。存储器102可以存储各种操作系统,例如苹果公司所开发的IOS操作系统,谷歌公司所开发的ANDROID操作系统等。The memory 102 is used to store applications and data, and the processor 101 performs various functions and data processing of the mobile phone 100 by running applications and data stored in the memory 102. The memory 102 mainly includes a storage program area and a storage data area, wherein the storage program area can store an operating system, an application required for at least one function (such as a sound playing function, an image playing function, etc.); the storage data area can be stored according to the use of the mobile phone. Data created at 100 o'clock (such as audio data, phone book, etc.). Further, the memory 102 may include a high speed random access memory, and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device. The memory 102 can store various operating systems, such as the IOS operating system developed by Apple Inc., the ANDROID operating system developed by Google Inc., and the like.
ISP103用于对处理器101中DSP输出的图像进行坏点修补、白平衡、伽马(gamma)校正、锐利度、颜色插值等处理,输出用户应用所需要的图像。ISP103是成像设备性能的决定因素。ISP103可以集成在AP中,也可以是单独的芯片,不予限制。The ISP 103 is configured to perform processing such as dead pixel repair, white balance, gamma correction, sharpness, color interpolation, and the like on the image output by the DSP in the processor 101, and output an image required by the user application. ISP 103 is a determining factor in the performance of an imaging device. The ISP 103 can be integrated in the AP, or it can be a separate chip, and is not limited.
触摸屏104可称为触控显示面板,用于实现手机100的输入和输出功能,可收集用户在其上或附近的触摸操作(比如用户使用手指、触笔等任何适合的物体或附件在触摸屏104上或在触摸屏104附近的操作)(如用户按压拍摄按钮的操作),并根据预先设定的程式驱动相应的连接装置,还可用于显示由用户输入的信息或提供给用户的信息(如通过摄像机采集到的图像)以及手机的各种菜单。可选的,触摸屏104可包括触摸检测装置和触摸控制器两个部分,其中,触摸检测装置检测用户的触摸方位,并检测触摸操作带来的信号,将信号传送给触摸控制器;触摸控制器从触摸检测装置上接收触摸信息,并将它转换成触点坐标,再送给处理器101,并能接收处理器101发来的命令并加以执行。The touch screen 104 can be referred to as a touch display panel for implementing the input and output functions of the mobile phone 100, and can collect touch operations on or near the user (such as the user using any suitable object or accessory such as a finger, a stylus, etc. on the touch screen 104. The operation on or near the touch screen 104) (such as the user pressing the operation of the shooting button), and driving the corresponding connection device according to a preset program, can also be used to display information input by the user or information provided to the user (eg, by The images captured by the camera) and various menus of the phone. Optionally, the touch screen 104 may include two parts: a touch detection device and a touch controller, wherein the touch detection device detects a touch orientation of the user, and detects a signal brought by the touch operation, and transmits a signal to the touch controller; the touch controller The touch information is received from the touch detection device and converted into contact coordinates, sent to the processor 101, and can receive commands from the processor 101 and execute them.
摄像机105可称为(camera),为具有视频摄像/传播和静态图像捕捉等基本功能的部件,主要用于图像采集。具体的,摄像机105可以包括摄像头和图像传感器,该 图像传感器可以为电荷耦合元件(Charge Coupled Device,CCD),还可以为金属氧化物半导体元件(Complementary Metal-Oxide Semiconductor,CMOS)的图像传感器或者其他任一类型的图像传感器。The camera 105 can be referred to as a camera, and is a component having basic functions such as video capturing/propagating and still image capturing, and is mainly used for image capturing. Specifically, the camera 105 may include a camera and an image sensor, and the image sensor may be a Charge Coupled Device (CCD), or may be a Metal Oxide Semiconductor (CMOS) image sensor or the like. Any type of image sensor.
在一种可能的设计中,手机100拍摄高动态图像的功能时,处理器101获得短曝光量、以及拍摄帧数M,通过ISP103将端曝光量和M发送至摄像机105;并在用户发出拍摄指令后,由处理器101控制摄像机105根据短曝光量和M采集M帧短曝光图像;随后,处理器101对M帧短曝光图像进行多帧时域降噪、空域降噪、局部亮度调整等RAW图处理操作,得到一帧RAW图像;处理器101将该RAW图像发送至ISP103,ISP103将RAW图像转换为YUV图像,处理器101将YUV图像转换为目标图像,如JPEG图像。具体的,该可能的设计可参照图4或图5所示方案。In a possible design, when the mobile phone 100 captures the function of the high dynamic image, the processor 101 obtains the short exposure amount and the number of shooting frames M, and transmits the terminal exposure amount and M to the camera 105 through the ISP 103; After the instruction, the processor 101 controls the camera 105 to acquire the M frame short exposure image according to the short exposure amount and M; then, the processor 101 performs multi-frame time domain noise reduction, spatial domain noise reduction, local brightness adjustment, etc. on the M frame short exposure image. The RAW map processing operation obtains a frame of RAW image; the processor 101 transmits the RAW image to the ISP 103, the ISP 103 converts the RAW image into a YUV image, and the processor 101 converts the YUV image into a target image, such as a JPEG image. Specifically, the possible design can refer to the scheme shown in FIG. 4 or FIG. 5.
此外,在本申请各个实施例中,手机100还可以包括光传感器106。具体的,光传感器106可包括环境光传感器及接近传感器,其中,环境光传感器可以感知手机100周围环境光线的明暗,以便手机100根据环境光线的明暗来调节触摸屏104的显示器的亮度等。接近传感器可以感知手机100当前与人体耳朵的接近程度,手机100可以在其移动到耳边时,关闭显示器的电源。此外,手机100还可配置的陀螺仪、气压计、湿度计、温度计、红外线传感器等其他传感器,在此不再赘述。Moreover, in various embodiments of the present application, the handset 100 may also include a light sensor 106. Specifically, the light sensor 106 may include an ambient light sensor and a proximity sensor. The ambient light sensor may sense the brightness of ambient light around the mobile phone 100, so that the mobile phone 100 adjusts the brightness of the display of the touch screen 104 according to the brightness of the ambient light. The proximity sensor can sense the proximity of the handset 100 to the human ear, and the handset 100 can turn off the power of the display as it moves to the ear. In addition, the mobile phone 100 can also be configured with other sensors such as a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, and the like, and will not be described herein.
手机100还可以包括给各个部件供电的电源装置107(比如电池和电源管理芯片),电池可以通过电源管理芯片与处理器101逻辑相连,从而通过电源装置107实现管理充电、放电、以及功耗管理等功能。The mobile phone 100 may further include a power supply device 107 (such as a battery and a power management chip) that supplies power to the various components. The battery may be logically connected to the processor 101 through the power management chip to manage charging, discharging, and power management through the power supply device 107. And other functions.
尽管图3未示出,手机100还可以包括蓝牙装置、定位装置、音频电路、扬声器、麦克风、WI-FI装置、近场通信(near field communication,NFC)装置等,在此不再赘述。Although not shown in FIG. 3, the mobile phone 100 may further include a Bluetooth device, a positioning device, an audio circuit, a speaker, a microphone, a WI-FI device, a near field communication (NFC) device, and the like, and details are not described herein.
以下实施例均可以在具有上述硬件的电子设备(例如手机100)中实现。The following embodiments can all be implemented in an electronic device (e.g., mobile phone 100) having the above hardware.
如图4或图5所示,为本申请实施例提供的一种拍摄方法的流程图,其中,该方法可以由图3所示手机100执行,以拍摄高动态图像。以图5为例,该方法可以包括S501~S506。FIG. 4 or FIG. 5 is a flowchart of a photographing method provided by an embodiment of the present application, wherein the method can be performed by the mobile phone 100 shown in FIG. 3 to capture a high dynamic image. Taking FIG. 5 as an example, the method may include S501 to S506.
S501:手机的处理器在检测到开机后,自动开启手机中执行本申请提供的用于拍摄高动态图像的功能;或者,手机的处理器在接收到用户发出的开启本申请提供的拍摄功能的操作后,开启手机中执行本申请提供的用于拍摄高动态图像的功能。S501: After detecting that the mobile phone is turned on, the processor of the mobile phone automatically turns on the mobile phone to perform the function for capturing high dynamic images provided by the application; or the processor of the mobile phone receives the shooting function provided by the user to open the application provided by the application. After the operation, the function of capturing high dynamic images provided by the present application is performed in the mobile phone.
比如,在黑夜监控时可以开启本申请实施例提供的拍摄方法,以实施监控。For example, in the night monitoring, the shooting method provided by the embodiment of the present application can be turned on to implement monitoring.
S502:手机的处理器接收用户发送的打开拍照应用的请求,打开拍照应用,并控制摄像机启动图像预览功能,获取预览图像。S502: The processor of the mobile phone receives the request for opening the camera application sent by the user, opens the camera application, and controls the camera to start the image preview function to obtain the preview image.
其中,预览图像可以指在用户所要拍摄的图像未成像之前,显示在手机的显示屏上的图像。如:当手机的处理器检测到用户以点击桌面图标或在解锁界面滑动相机快捷图标等方式请求开机相机应用时,手机的处理器控制摄像机对该图像进行捕获、对焦等操作,获取预览图像,进一步的,还可以将捕获到的预览图像显示在手机的显示屏上,供用户预览。The preview image may refer to an image displayed on the display screen of the mobile phone before the image to be captured by the user is not imaged. For example, when the processor of the mobile phone detects that the user requests the boot camera application by clicking the desktop icon or sliding the camera shortcut icon on the unlock interface, the processor of the mobile phone controls the camera to capture, focus, etc. the image, and obtain a preview image. Further, the captured preview image can also be displayed on the display screen of the mobile phone for preview by the user.
其中,预览图像可以包括多个像素,每个像素对应一个灰度值,该灰度值可以用于表示该像素的亮度,其取值范围可以为0~255,像素的灰度值越大,表示该像素越 亮,像素的灰度值越小,表示该像素越暗。The preview image may include a plurality of pixels, each pixel corresponding to a gray value, the gray value may be used to indicate the brightness of the pixel, and the value may range from 0 to 255, and the gray value of the pixel is larger. Indicates that the brighter the pixel, the smaller the gray value of the pixel, indicating that the pixel is darker.
S503:若手机的处理器确定该预览图像为高动态图像,则手机的处理器根据预览图像计算短曝光量,以及拍摄帧数M,其中,M为大于等于2的整数。S503: If the processor of the mobile phone determines that the preview image is a high dynamic image, the processor of the mobile phone calculates a short exposure amount according to the preview image, and a shooting frame number M, where M is an integer greater than or equal to 2.
具体的,可以获取预览图像的灰度直方图,根据预览图像的灰度直方图,计算预览图像中过曝像素的比例值和欠曝像素的比例值,若过曝像素的比例值大于第一预设值和/或欠曝像素的比例值大于第二预设值,则确定预览图像为高动态范围图像。Specifically, the gray histogram of the preview image may be obtained, and according to the gray histogram of the preview image, the proportion value of the overexposed pixel and the ratio of the underexposed pixel in the preview image are calculated, and if the ratio of the overexposed pixel is greater than the first The preset value and/or the scale value of the underexposed pixel is greater than the second preset value, and then the preview image is determined to be a high dynamic range image.
其中,过曝像素的比例值大于第一预设值和/或欠曝像素的比例值大于第二预设值可以指:过曝像素的比例值大于第一预设值,或者欠曝像素的比例值大于第二预设值,或者过曝像素的比例值大于第一预设值以及欠曝像素的比例值大于第二预设值。Wherein, the ratio of the overexposure pixel is greater than the first preset value and/or the ratio of the underexposure pixel is greater than the second preset value, which may mean that the ratio of the overexposed pixel is greater than the first preset value, or the underexposure pixel The ratio value is greater than the second preset value, or the ratio value of the overexposed pixel is greater than the first preset value and the ratio value of the underexposed pixel is greater than the second preset value.
其中,灰度直方图是对预览图像中像素的亮度级分布的统计,其将预览图像中的所有像素,按照灰阶值灰度值的大小,统计其出现的概率,并以横轴代表预览图像的亮度、纵轴代表预览图像中处于该亮度范围内的像素的相对数量,以形成灰度直方图。通常,横轴从左到右,灰度值从小到大。The gray histogram is a statistic of the brightness level distribution of the pixels in the preview image, and all the pixels in the preview image are counted according to the grayscale value of the grayscale value, and the probability is represented by the horizontal axis. The brightness and vertical axis of the image represent the relative number of pixels in the preview image that are within the brightness range to form a grayscale histogram. Usually, the horizontal axis is from left to right, and the gray value is from small to large.
过曝像素可以指:灰度值大于过曝阈值的像素;欠曝像素可以指灰度值小于欠曝阈值的像素,过曝阈值、欠曝阈值可以根据需要进行设置,不予限制。通常,对于灰度值的取值范围[0-255]而言,可以定义过曝阈值Tover=235,欠曝阈值Tunder=35,即灰度值为235~255的像素为过曝像素,灰度值为0~35的像素为欠曝像素。The overexposed pixel may refer to a pixel whose gray value is greater than the overexposure threshold; the underexposed pixel may refer to a pixel whose gray value is smaller than the underexposure threshold, and the overexposure threshold and the underexposure threshold may be set as needed, and are not limited. Generally, for the range of gray value values [0-255], the overexposure threshold Tover=235 can be defined, and the underexposure threshold Tunder=35, that is, the pixels with gray values of 235 to 255 are overexposed pixels, gray Pixels with a value of 0 to 35 are underexposed pixels.
过曝像素的比例值可以指:灰度直方图中[过曝阈值Tover,255]之间的像素占预览图像包括的所有像素的比例,记为Ratio_over;欠曝像素的比例指可以指:灰度直方图中[0,欠曝阈值]之间的像素占预览图像包括的所有像素的比例,记为Ratio_under。通常情况下,为了提高拍摄效果,如果Ratio_over超过第一预设值(比如设置为5%),则表示预览图像存在过曝现象,预览图像的整体亮度反差比较大;反之,Ratio_under超过第二预设值(比如40%),则表示预览图像存在欠曝现象,预览图像的整体亮度反差比较大。例如,如图5a所示,为本申请实施例提供的预览图像的灰度直方图,其中,灰度值低于30的像素的比例值约为0.6+0.18=0.78,显著高于预设值(40%),因此该图像为高动态图像。The scale value of the overexposed pixel may refer to: the ratio of pixels between the [overexposed threshold Tover, 255] in the gray histogram to all the pixels included in the preview image, which is recorded as Ratio_over; the ratio of underexposed pixels may refer to: gray The ratio of pixels between [0, underexposed threshold] in the histogram to all pixels included in the preview image is recorded as Ratio_under. Normally, in order to improve the shooting effect, if Ratio_over exceeds the first preset value (for example, set to 5%), it means that the preview image is overexposed, and the overall brightness contrast of the preview image is relatively large; otherwise, Ratio_under exceeds the second pre- Setting the value (such as 40%) means that the preview image is underexposed, and the overall brightness contrast of the preview image is relatively large. For example, as shown in FIG. 5a, a gray histogram of a preview image provided by an embodiment of the present application, wherein a pixel having a gray value lower than 30 has a scale value of about 0.6+0.18=0.78, which is significantly higher than a preset value. (40%), so the image is a high dynamic image.
具体的,可以计算预览图像的高光区域的亮度平均值,根据高光区域的亮度平均值、预览图像的曝光值、以及目标亮度值,确定短曝光量。其中,高光区域的亮度平均值可以指该高光区域中所有像素的灰度值的总和与该高光区域中所有像素的数量的比值,目标亮度值可以指用户期望达到的亮度。其中,高光区域可以指预览图像中过曝像素所组成的区域,该区域还可称为预览图像的过曝区域,高光区域的亮度平均值大于目标亮度平均值。Specifically, the brightness average value of the highlight region of the preview image may be calculated, and the short exposure amount is determined according to the brightness average value of the highlight region, the exposure value of the preview image, and the target brightness value. Wherein, the average value of the brightness of the highlight region may refer to the ratio of the sum of the gray values of all the pixels in the highlight region to the number of all the pixels in the highlight region, and the target luminance value may refer to the brightness that the user desires to achieve. The highlight area may refer to an area formed by overexposed pixels in the preview image, and the area may also be referred to as an overexposed area of the preview image, and the brightness average of the highlight area is greater than the target brightness average.
示例性的,可以计算目标亮度值与高光区域的亮度平均值的比值,将预览图像的曝光值与该比值的乘积作为短曝光量。例如,如下公式所示,假设高光区域的亮度平均值为Mover,目标亮度值为210,则目标亮度值与平均值Mover的比值Rover作为所要对高光区域进行亮度补偿的下降比率,此时,可以在预览图像的曝光值上E 1的基础上,乘以Rover得到短曝光量E targetExemplarily, the ratio of the target luminance value to the luminance average value of the highlight region may be calculated, and the product of the exposure value of the preview image and the ratio is used as the short exposure amount. For example, as shown in the following formula, assuming that the luminance average value of the highlight region is Mover and the target luminance value is 210, the ratio Rover of the target luminance value to the average value Mover is used as the falling ratio of the luminance compensation to be performed on the highlight region. exposure value on the basis of the preview image on E 1, obtained by multiplying short exposure Rover E target:
Figure PCTCN2018080734-appb-000001
Figure PCTCN2018080734-appb-000001
E target=E 1*R overE target =E 1 *R over .
具体的,可以计算预览图像中过暗区域的亮度平均值,根据过暗区域的亮度平均值与拍摄帧数间的对应关系,确定拍摄帧数M。其中,过暗区域的亮度平均值可以指该过暗区域中所有像素的灰度值的总和与该过暗区域中所有像素的数量的比值,过暗区域可以指预览图像中欠曝像素所组成的区域,该区域还可称为预览图像的欠曝区域。该对应关系可以为预设的函数关系式。Specifically, the average value of the brightness of the over dark area in the preview image may be calculated, and the number M of shot frames is determined according to the correspondence between the average value of the brightness of the over dark area and the number of shot frames. The average value of the brightness of the over dark area may refer to the ratio of the sum of the gray values of all the pixels in the dark area to the number of all the pixels in the dark area, and the dark area may refer to the underexposure pixels in the preview image. The area, which may also be referred to as the underexposed area of the preview image. The correspondence may be a preset functional relationship.
例如,假设灰度值在[0,64]区间的像素组成的区域为过暗区域,此时,可以计算灰度值在[0,64]区间的像素的亮度平均值M under,然后,根据下面的公式来设置M,可选的,当平均值M under小于等于阈值40时,M为6,当M under大于阈值20小于等于40时,M为4;当M under大于阈值40时,M为2。 For example, suppose that the area of the pixel whose gray value is in the [0, 64] interval is an over dark area. At this time, the luminance average value M under of the pixel whose gray value is in the [0, 64] interval can be calculated, and then, according to The following formula is used to set M. Optionally, when the average value M under is less than or equal to the threshold 40, M is 6. When M under is greater than the threshold 20 and less than or equal to 40, M is 4; when M under is greater than the threshold 40, M Is 2.
Figure PCTCN2018080734-appb-000002
Figure PCTCN2018080734-appb-000002
还可以参考拍摄场景来确定拍摄帧数M,所述拍摄场景可以包括白天拍摄、夜晚拍摄、逆光拍摄、夜景拍摄、暗光拍摄、点光源拍摄等任一拍摄场景,当前拍摄场景越亮,设置的拍摄帧数M越小,反之,因图像中噪声水平高,M越大。典型的,M可以设置为2帧或者4帧或者6帧等,如:夜景拍摄时M为6帧,白天拍摄时M为2帧,大逆光拍摄时M设置为4帧等等。The shooting frame number may also be determined by referring to the shooting scene, and the shooting scene may include any shooting scenes such as day shooting, night shooting, backlight shooting, night scene shooting, dark light shooting, and point light shooting, and the current shooting scene is brighter, setting The smaller the number of shot frames M is, the larger the M is due to the high noise level in the image. Typically, M can be set to 2 frames or 4 frames or 6 frames, etc., for example, M is 6 frames for night scene shooting, 2 frames for M during daytime shooting, and 4 frames for large backlighting.
具体的,可以根据当前预览图像的曝光时间ET和感光度ISO的比值、以及预览图像来确定当前拍摄处于哪种拍摄场景,如:当曝光时间ET和ISO的比值R小于某个预设比值(如0.9)时,表明当前拍摄场景为白天拍摄,反之,为夜景拍摄;如果是白天拍摄,并且预览图像中欠曝像素的比例值超过一定阈值(较大的阈值),那么,当前拍摄场景可能为逆光场景;如果是夜景拍摄,并且预览图像中过曝像素的比例值超过一定阈值T 1(阈值较小)且小于一定阈值T 2(阈值较大),则当前拍摄场景可能为点光源拍摄;如果是夜景拍摄,并且预览图像中过曝像素的比例值小于上述T 1,那么可以认为是暗光拍摄。其中,曝光时间ET一般为真实时间的倒数,比如,若当前曝光量为80ms,那么曝光时间ET=1000ms/80ms=12。 Specifically, according to the ratio of the exposure time ET of the current preview image and the sensitivity ISO, and the preview image, it is determined which shooting scene the current shooting is in, for example, when the ratio R of the exposure time ET and the ISO is less than a certain preset ratio ( For example, when 0.9), it indicates that the current shooting scene is shooting during the day, and vice versa, for night scene shooting; if it is shooting during the day, and the ratio of the underexposed pixels in the preview image exceeds a certain threshold (large threshold), then the current shooting scene may be For backlighting scenes; if it is night scene shooting, and the scale value of overexposed pixels in the preview image exceeds a certain threshold T 1 (threshold is small) and less than a certain threshold T 2 (large threshold), the current shooting scene may be shot by point light source If it is a night scene shooting, and the scale value of the overexposed pixels in the preview image is smaller than the above T 1 , it can be considered as a dark light shot. The exposure time ET is generally the reciprocal of the real time. For example, if the current exposure amount is 80 ms, the exposure time ET=1000 ms/80 ms=12.
S504:手机的控制器将短曝光量和拍摄帧数M通过ISP下发摄像机,当手机的控制器接收到用户发出的拍照指令后,手机的处理器控制摄像机根据短曝光量和拍摄帧数M,拍摄M帧短曝光图像。S504: The controller of the mobile phone sends the short exposure amount and the number of shooting frames M through the ISP, and when the controller of the mobile phone receives the photographing instruction issued by the user, the processor of the mobile phone controls the camera according to the short exposure amount and the number of shooting frames M. , shooting M frame short exposure images.
具体的,当用户按下手机中摄像机快门的瞬间,手机的处理器控制摄像机基于短曝光量和其他参数(感光度、光圈系数等)采集M帧短曝光图像。Specifically, when the user presses the camera shutter in the mobile phone, the processor of the mobile phone controls the camera to acquire the M frame short exposure image based on the short exposure amount and other parameters (sensitivity, aperture coefficient, etc.).
S505:手机的处理器对M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像。S505: The processor of the mobile phone performs multi-frame noise reduction processing and local brightness adjustment on the M frame short exposure image to obtain a frame of RAW image.
具体的,该步骤中的各个处理过程可参照下面所述。Specifically, each process in this step can be referred to below.
S506:手机的处理器将RAW图像传输至手机的ISP,手机的ISP对RAW图像进行格式转换得到YUV图像,手机的处理器对IPS处理后的YUV图像进行JPEG编码,得到目标图像。S506: The processor of the mobile phone transmits the RAW image to the ISP of the mobile phone, and the ISP of the mobile phone converts the RAW image into a YUV image, and the processor of the mobile phone performs JPEG encoding on the IPS processed YUV image to obtain a target image.
其中,YUV是一种编码格式,Y代表亮度,U和V是色度。ISP对RAW图像进行格式转换得到YUV图像可以包括:ISP先通过demosaic生成RGB三通道的数据,然后根据下述公式进行RGB到YUV的转换得到YUV图像:Y=0.299R+0.587G+0.115b,U=-0.147R-0.289G+0.436B,V=0.615R-0.515G-0.100B。Among them, YUV is an encoding format, Y stands for brightness, and U and V are chromaticity. The ISP converts the RAW image to obtain the YUV image. The ISP can first generate the RGB three-channel data through demosaic, and then convert the RGB to YUV according to the following formula to obtain the YUV image: Y=0.299R+0.587G+0.115b. U = -0.147R - 0.289G + 0.436B, V = 0.615R - 0.515G - 0.100B.
其中,将YUV图像进行JPEG编码的可以包括细节回叠、边缘裁剪等处理。具体的,其过程可参照现有技术,不再赘述。Wherein, the JPEG encoding of the YUV image may include processing such as detail folding, edge cropping, and the like. Specifically, the process can refer to the prior art, and details are not described herein.
与现有技术相比,图5所示方案中,可以在拍摄高动态图像时,根据预览图像实时地确定出高光区域需要进行补偿的短曝光量以及拍摄帧数,基于确定出的短曝光量和拍摄帧数采集多帧短曝光图像,并对多帧短曝光在RAW域进行多帧降噪处理以及局部亮度调整等处理,得到一帧RAW图;然后,再将得到的RAW图回灌至ISP进行处理,得到YUV图像,并对ISP处理后的YUV图像进行处理得到JPEG图像。首先,本实施例提供的技术方案中,基于当前拍摄的预览图像进行短曝光量和帧数M的计算,保证每一个拍摄场景都可以控制曝光,其次,对多帧短曝光帧在RAW域进行降噪、局部亮度调整等处理,不仅具有更好的噪声表现,而且,通过局部亮度调整使得高光细节保留;此外,该方案中,将处理后的RAW图回灌至ISP处理,由于ISP的处理比较快,大大提高了拍摄效率。Compared with the prior art, in the solution shown in FIG. 5, when shooting a high dynamic image, the short exposure amount and the number of shooting frames that need to be compensated for the highlight region can be determined in real time according to the preview image, based on the determined short exposure amount. And shooting a frame number to collect a multi-frame short exposure image, and multi-frame short exposure in the RAW domain for multi-frame noise reduction processing and local brightness adjustment and other processing to obtain a frame of RAW map; then, the resulting RAW map is recharged to The ISP processes the YUV image and processes the ISP-processed YUV image to obtain a JPEG image. First, in the technical solution provided by the embodiment, the short exposure amount and the frame number M are calculated based on the currently captured preview image to ensure that each shooting scene can control the exposure, and secondly, the multi-frame short exposure frame is performed in the RAW domain. Noise reduction, local brightness adjustment and other processes not only have better noise performance, but also preserve high-light detail through local brightness adjustment; in addition, in this scheme, the processed RAW image is recharged to ISP processing, due to ISP processing Faster, greatly improving the shooting efficiency.
例如,如图6a所示,左图为预览图像的效果图,右图为执行图5所示方案之后拍摄出的图像的效果图,由图6a可以看出,执行图5所示方案之后,高光区域以及过暗区域的细节均得到了改善。For example, as shown in FIG. 6a, the left image is an effect diagram of the preview image, and the right image is an effect diagram of the image captured after the scheme shown in FIG. 5 is executed. As can be seen from FIG. 6a, after the scheme shown in FIG. 5 is executed, The details of the highlight area and the over dark area are improved.
具体的,图5所示方案中,S505可以包括下面(1)~(6)所示过程,其中,(1)~(3)为多帧降噪过程,(4)~(6)为局部亮度调整过程。Specifically, in the solution shown in FIG. 5, S505 may include the following processes (1) to (6), wherein (1) to (3) are multi-frame noise reduction processes, and (4) to (6) are local. Brightness adjustment process.
(1)从M帧短曝光图像中选出一帧参考图像R。(1) A frame reference image R is selected from the M frame short exposure images.
示例性的,可以获取M帧短曝光图像中每帧图像的对比度,将对比度最高的一帧图像作为参考图像R,以便提升图像融合后的清晰度。Exemplarily, the contrast of each frame of the M frame short exposure image can be obtained, and the image with the highest contrast is used as the reference image R in order to improve the sharpness after image fusion.
其中,图像的对比度可以用于表征图像的清晰度,对比度越高,图像越清晰。通常,可以计算图像的拉普拉斯(laplacian)梯度的平均值,将该平均值作为图像的对比度。具体的,可以参照现有技术来计算图像的laplacian梯度的平均值,不再赘述。Among them, the contrast of the image can be used to characterize the sharpness of the image, the higher the contrast, the clearer the image. In general, the average of the laplacian gradient of the image can be calculated and used as the contrast of the image. Specifically, the average value of the laplacian gradient of the image may be calculated by referring to the prior art, and details are not described herein again.
(2)将第二帧图像与参考图像R进行像素值求差,对匹配的像素点进行逐点平均,得到新的参考图像R1,同时,记录不匹配的像素的位置,存入mask中;(2) performing pixel value difference between the second frame image and the reference image R, and performing point-by-point averaging on the matched pixel points to obtain a new reference image R1, and simultaneously recording the position of the unmatched pixel and storing it in the mask;
(3)将第三帧图像与R1进行像素值求差,对匹配的像素点进行逐点平均,得到新的参考图像R2,同时,记录不匹配的像素的位置,存入mask中;(3) performing the pixel value difference between the third frame image and R1, and performing point-by-point averaging on the matched pixel points to obtain a new reference image R2, and simultaneously recording the position of the unmatched pixel and storing it in the mask;
(4)对剩余帧重复执行上述动作(3),直至所有帧全部融合,得到时域降噪后的图像R和mask。(4) Repeat the above action (3) on the remaining frames until all the frames are fused, and the image R and mask after the time domain noise reduction are obtained.
其中,第二帧图像可以为除M帧短曝光图像中除参考图像R之外的任一图像,第三帧图像可以为M帧图像中未与参考图像进行像素值求差的的任一图像。The second frame image may be any image other than the reference image R except for the M frame short exposure image, and the third frame image may be any image of the M frame image that is not subjected to pixel value difference with the reference image. .
示例性的,设参考图像为R,第二帧图像为M,首先检测R中的特征点,以及M图像中的特征点,将R和M中的特征点进行匹配;然后,计算warp矩阵,利用warp矩阵对M进行变换得到配准后的M',将R和M'中的特征点进行匹配。其中,特征点可以为R中灰度值发生剧烈变化的像素点或者在图像边缘上曲率较大的点。典型的, 特征点可以为SURF特征点。Warp矩阵可称为变换矩阵,或者其他能够将图像进行平移、旋转、缩放等操作,以对图像进行变形的矩阵。Exemplarily, the reference image is R, the second frame image is M, the feature points in R are first detected, and the feature points in the M image are matched, and the feature points in R and M are matched; then, the warp matrix is calculated, M is transformed by the warp matrix to obtain the registered M', and the feature points in R and M' are matched. The feature point may be a pixel point in which the gray value of the R changes drastically or a point where the curvature is large on the edge of the image. Typically, the feature points can be SURF feature points. The Warp matrix can be called a transformation matrix, or other matrix that can transform, rotate, scale, etc. the image to deform the image.
在R和M'中的特征点进行匹配的过程中,如果图像拍摄时存在运动物体,在不同帧的图像中运动物体位置不同,此时,需要通过图像间像素点求差的方式检测出运动的区域,形成鬼影mask,首先进行逐个像素点的亮度差异值diff的计算:In the process of matching the feature points in R and M', if there are moving objects in the image shooting, the positions of the moving objects are different in the images of different frames. At this time, it is necessary to detect the motion by the difference of the pixel points between the images. The area, forming a ghost mask, first calculates the brightness difference value diff of pixel by pixel:
diff xy=abs(R xy-M′ xy) Diff xy =abs(R xy -M' xy )
如果diff超过设定的阈值(比如10),则认为R和M'中处于同一位置(x,v)的两个点并不匹配,将其位置记录下来,生成模板mask,mask中大于0的位置为不匹配的位置。然后对R和M’其余的区域进行平均:If the diff exceeds the set threshold (such as 10), it is considered that the two points in the same position (x, v) in R and M' do not match, and the position is recorded to generate a template mask, which is greater than 0 in the mask. The location is a mismatched location. Then average the rest of R and M':
O xy=(R xy+M′ xy)/2 O xy =(R xy +M' xy )/2
如此,在融合前进行差异检测,对于差异较小的区域进行平均,降低噪声;对于差异较大的区域(不匹配的区域)不进行平均降噪,此处噪声较大,需要在下一步中空域降噪进行噪声滤除,即步骤(2)中得到的mask可以用于指导后面空域降噪处理的强度。In this way, the difference detection is performed before the fusion, and the area with small difference is averaged to reduce the noise; for the area with large difference (the unmatched area), the average noise reduction is not performed, where the noise is large, and the next step is required. Noise reduction performs noise filtering, that is, the mask obtained in step (2) can be used to guide the intensity of the subsequent airspace noise reduction processing.
(3)对单帧图像进行空域降噪,得到空域降噪后的图像。(3) Perform spatial denoising on a single frame image to obtain an image after spatial denoising.
具体的,可以使用现有常用的非局部均值去噪(Non-Local Means,NLM)等空域降噪方法对时域降噪后的图像进行空域降噪,不再赘述。Specifically, the spatial denoising method of the time domain denoising image may be spatially denoised using a commonly used spatial denoising method such as Non-Local Means (NLM), and will not be described again.
对于2)中的鬼影mask,由于没有时域降噪残留噪声较大,需要对该区域加大降噪强度,其降噪强度可以根据需要进行设置,不予限制。For the ghost mask in 2), since there is no time domain noise reduction residual noise, it is necessary to increase the noise reduction intensity of the region, and the noise reduction intensity can be set as needed, and is not limited.
需要说明的是,当预览图像中存在运动区域时,可以执行上述步骤(3)以去除鬼影,当预览图像中不存在运动区域时,可以不需要执行步骤(3),而仅执行步骤(1)~(2)来实现多种降噪处理。本申请实施例中,仅以预览图像中存在运动区域为例进行说明。It should be noted that when there is a motion area in the preview image, the above step (3) may be performed to remove ghosts. When there is no motion area in the preview image, step (3) may not be performed, and only steps are performed ( 1) ~ (2) to achieve a variety of noise reduction processing. In the embodiment of the present application, only the motion region exists in the preview image is taken as an example for description.
(4)对空域降噪后的图像进行高光恢复处理,得到高光恢复图像。(4) The image after the noise reduction in the airspace is subjected to highlight recovery processing to obtain a highlight recovery image.
具体的,如图5b所示,可以获取校正矩阵C,根据校正矩阵C对空域降噪后的图像I进行校正处理(即镜头畸变矫正(Lens Shading Correction,LSC)处理),得到校正后图像I′;将图像I与图像I′进行曝光融合,得到高光恢复图像O。Specifically, as shown in FIG. 5b, the correction matrix C can be obtained, and the image I after the spatial domain noise reduction is corrected according to the correction matrix C (ie, Lens Shading Correction (LSC) processing), and the corrected image I is obtained. '; Image I and image I' are subjected to exposure fusion to obtain a highlight recovery image O.
其中,C中每个元素对应一个校正系数r(r>1,越远离中心,r值越大),根据校正矩阵C对空域降噪后的图像I进行校正处理,得到校正后图像I′可以指:将每一个像素I ij乘以C对应点C ij的矫正系数r,得到矫正后的图像I′。 Wherein, each element in C corresponds to a correction coefficient r (r>1, the farther away from the center, the larger the r value), the image I after the spatial domain noise reduction is corrected according to the correction matrix C, and the corrected image I′ can be obtained. It is meant that each corrected pixel I ij is multiplied by a correction coefficient r of C corresponding point C ij to obtain a corrected image I′.
将图像I与图像I′进行曝光融合,得到高光恢复图像O可以包括:根据O f=W ij*I ij+W ij*I′ ij计算得到高光恢复图像O中每个像素点的灰度值。其中,W ij为融合权重,其计算过程可参照下述计算权重
Figure PCTCN2018080734-appb-000003
的过程,只不过,在计算权重W ij时,其权重中心设置为210。
Exposing the image I to the image I′ to obtain the highlight recovery image O may include: calculating the gray value of each pixel in the highlight recovery image O according to O f =W ij *I ij +W ij *I′ ij . Where W ij is the fusion weight, and the calculation process can refer to the following calculation weights.
Figure PCTCN2018080734-appb-000003
The process is simply that when the weight W ij is calculated, its weight center is set to 210.
如此,当图像I中某点亮度较高,I′中对应点有可能出现亮度过曝时,可以通过高光恢复的方法保留I中未过曝的细节,或者保留I′中过曝区域的细节。Thus, when the brightness of a certain point in the image I is high, and the corresponding point in I' may be overexposed, the unexposed details of I may be preserved by the method of highlight recovery, or the details of the overexposed area in I' may be retained. .
(5)对空域降噪后的图像进行局部亮度增强处理,得到增强图像。(5) Perform local brightness enhancement processing on the image after the spatial noise reduction to obtain an enhanced image.
由于,本申请实施例为了保留过曝部分细节,在曝光计算的时候降低了曝光,因此空域降噪后的图像中的暗部细节信息可能不足,鉴于此,可以采用gamma曲线提亮、Retinex方法等提亮方式对空域降噪后的图像进行提亮,将提亮的图像后与高光恢复图像进行曝光融合,得到暗处、亮度细节都丰富的图像。具体的,其处理如图5c所示,包括:Since the embodiment of the present application reduces the exposure during the exposure calculation in order to preserve the details of the overexposed portion, the dark portion detail information in the image after the spatial domain noise reduction may be insufficient. In view of this, the gamma curve can be used for brightening, the Retinex method, and the like. The brightening method brightens the image after the airborne noise reduction, and combines the highlighted image with the highlight recovery image to obtain an image with darkness and rich brightness details. Specifically, the processing is as shown in FIG. 5c, including:
根据转换公式Gray=R*0.299+G*0.587+B*0.114将空域降噪后的图像转换成灰度图Gray。其中,R\G\B分别指图像的红/绿/蓝通道。The spatially denoised image is converted into a grayscale Gray according to the conversion formula Gray=R*0.299+G*0.587+B*0.114. Where R\G\B refers to the red/green/blue channel of the image, respectively.
对灰度图Gray进行细节分离,得到基础层(Base层)和细节层D;如:可以通过高斯滤波方法将灰度图Gray分离为Base和细节层D。The detail of the grayscale Gray is separated to obtain the base layer (base layer) and the detail layer D; for example, the grayscale gray can be separated into the base and the detail layer D by the Gaussian filtering method.
对Base层进行动态范围压缩(或称作亮度增强),得到增强的灰度图。具体的,可以通过Retinex方法增强得到增强的灰度图。Dynamic range compression (or brightness enhancement) is performed on the Base layer to obtain an enhanced grayscale image. Specifically, the enhanced grayscale image can be enhanced by the Retinex method.
如:根据数学模型:r(x,y)=logS(x,y)-log(F(x,y)*S(x,y))得到增强的灰度图,其中,S为空域降噪后的图像,r为增强的结果;F为中心环绕函数,一般定义为一个高斯算子,如:For example, according to the mathematical model: r (x, y) = logS (x, y) - log (F (x, y) * S (x, y)) to obtain an enhanced grayscale, where S is the spatial domain noise reduction After the image, r is the enhanced result; F is the center surround function, generally defined as a Gaussian operator, such as:
Figure PCTCN2018080734-appb-000004
Figure PCTCN2018080734-appb-000004
根据增强的灰度图的图像平均亮度以及预设提亮幅度,确定提亮增益(Gain)。其中,预设提亮幅度可以根据需要进行设置,不予限制。假设图像平均亮度M G,预设提亮幅度为128,此时,可以将M G与128做比较,得到Gain值,如:Gain=128/M GThe brightening gain (Gain) is determined according to the average brightness of the image of the enhanced grayscale image and the preset brightening amplitude. Among them, the preset brightness range can be set as needed, and is not limited. M G is assumed that the average image brightness, brighten the preset amplitude is 128 In this case, M G and 128 may be compared, the value of Gain obtained, such as: Gain = 128 / M G.
根据Gain值对增强的灰度图进行图像提亮,得到提亮的图像。具体的,可以将增强的灰度图中每个像素乘以Gain值,或者根据Gain值直接换算一条gamma曲线进行逐点查表(gamma曲线根据经验预设),提亮图像中的各个像素,得到提亮的图像G_bright。The enhanced grayscale image is brightened according to the Gain value to obtain a brightened image. Specifically, each pixel in the enhanced grayscale image may be multiplied by a Gain value, or a Gamma curve may be directly converted according to the Gain value to perform a point-by-point lookup table (the gamma curve is preset according to experience), and each pixel in the image is highlighted. Get the highlighted image G_bright.
将Base层、提亮的图像G_bright、以及高光恢复图像O进行曝光融合,得到局部亮度增强的图像B_enhance。The Base layer, the brightened image G_bright, and the highlight restored image O are subjected to exposure fusion to obtain a local brightness enhanced image B_enhance.
利用分离的细节层D对局部亮度增强的图像B_enhance进行细节增强(即图像相加),得到最终的增强灰度图Genhance。其中,图像相加可以指:将两个图像的处于同一位置上的像素的灰度值进行相加。The detail enhancement layer B is used to enhance the detail of the local brightness enhancement image B_enhance (ie, image addition) to obtain the final enhanced grayscale image Genhance. Wherein, the image addition may refer to adding the gray values of the pixels at the same position of the two images.
根据Gray和Genhance,计算每个像素的增益(gain)系数,并对降噪后的图像中每一个RGGB格式的四通道乘以对应的增益系数,得到最终的增强图像Ienhance。例如,如图5d所示,R\G\B通道每四个构成一组,可以计算得到一个单个的Gray;下面灰度图增强计算后,得到增益系数Ratio=Genhance/Gray,然后,将Ratio乘在R,G,B各通道上。According to Gray and Genhance, the gain coefficient of each pixel is calculated, and the corresponding gain coefficient is multiplied by four channels of each RGGB format in the noise-reduced image to obtain a final enhanced image Ienhance. For example, as shown in FIG. 5d, each of the R\G\B channels constitutes a group, and a single Gray can be calculated; after the grayscale enhancement calculation is performed, a gain coefficient Ratio=Genhance/Gray is obtained, and then, Ratio is obtained. Multiply on each channel of R, G, and B.
其中,该步骤中的曝光融合技术可参照下述。Among them, the exposure fusion technique in this step can be referred to the following.
(6)对增强图像Ienhance进行细节回叠。(6) Perform a detail overlay on the enhanced image Ienhance.
其中,细节回叠的核心是将原图像表示为基本分量(base layer)与细节分量(detail layer)之和,在此基础上单独增强细节分量并得到增强图像,如将细节分量乘以一个系数后回叠到原图上,其关键在于基本分量的获取。具体的,如图5e所示:The core of the detail overlay is to represent the original image as the sum of the base layer and the detail layer. On this basis, the detail component is separately enhanced and the enhanced image is obtained, such as multiplying the detail component by a coefficient. After back-stacking onto the original image, the key is to acquire the basic components. Specifically, as shown in Figure 5e:
将上述(5)中得到的Ienhance进行高斯滤波得到低频图像I;Performing the Gaussian filtering of the Ienhance obtained in (5) above to obtain a low frequency image I;
根据公式Idetail=Ienhance-I得到细节图像;Obtaining a detailed image according to the formula Idetail=Ienhance-I;
设置回叠系数α;得到细节回叠图像Ienhance+I_detail*αSet the back-off coefficient α; get the detail back-up image Ienhance+I_detail*α
其中,回叠系数α为经验值,一般大于1,通常情况下,图像的锐度越低,回叠系数α越高;图像的锐度越高,回叠系数α越低。Wherein, the back-off coefficient α is an empirical value, generally greater than 1. In general, the lower the sharpness of the image, the higher the rebound coefficient α; the higher the sharpness of the image, the lower the back-off coefficient α.
需要说明的是,图5e中每个操作都是对图像每个像素逐点操作。It should be noted that each operation in FIG. 5e is a point-by-point operation for each pixel of the image.
进一步的,在图5所示方案中,如果预览图像中存在特征区域,如:人脸区域,人像区域,花朵区域,食物等用户期望拍摄到的图像拍摄主体,为了提亮该特征区域成像时的亮度,需要增大短曝光量,即曝光量的时间较长一点,使得预览图像成像时的曝光较高,特征区域中的物体较亮。Further, in the solution shown in FIG. 5, if there are feature areas in the preview image, such as: face area, portrait area, flower area, food, etc., the image capturing subject that the user desires to capture, in order to brighten the feature area when imaging The brightness needs to be increased by a short exposure amount, that is, the exposure time is longer, so that the exposure of the preview image is higher, and the object in the feature area is brighter.
以特征区域为人脸为例,可以计算人脸区域的平均亮度值M faceTaking the feature area as a human face as an example, the average brightness value M face of the face area can be calculated;
计算目标亮度值与高光区域的亮度平均值的比值:Calculate the ratio of the target brightness value to the brightness average of the highlight area:
Figure PCTCN2018080734-appb-000005
Figure PCTCN2018080734-appb-000005
根据计算出的R over、以及对特征区域进行亮度补偿的最小下降比率R min重新设置R over,然后,在预览图像的曝光值上E 1的基础上,乘以设置后的Rover得到短曝光量E target。其中,人脸区域的平均亮度值M face可以指:人脸区域的所有像素的灰度值的总和人脸区域的所有像素的数量的比值;通常情况下,人脸区域的平均亮度值越低,R over越大(R over一般小于1)。例如:R over的最终结果将根据下述R over和设置的R min来确定,就是说,比较R over是不是比最小值R min还小,如果还小就设置为最小值R min,不能再小了;如果比最小值R min大,那么就是计算出的R overThe reset R over R over calculated, and the minimum drop rate R min to compensate the brightness of the feature region, and then, in the exposure value E of the preview image on the basis of 1, Rover multiplied by the short exposure amount provided to give E target . The average luminance value M face of the face region may be: a ratio of the total of the gray values of all the pixels of the face region to the number of all the pixels of the face region; generally, the average luminance value of the face region is lower. , R over is larger (R over is generally less than 1). For example, the final result of R over will be determined according to the following R over and the set R min , that is, whether R over is smaller than the minimum value R min , and if it is small, it is set to the minimum value R min , which can no longer be Small; if it is larger than the minimum value R min , then it is the calculated R over .
Figure PCTCN2018080734-appb-000006
Figure PCTCN2018080734-appb-000006
其中,R min可以根据不同的人脸区域的平均亮度值M face进行设置,如下所示,x表示人脸的平均亮度值: Where R min can be set according to the average brightness value M face of different face regions, as shown below, x represents the average brightness value of the face:
Figure PCTCN2018080734-appb-000007
Figure PCTCN2018080734-appb-000007
例如,如图6b所示,为拍摄人物的示意图,其中,图6b中左图为预览图像,右图为提亮人脸的曝光值得到的图像,明显的提亮很多。For example, as shown in FIG. 6b, in order to photograph a person, the left image in FIG. 6b is a preview image, and the image on the right is an image obtained by highlighting the exposure value of the human face, which is obviously brightened a lot.
进一步的,在图5所示方案中,若ISP包括空域降噪模块、镜头畸变矫正LSC模块、以及用于调整亮度的DRC等模块,则在ISP对RAW图像进行YUV格式转换时,还需要关闭这些模块,以避免ISP对图像的重复处理。Further, in the solution shown in FIG. 5, if the ISP includes a spatial domain noise reduction module, a lens distortion correction LSC module, and a DRC module for adjusting brightness, when the ISP performs YUV format conversion on the RAW image, it also needs to be turned off. These modules are used to avoid ISP's repeated processing of images.
进一步的,若预览图像中欠曝像素的比例值大于预设阈值,则表示预览图像中的过暗区域比较大,此时,可以确定长曝光量,根据长曝光量控制摄像机采集一帧长曝光图像,将M帧短曝光图像和该长曝光图像融合,以改善暗区细节。例如,如图7所示,为多帧短曝光图像与长曝光图像的流程示意图。Further, if the ratio of the underexposed pixels in the preview image is greater than the preset threshold, it indicates that the over dark area in the preview image is relatively large. At this time, the long exposure amount can be determined, and the camera is controlled to capture a long exposure according to the long exposure amount. The image is blended with the M-frame short exposure image and the long exposure image to improve dark area detail. For example, as shown in FIG. 7, it is a flow chart of a multi-frame short exposure image and a long exposure image.
具体的,可以计算预览图像的过暗区域的亮度平均值,根据过暗区域的亮度平均值、预览图像的曝光值、以及目标亮度值,确定长曝光量。其中,过暗区域的亮度平 均值可以指该过暗区域中所有像素的灰度值的总和与该过暗区域中所有像素的数量的比值,过暗区域的亮度平均值小于目标亮度值。Specifically, the average value of the brightness of the over dark area of the preview image may be calculated, and the long exposure amount may be determined according to the average value of the brightness of the over dark area, the exposure value of the preview image, and the target brightness value. The brightness average value of the over dark region may refer to a ratio of a sum of gray values of all pixels in the over dark region to a number of all pixels in the over dark region, and an average brightness value of the over dark region is smaller than the target luminance value.
如:可以计算目标亮度值与过暗区域的亮度平均值的比值,将预览图像的曝光值与该比值的乘积作为长曝光量。例如,假设过暗区域的亮度平均值为Munder,目标亮度值为210,则目标亮度值与平均值Munder的比值Runder作为所要对欠曝区域提亮进行的升高比率,为补偿值,此时,可以在预览图像的曝光值E 1上乘以Runder得到长曝光量E targerFor example, the ratio of the target brightness value to the brightness average value of the over dark area can be calculated, and the product of the exposure value of the preview image and the ratio is used as the long exposure amount. For example, if the average brightness value of the over dark area is Munder and the target brightness value is 210, the ratio Runder of the target brightness value to the average value Munder is used as the compensation ratio for the brightness increase of the underexposure area. You can multiply the exposure value E 1 of the preview image by Runder to get the long exposure amount E targer .
Figure PCTCN2018080734-appb-000008
Figure PCTCN2018080734-appb-000008
E target=E 1*R under E target =E 1 *R under
例如,如图7a所示,左图为单纯多帧短曝光效果图,右图为多帧短曝光图像和一帧长曝光图像融合后的效果图,由图7a可以看出,增加一帧长曝光图像后,过暗区域的细节得到了改善。For example, as shown in FIG. 7a, the left picture is a simple multi-frame short exposure effect picture, and the right picture is a multi-frame short exposure image and a frame long exposure image fusion effect diagram. As can be seen from FIG. 7a, one frame length is added. The details of the over dark areas are improved after the image is exposed.
其中,在图7所示方案中,对长曝光图像的多帧时域降噪、空域降噪的过程可参照图5中对短曝光图像的多帧时域降噪、空域降噪过程,不再赘述。In the scheme shown in FIG. 7, the process of multi-frame time domain noise reduction and spatial domain noise reduction of the long exposure image may refer to the multi-frame time domain noise reduction and spatial domain noise reduction process of the short exposure image in FIG. 5, Let me repeat.
具体的,上述提及的曝光融合,以及图7所示方案中对长曝光图像空域降噪后的图像和局部亮度调整后的图像进行曝光融合可参照下述:Specifically, the exposure fusion mentioned above, and the exposure fusion of the image after the spatial attenuation of the long exposure image and the local brightness adjustment in the scheme shown in FIG. 7 can be referred to the following:
定义权重中心。优选的,可以设置c=128。Define the weight center. Preferably, c = 128 can be set.
根据高斯权重公式、以及每张图像包括的像素的灰度值计算每一个像素的权重
Figure PCTCN2018080734-appb-000009
其中,高斯权重公式(公式中像素值使用灰阶最大值255进行了归一化,128归一化后为0.5)为:
Calculate the weight of each pixel according to the Gaussian weight formula and the gray value of the pixels included in each image
Figure PCTCN2018080734-appb-000009
Among them, the Gaussian weight formula (the pixel value in the formula is normalized using the grayscale maximum value of 255, and the normalized by 128 is 0.5) is:
Figure PCTCN2018080734-appb-000010
Figure PCTCN2018080734-appb-000010
例如,如图7b所示,灰度值越靠近128,权重越接近1,越远离128的区域(比如过曝255或者死黑区域0的地方)权重越小,对最终结果共享越小。For example, as shown in FIG. 7b, the closer the gray value is to 128, the closer the weight is to 1, and the farther away from the 128 area (such as overexposed 255 or dead black area 0), the smaller the weight, the smaller the sharing of the final result.
根据权重对多张图像进行加权求和,其中ij表示像素点位置,k表示不同的亮度的图像。A plurality of images are weighted and summed according to weights, where ij represents a pixel point position and k represents an image of a different brightness.
Figure PCTCN2018080734-appb-000011
Figure PCTCN2018080734-appb-000011
例如,如图7c所示,左上三列分别为短曝光输入图,中曝光输入图,长曝光输入图;左下三列分别为短曝光权重图,中曝光权重图,长曝光权重图;最后侧大图为左侧不同曝光图像根据他们各自的权重加权图进行曝光融合的结果,由图7c可知,图像和其权重加权图像进行曝光融合后,该图像的亮部和暗部细节均得到了很好的展现。For example, as shown in FIG. 7c, the upper left three columns are a short exposure input map, a medium exposure input map, and a long exposure input map; the lower left three columns are a short exposure weight map, a medium exposure weight map, and a long exposure weight map; The large image is the result of exposure fusion of different exposure images on the left side according to their respective weighted weight maps. It can be seen from Fig. 7c that after the image and its weight-weighted image are subjected to exposure fusion, the highlights and shadows of the image are well received. Show.
进一步的,在多帧短曝光图像和一帧长曝光图像融合的过程中,当长曝光图像出现模糊或者重影等问题,则丢弃长曝光图像,仅对多帧短曝光图像进行融合处理(即执行图5所示方案),以便将质量较差的长曝光图像融合影响整个图像的质量。Further, in the process of merging the multi-frame short exposure image and the one-frame long exposure image, when the long-exposure image has problems such as blur or ghosting, the long-exposure image is discarded, and only the multi-frame short-exposure image is fused (ie, The scheme shown in Fig. 5 is performed to fuse the poor quality long exposure image to affect the quality of the entire image.
需要说明的是,在预览图像中,一旦存在过曝像素,或者存在典型的过曝像素的时候,可以选择图4或图5所示的多帧短曝光融合方案,以提高图像的高光部分的细节;反之,如果图像不存在任何过曝像素,而是预览图像的整张图像都是欠曝的(比 如漆黑的夜晚),则可以确定长曝光量,执行下述长曝光图像的融合方案。换言之,就是预览图像太亮时,采用图4或图5所示的多帧短曝光融合的拍摄方案,预览图像太黑时,采用多帧长曝光融合的拍摄方案;预览图像有亮有黑,可以采用图7所示多帧短曝光融合+一帧长曝光图像融合的拍摄方案。It should be noted that, in the preview image, once there is overexposed pixels, or there are typical overexposed pixels, the multi-frame short exposure fusion scheme shown in FIG. 4 or FIG. 5 may be selected to improve the highlight portion of the image. Details; conversely, if the image does not have any overexposed pixels, but the entire image of the preview image is underexposed (such as a dark night), then a long exposure can be determined to perform the fusion scheme of the long exposure image described below. In other words, when the preview image is too bright, the multi-frame short exposure fusion shooting scheme shown in FIG. 4 or FIG. 5 is adopted. When the preview image is too dark, the multi-frame long exposure fusion shooting scheme is adopted; the preview image is bright and black, and The multi-frame short exposure fusion + one-frame long exposure image fusion shooting scheme shown in FIG. 7 is adopted.
其中,多帧长曝光融合的拍摄方案为:用户打开拍照应用后,启动摄像机的预览功能,根据预览图像确定拍摄参数(长曝光量、以及拍摄帧数N),并通过拍摄设备内部设置的图像信号处理器(INage Signal Processor,ISP)将确定的拍摄参数传输至摄像机,控制摄像机采集N帧长曝光图像;后续,对N帧长曝光帧在RAW域进行处理(如多帧降噪、局部亮度增强等)得到一帧动态范围高、噪声少的RAW图,并将该RAW图发送至ISP,由ISP处理得到YUV图像;最后,对YUV图像进行联合图像专家小组(Joint Photographic Experts Group,JPEG)编码得到目标图像。如此,可以快速地得到一张高质量的JPEG图像,改善多种应用场景下的拍摄效果。改善拍摄的噪声、动态范围等效果。具体的,该方案中的多帧降噪、局部亮度增强等过程可参照上述,不再赘述。The multi-frame long exposure fusion shooting scheme is: after the user opens the photographing application, the camera preview function is activated, and the shooting parameters (long exposure amount, and the number of shooting frames N) are determined according to the preview image, and the image set by the shooting device is set. The signal processor (ISP) transmits the determined shooting parameters to the camera, and the control camera captures the N-frame long exposure image; subsequently, the N-frame long exposure frame is processed in the RAW domain (eg multi-frame noise reduction, local brightness) Enhance, etc.) Get a RAW image with high dynamic range and low noise, send the RAW image to ISP, and process the YUV image by ISP; finally, Joint Photographic Experts Group (JPEG) for YUV image The code gets the target image. In this way, you can quickly get a high-quality JPEG image to improve the shooting in a variety of application scenarios. Improve the noise, dynamic range and other effects of shooting. Specifically, the process of multi-frame noise reduction, local brightness enhancement, and the like in the solution may be referred to the above, and details are not described herein.
其中,根据预览图像确定拍摄参数(长曝光量、以及拍摄帧数N)的过程可参照上述S503。如:对于图像整体欠曝的情况(比如夜景无光源的时候),亮度范围在[210,225]之间的像素的亮度平均值M over可能小于预设目标值210,此时,可以设置新的目标值Tnew,将目标值Tnew与亮度平均值Mover的比值Rover作为所要对高光区域进行亮度补偿的上升比率R over,R over可能大于1,也就是说设定的曝光将比预览图像的曝光时间更长,随后,在预览图像的曝光值E 1上的基础上,乘以Rover得到长曝光量E target: Here, the process of determining the shooting parameters (the long exposure amount and the number N of shots) based on the preview image can be referred to the above S503. For example, for the case where the image is underexposed as a whole (such as when there is no light source in the night scene), the brightness average value M over of the pixel whose brightness range is between [210, 225] may be less than the preset target value 210. At this time, a new one can be set. The target value Tnew, the ratio Rover of the target value Tnew and the luminance average value Mover is used as the rising ratio R over for the luminance compensation to be performed on the highlight region, R over may be greater than 1, that is, the set exposure will be larger than the exposure of the preview image. The time is longer, and then, based on the exposure value E 1 of the preview image, multiplying Rover to obtain a long exposure E target :
Figure PCTCN2018080734-appb-000012
Figure PCTCN2018080734-appb-000012
E target=E 1*R over E target =E 1 *R over
可以理解的是,为了实现上述功能,电子设备包含了执行各个功能相应的硬件结构和/或软件模块。本领域技术人员应该很容易意识到,结合本文中所公开的实施例描述的各示例的算法步骤,本申请能够以硬件或硬件和计算机软件的结合形式来实现。某个功能究竟以硬件还是计算机软件驱动硬件的方式来执行,取决于技术方案的特定应用和设计约束条件。专业技术人员可以对每个特定的应用来使用不同方法来实现所描述的功能,但是这种实现不应认为超出本申请的范围。It can be understood that, in order to achieve the above functions, the electronic device includes corresponding hardware structures and/or software modules for performing various functions. Those skilled in the art will readily appreciate that the present application can be implemented in a combination of hardware or hardware and computer software in combination with the algorithmic steps of the various examples described in the embodiments disclosed herein. Whether a function is implemented in hardware or computer software to drive hardware depends on the specific application and design constraints of the solution. A person skilled in the art can use different methods to implement the described functions for each particular application, but such implementation should not be considered to be beyond the scope of the present application.
本申请实施例可以根据上述方法示例对电子设备、服务器进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。需要说明的是,本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。The embodiments of the present application may divide the functional modules of the electronic device and the server according to the foregoing method. For example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The above integrated modules can be implemented in the form of hardware or in the form of software functional modules. It should be noted that the division of the module in the embodiment of the present application is schematic, and is only a logical function division, and the actual implementation may have another division manner.
在采用对应各个功能划分各个功能模块的情况下,图8示出了上述实施例中涉及的电子设备的一种可能的组成示意图,如图8所示,该电子设备80可以包括:计算单元801、拍摄控制单元802和图像处理单元803。FIG. 8 is a schematic diagram showing a possible composition of the electronic device involved in the foregoing embodiment. As shown in FIG. 8 , the electronic device 80 may include: a computing unit 801. The photographing control unit 802 and the image processing unit 803.
其中,计算单元801,用于根据预览图像计算短曝光量,以及拍摄帧数M,以支 持电子设备执行S503。The calculation unit 801 is configured to calculate a short exposure amount according to the preview image, and capture the frame number M to support the electronic device to execute S503.
拍摄控制单元802,用于将所述短曝光量和所述拍摄帧数M通过所述ISP下发至摄像机,并控制所述摄像机根据所述短曝光量和所述拍摄帧数M采集M帧短曝光图像;以支持电子设备执行S504。The shooting control unit 802 is configured to send the short exposure amount and the shooting frame number M to the camera through the ISP, and control the camera to acquire the M frame according to the short exposure amount and the shooting frame number M. Short exposure image; execute S504 with the supporting electronic device.
图像处理单元803,用于对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像,将所述RAW图像发送至所述ISP进行处理,得到YUV图像,对所述YUV图像进行压缩编码,得到目标图像。以支持电子设备执行S505和S506。The image processing unit 803 is configured to perform multi-frame noise reduction processing and local brightness adjustment on the M-frame short exposure image to obtain a frame of RAW image, and send the RAW image to the ISP for processing to obtain a YUV image, The YUV image is compression-encoded to obtain a target image. S505 and S506 are executed in support electronic devices.
需要说明的是,上述方法实施例涉及的各步骤的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。本申请实施例提供的电子设备,用于执行上述拍摄方法,因此可以达到与上述拍摄方法相同的效果。It should be noted that all the related content of the steps involved in the foregoing method embodiments may be referred to the functional descriptions of the corresponding functional modules, and details are not described herein again. The electronic device provided by the embodiment of the present application is configured to perform the above-described photographing method, and thus the same effect as the above-described photographing method can be achieved.
在采用集成的单元的情况下,可将上述计算单元801、拍摄控制单元802和图像处理单元803集成为处理模块,其中,处理模块用于对电子设备的动作进行控制管理,例如,处理模块用于支持电子设备执行图5中的步骤S501~S506,和/或用于本文所描述的技术的其它过程。此外,上述电子设备还可以包括显示模块和存储模块。In the case of an integrated unit, the above-mentioned computing unit 801, the shooting control unit 802 and the image processing unit 803 can be integrated into a processing module, wherein the processing module is used for controlling and managing the actions of the electronic device, for example, for the processing module. Steps S501-S506 in FIG. 5, and/or other processes for the techniques described herein, are performed on the supporting electronic device. In addition, the above electronic device may further include a display module and a storage module.
存储模块用于存储电子设备的程序代码、录制的视频以及视频的各项参数信息。The storage module is used to store program code of the electronic device, recorded video, and various parameter information of the video.
其中,处理模块可以是处理器或控制器,例如可以是CPU,图形处理器(graphics processing unit,GPU),通用处理器,数字信号处理器(digital signal processor,DSP),专用集成电路(application-specific integrated circuit,ASIC),FPGA或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。The processing module may be a processor or a controller, such as a CPU, a graphics processing unit (GPU), a general purpose processor, a digital signal processor (DSP), and an application specific integrated circuit (application- Specific integrated circuit (ASIC), FPGA or other programmable logic device, transistor logic device, hardware component, or any combination thereof. It is possible to implement or carry out the various illustrative logical blocks, modules and circuits described in connection with the present disclosure. The processor can also be a combination of computing functions, for example, including one or more microprocessor combinations, a combination of a DSP and a microprocessor, and the like.
显示模块可以为显示器,可以用于显示由用户输入的信息、提供给用户的信息以及终端的各种菜单的设备,具体可以采用液晶显示器、有机发光二极管等形式来配置显示器。另外,显示器上还可以集成触控板,用于采集在其上或附近的触摸事件,并将采集到的触摸信息发送给其他器件(例如处理器等)。The display module can be a display, and can be used to display information input by the user, information provided to the user, and various menus of the terminal. Specifically, the display can be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. In addition, a touch panel can be integrated on the display for collecting touch events on or near the display, and transmitting the collected touch information to other devices (such as a processor, etc.).
存储模块可以是存储器,该存储器可以包括高速RAM,还可以包括非易失存储器,例如磁盘存储器件、闪存器件或其他易失性固态存储器件等。The memory module may be a memory, which may include a high speed RAM, and may also include a nonvolatile memory such as a magnetic disk storage device, a flash memory device, or other volatile solid state storage device.
此外,电子设备还可以包括通信模块,可以用于支持电子设备与其他网络实体的通信,例如与服务器之间的通信。该通信模块具体可以为射频电路、蓝牙芯片、WIFI芯片等与其他电子设备交互的设备。In addition, the electronic device can also include a communication module that can be used to support communication of the electronic device with other network entities, such as communication with a server. The communication module may specifically be a device that interacts with other electronic devices, such as a radio frequency circuit, a Bluetooth chip, and a WIFI chip.
在一种具体实现中,当处理模块为处理器,显示模块为触摸屏,存储模块为存储器时,本申请实施例所涉及的电子设备具体可以为图3所示的手机。In a specific implementation, when the processing module is a processor, the display module is a touch screen, and the storage module is a memory, the electronic device in the embodiment of the present application may specifically be the mobile phone shown in FIG. 3 .
本申请实施例还提供一种计算机存储介质,该计算机存储介质中存储有计算机指令,当该计算机指令在电子设备上运行时,该电子设备执行上述相关方法步骤实现上述实施例中的拍摄方法。The embodiment of the present application further provides a computer storage medium, where the computer storage medium stores computer instructions. When the computer instruction is run on the electronic device, the electronic device performs the foregoing method steps to implement the shooting method in the foregoing embodiment.
本申请实施例还提供了一种计算机程序产品,当该计算机程序产品在计算机上运行时,使得计算机执行上述相关方法步骤,以实现上述实施例中的拍摄方法。The embodiment of the present application further provides a computer program product, when the computer program product is run on a computer, causing the computer to perform the above related method steps to implement the photographing method in the above embodiment.
另外,本申请的实施例还提供一种装置,这个装置具体可以是芯片,组件或模块,该装置可包括相连的处理器和存储器;其中,存储器用于存储计算机执行指令,当装 置运行时,处理器可执行存储器存储的计算机执行指令,以使芯片执行上述各方法实施例中的拍摄方法。In addition, the embodiment of the present application further provides a device, which may be a chip, a component or a module, and the device may include a connected processor and a memory; wherein the memory is used to store a computer to execute instructions, when the device is running, The processor may execute a computer-executable instruction stored in the memory to cause the chip to perform the photographing method in each of the above method embodiments.
其中,本申请实施例提供的电子设备、计算机存储介质、计算机程序产品或芯片均用于执行上文所提供的对应的方法,因此,其所能达到的有益效果可参考上文所提供的对应的方法中的有益效果,此处不再赘述。The electronic device, the computer storage medium, the computer program product, or the chip provided by the embodiment of the present application are all used to perform the corresponding method provided above. Therefore, the beneficial effects that can be achieved can be referred to the corresponding correspondence provided above. The beneficial effects of the method are not repeated here.
通过以上实施方式的描述,所属领域的技术人员可以了解到,为描述的方便和简洁,仅以上述各功能模块的划分进行举例说明,实际应用中,可以根据需要而将上述功能分配由不同的功能模块完成,即将装置的内部结构划分成不同的功能模块,以完成以上描述的全部或者部分功能。Through the description of the above embodiments, those skilled in the art can understand that for the convenience and brevity of the description, only the division of the above functional modules is illustrated. In practical applications, the above functions may be assigned differently according to needs. The function module is completed, that is, the internal structure of the device is divided into different functional modules to complete all or part of the functions described above.
在本申请所提供的几个实施例中,应该理解到,所揭露的装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,模块或单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个装置,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other manners. For example, the device embodiments described above are merely illustrative. For example, the division of modules or units is only a logical function division. In actual implementation, there may be another division manner, for example, multiple units or components may be combined or It can be integrated into another device, or some features can be ignored or not executed. In addition, the mutual coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be in an electrical, mechanical or other form.
作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是一个物理单元或多个物理单元,即可以位于一个地方,或者也可以分布到多个不同地方。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。The units described as separate components may or may not be physically separated, and the components displayed as units may be one physical unit or multiple physical units, that is, may be located in one place, or may be distributed to a plurality of different places. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
另外,在本申请各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。In addition, each functional unit in each embodiment of the present application may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit. The above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
集成的单元如果以软件功能单元的形式实现并作为独立的产品销售或使用时,可以存储在一个可读取存储介质中。基于这样的理解,本申请实施例的技术方案本质上或者说对现有技术做出贡献的部分或者该技术方案的全部或部分可以以软件产品的形式体现出来,该软件产品存储在一个存储介质中,包括若干指令用以使得一个设备(可以是单片机,芯片等)或处理器执行本申请各个实施例方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、ROM、RAM、磁碟或者光盘等各种可以存储程序代码的介质。An integrated unit can be stored in a readable storage medium if it is implemented as a software functional unit and sold or used as a standalone product. Based on such understanding, the technical solution of the embodiments of the present application may be embodied in the form of a software product in the form of a software product in essence or in the form of a contribution to the prior art, and the software product is stored in a storage medium. A number of instructions are included to cause a device (which may be a microcontroller, chip, etc.) or a processor to perform all or part of the steps of the various embodiments of the present application. The foregoing storage medium includes various media that can store program codes, such as a USB flash drive, a mobile hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
以上内容,仅为本申请的具体实施方式,但本申请的保护范围并不局限于此,任何熟悉本技术领域的技术人员在本申请揭露的技术范围内,可轻易想到变化或替换,都应涵盖在本申请的保护范围之内。因此,本申请的保护范围应以权利要求的保护范围为准。The above content is only a specific embodiment of the present application, but the scope of protection of the present application is not limited thereto, and any person skilled in the art can easily think of changes or substitutions within the technical scope disclosed in the present application. It is covered by the scope of protection of this application. Therefore, the scope of protection of this application should be determined by the scope of protection of the claims.

Claims (11)

  1. 一种拍摄方法,应用于包括摄像机和图像信号处理器ISP的电子设备,其特征在于,所述方法包括:A photographing method is applied to an electronic device including a camera and an image signal processor ISP, wherein the method comprises:
    根据预览图像计算短曝光量,以及拍摄帧数M,其中,所述预览图像为高动态范围图像,所述M为大于等于2的整数;Calculating a short exposure amount according to the preview image, and shooting the frame number M, wherein the preview image is a high dynamic range image, and the M is an integer greater than or equal to 2;
    将所述短曝光量和所述拍摄帧数M通过所述ISP下发至摄像机,并控制所述摄像机根据所述短曝光量和所述拍摄帧数M采集M帧短曝光图像;And sending the short exposure amount and the shooting frame number M to the camera through the ISP, and controlling the camera to acquire an M frame short exposure image according to the short exposure amount and the shooting frame number M;
    对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像;Performing multi-frame noise reduction processing and local brightness adjustment on the M-frame short exposure image to obtain a frame of RAW image;
    将所述RAW图像发送至所述ISP进行处理,得到YUV图像;Sending the RAW image to the ISP for processing to obtain a YUV image;
    对所述YUV图像进行压缩编码,得到目标图像。The YUV image is compression-encoded to obtain a target image.
  2. 根据权利要求1所述的方法,其特征在于,所述根据预览图像计算短曝光量,包括:The method according to claim 1, wherein said calculating a short exposure amount based on the preview image comprises:
    计算所述预览图像的高光区域的亮度平均值;其中,所述高光区域指所述预览图像中过曝像素所组成的区域;Calculating a brightness average value of the highlight region of the preview image; wherein the highlight region refers to an area formed by overexposed pixels in the preview image;
    根据所述高光区域的亮度平均值、所述预览图像的曝光值、以及目标亮度值,确定所述短曝光量;其中,所述高光区域的亮度平均值为所述高光区域中所有像素的灰度值的总和与该高光区域中所有像素的数量的比值,所述目标亮度值为用户期望达到的亮度,所述高光区域的亮度平均值大于所述目标亮度平均值。Determining the short exposure amount according to an average value of brightness of the highlight region, an exposure value of the preview image, and a target brightness value; wherein an average value of brightness of the highlight region is gray of all pixels in the highlight region A ratio of the sum of the degrees to the number of all pixels in the highlight region, the target luminance being the brightness desired by the user, the luminance average of the highlight region being greater than the target luminance average.
  3. 根据权利要求1所述的方法,其特征在于,若所述预览图像包括特征区域,则所述根据预览图像计算短曝光量,包括:The method according to claim 1, wherein if the preview image includes a feature area, the calculating a short exposure amount according to the preview image comprises:
    计算所述预览图像的特征区域的亮度平均值;其中,所述特征区域指所述预览图像用户期望拍摄到的图像拍摄主体;Calculating a brightness average value of the feature area of the preview image; wherein the feature area refers to an image capturing subject that the preview image user desires to capture;
    根据所述特征区域的亮度平均值和目标亮度值计算所述特征区域的第一亮度下降比率;其中,所述目标亮度值为用户期望达到的亮度;Calculating a first brightness reduction ratio of the feature area according to a brightness average value of the feature area and a target brightness value; wherein the target brightness value is a brightness that the user desires to achieve;
    根据计算出的所述第一亮度下降比率、以及预设的对所述特征区域进行亮度补偿的最小下降比率,确定所述特征区域的第二亮度下降比率;所述第二亮度下降比率大于等于所述最小下降比率;Determining a second brightness reduction ratio of the feature area according to the calculated first brightness reduction ratio and a preset minimum reduction ratio for performing brightness compensation on the feature area; the second brightness reduction ratio is greater than or equal to The minimum drop ratio;
    根据所述预览图像的曝光值、以及所述第二亮度下降比率,确定所述短曝光量。The short exposure amount is determined according to an exposure value of the preview image and the second brightness reduction ratio.
  4. 根据权利要求1-3任一项所述的方法,其特征在于,当所述预览图像中欠曝像素的比例值大于预设阈值时,在对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像之前,所述方法还包括:The method according to any one of claims 1-3, wherein when the ratio value of the underexposed pixels in the preview image is greater than a preset threshold, multi-frame noise reduction is performed on the short exposure image of the M frame. Processing, local brightness adjustment, before obtaining a frame of RAW image, the method further includes:
    计算所述预览图像的过暗区域的亮度平均值;Calculating an average value of brightness of an over dark area of the preview image;
    根据所述过暗区域的亮度平均值、所述预览图像的曝光值、以及所述目标亮度值,确定长曝光量;其中,所述过暗区域的亮度平均值指该过暗区域中所有像素的灰度值的总和与该过暗区域中所有像素的数量的比值,所述过暗区域的亮度平均值小于目标亮度值;Determining a long exposure amount according to the average value of the brightness of the over dark area, the exposure value of the preview image, and the target brightness value; wherein the brightness average value of the over dark area refers to all pixels in the over dark area a ratio of a sum of gray values to a number of all pixels in the over dark region, the average brightness of the over dark regions being less than the target brightness value;
    控制所述摄像头采集一帧长曝光图像;Controlling the camera to acquire a long exposure image of one frame;
    所述对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像,包括:Performing multi-frame noise reduction processing and local brightness adjustment on the short exposure image of the M frame to obtain a frame of RAW image, including:
    将对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整的图像与所述长曝光图像进行曝光融合,得到一帧RAW图像。The M frame short exposure image is subjected to multi-frame noise reduction processing, and the local brightness adjustment image is subjected to exposure fusion with the long exposure image to obtain a frame RAW image.
  5. 一种电子设备,所述电子设备包括摄像机和图像信号处理器ISP,其特征在于,所述电子设备还包括:An electronic device, comprising: a camera and an image signal processor ISP, wherein the electronic device further comprises:
    计算单元,用于根据预览图像计算短曝光量,以及拍摄帧数M,其中,所述预览图像为高动态范围图像,所述M为大于等于2的整数;a calculating unit, configured to calculate a short exposure amount according to the preview image, and a shooting frame number M, wherein the preview image is a high dynamic range image, and the M is an integer greater than or equal to 2;
    拍摄控制单元,用于将所述短曝光量和所述拍摄帧数M通过所述ISP下发至摄像机,并控制所述摄像机根据所述短曝光量和所述拍摄帧数M采集M帧短曝光图像;a shooting control unit, configured to send the short exposure amount and the shooting frame number M to the camera through the ISP, and control the camera to acquire an M frame short according to the short exposure amount and the shooting frame number M Exposure image
    图像处理单元,用于对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像;将所述RAW图像发送至所述ISP进行处理,得到YUV图像;对所述YUV图像进行压缩编码,得到目标图像。An image processing unit, configured to perform multi-frame noise reduction processing and local brightness adjustment on the M-frame short exposure image to obtain a frame of RAW image; and send the RAW image to the ISP for processing to obtain a YUV image; The YUV image is compression-encoded to obtain a target image.
  6. 根据权利要求5所述的电子设备,其特征在于,所述计算单元,具体用于:The electronic device according to claim 5, wherein the calculating unit is specifically configured to:
    计算所述预览图像的高光区域的亮度平均值;其中,所述高光区域指所述预览图像中过曝像素所组成的区域;Calculating a brightness average value of the highlight region of the preview image; wherein the highlight region refers to an area formed by overexposed pixels in the preview image;
    根据所述高光区域的亮度平均值、所述预览图像的曝光值、以及目标亮度值,确定所述短曝光量;其中,所述高光区域的亮度平均值为所述高光区域中所有像素的灰度值的总和与该高光区域中所有像素的数量的比值,所述目标亮度值为用户期望达到的亮度,所述高光区域的亮度平均值大于所述目标亮度平均值。Determining the short exposure amount according to an average value of brightness of the highlight region, an exposure value of the preview image, and a target brightness value; wherein an average value of brightness of the highlight region is gray of all pixels in the highlight region A ratio of the sum of the degrees to the number of all pixels in the highlight region, the target luminance being the brightness desired by the user, the luminance average of the highlight region being greater than the target luminance average.
  7. 根据权利要求5所述的电子设备,其特征在于,所述计算单元,具体用于:The electronic device according to claim 5, wherein the calculating unit is specifically configured to:
    若所述预览图像包括特征区域,计算所述预览图像的特征区域的亮度平均值;其中,所述特征区域指所述预览图像用户期望拍摄到的图像拍摄主体;If the preview image includes a feature area, calculate a brightness average value of the feature area of the preview image; wherein the feature area refers to an image capturing subject that the preview image user desires to capture;
    根据所述特征区域的亮度平均值和目标亮度值计算所述特征区域的第一亮度下降比率;其中,所述目标亮度值为用户期望达到的亮度;Calculating a first brightness reduction ratio of the feature area according to a brightness average value of the feature area and a target brightness value; wherein the target brightness value is a brightness that the user desires to achieve;
    根据计算出的所述第一亮度下降比率、以及预设的对所述特征区域进行亮度补偿的最小下降比率,确定所述特征区域的第二亮度下降比率;所述第二亮度下降比率大于等于所述最小下降比率;Determining a second brightness reduction ratio of the feature area according to the calculated first brightness reduction ratio and a preset minimum reduction ratio for performing brightness compensation on the feature area; the second brightness reduction ratio is greater than or equal to The minimum drop ratio;
    根据所述预览图像的曝光值、以及所述第二亮度下降比率,确定所述短曝光量。The short exposure amount is determined according to an exposure value of the preview image and the second brightness reduction ratio.
  8. 根据权利要求5-7任一项所述的电子设备,其特征在于,The electronic device according to any one of claims 5 to 7, wherein
    所述计算单元,还用于当所述预览图像中欠曝像素的比例值大于预设阈值时,在所述图像处理单元对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整,得到一帧RAW图像之前,计算所述预览图像的过暗区域的亮度平均值;The calculating unit is further configured to perform multi-frame noise reduction processing and local brightness adjustment on the M-frame short exposure image in the image processing unit when a ratio value of the under-exposed pixels in the preview image is greater than a preset threshold. Calculating an average value of the brightness of the over dark area of the preview image before obtaining a frame of the RAW image;
    所述拍摄控制单元,还用于根据所述过暗区域的亮度平均值、所述预览图像的曝光值、以及所述目标亮度值,确定长曝光量;其中,所述过暗区域的亮度平均值指该过暗区域中所有像素的灰度值的总和与该过暗区域中所有像素的数量的比值,所述过暗区域的亮度平均值小于目标亮度值;控制所述摄像头采集一帧长曝光图像;The photographing control unit is further configured to determine a long exposure amount according to an average value of a brightness of the over dark area, an exposure value of the preview image, and the target brightness value; wherein an average brightness of the over dark area The value refers to the ratio of the sum of the gray values of all the pixels in the over dark area to the number of all the pixels in the dark area, the brightness average of the over dark area is smaller than the target brightness value; controlling the camera to collect one frame long Exposure image
    所述图像处理单元,具体用于将对所述M帧短曝光图像进行多帧降噪处理、局部亮度调整的图像与所述长曝光图像进行曝光融合,得到一帧RAW图像。The image processing unit is specifically configured to perform exposure fusion on the multi-frame noise reduction processing, the partial brightness adjustment image, and the long exposure image on the M-frame short exposure image to obtain a frame RAW image.
  9. 一种电子设备,其特征在于,包括一个或多个处理器和一个或多个存储器;An electronic device comprising one or more processors and one or more memories;
    所述一个或多个存储器与所述一个或多个处理器耦合,所述一个或多个存储器用 于存储计算机程序代码,所述计算机程序代码包括计算机指令,当所述一个或多个处理器执行所述计算机指令时,所述电子设备执行如权利要求1-4任一项所述的拍摄方法。The one or more memories are coupled to the one or more processors, the one or more memories for storing computer program code, the computer program code comprising computer instructions, when the one or more processors The electronic device performs the photographing method according to any one of claims 1 to 4 when the computer instruction is executed.
  10. 一种计算机存储介质,其特征在于,包括计算机指令,当所述计算机指令在电子设备上运行时,使得所述电子设备执行如权利要求1-4任一项所述的拍摄方法。A computer storage medium, comprising computer instructions that, when executed on an electronic device, cause the electronic device to perform the photographing method of any of claims 1-4.
  11. 一种计算机程序产品,其特征在于,当所述计算机程序产品在计算机上运行时,使得所述计算机执行如权利要求1-4任一项所述的拍摄方法。A computer program product, characterized in that when the computer program product is run on a computer, the computer is caused to perform the photographing method according to any one of claims 1-4.
PCT/CN2018/080734 2018-03-27 2018-03-27 Image capture method and device WO2019183813A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2018/080734 WO2019183813A1 (en) 2018-03-27 2018-03-27 Image capture method and device
CN201880077221.5A CN111418201B (en) 2018-03-27 2018-03-27 Shooting method and equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2018/080734 WO2019183813A1 (en) 2018-03-27 2018-03-27 Image capture method and device

Publications (1)

Publication Number Publication Date
WO2019183813A1 true WO2019183813A1 (en) 2019-10-03

Family

ID=68060880

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2018/080734 WO2019183813A1 (en) 2018-03-27 2018-03-27 Image capture method and device

Country Status (2)

Country Link
CN (1) CN111418201B (en)
WO (1) WO2019183813A1 (en)

Cited By (39)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111127529A (en) * 2019-12-18 2020-05-08 浙江大华技术股份有限公司 Image registration method and device, storage medium and electronic device
CN111145151A (en) * 2019-12-23 2020-05-12 维沃移动通信有限公司 Motion area determination method and electronic equipment
CN111242860A (en) * 2020-01-07 2020-06-05 影石创新科技股份有限公司 Super night scene image generation method and device, electronic equipment and storage medium
CN111310727A (en) * 2020-03-13 2020-06-19 浙江大华技术股份有限公司 Object detection method and device, storage medium and electronic device
CN111405205A (en) * 2020-03-24 2020-07-10 维沃移动通信有限公司 Image processing method and electronic device
CN111915505A (en) * 2020-06-18 2020-11-10 北京迈格威科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN112017137A (en) * 2020-08-19 2020-12-01 深圳市锐尔觅移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN112580385A (en) * 2020-12-31 2021-03-30 杭州荣旗科技有限公司 Bar code decoding method based on multi-frame image fusion and non-local mean filtering
CN112651899A (en) * 2021-01-15 2021-04-13 北京小米松果电子有限公司 Image processing method and device, electronic device and storage medium
CN112785537A (en) * 2021-01-21 2021-05-11 北京小米松果电子有限公司 Image processing method, device and storage medium
CN112819702A (en) * 2019-11-15 2021-05-18 北京金山云网络技术有限公司 Image enhancement method and device, electronic equipment and computer readable storage medium
CN112907454A (en) * 2019-11-19 2021-06-04 杭州海康威视数字技术股份有限公司 Method and device for acquiring image, computer equipment and storage medium
CN112954136A (en) * 2021-01-29 2021-06-11 中国科学院长春光学精密机械与物理研究所 Method and device for suppressing shot noise of remote sensing image of aviation squint remote imaging
CN112950489A (en) * 2021-01-12 2021-06-11 辽宁省视讯技术研究有限公司 Three-dimensional field noise reduction method based on multiple exposures
CN112991188A (en) * 2019-12-02 2021-06-18 RealMe重庆移动通信有限公司 Image processing method and device, storage medium and electronic equipment
CN113012081A (en) * 2021-01-28 2021-06-22 北京迈格威科技有限公司 Image processing method, device and electronic system
CN113037988A (en) * 2019-12-09 2021-06-25 Oppo广东移动通信有限公司 Zoom method, electronic device, and computer-readable storage medium
CN113706495A (en) * 2021-08-23 2021-11-26 广东奥普特科技股份有限公司 Machine vision detection system for automatically detecting lithium battery parameters on conveyor belt
CN113747062A (en) * 2021-08-25 2021-12-03 Oppo广东移动通信有限公司 HDR scene detection method and device, terminal and readable storage medium
CN113822819A (en) * 2021-10-15 2021-12-21 Oppo广东移动通信有限公司 HDR scene detection method and device, terminal and readable storage medium
CN113835462A (en) * 2021-09-13 2021-12-24 星宸科技股份有限公司 Control circuit and control method of image sensor
CN113873178A (en) * 2020-06-30 2021-12-31 Oppo广东移动通信有限公司 Multimedia processing chip, electronic device and image processing method
CN113905185A (en) * 2021-10-27 2022-01-07 锐芯微电子股份有限公司 Image processing method and device
CN113962884A (en) * 2021-10-10 2022-01-21 杭州知存智能科技有限公司 HDR video acquisition method and device, electronic equipment and storage medium
CN114095666A (en) * 2021-08-12 2022-02-25 荣耀终端有限公司 Photographing method, electronic device and computer-readable storage medium
CN114264835A (en) * 2021-12-22 2022-04-01 上海集成电路研发中心有限公司 Method, device and chip for measuring rotating speed of fan
CN114449176A (en) * 2022-01-10 2022-05-06 瑞芯微电子股份有限公司 Automatic exposure method, dynamic range identification method, device, medium, and apparatus
CN114511469A (en) * 2022-04-06 2022-05-17 江苏游隼微电子有限公司 Intelligent image noise reduction prior detection method
CN114666512A (en) * 2022-03-25 2022-06-24 四川创安微电子有限公司 Adjusting method and system for rapid automatic exposure
CN114820404A (en) * 2021-01-29 2022-07-29 北京字节跳动网络技术有限公司 Image processing method, image processing apparatus, electronic device, and medium
CN115002356A (en) * 2022-07-19 2022-09-02 深圳市安科讯实业有限公司 Night vision method based on digital video photography
CN115278046A (en) * 2022-06-15 2022-11-01 维沃移动通信有限公司 Shooting method and device, electronic equipment and storage medium
CN115514876A (en) * 2021-06-23 2022-12-23 荣耀终端有限公司 Image fusion method, electronic device, storage medium, and computer program product
US11671715B2 (en) 2021-01-14 2023-06-06 Qualcomm Incorporated High dynamic range technique selection for image processing
CN116389898A (en) * 2023-02-27 2023-07-04 荣耀终端有限公司 Image processing method, device and storage medium
CN116452475A (en) * 2022-01-10 2023-07-18 荣耀终端有限公司 Image processing method and related device
CN116723409A (en) * 2022-02-28 2023-09-08 荣耀终端有限公司 Automatic exposure method and electronic equipment
WO2024011976A1 (en) * 2022-07-14 2024-01-18 荣耀终端有限公司 Method for expanding dynamic range of image and electronic device
WO2024088163A1 (en) * 2022-10-24 2024-05-02 维沃移动通信有限公司 Image processing method and circuit, device, and medium

Families Citing this family (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111866407A (en) * 2020-07-30 2020-10-30 深圳市阿达视高新技术有限公司 Image processing method and device based on motion digital camera
CN112381836B (en) * 2020-11-12 2023-03-31 贝壳技术有限公司 Image processing method and device, computer readable storage medium, and electronic device
CN112598609A (en) * 2020-12-09 2021-04-02 普联技术有限公司 Dynamic image processing method and device
CN112804464B (en) * 2020-12-30 2023-05-09 北京格视科技有限公司 HDR image generation method and device, electronic equipment and readable storage medium
CN114760480A (en) * 2021-01-08 2022-07-15 华为技术有限公司 Image processing method, device, equipment and storage medium
CN114827430B (en) * 2021-01-19 2024-05-21 Oppo广东移动通信有限公司 Image processing method, chip and electronic equipment
CN114979500B (en) * 2021-02-26 2023-08-08 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, electronic device, and readable storage medium
CN112969055B (en) * 2021-03-01 2022-11-08 天地伟业技术有限公司 Multi-exposure method for global monitoring
CN113473014B (en) * 2021-06-30 2022-11-18 北京紫光展锐通信技术有限公司 Image data processing method and electronic equipment
CN115696059A (en) * 2021-07-28 2023-02-03 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN113810603B (en) * 2021-08-12 2022-09-09 荣耀终端有限公司 Point light source image detection method and electronic equipment
CN115706870B (en) * 2021-08-12 2023-12-26 荣耀终端有限公司 Video processing method, device, electronic equipment and storage medium
CN116437222B (en) * 2021-12-29 2024-04-19 荣耀终端有限公司 Image processing method and electronic equipment
CN116416122B (en) * 2021-12-31 2024-04-16 荣耀终端有限公司 Image processing method and related device
CN114022484B (en) * 2022-01-10 2022-04-29 深圳金三立视频科技股份有限公司 Image definition value calculation method and terminal for point light source scene
CN117710265A (en) * 2022-01-25 2024-03-15 荣耀终端有限公司 Image processing method and related device
CN116723417B (en) * 2022-02-28 2024-04-26 荣耀终端有限公司 Image processing method and electronic equipment
CN116095517B (en) * 2022-08-31 2024-04-09 荣耀终端有限公司 Blurring method, terminal device and readable storage medium
CN115767262B (en) * 2022-10-31 2024-01-16 华为技术有限公司 Photographing method and electronic equipment
CN117692761A (en) * 2023-05-23 2024-03-12 荣耀终端有限公司 Motion snapshot method and electronic equipment
CN116847204A (en) * 2023-08-25 2023-10-03 荣耀终端有限公司 Target identification method, electronic equipment and storage medium
CN117692799A (en) * 2023-08-26 2024-03-12 荣耀终端有限公司 Shooting method and related equipment
CN117278864B (en) * 2023-11-15 2024-04-05 荣耀终端有限公司 Image capturing method, electronic device, and storage medium
CN117499779B (en) * 2023-12-27 2024-05-10 荣耀终端有限公司 Image preview method, device and storage medium
CN117714890B (en) * 2024-02-18 2024-05-31 荣耀终端有限公司 Exposure compensation method, electronic equipment and storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009153046A (en) * 2007-12-21 2009-07-09 Sanyo Electric Co Ltd Blur correcting device and method, and imaging apparatus
US8482620B2 (en) * 2008-03-11 2013-07-09 Csr Technology Inc. Image enhancement based on multiple frames and motion estimation
CN104917973A (en) * 2014-03-11 2015-09-16 宏碁股份有限公司 Dynamic exposure adjusting method and electronic apparatus
CN105812670A (en) * 2016-05-12 2016-07-27 珠海市魅族科技有限公司 Picture taking method and terminal
CN105872148A (en) * 2016-06-21 2016-08-17 维沃移动通信有限公司 Method and mobile terminal for generating high dynamic range images
CN107483836A (en) * 2017-09-27 2017-12-15 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107592453A (en) * 2017-09-08 2018-01-16 维沃移动通信有限公司 A kind of image pickup method and mobile terminal

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101399924B (en) * 2007-09-25 2010-05-19 展讯通信(上海)有限公司 Automatic exposure method and device based on brightness histogram
JP5439197B2 (en) * 2010-01-15 2014-03-12 オリンパスイメージング株式会社 Imaging apparatus, imaging method, and imaging program
US8606009B2 (en) * 2010-02-04 2013-12-10 Microsoft Corporation High dynamic range image generation and rendering
CN102075688B (en) * 2010-12-28 2012-07-25 青岛海信网络科技股份有限公司 Wide dynamic processing method for single-frame double-exposure image
JP5713752B2 (en) * 2011-03-28 2015-05-07 キヤノン株式会社 Image processing apparatus and control method thereof
US9064313B2 (en) * 2012-09-28 2015-06-23 Intel Corporation Adaptive tone map to a region of interest to yield a low dynamic range image
CN105578068B (en) * 2015-12-21 2018-09-04 广东欧珀移动通信有限公司 A kind of generation method of high dynamic range images, device and mobile terminal
KR20180027047A (en) * 2016-09-05 2018-03-14 엘지전자 주식회사 High dynamic range image photograping apparatus and method for controlling the same
CN106791475B (en) * 2017-01-23 2019-08-27 上海兴芯微电子科技有限公司 Exposure adjustment method and the vehicle mounted imaging apparatus being applicable in
CN107809593B (en) * 2017-11-13 2019-08-16 Oppo广东移动通信有限公司 Shoot method, apparatus, terminal and the storage medium of image

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009153046A (en) * 2007-12-21 2009-07-09 Sanyo Electric Co Ltd Blur correcting device and method, and imaging apparatus
US8482620B2 (en) * 2008-03-11 2013-07-09 Csr Technology Inc. Image enhancement based on multiple frames and motion estimation
CN104917973A (en) * 2014-03-11 2015-09-16 宏碁股份有限公司 Dynamic exposure adjusting method and electronic apparatus
CN105812670A (en) * 2016-05-12 2016-07-27 珠海市魅族科技有限公司 Picture taking method and terminal
CN105872148A (en) * 2016-06-21 2016-08-17 维沃移动通信有限公司 Method and mobile terminal for generating high dynamic range images
CN107592453A (en) * 2017-09-08 2018-01-16 维沃移动通信有限公司 A kind of image pickup method and mobile terminal
CN107483836A (en) * 2017-09-27 2017-12-15 维沃移动通信有限公司 A kind of image pickup method and mobile terminal

Cited By (64)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112819702B (en) * 2019-11-15 2024-02-20 北京金山云网络技术有限公司 Image enhancement method, image enhancement device, electronic equipment and computer readable storage medium
CN112819702A (en) * 2019-11-15 2021-05-18 北京金山云网络技术有限公司 Image enhancement method and device, electronic equipment and computer readable storage medium
CN112907454B (en) * 2019-11-19 2023-08-08 杭州海康威视数字技术股份有限公司 Method, device, computer equipment and storage medium for acquiring image
CN112907454A (en) * 2019-11-19 2021-06-04 杭州海康威视数字技术股份有限公司 Method and device for acquiring image, computer equipment and storage medium
CN112991188A (en) * 2019-12-02 2021-06-18 RealMe重庆移动通信有限公司 Image processing method and device, storage medium and electronic equipment
CN113037988A (en) * 2019-12-09 2021-06-25 Oppo广东移动通信有限公司 Zoom method, electronic device, and computer-readable storage medium
CN111127529A (en) * 2019-12-18 2020-05-08 浙江大华技术股份有限公司 Image registration method and device, storage medium and electronic device
CN111127529B (en) * 2019-12-18 2024-02-02 浙江大华技术股份有限公司 Image registration method and device, storage medium and electronic device
CN111145151A (en) * 2019-12-23 2020-05-12 维沃移动通信有限公司 Motion area determination method and electronic equipment
CN111145151B (en) * 2019-12-23 2023-05-26 维沃移动通信有限公司 Motion area determining method and electronic equipment
CN111242860A (en) * 2020-01-07 2020-06-05 影石创新科技股份有限公司 Super night scene image generation method and device, electronic equipment and storage medium
CN111242860B (en) * 2020-01-07 2024-02-27 影石创新科技股份有限公司 Super night scene image generation method and device, electronic equipment and storage medium
CN111310727B (en) * 2020-03-13 2023-12-08 浙江大华技术股份有限公司 Object detection method and device, storage medium and electronic device
CN111310727A (en) * 2020-03-13 2020-06-19 浙江大华技术股份有限公司 Object detection method and device, storage medium and electronic device
CN111405205A (en) * 2020-03-24 2020-07-10 维沃移动通信有限公司 Image processing method and electronic device
CN111405205B (en) * 2020-03-24 2023-02-24 维沃移动通信有限公司 Image processing method and electronic device
CN111915505A (en) * 2020-06-18 2020-11-10 北京迈格威科技有限公司 Image processing method, image processing device, electronic equipment and storage medium
CN111915505B (en) * 2020-06-18 2023-10-27 北京迈格威科技有限公司 Image processing method, device, electronic equipment and storage medium
CN113873178B (en) * 2020-06-30 2024-03-22 Oppo广东移动通信有限公司 Multimedia processing chip, electronic device and image processing method
CN113873178A (en) * 2020-06-30 2021-12-31 Oppo广东移动通信有限公司 Multimedia processing chip, electronic device and image processing method
CN112017137A (en) * 2020-08-19 2020-12-01 深圳市锐尔觅移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN112017137B (en) * 2020-08-19 2024-02-27 深圳市锐尔觅移动通信有限公司 Image processing method, device, electronic equipment and computer readable storage medium
CN112580385A (en) * 2020-12-31 2021-03-30 杭州荣旗科技有限公司 Bar code decoding method based on multi-frame image fusion and non-local mean filtering
CN112580385B (en) * 2020-12-31 2022-05-10 杭州荣旗科技有限公司 Bar code decoding method based on multi-frame image fusion and non-local mean filtering
CN112950489A (en) * 2021-01-12 2021-06-11 辽宁省视讯技术研究有限公司 Three-dimensional field noise reduction method based on multiple exposures
CN112950489B (en) * 2021-01-12 2023-11-03 辽宁省视讯技术研究有限公司 Three-dimensional field noise reduction method based on multiple exposure
US11671715B2 (en) 2021-01-14 2023-06-06 Qualcomm Incorporated High dynamic range technique selection for image processing
CN112651899A (en) * 2021-01-15 2021-04-13 北京小米松果电子有限公司 Image processing method and device, electronic device and storage medium
US11989863B2 (en) 2021-01-21 2024-05-21 Beijing Xiaomi Pinecone Electronics Co., Ltd. Method and device for processing image, and storage medium
CN112785537A (en) * 2021-01-21 2021-05-11 北京小米松果电子有限公司 Image processing method, device and storage medium
CN113012081A (en) * 2021-01-28 2021-06-22 北京迈格威科技有限公司 Image processing method, device and electronic system
CN112954136A (en) * 2021-01-29 2021-06-11 中国科学院长春光学精密机械与物理研究所 Method and device for suppressing shot noise of remote sensing image of aviation squint remote imaging
CN114820404A (en) * 2021-01-29 2022-07-29 北京字节跳动网络技术有限公司 Image processing method, image processing apparatus, electronic device, and medium
CN115514876A (en) * 2021-06-23 2022-12-23 荣耀终端有限公司 Image fusion method, electronic device, storage medium, and computer program product
CN115514876B (en) * 2021-06-23 2023-09-01 荣耀终端有限公司 Image fusion method, electronic device, storage medium and computer program product
CN114095666A (en) * 2021-08-12 2022-02-25 荣耀终端有限公司 Photographing method, electronic device and computer-readable storage medium
CN114095666B (en) * 2021-08-12 2023-09-22 荣耀终端有限公司 Photographing method, electronic device, and computer-readable storage medium
CN113706495A (en) * 2021-08-23 2021-11-26 广东奥普特科技股份有限公司 Machine vision detection system for automatically detecting lithium battery parameters on conveyor belt
CN113747062B (en) * 2021-08-25 2023-05-26 Oppo广东移动通信有限公司 HDR scene detection method and device, terminal and readable storage medium
CN113747062A (en) * 2021-08-25 2021-12-03 Oppo广东移动通信有限公司 HDR scene detection method and device, terminal and readable storage medium
CN113835462A (en) * 2021-09-13 2021-12-24 星宸科技股份有限公司 Control circuit and control method of image sensor
CN113962884A (en) * 2021-10-10 2022-01-21 杭州知存智能科技有限公司 HDR video acquisition method and device, electronic equipment and storage medium
CN113962884B (en) * 2021-10-10 2023-03-24 杭州知存智能科技有限公司 HDR video acquisition method and device, electronic equipment and storage medium
CN113822819B (en) * 2021-10-15 2023-10-27 Oppo广东移动通信有限公司 HDR scene detection method and device, terminal and readable storage medium
CN113822819A (en) * 2021-10-15 2021-12-21 Oppo广东移动通信有限公司 HDR scene detection method and device, terminal and readable storage medium
CN113905185A (en) * 2021-10-27 2022-01-07 锐芯微电子股份有限公司 Image processing method and device
CN113905185B (en) * 2021-10-27 2023-10-31 锐芯微电子股份有限公司 Image processing method and device
CN114264835A (en) * 2021-12-22 2022-04-01 上海集成电路研发中心有限公司 Method, device and chip for measuring rotating speed of fan
CN114264835B (en) * 2021-12-22 2023-11-17 上海集成电路研发中心有限公司 Method, device and chip for measuring rotation speed of fan
CN114449176A (en) * 2022-01-10 2022-05-06 瑞芯微电子股份有限公司 Automatic exposure method, dynamic range identification method, device, medium, and apparatus
CN116452475B (en) * 2022-01-10 2024-05-31 荣耀终端有限公司 Image processing method and related device
CN116452475A (en) * 2022-01-10 2023-07-18 荣耀终端有限公司 Image processing method and related device
CN116723409A (en) * 2022-02-28 2023-09-08 荣耀终端有限公司 Automatic exposure method and electronic equipment
CN116723409B (en) * 2022-02-28 2024-05-24 荣耀终端有限公司 Automatic exposure method and electronic equipment
CN114666512A (en) * 2022-03-25 2022-06-24 四川创安微电子有限公司 Adjusting method and system for rapid automatic exposure
CN114666512B (en) * 2022-03-25 2023-06-27 四川创安微电子有限公司 Method and system for adjusting rapid automatic exposure
CN114511469B (en) * 2022-04-06 2022-06-21 江苏游隼微电子有限公司 Intelligent image noise reduction prior detection method
CN114511469A (en) * 2022-04-06 2022-05-17 江苏游隼微电子有限公司 Intelligent image noise reduction prior detection method
CN115278046A (en) * 2022-06-15 2022-11-01 维沃移动通信有限公司 Shooting method and device, electronic equipment and storage medium
WO2024011976A1 (en) * 2022-07-14 2024-01-18 荣耀终端有限公司 Method for expanding dynamic range of image and electronic device
CN115002356A (en) * 2022-07-19 2022-09-02 深圳市安科讯实业有限公司 Night vision method based on digital video photography
WO2024088163A1 (en) * 2022-10-24 2024-05-02 维沃移动通信有限公司 Image processing method and circuit, device, and medium
CN116389898B (en) * 2023-02-27 2024-03-19 荣耀终端有限公司 Image processing method, device and storage medium
CN116389898A (en) * 2023-02-27 2023-07-04 荣耀终端有限公司 Image processing method, device and storage medium

Also Published As

Publication number Publication date
CN111418201A (en) 2020-07-14
CN111418201B (en) 2021-10-15

Similar Documents

Publication Publication Date Title
WO2019183813A1 (en) Image capture method and device
CN108419023B (en) Method for generating high dynamic range image and related equipment
CN110445988B (en) Image processing method, image processing device, storage medium and electronic equipment
WO2020034737A1 (en) Imaging control method, apparatus, electronic device, and computer-readable storage medium
US9118841B2 (en) Determining an image capture payload burst structure based on a metering image capture sweep
CN109671106B (en) Image processing method, device and equipment
US9077913B2 (en) Simulating high dynamic range imaging with virtual long-exposure images
CN109345485B (en) Image enhancement method and device, electronic equipment and storage medium
WO2017215501A1 (en) Method and device for image noise reduction processing and computer storage medium
CN110198417A (en) Image processing method, device, storage medium and electronic equipment
CN111028189A (en) Image processing method, image processing device, storage medium and electronic equipment
US9131201B1 (en) Color correcting virtual long exposures with true long exposures
CN111684788A (en) Image processing method and device
CN110445989B (en) Image processing method, image processing device, storage medium and electronic equipment
US9087391B2 (en) Determining an image capture payload burst structure
WO2020034702A1 (en) Control method, device, electronic equipment and computer readable storage medium
EP3836532A1 (en) Control method and apparatus, electronic device, and computer readable storage medium
CN113810590A (en) Image processing method, electronic device, medium, and system
WO2023137956A1 (en) Image processing method and apparatus, electronic device, and storage medium
JP6873679B2 (en) Imaging device, control method and program of imaging device
CN110572585B (en) Image processing method, image processing device, storage medium and electronic equipment
EP3889883A1 (en) Image processing method and device, mobile terminal, and storage medium
WO2015192545A1 (en) Photographing method and apparatus and computer storage medium
CN117135293B (en) Image processing method and electronic device
US20220230283A1 (en) Method and device for processing image, and storage medium

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18911676

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18911676

Country of ref document: EP

Kind code of ref document: A1