CN112995490A - Image processing method, terminal photographing method, medium and system - Google Patents

Image processing method, terminal photographing method, medium and system Download PDF

Info

Publication number
CN112995490A
CN112995490A CN201911276779.9A CN201911276779A CN112995490A CN 112995490 A CN112995490 A CN 112995490A CN 201911276779 A CN201911276779 A CN 201911276779A CN 112995490 A CN112995490 A CN 112995490A
Authority
CN
China
Prior art keywords
images
image
exposure
noise reduction
exposure values
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911276779.9A
Other languages
Chinese (zh)
Inventor
吴进福
赵乐
杨坤
张古强
吴天航
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201911276779.9A priority Critical patent/CN112995490A/en
Publication of CN112995490A publication Critical patent/CN112995490A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Image Processing (AREA)

Abstract

The application relates to the technical field of image processing, and discloses an image processing method, a terminal photographing system and a terminal photographing medium. The image processing method comprises the steps of obtaining at least 2 first images with the same exposure value based on the same content; carrying out denoising processing based on a plurality of photos on at least 2 first images to obtain at least 1 second image; adjusting exposure values based on the second image subjected to the noise reduction processing to obtain at least 2 third images with different exposure values; and at least 2 third images are fused to obtain a target image with a high dynamic range. The image processing method can be used for machine vision processing of artificial intelligence and can also be applied to a terminal photographing method.

Description

Image processing method, terminal photographing method, medium and system
Technical Field
The present application relates to the field of terminals, and more particularly, to a terminal photographing method, an image processing method thereof, a medium, and a system.
Background
In the modern times, information is rapidly developed, and mobile phones become an indispensable part of human life, particularly mobile phone photographing. When a user takes a picture, it is desirable to obtain a sharp high dynamic range picture. However, in a backlight environment, especially in an indoor backlight environment, a multi-frame noise reduction algorithm is often adopted to obtain a clear picture, and a high dynamic range picture is obtained through an HDR algorithm. However, the multi-frame noise reduction algorithm and the HDR algorithm are in principle in conflict, so that the functions are mutually exclusive and cannot be used simultaneously.
In addition, the combination of multi-frame noise reduction and HDR algorithm in the RAW domain results in large memory occupation, overlong photographing time, influence on user photographing experience, and cause great challenges for middle and low-end mobile phones.
Disclosure of Invention
The embodiment of the application provides an image processing method, a terminal photographing method, a medium and a system thereof.
In a first aspect, an embodiment of the present application provides an image processing method, where the method includes:
acquiring at least 2 first images, wherein the at least 2 first images are based on the same content, and the exposure values of the at least 2 first images are the same; carrying out denoising processing based on a plurality of photos on at least 2 first images to obtain at least 1 second image; adjusting the exposure value based on the second image to obtain at least 2 third images with different exposure values; and at least 2 third images are fused to obtain a target image, wherein the target image is a high dynamic range image. The method comprises the steps of carrying out denoising processing on a plurality of first images with the same exposure value to obtain a clear image, and then adjusting the exposure value of the denoised image to meet the condition of HDR fusion to obtain a clear high dynamic range image.
In a possible implementation of the first aspect, the image processing method further includes:
the at least 2 third images include the second image itself and at least 1 image obtained by adjusting the exposure value for the second image. I.e. HDR fusion may be performed between the second image itself and the second image with the exposure value adjusted.
In a possible implementation of the first aspect, the image processing method further includes:
adjusting the exposure value based on the second image to obtain at least 2 third images with different exposure values, comprising:
the second image is gamma corrected to obtain at least 1 image having an exposure value different from that of the second image. I.e. the adjustment of the exposure value to the second image is done by gamma correction.
In a possible implementation of the first aspect, the image processing method further includes:
the at least 2 third images include at least 2 images obtained by adjusting the exposure value with respect to the second image. That is, the HDR fusion may be performed between a plurality of images with the exposure value adjusted for the second image.
In a possible implementation of the first aspect, the image processing method further includes:
adjusting the exposure value based on the second image to obtain at least 2 third images with different exposure values, comprising:
the second image is gamma-corrected to obtain at least 2 third images having exposure values different from each other.
In a possible implementation of the first aspect, the image processing method further includes:
carrying out denoising processing based on a plurality of photos on at least 2 first images, wherein the denoising processing comprises the following steps: performing multi-frame time domain noise reduction on at least 2 first images. Namely, the first image is denoised through a multi-frame denoising algorithm in a time domain denoising algorithm.
In a possible implementation of the first aspect, the image processing method further includes:
the method comprises the following steps of carrying out denoising processing based on a plurality of photos on at least 2 first images, and further comprising the following steps: performing multi-frame time domain noise reduction on at least 2 first images to obtain images, and performing single frame noise reduction; single frame noise reduction includes single frame spatial noise reduction. Namely, single-frame noise reduction can be further carried out on the obtained single image.
In a possible implementation of the first aspect, the image processing method further includes:
the exposure value difference between at least 2 third images is a preset value. Namely, the exposure value difference between the images after exposure value adjustment is set to be a certain value so as to ensure that the high dynamic range image with the best effect is obtained.
In a possible implementation of the first aspect, the image processing method further includes:
the first image is a YUV domain based image. Namely, the multi-frame time domain noise reduction is carried out on the YUV image, and the HDR fusion algorithm is carried out the same operation relative to the image based on the RAW domain, so that the memory can be saved.
In a second aspect, an embodiment of the present application provides an image processing method, including:
acquiring at least 2 first images, wherein the at least 2 first images are based on the same content, and the exposure values of the at least 2 first images are the same; adjusting exposure value of each first image to obtain at least 2 third images which are related to each first image and have different exposure values; fusing at least 2 third images associated with each first image, and adjusting exposure values of the fused images to obtain at least 2 fourth images with the same exposure values; the fourth image is a high dynamic range image; and carrying out denoising processing based on a plurality of photos on at least 2 fourth images to obtain a target image. The method comprises the steps of firstly obtaining a plurality of first images with the same exposure value, carrying out HDR fusion firstly, carrying out exposure value adjustment on the images with the same exposure value respectively, obtaining at least two third images with different exposure values from each first image, then carrying out HDR fusion on the third images of each first image to obtain a plurality of fourth images with high dynamic range, and adjusting the exposure values of the fourth images to meet the condition of carrying out denoising processing so as to obtain a clear high dynamic range image.
In a possible implementation of the second aspect, the image processing method further includes:
the at least 2 third images associated with each first image include the first image itself and at least 1 image obtained by adjusting the exposure value for the first image. That is, the HDR fusion may be performed by the first image itself and the third image obtained by adjusting the exposure value of the first image.
In a possible implementation of the second aspect, the image processing method further includes:
adjusting exposure value of each first image to obtain at least 2 third images related to each first image, comprising: each of the first images is gamma-corrected, and each of the first images is subjected to at least 1 image having an exposure value different from that of the first image.
In a possible implementation of the second aspect, the image processing method further includes:
the at least 2 third images associated with each first image include at least 2 images obtained by adjusting the exposure value for the first image. Namely, HDR fusion can be performed between the third images obtained by adjusting the exposure values.
In a possible implementation of the second aspect, the image processing method further includes:
adjusting exposure value of each first image to obtain at least 2 third images associated with each first image, comprising: each of the first images is gamma-corrected, and each of the first images is subjected to at least 2 third images having exposure values different from each other.
In a possible implementation of the second aspect, the image processing method further includes:
and carrying out denoising processing based on a plurality of photos on at least 2 fourth images, wherein the denoising processing comprises the following steps: and performing multi-frame time domain noise reduction on at least 2 fourth images. Namely, multiple fourth images obtained after HDR fusion are subjected to multi-frame time domain noise reduction to obtain a clear fourth image.
In a possible implementation of the second aspect, the image processing method further includes:
and performing denoising processing based on a plurality of photos on at least 2 fourth images, further comprising: performing multi-frame time domain noise reduction on at least 2 fourth images to obtain images, and performing single frame noise reduction; single frame noise reduction includes single frame spatial noise reduction.
In a possible implementation of the second aspect, the image processing method further includes:
the exposure value difference between at least 2 third images is a preset value.
In a possible implementation of the first aspect, the image processing method further includes:
the first image is a YUV domain based image.
In a third aspect, an embodiment of the present application provides an image processing method, including:
acquiring M images, wherein the M images can be divided into N groups based on exposure values of the images, M and N are both integers, M is greater than or equal to 2N, and N is greater than or equal to 2; each group comprises at least 2 images, wherein the exposure values of the images in each group are the same, and the exposure values among the N groups of images are different from each other; respectively selecting at least one image from each group to perform HDR fusion to obtain at least two fifth images with the same exposure value, wherein the fifth images are high dynamic range images; and denoising the at least 2 fifth images based on the plurality of images to obtain a target image. The M images are grouped according to the exposure value, the images with the same exposure value are grouped into one group, so that after HDR fusion between the groups is carried out, the exposure values of at least 2 fifth images are the same, and then denoising processing is carried out on the at least 2 fifth images to obtain a clear high dynamic range image.
In a possible implementation of the third aspect, the image processing method further includes:
m images are obtained by adjusting exposure values of K images, wherein K is smaller than M. That is, the exposure parameters of the K images are adjusted to obtain an image with a known exposure value, and then M images can be grouped based on the exposure value.
In a possible implementation of the third aspect, the image processing method further includes:
m images are obtained by adjusting exposure values of K images, wherein K is less than or equal to M/2. When the value of M is 2 times that of K, each group has two images with the same exposure value.
In a possible implementation of the third aspect, the image processing method further includes:
and performing multi-frame time domain noise reduction on the at least two fifth images based on the plurality of images to obtain the target image.
In a possible implementation of the third aspect, the image processing method further includes:
denoising processing based on multiple images comprises the following steps: performing single-frame noise reduction on the images obtained by performing multi-frame noise reduction on at least two fifth images; single frame noise reduction also includes single frame spatial noise reduction.
In a possible implementation of the first aspect, the image processing method further includes:
the first image is a YUV domain based image.
In a fourth aspect, an embodiment of the present application provides an image processing method, including:
acquiring M images, wherein the M images can be divided into N groups based on exposure values of the images, each group comprises at least 2 images, the exposure values of the images in each group are different from each other, M and N are both integers, M is greater than or equal to 2N, and N is greater than or equal to 2; selecting at least two images from each group of the N groups for fusion processing, and adjusting exposure values of the fused images to obtain N fifth images with the same exposure values, wherein the fifth images are high dynamic range images; and performing denoising processing based on a plurality of images on at least two of the N fifth images to obtain a target image. Dividing the M images into N groups according to the exposure value, wherein the exposure value of the images in each group, then selecting at least two images from each group to perform HDR fusion to obtain N fused images, and adjusting the exposure value of the N images to meet the condition of denoising treatment, thereby obtaining a clear high dynamic range image.
In a possible implementation of the fourth aspect, the image processing method further includes:
m images are obtained by adjusting exposure values of K images, wherein K is smaller than M.
In a possible implementation of the fourth aspect, the image processing method further includes:
m images are obtained by adjusting exposure values of K images, wherein K is less than or equal to M/2.
In a possible implementation of the fourth aspect, the image processing method further includes:
and performing multi-frame time domain noise reduction on the at least two fifth images based on the plurality of images to obtain the target image.
In a possible implementation of the fourth aspect, the image processing method further includes:
denoising processing based on multiple images comprises the following steps: performing single-frame noise reduction on the images obtained by performing multi-frame noise reduction on at least two fifth images; single frame noise reduction also includes single frame spatial noise reduction.
In a possible implementation of the first aspect, the image processing method further includes:
the first image is a YUV domain based image.
In a fifth aspect, an embodiment of the present application provides a terminal-based photographing method, including:
the terminal detects the illumination environment of a photographing object;
when the lighting environment of the photographic object is a backlight environment, acquiring three co-exposure YUV images of the photographic object; the method of any one of claims 1-8, wherein three YUV images with the same exposure are subjected to image processing to obtain a high dynamic range image of the photographic subject. Namely, when a user uses the terminal to take a picture, the terminal can detect the illumination environment of the area where the picture-taking object is located, and if the picture-taking object belongs to the backlight picture taking, three identical YUV images with the content of the picture-taking object are collected; the image processing method can be used for the three YUV images, namely, multi-frame time domain noise reduction and single-frame space domain noise reduction are firstly carried out on the three YUV images with the same exposure value, then the exposure value of the denoised YUV images is adjusted through gamma correction to obtain at least two YUV images with different exposure values, and HDR fusion is carried out on the at least two YUV images with different exposure values to further obtain a clear high dynamic range image.
In a sixth aspect, an embodiment of the present application provides an image processing apparatus, including:
the acquisition module is used for acquiring at least 2 first images, wherein the at least 2 first images are based on the same content, and the exposure values of the at least 2 first images are the same;
the noise reduction module is used for carrying out noise reduction processing on at least 2 first images based on a plurality of photos to obtain at least 1 second image;
the fusion module is used for adjusting the exposure value based on the second image to obtain at least 2 third images with different exposure values; and at least 2 third images are fused to obtain a target image, wherein the target image is a high dynamic range image.
In a seventh aspect, an embodiment of the present application provides an image processing apparatus, including:
the acquisition module is used for acquiring at least 2 first images, wherein the at least 2 first images are based on the same content, and the exposure values of the at least 2 first images are the same;
the fusion module is used for adjusting the exposure value of each first image to obtain at least 2 third images which are related to each first image and have different exposure values; at least 2 third images related to each first image are fused, and exposure value adjustment is carried out on the fused images to obtain at least 2 fourth images with the same exposure value; the fourth image is a high dynamic range image;
and the noise reduction module is used for carrying out noise reduction processing based on a plurality of photos on at least 2 fourth images to obtain a target image.
In an eighth aspect, an embodiment of the present application provides an image processing apparatus, including:
the image processing device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring M images, the M images can be divided into N groups based on exposure values of the images, M and N are integers, M is greater than or equal to 2N, and N is greater than or equal to 2; each group comprises at least 2 images, wherein the exposure values of the images in each group are the same, and the exposure values among the N groups of images are different from each other;
the fusion module is used for selecting at least one image from each group to perform HDR fusion to obtain at least two fifth images with the same exposure value, and the fifth images are high dynamic range images;
and the noise reduction module is used for carrying out noise reduction processing on at least 2 fifth images based on a plurality of images to obtain a target image.
In a ninth aspect, an embodiment of the present application provides an image processing apparatus, including:
the image processing device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring M images, the M images can be divided into N groups based on exposure values of the images, each group comprises at least 2 images, the exposure values of the images in each group are different from each other, M and N are integers, M is greater than or equal to 2N, and N is greater than or equal to 2;
the fusion module is used for selecting at least two images from each group of the N groups to perform fusion processing, and performing exposure value adjustment on the fused images to obtain N fifth images with the same exposure values, wherein the fifth images are high-dynamic-range images;
and the noise reduction module is used for carrying out denoising processing based on a plurality of images on at least two of the N fifth images to obtain a target image.
In a tenth aspect, an embodiment of the present application provides a terminal, including:
the detection unit is used for detecting the illumination environment of the target photographing object by the terminal;
the image acquisition unit is used for acquiring three co-exposure YUV images of the photographic object when the illumination environment of the region where the photographic object is located is a backlight environment;
an image processing unit, which performs image processing on the three YUV images with the same exposure by adopting any one of the methods of claims 1 to 8 to obtain a high dynamic range image of the photographic subject.
In an eleventh aspect, an embodiment of the present application provides a machine-readable medium, including: a machine-readable medium having stored thereon instructions which, when executed on a machine, cause the machine to perform the method of any one of claims 1 to 25.
In a twelfth aspect, an embodiment of the present application provides a system, including: a memory for storing instructions for execution by one or more processors of a system, and the processor, being one of the processors of the system, for performing the method of any one of claims 1 to 25.
Drawings
Fig. 1 illustrates a system framework diagram of a handset, according to some embodiments of the present application.
Fig. 2 illustrates a usage scenario of a cell phone, according to some embodiments of the present application.
FIG. 3 illustrates three images collected by a camera module with the same exposure value, according to some embodiments of the present application.
Fig. 4 illustrates a structure of a computer vision module in the handset system of fig. 1, in accordance with some embodiments of the present application.
FIG. 5 illustrates an algorithm flow for multi-frame time-domain noise reduction, according to some embodiments of the present application.
FIG. 6 illustrates a gamma correction algorithm schematic, according to some embodiments of the present application.
Fig. 7(a) is an original without gamma correction, and fig. 7(b), (c) show images with different values of gamma, according to some embodiments of the present application.
Fig. 8 illustrates a schematic diagram of an HDR algorithm, according to some embodiments of the present application.
FIG. 9 illustrates a flow diagram of a method of image processing, according to some embodiments of the present application.
FIG. 10 illustrates a flow diagram of a method of image processing, according to some embodiments of the present application.
FIG. 11 illustrates a flow diagram of a method of image processing, according to some embodiments of the present application.
FIG. 12 illustrates a flow diagram of a method of image processing, according to some embodiments of the present application.
Fig. 13 illustrates a flow chart for mobile phone photography, according to some embodiments of the present application.
14-17 illustrate an apparatus of an image processing method according to some embodiments of the present application.
Fig. 18 illustrates a terminal according to some embodiments of the present application.
Fig. 19 illustrates a block diagram of a system, according to some embodiments of the present application.
Fig. 20 illustrates a block diagram of a system on a chip (SoC), according to some embodiments of the present application.
Detailed Description
Illustrative embodiments of the present application include, but are not limited to, an image processing method, a terminal photographing method, and a medium and system.
It is to be appreciated that as used herein, the term module may refer to or include an Application Specific Integrated Circuit (ASIC), an electronic circuit, a processor (shared, dedicated, or group) and/or memory that executes one or more software or firmware programs, a combinational logic circuit, and/or other suitable hardware components that provide the described functionality, or may be part of such hardware components.
Embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It is to be understood that, in some embodiments of the present application, an image processing method is disclosed in which, based on a combination of MFNR (multi frame noise reduction) and HDR (high dynamic range) algorithms, an image to be processed is processed separately according to an exposure value of the image to be processed: (1) firstly, multi-frame noise reduction is carried out, and then HDR fusion is carried out; (2) HDR fusion is carried out firstly, and then multi-frame noise reduction is carried out, so that a clear high dynamic range image is obtained. The method can also be used for image processing of the YUV domain-based image by adopting the method, so that the memory is saved.
An application scenario of the image recognition method of the present application, a photographing technique of a terminal, is described below.
Specifically, the photographing method and the image processing method in the embodiment of the present invention may be used in a terminal, for example, a mobile phone, a tablet computer, a notebook computer, and the like having a photographing function. Other devices with a photographing function, such as a digital camera, a video camera, etc., are also possible. The terminal judges whether the current photographing scene is a backlight scene or not based on a backlight detection technology so as to detect whether the current photographing scene is a backlight photographing or not. The backlight detection technology may be, but is not limited to, determining whether a current photographing scene is backlight according to a brightness value of a photographing target in the photographing scene, and embodiments of the present invention are not limited and described herein.
It is to be appreciated that in various embodiments of the present application, the processor may be a microprocessor, a digital signal processor, a microcontroller, or the like, and/or any combination thereof. According to another aspect, the processor may be a single-core processor, a multi-core processor, the like, and/or any combination thereof.
Embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
Some embodiments according to the present application disclose a handset system. Fig. 1 shows a block diagram of the cellular phone system. It is understood that the system framework is also applicable to other terminals, not limited to mobile phones. As shown in fig. 1, the handset system 10 includes: a software system 110 and a hardware system 120. The hardware system 120 includes a camera module 121, a sensor module 122, and a display screen 123, and the software system 110 includes an operating system 111 and a computer vision module 113 at an application layer 112. The operating system 111 is a computer program integrated in the terminal that manages hardware and software resources of the terminal device. The application layer 112 is a module program having a specific function that runs on top of the operating system 111. The camera module 121 is used to capture video or image information, such as images taken by a user. The sensor module 122 is configured to detect illumination of a current photographing environment, for example, detect that the current illumination environment is a backlight photographing environment.
It is to be understood that the illustrated structure of the embodiment of the present invention is not intended to limit the handset 10. In other embodiments of the present application, the handset 10 may include more or fewer components than shown, or some components may be combined, some components may be separated, or a different arrangement of components may be used. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
As shown in fig. 1, the camera module 121 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The light receiving element converts an optical Signal into an electrical Signal, and then transmits the electrical Signal to an ISP (Image Signal Processing) to convert the electrical Signal into a digital Image Signal. The mobile phone 10 can implement a shooting function through an ISP, a camera 121, a video codec, a GPU (graphics Processing Unit), a display screen 123, an application processor, and the like.
The sensor module 122 may include a proximity light sensor, a pressure sensor, a gyroscope sensor, an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, and the like.
The display screen 123 is used for displaying a human-computer interaction interface, images, videos, and the like. The display screen 102 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like.
According to some embodiments of the present application, fig. 2 illustrates a scene in which a user takes a picture using a mobile phone in a backlight. At this time, the mobile phone 10 can process the captured image through an image processing technique. The specific method comprises the following steps:
when the user clicks to take a picture, and the sensor module 122 of the mobile phone 10 detects that the picture-taking scene of the user is a backlight picture, the camera 1211 collects three YUV pictures (see fig. 3) input by the same exposure and dark light.
Three pictures of the same exposure and dark light input taken by the front camera 1211 are processed by the computer vision module 113.
As shown in fig. 4, the computer vision module 113 includes an image denoising unit 1131, an image exposure value adjusting unit 1132, and an HDR fusion unit 1133. The computer vision module 113 performs the following processing procedures through the image denoising unit 1131, the image exposure value adjusting unit 1132, and the HDR fusion unit 1133:
(1) the image denoising unit 1131 performs multi-frame time domain denoising and single-frame spatial domain denoising on the collected three YUV images (see fig. 3) with the same exposure dark light.
The multi-frame time domain noise reduction is one of the most widely used time domain noise reduction algorithms. The principle is that pixel points with different noise point properties are found in different photos, and a clean and pure dim light photo is synthesized by weighting. As shown in fig. 5, a general flow chart of the multi-frame noise reduction algorithm is shown:
firstly, registration between a plurality of input and exposure dark light images is carried out based on an optical flow method (point-by-point motion estimation) or a block matching method (motion estimation commonly used for video compression). Then, Temporal Fusion (Temporal Fusion) is performed on the registered plurality of images. Because the continuously shot images have strong relevance in a time domain, even if the signals are weak, the images are aligned with each other in space and then accumulated on a time axis, the distribution of noise points can become a zero mean value, and the purpose of removing the noise is further achieved.
(2) The image exposure value adjusting unit 1132 obtains a short exposure frame, a middle exposure frame, and a long exposure frame of the noise-reduced image by performing gamma correction (see fig. 6) on the noise-reduced image.
As shown in fig. 6, gamma correction, also called gamma non-linearity and gamma encoding, is used to perform non-linear operation or inverse operation on the brightness or tristimulus values of light in a film or image system. In the simplest case, gamma correction is defined by the following equation.
Figure BDA0002315771380000081
Where A is a constant and both the input and output values are non-negative real values. In general, in the normal case where a is 1, the input and output values range from 0 to 1. The case when the gamma value γ <1 is sometimes referred to as a coded gamma value, and the process of performing this coding operation using the above power law is also referred to as gamma compression; in contrast, the case where gamma value γ >1 is sometimes referred to as a decoded gamma value, and the process of performing this decoding operation using the above power law is also referred to as "gamma expansion".
As shown in fig. 7(a), the image is not subjected to the gamma correction, and fig. 7(b) and (c) show the images when the value of γ is 2.2 and 1/2.2, respectively, i.e., the images after the gamma correction. In the method of the invention, images with exposure value differences of specific values can be obtained by adjusting different values of gamma. For example, when gamma correction is performed on two images, gamma may be k or 1/k to obtain another image with a different exposure value from the original image, and then HDR fusion is performed between the original image and the image with gamma being 1/k or k; or when three images are subjected to HDR fusion, k and 1/k are respectively taken as gamma so as to obtain two images with different exposure values from the original image, and then the three images are subjected to HDR fusion.
Further, by setting the difference between the exposure values of the images to a fixed value, for example, 0.5, the exposure value of the image may be [ ev [ ]-0.5,ev,ev+0.5]。
(3) The HDR fusion unit 1133 performs HDR fusion on the short exposure frame, the intermediate exposure frame, and the long exposure frame to generate a high dynamic range image.
The HDR algorithm may use a median Threshold bitmap mtb (media Threshold bitmaps) method to quickly register multiple images at different exposures. The MTB bitmap is defined as: the average value of the image pixels is determined, and then the value which is larger than the average value in the original image pixels is set to be 255, and the value which is smaller than the average value is set to be zero. Since the MTB does not change with the change in exposure time, image registration can be performed without specifying the exposure time of the original image.
As shown in the flow diagram of the HDR algorithm of fig. 8, in the three images in fig. 7(a), (b), and (c), at least two images are arbitrarily selected in the image, and the best exposure portion is selected to sequentially construct a laplacian pyramid and a weighted gaussian pyramid, and then the laplacian pyramid is correspondingly weighted according to the weighted gaussian pyramid, and an image with a high dynamic range is generated after the pyramid is reconstructed.
Based on the above description of the photographing mode of the mobile phone 10, a specific flow of the image recognition method of the present application is specifically described below. Various relevant details in the above description are still applicable in the present flow, and are not repeated herein to avoid repetition. Specifically, as shown in fig. 9, the image processing method of the present application includes:
(1) an image to be processed is acquired (901). The image to be processed can be 3 YUV images with the same exposure and dark light.
(2) And performing noise reduction processing on the image to be processed (902). For example, multi-frame time domain denoising (see fig. 5) is performed on the YUV image to be processed, and single-frame spatial domain denoising is further performed on the YUV image subjected to the multi-frame time domain denoising, so as to obtain a clearer image.
(3) And performing HDR fusion (903) on the image subjected to the noise reduction processing. For example, gamma correction is performed on the YUV image subjected to the noise reduction processing (see fig. 7), so as to obtain a short exposure frame, a medium exposure frame and a long exposure frame of the YUV image, and HDR fusion is performed on the short exposure frame, the medium exposure frame and the long exposure frame (see fig. 8), so as to obtain a clear high dynamic range image of the YUV image to be processed.
Further, in other embodiments, the image processing method shown in fig. 10 may be further adopted for processing, and specifically includes:
(1) an image to be processed is acquired (1001). The image to be processed can be 2 YUV images with the same exposure and dark light.
(2) Exposure parameter adjustment is performed for each image to be processed (1002). For example, each image to be processed is gamma-corrected to obtain one image with a different exposure value or two images with different exposure values.
(3) And performing HDR fusion (1003) on the image obtained after the exposure value adjustment. When each image to be processed only obtains an image with an exposure value different from that of the original image to be processed, performing HDR fusion on the original image to be processed and an image with an exposure value different from that of the image to be processed, which is obtained after gamma correction, so as to obtain two fused high-dynamic-range images; when each image to be processed is subjected to exposure value adjustment to obtain two images with different exposure values, HDR fusion is carried out on the two images with different exposure values of each image to be processed after exposure value adjustment to obtain two fused high-dynamic-range images.
(4) And performing noise reduction processing (1004) on the HDR fused image. And performing multi-frame time domain noise reduction and single-frame space domain noise reduction on the two high dynamic range images obtained after HDR fusion to obtain clear high dynamic range images.
Further, in other embodiments, the image processing method shown in fig. 11 may be further adopted to perform processing, which specifically includes:
(1) m images to be processed are acquired (1101). For example, by performing exposure value adjustment on 2 images to be processed, two images having different exposure values are obtained for each image.
(2) M images to be processed are divided into N groups according to exposure values (1102). For example, the 4 images are divided into two groups A1[ a1, a1], B1[ B1, B2] according to exposure values, and the exposure values of the two images in each group are the same and the exposure values between the two groups A1 and B1 are different.
(3) At least one image from each group is selected for HDR fusion (1103). For example, an image a1, B1, a2, and B2 are selected from the group a1 and B1, respectively, and HDR fusion is performed to obtain a fused image C1[ C1, C2], and the exposure values of the images C1 and C2 are the same.
(4) And performing noise reduction processing on the N images after HDR fusion (1104). For example, the images c1 and c2 are subjected to multi-frame temporal noise reduction and single-frame spatial noise reduction, thereby obtaining clear high-dynamic-range images.
Further, in other embodiments, the image processing method shown in fig. 12 may be further adopted for processing, which specifically includes:
(1) m images to be processed are acquired (1201). For example, by performing exposure value adjustment on 2 images to be processed, 4 images are obtained each.
(2) M images to be processed are divided into N groups according to exposure values (1202). For example, the 4 images are divided into two groups A2[ a1, B1], B2[ a2, B2] according to exposure values, wherein the exposure values of the images in each group are different; or the 4 photos are divided into A2[ a1, B1] and B2[ a1, B1 ].
(3) At least two images from each group are selected for HDR fusion (1203). For example, HDR fusion is performed on the images in the groups a2 and B2 to obtain 2 fused images C3[ C1, C2] or C3[ C1, C1 ].
(4) And performing noise reduction processing on the N images after HDR fusion (1104). For example, the exposure value of the image C3[ C1, C2] is adjusted to meet the condition of multi-frame temporal noise reduction, and then multi-frame temporal noise reduction and single-frame spatial noise reduction are performed on the image C1, C2 or directly multi-frame temporal noise reduction and single-frame spatial noise reduction are performed on the image C3[ C1, C1] to obtain a clear high dynamic range image.
In addition, as shown in fig. 13, an embodiment of the present application also provides a flowchart of a terminal photographing method, which specifically includes:
a user clicks to take a picture (1301), then the terminal detects whether the current illumination environment belongs to indoor backlight (1302), and if the current illumination environment does not belong to an indoor backlight scene, no processing is carried out on the image; if the scene is an indoor backlight scene, the terminal collects three pieces of same-exposure dark light YUV images (1303), multi-frame time domain noise reduction (1304) is carried out on the three pieces of same-exposure dark light YUV images, and single-frame space domain noise reduction (1305) is carried out on the images subjected to multi-frame time domain noise reduction to obtain clear images; and performing gamma correction (1306) on the image subjected to noise reduction to obtain a short exposure frame, a middle exposure frame and a long exposure frame of the image, and performing HDR fusion (1307) on the short exposure frame, the middle exposure frame and the long exposure frame of the image to generate a clear high dynamic range image.
It should be noted that, although the image processing method or the terminal photographing method according to the embodiments of the present application is a specific implementation, the processed photograph is a YUV image. It will be understood by those skilled in the art that pictures in RAW format are equally applicable to the present invention.
Fig. 14 to 17 show schematic structural diagrams of an image recognition apparatus corresponding to the image processing method, and it can be understood that specific technical details in the image processing method are also applicable to the apparatus, and are not described herein again to avoid repetition.
As shown in fig. 14, the image processing apparatus includes:
an obtaining module 1401, configured to obtain at least 2 first images, where the at least 2 first images are based on the same content, and exposure values of the at least 2 first images are all the same;
a denoising module 1402, configured to perform denoising processing based on multiple photos on the at least 2 first images to obtain at least 1 second image;
a fusion module 1403, configured to perform exposure value adjustment on the basis of the second image to obtain at least 2 third images with different exposure values; and fusing the at least 2 third images to obtain a target image, wherein the target image is a high dynamic range image.
In addition, as another embodiment shown in fig. 15, an image processing apparatus includes:
an acquiring module 1501, configured to acquire at least 2 first images, where the at least 2 first images are based on the same content, and exposure values of the at least 2 first images are all the same;
a fusion module 1502, configured to perform exposure value adjustment on each of the first images to obtain at least 2 third images with different exposure values associated with each of the first images; at least 2 third images related to each first image are fused, and exposure value adjustment is carried out on the fused images to obtain at least 2 fourth images with the same exposure value; the fourth image is a high dynamic range image;
and the noise reduction module 1503 is configured to perform denoising processing based on multiple photos on the at least 2 fourth images to obtain a target image.
In addition, as another embodiment shown in fig. 16, an image processing apparatus includes:
an obtaining module 1601, configured to obtain M images, where the M images can be divided into N groups based on exposure values of the images, where M and N are both integers, and M is greater than or equal to 2N, and N is greater than or equal to 2; each group comprises at least 2 images, wherein the exposure values of the images in each group are the same, and the exposure values among the N groups of images are different from each other;
a fusion module 1602, configured to select at least one image from each group to perform HDR fusion, so as to obtain at least two fifth images with the same exposure value, where the fifth images are high dynamic range images;
a denoising module 1603, configured to perform denoising processing based on multiple images on the at least 2 fifth images to obtain a target image.
In addition, as another embodiment shown in fig. 17, an image processing apparatus includes:
an obtaining module 1701 for obtaining M images, wherein the M images can be divided into N groups based on exposure values of the images, each group comprises at least 2 images, and the exposure values of the images in each group are different from each other, wherein M and N are integers, M is greater than or equal to 2N, and N is greater than or equal to 2;
a fusion module 1702, configured to select at least two images from each of the N groups for fusion processing, and perform exposure value adjustment on the fused images to obtain N fifth images with the same exposure value, where the fifth images are high dynamic range images;
and a denoising module 1703, configured to perform denoising processing based on multiple images on at least two of the N fifth images to obtain a target image.
Fig. 18 shows a schematic structural diagram of a terminal corresponding to the terminal shooting method, and it can be understood that specific technical details in the terminal message processing method are also applicable to the apparatus, and are not described herein again to avoid repetition.
Specifically, as shown in fig. 18, the terminal includes:
a detection unit 1801, configured to detect an illumination environment of a photographic subject by a terminal;
an image acquisition unit 1802, configured to acquire three co-exposure YUV images of the photographic subject when an illumination environment in which the photographic subject is located is a backlight environment;
an image processing unit 1803, which performs image processing on the three YUV images with the same exposure by using any one of the methods of claims 1 to 8, to obtain a high dynamic range image of the photographic subject.
Referring now to FIG. 19, shown is a block diagram of a system 1900 in accordance with one embodiment of the present application. Fig. 19 schematically illustrates an example system 1900 in accordance with various embodiments. In one embodiment, the system 1900 may include one or more processors 1904, system control logic 1908 coupled to at least one of the processors 1904, system memory 1912 coupled to the system control logic 1908, non-volatile memory (NVM)1919 coupled to the system control logic 1908, and a network interface 1920 coupled to the system control logic 1908.
In some embodiments, the processor 1904 may include one or more single-core or multi-core processors. In some embodiments, the processor 1904 may include any combination of general-purpose processors and dedicated processors (e.g., graphics processors, application processors, baseband processors, etc.). In embodiments where system 1900 employs eNB (enhanced Node B) 101 or RAN (Radio Access Network) controller 102, processor 1904 may be configured to perform various consistent embodiments.
In some embodiments, system control logic 1908 may include any suitable interface controllers to provide any suitable interface to at least one of processors 1904 and/or any suitable device or component in communication with system control logic 1908.
In some embodiments, system control logic 1908 may include one or more memory controllers to provide an interface to system memory 1912. System memory 1912 may be used to load and store data and/or instructions. Memory 1912 of system 1900 may include any suitable volatile memory, such as suitable Dynamic Random Access Memory (DRAM), in some embodiments.
NVM/memory 1919 may include one or more tangible, non-transitory computer-readable media for storing data and/or instructions. In some embodiments, the NVM/memory 1919 may include any suitable non-volatile memory, such as flash memory, and/or any suitable non-volatile storage device, such as at least one of a HDD (Hard Disk Drive), CD (Compact Disc) Drive, DVD (Digital Versatile Disc) Drive.
The NVM/memory 1919 may include a portion of the storage resources on the device on which the system 1900 is installed, or it may be accessible by, but not necessarily a part of, the device. The NVM/storage 1919 may be accessed over a network, for example, via a network interface 1920.
Network interface 1920 may include a transceiver to provide a radio interface for system 1900 to communicate with any other suitable devices (e.g., front-end module, antenna, etc.) over one or more networks. In some embodiments, network interface 1920 may be integrated with other components of system 1900. For example, the network interface 1920 can be integrated with at least one of the processor 1904, the system memory 1912, the NVM/storage 1919, and a firmware device (not shown) having instructions that, when executed by at least one of the processors 1904, the system 1900 implements the methods shown in fig. 9-12.
Network interface 1920 may further include any suitable hardware and/or firmware to provide a multiple-input multiple-output radio interface. For example, network interface 1920 may be a network adapter, a wireless network adapter, a telephone modem, and/or a wireless modem.
In one embodiment, at least one of the processors 1904 may be packaged together with logic for one or more controllers of system control logic 1908 to form a System In Package (SiP). In one embodiment, at least one of the processors 1904 may be integrated on the same die with logic for one or more controllers of system control logic 1908 to form a system on a chip (SoC).
The system 1900 may further include: an input/output (I/O) device 1932. The I/O device 1932 may include a user interface to enable a user to interact with the system 1900; the design of the peripheral component interface enables peripheral components to also interact with system 1900. In some embodiments, the system 1900 further includes sensors for determining at least one of environmental conditions and location information associated with the system 1900.
In some embodiments, the user interface may include, but is not limited to, a display (e.g., a liquid crystal display, a touch screen display, etc.), a speaker, a microphone, one or more cameras (e.g., still image cameras and/or video cameras), a flashlight (e.g., a light emitting diode flash), and a keyboard.
In some embodiments, the peripheral component interfaces may include, but are not limited to, a non-volatile memory port, an audio jack, and a power interface.
In some embodiments, the sensors may include, but are not limited to, a gyroscope sensor, an accelerometer, a proximity sensor, an ambient light sensor, and a positioning unit. The positioning unit may also be part of network interface 1920 or interact with network interface 1920 to communicate with components of a positioning network, such as Global Positioning System (GPS) satellites.
Fig. 20 shows a block diagram of a SoC (System on Chip) 2000, according to an embodiment of the present application. In fig. 20, like parts have the same reference numerals. In addition, the dashed box is an optional feature of more advanced socs. In fig. 20, the SoC 2000 includes: an interconnect unit 2050 that is coupled to the application processor 2020; a system agent unit 2070; a bus controller unit 2080; an integrated memory controller unit 2040; a set or one or more coprocessors 2020 which may include integrated graphics logic, an image processor, an audio processor, and a video processor; a Static Random Access Memory (SRAM) unit 2030; a Direct Memory Access (DMA) unit 2060. In one embodiment, the coprocessor 2020 includes a special-purpose processor, such as, for example, a network or communication processor, compression engine, GPGPU, a high-throughput MIC processor, embedded processor, or the like.
Embodiments of the mechanisms disclosed herein may be implemented in hardware, software, firmware, or a combination of these implementations. Embodiments of the application may be implemented as computer programs or program code executing on programmable systems comprising at least one processor, a storage system (including volatile and non-volatile memory and/or storage elements), at least one input device, and at least one output device.
Program code may be applied to input instructions to perform the functions described herein and generate output information. The output information may be applied to one or more output devices in a known manner. For purposes of this application, a processing system includes any system having a processor such as, for example, a Digital Signal Processor (DSP), a microcontroller, an Application Specific Integrated Circuit (ASIC), or a microprocessor.
The program code may be implemented in a high level procedural or object oriented programming language to communicate with a processing system. The program code can also be implemented in assembly or machine language, if desired. Indeed, the mechanisms described in this application are not limited in scope to any particular programming language. In any case, the language may be a compiled or interpreted language.
In some cases, the disclosed embodiments may be implemented in hardware, firmware, software, or any combination thereof. The disclosed embodiments may also be implemented as instructions carried by or stored on one or more transitory or non-transitory machine-readable (e.g., computer-readable) storage media, which may be read and executed by one or more processors. For example, the instructions may be distributed via a network or via other computer readable media. Thus, a machine-readable medium may include any mechanism for storing or transmitting information in a form readable by a machine (e.g., a computer), including, but not limited to, floppy diskettes, optical disks, read-only memories (CD-ROMs), magneto-optical disks, read-only memories (ROMs), Random Access Memories (RAMs), erasable programmable read-only memories (EPROMs), electrically erasable programmable read-only memories (EEPROMs), magnetic or optical cards, flash memory, or a tangible machine-readable memory for transmitting information (e.g., carrier waves, infrared digital signals, etc.) using the internet in an electrical, optical, acoustical or other form of propagated signal. Thus, a machine-readable medium includes any type of machine-readable medium suitable for storing or transmitting electronic instructions or information in a form readable by a machine (e.g., a computer).
In the drawings, some features of the structures or methods may be shown in a particular arrangement and/or order. However, it is to be understood that such specific arrangement and/or ordering may not be required. Rather, in some embodiments, the features may be arranged in a manner and/or order different from that shown in the illustrative figures. In addition, the inclusion of a structural or methodical feature in a particular figure is not meant to imply that such feature is required in all embodiments, and in some embodiments, may not be included or may be combined with other features.
It should be noted that, in the embodiments of the apparatuses in the present application, each unit/module is a logical unit/module, and physically, one logical unit/module may be one physical unit/module, or may be a part of one physical unit/module, and may also be implemented by a combination of multiple physical units/modules, where the physical implementation manner of the logical unit/module itself is not the most important, and the combination of the functions implemented by the logical unit/module is the key to solve the technical problem provided by the present application. Furthermore, in order to highlight the innovative part of the present application, the above-mentioned device embodiments of the present application do not introduce units/modules which are not so closely related to solve the technical problems presented in the present application, which does not indicate that no other units/modules exist in the above-mentioned device embodiments.
It is noted that, in the examples and descriptions of this patent, relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Also, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, the use of the verb "comprise a" to define an element does not exclude the presence of another, same element in a process, method, article, or apparatus that comprises the element.
While the present application has been shown and described with reference to certain preferred embodiments thereof, it will be understood by those of ordinary skill in the art that various changes in form and details may be made therein without departing from the spirit and scope of the present application.

Claims (27)

1. An image processing method, comprising
Acquiring at least 2 first images, wherein the at least 2 first images are based on the same content, and the exposure values of the at least 2 first images are the same;
denoising the at least 2 first images based on a plurality of photos to obtain at least 1 second image;
adjusting the exposure value based on the second image to obtain at least 2 third images with different exposure values; and
and fusing the at least 2 third images to obtain a target image, wherein the target image is a high dynamic range image.
2. The method of claim 1, comprising:
the at least 2 third images include the second image itself and at least 1 image obtained by adjusting the exposure value for the second image.
3. The method of claim 2, comprising:
the exposure value adjustment is carried out based on the second image, and at least 2 third images with different exposure values are obtained, and the method comprises the following steps:
and carrying out gamma correction on the second image to obtain at least 1 image with an exposure value different from that of the second image.
4. The method of claim 1, comprising:
the at least 2 third images include at least 2 images obtained by adjusting exposure values with respect to the second image.
5. The method of claim 4, comprising:
the exposure value adjustment is carried out based on the second image, and at least 2 third images with different exposure values are obtained, and the method comprises the following steps:
and carrying out gamma correction on the second image to obtain at least 2 third images with different exposure values.
6. The method of claim 1, comprising:
the denoising processing of the at least 2 first images based on multiple photos comprises:
and carrying out multi-frame time domain noise reduction on the at least 2 first images.
7. The method of claim 6, comprising:
the denoising processing of the at least 2 first images based on multiple photos comprises:
performing multi-frame time domain noise reduction on the at least 2 first images to obtain images, and performing single frame noise reduction; the single frame noise reduction comprises single frame spatial noise reduction.
8. The method according to any of claims 1-7, wherein the exposure value difference between the at least 2 third images is a preset value.
9. An image processing method, comprising
Acquiring at least 2 first images, wherein the at least 2 first images are based on the same content, and the exposure values of the at least 2 first images are the same;
adjusting exposure value of each first image to obtain at least 2 third images which are related to each first image and have different exposure values;
fusing at least 2 third images associated with each first image, and adjusting exposure values of the fused images to obtain at least 2 fourth images with the same exposure values; the fourth image is a high dynamic range image;
and denoising the at least 2 fourth images based on the plurality of photos to obtain a target image.
10. The method of claim 9, comprising:
the at least 2 third images associated with each first image include the first image itself and at least 1 image obtained by adjusting the exposure value for the first image.
11. The method of claim 10, comprising:
the exposure value adjustment is carried out on each first image to obtain at least 2 third images related to each first image, and the method comprises the following steps:
and carrying out gamma correction on each first image, wherein each first image obtains at least 1 image with different exposure value from the first image.
12. The method of claim 9, comprising:
the at least 2 third images associated with each first image comprise at least 2 images obtained after adjusting exposure values for the first images.
13. The method of claim 12, comprising:
the exposure value adjustment is carried out on each first image to obtain at least 2 third images related to each first image, and the method comprises the following steps:
and carrying out gamma correction on each first image, wherein each first image is provided with at least 2 third images with different exposure values.
14. The method of claim 9, comprising:
the denoising processing on the at least 2 fourth images based on a plurality of photos comprises:
and carrying out multi-frame time domain noise reduction on the at least 2 fourth images.
15. The method of claim 14, comprising:
the denoising processing on the at least 2 fourth images based on a plurality of photos comprises:
performing multi-frame time domain noise reduction on the at least 2 fourth images to obtain images, and performing single frame noise reduction; the single frame noise reduction comprises single frame spatial noise reduction.
16. The method according to any of claims 9-15, wherein the exposure value difference between the at least 2 third images is a preset value.
17. An image processing method, comprising
Acquiring M images, wherein the M images can be divided into N groups based on exposure values of the images, M and N are both integers, M is greater than or equal to 2N, and N is greater than or equal to 2;
each group comprises at least 2 images, wherein the exposure values of the images in each group are the same, and the exposure values among the N groups of images are different from each other;
respectively selecting at least one image from each group to perform HDR fusion to obtain at least two fifth images with the same exposure value, wherein the fifth images are high dynamic range images;
and denoising the at least 2 fifth images based on the plurality of images to obtain a target image.
18. An image processing method, comprising
Acquiring M images, wherein the M images can be divided into N groups based on exposure values of the images, each group comprises at least 2 images, the exposure values of the images in each group are different from each other, M and N are both integers, M is greater than or equal to 2N, and N is greater than or equal to 2;
selecting at least two images from each group of the N groups for fusion processing, and adjusting exposure values of the fused images to obtain N fifth images with the same exposure values, wherein the fifth images are high dynamic range images;
and denoising at least two of the N fifth images based on a plurality of images to obtain a target image.
19. The method according to claim 17 or 18, comprising:
the M images are obtained by adjusting exposure values of K images, wherein K is smaller than M.
20. The method of claim 13, comprising:
the K is less than or equal to M/2.
21. The method according to any one of claims 17-18, wherein the denoising process based on multiple images comprises:
and performing multi-frame time domain noise reduction on the at least two fifth images based on the plurality of images to obtain the target image.
22. The method according to claim 21, wherein the denoising process based on multiple images comprises:
performing single-frame noise reduction on the images obtained after the multi-frame noise reduction is performed on the at least two fifth images; the single frame noise reduction further comprises single frame spatial domain noise reduction.
23. The method according to any one of claims 1 or 9, wherein the first image is a YUV domain based image.
24. The method according to any one of claims 17 or 18, wherein the M images are YUV domain based images.
25. A terminal-based photographing method is characterized by comprising the following steps:
the terminal detects the illumination environment of a photographing object;
when the illumination environment of the photographic object is a backlight environment, acquiring three co-exposure YUV images of the photographic object;
the method of any one of claims 1-8, wherein the three YUV images with the same exposure are subjected to image processing to obtain a high dynamic range image of the photographic subject.
26. A machine-readable medium having stored thereon instructions which, when executed on a machine, cause the machine to perform the method of any one of claims 1 to 25.
27. A system, comprising:
a memory for storing instructions for execution by one or more processors of the system, an
A processor, being one of the processors of the system, for performing the method of any one of claims 1 to 25.
CN201911276779.9A 2019-12-12 2019-12-12 Image processing method, terminal photographing method, medium and system Pending CN112995490A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911276779.9A CN112995490A (en) 2019-12-12 2019-12-12 Image processing method, terminal photographing method, medium and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911276779.9A CN112995490A (en) 2019-12-12 2019-12-12 Image processing method, terminal photographing method, medium and system

Publications (1)

Publication Number Publication Date
CN112995490A true CN112995490A (en) 2021-06-18

Family

ID=76332152

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911276779.9A Pending CN112995490A (en) 2019-12-12 2019-12-12 Image processing method, terminal photographing method, medium and system

Country Status (1)

Country Link
CN (1) CN112995490A (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150262341A1 (en) * 2014-03-17 2015-09-17 Qualcomm Incorporated System and method for multi-frame temporal de-noising using image alignment
CN107395983A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium
CN107888840A (en) * 2017-10-30 2018-04-06 广东欧珀移动通信有限公司 High-dynamic-range image acquisition method and device
WO2018072267A1 (en) * 2016-10-17 2018-04-26 华为技术有限公司 Photographing method for terminal, and terminal
CN108322646A (en) * 2018-01-31 2018-07-24 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150262341A1 (en) * 2014-03-17 2015-09-17 Qualcomm Incorporated System and method for multi-frame temporal de-noising using image alignment
WO2018072267A1 (en) * 2016-10-17 2018-04-26 华为技术有限公司 Photographing method for terminal, and terminal
CN107395983A (en) * 2017-08-24 2017-11-24 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium
CN107888840A (en) * 2017-10-30 2018-04-06 广东欧珀移动通信有限公司 High-dynamic-range image acquisition method and device
CN108322646A (en) * 2018-01-31 2018-07-24 广东欧珀移动通信有限公司 Image processing method, device, storage medium and electronic equipment

Similar Documents

Publication Publication Date Title
US9451173B2 (en) Electronic device and control method of the same
CN110191291B (en) Image processing method and device based on multi-frame images
CN108391060B (en) Image processing method, image processing device and terminal
CN110264420B (en) Image processing method and device based on multi-frame images
CN110166709B (en) Night scene image processing method and device, electronic equipment and storage medium
CN115601244B (en) Image processing method and device and electronic equipment
CN113452898B (en) Photographing method and device
US20240119566A1 (en) Image processing method and apparatus, and electronic device
CN116744120B (en) Image processing method and electronic device
CN105391940B (en) A kind of image recommendation method and device
US20170351932A1 (en) Method, apparatus and computer program product for blur estimation
CN115633262B (en) Image processing method and electronic device
CN113259594A (en) Image processing method and device, computer readable storage medium and terminal
CN116916151B (en) Shooting method, electronic device and storage medium
CN116668862B (en) Image processing method and electronic equipment
WO2023124202A1 (en) Image processing method and electronic device
CN116709042B (en) Image processing method and electronic equipment
CN107451972B (en) Image enhancement method, device and computer readable storage medium
CN115735226B (en) Image processing method and chip
CN112995490A (en) Image processing method, terminal photographing method, medium and system
CN116723417B (en) Image processing method and electronic equipment
CN115767287B (en) Image processing method and electronic equipment
CN116668836B (en) Photographing processing method and electronic equipment
CN116012262B (en) Image processing method, model training method and electronic equipment
CN117710264B (en) Dynamic range calibration method of image and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210618

RJ01 Rejection of invention patent application after publication