CN107395983B - Image processing method, mobile terminal and computer readable storage medium - Google Patents

Image processing method, mobile terminal and computer readable storage medium Download PDF

Info

Publication number
CN107395983B
CN107395983B CN201710735043.8A CN201710735043A CN107395983B CN 107395983 B CN107395983 B CN 107395983B CN 201710735043 A CN201710735043 A CN 201710735043A CN 107395983 B CN107395983 B CN 107395983B
Authority
CN
China
Prior art keywords
images
group
image
brightening
fusion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710735043.8A
Other languages
Chinese (zh)
Other versions
CN107395983A (en
Inventor
杨威
寇飞
张华琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201710735043.8A priority Critical patent/CN107395983B/en
Publication of CN107395983A publication Critical patent/CN107395983A/en
Application granted granted Critical
Publication of CN107395983B publication Critical patent/CN107395983B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/95Computational photography systems, e.g. light-field imaging systems
    • H04N23/951Computational photography systems, e.g. light-field imaging systems by using two or more images to influence resolution, frame rate or aspect ratio
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/265Mixing

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Computing Systems (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)
  • Studio Devices (AREA)

Abstract

The invention relates to the technical field of communication, and provides an image processing method, a mobile terminal and a computer readable storage medium, wherein the method comprises the following steps: shooting a target object to acquire a first group of images and a second group of images, wherein the first group of images comprise at least one image, and the second group of images comprise at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image. The invention can avoid the situation that the shot image is fuzzy due to the movement of the shot object or the shake of the camera, and can improve the dynamic range of the image.

Description

Image processing method, mobile terminal and computer readable storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to an image processing method, a mobile terminal, and a computer-readable storage medium.
Background
With the development of mobile terminals, the picture elements photographed by the mobile terminals are higher and higher, and the mobile terminals become necessary photographing tools for people going out. In order to improve the photographing effect of the mobile terminal, currently, most mobile terminals have a HDR (High-Dynamic Range) technology. The principle of the technology is that a mobile terminal shoots a plurality of images with different exposure degrees, and the images are subjected to image fusion, so that an image with a high dynamic range is obtained. However, in the process of acquiring a plurality of images with different exposure levels by the mobile terminal, the captured images may be blurred due to factors such as movement of the subject or shaking of the mobile terminal, and thus the effect of the output images may be affected.
Therefore, when the existing mobile terminal outputs the image with the high dynamic range, the image effect is poor.
Disclosure of Invention
The embodiment of the invention provides an image processing method, a mobile terminal and a computer readable storage medium, which aim to solve the problem that the image effect is poor when the existing mobile terminal outputs an image with a high dynamic range.
In a first aspect, an embodiment of the present invention provides an image processing method, including:
shooting a target object to acquire a first group of images and a second group of images, wherein the first group of images comprise at least one image, and the second group of images comprise at least one image;
brightening the first group of images and/or the second group of images to obtain a third group of images;
and carrying out image fusion by using the third group of images to obtain a target image.
In a second aspect, an embodiment of the present invention further provides a mobile terminal, including:
the device comprises a shooting module, a processing module and a display module, wherein the shooting module is used for shooting a target object to acquire a first group of images and a second group of images, the first group of images comprise at least one image, and the second group of images comprise at least one image;
the brightening module is used for brightening the first group of images and/or the second group of images acquired by the shooting module to acquire a third group of images;
and the fusion module is used for carrying out image fusion by utilizing the third group of images obtained by the brightening module to obtain a target image.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including: a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps in the image processing method as described above when executing the computer program.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the steps in the image processing method as described above.
In the embodiment of the invention, a target object is shot to obtain a first group of images and a second group of images, wherein the first group of images comprise at least one image, and the second group of images comprise at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and performing image fusion by using the third group of images to obtain a target image, so that the condition that the shot image is fuzzy due to the movement of a shot object or the shake of a camera can be avoided, and the dynamic range of the image can be improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive exercise.
FIG. 1 is a flow chart of an image processing method provided by an embodiment of the invention;
FIG. 2 is a second flowchart of an image processing method according to an embodiment of the present invention;
fig. 3 is one of the structural diagrams of a mobile terminal according to an embodiment of the present invention;
fig. 4 is a second block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 5 is one of the structural diagrams of a brightening module in a mobile terminal according to an embodiment of the present invention;
fig. 6 is a second structural diagram of a brightening module in the mobile terminal according to the embodiment of the invention;
fig. 7 is a third block diagram of a mobile terminal according to an embodiment of the present invention;
fig. 8 is a fourth structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 9 is a fifth structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, fig. 1 is a flowchart of an image processing method provided by an embodiment of the present invention, which can be applied to a mobile terminal. As shown in fig. 1, the image processing method includes the steps of:
step 101, shooting a target object, and acquiring a first group of images and a second group of images, wherein the first group of images comprises at least one image, and the second group of images comprises at least one image.
In this step, the first and second sets of images are acquired simultaneously at a preset sensitivity for a preset exposure time.
When the target object is shot, the shooting can be performed under a preset shooting condition, so that the detail characteristics of the image are improved. Noise can be reduced by adopting lower light sensitivity, so that the quality of an image is improved; the lower exposure time can reduce the influence on the image effect caused by the moving of the shot image or the shaking of the camera.
The first set of images and the second set of images may be the same or different images, and each of the first set of images and the second set of images includes a target object. After shooting the shooting object, the mobile terminal can simultaneously acquire at least two images. In particular implementation, the mobile terminal may output at least two images simultaneously using the image sensor. For example, 4 images may be simultaneously output by capturing images using a 4in1 image sensor. Thus, the situation of image blurring caused by the movement of the photographic subject or camera shake can be reduced, and the effect of the image can be improved.
And 102, performing brightening treatment on the first group of images and/or the second group of images to obtain a third group of images.
Wherein, the brightening process can be completed in a special image processing unit or an image signal processing unit of the mobile terminal. The mobile terminal can perform the brightening processing on different images by using different brightening rules. In particular, different brightening functions may be used to brighten each image.
In this step, only the first group image may be subjected to the brightening process, only the second group image may be subjected to the brightening process, or both the first group image and the second group image may be subjected to the brightening process. For ease of understanding, the brightening process is performed only on the first group of images including 4 images.
For example, assuming that I1, I2, I3 and I4 are four images before brightening, the four images can be brightened by using the following functions to obtain four images Z1, Z2, Z3 and Z4 after brightening.
Z1=I1 (1)
Z2(p)=I2(P)*(1+exp(-10*I2(p))) (2)
Z3(p)=I3(p)*(1+exp(-9*I3(p)))*ScaleFacetor1 (3)
Z4(p)=I4(p)*(1+exp(-8*I3(p)))*ScaleFacetor2 (4)
In the above formula, I2(p), I3(p), and I4(p) are pixel values corresponding to I2, I3, and I4, respectively, and Z2(p), Z3(p), and Z4(p) are pixel values corresponding to Z2, Z3, and Z4, respectively. The scaler 1 in the formula (3) and the scaler 2 in the formula (4) are luminance adjustment coefficients. Wherein, the scalefactor1 is 5/8, and the value of the scalefactor2 can be calculated according to the following formula (5):
Figure BDA0001387934540000041
wherein the content of the first and second substances,
Figure BDA0001387934540000042
the average value of the brightness of I4 is shown.
In this way, the pixel values of the four images after being brightened can be obtained, and 4 images with different brightness values, namely the third group of images, can be obtained. After the image is lightened, the detail information of a dark place in the image scene can be enhanced, and the definition of the image can be improved.
And when the first group of images are subjected to brightening processing, brightening processing is carried out on each image in the first group of images according to a brightening rule corresponding to each image, and a third group of images are obtained.
And when the second group of images are subjected to brightening processing, each image in the second group of images is subjected to brightening processing according to a brightening rule corresponding to each image to obtain a third group of images.
When the first group of images and the second group of images are subjected to brightening processing, each image in the first group of images is subjected to brightening processing according to a brightening rule corresponding to each image to obtain a third group of images; and performing brightening processing on each image in the second group of images according to a brightening rule corresponding to each image to obtain a fourth group of images, wherein the brightness of each image in the fourth group of images is different from the brightness of each image in the third group of images.
In this embodiment, the images in the first group of images are subjected to the brightening process, and the images in the second group of images are subjected to the brightening process, and the specific brightening process may be the same as the brightening process for the images in the first group of images, and is not described herein again.
It should be noted that the embodiment of the present invention does not limit the sequence of the brightening processes performed on the first group of images and the second group of images.
In this way, the image is brightened, and the detail features of the image can be enhanced.
Optionally, when the first group of images is subjected to the brightening process, and the second group of images includes at least two images, after the step of shooting the target object to obtain the first group of images and the second group of images, and before the step of performing image fusion by using the third group of images to obtain the target image, the method further includes: and denoising the second group of images to obtain a sixth image.
In this embodiment, the pixels of each image in the second group of images may be acquired, and the pixels of each image may be subjected to weighted average processing to obtain the sixth image. Thus, noise of the image can be reduced, thereby improving the definition of the image.
And 103, carrying out image fusion by using the third group of images to obtain a target image.
In this step, the third group of images are used for image fusion by adopting an exposure fusion algorithm to obtain a target image.
The third group of images comprises at least two images, and the brightness of the at least two images is different. The third group of images may be fused by using algorithms such as laplacian pyramid and gaussian pyramid.
For example, the fusion weight of each pixel in each source image can be calculated in combination with the contrast, saturation and exposure level of each image. A gaussian pyramid can be constructed for each weighted image, a laplacian pyramid can be constructed for the color source image, and the gaussian pyramid and the laplacian pyramid of the corresponding layer are multiplied. Thus, each image can get a weighted laplacian pyramid. And adding corresponding layers of the Laplacian pyramids of different images, performing normalization processing, and reconstructing the Laplacian pyramids to obtain fused images. The fusion process can be accelerated in a special image processing unit, and the image processing speed can be improved.
Therefore, the at least two images are subjected to image fusion to obtain the target image with a high dynamic range, so that the detail characteristics in the target image can be enhanced, and the visual effect of the target image is improved.
And when the first group of images are subjected to brightening processing to obtain a third group of images and each image in the second group of images is subjected to brightening processing to obtain a fourth group of images, carrying out image fusion by using the third group of images and the fourth group of images to obtain a target image. The specific image fusion mode may be the same as described above, and is not described herein again. Thus, the detail characteristics of the fused image can be enhanced, and the picture effect can be improved.
And when the first group of images are subjected to brightening processing and the second group of images are subjected to denoising processing to obtain a sixth image, carrying out image fusion by using the sixth image and the third group of images to obtain a target image. The specific image fusion mode may be the same as described above, and is not described herein again. Thus, the noise of the image can be reduced, and the definition of the picture can be improved.
The mobile terminal can display the fused target image through a display screen of the mobile terminal so as to be viewed by a user, and can also store the target image.
In the embodiment of the present invention, the image processing method may be applied to a mobile terminal, for example: a Mobile phone, a Tablet Personal Computer (Tablet Personal Computer), a Laptop Computer (Laptop Computer), a Personal Digital Assistant (PDA), a Mobile Internet Device (MID), a Wearable Device (Wearable Device), or the like.
The image processing method provided by the embodiment of the invention is used for shooting a target object to obtain a first group of images and a second group of images, wherein the first group of images comprises at least one image, and the second group of images comprises at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image. In this way, it is possible to avoid blurring of the captured image due to movement of the subject or camera shake, and to improve the effect of the output high dynamic range image.
Referring to fig. 2, the main difference between the present embodiment and the above-mentioned embodiments is that after the fifth image is obtained by performing denoising processing on the first group of images and the second group of images, the fifth image is subjected to brightening processing. Fig. 2 is a flowchart of an image processing method according to an embodiment of the present invention, and as shown in fig. 2, the image processing method includes the following steps:
step 201, shooting a target object to acquire a first group of images and a second group of images.
The first group of images comprises at least one image, and the second group of images comprises at least one image.
In this step, the target subject may be photographed under a preset photographing condition, which may be a condition set to enhance the details of the image, for example, a shorter exposure time and a lower sensitivity.
The first and second sets of images may be the same or different images, and the first and second sets of images contain the target object. After shooting the shooting object, the mobile terminal can simultaneously acquire at least two images. In particular implementation, the mobile terminal may output at least two images simultaneously using the image sensor. For example, 4 images may be simultaneously output by capturing images using a 4in1 image sensor. Thus, the situation of image blurring caused by the movement of the photographic subject or camera shake can be reduced, and the effect of the image can be improved.
Step 202, denoising the first group of images and the second group of images to obtain a fifth image.
In this step, weighted average processing is performed on the pixels of the first group of images and the second group of images to obtain a fifth image.
For the convenience of understanding, the first set of images and the second set of images include 4 images, and the four images are denoised. For example, assuming that I1, I2, I3, and I4 are four images before denoising, the four images may be denoised by using the following function, and a fifth image I5 after denoising is obtained.
I5(p)=α*I1(p)+β*I2(p)+γ*I3(P)+δ*I4(p) (6)
In the formula (6), in the above formula, I1(p), I2(p), I3(p), I4(p), and I5(p) are pixel values corresponding to I1, I2, I3, I4, and I5, and α, γ, and δ may all be 0.25.
Therefore, after the image is denoised, the image noise can be reduced, and the definition of the image can be improved.
And 203, performing brightening processing on the fifth image according to different brightening rules respectively to obtain a third group of images with different image brightness.
In this step, since the noise of the fifth image is small with respect to each of the first group of images and the second group of images, the amount of noise amplification is also relatively small after the fifth image is subjected to the brightening process. In a specific implementation, the image may be brightened according to a preset brightening function.
For the sake of understanding, four images with different brightness are obtained by brightening the fifth image. For example, assuming that I5 is the fifth image, the fifth image may be subjected to a brightening process using the following functions, respectively, to obtain four brightened images Z1, Z2, Z3, and Z4.
Z1=I5 (7)
Z2(p)=I5(P)*(1+exp(-10*I5(p))) (8)
Z3(p)=I5(p)*(1+exp(-9*I5(p)))*ScaleFacetor1 (9)
Z4(p)=I5(p)*(1+exp(-8*I5(p)))*ScaleFacetor2 (10)
In the above formula, I5(p) is the pixel value corresponding to I5, and Z2(p), Z3(p), and Z4(p) are the pixel values corresponding to Z2, Z3, and Z4, respectively. The ScaleFactor1 in the formula (9) and the ScaleFactor2 in the formula (10) are both brightness adjustment coefficients, wherein the ScaleFactor1 is 5/8, and the value of the ScaleFactor2 can be calculated according to the following formula (11):
Figure BDA0001387934540000081
wherein the content of the first and second substances,
Figure BDA0001387934540000082
the average value of the brightness of I5 is shown.
In this way, the pixel values of the four images after being brightened can be obtained, and 4 images with different brightness values, namely the third group of images, can be obtained. After the image is lightened, the detail information of a dark place in the image scene can be enhanced, and the definition of the image can be improved.
And step 204, carrying out image fusion by using the third group of images to obtain a target image.
The third group of images comprises at least two images, and the brightness of the at least two images is different. In specific implementation, the third group of images may be fused by using algorithms such as laplacian pyramid and gaussian pyramid. Therefore, the at least two images are subjected to image fusion to obtain the target image with a high dynamic range, so that the detail characteristics in the target image can be enhanced, and the visual effect of the target image is improved.
The mobile terminal can display the fused target image through a display screen of the mobile terminal so as to be viewed by a user, and can also store the target image.
According to the image processing method provided by the embodiment of the invention, after the first group of images and the second group of images are subjected to denoising processing to obtain the fifth image, the fifth image is subjected to brightening processing, noise can be reduced, and thus the image effect of the images can be improved. And the highlighted images are fused, so that the detail characteristics in the target image can be enhanced, and the visual effect of the target image is improved.
Referring to fig. 3, fig. 3 is a block diagram of a mobile terminal according to an embodiment of the present invention. As shown in fig. 3, the mobile terminal may include an image sensor, an image signal processing unit, a dedicated image processing unit, a main controller, a cache controller, a display device, and the like.
The image processor may be configured to output at least two images after acquiring the images. For example, a 4in1 image sensor, through which images are acquired, 4 images may be obtained.
The image signal processing unit mainly performs post-processing such as linear correction, noise removal, dead pixel removal, interpolation, white balance, and auto exposure control on the signal output from the front-end image sensor. In this way, details of the scene can be restored under different optical conditions. The purpose of the special image processing unit is mainly to increase the processing speed, and the special image processing unit can be an image processor with a coloring function, a multi-core processor or a central processing unit with a special instruction set.
In the above-described embodiments, the brightening process for the image may be processed in the image signal processing unit or the dedicated image processing unit; the process of image fusion can be processed in a special image processing unit.
The main controller is mainly used for coordinating events among all the functional units. For example, the processed image is stored or displayed through a display device.
The mobile terminal according to the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 2 and achieve the same beneficial effects, and for avoiding repetition, details are not repeated here.
Referring to fig. 4, fig. 4 is a structural diagram of a mobile terminal according to an embodiment of the present invention, as shown in fig. 4, a mobile terminal 400 includes a shooting module 401, a brightening module 402, and a fusion module 403, where the shooting module 401 is connected to the brightening module 402, and the brightening module 402 is connected to the fusion module 403.
The image capturing module 401 is configured to capture a target object, and acquire a first group of images and a second group of images, where the first group of images includes at least one image, and the second group of images includes at least one image; a brightening module 402, configured to perform brightening processing on the first group of images and/or the second group of images acquired by the shooting module 401, so as to obtain a third group of images; and a fusion module 403, configured to perform image fusion on the third group of images obtained by the brightening module 402 to obtain a target image.
Optionally, as shown in fig. 5, the brightening module 402 includes: the denoising submodule 4021 is configured to perform denoising processing on the first group of images and the second group of images obtained by the shooting module to obtain a fifth image; the first brightening submodule 4022 is configured to brighten the fifth image obtained by the denoising submodule 4021 according to different brightening rules, so as to obtain the third group of images with different image brightness.
Optionally, the denoising submodule 4021 is specifically configured to perform weighted average processing on pixels of the first group of images and the second group of images to obtain a fifth image.
Optionally, as shown in fig. 6, the brightening module 402 includes: the second brightening submodule 4023 is configured to perform brightening processing on each image in the first group of images according to a brightening rule corresponding to each image, so as to obtain a third group of images; a third brightening submodule 4024, configured to perform brightening processing on each image in the second set of images according to a brightening rule corresponding to each image, to obtain a fourth set of images, where brightness of each image in the fourth set of images is different from brightness of each image in the third set of images; the fusion module 403 is specifically configured to perform image fusion by using the third group of images obtained by the second brightening submodule 4023 and the fourth group of images obtained by the third brightening submodule 4024 to obtain a target image.
Optionally, as shown in fig. 7, the second group of images includes at least two images, and the brightening module 402 is specifically configured to perform brightening processing on each image in the first group of images according to a brightening rule corresponding to each image, so as to obtain a third group of images; the mobile terminal 400 further includes: a denoising module 404, configured to perform denoising processing on the second group of images to obtain a sixth image; the fusion module 403 is specifically configured to perform image fusion by using the sixth image and the third group of images to obtain a target image.
Optionally, the fusion module 403 is specifically configured to perform image fusion by using the third group of images through an exposure fusion algorithm to obtain a target image.
Optionally, the shooting module 401 is specifically configured to acquire the first group of images and the second group of images simultaneously within a preset exposure time under a preset sensitivity condition.
The mobile terminal 400 can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 to fig. 2, and is not described herein again to avoid repetition.
The mobile terminal 400 of the embodiment of the present invention photographs a target object to obtain a first group of images and a second group of images, where the first group of images includes at least one image, and the second group of images includes at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image. In this way, it is possible to avoid blurring of the captured image due to movement of the subject or camera shake, and to improve the effect of the output high dynamic range image.
Referring to fig. 8, fig. 8 is a structural diagram of a mobile terminal according to an embodiment of the present invention, and as shown in fig. 8, a mobile terminal 800 includes: at least one processor 801, memory 802, at least one network interface 804, and a user interface 803. The various components in the mobile terminal 800 are coupled together by a bus system 805. It is understood that the bus system 805 is used to enable communications among the components connected. The bus system 805 includes a power bus, a control bus, and a status signal bus in addition to a data bus. For clarity of illustration, however, the various buses are labeled as bus system 805 in fig. 8.
The mobile terminal 800 also includes a camera, which may include an image sensor, that is coupled to the various components of the mobile terminal via the bus system 805.
The user interface 803 may include, among other things, a display, a keyboard, or a pointing device (e.g., a mouse, track ball, touch pad, or touch screen, etc.).
It will be appreciated that the memory 802 in embodiments of the invention may be either volatile memory or nonvolatile memory, or may include both volatile and nonvolatile memory. The non-volatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable Programmable ROM (EPROM), an Electrically Erasable Programmable ROM (EEPROM), or a flash Memory. Volatile Memory can be Random Access Memory (RAM), which acts as external cache Memory. By way of illustration, and not limitation, many forms of RAM are available, such as Static random access memory (Static RAM, SRAM), Dynamic Random Access Memory (DRAM), Synchronous Dynamic random access memory (Synchronous D-RAM, SDRAM), Double Data Rate Synchronous Dynamic Random Access Memory (DDRSDRAM), Enhanced Synchronous SD RAM (ESDRAM), Synchronous Link DRAM (SLDRAM), and Direct memory bus RAM (DRRAM). The memory 802 of the systems and methods described herein is intended to comprise, without being limited to, these and any other suitable types of memory.
In some embodiments, memory 802 stores the following elements, executable modules or data structures, or a subset thereof, or an expanded set thereof: an operating system 8021 and application programs 8022.
The operating system 8021 includes various system programs, such as a framework layer, a core library layer, a driver layer, and the like, and is used for implementing various basic services and processing hardware-based tasks. The application program 8022 includes various application programs, such as a Media Player (Media Player), a Browser (Browser), and the like, for implementing various application services. A program implementing a method according to an embodiment of the present invention may be included in application program 8022.
In the embodiment of the present invention, by calling the program or instruction stored in the memory 802, specifically, the program or instruction stored in the application program 8022, the processor 801 is configured to: shooting a target object to acquire a first group of images and a second group of images, wherein the first group of images comprise at least one image, and the second group of images comprise at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image.
The methods disclosed in the embodiments of the present invention described above may be implemented in the processor 801 or implemented by the processor 801. The processor 801 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by integrated logic circuits of hardware or instructions in the form of software in the processor 801. The Processor 801 may be a general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable Gate Array (FPGA) or other programmable logic device, discrete Gate or transistor logic device, or discrete hardware components. The various methods, steps and logic blocks disclosed in the embodiments of the present invention may be implemented or performed. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. The steps of the method disclosed in connection with the embodiments of the present invention may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in ram, flash memory, rom, prom, or eprom, registers, etc. storage media as is well known in the art. The storage medium is located in the memory 802, and the processor 801 reads the information in the memory 802, and combines the hardware to complete the steps of the method.
It is to be understood that the embodiments described herein may be implemented in hardware, software, firmware, middleware, microcode, or any combination thereof. For a hardware implementation, the Processing units may be implemented within one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), general purpose processors, controllers, micro-controllers, microprocessors, other electronic units configured to perform the functions described herein, or a combination thereof.
For a software implementation, the techniques described herein may be implemented with modules (e.g., procedures, functions, and so on) that perform the functions described herein. The software codes may be stored in a memory and executed by a processor. The memory may be implemented within the processor or external to the processor.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: denoising the first group of images and the second group of images to obtain a fifth image; and performing brightening treatment on the fifth image according to different brightening rules respectively to obtain a third group of images with different image brightness.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: and carrying out weighted average processing on the pixels of the first group of images and the second group of images to obtain a fifth image.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: brightening each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images; performing brightening processing on each image in the second group of images according to a brightening rule corresponding to each image to obtain a fourth group of images, wherein the brightness of each image in the fourth group of images is different from the brightness of each image in the third group of images; and carrying out image fusion by using the third group of images and the fourth group of images to obtain a target image.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: brightening each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images; denoising the second group of images to obtain a sixth image; and carrying out image fusion by using the sixth image and the third group of images to obtain a target image.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: and carrying out image fusion by using the third group of images by adopting an exposure fusion algorithm to obtain a target image.
Optionally, the computer program when executed by the processor 801 may further implement the steps of: the first and second sets of images are acquired simultaneously at a preset sensitivity for a preset exposure time.
The mobile terminal 800 can implement each process implemented by the mobile terminal in the foregoing embodiments, and details are not repeated here to avoid repetition.
The mobile terminal 800 of the embodiment of the present invention photographs a target object to obtain a first group of images and a second group of images, where the first group of images includes at least one image, and the second group of images includes at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image. In this way, it is possible to avoid blurring of the captured image due to movement of the subject or camera shake, and to improve the effect of the output high dynamic range image.
Referring to fig. 9, fig. 9 is a block diagram of a mobile terminal according to an embodiment of the present invention. Specifically, the mobile terminal 900 in fig. 9 may be a mobile phone, a tablet computer, a Personal Digital Assistant (PDA), or a vehicle-mounted computer.
As shown in fig. 9, the mobile terminal 900 includes a Radio Frequency (RF) circuit 910, a memory 920, an input unit 930, a display unit 940, a processor 950, an audio circuit 960, a communication module 970, and a power supply 980.
The input unit 930 may be used, among other things, to receive numeric or character information input by a user and to generate signal inputs related to user settings and function control of the mobile terminal 900. Specifically, in the embodiment of the present invention, the input unit 930 may include a touch panel 931. The touch panel 931, also referred to as a touch screen, may collect a touch operation performed by a user on or near the touch panel 931 (for example, a user may operate the touch panel 931 by using a finger, a stylus pen, or any other suitable object or accessory), and drive the corresponding connection device according to a preset program. Alternatively, the touch panel 931 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 950, and can receive and execute commands sent from the processor 950. In addition, the touch panel 931 may be implemented by various types, such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 931, the input unit 930 may also include other input devices 932, and the other input devices 932 may include, but are not limited to, one or more of a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and the like.
Among other things, the display unit 940 may be used to display information input by the user or information provided to the user and various menu interfaces of the mobile terminal 900. The display unit 940 may include a display panel 941, and the display panel 941 may be optionally configured in the form of an LCD or an Organic Light-Emitting Diode (OLED).
It should be noted that the touch panel 931 may cover the display panel 941 to form a touch display screen, and when the touch display screen detects a touch operation on or near the touch display screen, the touch display screen is transmitted to the processor 950 to determine the type of the touch event, and then the processor 950 provides a corresponding visual output on the touch display screen according to the type of the touch event.
The touch display screen comprises an application program interface display area and a common control display area. The arrangement modes of the application program interface display area and the common control display area are not limited, and can be an arrangement mode which can distinguish two display areas, such as vertical arrangement, left-right arrangement and the like. The application interface display area may be used to display an interface of an application. Each interface may contain at least one interface element such as an icon and/or widget desktop control for an application. The application interface display area may also be an empty interface that does not contain any content. The common control display area is used for displaying controls with high utilization rate, such as application icons like setting buttons, interface numbers, scroll bars, phone book icons and the like. The touch screen is a flexible screen, and the two surfaces of the flexible screen are both pasted with the organic transparent conductive films of the carbon nanotubes.
Wherein the processor 950 is a control center of the mobile terminal 900, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal 900 and processes data by operating or executing software programs and/or modules stored in the first memory 921 and calling data stored in the second memory 922, thereby integrally monitoring the mobile terminal 900. Optionally, processor 950 may include one or more processing units.
The mobile terminal 900 further includes a camera, which may include an image sensor, connected to the components of the mobile terminal via a bus system 905.
In an embodiment of the present invention, the processor 950 is configured to, by invoking software programs and/or modules stored in the first memory 921 and/or data stored in the second memory 922: shooting a target object to acquire a first group of images and a second group of images, wherein the first group of images comprise at least one image, and the second group of images comprise at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image.
Optionally, the computer program when executed by the processor 950 may further implement the steps of: denoising the first group of images and the second group of images to obtain a fifth image; and performing brightening treatment on the fifth image according to different brightening rules respectively to obtain a third group of images with different image brightness.
Optionally, the computer program when executed by the processor 950 may further implement the steps of: and carrying out weighted average processing on the pixels of the first group of images and the second group of images to obtain a fifth image.
Optionally, when being executed by the processor 950, the computer program may further implement the following steps of performing a brightening process on each image in the first group of images according to a brightening rule corresponding to each image, to obtain a third group of images; performing brightening processing on each image in the second group of images according to a brightening rule corresponding to each image to obtain a fourth group of images, wherein the brightness of each image in the fourth group of images is different from the brightness of each image in the third group of images; and carrying out image fusion by using the third group of images and the fourth group of images to obtain a target image.
Optionally, the computer program when executed by the processor 950 may further implement the steps of: brightening each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images; denoising the second group of images to obtain a sixth image; and carrying out image fusion by using the sixth image and the third group of images to obtain a target image.
Optionally, the computer program when executed by the processor 950 may further implement the steps of: and carrying out image fusion by using the third group of images by adopting an exposure fusion algorithm to obtain a target image.
Optionally, the computer program when executed by the processor 950 may further implement the steps of: the first and second sets of images are acquired simultaneously at a preset sensitivity for a preset exposure time.
The mobile terminal 900 can implement the processes implemented by the mobile terminal in the foregoing embodiments, and in order to avoid repetition, the details are not described here.
The mobile terminal 900 of the embodiment of the present invention photographs a target object to obtain a first group of images and a second group of images, where the first group of images includes at least one image, and the second group of images includes at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image. In this way, it is possible to avoid blurring of the captured image due to movement of the subject or camera shake, and to improve the effect of the output high dynamic range image.
Embodiments of the present invention also provide a computer-readable storage medium having stored thereon a computer program (instructions), which when executed by a processor, implement the steps of: shooting a target object to acquire a first group of images and a second group of images, wherein the first group of images comprise at least one image, and the second group of images comprise at least one image; brightening the first group of images and/or the second group of images to obtain a third group of images; and carrying out image fusion by using the third group of images to obtain a target image.
Optionally, the program (instructions) when executed by the processor implement the steps of: denoising the first group of images and the second group of images to obtain a fifth image; and performing brightening treatment on the fifth image according to different brightening rules respectively to obtain a third group of images with different image brightness.
Optionally, the program (instructions) when executed by the processor implement the steps of: and carrying out weighted average processing on the pixels of the first group of images and the second group of images to obtain a fifth image.
Optionally, the program (instructions) when executed by the processor implement the steps of: brightening each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images; performing brightening processing on each image in the second group of images according to a brightening rule corresponding to each image to obtain a fourth group of images, wherein the brightness of each image in the fourth group of images is different from the brightness of each image in the third group of images; and carrying out image fusion by using the third group of images and the fourth group of images to obtain a target image.
Optionally, the program (instructions) when executed by the processor implement the steps of: brightening each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images; denoising the second group of images to obtain a sixth image; and carrying out image fusion by using the sixth image and the third group of images to obtain a target image.
Optionally, the program (instructions) when executed by the processor implement the steps of: and carrying out image fusion by using the third group of images by adopting an exposure fusion algorithm to obtain a target image.
Optionally, the program (instructions) when executed by the processor implement the steps of: the first and second sets of images are acquired simultaneously at a preset sensitivity for a preset exposure time.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data. Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
It is clear to those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described systems, apparatuses and units may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment of the present invention.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present invention may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method according to the embodiments of the present invention. And the aforementioned storage medium includes: various media capable of storing program codes, such as a U disk, a removable hard disk, a ROM, a RAM, a magnetic disk, or an optical disk.
The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and all the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (12)

1. An image processing method applied to a mobile terminal, the method comprising:
shooting a target object to obtain a first group of images and a second group of images, wherein the first group of images comprise at least one image, and the second group of images comprise at least two images;
brightening the first group of images and/or the second group of images to obtain a third group of images;
performing image fusion by using the third group of images to obtain a target image;
the step of shooting the target object and acquiring the first group of images and the second group of images comprises the following steps:
simultaneously acquiring the first set of images and the second set of images within a preset exposure time under a preset sensitivity condition;
the first set of images and the second set of images are the same images;
the step of performing a brightening process on the first group of images and/or the second group of images to obtain a third group of images includes:
brightening each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images;
after the step of capturing the target object and acquiring the first group of images and the second group of images, and before the step of performing image fusion by using the third group of images to obtain the target image, the method further comprises:
denoising the second group of images to obtain a sixth image;
the step of performing image fusion by using the third group of images to obtain a target image comprises the following steps:
and carrying out image fusion by using the sixth image and the third group of images to obtain a target image.
2. The image processing method according to claim 1, wherein the step of performing a brightening process on the first set of images and/or the second set of images to obtain a third set of images comprises:
denoising the first group of images and the second group of images to obtain a fifth image;
and performing brightening treatment on the fifth image according to different brightening rules respectively to obtain a third group of images with different image brightness.
3. The image processing method according to claim 2, wherein the step of denoising the first group of images and the second group of images to obtain a fifth image comprises:
and carrying out weighted average processing on the pixels of the first group of images and the second group of images to obtain a fifth image.
4. The image processing method according to claim 1, wherein the step of performing a brightening process on the first set of images and/or the second set of images to obtain a third set of images comprises:
brightening each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images;
performing brightening processing on each image in the second group of images according to a brightening rule corresponding to each image to obtain a fourth group of images, wherein the brightness of each image in the fourth group of images is different from the brightness of each image in the third group of images;
the step of performing image fusion by using the third group of images to obtain a target image comprises the following steps:
and carrying out image fusion by using the third group of images and the fourth group of images to obtain a target image.
5. The image processing method according to claim 1, wherein the step of performing image fusion by using the third group of images to obtain the target image comprises:
and carrying out image fusion by using the third group of images by adopting an exposure fusion algorithm to obtain a target image.
6. A mobile terminal, comprising:
the device comprises a shooting module, a processing module and a display module, wherein the shooting module is used for shooting a target object to acquire a first group of images and a second group of images, the first group of images comprise at least one image, and the second group of images comprise at least two images;
the brightening module is used for brightening the first group of images and/or the second group of images acquired by the shooting module to acquire a third group of images;
the fusion module is used for carrying out image fusion on the third group of images obtained by the brightening module to obtain a target image;
the shooting module is specifically used for simultaneously collecting the first group of images and the second group of images within a preset exposure time under a preset sensitivity condition;
the first set of images and the second set of images are the same images;
the brightening module is specifically configured to perform brightening processing on each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images;
the mobile terminal further includes:
the denoising module is used for denoising the second group of images to obtain a sixth image;
the fusion module is specifically configured to perform image fusion by using the sixth image and the third group of images to obtain a target image.
7. The mobile terminal of claim 6, wherein the brightening module comprises:
the denoising submodule is used for denoising the first group of images and the second group of images obtained by the shooting module to obtain a fifth image;
and the first brightening submodule is used for brightening the fifth image obtained by the denoising submodule according to different brightening rules respectively to obtain the third group of images with different image brightness.
8. The mobile terminal of claim 7, wherein the de-noising sub-module is specifically configured to perform weighted average processing on pixels of the first set of images and the second set of images to obtain a fifth image.
9. The mobile terminal of claim 6, wherein the brightening module comprises:
the second brightening submodule is used for carrying out brightening processing on each image in the first group of images according to a brightening rule corresponding to each image to obtain a third group of images;
the third brightening submodule is used for carrying out brightening processing on each image in the second group of images according to a brightening rule corresponding to each image to obtain a fourth group of images, wherein the brightness of each image in the fourth group of images is different from the brightness of each image in the third group of images;
the fusion module is specifically configured to perform image fusion by using the third group of images obtained by the second brightening submodule and the fourth group of images obtained by the third brightening submodule, so as to obtain a target image.
10. The mobile terminal according to claim 6, wherein the fusion module is specifically configured to perform image fusion by using the third group of images through an exposure fusion algorithm to obtain a target image.
11. A mobile terminal, comprising: memory, processor and computer program stored on the memory and executable on the processor, which when executed by the processor implements the steps in the image processing method according to any of claims 1 to 5.
12. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps in the image processing method according to any one of claims 1 to 5.
CN201710735043.8A 2017-08-24 2017-08-24 Image processing method, mobile terminal and computer readable storage medium Active CN107395983B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710735043.8A CN107395983B (en) 2017-08-24 2017-08-24 Image processing method, mobile terminal and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710735043.8A CN107395983B (en) 2017-08-24 2017-08-24 Image processing method, mobile terminal and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN107395983A CN107395983A (en) 2017-11-24
CN107395983B true CN107395983B (en) 2020-04-07

Family

ID=60346765

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710735043.8A Active CN107395983B (en) 2017-08-24 2017-08-24 Image processing method, mobile terminal and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN107395983B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109993737A (en) * 2019-03-29 2019-07-09 联想(北京)有限公司 A kind of processing method, equipment and computer readable storage medium
CN112995490A (en) * 2019-12-12 2021-06-18 华为技术有限公司 Image processing method, terminal photographing method, medium and system
CN111462268B (en) * 2020-03-31 2022-11-11 北京市商汤科技开发有限公司 Image reconstruction method and device, electronic equipment and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102547080A (en) * 2010-12-31 2012-07-04 联想(北京)有限公司 Image pick-up module and information processing equipment comprising same
CN104869297A (en) * 2015-06-15 2015-08-26 联想(北京)有限公司 Image processing method and electronic equipment

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100843087B1 (en) * 2006-09-06 2008-07-02 삼성전자주식회사 A image generation apparatus and method for the same
JP5367640B2 (en) * 2010-05-31 2013-12-11 パナソニック株式会社 Imaging apparatus and imaging method
US8542288B2 (en) * 2010-11-03 2013-09-24 Sony Corporation Camera system and imaging method using multiple lens and aperture units
JP5903979B2 (en) * 2012-03-28 2016-04-13 リコーイメージング株式会社 Imaging apparatus, moving image creating method, and moving image creating program
CN103873781B (en) * 2014-03-27 2017-03-29 成都动力视讯科技股份有限公司 A kind of wide dynamic camera implementation method and device
CN104320575B (en) * 2014-09-30 2019-01-15 百度在线网络技术(北京)有限公司 A kind of image processing method and image processing apparatus for portable terminal
CN105827754B (en) * 2016-03-24 2019-07-26 维沃移动通信有限公司 A kind of generation method and mobile terminal of high dynamic range images

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102547080A (en) * 2010-12-31 2012-07-04 联想(北京)有限公司 Image pick-up module and information processing equipment comprising same
CN104869297A (en) * 2015-06-15 2015-08-26 联想(北京)有限公司 Image processing method and electronic equipment

Also Published As

Publication number Publication date
CN107395983A (en) 2017-11-24

Similar Documents

Publication Publication Date Title
CN107197169B (en) high dynamic range image shooting method and mobile terminal
CN109565551B (en) Synthesizing images aligned to a reference frame
CN106060406B (en) Photographing method and mobile terminal
CN106161967B (en) Backlight scene panoramic shooting method and mobile terminal
US20200267300A1 (en) System and method for compositing high dynamic range images
KR102124604B1 (en) Method for stabilizing image and an electronic device thereof
EP3443736B1 (en) Method and apparatus for video content stabilization
CN109118430B (en) Super-resolution image reconstruction method and device, electronic equipment and storage medium
CN106454086B (en) Image processing method and mobile terminal
CN107395983B (en) Image processing method, mobile terminal and computer readable storage medium
KR102561714B1 (en) Image data processing from composite images
US9445073B2 (en) Image processing methods and systems in accordance with depth information
EP3234908A1 (en) Method, apparatus and computer program product for blur estimation
CN107705275B (en) Photographing method and mobile terminal
US20130236117A1 (en) Apparatus and method for providing blurred image
CN114390201A (en) Focusing method and device thereof
JP7025237B2 (en) Image processing equipment and its control method and program
CN115049572A (en) Image processing method, image processing device, electronic equipment and computer readable storage medium
CN114679553A (en) Video noise reduction method and device
EP3920132A1 (en) Electronic device for generating hdr image and operating method therefor
CN110545375A (en) Image processing method, image processing device, storage medium and electronic equipment
CN108810322B (en) Image processing method and related device
JP2018006803A (en) Imaging apparatus, control method for imaging apparatus, and program
CN114125296A (en) Image processing method, image processing device, electronic equipment and readable storage medium
WO2023086146A1 (en) User interface for camera focus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant