CN111815531A - Image processing method, image processing device, terminal equipment and computer readable storage medium - Google Patents

Image processing method, image processing device, terminal equipment and computer readable storage medium Download PDF

Info

Publication number
CN111815531A
CN111815531A CN202010657947.5A CN202010657947A CN111815531A CN 111815531 A CN111815531 A CN 111815531A CN 202010657947 A CN202010657947 A CN 202010657947A CN 111815531 A CN111815531 A CN 111815531A
Authority
CN
China
Prior art keywords
image
images
processed
frame
sub
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010657947.5A
Other languages
Chinese (zh)
Other versions
CN111815531B (en
Inventor
赖泽民
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202010657947.5A priority Critical patent/CN111815531B/en
Publication of CN111815531A publication Critical patent/CN111815531A/en
Application granted granted Critical
Publication of CN111815531B publication Critical patent/CN111815531B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • G06T5/70
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/60Noise processing, e.g. detecting, correcting, reducing or removing noise
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)

Abstract

The application is applicable to the technical field of image processing, and provides an image processing method, an image processing device, terminal equipment and a computer readable storage medium, wherein the image processing method comprises the following steps: acquiring at least two continuously shot images to be processed; obtaining a static area and a motion area of the at least two frames of images to be processed; according to the sub-images of the static area in the at least two frames of images to be processed, noise reduction processing is carried out on the static area to obtain a first sub-image; according to the sub-image of the motion area in a frame candidate image to be processed, carrying out noise reduction processing on the motion area to obtain a second sub-image; and splicing the first sub-image and the second sub-image to obtain a target image. By the method and the device, the noise in the image can be reduced, and a high-quality image is obtained.

Description

Image processing method, image processing device, terminal equipment and computer readable storage medium
Technical Field
The present application belongs to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, a terminal device, and a computer-readable storage medium.
Background
Image noise refers to unnecessary or redundant interference information present in an image, and the presence of noise seriously affects the quality of the image. In the process of acquiring, transmitting and storing the image, due to the influence of various factors (such as relative motion between the imaging device and the object, nonlinearity of the image sensor and the like), more noise may be generated in the image, and the image quality is reduced. Therefore, how to improve the quality of the image is an urgent technical problem to be solved in the field of image processing.
Disclosure of Invention
The application provides an image processing method, an image processing device, a terminal device and a computer readable storage medium, which are used for suppressing noise in an image and obtaining a high-quality image.
In a first aspect, the present application provides an image processing method, including:
acquiring at least two continuously shot images to be processed;
obtaining a static area and a motion area of the at least two frames of images to be processed;
according to the sub-images of the static area in the at least two frames of images to be processed, noise reduction processing is carried out on the static area to obtain a first sub-image, wherein the first sub-image is a noise-reduced sub-image corresponding to the static area;
according to the sub-image of the motion area in a candidate image to be processed of one frame, carrying out noise reduction processing on the motion area to obtain a second sub-image, wherein the second sub-image is a noise-reduced sub-image corresponding to the motion area, and the candidate image to be processed is an image to be processed of the motion area in the at least two frames of images to be processed;
and synthesizing the first sub-image and the second sub-image to obtain a target image.
In a second aspect, the present application provides an image processing apparatus comprising:
the image acquisition module is used for acquiring at least two continuously shot images to be processed;
the area acquisition module is used for acquiring a static area and a motion area of the at least two frames of images to be processed;
the first denoising module is used for denoising the static area according to sub-images of the static area in the at least two frames of images to be processed respectively to obtain a first sub-image, wherein the first sub-image is a denoised sub-image corresponding to the static area;
a second denoising module, configured to perform denoising processing on the motion region according to a sub-image of the motion region in a candidate to-be-processed image of one frame, to obtain a second sub-image, where the second sub-image is a denoised sub-image corresponding to the motion region, and the candidate to-be-processed image is a to-be-processed image in which the motion region exists in the at least two frames of to-be-processed images;
and the target acquisition module is used for splicing the first sub-image and the second sub-image to obtain a target image.
In a third aspect, the present application provides a terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, wherein the processor implements the steps of the image processing method according to the first aspect when executing the computer program.
In a fourth aspect, the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the image processing method according to the first aspect.
In a fifth aspect, the present application provides a computer program product, which, when run on a terminal device, causes the terminal device to perform the steps of the image processing method according to the first aspect as described above.
As can be seen from the above, the present application obtains the static area and the motion area of at least two frames of images to be processed that are continuously photographed, obtains the first sub-image with reduced noise corresponding to the static area according to the sub-images of the static area in the at least two frames of images to be processed respectively, obtains the second sub-image with reduced noise corresponding to the motion area according to the sub-image of the motion area in a frame of candidate images to be processed, and splices the first sub-image and the second sub-image to obtain the target image with reduced noise, that is, obtain the high quality image.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of an implementation of an image processing method according to an embodiment of the present application;
FIG. 2 is an exemplary diagram of image stitching;
fig. 3 is a schematic flow chart of an implementation of an image processing method provided in the second embodiment of the present application;
fig. 4 is a schematic flow chart of an implementation of an image processing method provided in the third embodiment of the present application;
fig. 5 is a schematic structural diagram of an image processing apparatus according to a fourth embodiment of the present application;
fig. 6 is a schematic structural diagram of a terminal device provided in the fifth embodiment of the present application;
fig. 7 is a schematic structural diagram of a terminal device according to a sixth embodiment of the present application.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
In particular implementations, the terminal devices described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the device is not a portable communication device, but is a desktop computer having a touch-sensitive surface (e.g., a touch screen display and/or touchpad).
In the discussion that follows, a terminal device that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal device may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal device supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal device may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
It should be understood that, the sequence numbers of the steps in this embodiment do not mean the execution sequence, and the execution sequence of each process should be determined by the function and the inherent logic of the process, and should not constitute any limitation to the implementation process of the embodiment of the present application.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
Referring to fig. 1, which is a schematic diagram of an implementation flow of an image processing method provided in an embodiment of the present application, where the image processing method is applied to a terminal device, as shown in the figure, the image processing method may include the following steps:
step 101, at least two frames of images to be processed which are continuously shot are obtained.
For example, five images are continuously shot by a camera integrated with a terminal device, and the images are respectively a1, a2, A3, a4 and a5 according to the sequence of shooting time, a1, a2 and A3 are three continuously shot images, and a1 and A3 are not two continuously shot images because a1 and A3 are not adjacent.
It should be noted that, in the embodiment of the present application, at least two frames of images to be processed that are continuously captured may be acquired by a camera integrated with the terminal device (for example, at least two frames of images to be processed that are continuously captured by the camera), or at least two frames of images to be processed that are continuously captured may be acquired from another device (for example, at least two frames of images to be processed that are continuously captured and sent by the receiving server), where an acquisition manner of the at least two frames of images to be processed that are continuously captured is not limited herein.
Step 102, obtaining a static area and a motion area of at least two frames of images to be processed.
In the embodiment of the present application, the at least two frames of images to be processed may be divided into the still region and the motion region according to the difference between all adjacent two frames of images to be processed in the at least two frames of images to be processed, so as to obtain the still region and the motion region of the at least two frames of images to be processed. The difference between two adjacent frames of images to be processed may refer to offset of matched pixel points in the two adjacent frames of images to be processed, if the offset of the matched pixel points is greater than an offset threshold, it is determined that the matched pixel points move, an area formed by a plurality of moving pixel points is a moving area, if the offset of the matched pixel points is less than or equal to the offset threshold, it is determined that the matched pixel points do not move, and an area formed by a plurality of non-moving pixel points is a static area. Optionally, the user may set the offset threshold by himself or herself according to actual needs, or may preset offset thresholds corresponding to different ambient light intensities, and adaptively select the offset threshold according to the ambient light intensity of the shot image, which is not limited herein.
Taking five to-be-processed images as an example, illustrating all two adjacent to-be-processed images in at least two to-be-processed images, and the five to-be-processed images are respectively a1, a2, A3, a4, a5, (a1, a2), (a2, A3), (A3, a4), (a4, a5) which are adjacent to each other according to the sequence of the five to-be-processed images at the shooting time, that is, the five to-be-processed images include four groups of adjacent two to-be-processed images.
In the embodiment of the application, after a static area and a motion area of at least two frames of images to be processed are obtained, for an ith frame of images to be processed, the ith frame of images to be processed is any one of the at least two frames of images to be processed, a sub-image of the static area in the ith frame of images to be processed can be extracted from the ith frame of images to be processed, and then at least two sub-images can be obtained from the at least two frames of images to be processed; the sub-image of the motion area in the candidate image to be processed of the frame can be extracted from the candidate image to be processed of the frame, so as to obtain the sub-image of the motion area in the candidate image to be processed of the frame, wherein the candidate image to be processed refers to the image to be processed of at least two frames of images to be processed with motion areas.
And 103, performing noise reduction processing on the static area according to the sub-images of the static area in the at least two frames of images to be processed respectively to obtain a first sub-image.
The first sub-image is a noise-reduced sub-image corresponding to the still region.
In the embodiment of the present application, the still region corresponds to the sub-image in each of the at least two frames of images to be processed, and the noise-reduced sub-image (i.e., the first sub-image) corresponding to the still region may be obtained according to the sub-images of the still region in all the images to be processed. For example, taking five frames of images to be processed as an example, the static area is the area where the sun is located, and then sun images (i.e., sub-images of the sun area in the images to be processed) exist in all the five frames of images to be processed, and according to the sun images of the areas where the sun is located in the five frames of images to be processed, the noise-reduced sun image corresponding to the area where the sun is located can be obtained.
The noise reduction processing for the static area includes, but is not limited to, the following two ways:
in the first mode, sub-images in at least two frames of images to be processed in a static area are superposed, and an image obtained after superposition is a first sub-image, namely, the quality of the first sub-image corresponding to the static area is improved by superposing at least two sub-images corresponding to the static area;
in the second mode, a preset noise reduction algorithm is firstly used for performing noise reduction on the sub-images in the static area in the at least two frames of images to be processed respectively, the sub-images subjected to noise reduction are overlapped, and the overlapped image is the first sub-image. The preset noise reduction algorithm may be any preset noise reduction algorithm, such as a non-local average filtering algorithm, which is not limited herein.
And 104, performing noise reduction processing on the motion area according to the sub-image of the motion area in the candidate image to be processed of one frame to obtain a second sub-image.
The second sub-image is a noise-reduced sub-image corresponding to the motion region.
In this embodiment of the present application, since the motion region does not necessarily exist in each frame of to-be-processed image in the at least two frames of to-be-processed images, the candidate to-be-processed images may be obtained from the at least two frames of to-be-processed images, and then the noise-reduced sub-image (i.e., the second sub-image) corresponding to the motion region may be obtained according to the sub-image of the motion region in the one frame of candidate to-be-processed image. The one-frame candidate to-be-processed image in step 104 may refer to any one-frame candidate to-be-processed image of all candidate to-be-processed images in the at least two frames of to-be-processed images, and when the number of the candidate to-be-processed images is at least two, the one-frame candidate to-be-processed image in step 104 may be any one of the at least two candidate to-be-processed images, or may be the candidate to-be-processed image with the highest definition, which is not limited herein.
For example, taking five frames of images to be processed as an example, the motion region is a lightning region, two frames of images to be processed in the five frames of images to be processed have lightning images, and three frames of images to be processed do not have lightning images, and the lightning images with reduced noise can be obtained according to any one of the two frames of images to be processed having the lightning images or the image to be processed with the highest definition.
The noise reduction processing for the motion region includes, but is not limited to, the following two ways:
in the first mode, the sub-image in the candidate image to be processed of one frame in the step 104 of the motion region is taken as the second sub-image, that is, the sub-image in the candidate image to be processed of one frame in the motion region is taken as the second sub-image, so that motion blur caused by superposition of a plurality of sub-images can be avoided, and the quality of the second sub-image corresponding to the motion region is improved;
in the second mode, a preset noise reduction algorithm is used to perform noise reduction processing on the sub-image of the motion region in the candidate image to be processed in one frame in step 104, and the sub-image subjected to noise reduction processing is taken as the second sub-image, that is, in the present application, before the sub-image of the motion region in the candidate image to be processed in one frame is taken as the second sub-image, the sub-image of the motion region in the candidate image to be processed in one frame is subjected to noise reduction processing, so that the quality of the second sub-image corresponding to the motion region can be further improved. The preset noise reduction algorithm may be any preset noise reduction algorithm, such as a non-local average filtering algorithm, which is not limited herein.
And 105, splicing the first sub-image and the second sub-image to obtain a target image.
In the embodiment of the application, the first sub-image and the second sub-image can be spliced according to the position distribution of the static area corresponding to the first sub-image and the position distribution of the motion area corresponding to the second sub-image in the image to be processed, the spliced image is the target image, and the target image is also the image with reduced noise because the first sub-image and the second sub-image which form the target image are the sub-images with reduced noise, so that the image quality is higher.
As shown in fig. 2, which is an exemplary diagram of image stitching, two frames of images to be processed are shown in fig. 2, the area of the sun and the area of the dark clouds in fig. 2 are both static areas, the area of the lightning is a moving area, the area of the sun and the area of the dark clouds are obtained from the first frame of image to be processed, and both are subjected to noise reduction processing to obtain a solar image with reduced noise corresponding to the area of the sun, and a dark cloud image with reduced noise corresponding to the area of the dark clouds is obtained, the area of the lightning is obtained from the second frame of image to be processed and subjected to noise reduction processing to obtain a lightning image with reduced noise corresponding to the area of the lightning, the solar image, the dark cloud image and the lightning image are stitched to obtain a high-quality image including the solar image, the dark cloud image and the lightning image, wherein black points in the first frame of image to be processed and the second frame of image to be processed represent noise points, as can be seen from fig. 2, a high-quality image with effectively suppressed noise or effectively reduced noise can be obtained by the present application.
According to the method and the device, the static area and the motion area of at least two frames of images to be processed which are continuously shot are obtained, the first sub-image which is corresponding to the static area and has reduced noise is obtained according to the sub-images of the static area in the at least two frames of images to be processed respectively, the second sub-image which is corresponding to the motion area and has reduced noise is obtained according to the sub-image of the motion area in a frame of candidate images to be processed, the first sub-image and the second sub-image are spliced, the target image with reduced noise can be obtained, and the high-quality image is obtained.
Referring to fig. 3, it is a schematic diagram of an implementation flow of an image processing method provided in the second embodiment of the present application, where the image processing method is applied to a terminal device integrated with a camera, and as shown in the figure, the image processing method may include the following steps:
step 301, when the camera is in the professional mode, if a photographing instruction is received, acquiring an exposure mode of the camera.
The professional mode is a mode in which a user sets a photographing parameter according to actual needs, and the photographing parameter includes, but is not limited to, a photometric method, sensitivity, exposure time, exposure compensation, a focusing method, white balance, and the like.
In the embodiment of the application, when detecting that the camera application is started, the terminal device may detect whether the camera is in the professional mode, and if the camera is in the professional mode, detect whether a photographing instruction is received, and if the photographing instruction is received, obtain an exposure mode of the camera, and if the photographing instruction is not received, continue to detect whether the photographing instruction is received until the photographing instruction is received or the camera exits the professional mode. The professional mode of the camera may be switched from another photographing mode to the professional mode, or the default photographing mode may be the professional mode when the camera application is started, which is not limited herein. The camera exiting the professional mode may be switched from the professional mode to another photographing mode, or may be turned off from the camera application, which is not limited herein.
Step 302, if the exposure mode of the camera is automatic exposure or the exposure mode of the camera is manual exposure and the exposure time of the camera is less than the time threshold, controlling the camera to continuously shoot at least two frames of images to be processed.
The automatic exposure means that a camera replaces manual operation, and exposure time, aperture, sensitivity and the like are automatically adjusted to control exposure. The manual exposure means that exposure can be controlled by manually setting an aperture, exposure time, sensitivity, and the like. The time threshold may be set according to empirical values, for example 0.03 seconds.
It should be noted that the number of at least two frames of images to be processed, which are continuously captured in step 302, may be specifically set according to an empirical value, for example, six frames.
Optionally, the embodiment of the present application further includes:
and if the exposure mode of the camera is manual exposure and the exposure time of the camera is greater than or equal to the time threshold, controlling the camera to shoot a frame of image to be processed.
In the embodiment of the application, when the exposure mode of the camera is manual exposure, the exposure time of the camera can be acquired, the exposure time of the camera is compared with a time threshold, if the exposure time of the camera is smaller than the time threshold, the camera is controlled to continuously shoot at least two frames of images to be processed, and high-quality images can be output based on the at least two frames of images to be processed; if the exposure time of the camera is greater than or equal to the time threshold, in order to improve the photographing experience of the user, the camera may be controlled to photograph a frame of image to be processed, and the frame of image to be processed is output.
Step 303, obtaining a static area and a motion area of at least two frames of images to be processed.
The step is the same as step 102, and reference may be made to the related description of step 102, which is not repeated herein.
And 304, performing noise reduction processing on the static area according to the sub-images of the static area in the at least two frames of images to be processed respectively to obtain a first sub-image.
The step is the same as step 103, and reference may be made to the related description of step 103, which is not described herein again.
Step 305, according to the sub-image of the motion area in the candidate image to be processed in one frame, performing noise reduction processing on the motion area to obtain a second sub-image.
The step is the same as step 104, and reference may be made to the related description of step 104, which is not described herein again.
And step 306, splicing the first sub-image and the second sub-image to obtain a target image.
The step is the same as step 105, and reference may be made to the related description of step 105, which is not repeated herein.
According to the embodiment of the application, when the camera is in the professional mode, the shooting frame number of the camera is adaptively controlled according to the exposure mode and the exposure time, so that the shooting experience of a user can be improved while a high-quality image is obtained.
Referring to fig. 4, it is a schematic diagram of an implementation flow of an image processing method provided in the third embodiment of the present application, where the image processing method is applied to a terminal device integrated with a camera, and as shown in the figure, the image processing method may include the following steps:
step 401, when a continuous shooting instruction is received, controlling a camera to continuously shoot M frames of alternative images, and storing the M frames of alternative images in a preset cache region.
Wherein, M is L + N-1, L is an integer more than 1, and N is an odd number more than 1. The continuous shooting instruction is used for instructing the camera to output at least two continuous shooting images.
In the embodiment of the application, when a continuous shooting instruction is received, a camera is controlled to continuously shoot M frames of alternative images, and the shot alternative images are sequentially stored in a preset cache area in the shooting process.
The M frames of alternative images are obtained according to the photographing data stream, and compared with the preview data stream, the photographing data stream can ensure that the photographed images are full-size images, and effectively retain image information.
Step 402, dividing the M frames of candidate images into L image groups, wherein one image group includes N frames of candidate images.
The N frames of alternative images are continuous in shooting time, one image group finally outputs one frame of target image, and L image groups finally output L frames of target images, namely the L frames of target images are continuous shooting images output by the camera.
Optionally, dividing the M frame candidate images into L image groups includes:
the first of M frame alternative images
Figure BDA0002577438740000111
The frame is used as a reference frame, and the M frame alternative images are arranged according to the shooting time;
to be adjacent to and arranged before the reference frame
Figure BDA0002577438740000112
A frame candidate image, and a frame adjacent to and arranged after the reference frame
Figure BDA0002577438740000113
Frame candidate image as reference image of reference frameDetermining a group of image groups formed by the reference frame and the reference image;
taking the next frame image of the reference frame in the M frames of candidate images as the reference frame, and returning to execute the processing of arranging the next frame image adjacent to and before the reference frame
Figure BDA0002577438740000114
A frame candidate image, and a frame adjacent to and arranged after the reference frame
Figure BDA0002577438740000115
And the frame candidate images are used as reference images of the reference frame, and a group of image groups formed by the reference frame and the reference images are determined until the M frame candidate images are traversed, so that L image groups are obtained.
In the embodiment of the present application, the M candidate images are arranged according to the shooting time, for example, a first frame to-be-processed image in the M candidate images is a first frame in the M candidate images, and an mth frame candidate image is a last frame in the M candidate images.
Illustratively, five frame candidate images B1, B2, B3, B4 and B5 are arranged according to shooting time, B1 is a first frame candidate image, B2 is a second frame candidate image, B3 is a third frame candidate image, B4 is a fourth frame candidate image, and B5 is a fifth frame candidate image, where B2 is first used as a reference frame, B1 and B3 are reference images of B2, then B1, B2 and B3 constitute an image group, B3 is then used as a reference frame, B2 and B4 are reference images of B3, then B2, B3 and B4 constitute an image group, and finally B4 is used as a reference frame, B3 and B5 are reference images of B4, then B3, B4 and a5 constitute an image group, and three image groups are obtained.
Step 403, acquiring at least two frames of images to be processed from the N frames of candidate images in each image group.
For an image group, at least two to-be-processed images corresponding to the image group can be acquired from N candidate images in the image group.
Optionally, after dividing the M frame candidate images into L image groups, the method further includes:
judging whether the reference frame in each image group meets the image synthesis condition or not, wherein the reference frame in each image group meets the image synthesis condition, namely, an image similar to the reference frame exists in the N-1 frame reference image of each image group, and the reference frame in each image group does not meet the image synthesis condition, namely, an image similar to the reference frame does not exist in the N-1 frame reference image of each image group;
correspondingly, acquiring at least two frames of images to be processed from the N frames of alternative images in each image group includes:
and if the reference frame in each image group meets the image synthesis condition, determining that the reference frame in each image group and the reference image similar to the reference frame are at least two to-be-processed images.
In the embodiment of the application, for any image group, by acquiring the similarity between the reference frame of the image group and the reference image of the N-1 frames of the image group, and comparing the acquired N-1 similarities with the similarity threshold, if there is a reference image whose similarity with the reference frame is greater than the similarity threshold, it is determined that the reference frame in the image group satisfies the image synthesis condition; and if no reference image with the similarity larger than the similarity threshold value with the reference frame exists, determining that the reference frame in the image group does not meet the image synthesis condition.
Optionally, the embodiment of the present application further includes:
and if the reference frame in each image group does not meet the image synthesis condition, taking the reference frame in each image group as a target image of each image group.
Step 404, a static area and a motion area of at least two frames of images to be processed are obtained.
The step is the same as step 102, and reference may be made to the related description of step 102, which is not repeated herein.
Step 405, according to the sub-images of the still region in the at least two frames of images to be processed, the noise reduction processing is performed on the still region to obtain a first sub-image.
The step is the same as step 103, and reference may be made to the related description of step 103, which is not described herein again.
And step 406, performing noise reduction processing on the motion region according to the sub-image of the motion region in the candidate image to be processed in one frame to obtain a second sub-image.
The step is the same as step 104, and reference may be made to the related description of step 104, which is not described herein again.
And step 407, splicing the first sub-image and the second sub-image to obtain a target image.
The step is the same as step 105, and reference may be made to the related description of step 105, which is not repeated herein.
According to the method and the device, when the camera continuously shoots the images, the to-be-processed images corresponding to each image group are obtained from the multi-frame alternative images continuously shot by the camera, and when the to-be-processed images comprise the static area and the moving area, the static area and the moving area are subjected to noise reduction processing in different modes, so that the high-quality images with reduced noise can be obtained.
Fig. 5 is a schematic structural diagram of an image processing apparatus according to a fourth embodiment of the present application, and only a part related to the embodiment of the present application is shown for convenience of description.
The image processing apparatus includes:
an image obtaining module 51, configured to obtain at least two frames of images to be processed that are continuously captured;
the region acquiring module 52 is configured to acquire a still region and a moving region of at least two frames of images to be processed;
the first denoising module 53 is configured to perform denoising processing on the stationary region according to sub-images of the stationary region in at least two frames of images to be processed, respectively, to obtain a first sub-image, where the first sub-image is a denoised sub-image corresponding to the stationary region;
a second denoising module 54, configured to perform denoising processing on the motion region according to a sub-image of the motion region in a candidate to-be-processed image of one frame to obtain a second sub-image, where the second sub-image is a denoised sub-image corresponding to the motion region, and the candidate to-be-processed image is a to-be-processed image with a motion region in at least two frames of to-be-processed images;
and the target obtaining module 55 is configured to splice the first sub-image and the second sub-image to obtain a target image.
Optionally, the image processing apparatus further comprises:
the mode acquisition module is used for acquiring the exposure mode of the camera if a photographing instruction is received when the camera is in the professional mode;
correspondingly, the image acquisition module is specifically configured to:
and if the exposure mode of the camera is automatic exposure or the exposure mode of the camera is manual exposure and the exposure time of the camera is less than the time threshold, controlling the camera to continuously shoot at least two frames of images to be processed.
Optionally, the image processing apparatus further comprises:
and the camera control module is used for controlling the camera to shoot a frame of image to be processed if the exposure mode of the camera is manual exposure and the exposure time of the camera is greater than or equal to a time threshold.
Optionally, the image processing apparatus further comprises:
the image storage module is used for controlling the camera to continuously shoot M frames of alternative images when a continuous shooting instruction is received, and storing the M frames of alternative images in a preset cache region, wherein M is L + N-1, L is an integer larger than 1, and N is an odd number larger than 1;
the image dividing module is used for dividing the M frames of alternative images into L image groups, wherein one image group comprises N frames of alternative images, and the N frames of alternative images are continuous in shooting time;
correspondingly, the image acquisition module is specifically configured to:
and acquiring at least two frames of images to be processed from the N frames of alternative images in each image group.
Optionally, the image dividing module is specifically configured to:
the first of M frame alternative images
Figure BDA0002577438740000141
The frame is used as a reference frame, and the M frame alternative images are arranged according to the shooting time;
will be in phase with the reference frameArranged adjacent to and before the reference frame
Figure BDA0002577438740000142
A frame candidate image, and a frame adjacent to and arranged after the reference frame
Figure BDA0002577438740000143
The frame candidate image is used as a reference image of a reference frame, and the reference frame and the reference image are determined to form a group of image groups;
taking the next frame image of the reference frame in the M frames of candidate images as the reference frame, and returning to execute the processing of arranging the next frame image adjacent to and before the reference frame
Figure BDA0002577438740000144
A frame candidate image, and a frame adjacent to and arranged after the reference frame
Figure BDA0002577438740000145
And the frame candidate images are used as reference images of the reference frame, and a group of image groups formed by the reference frame and the reference images are determined until the M frame candidate images are traversed, so that L image groups are obtained.
Optionally, the image processing apparatus further comprises:
the image synthesis device comprises a synthesis judging module, a frame matching module and a frame matching module, wherein the synthesis judging module is used for judging whether a reference frame in each image group meets an image synthesis condition or not, the reference frame in each image group meets the image synthesis condition, namely, an image similar to the reference frame exists in an N-1 frame reference image of each image group, and the reference frame in each image group does not meet the image synthesis condition, namely, an image similar to the reference frame does not exist in the N-1 frame reference image of each image group;
correspondingly, the image acquisition module is specifically configured to:
and if the reference frame in each image group meets the image synthesis condition, determining that the reference frame in each image group and the reference image similar to the reference frame are at least two to-be-processed images.
Optionally, the image processing apparatus further comprises:
and the image determining module is used for taking the reference frame in each image group as the target image of each image group if the reference frame in each image group does not meet the image synthesis condition.
The image processing apparatus provided in the embodiment of the present application can be applied to the foregoing method embodiments, and for details, refer to the description of the foregoing method embodiments, which are not described herein again.
Fig. 6 is a schematic structural diagram of a terminal device according to a fifth embodiment of the present application. The terminal device as shown in the figure may include: one or more processors 601 (only one shown); one or more input devices 602 (only one shown), one or more output devices 603 (only one shown), and memory 604. The processor 601, the input device 602, the output device 603, and the memory 604 are connected by a bus 605. The memory 604 is used for storing instructions, and the processor 601 is used for implementing the steps in the above-mentioned respective embodiments of the image processing method when executing the instructions stored in the memory 604.
It should be understood that, in the embodiment of the present Application, the Processor 601 may be a Central Processing Unit (CPU), and the Processor may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The input device 602 may include a touch pad, a fingerprint sensor (for collecting fingerprint information of a user and direction information of the fingerprint), a microphone, a data receiving interface, and the like. The output device 603 may include a display (LCD, etc.), speakers, a data transmission interface, and the like.
The memory 604 may include both read-only memory and random access memory, and provides instructions and data to the processor 601. A portion of the memory 604 may also include non-volatile random access memory. For example, the memory 604 may also store device type information.
In a specific implementation, the processor 601, the input device 602, the output device 603, and the memory 604 described in this embodiment of the present application may execute the implementation described in the embodiment of the image processing method provided in this embodiment of the present application, or may execute the implementation described in the image processing apparatus described in the fourth embodiment of the present application, which is not described herein again.
Fig. 7 is a schematic structural diagram of a terminal device according to a sixth embodiment of the present application. As shown in fig. 7, the terminal device 7 of this embodiment includes: a processor 70, a memory 71 and a computer program 72 stored in said memory 71 and executable on said processor 70. The processor 70, when executing the computer program 72, implements the steps in the various image processing method embodiments described above.
Illustratively, the computer program 72 may be partitioned into one or more modules/units that are stored in the memory 71 and executed by the processor 70 to accomplish the present application. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 72 in the terminal device 7. For example, the computer program 72 may be divided into an event acquisition module, a display module, an update module, and an information transmission module, and the specific functions of each module are as follows:
the terminal device 7 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 70, a memory 71. It will be appreciated by those skilled in the art that fig. 7 is merely an example of a terminal device 7 and does not constitute a limitation of the terminal device 7 and may comprise more or less components than shown, or some components may be combined, or different components, for example the terminal device may further comprise input output devices, network access devices, buses, etc.
The processor 70 may be a central processing unit CPU, but may also be other general purpose processors, digital signal processors DSP, application specific integrated circuits ASIC, off-the-shelf programmable gate arrays FPGA or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 71 may be an internal storage unit of the terminal device 7, such as a hard disk or a memory of the terminal device 7. The memory 71 may also be an external storage device of the terminal device 7, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 7. Further, the memory 71 may also include both an internal storage unit and an external storage device of the terminal device 7. The memory 71 is used for storing the computer program and other programs and data required by the terminal device. The memory 71 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow in the method of the embodiments described above can be realized by a computer program, which can be stored in a computer-readable storage medium and can realize the steps of the embodiments of the methods described above when the computer program is executed by a processor. Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. An image processing method, characterized in that the image processing method comprises:
acquiring at least two continuously shot images to be processed;
obtaining a static area and a motion area of the at least two frames of images to be processed; according to the sub-images of the static area in the at least two frames of images to be processed, noise reduction processing is carried out on the static area to obtain a first sub-image, wherein the first sub-image is a noise-reduced sub-image corresponding to the static area;
according to the sub-image of the motion area in a candidate image to be processed of one frame, carrying out noise reduction processing on the motion area to obtain a second sub-image, wherein the second sub-image is a noise-reduced sub-image corresponding to the motion area, and the candidate image to be processed is an image to be processed of the motion area in the at least two frames of images to be processed;
and splicing the first sub-image and the second sub-image to obtain a target image.
2. The image processing method according to claim 1, further comprising, before acquiring at least two to-be-processed images continuously captured:
when a camera is in a professional mode, if a photographing instruction is received, acquiring an exposure mode of the camera;
correspondingly, the acquiring of the at least two frames of images to be processed which are continuously shot comprises:
and if the exposure mode of the camera is automatic exposure or the exposure mode of the camera is manual exposure and the exposure time of the camera is less than a time threshold, controlling the camera to continuously shoot at least two frames of images to be processed.
3. The image processing method according to claim 2, characterized in that the image processing method further comprises:
and if the exposure mode of the camera is manual exposure and the exposure time of the camera is greater than or equal to a time threshold, controlling the camera to shoot a frame of image to be processed.
4. The image processing method according to claim 1, further comprising, before acquiring at least two to-be-processed images continuously captured:
when a continuous shooting instruction is received, controlling a camera to continuously shoot M frames of alternative images, and storing the M frames of alternative images in a preset cache region, wherein M is L + N-1, L is an integer larger than 1, and N is an odd number larger than 1;
dividing the M frames of alternative images into L image groups, wherein one image group comprises N frames of alternative images, and the N frames of alternative images are continuous in shooting time;
correspondingly, the acquiring of the at least two frames of images to be processed which are continuously shot comprises:
and acquiring the at least two frames of images to be processed from the N frames of alternative images in each image group.
5. The image processing method of claim 4, wherein said dividing the M frame candidate images into L groups of images comprises:
the first of the M frame alternative images
Figure FDA0002577438730000021
The frame is used as a reference frame, and the M frame alternative images are arranged according to shooting time;
to be adjacent to and arranged before the reference frame
Figure FDA0002577438730000022
A frame candidate image, and a frame adjacent to and arranged after the reference frame
Figure FDA0002577438730000023
Taking the frame candidate image as a reference image of the reference frame, and determining that the reference frame and the reference image form a group of image groups;
taking the next frame image of the reference frame in the M frame candidate images as a reference frame, and returning to execute the step of arranging the next frame image adjacent to the reference frame before the reference frame
Figure FDA0002577438730000024
A frame candidate image, and a frame adjacent to and arranged after the reference frame
Figure FDA0002577438730000025
And taking the frame alternative images as reference images of the reference frame, determining that the reference frame and the reference images form a group of image groups until the M frame alternative images are traversed, and obtaining L image groups.
6. The image processing method of claim 5, wherein after dividing the M frame candidate images into L groups of images, further comprising:
judging whether a reference frame in each image group meets an image synthesis condition, wherein the reference frame in each image group meets the image synthesis condition, namely, an image similar to the reference frame exists in the N-1 frame reference image in each image group, and the reference frame in each image group does not meet the image synthesis condition, namely, the image similar to the reference frame does not exist in the N-1 frame reference image in each image group;
correspondingly, the acquiring the at least two frame alternative images from the N frame alternative images in each image group includes:
and if the reference frame in each image group meets the image synthesis condition, determining the reference frame in each image group and the reference image similar to the reference frame as the at least two images to be processed.
7. The image processing method according to claim 6, further comprising:
and if the reference frame in each image group does not meet the image synthesis condition, taking the reference frame in each image group as a target image of each image group.
8. An image processing apparatus characterized by comprising:
the image acquisition module is used for acquiring at least two continuously shot images to be processed;
the area acquisition module is used for acquiring a static area and a motion area of the at least two frames of images to be processed;
the first denoising module is used for denoising the static area according to sub-images of the static area in the at least two frames of images to be processed respectively to obtain a first sub-image, wherein the first sub-image is a denoised sub-image corresponding to the static area;
a second denoising module, configured to perform denoising processing on the motion region according to a sub-image of the motion region in a candidate to-be-processed image of one frame, to obtain a second sub-image, where the second sub-image is a denoised sub-image corresponding to the motion region, and the candidate to-be-processed image is a to-be-processed image in which the motion region exists in the at least two frames of to-be-processed images;
and the target acquisition module is used for splicing the first sub-image and the second sub-image to obtain a target image.
9. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the image processing method according to any of claims 1 to 7 when executing the computer program.
10. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 7.
CN202010657947.5A 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer readable storage medium Active CN111815531B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010657947.5A CN111815531B (en) 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010657947.5A CN111815531B (en) 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer readable storage medium

Publications (2)

Publication Number Publication Date
CN111815531A true CN111815531A (en) 2020-10-23
CN111815531B CN111815531B (en) 2024-03-01

Family

ID=72842074

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010657947.5A Active CN111815531B (en) 2020-07-09 2020-07-09 Image processing method, device, terminal equipment and computer readable storage medium

Country Status (1)

Country Link
CN (1) CN111815531B (en)

Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070248330A1 (en) * 2006-04-06 2007-10-25 Pillman Bruce H Varying camera self-determination based on subject motion
US20120201426A1 (en) * 2011-02-04 2012-08-09 David Wayne Jasinski Estimating subject motion for capture setting determination
US20120201427A1 (en) * 2011-02-04 2012-08-09 David Wayne Jasinski Estimating subject motion between image frames
US9756249B1 (en) * 2016-04-27 2017-09-05 Gopro, Inc. Electronic image stabilization frequency estimator
WO2017205492A1 (en) * 2016-05-25 2017-11-30 Gopro, Inc. Three-dimensional noise reduction
CN107872623A (en) * 2017-12-22 2018-04-03 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN108616687A (en) * 2018-03-23 2018-10-02 维沃移动通信有限公司 A kind of photographic method, device and mobile terminal
CN109120862A (en) * 2018-10-15 2019-01-01 Oppo广东移动通信有限公司 High-dynamic-range image acquisition method, device and mobile terminal
US20190058821A1 (en) * 2017-08-16 2019-02-21 Qualcomm Incorporated Image capture device with stabilized exposure or white balance
CN109474787A (en) * 2018-12-28 2019-03-15 维沃移动通信有限公司 A kind of photographic method, terminal device and storage medium
US20190108622A1 (en) * 2017-10-11 2019-04-11 Gopro, Inc. Non-local means denoising
CN111062881A (en) * 2019-11-20 2020-04-24 RealMe重庆移动通信有限公司 Image processing method and device, storage medium and electronic equipment
US20200134791A1 (en) * 2018-10-27 2020-04-30 BARS Imaging LLC Spatio-temporal differential synthesis of detail images for high dynamic range imaging

Patent Citations (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070248330A1 (en) * 2006-04-06 2007-10-25 Pillman Bruce H Varying camera self-determination based on subject motion
US20120201426A1 (en) * 2011-02-04 2012-08-09 David Wayne Jasinski Estimating subject motion for capture setting determination
US20120201427A1 (en) * 2011-02-04 2012-08-09 David Wayne Jasinski Estimating subject motion between image frames
US9756249B1 (en) * 2016-04-27 2017-09-05 Gopro, Inc. Electronic image stabilization frequency estimator
WO2017205492A1 (en) * 2016-05-25 2017-11-30 Gopro, Inc. Three-dimensional noise reduction
US20190058821A1 (en) * 2017-08-16 2019-02-21 Qualcomm Incorporated Image capture device with stabilized exposure or white balance
US20190108622A1 (en) * 2017-10-11 2019-04-11 Gopro, Inc. Non-local means denoising
CN107872623A (en) * 2017-12-22 2018-04-03 维沃移动通信有限公司 A kind of image pickup method, mobile terminal and computer-readable recording medium
CN108616687A (en) * 2018-03-23 2018-10-02 维沃移动通信有限公司 A kind of photographic method, device and mobile terminal
CN109120862A (en) * 2018-10-15 2019-01-01 Oppo广东移动通信有限公司 High-dynamic-range image acquisition method, device and mobile terminal
US20200134791A1 (en) * 2018-10-27 2020-04-30 BARS Imaging LLC Spatio-temporal differential synthesis of detail images for high dynamic range imaging
CN109474787A (en) * 2018-12-28 2019-03-15 维沃移动通信有限公司 A kind of photographic method, terminal device and storage medium
CN111062881A (en) * 2019-11-20 2020-04-24 RealMe重庆移动通信有限公司 Image processing method and device, storage medium and electronic equipment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
JEONHO KANG ETC.: "Minimum Error Seam-Based Efficient Panorama VideoStitching Method Robust to Parallax", IEEE ACCESS *
刘宗等: "基于多曝光的高动态图像合成的噪声处理", 电子科技, no. 11, 15 November 2016 (2016-11-15) *
刘秀进等: "基于图像融合的运动目标检测与跟踪方法研究", 机械工程与自动化, no. 04 *
李波等: "三维运动图像多参考帧边缘差异动态分割仿真", 计算机仿真, no. 06 *
潘峥嵘等: "改进的背景减法与五帧差分法相结合的运动目标检测", 自动化与仪表, no. 07 *

Also Published As

Publication number Publication date
CN111815531B (en) 2024-03-01

Similar Documents

Publication Publication Date Title
CN111654594B (en) Image capturing method, image capturing apparatus, mobile terminal, and storage medium
WO2020171373A1 (en) Techniques for convolutional neural network-based multi-exposure fusion of multiple image frames and for deblurring multiple image frames
CN111726533B (en) Image processing method, image processing device, mobile terminal and computer readable storage medium
CN110335216B (en) Image processing method, image processing apparatus, terminal device, and readable storage medium
CN108737739B (en) Preview picture acquisition method, preview picture acquisition device and electronic equipment
EP2569934A1 (en) Imaging apparatus, image processing method, and recording medium for recording program thereon
CN110457963B (en) Display control method, display control device, mobile terminal and computer-readable storage medium
CN109215037B (en) Target image segmentation method and device and terminal equipment
WO2021083059A1 (en) Image super-resolution reconstruction method, image super-resolution reconstruction apparatus, and electronic device
CN109118447B (en) Picture processing method, picture processing device and terminal equipment
US20220103743A1 (en) Picture focusing method, apparatus, terminal, and corresponding storage medium
CN109767401B (en) Picture optimization method, device, terminal and corresponding storage medium
US20170019615A1 (en) Image processing method, non-transitory computer-readable storage medium and electrical device thereof
CN108769419B (en) Photographing method, mobile terminal and computer-readable storage medium
CN111654637B (en) Focusing method, focusing device and terminal equipment
CN114390201A (en) Focusing method and device thereof
CN110618852B (en) View processing method, view processing device and terminal equipment
US11770603B2 (en) Image display method having visual effect of increasing size of target image, mobile terminal, and computer-readable storage medium
CN107395983B (en) Image processing method, mobile terminal and computer readable storage medium
CN112055156B (en) Preview image updating method and device, mobile terminal and storage medium
CN111754435A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
CN111815531A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
CN107360361B (en) Method and device for shooting people in backlight mode
CN110705653A (en) Image classification method, image classification device and terminal equipment
CN111861965A (en) Image backlight detection method, image backlight detection device and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant