CN116320729A - Image processing method, device, electronic equipment and readable storage medium - Google Patents

Image processing method, device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN116320729A
CN116320729A CN202310312173.6A CN202310312173A CN116320729A CN 116320729 A CN116320729 A CN 116320729A CN 202310312173 A CN202310312173 A CN 202310312173A CN 116320729 A CN116320729 A CN 116320729A
Authority
CN
China
Prior art keywords
processed
image
images
frames
frame
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310312173.6A
Other languages
Chinese (zh)
Inventor
康波
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202310312173.6A priority Critical patent/CN116320729A/en
Publication of CN116320729A publication Critical patent/CN116320729A/en
Pending legal-status Critical Current

Links

Images

Abstract

The application discloses an image processing method, an image processing device, electronic equipment and a readable storage medium, and belongs to the technical field of image processing. The image processing method comprises the following steps: obtaining M frames of images to be processed, wherein the M frames of images to be processed are used for fusion processing to obtain a target shooting image; according to the inter-frame difference degrees respectively corresponding to the M frames of images to be processed, N frames of images to be processed are determined from the M frames of images to be processed; determining a back display image from the N frames of images to be processed according to the image characteristic information respectively corresponding to the N frames of images to be processed, wherein the image characteristic information comprises at least one of the following: brightness difference degree, color difference degree and detail richness degree; storing the back display image to the album; in the case of obtaining a target captured image, the return image in the album is replaced with the target captured image.

Description

Image processing method, device, electronic equipment and readable storage medium
Technical Field
The application belongs to the technical field of image processing, and particularly relates to an image processing method, an image processing device, electronic equipment and a readable storage medium.
Background
With the development of intelligent terminal technology, the functions provided by the intelligent terminal equipment are more and more abundant. For example, the user may take an image or video, etc., using a photographing function provided by the terminal. In the shooting process, the terminal often performs a series of algorithm processes such as enhancement, beautification, compression and the like on the image data, so that after clicking shooting, a user needs to wait for a period of time to view the image.
At present, after a user clicks to shoot, a frame of image which is not processed by an algorithm is selected from the cached image frames for the user to check, after the image processed by the algorithm is generated, the image processed by the algorithm is used for replacing the image which is not processed by the algorithm, however, if the difference between the two frames is too large, the image frames displayed before can influence the judgment of the user on shooting satisfaction degree.
Disclosure of Invention
An object of the embodiments of the present invention is to provide an image processing method, an apparatus, an electronic device, and a readable storage medium, which can improve the similarity between a previously displayed back-display image and a subsequently displayed target shooting image, and avoid that the excessive difference between two frames affects the judgment of the user on the shooting satisfaction.
In a first aspect, an embodiment of the present application provides an image processing method, including:
obtaining M frames of images to be processed, wherein the M frames of images to be processed are used for fusion processing to obtain a target shooting image;
according to the inter-frame difference degrees respectively corresponding to the M frames of images to be processed, N frames of images to be processed are determined from the M frames of images to be processed;
determining a back display image from the N frames of images to be processed according to the image characteristic information respectively corresponding to the N frames of images to be processed, wherein the image characteristic information comprises at least one of the following: brightness difference degree, color difference degree and detail richness degree;
Storing the back display image to the album;
in the case of obtaining a target captured image, the return image in the album is replaced with the target captured image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the acquisition module is used for acquiring M frames of images to be processed, wherein the M frames of images to be processed are used for fusion processing to obtain a target shooting image;
the processing module is used for determining N frames of images to be processed from the M frames of images to be processed according to the inter-frame difference degrees respectively corresponding to the M frames of images to be processed;
the processing module is further used for determining a back display image from the N frames of images to be processed according to the image characteristic information corresponding to the N frames of images to be processed respectively, wherein the image characteristic information comprises at least one of the following: brightness difference degree, color difference degree and detail richness degree;
the storage module is used for storing the back display image to the album;
and the storage module is also used for replacing the back display image in the album with the target shooting image under the condition of obtaining the target shooting image.
In a third aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the method as described in the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, embodiments of the present application provide a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and where the processor is configured to execute a program or instructions to implement a method according to the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement the method according to the first aspect.
In the embodiment of the application, after a terminal acquires a to-be-processed image for fusion processing into a target shooting image in a shooting process, determining N frames of to-be-processed images from M frames of to-be-processed images according to inter-frame difference degrees respectively corresponding to the M frames of to-be-processed images, and determining a back display image from the N frames of to-be-processed images according to image characteristic information respectively corresponding to the N frames of to-be-processed images. Since the image characteristic information may include at least one of: the brightness difference degree, the color difference degree and the detail abundance degree, so that the similarity between the back display image displayed in front and the target shooting image displayed in back can be improved, and the influence of overlarge difference of two frames on the judgment of the shooting satisfaction degree of a user is avoided.
Drawings
Fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application;
fig. 2 is a schematic diagram of a process flow for selecting a echo image and generating a target shot image according to an embodiment of the present application;
fig. 3 is a schematic diagram of a positional relationship between a pixel and a neighboring pixel according to an embodiment of the present application;
fig. 4 is a schematic structural view of an image processing apparatus according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 6 is a schematic hardware structure of another electronic device according to an embodiment of the present application.
Detailed Description
Technical solutions in the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments obtained by a person of ordinary skill in the art based on the embodiments in the present application are within the scope of the protection of the present application.
The terms first, second and the like in the description and in the claims, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type and do not limit the number of objects, for example, the first object may be one or more. Furthermore, in the description and claims, "and/or" means at least one of the connected objects, and the character "/", generally means that the associated object is an "or" relationship.
With the development of intelligent terminal technology, the functions provided by the intelligent terminal equipment are more and more abundant. For example, the user may take an image or video, etc., using a photographing function provided by the terminal. In order to improve the imaging quality, the terminal often needs a series of algorithm processes such as enhancement, beautification, compression and the like for the image data. Resulting in a user waiting for a period of time to view the image after clicking on the shot.
At present, after a user clicks to shoot, a frame of image which is not processed by an algorithm is selected from the cached image frames for the user to check, after the image processed by the algorithm is generated, the image processed by the algorithm is used for replacing the image which is not processed by the algorithm, however, if the difference between the two frames is too large, the image frames displayed before can influence the judgment of the user on shooting satisfaction degree.
In view of this, an object of an embodiment of the present application is to provide an image processing method, an apparatus, an electronic device, and a readable storage medium, which improve the similarity between a redisplayed image displayed before and a target captured image displayed after, and avoid that the excessive difference between two frames affects the judgment of the user on the captured satisfaction.
The image processing method provided by the embodiment of the present application will be described in detail first by specific embodiments and application scenarios thereof with reference to the accompanying drawings. The image processing method in the embodiment of the application can be applied to electronic equipment such as mobile phones, tablet computers, notebook computers and the like which can be used for shooting, and is not particularly limited herein.
Fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present application. As shown in connection with fig. 1, the image processing method may include steps 110 to 150.
Step 110, obtaining M frames of images to be processed, wherein the M frames of images to be processed are used for fusion processing to obtain a target shooting image;
step 120, determining N frames of images to be processed from the M frames of images to be processed according to the inter-frame difference degrees respectively corresponding to the M frames of images to be processed;
step 130, determining a back-display image from the N frames of images to be processed according to the image feature information corresponding to the N frames of images to be processed, where the image feature information includes at least one of the following: brightness difference degree, color difference degree and detail richness degree;
step 140, storing the back display image to the album;
step 150, in the case of obtaining the target photographed image, replacing the back-display image in the album with the target photographed image.
The above steps are described in detail below, and are specifically described below.
Specifically, referring to step 110 above, after the shooting function of the electronic device is turned on, the user can display a preview image in the current shooting scene in the shooting preview interface. The electronic device can buffer the multi-frame preview images, and under the condition that the input of the user for controlling the electronic device to shoot is received, M to-be-processed images for fusion processing into the target shooting image can be extracted from the buffered multi-frame preview images. It is understood that M is a positive integer greater than or equal to 1. The process of the M-frame image fusion processing includes, for example, image registration, image denoising, image enhancement, image fusion calculation, and the like, which is not particularly limited herein.
The above step 120 is related to each frame of to-be-processed image corresponding to the M frames of to-be-processed images, and the inter-frame difference values between each frame of to-be-processed image and other M-1 frames of to-be-processed images are calculated respectively, so as to obtain the inter-frame difference degree corresponding to each frame of to-be-processed image. By way of example, the degree of inter-frame difference corresponding to each of the M frames of images to be processed can be determined by means of inter-frame difference calculation, optical flow magnitude calculation, and the like. Based on the degree of the inter-frame difference, the visual angle difference existing between each frame of the image to be processed and the rest of the images to be processed in the M frames of the images to be processed can be represented.
In some embodiments, a filtering condition for determining N frames of images to be processed from M frames of images to be processed may be preset based on display requirements. For example, in order to reduce the small difference in viewing angle between the selected N-frame to-be-processed image and the finally generated target captured image, a preset screening condition may be set to acquire the N-frame to-be-processed image with the smallest value of the inter-frame difference. It is understood that N is a positive integer less than or equal to M. Therefore, after N frames of images to be processed are determined through preset screening conditions, the back display image is determined from the N frames of images to be processed, and visual effects of visual angle jump between the back display image and the target shooting image can be effectively reduced.
In this step 130, the imaging time of each frame of the image to be processed is different, so that the corresponding image feature information of each frame of the image to be processed is different. Image characteristic information such as brightness difference degree, color difference degree, detail richness degree, and the like. Alternatively, the image characteristic information for selecting the retrieval display image may be adjusted according to the actual shooting scene.
For example, when the environment brightness is high and the light source is stable, the N frames of images to be processed may be screened without using the image feature information as the image feature information, and when the electronic device for capturing images maintains a stable state itself, the N frames of images to be processed may be screened without using the detail abundance as the image feature information, which is not particularly limited herein.
The steps 140 to 150 are related to storing the back-displayed image in the album after determining the back-displayed image from the N frames of the images to be processed based on the image characteristic information.
In an alternative example, if the user opens the album, and needs to view the captured image, the return image may be displayed directly. If the user does not have the requirement of opening the album, and the photographing interface comprises a small window for displaying the photographed image, the thumbnail of the back-display image can be rapidly displayed in the small window, and photographing experience without hysteresis in the photographing-to-displaying process is effectively realized. In addition, the electronic device performs image fusion processing based on the M frames of images to be processed to generate a target shooting image, and replaces the back display image in the album with the target shooting image under the condition that the target shooting image is obtained.
Fig. 2 is a schematic diagram of a process flow for selecting a echo image and generating a target shot image according to an embodiment of the present application. Referring to fig. 2, the electronic device acquires M frames of images to be processed, and on one hand, selects a display image from the M frames of images to be processed and stores the display image to the album; on the other hand, fusion processing is performed on M-frame images to be processed, and after a target captured image is generated, the target captured image is used to replace the target captured image in the album.
Based on the embodiment of the application, because the visual difference between the selected back display image and the target shooting image is small, after the back display image is stored in the album, the back display image in the album is replaced by the target shooting image under the condition of obtaining the target shooting image, so that the similarity between the back display image displayed in front and the target shooting image displayed in back can be effectively improved, and the judgment of the shooting satisfaction degree of a user due to overlarge difference between two frames is avoided.
As a specific example, according to the inter-frame difference degrees corresponding to the M-frame to-be-processed images, the N-frame to-be-processed images are determined from the M-frame to-be-processed images, and the following steps may be specifically referred to:
for each frame of to-be-processed image in the M frames of to-be-processed images, acquiring an optical flow amplitude corresponding to the to-be-processed image, wherein the optical flow amplitude is used for representing the inter-frame difference degree between the to-be-processed image and the rest of to-be-processed images in the M frames of to-be-processed images; and determining N frames of to-be-processed images from the M frames of to-be-processed images according to the optical flow amplitude corresponding to each frame of to-be-processed image, wherein the optical flow amplitudes corresponding to the N frames of to-be-processed images respectively meet the target screening condition.
For example, the optical flow amplitude may be used to characterize the degree of inter-frame difference between the image to be processed and the remaining image to be processed in the M-frame image to be processed, that is, the difference in viewing angle between each frame of image to be processed and the remaining image to be processed in the M-frame image to be processed may be represented by the degree of inter-frame difference.
Taking the calculation of the M-th frame of the M-frame of the images to be processed as an example, the optical flow amplitude of the mth frame of the to-be-processed image can represent the inter-frame difference degree of the mth frame of the to-be-processed image and the to-be-processed images except the mth frame of the M frame of the to-be-processed image. Optionally, by calculating the difference values of the optical flows of the M-th frame of the to-be-processed image and the M-1 frame of the to-be-processed images remaining in the M-th frame of the to-be-processed image, the inter-frame difference degree of the M-th frame of the to-be-processed image can be determined according to the M-1 optical flow difference values.
Specifically, for each frame of to-be-processed image in the M frames of to-be-processed images, the optical flow amplitude corresponding to the to-be-processed image is obtained, and the following steps may be specifically referred to: for each frame of to-be-processed image in the M frames of to-be-processed images, acquiring an optical flow difference value between the to-be-processed image and the rest to-be-processed images of each frame; and acquiring the average value of the optical flow difference values between the to-be-processed image and the rest to-be-processed images of each frame, and taking the average value as the optical flow amplitude corresponding to the to-be-processed image.
For example, the optical flow differential value may be used to represent a displacement of each pixel point representing the same object (or object) between two frames of the image to be processed, alternatively the displacement may be represented using a two-dimensional vector. By calculating the difference value of each pixel representing the same object (or object) in the two frame images, the optical flow difference value of the two frame images may be obtained, alternatively, the optical flow difference value of the two frame images may be represented in a matrix form, which is not particularly limited herein.
As a specific example, the images to be processed are I1, I2, …, im, …, im, respectively, for M frames.
Taking the first frame to-be-processed image I1 and the second frame to-be-processed image I2 as examples, the first frame to-be-processed image I1 calculates optical flow difference values of I1 and I2 to IM respectively to obtain M-1 optical flow difference values, and for convenience of description, the M-1 optical flow difference values of the second frame to-be-processed image I2 are expressed in the form of an optical flow difference value set FI 1.
For the second frame of the image I2 to be processed, calculating the optical flow difference values of the I2 and I1, I3 to IM respectively to obtain M-1 optical flow difference values, wherein for convenience of description, the M-1 optical flow difference values of the second frame of the image I2 to be processed are represented in the form of an optical flow difference value set FI 2.
It is understood that, for the mth frame of the image Im to be processed, the optical flow differential values of Im and I1 to I (M-1), I (m+1) to Im are calculated, and for convenience of description, the M-1 optical flow differential values of the mth frame of the image Im to be processed are represented in the form of an optical flow differential value set FIm, where M e M is not listed here. Based on this, optical flow differential values FI1, FI2, …, FIm, …, FIM corresponding to the M frames of the image to be processed, respectively, can be obtained.
Each optical flow difference value in the optical flow difference value set can be expressed in a matrix form, and in order to facilitate calculation and improve the selection speed of the echo image, the average value of the optical flow difference values between the to-be-processed image and the rest to-be-processed images of each frame can be calculated respectively, and the average value is taken as the optical flow amplitude corresponding to the to-be-processed image.
Taking the first frame of the image I1 to be processed as an example, an average value of M-1 optical flow difference values in the optical flow difference value set FI1 is calculated to obtain the optical flow amplitude of the first frame of the image I1 to be processed. Taking the second frame of the image I2 to be processed as an example, calculating the average value of M-1 optical flow difference values in the optical flow difference value set FI2 set to obtain the optical flow amplitude of the second frame of the image I2 to be processed.
It is understood that, for the mth frame of the image Im to be processed, the average value of M-1 optical flow differential values in the optical flow differential value set FIm is calculated to obtain the optical flow amplitude of the mth frame of the image Im to be processed, which is not listed here. Based on this, the optical flow amplitude of the image to be processed per frame can be obtained.
In some embodiments, the smaller the magnitude of the optical flow, the smaller the degree of inter-frame difference between the image to be processed and the remaining image to be processed in the M-frame image to be processed, the smaller the difference in viewing angle that exists between each frame of image to be processed and the remaining image to be processed in the M-frame image to be processed. As a specific example, the target screening condition may be to select N frames of images to be processed having the smallest optical flow amplitude from the M frames of images to be processed.
Based on the embodiment of the application, after N frames of to-be-processed images determined by preset screening conditions are processed, the back-display images determined from the N frames of to-be-processed images can be used for effectively reducing visual effects of visual angle jump between the back-display images and the target shooting images, improving the similarity between the back-display images displayed in front and the target shooting images displayed in back, and avoiding the influence of overlarge difference between the two frames on the judgment of shooting satisfaction degree of a user.
As a specific example, for each of the N frames of images to be processed, the degree of brightness difference corresponding to the image to be processed is used to characterize the brightness difference between the image to be processed and the remaining image to be processed of the N frames of images to be processed.
The degree of brightness difference corresponding to the image to be processed can be determined with reference to the following steps:
acquiring a brightness value of an image to be processed; acquiring the average value of brightness values of N frames of images to be processed; and determining the brightness difference degree corresponding to the image to be processed according to the difference between the brightness value of the image to be processed and the average value of the brightness values.
Specifically, the luminance value of the image to be processed may represent the brightness level of the image to be processed. The luminance values for N frames of the image to be processed may be denoted as A1, A2, …, an, …, an, respectively. The calculation method of the luminance value is not particularly limited herein.
Average value of brightness values of N frames of images to be processed
Figure SMS_1
Can be shown as formula (1).
Figure SMS_2
For a first frame of the N frames of images to be processed, calculating average values of A1 and brightness values
Figure SMS_3
The brightness difference degree of the image to be processed of the first frame can be obtained. For the second frame of the N frames of images to be processed, calculating A2 and the average value of brightness values +. >
Figure SMS_4
The brightness difference degree of the image to be processed of the second frame can be obtained. It can be understood that for the nth frame of the N frames of the images to be processed, an and the average value of the brightness value are calculated>
Figure SMS_5
The brightness difference degree of the N-th frame of the image to be processed can be obtained, wherein N is epsilon N, and the brightness difference degree calculation process of the N-th frame of the image to be processed is not listed here.
As a specific example, for each of the N-frame images to be processed, the degree of color difference corresponding to the image to be processed is used to characterize the color difference between the image to be processed and the remaining image to be processed of the N-frame images to be processed.
The degree of color difference corresponding to the image to be processed can be determined with reference to the following steps:
acquiring a tone value and saturation of an image to be processed; acquiring the tone average value of N frames of images to be processed and the saturation average value of the N frames of images to be processed; and determining the color difference degree corresponding to the image to be processed according to the difference between the tone value and the tone mean value of the image to be processed and the difference between the saturation of the image to be processed and the saturation mean value.
For example, the tone values for N frames of the image to be processed may be denoted B1, B2, …, bn, …, bn, respectively. The saturation for N frames of the image to be processed may be denoted C1, C2, …, cn, …, cn, respectively. The calculation method of the hue value and saturation is not particularly limited herein.
Tone average value of N frames of to-be-processed image
Figure SMS_6
Can be shown as formula (2).
Figure SMS_7
Saturation mean of N frames of images to be processed
Figure SMS_8
Can be shown as formula (3).
Figure SMS_9
For a first frame of the N frames of images to be processed, calculating B1 and a tone average value
Figure SMS_10
Can obtain the difference between the tone value and the tone average value of the image to be processed of the first frameDifferent from each other. For a second frame of the N frames of images to be processed, calculating B2 and tone mean +.>
Figure SMS_11
The difference between the hue value and the hue mean value of the image to be processed of the second frame can be obtained. It can be understood that, for the nth frame of the N frames of the images to be processed, bn and the tone average value are calculated
Figure SMS_12
The difference between the hue value and the hue mean value of the N-th frame of the image to be processed can be obtained, wherein N e N is not listed here as the difference calculation process between the hue value and the hue mean value of the N-th frame of the image to be processed.
For a first frame of the N frames of images to be processed, calculating C1 and saturation mean value
Figure SMS_13
And the difference between the saturation and the saturation mean of the image to be processed of the first frame can be obtained. For a second frame of the N frames of images to be processed, calculating C2 and saturation mean +. >
Figure SMS_14
And the difference between the saturation and the saturation mean of the image to be processed of the second frame can be obtained. It can be understood that Cn and saturation mean value are calculated for the nth frame of the N frames of the images to be processed>
Figure SMS_15
The difference between the saturation and the saturation mean of the N-th frame of the image to be processed can be obtained, wherein N e N is not listed here in one-to-one relation to the difference calculation process between the saturation and the saturation mean of the N-th frame of the image to be processed.
As a specific example, another combination with equation (4) determines the degree of color difference corresponding to the image to be processed.
Figure SMS_16
In the formula (4), di is the degree of color difference of the i-th frame of the N-frame of the images to be processed, bi is the hue value of the i-th frame of the N-frame of the images to be processed, and Ci is the color saturation of the i-th frame of the N-frame of the images to be processed.
As a specific example, for each of the N frames of images to be processed, the corresponding level of detail of the image to be processed is used to characterize the level of detail of the image to be processed. The level of detail richness corresponding to the image to be processed can be determined with reference to the following steps:
for each pixel point in the image to be processed, determining the corresponding dispersion of the pixel point according to the gray value of the pixel point and the gray values of the neighborhood pixel points of the preset number; and determining a dispersion mean value of the image to be processed according to the dispersion corresponding to each pixel point in the image to be processed, wherein the dispersion mean value is used for representing the detail richness of the image to be processed.
For example, the preset number of neighboring pixels of the pixels in the image to be processed refers to the preset number of pixels closest to the pixel. The preset number may be set according to an actual application scenario, and is not particularly limited herein. For example, the preset number may be 4, and the pixel points with the nearest distances in the up-down, left-right directions may be selected; the preset number can be 9, and pixel points with the nearest distances from 8 directions of up, down, left and right, up, down, up and right and down of the pixel points can be selected; alternatively, when the preset number is greater, the pixel point closest to the preset number may be selected continuously, where the preset number is not particularly limited.
As a specific example, fig. 3 is a schematic diagram of a positional relationship between a pixel point and a neighboring pixel point according to an embodiment of the present application. The preset number is 9, which includes pixel points 301, and the neighboring pixel points are shown in fig. 302.
It will be appreciated that when the pixel points at the edge need to calculate the dispersion, the gray value corresponding to the position of the missing neighboring pixel point may be set to a default value, so that the neighboring pixel point reaches a preset number, for example, the gray value is set to 128, which is not limited herein.
Alternatively, the dispersion of the pixel points may be determined by calculating the variance or standard deviation, in particular, without limitation. After determining the dispersion of each pixel, a mean value of the dispersion of the pixels in each frame of the image to be processed can be determined.
In the embodiment of the application, the dispersion mean value is used for representing the detail richness of the image to be processed, wherein the detail richness is larger when the dispersion mean value is larger, and the detail richness is smaller when the dispersion mean value is smaller. Correspondingly, when the retrieval display image is selected based on the detail richness, the image to be processed with the maximum dispersion mean value can be selected as the back display image.
According to the embodiment of the application, the detail richness is achieved, the back-display image is selected from the N frames of images to be processed, when the back-display image is replaced with the target shooting image, the change of the visual effect caused by the difference value of the image detail can be effectively reduced, the similarity between the back-display image displayed in front and the target shooting image displayed in back is improved, and the fact that the judgment of the shooting satisfaction degree of a user is influenced due to the fact that the difference of the two frames is too large is avoided.
In some embodiments, the image characteristic information may include: brightness difference degree, color difference degree and detail richness degree. Specifically, referring to step 240, according to the image feature information corresponding to each of the N frames of images to be processed, a return image is determined from the N frames of images to be processed, and the following steps may be specifically referred to: according to the brightness difference degrees respectively corresponding to the N frames of images to be processed, determining J frames of images to be processed from the N frames of images to be processed; determining K frames of images to be processed from the J frames of images to be processed according to the color difference degrees respectively corresponding to the J frames of images to be processed; and determining a back display image from the K frames of images to be processed according to the detail abundance degrees respectively corresponding to the K frames of images to be processed.
By way of example, when the to-be-processed image is selected based on the brightness difference degree, a J-frame to-be-processed image with the minimum brightness difference degree can be selected, so that after the J-frame to-be-processed image is selected based on the brightness difference degree, the back-display image is determined from the J-frame to-be-processed image, the visual effect of brightness jump between the back-display image and the target shooting image can be effectively reduced, the similarity between the back-display image displayed in front and the target shooting image displayed behind is improved, and the influence of overlarge difference between two frames on the judgment of the shooting satisfaction degree of a user is avoided.
When the image to be processed is selected based on the color difference degree, a K frame image to be processed with the minimum color difference degree can be selected, so that after the K frame image to be processed is selected based on the color difference degree, the back display image is determined from the K frame image to be processed, the back display image can be ensured to comprise uniform colors, the visual effect of color jump between the back display image and the target shooting image can be effectively reduced, the similarity between the back display image displayed in front and the target shooting image displayed in back can be improved, and the judgment of the shooting satisfaction degree of a user due to overlarge difference between the two frames can be avoided.
When the to-be-processed image is selected based on the detail abundance degree, the to-be-processed image with the largest detail abundance degree can be selected as the back-display image, so that after the back-display image is selected based on the detail abundance degree, the back-display image can be ensured to have rich image detail information, the visual effect that the image details jump between the back-display image and the target shooting image can be effectively reduced, the similarity between the back-display image displayed in front and the target shooting image displayed in back is improved, and the influence of overlarge difference between two frames on the judgment of the shooting satisfaction degree of a user is avoided.
Based on the embodiment of the application, based on the brightness difference degree, the color difference degree and the detail richness degree, the back-display image is selected from the N frames of to-be-processed images, the change of visual effect after replacement between the back-display image and the target shooting image can be effectively reduced, and the similarity between the back-display image displayed in front and the target shooting image displayed in back is improved.
According to the image processing method provided by the embodiment of the application, the execution subject can be an image processing device. In the embodiment of the present application, an image processing apparatus provided in the embodiment of the present application will be described by taking an example in which the image processing apparatus executes an image processing method.
Fig. 4 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application, and, with reference to fig. 4, the image processing apparatus may include an acquisition module 410, a processing module 420, and a storage module 430.
The acquisition module 410 is configured to acquire M frames of images to be processed, where the M frames of images to be processed are used for fusion processing into a target shooting image;
the processing module 420 is configured to determine N frames of to-be-processed images from the M frames of to-be-processed images according to the inter-frame difference degrees corresponding to the M frames of to-be-processed images, respectively;
the processing module 420 is further configured to determine a back-display image from the N frames of images to be processed according to image feature information corresponding to the N frames of images to be processed, where the image feature information includes at least one of the following: brightness difference degree, color difference degree and detail richness degree;
a storage module 430 for storing the back-displayed image to the album;
the storage module 430 is further configured to replace the return image in the album with the target captured image if the target captured image is obtained.
Based on the embodiment of the application, the display-back image which can be displayed for the user can be quickly selected from the multi-frame to-be-processed images, the requirement that the user views the shot image immediately can be met, the similarity between the display-back image displayed in front and the target shot image displayed in rear can be improved, and the fact that the judgment of the user on the shooting satisfaction degree is influenced due to the fact that the difference of two frames is too large is avoided.
In some embodiments, the obtaining module 410 is further configured to obtain, for each of the M frames of images to be processed, an optical flow magnitude corresponding to the image to be processed, where the optical flow magnitude is used to characterize a degree of inter-frame difference between the image to be processed and a remaining image to be processed in the M frames of images to be processed;
the processing module 420 is further configured to determine the N frames of to-be-processed images from the M frames of to-be-processed images according to optical flow magnitudes corresponding to each frame of to-be-processed images, where the optical flow magnitudes corresponding to the N frames of to-be-processed images respectively meet a target screening condition.
Based on the embodiment of the application, after N frames of to-be-processed images determined by preset screening conditions are processed, the back-display images determined from the N frames of to-be-processed images can be used for effectively reducing the visual effect of visual angle jump between the back-display images and the target shooting images, improving the similarity between the back-display images displayed in front and the target shooting images displayed in back, and avoiding the influence of overlarge difference between the two frames on the judgment of the shooting satisfaction degree of a user.
In some embodiments, the obtaining module 410 is further configured to obtain, for each of the M frames of images to be processed, an optical flow differential value between the image to be processed and the remaining image to be processed for each frame;
The obtaining module 410 is further configured to obtain a mean value of the difference values of optical flows between the to-be-processed image and the remaining to-be-processed images of each frame, and use the mean value as an optical flow amplitude corresponding to the to-be-processed image.
Based on the embodiment of the application, after N frames of to-be-processed images determined by preset screening conditions are processed, the back-display images determined from the N frames of to-be-processed images can be used for effectively reducing the visual effect of visual angle jump between the back-display images and the target shooting images, improving the similarity between the back-display images displayed in front and the target shooting images displayed in back, and avoiding the influence of overlarge difference between the two frames on the judgment of the shooting satisfaction degree of a user.
In some embodiments, the image characteristic information comprises: brightness difference degree, color difference degree and detail richness degree;
the processing module 420 is further configured to determine a J-frame to-be-processed image from the N-frame to-be-processed images according to the brightness difference degrees corresponding to the N-frame to-be-processed images, respectively;
the processing module 420 is further configured to determine a K frame to-be-processed image from the J frame to-be-processed images according to the color difference degrees corresponding to the J frame to-be-processed images, respectively;
The processing module 420 is further configured to determine the echo image from the K frame images to be processed according to the detail abundance levels corresponding to the K frame images to be processed, respectively.
Based on the embodiment of the application, based on the brightness difference degree, the color difference degree and the detail richness degree, the back-display image is selected from the N frames of to-be-processed images, so that the similarity between the back-display image displayed in front and the target shooting image displayed in back is effectively improved, and the change of visual effect after replacement between the back-display image and the target shooting image can be reduced.
In some embodiments, for each of the N frames of images to be processed, a degree of brightness difference corresponding to the image to be processed is used to characterize a brightness difference between the image to be processed and a remaining image to be processed of the N frames of images to be processed;
the obtaining module 410 is further configured to obtain a brightness value of the image to be processed;
the obtaining module 410 is further configured to obtain a mean value of luminance values of the N frames of images to be processed;
the processing module 420 is further configured to determine a brightness difference degree corresponding to the image to be processed according to a difference between the brightness value of the image to be processed and the average value of the brightness values.
Therefore, the visual effect of brightness jump between the back display image and the target shooting image can be effectively reduced, the similarity between the back display image displayed in front and the target shooting image displayed in back can be improved, and the influence of overlarge difference of two frames on the judgment of the shooting satisfaction degree of a user is avoided.
In some embodiments, for each of the N frames of images to be processed, a degree of color difference corresponding to the image to be processed is used to characterize a color difference between the image to be processed and a remaining image to be processed of the N frames of images to be processed;
an obtaining module 410, configured to obtain a hue value and saturation of the image to be processed;
the obtaining module 410 is further configured to obtain a tone average value of the N frames of images to be processed and a saturation average value of the N frames of images to be processed;
the processing module 420 is further configured to determine a degree of color difference corresponding to the image to be processed according to a difference between the hue value of the image to be processed and the hue mean value, and a difference between the saturation of the image to be processed and the saturation mean value.
Therefore, the visual effect of color jump between the back display image and the target shooting image can be effectively reduced, the similarity between the back display image displayed in front and the target shooting image displayed in back can be improved, and the influence of overlarge difference of two frames on the judgment of the shooting satisfaction degree of a user is avoided.
In some embodiments, for each of the N frames of images to be processed, the level of detail corresponding to the image to be processed is used to characterize the level of detail of the image to be processed;
the processing module 420 is further configured to determine, for each pixel in the image to be processed, a dispersion corresponding to the pixel according to a gray value of the pixel and a gray value of a preset number of neighboring pixels;
the processing module 420 is further configured to determine a dispersion mean value according to the dispersion corresponding to each pixel point in the image to be processed, where the dispersion mean value is used to characterize the detail richness of the image to be processed.
According to the embodiment of the application, the detail richness is achieved, the back-display image is selected from the N frames of to-be-processed images, when the back-display image is replaced with the target shooting image, the similarity between the back-display image displayed in front and the target shooting image displayed in back can be improved, the change of visual effect caused by the difference of the image details can be effectively reduced, and the fact that the judgment of the shooting satisfaction degree of a user is influenced due to the fact that the difference of the two frames is too large is avoided.
The image processing apparatus in the embodiment of the present application may be an electronic device, or may be a component in an electronic device, for example, an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. By way of example, the electronic device may be a mobile phone, tablet computer, notebook computer, palm computer, vehicle-mounted electronic device, mobile internet appliance (Mobile Internet Device, MID), augmented reality (augmented reality, AR)/Virtual Reality (VR) device, robot, wearable device, ultra-mobile personal computer, UMPC, netbook or personal digital assistant (personal digital assistant, PDA), etc., but may also be a server, network attached storage (Network Attached Storage, NAS), personal computer (personal computer, PC), television (TV), teller machine or self-service machine, etc., and the embodiments of the present application are not limited in particular.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, which are not specifically limited in the embodiments of the present application.
The image processing device provided in the embodiment of the present application can implement each process implemented by the embodiment of the image processing method, and in order to avoid repetition, a description is omitted here.
Optionally, as shown in fig. 5, the embodiment of the present application further provides an electronic device 500, including a processor 501 and a memory 502, where the memory 502 stores a program or an instruction that can be executed on the processor 501, and the program or the instruction implements each step of the embodiment of the image processing method when executed by the processor 501, and the steps achieve the same technical effects, so that repetition is avoided, and no further description is given here.
The electronic device in the embodiment of the application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 6 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 600 includes, but is not limited to: radio frequency unit 601, network module 602, audio output unit 603, input unit 604, sensor 605, display unit 606, user input unit 607, interface unit 608, memory 609, and processor 610.
Those skilled in the art will appreciate that the electronic device 600 may further include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 610 by a power management system to perform functions such as managing charge, discharge, and power consumption by the power management system. The electronic device structure shown in fig. 6 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than shown, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
The processor 610 is configured to acquire M frames of images to be processed, where the M frames of images to be processed are used for fusion processing into a target shooting image;
the processor 610 is configured to determine N frames of images to be processed from the M frames of images to be processed according to the inter-frame difference degrees corresponding to the M frames of images to be processed, respectively;
the processor 610 is further configured to determine a echo image from the N frames of images to be processed according to image feature information corresponding to the N frames of images to be processed, where the image feature information includes at least one of: brightness difference degree, color difference degree and detail richness degree;
a memory 609 for storing the back-displayed image to the album;
the memory 609 is also used for replacing the back-display image in the album with the target photographed image in the case where the target photographed image is obtained.
Based on the embodiment of the application, the display-back image which can be displayed for the user can be quickly selected from the multi-frame to-be-processed images, the requirement that the user views the shot image immediately can be met, the similarity between the display-back image displayed in front and the target shot image displayed in rear can be improved, and the fact that the judgment of the user on the shooting satisfaction degree is influenced due to the fact that the difference of two frames is too large is avoided.
In some embodiments, the processor 610 is further configured to obtain, for each of the M frames of images to be processed, an optical flow magnitude corresponding to the image to be processed, where the optical flow magnitude is used to characterize a degree of inter-frame difference between the image to be processed and remaining images to be processed in the M frames of images to be processed;
the processor 610 is further configured to determine the N frames of to-be-processed images from the M frames of to-be-processed images according to optical flow magnitudes corresponding to each frame of to-be-processed images, where the optical flow magnitudes corresponding to the N frames of to-be-processed images respectively satisfy a target screening condition.
Based on the embodiment of the application, after N frames of to-be-processed images determined by preset screening conditions are processed, the display-back images determined from the N frames of to-be-processed images can be used for improving the similarity between the display-back images displayed in front and the target shooting images displayed in back, effectively reducing the visual effect of visual angle jump between the display-back images and the target shooting images, and avoiding the influence of overlarge difference of two frames on the judgment of shooting satisfaction degree of users.
In some embodiments, the processor 610 is further configured to obtain, for each of the M frames of images to be processed, an optical flow differential value between the image to be processed and the remaining image to be processed for each frame;
the processor 610 is further configured to obtain a mean value of the difference values of optical flows between the to-be-processed image and the remaining to-be-processed images of each frame, and take the mean value as an optical flow amplitude corresponding to the to-be-processed image.
Based on the embodiment of the application, after N frames of to-be-processed images determined by preset screening conditions are processed, the display-back images determined from the N frames of to-be-processed images can be used for improving the similarity between the display-back images displayed in front and the target shooting images displayed in back, effectively reducing the visual effect of visual angle jump between the display-back images and the target shooting images, and avoiding the influence of overlarge difference of two frames on the judgment of shooting satisfaction degree of users.
In some embodiments, the image characteristic information comprises: brightness difference degree, color difference degree and detail richness degree;
the processor 610 is further configured to determine a J-frame to-be-processed image from the N-frame to-be-processed images according to the brightness difference degrees corresponding to the N-frame to-be-processed images, respectively;
The processor 610 is further configured to determine a K-frame to-be-processed image from the J-frame to-be-processed images according to the color difference degrees corresponding to the J-frame to-be-processed images, respectively;
the processor 610 is further configured to determine the echo image from the K frame images to be processed according to the detail abundance levels corresponding to the K frame images to be processed, respectively.
Based on the embodiment of the application, based on the brightness difference degree, the color difference degree and the detail richness degree, the back display image is selected from the N frames of to-be-processed images, so that the similarity between the back display image displayed before and the target shooting image displayed after can be improved, the change of the visual effect after replacement between the back display image and the target shooting image is effectively reduced, and the influence of overlarge difference of two frames on the judgment of the shooting satisfaction degree of a user is avoided.
In some embodiments, for each of the N frames of images to be processed, a degree of brightness difference corresponding to the image to be processed is used to characterize a brightness difference between the image to be processed and a remaining image to be processed of the N frames of images to be processed;
the processor 610 is further configured to obtain a brightness value of the image to be processed;
the processor 610 is further configured to obtain a mean value of luminance values of the N frames of images to be processed;
The processor 610 is further configured to determine a brightness difference degree corresponding to the image to be processed according to a difference between the brightness value of the image to be processed and the average value of the brightness values.
Therefore, the similarity between the displayed back display image and the target shooting image displayed later can be improved, the visual effect of brightness jump between the back display image and the target shooting image can be effectively reduced, and the influence of overlarge difference of two frames on the judgment of the shooting satisfaction degree of a user is avoided.
In some embodiments, for each of the N frames of images to be processed, a degree of color difference corresponding to the image to be processed is used to characterize a color difference between the image to be processed and a remaining image to be processed of the N frames of images to be processed;
a processor 610 further configured to obtain a hue value and a saturation of the image to be processed;
the processor 610 is further configured to obtain a tone average value of the N frames of images to be processed and a saturation average value of the N frames of images to be processed;
the processor 610 is further configured to determine a degree of color difference corresponding to the image to be processed according to a difference between the hue value of the image to be processed and the hue mean value, and a difference between the saturation of the image to be processed and the saturation mean value.
Therefore, the similarity between the displayed back display image and the target shooting image displayed later can be improved, the visual effect of color jump between the back display image and the target shooting image can be effectively reduced, and the situation that the judgment of the shooting satisfaction degree of a user is influenced by overlarge difference of two frames is avoided.
In some embodiments, for each of the N frames of images to be processed, the level of detail corresponding to the image to be processed is used to characterize the level of detail of the image to be processed;
the processor 610 is further configured to determine, for each pixel in the image to be processed, a dispersion corresponding to the pixel according to a gray value of the pixel and a gray value of a preset number of neighboring pixels;
the processor 610 is further configured to determine a dispersion mean according to the dispersion corresponding to each pixel point in the image to be processed, where the dispersion mean is used to characterize the detail richness of the image to be processed.
According to the embodiment of the application, the detail richness is achieved, the back-display image is selected from the N frames of images to be processed, when the back-display image is replaced with the target shooting image, the similarity between the back-display image displayed in front and the target shooting image displayed in back can be improved, the change of visual effect caused by the difference of the image details is effectively reduced, and the fact that the judgment of the shooting satisfaction degree of a user is influenced due to the fact that the difference of the two frames is too large is avoided.
It should be understood that in the embodiment of the present application, the input unit 604 may include a graphics processor (Graphics Processing Unit, GPU) 6041 and a microphone 6042, and the graphics processor 6041 processes image data of still pictures or videos obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The display unit 606 may include a display panel 6061, and the display panel 6061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 607 includes at least one of a touch panel 6071 and other input devices 6072. The touch panel 6071 is also called a touch screen. The touch panel 6071 may include two parts of a touch detection device and a touch controller. Other input devices 6072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 609 may be used to store software programs as well as various data. The memory 609 may mainly include a first storage area storing programs or instructions and a second storage area storing data, wherein the first storage area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 609 may include volatile memory or nonvolatile memory, or the memory 609 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (Random Access Memory, RAM), static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double Data Rate SDRAM (ddr SDRAM), enhanced SDRAM (Enhanced SDRAM), synchronous DRAM (SLDRAM), and Direct RAM (DRRAM). Memory 609 in the present embodiment includes, but is not limited to, these and any other suitable types of memory.
The processor 610 may include one or more processing units; optionally, the processor 610 integrates an application processor that primarily processes operations involving an operating system, user interface, application programs, etc., and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 610.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and the same technical effects can be achieved, so that repetition is avoided, and no further description is given here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip includes a processor and a communication interface, the communication interface is coupled with the processor, and the processor is used for running a program or an instruction, so as to implement each process of the embodiment of the image processing method, and achieve the same technical effect, so that repetition is avoided, and no redundant description is provided here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
The embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the respective processes of the embodiments of the image processing method described above, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may also be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solutions of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk), comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the methods described in the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those of ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are also within the protection of the present application.

Claims (10)

1. An image processing method, the method comprising:
obtaining M frames of images to be processed, wherein the M frames of images to be processed are used for fusion processing to obtain a target shooting image;
according to the inter-frame difference degrees respectively corresponding to the M frames of images to be processed, N frames of images to be processed are determined from the M frames of images to be processed;
determining a back display image from the N frames of images to be processed according to the image characteristic information respectively corresponding to the N frames of images to be processed, wherein the image characteristic information comprises at least one of the following: brightness difference degree, color difference degree and detail richness degree;
storing the back display image to an album;
and in the case of obtaining the target shooting image, replacing the back display image in the album with the target shooting image.
2. The method according to claim 1, wherein determining N frames of images to be processed from the M frames of images to be processed according to the respective degrees of inter-frame differences of the M frames of images to be processed, comprises:
for each frame of to-be-processed image in the M frames of to-be-processed images, acquiring an optical flow amplitude corresponding to the to-be-processed image, wherein the optical flow amplitude is used for representing the inter-frame difference degree between the to-be-processed image and the rest of the M frames of to-be-processed images;
And determining N frames of to-be-processed images from the M frames of to-be-processed images according to the optical flow amplitude corresponding to each frame of to-be-processed image, wherein the optical flow amplitudes corresponding to the N frames of to-be-processed images respectively meet the target screening condition.
3. The method according to claim 2, wherein the obtaining, for each of the M frames of images to be processed, an optical flow magnitude corresponding to the image to be processed includes:
for each frame of the M frames of images to be processed, acquiring an optical flow difference value between the images to be processed and the rest of the images to be processed;
and acquiring an average value of optical flow difference values between the to-be-processed image and the rest to-be-processed images of each frame, and taking the average value as an optical flow amplitude corresponding to the to-be-processed image.
4. The method of claim 1, wherein the image characteristic information comprises: brightness difference degree, color difference degree and detail richness degree;
the determining the back display image from the N frames of images to be processed according to the image characteristic information respectively corresponding to the N frames of images to be processed includes:
determining J frames of images to be processed from the N frames of images to be processed according to the brightness difference degrees respectively corresponding to the N frames of images to be processed;
Determining K frames of images to be processed from the J frames of images to be processed according to the color difference degrees respectively corresponding to the J frames of images to be processed;
and determining the back display image from the K frame to-be-processed images according to the detail richness corresponding to the K frame to-be-processed images respectively.
5. The method according to claim 1, wherein for each of the N frames of images to be processed, a degree of brightness difference corresponding to the image to be processed is used to characterize a brightness difference between the image to be processed and the remaining image to be processed of the N frames of images to be processed; the method further comprises the steps of:
acquiring a brightness value of the image to be processed;
acquiring the average value of the brightness values of the N frames of images to be processed;
and determining the brightness difference degree corresponding to the image to be processed according to the difference between the brightness value of the image to be processed and the average value of the brightness values.
6. The method according to claim 1, wherein for each of the N frames of images to be processed, a degree of color difference corresponding to the image to be processed is used to characterize a color difference between the image to be processed and the remaining image to be processed of the N frames of images to be processed; the method further comprises the steps of:
Acquiring a tone value and saturation of the image to be processed;
acquiring the tone average value of the N frames of images to be processed and the saturation average value of the N frames of images to be processed;
and determining the color difference degree corresponding to the image to be processed according to the difference between the hue value of the image to be processed and the hue mean value and the difference between the saturation of the image to be processed and the saturation mean value.
7. The method according to claim 1, wherein for each of the N frames of images to be processed, the corresponding level of detail of the image to be processed is used to characterize the level of detail of the image to be processed; the method further comprises the steps of:
for each pixel point in the image to be processed, determining the corresponding dispersion of the pixel point according to the gray value of the pixel point and the gray values of the neighborhood pixel points of the preset number; and determining a dispersion mean value according to the dispersion corresponding to each pixel point in the image to be processed, wherein the dispersion mean value is used for representing the detail richness of the image to be processed.
8. An image processing apparatus, characterized in that the apparatus comprises:
The acquisition module is used for acquiring M frames of images to be processed, wherein the M frames of images to be processed are used for fusion processing to obtain a target shooting image;
the processing module is used for determining N frames of images to be processed from the M frames of images to be processed according to the inter-frame difference degrees respectively corresponding to the M frames of images to be processed;
the processing module is further configured to determine a back-display image from the N frames of images to be processed according to image feature information corresponding to the N frames of images to be processed, where the image feature information includes at least one of the following: brightness difference degree, color difference degree and detail richness degree;
the storage module is used for storing the back display image to the album;
the storage module is further used for replacing the back display image in the album with the target shooting image under the condition that the target shooting image is obtained.
9. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the image processing method of any of claims 1-7.
10. A readable storage medium, characterized in that the readable storage medium has stored thereon a program or instructions which, when executed by a processor, implement the steps of the image processing method according to any of claims 1-7.
CN202310312173.6A 2023-03-27 2023-03-27 Image processing method, device, electronic equipment and readable storage medium Pending CN116320729A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310312173.6A CN116320729A (en) 2023-03-27 2023-03-27 Image processing method, device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310312173.6A CN116320729A (en) 2023-03-27 2023-03-27 Image processing method, device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN116320729A true CN116320729A (en) 2023-06-23

Family

ID=86788553

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310312173.6A Pending CN116320729A (en) 2023-03-27 2023-03-27 Image processing method, device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN116320729A (en)

Similar Documents

Publication Publication Date Title
CN112637500B (en) Image processing method and device
CN112911147B (en) Display control method, display control device and electronic equipment
CN112437237B (en) Shooting method and device
CN112508820A (en) Image processing method and device and electronic equipment
CN111835937A (en) Image processing method and device and electronic equipment
CN111818382A (en) Screen recording method and device and electronic equipment
CN108495038B (en) Image processing method, image processing device, storage medium and electronic equipment
CN115439386A (en) Image fusion method and device, electronic equipment and storage medium
US11195247B1 (en) Camera motion aware local tone mapping
CN116320729A (en) Image processing method, device, electronic equipment and readable storage medium
CN112383708B (en) Shooting method and device, electronic equipment and readable storage medium
CN114785957A (en) Shooting method and device thereof
CN113891018A (en) Shooting method and device and electronic equipment
CN114093005A (en) Image processing method and device, electronic equipment and readable storage medium
CN112672056A (en) Image processing method and device
CN113012085A (en) Image processing method and device
CN112446848A (en) Image processing method and device and electronic equipment
CN112367464A (en) Image output method and device and electronic equipment
CN114143448B (en) Shooting method, shooting device, electronic equipment and readable storage medium
CN112367470B (en) Image processing method and device and electronic equipment
CN112637528B (en) Picture processing method and device
CN112367562B (en) Image processing method and device and electronic equipment
CN115797160A (en) Image generation method and device
CN115103119A (en) Shooting method and device and electronic equipment
CN117793513A (en) Video processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination