CN112785488A - Image processing method and device, storage medium and terminal - Google Patents

Image processing method and device, storage medium and terminal Download PDF

Info

Publication number
CN112785488A
CN112785488A CN201911095013.0A CN201911095013A CN112785488A CN 112785488 A CN112785488 A CN 112785488A CN 201911095013 A CN201911095013 A CN 201911095013A CN 112785488 A CN112785488 A CN 112785488A
Authority
CN
China
Prior art keywords
image
face image
historical
beautification
historical face
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN201911095013.0A
Other languages
Chinese (zh)
Inventor
刘柳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Original Assignee
Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yulong Computer Telecommunication Scientific Shenzhen Co Ltd filed Critical Yulong Computer Telecommunication Scientific Shenzhen Co Ltd
Priority to CN201911095013.0A priority Critical patent/CN112785488A/en
Publication of CN112785488A publication Critical patent/CN112785488A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • G06T3/04

Abstract

The embodiment of the application discloses an image processing method, an image processing device, a storage medium and a terminal, wherein the method comprises the following steps: acquiring a target face image; acquiring an image beautification parameter generated based on a historical face image set; and beautifying the target face image based on the image beautifying parameters to generate a beautified image corresponding to the target face image. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.

Description

Image processing method and device, storage medium and terminal
Technical Field
The present application relates to the field of computer technologies, and in particular, to an image processing method, an image processing apparatus, a storage medium, and a terminal.
Background
With the rapid popularization of terminals with cameras, such as mobile phones and tablet computers, people have become fashionable to record the infusion of life by taking pictures. After the self-photographing is finished, the user can beautify the photographed face image according to the preference of the user so as to meet the personalized requirements of the user.
For beautifying images, image processing software is mostly adopted for processing at present, in the process, a user needs to adjust the areas needing to be adjusted one by one, and when the areas needing to be adjusted are more, a large amount of time is needed for completing the beautifying processing of the images. The method is complicated in process and wastes time, so that the image beautifying efficiency is reduced.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, a storage medium and a terminal, which can improve the beautifying efficiency of an image after a user takes a picture. The technical scheme is as follows:
in a first aspect, an embodiment of the present application provides an image processing method, including:
acquiring a target face image;
acquiring an image beautification parameter generated based on a historical face image set;
and beautifying the target face image based on the image beautifying parameters to generate a beautified image corresponding to the target face image.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the image acquisition module is used for acquiring a target face image;
the parameter acquisition module is used for acquiring an image beautifying parameter generated based on a historical face image;
and the first image generation module is used for beautifying the target face image based on the image beautifying parameters and generating a beautifying image corresponding to the target face image.
In a third aspect, embodiments of the present application provide a computer storage medium storing a plurality of instructions adapted to be loaded by a processor and to perform the above-mentioned method steps.
In a fourth aspect, an embodiment of the present application provides a terminal, which may include: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the above-mentioned method steps.
The beneficial effects brought by the technical scheme provided by some embodiments of the application at least comprise:
in the embodiment of the application, an image processing device is used for obtaining a target face image, then an image beautification parameter generated based on a historical face image set is obtained, finally beautification processing is carried out on the target face image based on the image beautification parameter, and an beautification image corresponding to the target face image is generated after the beautification processing is finished. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is a scene schematic diagram of an implementation scenario provided in an embodiment of the present application;
fig. 2 is a schematic diagram illustrating an effect displayed by an image capturing interface according to an embodiment of the present disclosure;
FIG. 3 is a schematic diagram illustrating an effect of an image beautification interface display according to an embodiment of the present disclosure;
fig. 4 is a schematic flowchart of an image processing method according to an embodiment of the present application;
FIG. 5 is a schematic flowchart of another image processing method provided in the embodiments of the present application;
FIG. 6 is a schematic flowchart of another image processing method provided in the embodiments of the present application;
FIG. 7 is a schematic diagram illustrating an effect of an image manually beautifying interface display according to an embodiment of the present disclosure;
FIG. 8 is a schematic flow chart of image processing provided by an embodiment of the present application;
fig. 9 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of another image processing apparatus provided in an embodiment of the present application;
FIG. 11 is a schematic structural diagram of an image determination module provided in an embodiment of the present application;
FIG. 12 is a schematic structural diagram of another image determination module provided in an embodiment of the present application;
fig. 13 is a schematic structural diagram of a first image generation module according to an embodiment of the present disclosure;
fig. 14 is a schematic structural diagram of a terminal according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more clear, embodiments of the present application will be described in further detail below with reference to the accompanying drawings.
It should be understood that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the application, as detailed in the appended claims.
In the description of the present application, it is to be understood that the terms "first," "second," and the like are used for descriptive purposes only and are not to be construed as indicating or implying relative importance. The specific meaning of the above terms in the present application can be understood in a specific case by those of ordinary skill in the art. Further, in the description of the present application, "a plurality" means two or more unless otherwise specified. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship.
Until now, for beautifying images, image processing software is mostly adopted to process the images, and since the image processing software needs a user to adjust the areas to be adjusted one by one during the image beautifying, when the areas to be adjusted are more, a great amount of time is needed to finish the beautifying processing of the images, which undoubtedly reduces the photographing efficiency. Therefore, the present application provides an image processing method, an image processing apparatus, a storage medium, and a terminal to solve the above-mentioned problems in the related art. In the technical scheme provided by the application, since the image beautification parameters are generated in advance and stored based on the historical face image set, when a user beautifies an image after photographing is finished, the beautification parameters can be automatically acquired to beautify the image, and the mode can quickly complete the image beautification processing, so that the time is saved, the image beautification efficiency is improved, and the following exemplary embodiment is adopted to describe in detail.
Referring to fig. 1, a scene diagram of an implementation scenario shown in an embodiment of the present application is shown, where the implementation scenario includes a user 110, a user terminal 120, and a camera 130. The user terminal 120 is an electronic device with a network communication function, and the electronic device includes, but is not limited to, a smart phone, a tablet computer, a wearable device, a smart home device, a laptop computer, a desktop computer, a smart camera, and the like. The user terminal 120 includes one or more processors or memories, which may include one or more processing cores. The processor connects various parts within the entire image processing apparatus using various interfaces and lines, and performs various functions of the image processing apparatus and processes data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory, and calling data stored in the memory. Optionally, the processor may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor may integrate one or more of a Central Processing Unit (CPU), a modem, and the like.
The user terminal 120 stores a historical face image set and beautification parameters obtained by analyzing and processing the historical face images. Optionally, the historical facial image set includes facial images approved by the user 110 and facial images adjusted by the user 110, and the types of the collected facial images may be various, which is not limited herein. Among them, the user terminal 120 has an application program having an image processing function installed therein.
In a possible implementation manner, the user 110 opens the user terminal 120, and then opens an application program with functions of taking pictures and beautifying images installed thereon, so as to enter an image taking interface, as shown in fig. 2, the user can capture a face image of the user through the camera 130. Specifically, the user can adjust according to the displacement between the face image of the user and the camera, the user can click a shooting key in the figure 2 for shooting the face image until the user adjusts to the favorite state of the user, and the face image can be collected after shooting is finished. When the user terminal 120 detects that the user 110 clicks the photographing key, the current face image of the user 110 is displayed on the user terminal 120.
When the user 110 beautifies the currently displayed face image, the user 110 may click a beauty function button in fig. 2 to enter an image beautification interface, or the user terminal 120 may automatically obtain beautification parameters for beautification when detecting the face image.
For example, as shown in fig. 3, the beautification interface includes beautification parameter values and a beautification key. Specifically, after the user terminal 120 detects that the user triggers the beauty button, the user terminal 120 acquires the image beautification parameters through a wired or wireless network, and after the user terminal 120 acquires the beautification parameters, the beautification parameters are processed through an internal program and displayed on an interface of the user terminal 120, for example, the beautification parameters displayed on the interface of the terminal in fig. 3 include brightness, contrast, saturation, and definition, where the acquired image beautification parameters may be several of the beautification parameters, or may have parameters of other image beautification parameters, and this is not limited here. The beautification parameter is obtained by processing and analyzing data information of each historical face image in the historical face image set, the value of the beautification parameter can be changed based on the change of each historical face image, and each historical face image is stored by the user terminal 120 according to the operation of the user on the face image. Optionally, for example, when the user approves the face photo, the user terminal 120 stores the face image approved by the user 110, or stores a self-shot photo adjusted by the user, which is not limited herein.
When the beautification parameters are displayed on the interface of fig. 3, for example, a user can beautify the face image by clicking a beautification key, the user 110 triggers the beautification key and then responds the generated image beautification instruction to the user terminal 120, the user terminal 120 receives the beautification instruction and then analyzes the parameter values of the current face image, after the analysis is finished, the obtained beautification parameters are used for adjusting the parameter values of the current face image, and when the adjusted parameter values reach the beautification parameter values, the beautified face image is generated. When the user 110 manually adjusts different areas of the beautified face image to generate an adjusted face image, the user terminal 120 stores the manually adjusted face image.
In the embodiment of the application, an image processing device is used for obtaining a target face image, then an image beautification parameter generated based on a historical face image set is obtained, finally beautification processing is carried out on the target face image based on the image beautification parameter, and an beautification image corresponding to the target face image is generated after the beautification processing is finished. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.
The image processing method provided by the embodiment of the present application will be described in detail below with reference to fig. 4 to 8. The method may be implemented in dependence on a computer program, executable on an image processing apparatus based on the von neumann architecture. The computer program may be integrated into the application or may run as a separate tool-like application. The image processing apparatus in the embodiment of the present application may be a user terminal.
Referring to fig. 4, a flowchart of an image processing method according to an embodiment of the present application is provided. As shown in fig. 4, the method of the embodiment of the present application may include the steps of:
s101, acquiring a target face image;
the image is a picture with visual effect, which is the basis of human vision, and comprises a picture on paper, a picture or a photo, a television, a projector or a computer screen. The face image is an image having a variety of facial information including, for example, a human eye region, a nose region, a mouth region, and the like. Wherein the eye region includes pupil, eyelash, eyebrow, etc.
Generally, the target face image is a face image to be beautified by a user, the face image does not conform to the aesthetic preference of the user, the user can beautify according to the shortage of the face image, and the face image conforming to the aesthetic preference of the user is generated after the beautification is finished. The specific beautifying processing may be that the user performs one-key beautifying through image beautifying software, or that the user manually adjusts an unsatisfactory area in the face image.
The mode of acquiring the target face image can be that a user adopts a user terminal with a camera to shoot to obtain the face image, or the user selects a certain face image shot in advance from an image set in a local image library or selects a face image downloaded from a network terminal, and the like. For the face image, the user may select different modes for obtaining, which is not specifically limited herein.
In a feasible implementation mode, a user enters an image shooting interface by clicking software with image shooting and image beautifying processing functions installed on a user terminal, and then clicks a shooting function key to acquire a face image as a target face image by adjusting the displacement of a face image and a camera of the user when the user adjusts the face image to a satisfactory state.
In another possible implementation manner, a user may open a local gallery of the user terminal, select a historical image stored in the local gallery, and when the user selects to complete one target face image, the target face image may be obtained.
For convenience of description, the embodiment of the present application is described by taking as an example that software with image capturing and image beautification processing functions is installed on a user terminal, and the user terminal responds to and beautifies a target face image obtained by the software.
For example: the user xiao ming wants to perform self-shooting and beautifies the face image of the self-shooting, firstly, the user xiao ming opens the user terminal, finds software with image shooting and image beautification processing functions and clicks to enter, and then makes a series of expression features to increase the interestingness of the face image by adjusting the displacement between the face image and the camera according to the preference of the user, and when the face image presented on the user terminal accords with the preference of the user xiao ming, the user xiao ming can click a shooting key on the user terminal to acquire the face image. Meanwhile, the user xiao ming can click a local gallery key to enter the local gallery, the user xiao ming can select one face image according to the historical images stored in the gallery, and the target face image is obtained after the user xiao ming finishes selecting.
S102, acquiring an image beautification parameter generated based on a historical face image set;
the historical face image is face image data stored according to the usual aesthetic taste of a user, and a plurality of pieces of face image data are stored according to the aesthetic taste of the user, so that a historical face image set is generated, the historical face image set stores photos and videos of people actively collected by the user, photos or videos of people favored by various social media, browsed makeup video and personal photos manually adjusted in the past. The image beautification parameters are face image beautification parameters obtained by processing and analyzing each face image in the historical face image set.
Generally, a user may like to like a figure photo or watch a makeup video according to the aesthetic taste of the user in normal life, and when the user terminal detects the behaviors of the user, the user terminal defaults that the photo or the makeup video at the moment conforms to the current aesthetic taste of the user, so the user terminal uploads the photo or the makeup video at the moment to a server for storage.
Specifically, in each historical face image analysis process, an image analysis process model needs to be created first, where the image analysis process model is a mathematical model capable of analyzing an image, and the mathematical model is created and generated based on at least one of a Convolutional Neural Network (CNN) model, a Deep Neural Network (DNN) model, a Recurrent Neural Network (RNN) model, an embedding (embedding) model, a Long-Short Term Memory model (Long-Short Term Memory, LSTM), and a Gradient Boosting Decision Tree (GBDT) model. Then, user preference data such as glossiness, smoothness, skin color, natural makeup degree, hair color and makeup feeling of key parts (including but not limited to eyes and lips) of the historical face image skin are obtained, finally, the obtained data are analyzed and processed by the model to obtain face image beautifying parameters, and the face image beautifying parameters which can be obtained through image analysis and processing model processing are shown in table 1. Based on the beautification parameter values in the table 1, each beautification name corresponds to one beautification parameter, and the value of the beautification parameter is not specifically set and can be obtained again by analyzing and processing each historical face image. For example, the skin brightness may be 56% or 65%.
TABLE 1
Figure BDA0002268053960000071
Figure BDA0002268053960000081
In a possible implementation manner, when a user approves a person image in which the user is interested, the user terminal receives an approval instruction of the user and then uploads the approved photo to the server for storage through a wireless or wired network, or when the user watches a makeup video, the user terminal receives a makeup video click playing instruction and then uploads the makeup video to the server for storage through the wireless or wired network. And generating a historical face image set after the storage is finished, then carrying out analysis processing through an internal image processing analysis program, and generating the required beauty parameter value of the face image after the analysis processing is finished.
S103, beautifying the target face image based on the image beautifying parameters to generate a beautified image corresponding to the target face image.
The acquisition of the target face image can be specifically referred to as S101, the beautification parameters are obtained based on the analysis and processing of the historical face image, the specific contents can be referred to as S102, and the beautification image is generated by adjusting the parameters of the target face image according to the beautification parameters.
Generally, after a user terminal acquires a target face image, firstly, the acquired current face image is analyzed, the skin state of the current face image is analyzed, after the analysis is finished, each parameter value of the current skin state of the target face image is obtained, then, beautification parameter values obtained based on historical face images are obtained, and finally, each parameter value of the current skin state is adjusted to the beautification parameter values to generate an beautification image.
For example, when a user wants to beautify a photo after self-timer shooting ends, the user can beautify the face by clicking a one-key beauty button, such as: the self-portrait can be adjusted to be close to the user's preference after the beautification is finished.
In the embodiment of the application, an image processing device is used for obtaining a target face image, then an image beautification parameter generated based on a historical face image set is obtained, finally beautification processing is carried out on the target face image based on the image beautification parameter, and an beautification image corresponding to the target face image is generated after the beautification processing is finished. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.
Fig. 5 is a schematic flow chart of an image processing method according to an embodiment of the present disclosure. The present embodiment is exemplified by applying the image processing method to a user terminal. The image processing method may include the steps of:
s201, collecting a sample face image, and generating a historical face image set;
the sample face images are face images determined according to aesthetic preferences of users, when a plurality of face images are determined according to the aesthetic preferences of the users, a historical face image set is generated, self-portrait photos made up by the users possibly exist in the historical face image set, face images liked by the users on a social network site possibly exist, and make up videos watched by the users possibly exist.
Generally, collected data samples are also called data acquisition, and today in the rapid development of the internet industry, data collection is widely applied to the internet field, accurate selection of the data samples to be collected has a profound influence on products, if the collected data samples are not accurate enough, large deviation of test results can be caused, and inestimable loss is caused to the products. Therefore, it is necessary to accurately collect the sample data information.
In the embodiment of the application, when the user terminal detects that the user operates on the face image, the user terminal defaults the face image at the moment as the face image according with the aesthetic preference of the user, the user terminal uploads the face image at the moment to the server for processing and storing, the collected data is the face image according with the aesthetic preference of the user, and beautifying parameters obtained by analyzing and processing the face image according with the aesthetic preference of the user better meet the beautifying requirement of the user.
S202, acquiring image types corresponding to the historical face images in the historical face image set;
wherein, the image type is to distinguish and represent the image according to different characteristics reflected in the image information. The image type is distinguished by quantitatively analyzing the image by a computer and classifying the image or each pixel or area in the image into one of a plurality of categories. The types of images can be specifically classified into a BMP image format, a PCX image format, a TIFF image format, a GIF image format, a JPEG image format, and the like.
In general, each face image stored in the face image set is collected according to the usual aesthetic preference of the user, and thus each historical face image includes a plurality of image types. When the face image is analyzed, firstly, the type of the face image is acquired through an acquisition program stored in the server, and the various types of the face image can be analyzed and processed after the image types of the face image are acquired.
S203, determining the weight value corresponding to each historical face image based on the corresponding relation between the image type and the image weight;
for example, the weight value of the face image manually adjusted by the user is 0.3, while the weight value of the face image approved by the user is 0.2, and based on the size of the value, the size of the weight value of the face image can be determined, and the higher the priority of the face image is, the higher the weight is.
For example, in daily self-timer shooting, the user manually adjusts a self-timer photo, and after the manual adjustment, the user terminal defaults that the weight value of the photo is higher. In the daily life of the user, for the favorite face images, the user terminal defaults that the weight value of the face image is relatively lower than that of the manually adjusted face image.
In the embodiment of the application, the weight value of each face image can be determined according to the image type of each face image in the historical face images and the different weight of each image, and the face images with larger weight values are more in line with the aesthetic preference of users.
S204, acquiring the current beautification parameters of the historical face images, weighting and summing the current beautification parameters of the historical face images and the weight values of the historical face images, and averaging to generate image beautification parameters;
the skin parameter values of the historical face images in the historical face image set are different, after the weight value of each historical face image is determined based on the step S203, new parameter values of the historical face images can be obtained by multiplying the weight values by the parameter values in each historical face image obtained through analysis, then the new parameter values are added and divided by the total quantity of the historical face images to obtain the average value of each parameter of the skin of the historical face images, and the average value obtained based on the method can be regarded as an image beautifying parameter which is closest to the aesthetic preference of a user.
For example, the user terminal analyzes that the skin gloss in each historical face image in the historical face images is 0.36, 0.65, 0.52, 0.46, and 0.66, the weight values of each historical face image are 0.3 because the two previous historical face images are manually adjusted by the user, and the weight values of the remaining two historical face images are 0.2 because the remaining two historical face images are face images complied by the user, so the whitening degree parameter in the face images at present is calculated in the following manner: (0.36 · 0.3+0.65 · 0.3+0.52 · 0.3+0.46 · 0.2+0.66 · 0.2)/5, the calculation result at this time being the whitening degree parameter among the beautification parameters.
S205, acquiring a target face image;
see S101 for details, which are not described herein.
S206, acquiring an image beautifying parameter generated based on the historical face image set;
see S102 for details, which are not described herein.
S207, acquiring parameters of the target face image;
the parameters are understood to be data which can be used as a reference, also referred to as reference values. When a problem is researched, the problem is generally analyzed and solved by obtaining values of all variables and relations among all variables, a target face image can be a current face image shot and collected by a user through a user terminal with a camera or a face image stored in user terminal equipment, and parameters of the target face image are a set of reference data obtained by the user terminal through internal program analysis.
In the embodiment of the application, a user firstly carries out self-shooting through a user terminal with a camera, a face image of the user is obtained after the self-shooting is finished, an internal program is called to carry out analysis processing on the face image after the user terminal detects that the face image is obtained, each parameter of the current skin state of the user is obtained after the analysis processing is finished, parameter values possibly contained in the obtained target face image parameters include parameter values such as skin glossiness, skin smoothness, skin color cold and warm depth degree, make-up feeling natural degree and the like of the face image, and other parameter values possibly also can be contained, and are not listed one by one.
S208, adjusting the parameters of the target face image to the values corresponding to the image beautification parameters, and generating beautification images corresponding to the target face image;
wherein, the skin state parameter value of the target face image is obtained through step S207, and then the beautification parameter of the face image is obtained through step S204, and the beautification parameter information includes an attribute value corresponding to the image beautification operation. The different image beautifying operations include adjusting different image attribute values or parameter values for processing the image according to different operation formulas so as to enable the image to achieve corresponding beautifying effects, the image beautifying operations include skin grinding, whitening, speckle removing, color mixing, face thinning and the like, and the execution of the different image beautifying operations requires corresponding adjustment of different attribute values of the target face image, for example, the image attribute values include brightness values, contrast ratios, saturation degrees and the like of the face image.
In the embodiment of the present application, when a target face image is to be beautified, the beautifying operation of the face image can be implemented only by adjusting the skin condition reference value of the obtained target face image to the beautifying parameter value of the face image, for example, as shown in table 3, the parameter before beautifying corresponding to the beautifying name based on table 3 is the parameter before beautifying of the person
TABLE 3
Figure BDA0002268053960000111
Figure BDA0002268053960000121
The beautified parameters are beautification parameters obtained based on the historical face image set, for example, the skin brightness is adjusted from thirty-six percent to fifty-six percent to complete the adjustment of the skin brightness, and for example, the face definition is adjusted from seventy-eight percent to ninety-seven percent to make the face image clearer.
S209, when receiving an editing instruction aiming at the beautified image, editing the beautified image to generate an edited corrected image;
generally, a beautified image generated after a face image is beautified by using a beauty parameter may not meet the current aesthetic preference of a user, and at this time, the user may manually adjust an area to be adjusted in the face image, as shown in fig. 7, the user may complete adjustment of a certain area in the beautified image according to a face image area adjustment function key displayed on a user terminal interface, and may click a "save" function key to save the adjusted image after the adjustment is completed.
In a possible implementation mode, after a user finishes self-timer shooting, after the user clicks a one-key beauty key, the user can click a face slimming function key in an interface to adjust the face shape according to the aesthetic preference of the user after the beautification is found, and the user can finish the manual beautified face image by clicking a 'save' function key until the face shape accords with the aesthetic preference of the user.
And S210, storing the edited corrected image into the historical human face image set.
After the user manually adjusts the perfected photo, the user terminal enables the photo defaulted to the perfected photo to accord with the aesthetic taste of the user, and the user terminal uploads the beautified image after the user adjusts the facial photo to the historical facial photo in the server in a wireless or wired mode to be stored and processed.
For example, as shown in fig. 8, the core module of the embodiment of the present application is composed of a data storage module, an algorithm analysis module, and a processing module. The data storage module stores photos and videos of people, photos and videos praised on social media, videos related to browsed makeup and photos collected in daily life of users, and data manually adjusted during self-shooting. The algorithm analysis module stores beauty parameters obtained by analysis and processing according to the data storage module, wherein the beauty parameters comprise skin glossiness, skin smoothness, skin color cold and warm depth, make-up feeling naturalness and the like. The processing module adjusts the current skin state according to the current skin state and the beauty parameters stored in the algorithm analysis module, and when the user does not meet the requirement of manual adjustment of the adjusted skin state, the processing module sends the human face image manually adjusted by the user to the data storage module for storage processing, and the three modules cooperate with each other to form the embodiment of the application.
In the embodiment of the application, an image processing device is used for obtaining a target face image, then an image beautification parameter generated based on a historical face image set is obtained, finally beautification processing is carried out on the target face image based on the image beautification parameter, and an beautification image corresponding to the target face image is generated after the beautification processing is finished. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.
Fig. 6 is a schematic flow chart of an image processing method according to an embodiment of the present disclosure. The present embodiment is exemplified by applying the image processing method to a user terminal. The image processing method may include the steps of:
s301, collecting sample face images and generating a historical face image set;
specifically, refer to step S201, which is not described herein again.
S302, acquiring the acquisition time indicated by each historical face image in the historical face image set; the acquisition time indicated by the face image is the current time when the user terminal acquires the image according to the aesthetic preference of the user, for example, the user acquires a face image at the moment of 16:44:58 in 10 and 14 months in 2019, and the time corresponding to the face image is 16:44:58 in 14 days in 10 and 14 months in 2019. Each historical face image in the historical face image set corresponds to time data during acquisition, for example, as shown in table 2, when each historical face image in the historical face image set is analyzed, acquisition time corresponding to each historical face image is acquired first.
TABLE 2
ID Image information Time of acquisition Number of manual adjustments
1 Face image 1 2019.08.20 3
2 Face image 2 2019.08.22 2
3 Face image 3 2019.08.23 1
Further, in the historical face image set, there are face images that have been manually adjusted by the user, and the user may have performed multiple adjustments based on the same face image, for example, in the data information in table 2, the number of times of manual adjustment of the face image 1 is 3, the number of times of manual adjustment of the face image 2 is 2, and the number of times of manual adjustment of the face image 3 is 1, and in the face image analysis task, analysis processing may be performed according to the number of times of manual adjustment of the acquired face image.
S303, determining the priority of each historical face image based on the time length between the acquisition time and the current time;
the determined historical face images are face information obtained by prioritizing the face images in the historical face image set and selecting the face images according to the priority, and are shown in table 2, for example.
The priority is to sort each face image according to a certain rule, and the priority rule at least comprises the following two types:
optionally, the acquisition time of each historical face image in the historical face image set is first acquired, then the current time is acquired, and finally the priority is determined according to the time difference between the acquisition time of the historical face image and the current time, that is, the priority closest to the current time in the historical face image acquisition time is the highest, and the priority is the next highest.
Optionally, the manual adjustment times corresponding to each historical face image in the historical face images are firstly obtained, and then the priority is determined according to the adjustment times. That is, the historical face image has the highest priority corresponding to the highest adjustment frequency, and has the highest priority corresponding to the other adjustment frequencies.
For example, the current time of the system is 09:35:20 in 8/28/2019, and if the priority rule is determined according to the time length of the acquisition time from the current time, the priority of the face image 3 is higher than that of the face image 2 and is higher than that of the face image 1.
For example, if the priority rule is determined according to the adjustment times, the manual adjustment times of the face image 1 is 3, the manual adjustment times of the face image 2 is 2, and the manual adjustment times of the face image 3 is 1, so that the priority of the face image 1 is higher than that of the face image 2 and is higher than that of the face image 3.
S304, determining a facial image to be analyzed in the historical facial image set based on the high-low order of the priority;
step S303 is referred to for determining the priority of each historical face image, which is not described herein again, and after the priority determination is finished, some face images with the highest priority may be obtained according to the priority of each face image and used as the face images to be analyzed. The obtained face images with the highest priority can best meet the aesthetic taste of the current user, and the beautification parameters of the images generated after the face images are analyzed and processed are more accurate.
S305, generating an image beautifying parameter after analyzing and processing the face image to be analyzed;
in general, after determining the facial image to be analyzed based on step S304, the user terminal may perform an analysis process based on the determined facial image to be analyzed, and generate a specific set of image beautification parameters after the analysis process.
In a possible implementation mode, the user terminal selects some face images with the highest priority according to the order of the priorities, when the user terminal detects that the selection is completed, the internal face image analysis processing program is called to perform analysis processing, the image beautification parameters can be generated after the analysis processing is completed, and after the image beautification parameters are generated, the beautification parameters are stored by the user terminal to prepare for calling the beautification parameters when the face image beautification is performed by the user next time.
S306, acquiring a target face image;
see S101 for details, which are not described herein.
S307, acquiring an image beautifying parameter generated based on the historical face image set;
see S102 for details, which are not described herein.
S308, acquiring parameters of the target face image;
see S207 specifically, and will not be described herein.
S309, adjusting the parameters of the target face image to the values corresponding to the image beautification parameters, and generating beautification images corresponding to the target face image;
see S208 for details, which are not described herein.
S310, when an editing instruction for the beautified image is received, editing the beautified image to generate an edited corrected image;
see S209 specifically, and are not described herein again.
S311, storing the edited corrected image into the historical human face image set.
See S210 for details, which are not described herein.
In the embodiment of the application, an image processing device is used for obtaining a target face image, then an image beautification parameter generated based on a historical face image set is obtained, finally beautification processing is carried out on the target face image based on the image beautification parameter, and an beautification image corresponding to the target face image is generated after the beautification processing is finished. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.
The following are embodiments of the apparatus of the present application that may be used to perform embodiments of the method of the present application. For details which are not disclosed in the embodiments of the apparatus of the present application, reference is made to the embodiments of the method of the present application.
Referring to fig. 9, a schematic structural diagram of an image processing apparatus according to an exemplary embodiment of the present application is shown. The image processing apparatus may be implemented as all or a part of the terminal by software, hardware, or a combination of both. The apparatus 1 comprises an image acquisition module 10, a parameter acquisition module 20 and a first image generation module 30.
The image acquisition module 10 is used for acquiring a target face image;
a parameter obtaining module 20, configured to obtain an image beautification parameter generated based on a historical face image;
and the first image generation module 30 is configured to perform beautification processing on the target face image based on the image beautification parameter, and generate an beautified image corresponding to the target face image.
Optionally, as shown in fig. 10, the apparatus 1 further includes:
a first set generating module 70, configured to collect sample face images and generate a historical face image set;
a type obtaining module 60, configured to obtain an image type corresponding to each historical face image in the historical face image set;
a weight value determining module 50, configured to determine a weight value corresponding to each historical face image based on a correspondence between an image type and an image weight;
and the first parameter generation module 40 is configured to generate an image beautification parameter according to the weight value of each historical face image.
Optionally, the first parameter generating module 40 is specifically configured to:
and acquiring the current beautification parameters of the historical face images, weighting and summing the current beautification parameters of the historical face images and the weight values of the historical face images, and averaging to generate the image beautification parameters.
Optionally, as shown in fig. 10, the apparatus 1 further includes:
a second set generating module 80, configured to collect sample face images and generate a historical face image set;
the image determining module 90 is configured to obtain priorities of the historical face images in the historical face image set, and determine a face image to be analyzed in the historical face image set based on a high-low order of the priorities;
and the second parameter generating module 100 is configured to generate an image beautification parameter after analyzing and processing the face image to be analyzed.
Optionally, as shown in fig. 11, the image determining module 90 includes:
a time acquisition unit 901, configured to acquire acquisition time indicated by each historical face image in the historical face image set;
a first priority determining unit 902, configured to determine a priority of each historical face image based on a time length between the acquisition time and the current time.
Optionally, as shown in fig. 12, the image determining module 90 includes:
a frequency obtaining unit 903, configured to obtain adjustment frequencies of each historical face image in the historical face image set;
a second priority determining unit 904, configured to determine a priority of each historical face image based on the number of times of adjustment of each historical face image.
Optionally, as shown in fig. 13, the first image generating module 30 includes:
a parameter acquiring unit 301, configured to acquire parameters of the target face image;
an image adjusting unit 302, configured to adjust the parameter of the target face image to a value corresponding to the image beautification parameter.
Optionally, as shown in fig. 10, the apparatus 1 further includes:
a second image generating module 110, configured to, when an editing instruction for the beautified image is received, perform editing processing on the beautified image, and generate an edited modified image;
an image saving module 120, configured to save the modified beautified image to the historical face image set.
It should be noted that, when the image processing apparatus provided in the foregoing embodiment executes the image processing method, only the division of the functional modules is illustrated, and in practical applications, the above functions may be distributed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. In addition, the image processing apparatus and the image processing method provided by the above embodiments belong to the same concept, and details of implementation processes thereof are referred to in the method embodiments and are not described herein again.
The above-mentioned serial numbers of the embodiments of the present application are merely for description and do not represent the merits of the embodiments.
In the embodiment of the application, an image processing device is used for obtaining a target face image, then an image beautification parameter generated based on a historical face image set is obtained, finally beautification processing is carried out on the target face image based on the image beautification parameter, and an beautification image corresponding to the target face image is generated after the beautification processing is finished. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.
The present application also provides a computer readable medium, on which program instructions are stored, which program instructions, when executed by a processor, implement the image processing method provided by the above-mentioned various method embodiments.
The present application also provides a computer program product comprising instructions which, when run on a computer, cause the computer to perform the image processing method as described in the various method embodiments above.
Please refer to fig. 14, which provides a schematic structural diagram of a terminal according to an embodiment of the present application. As shown in fig. 14, the terminal 1000 can include: at least one processor 1001, at least one network interface 1004, a user interface 1003, memory 1005, at least one communication bus 1002.
Wherein a communication bus 1002 is used to enable connective communication between these components.
The user interface 1003 may include a Display screen (Display) and a Camera (Camera), and the optional user interface 1003 may also include a standard wired interface and a wireless interface.
The network interface 1004 may optionally include a standard wired interface, a wireless interface (e.g., WI-FI interface), among others.
Processor 1001 may include one or more processing cores, among other things. The processor 1001 interfaces various components throughout the electronic device 1000 using various interfaces and lines to perform various functions of the electronic device 1000 and to process data by executing or executing instructions, programs, code sets, or instruction sets stored in the memory 1005 and invoking data stored in the memory 1005. Alternatively, the processor 1001 may be implemented in at least one hardware form of Digital Signal Processing (DSP), Field-Programmable Gate Array (FPGA), and Programmable Logic Array (PLA). The processor 1001 may integrate one or more of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. Wherein, the CPU mainly processes an operating system, a user interface, an application program and the like; the GPU is used for rendering and drawing the content required to be displayed by the display screen; the modem is used to handle wireless communications. It is understood that the modem may not be integrated into the processor 1001, but may be implemented by a single chip.
The Memory 1005 may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). Optionally, the memory 1005 includes a non-transitory computer-readable medium. The memory 1005 may be used to store an instruction, a program, code, a set of codes, or a set of instructions. The memory 1005 may include a stored program area and a stored data area, wherein the stored program area may store instructions for implementing an operating system, instructions for at least one function (such as a touch function, a sound playing function, an image playing function, etc.), instructions for implementing the various method embodiments described above, and the like; the storage data area may store data and the like referred to in the above respective method embodiments. The memory 1005 may optionally be at least one memory device located remotely from the processor 1001. As shown in fig. 14, a memory 1005, which is a kind of computer storage medium, may include therein an operating system, a network communication module, a user interface module, and an image processing application program.
In the electronic device 1000 shown in fig. 14, the user interface 1003 is mainly used as an interface for providing input for a user, and acquiring data input by the user; and the processor 1001 may be configured to invoke an image processing application stored in the memory 1005 and specifically perform the following operations:
acquiring a target face image;
acquiring an image beautification parameter generated based on a historical face image set;
and beautifying the target face image based on the image beautifying parameters to generate a beautified image corresponding to the target face image.
In one embodiment, before performing the acquiring of the target face image, the processor 1001 further performs the following operations:
collecting a sample face image to generate a historical face image set;
acquiring image types corresponding to all historical face images in the historical face image set;
determining a weight value corresponding to each historical face image based on the corresponding relation between the image type and the image weight;
and generating an image beautifying parameter according to the weight value of each historical face image.
In one embodiment, when the processor 1001 executes the generation of the image beautification parameter according to the weight value of each historical face image, the following operations are specifically executed:
and acquiring the current beautification parameters of the historical face images, weighting and summing the current beautification parameters of the historical face images and the weight values of the historical face images, and averaging to generate the image beautification parameters.
In one embodiment, before performing the acquiring of the target face image, the processor 1001 further performs the following operations:
collecting a sample face image to generate a historical face image set;
acquiring the priority of each historical face image in the historical face image set, and determining a face image to be analyzed in the historical face image set based on the high-low sequence of the priority;
and analyzing the face image to be analyzed to generate an image beautifying parameter.
In an embodiment, when the processor 1001 executes the acquiring of the priority of each historical face image in the historical face image set, the following operation is specifically executed:
acquiring the acquisition time indicated by each historical face image in the historical face image set;
and determining the priority of each historical face image based on the time length of the acquisition time from the current time.
In an embodiment, when the processor 1001 executes the acquiring of the priority of each historical face image in the historical face image set, the following operation is specifically executed:
acquiring the adjustment times of each historical face image in the historical face image set;
and determining the priority of each historical face image based on the adjustment times of each historical face image.
In one embodiment, when the processor 1001 performs the beautification processing on the target face image based on the image beautification parameter, specifically perform the following operations:
acquiring parameters of the target face image;
and adjusting the parameters of the target face image to the values corresponding to the image beautification parameters.
In one embodiment, after the generating of the beautified image corresponding to the target face image, the processor 1001 further performs the following operations:
when an editing instruction for the beautified image is received, editing the beautified image to generate an edited corrected image;
and storing the corrected beautified image into the historical face image set.
In the embodiment of the application, an image processing device is used for obtaining a target face image, then an image beautification parameter generated based on a historical face image set is obtained, finally beautification processing is carried out on the target face image based on the image beautification parameter, and an beautification image corresponding to the target face image is generated after the beautification processing is finished. Therefore, by adopting the embodiment of the application, the image beautification parameters are generated in advance and stored based on the historical face image set, so that the beautification parameters can be automatically acquired for image beautification processing when the user beautifies the image after the photographing is finished, the mode can quickly finish the image beautification processing, the time is saved, and the image beautification efficiency is improved.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above disclosure is only for the purpose of illustrating the preferred embodiments of the present application and is not to be construed as limiting the scope of the present application, so that the present application is not limited thereto, and all equivalent variations and modifications can be made to the present application.

Claims (18)

1. An image processing method, characterized in that the method comprises:
acquiring a target face image;
acquiring an image beautification parameter generated based on a historical face image set;
and beautifying the target face image based on the image beautifying parameters to generate a beautified image corresponding to the target face image.
2. The method of claim 1, wherein before the obtaining the target face image, further comprising:
collecting a sample face image to generate a historical face image set;
acquiring image types corresponding to all historical face images in the historical face image set;
determining a weight value corresponding to each historical face image based on the corresponding relation between the image type and the image weight;
and generating an image beautifying parameter according to the weight value of each historical face image.
3. The method of claim 2, wherein generating an image beautification parameter according to the weight value of each historical face image comprises:
and acquiring the current beautification parameters of the historical face images, weighting and summing the current beautification parameters of the historical face images and the weight values of the historical face images, and averaging to generate the image beautification parameters.
4. The method of claim 1, wherein before the obtaining the target face image, further comprising:
collecting a sample face image to generate a historical face image set;
acquiring the priority of each historical face image in the historical face image set, and determining a face image to be analyzed in the historical face image set based on the high-low sequence of the priority;
and analyzing the face image to be analyzed to generate an image beautifying parameter.
5. The method according to claim 4, wherein the obtaining the priority of each historical face image in the historical face image set comprises:
acquiring the acquisition time indicated by each historical face image in the historical face image set;
and determining the priority of each historical face image based on the time length of the acquisition time from the current time.
6. The method according to claim 4, wherein the obtaining the priority of each historical face image in the historical face image set comprises:
acquiring the adjustment times of each historical face image in the historical face image set;
and determining the priority of each historical face image based on the adjustment times of each historical face image.
7. The method of claim 1, wherein the beautifying the target face image based on the image beautification parameters comprises:
acquiring parameters of the target face image;
and adjusting the parameters of the target face image to the values corresponding to the image beautification parameters.
8. The method of claim 1, wherein after generating the beautified image corresponding to the target face image, further comprising:
when an editing instruction for the beautified image is received, editing the beautified image to generate an edited corrected image;
and storing the corrected beautified image into the historical face image set.
9. An image processing apparatus, characterized in that the apparatus comprises:
the image acquisition module is used for acquiring a target face image;
the parameter acquisition module is used for acquiring an image beautifying parameter generated based on a historical face image;
and the first image generation module is used for beautifying the target face image based on the image beautifying parameters and generating a beautifying image corresponding to the target face image.
10. The apparatus of claim 9, further comprising:
the first set generation module is used for acquiring sample face images and generating a historical face image set;
the type acquisition module is used for acquiring the image types corresponding to the historical face images in the historical face image set;
the weight value determining module is used for determining the weight value corresponding to each historical face image based on the corresponding relation between the image type and the image weight;
and the first parameter generation module is used for generating an image beautification parameter according to the weight value of each historical face image.
11. The apparatus of claim 10, wherein the parameter generation module is specifically configured to:
and acquiring the current beautification parameters of the historical face images, weighting and summing the current beautification parameters of the historical face images and the weight values of the historical face images, and averaging to generate the image beautification parameters.
12. The apparatus of claim 9, further comprising:
the second set generation module is used for acquiring sample face images and generating a historical face image set;
the image determining module is used for acquiring the priority of each historical face image in the historical face image set and determining a face image to be analyzed in the historical face image set based on the high-low sequence of the priority;
and the second parameter generation module is used for generating an image beautification parameter after analyzing and processing the face image to be analyzed.
13. The apparatus of claim 12, wherein the image determination module comprises:
the time acquisition unit is used for acquiring the acquisition time indicated by each historical face image in the historical face image set;
and the first priority determining unit is used for determining the priority of each historical face image based on the time length of the acquisition time from the current time.
14. The apparatus of claim 12, wherein the image determination module comprises:
the number obtaining unit is used for obtaining the adjustment number of each historical face image in the historical face image set;
and the second priority determining unit is used for determining the priority of each historical face image based on the adjustment times of each historical face image.
15. The apparatus of claim 9, wherein the first image generation module comprises:
the parameter acquisition unit is used for acquiring parameters of the target face image;
and the image adjusting unit is used for adjusting the parameters of the target face image to the values corresponding to the image beautification parameters.
16. The apparatus of claim 9, further comprising:
the second image generation module is used for editing the beautified image to generate an edited corrected image when an editing instruction for the beautified image is received;
and the image storage module is used for storing the corrected beautified image into the historical face image set.
17. A computer storage medium, characterized in that it stores a plurality of instructions adapted to be loaded by a processor and to carry out the method steps according to any one of claims 1 to 8.
18. A terminal, comprising: a processor and a memory; wherein the memory stores a computer program adapted to be loaded by the processor and to perform the method steps of any of claims 1 to 8.
CN201911095013.0A 2019-11-11 2019-11-11 Image processing method and device, storage medium and terminal Withdrawn CN112785488A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911095013.0A CN112785488A (en) 2019-11-11 2019-11-11 Image processing method and device, storage medium and terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911095013.0A CN112785488A (en) 2019-11-11 2019-11-11 Image processing method and device, storage medium and terminal

Publications (1)

Publication Number Publication Date
CN112785488A true CN112785488A (en) 2021-05-11

Family

ID=75749741

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911095013.0A Withdrawn CN112785488A (en) 2019-11-11 2019-11-11 Image processing method and device, storage medium and terminal

Country Status (1)

Country Link
CN (1) CN112785488A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344812A (en) * 2021-05-31 2021-09-03 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN113766123A (en) * 2021-08-26 2021-12-07 深圳市有方科技股份有限公司 Photographing beautifying method and terminal
CN115239576A (en) * 2022-06-15 2022-10-25 荣耀终端有限公司 Photo optimization method, electronic device and storage medium

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107622478A (en) * 2017-09-04 2018-01-23 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium
CN107862274A (en) * 2017-10-31 2018-03-30 广东欧珀移动通信有限公司 U.S. face method, apparatus, electronic equipment and computer-readable recording medium
CN107862653A (en) * 2017-11-30 2018-03-30 广东欧珀移动通信有限公司 Method for displaying image, device, storage medium and electronic equipment
CN107995415A (en) * 2017-11-09 2018-05-04 深圳市金立通信设备有限公司 A kind of image processing method, terminal and computer-readable medium
CN109035180A (en) * 2018-09-27 2018-12-18 广州酷狗计算机科技有限公司 Video broadcasting method, device, equipment and storage medium
CN109167914A (en) * 2018-09-25 2019-01-08 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN109255761A (en) * 2018-08-23 2019-01-22 北京金山安全软件有限公司 Image processing method and device and electronic equipment
CN109446993A (en) * 2018-10-30 2019-03-08 维沃移动通信有限公司 A kind of image processing method and mobile terminal
WO2019100766A1 (en) * 2017-11-22 2019-05-31 格力电器(武汉)有限公司 Image processing method and apparatus, electronic device and storage medium
WO2019109805A1 (en) * 2017-12-06 2019-06-13 Oppo广东移动通信有限公司 Method and device for processing image

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107622478A (en) * 2017-09-04 2018-01-23 维沃移动通信有限公司 A kind of image processing method, mobile terminal and computer-readable recording medium
CN107862274A (en) * 2017-10-31 2018-03-30 广东欧珀移动通信有限公司 U.S. face method, apparatus, electronic equipment and computer-readable recording medium
CN107995415A (en) * 2017-11-09 2018-05-04 深圳市金立通信设备有限公司 A kind of image processing method, terminal and computer-readable medium
WO2019100766A1 (en) * 2017-11-22 2019-05-31 格力电器(武汉)有限公司 Image processing method and apparatus, electronic device and storage medium
CN107862653A (en) * 2017-11-30 2018-03-30 广东欧珀移动通信有限公司 Method for displaying image, device, storage medium and electronic equipment
WO2019109805A1 (en) * 2017-12-06 2019-06-13 Oppo广东移动通信有限公司 Method and device for processing image
CN109255761A (en) * 2018-08-23 2019-01-22 北京金山安全软件有限公司 Image processing method and device and electronic equipment
CN109167914A (en) * 2018-09-25 2019-01-08 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN109035180A (en) * 2018-09-27 2018-12-18 广州酷狗计算机科技有限公司 Video broadcasting method, device, equipment and storage medium
CN109446993A (en) * 2018-10-30 2019-03-08 维沃移动通信有限公司 A kind of image processing method and mobile terminal

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113344812A (en) * 2021-05-31 2021-09-03 维沃移动通信(杭州)有限公司 Image processing method and device and electronic equipment
CN113766123A (en) * 2021-08-26 2021-12-07 深圳市有方科技股份有限公司 Photographing beautifying method and terminal
CN115239576A (en) * 2022-06-15 2022-10-25 荣耀终端有限公司 Photo optimization method, electronic device and storage medium

Similar Documents

Publication Publication Date Title
US11321385B2 (en) Visualization of image themes based on image content
US10593023B2 (en) Deep-learning-based automatic skin retouching
EP3338217B1 (en) Feature detection and masking in images based on color distributions
CN110263642B (en) Image cache for replacing portions of an image
US10607372B2 (en) Cosmetic information providing system, cosmetic information providing apparatus, cosmetic information providing method, and program
US11503205B2 (en) Photographing method and device, and related electronic apparatus
CN106682632B (en) Method and device for processing face image
US20150325023A1 (en) Providing pre-edits for photos
CN112785488A (en) Image processing method and device, storage medium and terminal
KR20140076632A (en) Image recomposition using face detection
CN106815803B (en) Picture processing method and device
US9799099B2 (en) Systems and methods for automatic image editing
WO2015167975A1 (en) Rating photos for tasks based on content and adjacent signals
CN111008935B (en) Face image enhancement method, device, system and storage medium
CN112839223A (en) Image compression method, image compression device, storage medium and electronic equipment
CN112102157A (en) Video face changing method, electronic device and computer readable storage medium
CN109829364A (en) A kind of expression recognition method, device and recommended method, device
WO2015184903A1 (en) Picture processing method and device
CN114758027A (en) Image processing method, image processing device, electronic equipment and storage medium
CN111316628A (en) Image shooting method and image shooting system based on intelligent terminal
US20190304152A1 (en) Method and device for processing image
CN113709560B (en) Video editing method, device, equipment and storage medium
US20190205689A1 (en) Method and device for processing image, electronic device and medium
US11812183B2 (en) Information processing device and program
CN112083863A (en) Image processing method and device, electronic equipment and readable storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20210511

WW01 Invention patent application withdrawn after publication