CN112199018A - Image processing method and device - Google Patents

Image processing method and device Download PDF

Info

Publication number
CN112199018A
CN112199018A CN202011099583.XA CN202011099583A CN112199018A CN 112199018 A CN112199018 A CN 112199018A CN 202011099583 A CN202011099583 A CN 202011099583A CN 112199018 A CN112199018 A CN 112199018A
Authority
CN
China
Prior art keywords
makeup
input
category
materials
user
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011099583.XA
Other languages
Chinese (zh)
Inventor
马兴涛
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202011099583.XA priority Critical patent/CN112199018A/en
Publication of CN112199018A publication Critical patent/CN112199018A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/53Querying
    • G06F16/532Query formulation, e.g. graphical querying
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements

Abstract

The application discloses an image processing method and device, and the method comprises the steps of responding to a first input by receiving the first input of a user, and displaying a makeup category identification corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is the category to which the makeup materials in the makeup picture library belong, and is determined according to the target makeup materials in which the user is interested; receiving a second input of the makeup category identification, and displaying makeup materials corresponding to the makeup category identification selected by the second input; receiving a third input of the makeup materials, and processing the image to be processed according to the makeup materials selected by the third input. In the embodiment of the application, the makeup materials corresponding to the makeup category identification selected by the user can be ensured to meet the user preference, the user can conveniently and quickly and accurately find the makeup materials meeting the user preference, the selection operation is simplified, and the makeup efficiency is improved.

Description

Image processing method and device
Technical Field
The present application relates to the field of communications technologies, and in particular, to an image processing method and apparatus.
Background
The image can freeze nice moment and record nice life. In order to improve the aesthetic feeling of the image, the user often uses the beauty function to shoot or make up the image so as to make the image better meet the requirements of the user.
In the prior art, a large number of preset default makeup materials are often directly displayed to a user for the user to select when makeup is performed. In this way, the user may spend a lot of time selecting a preferred makeup material from the plurality of default makeup materials, and the efficiency of image makeup is low due to the complicated selection operation.
Disclosure of Invention
The embodiment of the application aims to provide an image processing method and device, which can solve the problems that makeup materials cannot be recommended according to the preference of a user, and the operation of selecting the makeup materials by the user is complicated and the efficiency is low.
In order to solve the technical problem, the present application is implemented as follows:
in a first aspect, an embodiment of the present application provides an image processing method, including:
receiving a first input of a user, and responding to the first input, displaying a makeup category identification corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested;
receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input;
and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including:
the first display module is used for receiving a first input of a user and responding to the first input to display a makeup category identifier corresponding to at least one alternative makeup category in the makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested;
the second display module is used for receiving a second input of the makeup category identification, responding to the second input, and displaying the makeup materials corresponding to the makeup category identification selected by the second input;
and the processing module is used for receiving a third input of the identification corresponding to the makeup material and responding to the third input to process the image to be processed according to the makeup material selected by the third input.
In a third aspect, an embodiment of the present application provides an electronic device, which includes a processor, a memory, and a program or instructions stored on the memory and executable on the processor, and when executed by the processor, the program or instructions implement the steps of the method according to the first aspect.
In a fourth aspect, embodiments of the present application provide a readable storage medium, on which a program or instructions are stored, which when executed by a processor implement the steps of the method according to the first aspect.
In a fifth aspect, an embodiment of the present application provides a chip, where the chip includes a processor and a communication interface, where the communication interface is coupled to the processor, and the processor is configured to execute a program or instructions to implement the method according to the first aspect.
In the embodiment of the application, by receiving a first input of a user, in response to the first input, displaying a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested; receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input; and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input. In the embodiment of the application, the makeup material corresponding to the makeup category identification selected by the user can be ensured to meet the preference of the user to a certain extent, the user can conveniently and accurately find the makeup material conforming to the preference of the user, the selection operation is simplified, and the makeup efficiency is improved.
Drawings
Fig. 1 shows a flowchart of an image processing method in an embodiment of the present application;
FIG. 2 is a schematic diagram illustrating a makeup category display interface in an embodiment of the present application;
FIG. 3 is a schematic diagram illustrating another makeup category display interface in an embodiment of the present application;
FIG. 4 is a schematic view showing a makeup material display interface in the embodiment of the present application;
fig. 5 shows a flowchart of another image processing method in the embodiment of the present application;
FIG. 6 is a schematic diagram illustrating a user entering a makeup application home page in an embodiment of the present application;
FIG. 7 illustrates a schematic view of a makeup replacement display interface in an embodiment of the present application;
FIG. 8 is a schematic view showing a makeup material display interface in the embodiment of the present application;
fig. 9 is a block diagram showing a configuration of an image processing apparatus in an embodiment of the present application;
fig. 10 is a block diagram showing a structure of an electronic device in an embodiment of the present application;
fig. 11 shows a block diagram of an electronic device in an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some, but not all, embodiments of the present application. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms first, second and the like in the description and in the claims of the present application are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It will be appreciated that the data so used may be interchanged under appropriate circumstances such that embodiments of the application may be practiced in sequences other than those illustrated or described herein, and that the terms "first," "second," and the like are generally used herein in a generic sense and do not limit the number of terms, e.g., the first term can be one or more than one. In addition, "and/or" in the specification and claims means at least one of connected objects, a character "/" generally means that a preceding and succeeding related objects are in an "or" relationship.
The image processing method provided by the embodiment of the present application is described in detail below with reference to the accompanying drawings through specific embodiments and application scenarios thereof.
Referring to fig. 1, a flowchart of an image processing method according to an embodiment of the present application is shown, which may specifically include the following steps:
step 101, receiving a first input of a user, and displaying a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library in response to the first input; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and the alternative makeup category is determined according to target makeup materials in which a user is interested.
In the embodiment of the application, after the user enters the makeup changing application, the image to be processed is obtained by shooting or selecting from an album, and the image to be processed is displayed on the display interface.
In an embodiment of the present application, the first input includes: triggering a trigger on a display interface, or triggering an appointed key on a terminal, or inputting preset information by a user through voice. After receiving the first input of the user, displaying the makeup category identification on the display interface, as shown in fig. 2, and displaying the makeup category identification of which the alternative makeup categories are commuted makeup and smoked makeup on the display interface. The commuting makeup and the smoking makeup are makeup categories determined by target makeup materials which are interested by the user, namely the makeup categories which are interested by the user.
In the embodiment of the application, if the makeup category in which the user is interested is at least one, the makeup category identification corresponding to at least one alternative makeup category can be displayed on the display interface.
In the embodiment of the present application, the user may also slide down the screen or otherwise complete the switching between the makeup category identifier to be selected and the other makeup category identifiers on the display interface, which is not limited herein.
In the embodiment of the present application, the target makeup material that the user is interested in refers to a makeup material that the user has used before or a makeup material that is marked as a favorite.
In the embodiment of the application, the target makeup material has timeliness; the alternative makeup category is determined according to the target makeup materials of interest to the user including: the alternative makeup category is a target makeup material which is in the preset time and is interesting to the user according to the storage time; the preset time refers to a time length from the current time being a preset length, for example, one day, three days, or five days from the current time.
In the embodiment of the application, all makeup materials and makeup categories are stored in the makeup picture library, including the makeup materials previously downloaded by the user and the identifications of the makeup materials which are updated by the application but not downloaded by the user, or the identifications of the makeup categories.
In the embodiment of the application, the makeup category is the classification of the makeup materials, the quantity of the makeup categories is small compared with the makeup materials, the number of options can be reduced through the classification of the makeup materials, the makeup categories which are interesting to a user are displayed on the display interface, and the quantity of the options is reduced, so that when the user searches for the favorite makeup materials, the user does not need to select from a large quantity of the makeup materials, only needs to select the needed makeup categories from a small quantity of recommended makeup categories, and further, the time for the user to select can be saved.
In the embodiment of the application, when a user needs to change makeup, the makeup category identification which is interested by the user is displayed on the display interface, the user can conveniently and quickly find the makeup category which is interested by the user, the time for the user to find the favorite makeup category is saved, and when the user needs other makeup categories, the user can find other makeup categories in the makeup picture library by sliding down or triggering the more identification, so that the efficiency of the user for finding other makeup categories is improved.
And 102, receiving a second input of the makeup category identification, and responding to the second input to display the makeup materials corresponding to the makeup category identification selected by the second input.
In an embodiment of the present application, the second input comprises: triggering the makeup category identification on the display interface, or triggering a preset key on the terminal, or inputting the selected makeup category identification by the user through voice.
In the embodiment of the present application, each makeup category identification corresponds to a plurality of makeup materials. When the user triggers the makeup category identifier in fig. 2 or fig. 3, part or all of the makeup materials corresponding to the makeup identifier are displayed.
In the embodiment of the application, the makeup material can be a set of makeup material or a single makeup material, wherein the set of makeup material comprises: make-up on various parts of the face and make-up on the skin; for example: the set of makeup materials A corresponding to the European and American makeup categories comprises: blue eyes, long eyelashes, yellow brown eyebrows, black-purple lip makeup, wheat skin tone; another set of makeup material B includes: green eyes, raised eyelashes, golden eyebrows, red makeup, snow-white complexion. Individual makeup materials include: the makeup of a single part of the face, for example, a single makeup material C corresponding to the european makeup category is: green eyes; the individual makeup material D was: snow white skin color.
Referring to fig. 4, the makeup materials are displayed on the display interface and arranged according to the interested target makeup materials selected by the user before, and the use times of the makeup materials by the user can be arranged in a descending order. For example, the makeup materials with the largest number of uses are arranged at the position of the first makeup material, and the makeup materials with the largest number of uses are arranged at the position of the second makeup material in sequence.
In the embodiment of the application, the makeup category identification corresponds to the download identification of the makeup materials downloaded historically and the updated makeup materials by the user. Then, the arrangement of the makeup materials on the display interface includes: sequentially arranging the makeup materials with the historical use times larger than the preset time threshold according to the historical use times, then arranging the download identification of the updated makeup materials, and finally sequentially arranging other remaining makeup materials; or, arranging the download identification of the updated makeup materials at the first position, and then sequentially arranging the makeup materials downloaded historically by the user according to the historical use times.
In the embodiment of the application, the makeup materials are arranged according to the historical use times of the user, so that the user can conveniently and quickly find the desired makeup materials.
And 103, receiving a third input of the identification corresponding to the makeup material, and processing the image to be processed according to the makeup material selected by the third input in response to the third input.
In an embodiment of the present application, the third input comprises: triggering the makeup material identification on the display interface, or triggering a preset button on the terminal, or inputting the selected makeup material by the user through voice.
In the embodiment of the present application, the image to be processed includes: and (5) processing the human face image.
In the embodiment of the application, when the user selects one of the makeup materials A, the image to be processed can be processed according to the makeup material A, when the user selects the other makeup material B, the processing of the image to be processed by the makeup material A is cancelled, and then the image to be processed is processed by the makeup material B.
Referring to fig. 4, the image and the makeup material after being processed by the makeup material can be displayed on the same interface, so that the user can conveniently view the processed image to be processed in time, and can click the confirmation key to store the processed image to be processed when the user is satisfied.
In the embodiment of the application, when a set of makeup materials A is selected by a user for the first time, the set of makeup materials A is used for processing the image to be processed, if a set of makeup materials B is also selected by the user for the second time, the processing of the image to be processed by the makeup materials A is cancelled, the image to be processed is processed by the makeup materials B, if a single makeup material C is selected by the user for the second time, the user can select the single makeup material C to replace the part corresponding to the makeup material C in the makeup materials A, for example, the green eye of the single makeup material C can replace the green eye in the makeup materials A, and other parts in the makeup materials A are kept continuously. When the user selects a single dressing material C for the first time, the dressing material C is used for processing the image to be processed, if the user selects a single dressing material D for the second time, whether the dressing material C and the dressing material D are the same part to be processed in the image to be processed is judged, if the dressing material C and the dressing material D are the same part to be processed, the processing of the image to be processed by the dressing material C is cancelled, the dressing material D is replaced by the dressing material D, and if the dressing material C and the dressing material D are different, the image to be processed is processed by the dressing material C and the dressing material D at the same time.
In the embodiment of the application, a user can flexibly select and replace the makeup materials to obtain a satisfactory processed image to be processed.
In the embodiment of the application, by receiving a first input of a user, in response to the first input, displaying a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested; receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input; and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input. In the embodiment of the application, the makeup material corresponding to the makeup category identification selected by the user can be ensured to meet the preference of the user to a certain extent, the user can conveniently and accurately find the makeup material conforming to the preference of the user, the selection operation is simplified, and the makeup efficiency is improved.
Referring to fig. 5, a flowchart of another image processing method according to the embodiment of the present application is shown, which may specifically include the following steps:
step 201, obtaining the target makeup material.
In the embodiment of the present application, the target makeup material includes: the user marks favorite makeup material and/or user history used makeup material.
In this embodiment of the present application, step 201 includes: displaying the category identification of the dressing to be selected; receiving a fourth input of the identification of the dressing category to be selected, and responding to the fourth input to display materials of the dressing to be selected; the material of the makeup to be selected is the material of the makeup corresponding to the class identifier of the makeup to be selected by the fourth input; receiving a fifth input of the dressing material to be selected, and determining the dressing material to be selected by the fifth input as the target dressing material in response to the fifth input.
Wherein, the fifth input selected dressing material to be selected is the dressing material marked as favorite by the user; and the makeup materials selected by the subsequent third input are the makeup materials used by the user in history.
In the embodiment of the application, when a user enters a makeup application for the first time, the makeup category identifications in the makeup picture library are displayed in the display interface, the user can select at least one makeup category identification according to own preference, then the materials to be selected corresponding to the makeup category identification selected by the user are displayed on the display interface, the user selects the materials to be selected again according to the preference, and the selected materials are used as target materials.
Referring to fig. 6, the user may press one of the makeup materials for a long time, the display interface pops up a selection box, and the user may determine whether to select the makeup material as the target makeup material by triggering like or dislike.
In the embodiment of the application, when the user does not enter the makeup application for the first time, referring to fig. 4, the user can click one of the makeup materials to change makeup, and press one of the makeup materials for a long time to mark the makeup material as the target makeup material, so that when the user uses the makeup application for the next time, the user can recommend the makeup material according to the selected target makeup material.
In the embodiment of the application, the target makeup materials obtained each time are stored in the preset folder, and when the candidate makeup category is determined by using the target makeup materials each time, part of the target makeup materials can be selected according to the storage time of the target makeup materials in the preset folder to determine the candidate makeup category.
In the embodiment of the application, the target makeup materials are determined according to the preference of the user, so that the user can conveniently recommend the makeup materials according to the previous selection in each use, and the recommended makeup materials can be updated in time, so that the makeup materials are more in line with the current preference of the user.
Step 202, determining the alternative makeup category according to the target makeup materials.
In this embodiment of the present application, step 202 includes: determining the makeup category to which each target makeup material belongs to obtain a reference makeup category; determining the reference makeup category with the corresponding ratio value larger than a preset ratio threshold value as an alternative makeup category; the ratio is a ratio of the number of the historical makeup materials contained in the reference makeup category to the total number of the historical makeup materials.
In the embodiment of the application, when the user selects at least one dressing category to be selected, the user cannot really know that the dressing materials of the dressing categories to be selected in the dressing picture library are suitable for the user, so that after the user selects a plurality of target dressing materials, the mobile terminal can perform statistics according to the target dressing materials selected by the user to determine the real preference of the user for the dressing categories.
In the embodiment of the application, the makeup categories of the target makeup materials are divided, the ratio of each makeup category is determined, and then the alternative makeup categories which are actually interested by the user are determined. For example, when the user selects japanese-korean makeup, american makeup, and commuter makeup among the eight makeup categories of fig. 2 and 3, and among the makeup materials corresponding to the japanese-korean makeup, the american makeup, and the commuter makeup, the selected target makeup materials have 70% of makeup contents of japanese-korean makeup, 20% of european makeup, and 10% of commuter makeup, and when the preset ratio threshold is determined to be 50%, the candidate makeup category is japanese-korean makeup, and the user substantially likes it; when the preset ratio threshold is determined to be 15%, the candidate makeup categories are japanese and american makeup, which the user substantially likes.
In the embodiment of the application, the alternative makeup category is determined according to the target makeup materials which are interesting to the user or used by the user, so that the makeup category preferred by the user can be reflected more truly, and the user can conveniently and quickly search the preferred makeup materials according to the makeup category when using the makeup materials next time.
In this embodiment of the application, after step 202, the method further includes displaying the to-be-processed image in the first preview area.
In this embodiment of the present application, before displaying the to-be-processed image in the first preview area, the method further includes: receiving a trigger operation of opening the makeup application by a user, acquiring a to-be-processed image shot or selected in an album after the makeup application is opened by the mobile terminal, and displaying the to-be-processed image 61, referring to fig. 7. When the user triggers the one-key makeup changing in fig. 6, the user enters a one-key makeup changing interface, referring to fig. 7, displays the image to be processed in the first preview area X, and after the image to be processed is processed, displays the processed image to be processed in the second preview area Y.
In the embodiment of the application, the image to be processed is displayed in the first preview area, so that a user can compare the processed image conveniently after processing the processed image.
In the embodiment of the application, when the user is not satisfied with the makeup changing effect after the previous to-be-processed image is processed, the to-be-processed image in the first preview area can be clicked or double-clicked, and the makeup changing operation can be carried out again, so that the time for the user to change the makeup again can be saved.
In the embodiment of the present application, step 202 further includes displaying at least one preset makeup material for selection by the user.
In the embodiment of the application, the preset makeup material is a target makeup material with the use frequency being greater than a preset frequency, or the preset makeup material is a target makeup material with the preservation time within a second preset time in the alternative makeup category.
In the embodiment of the application, referring to fig. 7, the preset makeup material is displayed in the first preview area, when a user enters a one-key makeup changing interface, the preset makeup material can be directly selected for makeup changing, if the user is satisfied with the preset makeup material, a subsequent step of opening the makeup category identifier is not needed, and when the user is not satisfied after selecting the preset makeup material for makeup changing, more subsequent steps can be clicked to be executed.
In the embodiment of the application, when the user enters the makeup application for the first time, the preset makeup material may be the most popular makeup material in the current time, that is, the makeup material with the download times or the use times of all users using the makeup application being greater than the preset times.
When the user does not enter the makeup application for the first time, the preset makeup materials are the makeup materials with the historical use times of the user being more than the preset times or the latest updated makeup materials in the makeup categories in which the user is interested.
In the embodiment of the present application, the preservation time is the makeup material that is the latest updated makeup material in the target makeup category within the second preset time.
In the embodiment of the present application, when the user does not like or does not need the preset makeup material displayed in the first preview area, the preset makeup material can be deleted by double-clicking or long-pressing the preset makeup material.
In the embodiment of the application, the preset makeup materials displayed in the first preview area are determined according to the makeup materials which are used the most times by the user or determined according to the makeup categories which are interested by the user, so that at least one preset makeup material is directly recommended to the user in the first preview area, the user can conveniently and quickly obtain the makeup materials which are possibly needed, and the world for searching the makeup materials by the user is saved.
Step 203, receiving a first input of a user, and displaying a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library in response to the first input; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and the alternative makeup category is determined according to target makeup materials in which a user is interested.
In the embodiment of the application, when the user is not satisfied with the preset makeup materials, the first input can be sent by triggering the preset identifier, and other makeup categories are selected.
In addition, referring to step 101, the description is omitted here.
And 204, receiving a second input of the makeup category identification, and responding to the second input to display the makeup materials corresponding to the makeup category identification selected by the second input.
Referring to step 102, the detailed description is omitted here.
Step 205, receiving a third input of the identifier corresponding to the makeup material, and in response to the third input, processing the image to be processed according to the makeup material selected by the third input.
Referring to step 103, the description is omitted here.
And step 206, storing the makeup materials corresponding to the identifier selected according to the third input as the target makeup materials.
In the embodiment of the application, the makeup material selected by the third input is the makeup material adopted by the user in the image processing at this time, and the makeup material belongs to the makeup material preferred by the user, so that the makeup material is saved as the target makeup material, and the candidate makeup category preferred by the user is recommended to the user according to the target makeup material at the next time.
In the embodiment of the application, the makeup material selected by the third input is stored in the preset folder as the target makeup material, and the candidate makeup category can be determined by using the makeup material selected by the fifth input as the same material.
In the embodiment of the application, the makeup materials used by the user are stored as the target makeup materials, so that the makeup category preferred by the user can be recommended more accurately when the makeup category is recommended.
In this embodiment of the application, after step 206, the method further includes displaying the processed image to be processed in a second preview area.
In the embodiment of the application, referring to fig. 4 and 7, the processed image to be processed is displayed in the second preview area Y, so that a user can browse the processed image to be processed in time.
After the processed image to be processed is displayed in the second preview area, the user may click to save or share the processed image to be processed to perform further operations, which is not limited herein.
In the embodiment of the application, by receiving a first input of a user, in response to the first input, displaying a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested; receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input; and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input. In the embodiment of the application, the makeup material corresponding to the makeup category identification selected by the user can be ensured to meet the preference of the user to a certain extent, the user can conveniently and accurately find the makeup material conforming to the preference of the user, the selection operation is simplified, and the makeup efficiency is improved.
It should be noted that, in the image processing method provided in the embodiment of the present application, the execution subject may be an image processing apparatus, or a control module for the image processing method in the image processing apparatus. The image processing apparatus provided in the embodiment of the present application is described with an example in which an image processing apparatus executes an image processing method.
Referring to fig. 9, a block diagram of an image processing apparatus 300 according to an embodiment of the present application is shown, which may specifically include:
the first display module 301 is used for receiving a first input of a user and displaying a makeup category identifier corresponding to at least one alternative makeup category in the makeup picture library in response to the first input; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested;
a second display module 302, configured to receive a second input of the makeup category identifier, and in response to the second input, display a makeup material corresponding to the makeup category identifier selected by the second input;
and the processing module 303 is configured to receive a third input of the identifier corresponding to the makeup material, and process the image to be processed according to the makeup material selected by the third input in response to the third input.
Optionally, the method further includes:
the acquisition module is used for acquiring the target makeup materials;
and the determining module is used for determining the alternative makeup category according to the target makeup materials.
The acquisition module includes:
the first display unit is used for displaying the identification of the category of the makeup to be selected;
the second display unit is used for receiving a fourth input of the identification of the dressing category to be selected and responding to the fourth input to display the materials of the dressing to be selected; the material of the makeup to be selected is the material of the makeup corresponding to the class identifier of the makeup to be selected by the fourth input;
a first determination unit for receiving a fifth input to the dressing material to be selected, and in response to the fifth input, determining the dressing material to be selected by the fifth input as the target dressing material.
Further comprising:
and the storage module is used for storing the makeup materials corresponding to the identification selected according to the third input into the target makeup materials.
The determining module includes:
the second determining unit is used for determining the makeup category to which each target makeup material belongs to obtain a reference makeup category;
a third determining unit, configured to determine the reference makeup category for which the corresponding ratio is greater than the preset ratio threshold as an alternative makeup category; the ratio is a ratio of the number of the historical makeup materials contained in the reference makeup category to the total number of the historical makeup materials.
Further comprising:
the second display module is used for displaying the image to be processed in the first preview area;
and the third display module is used for displaying the processed image to be processed in a second preview area.
Further comprising:
the fourth display module is used for displaying at least one preset makeup material for the user to select;
the preset makeup material is a target makeup material with the use frequency larger than the preset frequency, or the preset makeup material is the target makeup material with the preservation time within second preset time in the alternative makeup category.
In the embodiment of the application, the image processing device receives a first input of a user, responds to the first input, and displays a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested; receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input; and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input. In the embodiment of the application, the makeup material corresponding to the makeup category identification selected by the user can be ensured to meet the preference of the user to a certain extent, the user can conveniently and accurately find the makeup material conforming to the preference of the user, the selection operation is simplified, and the makeup efficiency is improved.
The image processing apparatus in the embodiment of the present application may be an apparatus, or may be a component, an integrated circuit, or a chip in a terminal. The device can be mobile electronic equipment or non-mobile electronic equipment. By way of example, the mobile electronic device may be a mobile phone, a tablet computer, a notebook computer, a palm top computer, a vehicle-mounted electronic device, a wearable device, an ultra-mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), and the like, and the non-mobile electronic device may be a server, a Network Attached Storage (NAS), a Personal Computer (PC), a Television (TV), a teller machine or a self-service machine, and the like, and the embodiments of the present application are not particularly limited.
The image processing apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system (Android), an iOS operating system, or other possible operating systems, which is not specifically limited in the embodiments of the present application.
The image processing apparatus provided in the embodiment of the present application can implement each process implemented by the method embodiments of fig. 1 to fig. 8, and is not described herein again to avoid repetition.
Optionally, as shown in fig. 10, an electronic device 100 is further provided in this embodiment of the present application, and includes a processor 101, a memory 102, and a program or an instruction stored in the memory 102 and executable on the processor 101, where the program or the instruction is executed by the processor 101 to implement each process of the above-mentioned embodiment of the image processing method, and can achieve the same technical effect, and no further description is provided here to avoid repetition.
It should be noted that the electronic device in the embodiment of the present application includes the mobile electronic device and the non-mobile electronic device described above.
Fig. 11 is a schematic diagram of a hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 200 includes, but is not limited to: a radio frequency unit 201, a network module 202, an audio output unit 203, an input unit 204, a sensor 205, a display unit 206, a user input unit 207, an interface unit 208, a memory 209, and a processor 210.
Those skilled in the art will appreciate that the electronic device 200 may further comprise a power source (e.g., a battery) for supplying power to various components, and the power source may be logically connected to the processor 210 through a power management system, so as to implement functions of managing charging, discharging, and power consumption through the power management system. The electronic device structure shown in fig. 11 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than those shown, or combine some components, or arrange different components, and thus, the description is not repeated here.
An input unit 204 for receiving a first input of a user in response to the first input; receiving a second input of the makeup category identification, receiving a third input of an identification corresponding to the makeup material, in response to the third input;
a processor 210 for displaying a makeup category identifier corresponding to at least one candidate makeup category in the makeup picture gallery in response to the first input; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested; responding to the second input, and displaying makeup materials corresponding to the makeup category identification selected by the second input; in response to the third input, processing the image to be processed according to the makeup material selected by the third input
In the embodiment of the application, by receiving a first input of a user, in response to the first input, displaying a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested; receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input; and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input. In the embodiment of the application, the makeup material corresponding to the makeup category identification selected by the user can be ensured to meet the preference of the user to a certain extent, the user can conveniently and accurately find the makeup material conforming to the preference of the user, the selection operation is simplified, and the makeup efficiency is improved.
Optionally, the processor 210 is further configured to obtain the target makeup material; determining the alternative makeup category according to the target makeup materials; displaying the image to be processed in a first preview area; after the image to be processed is processed according to the makeup material selected by the third input, the method further comprises the following steps: displaying the processed image to be processed in a second preview area; displaying at least one preset makeup material for selection by a user; the preset makeup material is a target makeup material with the use frequency larger than the preset frequency, or the preset makeup material is the target makeup material with the preservation time within second preset time in the alternative makeup category.
In the embodiment of the application, by receiving a first input of a user, in response to the first input, displaying a makeup category identifier corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested; receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input; and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input. In the embodiment of the application, the makeup material corresponding to the makeup category identification selected by the user can be ensured to meet the preference of the user to a certain extent, the user can conveniently and accurately find the makeup material conforming to the preference of the user, the selection operation is simplified, and the makeup efficiency is improved.
It should be understood that in the embodiment of the present application, the input Unit 204 may include a Graphics Processing Unit (GPU) 2041 and a microphone 2042, and the Graphics Processing Unit 2041 processes image data of a still picture or a video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The display unit 206 may include a display panel 2061, and the display panel 2061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 107 includes a touch panel 2071 and other input devices 2072. The touch panel 2071 is also referred to as a touch screen. The touch panel 2071 may include two parts of a touch detection device and a touch controller. Other input devices 2072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein. Memory 209 may be used to store software programs as well as various data, including but not limited to applications and operating systems. Processor 210 may integrate an application processor, which primarily handles operating systems, user interfaces, applications, etc., and a modem processor, which primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 210.
The embodiment of the present application further provides a readable storage medium, where a program or an instruction is stored on the readable storage medium, and when the program or the instruction is executed by a processor, the program or the instruction implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here.
The processor is the processor in the electronic device described in the above embodiment. The readable storage medium includes a computer readable storage medium, such as a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk, and so on.
The embodiment of the present application further provides a chip, where the chip includes a processor and a communication interface, the communication interface is coupled to the processor, and the processor is configured to execute a program or an instruction to implement each process of the embodiment of the image processing method, and can achieve the same technical effect, and the details are not repeated here to avoid repetition.
It should be understood that the chips mentioned in the embodiments of the present application may also be referred to as system-on-chip, system-on-chip or system-on-chip, etc.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Further, it should be noted that the scope of the methods and apparatus of the embodiments of the present application is not limited to performing the functions in the order illustrated or discussed, but may include performing the functions in a substantially simultaneous manner or in a reverse order based on the functions involved, e.g., the methods described may be performed in an order different than that described, and various steps may be added, omitted, or combined. In addition, features described with reference to certain examples may be combined in other examples.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present application may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present application.
While the present embodiments have been described with reference to the accompanying drawings, it is to be understood that the invention is not limited to the precise embodiments described above, which are meant to be illustrative and not restrictive, and that various changes may be made therein by those skilled in the art without departing from the spirit and scope of the invention as defined by the appended claims.

Claims (10)

1. An image processing method, characterized in that the method comprises:
receiving a first input of a user, and responding to the first input, displaying a makeup category identification corresponding to at least one alternative makeup category in a makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested;
receiving a second input of the makeup category identification, and responding to the second input to display makeup materials corresponding to the makeup category identification selected by the second input;
and receiving a third input of the identification corresponding to the makeup materials, and responding to the third input to process the image to be processed according to the makeup materials selected by the third input.
2. The method as set forth in claim 1, wherein the receiving a first input from a user, prior to displaying an identification of a makeup category corresponding to at least one alternative makeup category in a makeup picture gallery in response to the first input, further comprises:
obtaining the target makeup material;
and determining the alternative makeup category according to the target makeup materials.
3. The method of claim 2, wherein said obtaining target makeup material comprises:
displaying the category identification of the dressing to be selected;
receiving a fourth input of the identification of the dressing category to be selected, and responding to the fourth input to display materials of the dressing to be selected; the material of the makeup to be selected is the material of the makeup corresponding to the class identifier of the makeup to be selected by the fourth input;
receiving a fifth input of the dressing material to be selected, and determining the dressing material to be selected by the fifth input as the target dressing material in response to the fifth input.
4. The method of claim 2, wherein receiving a third input by the user for the makeup material, in response to the third input, after processing the image to be processed according to the makeup material selected by the third input, further comprises:
and saving the makeup materials corresponding to the identification selected according to the third input as the target makeup materials.
5. The method of claim 3 or 4, wherein said determining the alternative makeup category based on the target makeup material comprises:
determining the makeup category to which each target makeup material belongs to obtain a reference makeup category;
determining the reference makeup category with the corresponding ratio value larger than a preset ratio threshold value as an alternative makeup category; the ratio is a ratio of the number of the historical makeup materials contained in the reference makeup category to the total number of the historical makeup materials.
6. An image processing apparatus, characterized in that the apparatus comprises:
the first display module is used for receiving a first input of a user and responding to the first input to display a makeup category identifier corresponding to at least one alternative makeup category in the makeup picture library; wherein the alternative makeup category is a category to which makeup materials in the makeup picture gallery belong, and is determined according to target makeup materials in which a user is interested;
the second display module is used for receiving a second input of the makeup category identification, responding to the second input, and displaying the makeup materials corresponding to the makeup category identification selected by the second input;
and the processing module is used for receiving a third input of the identification corresponding to the makeup material and responding to the third input to process the image to be processed according to the makeup material selected by the third input.
7. The apparatus of claim 6, further comprising:
the acquisition module is used for acquiring the target makeup materials;
and the determining module is used for determining the alternative makeup category according to the target makeup materials.
8. The apparatus of claim 7, wherein the obtaining module comprises:
the first display unit is used for displaying the identification of the category of the makeup to be selected;
the second display unit is used for receiving a fourth input of the identification of the dressing category to be selected and responding to the fourth input to display the materials of the dressing to be selected; the material of the makeup to be selected is the material of the makeup corresponding to the class identifier of the makeup to be selected by the fourth input;
a first determination unit for receiving a fifth input to the dressing material to be selected, and in response to the fifth input, determining the dressing material to be selected by the fifth input as the target dressing material.
9. The apparatus of claim 7, further comprising:
and the storage module is used for storing the makeup materials corresponding to the identification selected according to the third input into the target makeup materials.
10. The apparatus of claim 8 or 9, wherein the determining module comprises:
the second determining unit is used for determining the makeup category to which each target makeup material belongs to obtain a reference makeup category;
a third determining unit, configured to determine the reference makeup category for which the corresponding ratio is greater than the preset ratio threshold as an alternative makeup category; the ratio is a ratio of the number of the historical makeup materials contained in the reference makeup category to the total number of the historical makeup materials.
CN202011099583.XA 2020-10-14 2020-10-14 Image processing method and device Pending CN112199018A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011099583.XA CN112199018A (en) 2020-10-14 2020-10-14 Image processing method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011099583.XA CN112199018A (en) 2020-10-14 2020-10-14 Image processing method and device

Publications (1)

Publication Number Publication Date
CN112199018A true CN112199018A (en) 2021-01-08

Family

ID=74008985

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011099583.XA Pending CN112199018A (en) 2020-10-14 2020-10-14 Image processing method and device

Country Status (1)

Country Link
CN (1) CN112199018A (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170358116A1 (en) * 2016-06-14 2017-12-14 Asustek Computer Inc. Method of establishing virtual makeup data and electronic device using the same
CN108121957A (en) * 2017-12-19 2018-06-05 北京麒麟合盛网络技术有限公司 The method for pushing and device of U.S. face material
CN109272473A (en) * 2018-10-26 2019-01-25 维沃移动通信(杭州)有限公司 A kind of image processing method and mobile terminal

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170358116A1 (en) * 2016-06-14 2017-12-14 Asustek Computer Inc. Method of establishing virtual makeup data and electronic device using the same
CN108121957A (en) * 2017-12-19 2018-06-05 北京麒麟合盛网络技术有限公司 The method for pushing and device of U.S. face material
CN109272473A (en) * 2018-10-26 2019-01-25 维沃移动通信(杭州)有限公司 A kind of image processing method and mobile terminal

Similar Documents

Publication Publication Date Title
CN109189986B (en) Information recommendation method and device, electronic equipment and readable storage medium
CN112612391B (en) Message processing method and device and electronic equipment
CN111857460A (en) Split screen processing method, split screen processing device, electronic equipment and readable storage medium
CN113067983B (en) Video processing method and device, electronic equipment and storage medium
CN112083854A (en) Application program running method and device
CN113835580A (en) Application icon display method and device, electronic equipment and storage medium
CN113590008A (en) Chat message display method and device and electronic equipment
CN114040248A (en) Video processing method and device and electronic equipment
CN113037925A (en) Information processing method, information processing apparatus, electronic device, and readable storage medium
CN112083863A (en) Image processing method and device, electronic equipment and readable storage medium
CN112199018A (en) Image processing method and device
CN113905125B (en) Video display method and device, electronic equipment and storage medium
CN113271379B (en) Image processing method and device and electronic equipment
CN113779293A (en) Image downloading method, device, electronic equipment and medium
CN112256976B (en) Matching method and related device
CN113835811A (en) Display method and device
CN112818147A (en) Picture processing method, device, equipment and storage medium
CN113282780A (en) Picture management method and device, electronic equipment and readable storage medium
CN112732961A (en) Image classification method and device
CN112818094A (en) Chat content processing method and device and electronic equipment
CN112084151A (en) File processing method and device and electronic equipment
CN111813285B (en) Floating window management method and device, electronic equipment and readable storage medium
CN113691729B (en) Image processing method and device
CN112764632B (en) Image sharing method and device and electronic equipment
CN111813298B (en) Timing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20210108