CN111324274A - Virtual makeup trial method, device, equipment and storage medium - Google Patents
Virtual makeup trial method, device, equipment and storage medium Download PDFInfo
- Publication number
- CN111324274A CN111324274A CN201811525712.XA CN201811525712A CN111324274A CN 111324274 A CN111324274 A CN 111324274A CN 201811525712 A CN201811525712 A CN 201811525712A CN 111324274 A CN111324274 A CN 111324274A
- Authority
- CN
- China
- Prior art keywords
- makeup
- virtual
- operation information
- type
- trial
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 47
- 230000006870 function Effects 0.000 claims description 92
- 230000000694 effects Effects 0.000 claims description 25
- 210000004709 eyebrow Anatomy 0.000 claims description 23
- 238000012545 processing Methods 0.000 claims description 21
- 238000004590 computer program Methods 0.000 claims description 11
- 210000003128 head Anatomy 0.000 description 15
- 238000012360 testing method Methods 0.000 description 8
- 238000010586 diagram Methods 0.000 description 6
- 239000000047 product Substances 0.000 description 5
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000008569 process Effects 0.000 description 3
- 238000004891 communication Methods 0.000 description 2
- 239000002537 cosmetic Substances 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 210000000720 eyelash Anatomy 0.000 description 2
- 210000001747 pupil Anatomy 0.000 description 2
- 239000013589 supplement Substances 0.000 description 2
- 230000003321 amplification Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000004088 simulation Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformations in the plane of the image
- G06T3/04—Context-preserving transformations, e.g. by using an importance map
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Processing Or Creating Images (AREA)
Abstract
The application provides a virtual makeup trial method, a device, equipment and a storage medium, wherein the method comprises the following steps: acquiring operation information of a user touching the three-dimensional model of the virtual fitting; and if the virtual makeup trial is confirmed to be started corresponding to the operation information, acquiring a function instruction corresponding to the virtual makeup trial starting according to the operation information, and executing corresponding operation according to the function instruction. Through in virtual fitting room, touch the three-dimensional model on the display screen, can start virtual trying to make up the function, make the user can conveniently get into virtual trying to make up the function from virtual fitting fast, can make the user effectively improve user experience.
Description
Technical Field
The application relates to the technical field of electronic commerce, in particular to a virtual makeup trying method, device, equipment and storage medium.
Background
The selection of clothes suitable for the user is an indispensable important link in the daily life of the female, and with the development of computer simulation, a method for performing virtual fitting by using an image processing technology begins to appear in the visual field of people.
However, the fitting method in the prior art is not high in order conversion rate, so how to effectively improve the order conversion rate becomes a technical problem which needs to be solved urgently.
Disclosure of Invention
The application provides a virtual makeup trial method, a virtual makeup trial device, virtual makeup trial equipment and a virtual makeup trial storage medium, which aim to overcome the defects of low order conversion rate and the like in the prior art.
The application provides a virtual makeup trying method in a first aspect, which comprises the following steps:
acquiring operation information of a user touching the three-dimensional model of the virtual fitting;
and if the virtual makeup trial is confirmed to be started corresponding to the operation information, acquiring a function instruction corresponding to the virtual makeup trial starting according to the operation information, and executing corresponding operation according to the function instruction.
In a second aspect, the present application provides a virtual makeup trial device, comprising:
the acquisition module is used for acquiring operation information of a user touching the three-dimensional model of the virtual fitting;
and the processing module is used for acquiring a function instruction corresponding to the virtual makeup trial starting according to the operation information and executing corresponding operation according to the function instruction if the virtual makeup trial starting corresponding to the operation information is confirmed.
A third aspect of the present application provides a computer device comprising: at least one processor and memory;
the memory stores a computer program; the at least one processor executes the computer program stored by the memory to implement the method provided by the first aspect.
A fourth aspect of the present application provides a computer-readable storage medium having stored thereon a computer program which, when executed, implements the method provided by the first aspect.
The virtual trial makeup method, the virtual trial makeup device, the virtual trial makeup equipment and the storage medium can start the virtual trial makeup function through the three-dimensional model on the touch display screen in the virtual fitting room, so that a user can conveniently and quickly enter the virtual trial makeup function from the virtual fitting, and the user experience can be effectively improved.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings needed to be used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to these drawings without inventive exercise.
Fig. 1 is a schematic flow chart of a virtual makeup trying method according to an embodiment of the present disclosure;
FIG. 2 is a schematic flow chart illustrating a virtual makeup trying method according to another embodiment of the present disclosure;
FIG. 3 is a schematic diagram of a virtual makeup system according to an embodiment of the present disclosure;
fig. 4 is a schematic structural diagram of a virtual makeup trial device according to an embodiment of the present disclosure;
fig. 5 is a schematic structural diagram of a computer device according to an embodiment of the present application.
With the above figures, there are shown specific embodiments of the present application, which will be described in more detail below. These drawings and written description are not intended to limit the scope of the disclosed concepts in any way, but rather to illustrate the concepts of the disclosure to those skilled in the art by reference to specific embodiments.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present application clearer, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms referred to in this application are explained first:
virtual fitting: the method is a technical application for realizing the effect of changing the clothes and checking the clothes without taking off the clothes of a user through a virtual technical means.
Virtual makeup trial: the method is a technical means for carrying out virtual makeup by utilizing an image processing technology and enabling a user to check the makeup effect. For example, the virtual makeup trial is realized in a non-contact simulation mode by analyzing the collected images, detecting facial images of a makeup trial user according to the feature points of the human face, selecting a proper image containing the human face as a makeup trial image, and then performing image processing on the makeup trial image according to the actual operation and the processing parameters based on the virtual makeup. In the embodiment of the application, the virtual makeup may be AR (augmented reality) makeup.
The virtual fitting method provided by the embodiment of the application is suitable for a virtual fitting system with a virtual fitting function, and the virtual fitting system can comprise computer equipment such as a mobile phone, a desktop computer, a notebook computer, a tablet computer, a server (with a display terminal) and the like. A user enters the virtual fitting room through the computer equipment, and a three-dimensional model for fitting is shown on the display screen. And the user clicks each part of the three-dimensional model to enter a corresponding virtual makeup trying interface to carry out virtual makeup trying. For example, clicking the eyebrow of the three-dimensional model, the virtual eyebrow drawing interface can be accessed. The corresponding relation between each part of the specific three-dimensional model and the virtual makeup test can be set according to actual requirements.
Furthermore, the terms "first", "second", etc. are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. In the description of the following examples, "plurality" means two or more unless specifically limited otherwise.
The following several specific embodiments may be combined with each other, and details of the same or similar concepts or processes may not be repeated in some embodiments. Embodiments of the present invention will be described below with reference to the accompanying drawings.
Example one
The embodiment provides a virtual fitting method, which is used for quickly entering a virtual fitting function in a virtual fitting room. The execution subject of the embodiment is a virtual makeup trying device, and the device can be arranged in computer equipment.
As shown in fig. 1, a schematic flow chart of a virtual makeup trying method provided in this embodiment is shown, where the method includes:
Specifically, after the user enters the virtual fitting room, the three-dimensional model is displayed on the display screen, the head portrait image of the user is acquired by the computer equipment through the camera and is synthesized with the original three-dimensional model, the head portrait of the three-dimensional model becomes the head portrait of the user, the computer equipment can also acquire the weight, the height and other related information of the user, or the user actively inputs the related information, so that the three-dimensional model more accurately conforms to the actual body type of the user. The specific virtual fitting operation is the prior art, and is not described herein again.
The user can touch different parts of the three-dimensional model on the display screen, and the computer equipment can acquire operation information of the user, such as clicking operation, sliding operation and the like.
And 102, if the operation information is confirmed to correspond to the start of the virtual makeup trial, acquiring a function instruction corresponding to the start of the virtual makeup trial according to the operation information, and executing corresponding operation according to the function instruction.
Specifically, after the operation information touched by the user is obtained, whether the virtual makeup test is started or not can be judged according to the operation information, for example, the operation information may include an operation type, and the operation type may include a click operation, a slide operation, and the like. The operation information may further include position information, such as click position information, touch position information, and the like, and by comparing the position information with the position information of the three-dimensional model, what part of the three-dimensional model is clicked or touched by the user may be obtained, and the corresponding relationship between different parts of the three-dimensional model and the operation type and the operation purpose may be preconfigured, and after the click or touch position of the user is determined, the operation purpose of the user may be obtained. Such as the user clicking on the eyebrow portion, eye portion, cheek portion, lip portion, etc., of the three-dimensional model, indicating that the user's operation is aimed at virtual makeup. Or, as long as the user clicks any part of the face, it indicates that the operation purpose of the user is to perform virtual makeup trial, and how to confirm that the operation information of the user corresponds to the start of the virtual makeup trial may be set according to actual requirements, which is not limited in this embodiment.
Alternatively, the type of makeup try for the virtual makeup try to be activated may be an eyebrow type, an eye type, a cheek type, a lip type, and others. The specific type of trying to make up can be set according to actual needs, and this embodiment is not limited.
After the virtual makeup trial is started according to the operation information of the user, a function instruction corresponding to the virtual makeup trial can be acquired according to the operation information, and corresponding operation is executed according to the function instruction so as to start the virtual makeup trial function.
Optionally, the virtual makeup trying function may be started by entering a virtual makeup trying initial page, selecting a specific makeup trying type by a user, or determining the makeup trying type according to a click position in the operation information, and directly entering a makeup trying interface corresponding to the makeup trying type, which may be specifically set according to actual requirements, and this embodiment is not limited. For example, the user may enter a virtual makeup trying initial page as long as the user clicks the three-dimensional model, or directly enter a virtual makeup trying interface of an eyebrow type as long as the click position included in the operation information corresponds to the eyebrow portion of the three-dimensional model.
According to the virtual trial makeup method provided by the embodiment, the virtual trial makeup function can be started by touching the three-dimensional model on the display screen in the virtual fitting room, so that a user can conveniently and quickly enter the virtual trial makeup function from the virtual fitting, and the user experience can be effectively improved.
Example two
The present embodiment further supplements the method provided in the first embodiment.
Fig. 2 is a schematic flow chart of the virtual makeup trying method provided in this embodiment.
As an implementable manner, on the basis of the first embodiment, optionally, the operation information includes click position information, and the step 102 specifically includes:
Specifically, the operation information includes click position information, that is, the operation of the user is a click operation, and the user clicks a certain part of the display screen. After the click position information of the user operation is obtained, the type of virtual makeup trial started corresponding to the operation information of the user can be determined according to the click position information. The computer equipment is pre-configured with the corresponding relation between the position information of each part of the three-dimensional model and the makeup type. The user click position information can be compared with the position information of the three-dimensional model to judge what part of the three-dimensional model is clicked by the user, and then the type of the makeup test corresponding to the user operation information is further judged.
It is understood that different parts of the three-dimensional model can be associated with other functions, not limited to the virtual fitting function, for example, the foot of the three-dimensional model can be associated with the virtual fitting function, so that the user can conveniently enter the fitting function, and the like. The present embodiment is not limited.
And 1022, acquiring a function instruction for starting the virtual makeup trial corresponding to the makeup trial type according to the operation information, and executing corresponding operation according to the function instruction.
Specifically, after the makeup trying type corresponding to the user operation is determined, a function instruction for starting the virtual makeup trying corresponding to the makeup trying type can be obtained according to the operation information of the user, and corresponding operation is executed according to the function instruction.
Illustratively, if the type of the makeup trial is an eyebrow type, a function instruction for starting an eyebrow pencil function is obtained, and an eyebrow pencil type virtual makeup trial interface is entered, so that a user can select an eyebrow shape, a color and the like desired by the user on the eyebrow pencil type virtual makeup trial interface. The specific virtual makeup trial interface can be set according to actual requirements, and the embodiment is not limited.
Alternatively, the makeup try types include an eyebrow type, an eye type, a cheek type, a lip type, and others.
Optionally, according to the functional instruction, performing corresponding operations, including:
and starting a virtual makeup trying function corresponding to the makeup trying type, and displaying.
Illustratively, virtual makeup trying functions of eye types are started, a makeup trying interface of the eye types is displayed, a user can select a cosmetic pupil, an eye line, false eyelashes, mascara and the like, make up trying on the eyes, and display makeup trying effects and the like. The specific makeup trial operation and effect display are not limited.
Optionally, step 1022 specifically includes:
if the fitting type corresponding to the operation information is determined to be the eyebrow type according to the click position information, acquiring an eyebrow pencil function instruction for starting virtual fitting, and executing corresponding operation according to the function instruction;
if the makeup trying type corresponding to the operation information is determined to be the eye type according to the click position information, acquiring an eye makeup function instruction for starting virtual makeup trying, and executing corresponding operation according to the function instruction;
if the trial makeup type corresponding to the operation information is determined to be the cheek type according to the click position information, acquiring a blush function instruction for starting the virtual trial makeup, and executing corresponding operation according to the function instruction;
and if the trial makeup type corresponding to the operation information is determined to be the lip type according to the click position information, acquiring a lipstick function instruction for starting the virtual trial makeup, and executing corresponding operation according to the function instruction.
Optionally, other makeup-type functionality may also be included.
As another implementable manner, on the basis of the first embodiment, optionally, the method may further include:
in step 2011, a makeup trying instruction of the user is obtained, where the makeup trying instruction includes a makeup trying type and makeup trying content.
Specifically, after entering the virtual makeup trying interface, the user may select corresponding makeup trying contents, for example, in the eyebrow type, the user may select the makeup trying contents such as the eyebrow shape and the eyebrow pencil color; under the eye type, the user may select cosmetic pupils, eye lines, eye shadows, eyelashes, and the like; in the cheek type, the user may select a blush color, etc.; under the lip type, the user may select a lipstick color, and so on. After the user selects the makeup test instruction, the computer equipment can acquire the makeup test instruction of the user, and the makeup test instruction comprises the makeup test type and the makeup test content.
After the makeup trying instruction of the user is obtained, a makeup trying effect can be displayed on the corresponding part of the three-dimensional model according to the makeup trying type and the makeup trying content included in the makeup trying instruction of the user.
Through showing the effect of trying to make up on three-dimensional model, can make the user look over fitting effect and the effect combination of trying to make up, can make the user know conveniently which kind of clothing that oneself is fit for to and this kind of clothing is fit for which kind of makeup, further improve user experience, can effectively increase the user and add and purchase product number and order number.
Optionally, the fitting effect of the user can also be displayed on the virtual fitting interface, and is not required to be displayed on the virtual fitting three-dimensional model, and the specific display mode can be set according to actual requirements.
As another implementable manner, on the basis of the first embodiment, optionally, the method further includes:
and step 202, if the operation information is confirmed to correspond to the amplified three-dimensional model, amplifying the three-dimensional model according to the operation information.
Specifically, because the virtual fitting three-dimensional model is usually an integral model, the user operation may also be a sliding operation, which may be an enlarging gesture, and the head of the model is positioned by the enlarging gesture, so that the user can click the corresponding part of the head to make up a trial.
Optionally, the operation information may include touch trajectory start point information and touch trajectory end point information, the touch trajectory start point information may include start point positions of two fingers in the zoom-in gesture, and the touch trajectory end point information may include end point positions of two finger sliding trajectories in the zoom-in gesture.
Step 202 may specifically include:
step 2021, if it is determined that the operation information corresponds to the enlarged three-dimensional model, enlarging and displaying a portion of the three-dimensional model corresponding to the operation information according to the touch trajectory start point information and the touch trajectory end point information.
Specifically, after the touch trajectory start point information and the touch trajectory end point information of the user are acquired, the part of the three-dimensional model corresponding to the operation information may be displayed in an enlarged manner according to the touch trajectory start point information and the touch trajectory end point information.
Illustratively, when the starting point of the zooming-in gesture of the user is at the head of the three-dimensional model, the head is zoomed in and displayed. The specific magnification scale can be determined according to the distance between the end point and the starting point of the magnification gesture. For example, the distance between two starting points and two ending points of the zooming-in gesture is a first distance and a second distance, respectively, a certain calculation may be performed according to the first distance and the second distance to obtain a target distance value, and the zooming-in scale may be determined according to the target distance value. After the amplification, the amplified portion is displayed in the middle of the display screen, which is only an exemplary illustration here, and is not a limitation thereto, and any other implementable manner may be adopted for how to amplify specifically, and is not described herein again.
In some embodiments, optionally, as shown in fig. 3, an architecture diagram of the virtual makeup trying system provided in this embodiment is illustrated. The user enters the virtual fitting room, can see the model display of the three-dimensional model, and has a clickable area on the three-dimensional model, so that the user can enter the corresponding virtual fitting function by clicking.
It should be noted that the respective implementable modes in the present embodiment may be implemented individually, or may be implemented in combination in any combination without conflict, and the present application is not limited thereto.
According to the virtual trial makeup method provided by the embodiment, the virtual trial makeup function can be started by touching the three-dimensional model on the display screen in the virtual fitting room, so that a user can conveniently and quickly enter the virtual trial makeup function from the virtual fitting, and the user experience can be effectively improved. And through showing the effect of trying to make up on three-dimensional model, can make the user look over fitting effect and the effect of trying to make up in combination, can make the user know conveniently which kind of clothing that oneself is fit for to and this kind of clothing is fit for which kind of makeup, further improve user experience, can effectively increase user and add and purchase product number and order number. The head of the model can be positioned through the magnifying gesture, so that a user can conveniently click the corresponding part of the head to make up for trial.
EXAMPLE III
The present embodiment provides a virtual makeup trying device for performing the method of the first embodiment.
As shown in fig. 4, a schematic structural diagram of the virtual makeup trying device provided in this embodiment is shown. The virtual makeup trial device 30 includes an acquisition module 31 and a processing module 32.
The obtaining module 31 is configured to obtain operation information of the user touching the three-dimensional model of the virtual fitting; the processing module 32 is configured to, if it is determined that the operation information corresponds to the start of the virtual makeup trial, obtain a function instruction corresponding to the start of the virtual makeup trial according to the operation information, and execute a corresponding operation according to the function instruction.
The specific manner in which the respective modules perform operations has been described in detail in relation to the apparatus in this embodiment, and will not be elaborated upon here.
According to the virtual fitting device provided by the embodiment, the virtual fitting function can be started through the three-dimensional model on the touch display screen in the virtual fitting room, so that a user can conveniently and quickly enter the virtual fitting function from the virtual fitting, and the user experience can be effectively improved.
Example four
The present embodiment further supplements the description of the apparatus provided in the third embodiment.
As a practical manner, on the basis of the third embodiment, optionally, the operation information includes click position information; the processing module is specifically configured to:
determining a makeup trying type for starting virtual makeup trying corresponding to the operation information according to the click position information;
and acquiring a function instruction for starting the virtual makeup trial corresponding to the makeup trial type according to the operation information, and executing corresponding operation according to the function instruction.
Optionally, the processing module is specifically configured to:
and starting a virtual makeup trying function corresponding to the makeup trying type, and displaying.
Optionally, the processing module is specifically configured to:
if the fitting type corresponding to the operation information is determined to be the eyebrow type according to the click position information, acquiring an eyebrow pencil function instruction for starting virtual fitting, and executing corresponding operation according to the function instruction;
if the makeup trying type corresponding to the operation information is determined to be the eye type according to the click position information, acquiring an eye makeup function instruction for starting virtual makeup trying, and executing corresponding operation according to the function instruction;
if the trial makeup type corresponding to the operation information is determined to be the cheek type according to the click position information, acquiring a blush function instruction for starting the virtual trial makeup, and executing corresponding operation according to the function instruction;
and if the trial makeup type corresponding to the operation information is determined to be the lip type according to the click position information, acquiring a lipstick function instruction for starting the virtual trial makeup, and executing corresponding operation according to the function instruction.
As another implementable manner, on the basis of the third embodiment, optionally, the obtaining module is further configured to obtain a makeup trying instruction of the user, where the makeup trying instruction includes a makeup trying type and makeup trying content;
and the processing module is also used for displaying the makeup trying effect on the corresponding part of the three-dimensional model according to the makeup trying type and the makeup trying content.
As another implementable manner, on the basis of the third embodiment, optionally, the processing module is further configured to:
and if the operation information is confirmed to correspond to the amplified three-dimensional model, amplifying the three-dimensional model according to the operation information.
Optionally, the operation information includes touch track start point information and touch track end point information; the processing module is specifically configured to:
and if the operation information is confirmed to correspond to the amplified three-dimensional model, amplifying and displaying the part of the three-dimensional model corresponding to the operation information according to the touch track starting point information and the touch track end point information.
The specific manner in which the respective modules perform operations has been described in detail in relation to the apparatus in this embodiment, and will not be elaborated upon here.
It should be noted that the respective implementable modes in the present embodiment may be implemented individually, or may be implemented in combination in any combination without conflict, and the present application is not limited thereto.
According to the virtual fitting device of this embodiment, through in virtual fitting room, the three-dimensional model on the touch display screen can start virtual fitting function, makes the user can conveniently get into virtual fitting function from virtual fitting fast, can make the user effectively improve user experience. And through showing the effect of trying to make up on three-dimensional model, can make the user look over fitting effect and the effect of trying to make up in combination, can make the user know conveniently which kind of clothing that oneself is fit for to and this kind of clothing is fit for which kind of makeup, further improve user experience, can effectively increase user and add and purchase product number and order number. The head of the model can be positioned through the magnifying gesture, so that a user can conveniently click the corresponding part of the head to make up for trial.
EXAMPLE five
The embodiment provides a computer device for executing the virtual makeup trying method provided by the embodiment.
Fig. 5 is a schematic structural diagram of the computer device provided in this embodiment. The computer device 50 includes: at least one processor 51 and memory 52;
the memory stores a computer program; at least one processor executes the computer program stored in the memory to implement the methods provided by the above-described embodiments.
According to the computer equipment of the embodiment, the virtual fitting function can be started by touching the three-dimensional model on the display screen in the virtual fitting room, so that a user can conveniently and quickly enter the virtual fitting function from the virtual fitting, and the user experience can be effectively improved. And through showing the effect of trying to make up on three-dimensional model, can make the user look over fitting effect and the effect of trying to make up in combination, can make the user know conveniently which kind of clothing that oneself is fit for to and this kind of clothing is fit for which kind of makeup, further improve user experience, can effectively increase user and add and purchase product number and order number. The head of the model can be positioned through the magnifying gesture, so that a user can conveniently click the corresponding part of the head to make up for trial.
EXAMPLE six
The present embodiment provides a computer-readable storage medium, in which a computer program is stored, and when the computer program is executed, the method provided by any one of the above embodiments is implemented.
According to the computer-readable storage medium of the embodiment, the virtual fitting function can be started by touching the three-dimensional model on the display screen in the virtual fitting room, so that a user can conveniently and quickly enter the virtual fitting function from the virtual fitting, and the user experience can be effectively improved. And through showing the effect of trying to make up on three-dimensional model, can make the user look over fitting effect and the effect of trying to make up in combination, can make the user know conveniently which kind of clothing that oneself is fit for to and this kind of clothing is fit for which kind of makeup, further improve user experience, can effectively increase user and add and purchase product number and order number. The head of the model can be positioned through the magnifying gesture, so that a user can conveniently click the corresponding part of the head to make up for trial.
In the several embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described apparatus embodiments are merely illustrative, and for example, the division of the units is only one logical division, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, or in a form of hardware plus a software functional unit.
The integrated unit implemented in the form of a software functional unit may be stored in a computer readable storage medium. The software functional unit is stored in a storage medium and includes several instructions to enable a computer device (which may be a personal computer, a server, or a network device) or a processor (processor) to execute some steps of the methods according to the embodiments of the present application. And the aforementioned storage medium includes: various media capable of storing program codes, such as a usb disk, a removable hard disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk, or an optical disk.
It is obvious to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be performed by different functional modules according to needs, that is, the internal structure of the device is divided into different functional modules to perform all or part of the above described functions. For the specific working process of the device described above, reference may be made to the corresponding process in the foregoing method embodiment, which is not described herein again.
Finally, it should be noted that: the above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some or all of the technical features may be equivalently replaced; and the modifications or the substitutions do not make the essence of the corresponding technical solutions depart from the scope of the technical solutions of the embodiments of the present application.
Claims (17)
1. A virtual makeup trial method, comprising:
acquiring operation information of a user touching the three-dimensional model of the virtual fitting;
and if the virtual makeup trial is confirmed to be started corresponding to the operation information, acquiring a function instruction corresponding to the virtual makeup trial starting according to the operation information, and executing corresponding operation according to the function instruction.
2. The method of claim 1, wherein the operational information includes click location information;
if the operation information is confirmed to be corresponding to the start of the virtual makeup trial, acquiring a function instruction corresponding to the start of the virtual makeup trial according to the operation information, and executing corresponding operations according to the function instruction, wherein the operation operations comprise:
determining a makeup trying type for starting virtual makeup trying corresponding to the operation information according to the click position information;
and acquiring a function instruction for starting the virtual makeup trial corresponding to the makeup trial type according to the operation information, and executing corresponding operation according to the function instruction.
3. The method of claim 2, wherein the makeup try types include an eyebrow type, an eye type, a cheek type, and a lip type.
4. The method of claim 2, wherein according to the functional instruction, performing corresponding operations comprises:
and starting a virtual makeup trying function corresponding to the makeup trying type, and displaying.
5. The method according to claim 2, wherein the obtaining a function instruction for starting the virtual makeup try corresponding to the makeup try type according to the operation information and executing corresponding operations according to the function instruction comprises:
if the fitting type corresponding to the operation information is determined to be the eyebrow type according to the click position information, acquiring an eyebrow pencil function instruction for starting virtual fitting, and executing corresponding operation according to the function instruction;
if the makeup trying type corresponding to the operation information is determined to be the eye type according to the click position information, acquiring an eye makeup function instruction for starting virtual makeup trying, and executing corresponding operation according to the function instruction;
if the trial makeup type corresponding to the operation information is determined to be the cheek type according to the click position information, acquiring a blush function instruction for starting virtual trial makeup, and executing corresponding operation according to the function instruction;
and if the trial makeup type corresponding to the operation information is determined to be the lip type according to the click position information, acquiring a lipstick function instruction for starting virtual trial makeup, and executing corresponding operation according to the function instruction.
6. The method of claim 1, further comprising:
acquiring a makeup trying instruction of a user, wherein the makeup trying instruction comprises a makeup trying type and makeup trying contents;
and displaying a makeup trial effect on the corresponding part of the three-dimensional model according to the makeup trial type and the makeup trial content.
7. The method of claim 1, wherein if it is determined that the operation information corresponds to an enlarged three-dimensional model, the three-dimensional model is enlarged based on the operation information.
8. The method of claim 7, wherein the operation information comprises touch track start point information and touch track end point information;
if the operation information is confirmed to correspond to the amplified three-dimensional model, the three-dimensional model is amplified according to the operation information, and the method comprises the following steps:
and if the operation information is confirmed to correspond to the amplified three-dimensional model, amplifying and displaying the part of the three-dimensional model corresponding to the operation information according to the touch track starting point information and the touch track end point information.
9. A virtual makeup trial device, comprising:
the acquisition module is used for acquiring operation information of a user touching the three-dimensional model of the virtual fitting;
and the processing module is used for acquiring a function instruction corresponding to the virtual makeup trial starting according to the operation information and executing corresponding operation according to the function instruction if the virtual makeup trial starting corresponding to the operation information is confirmed.
10. The apparatus of claim 9, wherein the operation information comprises click location information; the processing module is specifically configured to:
determining a makeup trying type for starting virtual makeup trying corresponding to the operation information according to the click position information;
and acquiring a function instruction for starting the virtual makeup trial corresponding to the makeup trial type according to the operation information, and executing corresponding operation according to the function instruction.
11. The apparatus of claim 10, wherein the processing module is specifically configured to:
and starting a virtual makeup trying function corresponding to the makeup trying type, and displaying.
12. The apparatus of claim 10, wherein the processing module is specifically configured to:
if the fitting type corresponding to the operation information is determined to be the eyebrow type according to the click position information, acquiring an eyebrow pencil function instruction for starting virtual fitting, and executing corresponding operation according to the function instruction;
if the makeup trying type corresponding to the operation information is determined to be the eye type according to the click position information, acquiring an eye makeup function instruction for starting virtual makeup trying, and executing corresponding operation according to the function instruction;
if the trial makeup type corresponding to the operation information is determined to be the cheek type according to the click position information, acquiring a blush function instruction for starting virtual trial makeup, and executing corresponding operation according to the function instruction;
and if the trial makeup type corresponding to the operation information is determined to be the lip type according to the click position information, acquiring a lipstick function instruction for starting virtual trial makeup, and executing corresponding operation according to the function instruction.
13. The device of claim 9, wherein the obtaining module is further configured to obtain a makeup trying instruction of the user, where the makeup trying instruction includes a makeup trying type and makeup trying content;
and the processing module is also used for displaying the makeup trying effect on the corresponding part of the three-dimensional model according to the makeup trying type and the makeup trying content.
14. The apparatus of claim 9, wherein the processing module is further configured to:
and if the operation information is confirmed to correspond to the amplified three-dimensional model, amplifying the three-dimensional model according to the operation information.
15. The apparatus of claim 14, wherein the operation information comprises touch track start point information and touch track end point information; the processing module is specifically configured to:
and if the operation information is confirmed to correspond to the amplified three-dimensional model, amplifying and displaying the part of the three-dimensional model corresponding to the operation information according to the touch track starting point information and the touch track end point information.
16. A computer device, comprising: at least one processor and memory;
the memory stores a computer program; the at least one processor executes the memory-stored computer program to implement the method of any of claims 1-8.
17. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, which computer program, when executed, implements the method of any one of claims 1-8.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811525712.XA CN111324274A (en) | 2018-12-13 | 2018-12-13 | Virtual makeup trial method, device, equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201811525712.XA CN111324274A (en) | 2018-12-13 | 2018-12-13 | Virtual makeup trial method, device, equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111324274A true CN111324274A (en) | 2020-06-23 |
Family
ID=71166514
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201811525712.XA Pending CN111324274A (en) | 2018-12-13 | 2018-12-13 | Virtual makeup trial method, device, equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111324274A (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112163920A (en) * | 2020-08-18 | 2021-01-01 | 广州市美图现电子有限公司 | Using method and device of skin-measuring makeup system, storage medium and computer equipment |
WO2023020196A1 (en) * | 2021-08-17 | 2023-02-23 | 北京字跳网络技术有限公司 | Virtual character image replacement method and apparatus, and computer storage medium |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103597519A (en) * | 2011-02-17 | 2014-02-19 | 麦特尔有限公司 | Computer implemented methods and systems for generating virtual body models for garment fit visualization |
CN104091269A (en) * | 2014-06-30 | 2014-10-08 | 京东方科技集团股份有限公司 | Virtual fitting method and virtual fitting system |
CN105164999A (en) * | 2013-04-17 | 2015-12-16 | 松下知识产权经营株式会社 | Image processing method and image processing device |
CN105229673A (en) * | 2013-04-03 | 2016-01-06 | 诺基亚技术有限公司 | A kind of device and the method be associated |
US20160042557A1 (en) * | 2014-08-08 | 2016-02-11 | Asustek Computer Inc. | Method of applying virtual makeup, virtual makeup electronic system, and electronic device having virtual makeup electronic system |
WO2017212878A1 (en) * | 2016-06-10 | 2017-12-14 | パナソニックIpマネジメント株式会社 | Virtual makeup device, and virtual makeup method |
WO2018005884A1 (en) * | 2016-06-29 | 2018-01-04 | EyesMatch Ltd. | System and method for digital makeup mirror |
CN108227922A (en) * | 2017-12-31 | 2018-06-29 | 广州二元科技有限公司 | Cosmetic method on a kind of real-time digital image of virtual reality |
CN108885794A (en) * | 2016-01-27 | 2018-11-23 | 尼廷·文斯 | The virtually trying clothes on the realistic human body model of user |
-
2018
- 2018-12-13 CN CN201811525712.XA patent/CN111324274A/en active Pending
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103597519A (en) * | 2011-02-17 | 2014-02-19 | 麦特尔有限公司 | Computer implemented methods and systems for generating virtual body models for garment fit visualization |
CN105229673A (en) * | 2013-04-03 | 2016-01-06 | 诺基亚技术有限公司 | A kind of device and the method be associated |
CN105164999A (en) * | 2013-04-17 | 2015-12-16 | 松下知识产权经营株式会社 | Image processing method and image processing device |
CN104091269A (en) * | 2014-06-30 | 2014-10-08 | 京东方科技集团股份有限公司 | Virtual fitting method and virtual fitting system |
US20160042557A1 (en) * | 2014-08-08 | 2016-02-11 | Asustek Computer Inc. | Method of applying virtual makeup, virtual makeup electronic system, and electronic device having virtual makeup electronic system |
CN108885794A (en) * | 2016-01-27 | 2018-11-23 | 尼廷·文斯 | The virtually trying clothes on the realistic human body model of user |
WO2017212878A1 (en) * | 2016-06-10 | 2017-12-14 | パナソニックIpマネジメント株式会社 | Virtual makeup device, and virtual makeup method |
WO2018005884A1 (en) * | 2016-06-29 | 2018-01-04 | EyesMatch Ltd. | System and method for digital makeup mirror |
CN108227922A (en) * | 2017-12-31 | 2018-06-29 | 广州二元科技有限公司 | Cosmetic method on a kind of real-time digital image of virtual reality |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112163920A (en) * | 2020-08-18 | 2021-01-01 | 广州市美图现电子有限公司 | Using method and device of skin-measuring makeup system, storage medium and computer equipment |
WO2023020196A1 (en) * | 2021-08-17 | 2023-02-23 | 北京字跳网络技术有限公司 | Virtual character image replacement method and apparatus, and computer storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN110363867B (en) | Virtual decorating system, method, device and medium | |
US10331731B2 (en) | Method and apparatus for identifying input features for later recognition | |
CN107220960B (en) | Make-up trial method, system and equipment | |
US9846803B2 (en) | Makeup supporting device, makeup supporting system, makeup supporting method, and non-transitory computer-readable recording medium | |
Genest et al. | KinectArms: a toolkit for capturing and displaying arm embodiments in distributed tabletop groupware | |
CN110456907A (en) | Control method, device, terminal device and the storage medium of virtual screen | |
CN109242765B (en) | Face image processing method and device and storage medium | |
WO2020114158A1 (en) | Method, apparatus and device for detecting lesion, and storage medium | |
US11954860B2 (en) | Image matching method and device, and storage medium | |
WO2015017687A2 (en) | Systems and methods for producing predictive images | |
Fiorentino et al. | Design review of CAD assemblies using bimanual natural interface | |
US20150269759A1 (en) | Image processing apparatus, image processing system, and image processing method | |
CN103870821A (en) | Virtual make-up trial method and system | |
US10395101B2 (en) | Interest degree determination device, interest Degree determination method, and non-transitory computer-readable recording medium | |
Vitali et al. | Acquisition of customer’s tailor measurements for 3D clothing design using virtual reality devices | |
CN111324274A (en) | Virtual makeup trial method, device, equipment and storage medium | |
CN112190921A (en) | Game interaction method and device | |
CN109559370A (en) | A kind of three-dimensional modeling method and device | |
Loorak et al. | Hand-over-face input sensing for interaction with smartphones through the built-in camera | |
CN112416126A (en) | Page rolling control method and device, storage medium and electronic equipment | |
CN110322964B (en) | Health state display method and device, computer equipment and storage medium | |
JP2019512141A (en) | Face model editing method and apparatus | |
CN116452745A (en) | Hand modeling, hand model processing method, device and medium | |
CN113703577A (en) | Drawing method and device, computer equipment and storage medium | |
CN114063845A (en) | Display method, display device and electronic equipment |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |