CN111402115B - Image processing method and electronic equipment - Google Patents

Image processing method and electronic equipment Download PDF

Info

Publication number
CN111402115B
CN111402115B CN202010163229.2A CN202010163229A CN111402115B CN 111402115 B CN111402115 B CN 111402115B CN 202010163229 A CN202010163229 A CN 202010163229A CN 111402115 B CN111402115 B CN 111402115B
Authority
CN
China
Prior art keywords
beauty
target
face
tip
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010163229.2A
Other languages
Chinese (zh)
Other versions
CN111402115A (en
Inventor
赵鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010163229.2A priority Critical patent/CN111402115B/en
Publication of CN111402115A publication Critical patent/CN111402115A/en
Application granted granted Critical
Publication of CN111402115B publication Critical patent/CN111402115B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • G06T3/04
    • G06T5/77
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30196Human being; Person
    • G06T2207/30201Face

Abstract

The embodiment of the invention provides an image processing method and electronic equipment, wherein the method comprises the following steps: modeling a target face in a first image to obtain a face model; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; and adding the target beauty tip to a preset area in the face model, wherein the preset area is determined according to the hairline position and the central axis of the face. The image processing method provided by the embodiment of the invention can add the beauty tip to the face image.

Description

Image processing method and electronic equipment
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to an image processing method and an electronic device.
Background
With the increasing popularity of shooting culture, more and more people shoot images in scenes of travel, meetings, life, parties and the like to record life. Along with the continuous progress of photographing software of electronic equipment and the high-demand of user for photographing effect, the face in the photographed character image can be subjected to face beautifying processing through the face beautifying software.
The beauty tip refers to a part protruding downwards in a V shape in the middle of the hairline (the connection of the hair with the forehead), which enables the left and right temples of the hairline to be symmetrical and arc-shaped, and can divide the appearance of a person. However, the existing face beautifying technology mainly adjusts facial five sense organs, and cannot add a beauty tip to a face image.
Disclosure of Invention
The embodiment of the invention provides an image processing method and electronic equipment, which are used for solving the problem that a beauty tip cannot be added to a face image in the prior art.
In order to solve the technical problems, the embodiment of the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, which is applied to an electronic device, where the method includes: modeling a target face in a first image to obtain a face model; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; and adding the target beauty tip to a preset area in the face model, wherein the preset area is determined according to the hairline position and the central axis of the face. In a second aspect, an embodiment of the present invention provides an electronic device, including: the modeling module is used for modeling the target face in the first image to obtain a face model; the first determining module is used for determining the position of the hairline in the face model and the central axis of the face; the second determining module is used for determining the target beauty tip; the adding module is used for adding the target beauty tip to a preset area in the face model, wherein the preset area is determined according to the hairline position and the central axis of the face.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program implements the steps of the image processing method described above when executed by the processor.
In a fourth aspect, an embodiment of the present invention provides a computer readable storage medium, where a computer program is stored on the computer readable storage medium, and the computer program when executed by a processor implements the steps of the image processing method described above.
In the embodiment of the invention, a face model is obtained by modeling a target face in a first image; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; adding the target beauty tip to a preset area in the face model; a cosmetic tip can be added to the face image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments of the present invention will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart of steps of an image processing method according to an embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of another image processing method according to an embodiment of the present invention;
FIG. 3 is a schematic illustration of a face model;
fig. 4 is a block diagram of an electronic device according to an embodiment of the present invention;
fig. 5 is a schematic hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the order of execution, and the order of execution of the processes should be determined by the functions and internal logic, and should not constitute any limitation on the implementation process of the embodiments of the present invention.
Example 1
Referring to fig. 1, a flowchart of steps of an image processing method according to a first embodiment of the present invention is shown.
The image processing method of the embodiment of the invention is applied to the electronic equipment and comprises the following steps:
step 101: modeling a target face in the first image to obtain a face model.
The image processing method provided by the embodiment of the invention is suitable for processing images and video. When the face modeling is carried out, a camera can be used for collecting a preview image, the face in the preview image is scanned and feature points are extracted, and the face is modeled based on the data obtained through the processing to obtain a face model.
The specific ways of scanning the face and extracting the feature points are only needed by referring to the related art, and the embodiment of the invention is not particularly limited.
Step 102: and determining the position of the hairline in the face model and the central axis of the face.
When the position of the hairline in the face model is determined, the hairline can be identified and positioned according to judging indexes such as hair texture, hair color and the like at the joint of the hairline and the skin, and the division of the hairline and the face is completed. The hairline position and the central axis of the face are two important factors for determining the adding position of the tip of the target beauty, so that the hairline position and the central axis of the face need to be determined in the face model. The central axis of the human face is the longitudinal axis.
Step 103: the target cosmetic tip is determined.
Optionally, the target beauty tip may be automatically generated by the electronic device according to the face model.
The target beauty tip can also be manually selected by a user from a plurality of preset beauty tips, and the electronic equipment determines the target beauty tip according to the user selection; or the electronic equipment automatically matches the target beauty tip from a plurality of preset beauty tips according to the face model.
Step 104: and adding the target beauty tip to a preset area in the face model.
The preset area is determined according to the position of the hairline and the central axis of the human face, and the upper edge of the target beauty tip can be connected with the hairline during addition, and the target beauty tip coincides with the central axis of the human face.
And adding the target beauty points into the face model to obtain a target face model, and adding the beauty points for each image to be processed according to the target face model.
Optionally, in the image shooting process, the electronic device adds a beauty tip for each frame of preview image according to the target face model, and the preview image after adding the beauty tip can be used for a user to preview the adding effect. If the user is satisfied with the added beauty tip effect, a confirmation instruction can be output, and the electronic equipment carries out the addition of the beauty tip on each frame of preview image acquired subsequently along the face model. In addition, in the image shooting process, a user can output a beauty tip adjusting instruction at any time, the electronic equipment responds to the instruction to output an adjusting interface, and the user adjusts parameters such as the position, the shape and the like of the beauty tip in the adjusting interface.
According to the image processing method provided by the embodiment of the invention, the face model is obtained by modeling the target face in the first image; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; and adding the target beauty point to a preset area in the face model, so that the beauty point can be added to the face image.
Example two
Referring to fig. 2, a flowchart of steps of an image processing method according to a second embodiment of the present invention is shown.
The image processing method of the embodiment of the invention is applied to the electronic equipment and comprises the following steps:
step 201: modeling a target face in the first image to obtain a face model.
The specific way of modeling the face is only required to refer to the related art, and the embodiment of the invention does not limit the specific way. The schematic diagram of the constructed face model is shown in fig. 2, and the face model comprises: facial form, facial feature point information, hairline position, hairline shape, and the like.
Step 202: and determining the position of the hairline in the face model and the central axis of the face.
The central axis is the longitudinal axis in the face model. The central axis is used for aligning the point of the beauty, and the hairline position is used as a reference for adding the point position of the beauty.
Step 203: and determining a target face shape corresponding to the face model.
The facial form refers to the outline of the face, the upper half of the face is a circular arc-shaped structure composed of maxilla, cheekbone, temporal bone, frontal bone and parietal bone, and the lower half depends on the form of mandible, which are important factors affecting the facial form.
The face shapes may include, but are not limited to: fang Lianxing, round face, standard face, chinese character ' shenzi ' face, a chinese character's face, etc.
Step 204: searching a first beauty tip matched with the target face from preset beauty tips.
The electronic equipment is preset with a plurality of facial types of beauty tips corresponding to at least one facial type.
The target face model can be used for determining the target face model, and the matched first beauty tip can be found from the preset multiple beauty tips according to the target face model.
Step 205: and determining a target beauty tip according to the first beauty tip.
In the specific implementation process, the first beauty tip can be directly used for determining the target beauty tip, and the first beauty tip can be used for expanding the beauty tip and determining the target beauty tip from the expanded beauty tips. An alternative way is:
searching a first beauty tip matched with the target face from preset beauty tips; generating an extended beauty tip according to the first beauty tip; displaying a first beauty tip and an extended beauty tip; receiving the selection operation of a user on the first beauty tip and the expanded beauty tip; the selected beauty tip is determined as the target beauty tip.
The expanded beauty tip can be obtained by expanding the size and color of the first beauty tip. When the size of the beauty tip is expanded, the size of the beauty tip can be changed by changing the length value of the beauty tip; when the color of the beauty tip is expanded, the color of the beauty tip can be given by directly applying a preset color template for selection by a user. An alternative way to extend the color of the tip of the person is to extract each target color matching the color system of the hair of the user from a preset color template as an extended color.
The mode of expanding the beauty tip for the user to select is provided, the personalized requirement of the user on the beauty tip can be met, and the interestingness is increased.
Step 206: and adding the target beauty tip into a preset area in the face model.
The preset area is determined through the hairline position and the central axis of the human face, the upper edge of the target beauty tip is connected with the hairline, and the target beauty tip is overlapped with the central axis of the human face.
An optional way to add the target beauty tip to the preset area in the face model is:
determining the position of the eyebrow in the face model; determining a distance between the eyebrow position and the hairline position; the target beauty tip is added at a position which is positioned at a fifth position of the distance and is close to the position side of the hairline.
The optional mode of determining the adding position of the target beauty tip based on the hairline position and the eyebrow position ensures that the effect of the added beauty tip is better and more natural.
The above only exemplifies an optional adding manner, as shown in fig. 3, in the specific implementation process, the position of the tip of the beauty can only occupy 0 to 1/4 of the position from the hairline to the eyebrow, so that the specific adding position of the target tip of the beauty can be adjusted by the user within a specific range. Assuming that the length of the tip of the beauty is n and the distance from the hairline to the position of the eyebrow is y, the specific range is 0< n < y/4.
After the target beauty tip is added at the position which is close to the hairline position side and is positioned at one fifth of the distance between the eyebrow position and the hairline position, the user can flexibly adjust the adding position of the target beauty tip according to the requirement, and the personalized requirement of the user can be met. The specific adjustment mode is as follows:
the electronic equipment receives position adjustment operation of a user on a target beauty tip; adjusting the position of the target beauty tip according to the position adjustment operation; the portion of the upper edge of the target mermaid tip beyond the hairline is deleted.
And adding the target beauty tip into the face model to obtain the target face model.
Step 207: and adding a beauty tip for the target face in each frame of the image to be processed according to the target face model.
And identifying facial feature point information, hairline position, hairline shape and other information in each frame of to-be-processed image according to the target face model, and accurately attaching the target beauty tip to the target position so as to ensure that a user can obtain the target image added with the beauty tip.
Optionally, the image to be processed may be a shot image, or may be a preview image of each frame in the shooting process. When the beauty tip is added for each frame of preview image, the preview image after the beauty tip is added can be used for a user to preview the adding effect. If the user is satisfied with the added beauty tip effect, a confirmation instruction can be output, and the electronic equipment performs the beauty tip addition on the later acquired frames of preview images along with the target face model. In addition, in the image shooting process, a user can output a beauty tip adjusting instruction at any time, the electronic equipment responds to the instruction to output an adjusting interface, and the user adjusts parameters such as the position, the shape and the like of the beauty tip in the adjusting interface.
According to the image processing method provided by the embodiment of the invention, the face model is obtained by modeling the target face in the first image; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; adding the target beauty tip to a preset area in the face model to obtain a target face model; according to the target face model added with the target beauty tip, the beauty tip is added for the target face in each frame of preview image in real time, and the beauty tip can be added for the face image, so that the effect of adjusting the details except the five sense organs in the face image is achieved. In addition, according to the image processing method provided by the embodiment of the invention, the target beauty tip is determined based on the target face model, so that the added beauty tip can be ensured to be more matched with the face in the face image, and the adding effect is more natural.
Example III
Referring to fig. 4, a block diagram of an electronic device according to a third embodiment of the present invention is shown.
The electronic device 400 of the embodiment of the invention includes:
the modeling module 401 is configured to model a target face in the first image to obtain a face model;
a first determining module 402, configured to determine a hairline position and a central axis of a face in the face model;
a second determining module 403, configured to determine a target beauty tip;
and an adding module 404, configured to add the target mermaid point to a preset area in the face model, where the preset area is determined according to the hairline position and the central axis of the face.
Optionally, the second determining module includes:
the first sub-module is used for determining a target face shape corresponding to the face model;
the second sub-module is used for searching a first beauty tip matched with the target face shape from preset beauty tips; and determining a target beauty tip according to the first beauty tip.
Optionally, the second submodule includes:
the first unit is used for searching a first beauty tip matched with the target face shape from preset beauty tips;
a second unit for generating an extended beauty tip according to the first beauty tip;
a third unit for displaying the first beauty tip and the extended beauty tip;
a fourth unit for receiving a selection operation of the first beauty tip and the extended beauty tip by a user;
and a fifth unit for determining the selected beauty tip as a target beauty tip.
Optionally, the adding module includes:
the position determining submodule is used for determining the position of the eyebrow in the face model;
a distance determination sub-module for determining a distance between the eyebrow position and the hairline position;
an adding sub-module for adding the target beauty tip at a position which is positioned at a fifth of the distance and is close to the hairline position side; wherein the target cosmetic tip coincides with the central axis of the face.
Optionally, the electronic device further includes:
the receiving module is used for receiving the position adjustment operation of the user on the target beauty tip after the adding module is positioned at the position which is close to the hairline position side and is one fifth of the distance;
the position adjusting module is used for adjusting the position of the target beauty tip according to the position adjusting operation;
and the deleting module is used for deleting the part exceeding the hairline at the upper edge of the target beauty tip.
Optionally, the electronic device may further include a real-time adjustment module, configured to add a beauty tip to the target face in each frame of preview image in real time according to the target face model.
The electronic device provided by the embodiment of the present invention can implement each process of the image processing method in each method embodiment, and in order to avoid repetition, a description is omitted here.
According to the electronic equipment provided by the embodiment of the invention, the face model is obtained by modeling the target face in the first image; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; and adding the target beauty point to a preset area in the face model, so that the beauty point can be added to the face image.
Example IV
Referring to fig. 5, a block diagram of an electronic device according to a fourth embodiment of the present invention is shown.
Fig. 5 is a schematic hardware structure of an electronic device implementing various embodiments of the present invention, where the electronic device 500 includes, but is not limited to: radio frequency unit 501, network module 502, audio output unit 503, input unit 504, sensor 505, display unit 506, user input unit 507, interface unit 508, memory 509, processor 510, and power source 511. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 5 is not limiting of the electronic device and that the electronic device may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. In the embodiment of the invention, the electronic equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
The processor 510 may be configured to model a target face in the first image to obtain a face model; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; and adding the target beauty tip to a preset area in the face model, wherein the preset area is determined according to the hairline position and the central axis of the face.
According to the electronic equipment provided by the embodiment of the invention, the face model is obtained by modeling the target face in the first image; determining the position of a hairline in the face model and the central axis of the face; determining a target beauty tip; and adding the target beauty point to a preset area in the face model, so that the beauty point can be added to the face image.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used to receive and send information or signals during a call, specifically, receive downlink data from a base station, and then process the downlink data with the processor 510; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 may also communicate with networks and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user through the network module 502, such as helping the user to send and receive e-mail, browse web pages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the electronic device 500. The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used for receiving an audio or video signal. The input unit 504 may include a graphics processor (Graphics Processing Unit, GPU) 5041 and a microphone 5042, the graphics processor 5041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphics processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. Microphone 5042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 501 in case of a phone call mode.
The electronic device 500 also includes at least one sensor 505, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 5061 and/or the backlight when the electronic device 500 is moved to the ear. The display panel 501 is a flexible display screen, and the flexible display screen includes a screen base, a liftable module array and a flexible screen which are sequentially stacked. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the electronic equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 505 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 506 is used to display information input by a user or information provided to the user. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 is operable to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 5071 or thereabout using any suitable object or accessory such as a finger, stylus, etc.). Touch panel 5071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, physical keyboards, function keys (e.g., volume control keys, switch keys, etc.), trackballs, mice, joysticks, and so forth, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or thereabout, the touch operation is transmitted to the processor 510 to determine a type of touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of touch event. Although in fig. 5, the touch panel 5071 and the display panel 5061 are two independent components for implementing the input and output functions of the electronic device, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 508 is an interface for connecting an external device to the electronic apparatus 500. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 500 or may be used to transmit data between the electronic apparatus 500 and an external device.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 509, and calling data stored in the memory 509, thereby performing overall monitoring of the electronic device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 510.
The electronic device 500 may also include a power supply 511 (e.g., a battery) for powering the various components, and preferably the power supply 511 may be logically connected to the processor 510 via a power management system that performs functions such as managing charging, discharging, and power consumption.
In addition, the electronic device 500 includes some functional modules, which are not shown, and will not be described herein.
Preferably, the embodiment of the present invention further provides an electronic device, including a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program when executed by the processor 510 implements each process of the above embodiment of the image processing method, and the same technical effects can be achieved, and for avoiding repetition, a description is omitted herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above-mentioned image processing method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (6)

1. An image processing method applied to an electronic device, the method comprising:
modeling a target face in a first image to obtain a face model;
determining the position of a hairline in the face model and the central axis of the face;
determining a target beauty tip;
adding the target beauty tip to a preset area in the face model, wherein the preset area is determined according to the hairline position and the central axis of the face;
the determining the target beauty tip includes:
determining a target face shape corresponding to the face model;
searching a first beauty tip matched with the target face from preset beauty tips;
determining a target beauty tip according to the first beauty tip;
searching a first beauty point matched with the target face from preset beauty points, and determining a target beauty point according to the first beauty point, wherein the method comprises the following steps:
searching a first beauty tip matched with the target face from preset beauty tips;
generating an extended beauty tip according to the first beauty tip;
displaying the first beauty tip and the extended beauty tip;
receiving selection operations of a user on the first beauty tip and the extended beauty tip;
the selected beauty tip is determined as the target beauty tip.
2. The image processing method according to claim 1, wherein adding the target beauty tip to a preset area in the face model includes:
determining the position of the eyebrow in the face model;
determining a distance between the eyebrow position and the hairline position;
adding the target beauty tip at a position which is positioned at a fifth of the distance and is close to the position side of the hairline; wherein the central axis of the target beauty tip coincides with the central axis of the face.
3. An electronic device, the electronic device comprising:
the modeling module is used for modeling the target face in the first image to obtain a face model;
the first determining module is used for determining the position of the hairline in the face model and the central axis of the face;
the second determining module is used for determining the target beauty tip;
the adding module is used for adding the target beauty tip to a preset area in the face model, wherein the preset area is determined according to the hairline position and the central axis of the face;
the second determining module includes:
the first sub-module is used for determining a target face shape corresponding to the face model;
the second sub-module is used for searching a first beauty tip matched with the target face shape from preset beauty tips; determining a target beauty tip according to the first beauty tip;
the second sub-module includes:
the first unit is used for searching a first beauty tip matched with the target face shape from preset beauty tips;
a second unit for generating an extended beauty tip according to the first beauty tip;
a third unit for displaying the first beauty tip and the extended beauty tip;
a fourth unit for receiving a selection operation of the first beauty tip and the extended beauty tip by a user;
and a fifth unit for determining the selected beauty tip as a target beauty tip.
4. The electronic device of claim 3, wherein the add-on module comprises:
the position determining submodule is used for determining the position of the eyebrow in the face model;
a distance determination sub-module for determining a distance between the eyebrow position and the hairline position;
an adding sub-module for adding the target beauty tip at a position which is positioned at a fifth of the distance and is close to the hairline position side; wherein the central axis of the target beauty tip coincides with the central axis of the face.
5. An electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, which when executed by the processor, implements the steps of the image processing method according to any of claims 1-2.
6. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, implements the steps of the image processing method according to any of claims 1-2.
CN202010163229.2A 2020-03-10 2020-03-10 Image processing method and electronic equipment Active CN111402115B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010163229.2A CN111402115B (en) 2020-03-10 2020-03-10 Image processing method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010163229.2A CN111402115B (en) 2020-03-10 2020-03-10 Image processing method and electronic equipment

Publications (2)

Publication Number Publication Date
CN111402115A CN111402115A (en) 2020-07-10
CN111402115B true CN111402115B (en) 2024-02-20

Family

ID=71430838

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010163229.2A Active CN111402115B (en) 2020-03-10 2020-03-10 Image processing method and electronic equipment

Country Status (1)

Country Link
CN (1) CN111402115B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780658A (en) * 2016-11-16 2017-05-31 北京旷视科技有限公司 face characteristic adding method, device and equipment
CN107730444A (en) * 2017-10-31 2018-02-23 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and computer equipment
CN107835367A (en) * 2017-11-14 2018-03-23 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN107833177A (en) * 2017-10-31 2018-03-23 维沃移动通信有限公司 A kind of image processing method and mobile terminal
WO2019024751A1 (en) * 2017-07-31 2019-02-07 腾讯科技(深圳)有限公司 Facial expression synthesis method and apparatus, electronic device, and storage medium
CN109544445A (en) * 2018-12-11 2019-03-29 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106780658A (en) * 2016-11-16 2017-05-31 北京旷视科技有限公司 face characteristic adding method, device and equipment
WO2019024751A1 (en) * 2017-07-31 2019-02-07 腾讯科技(深圳)有限公司 Facial expression synthesis method and apparatus, electronic device, and storage medium
CN107730444A (en) * 2017-10-31 2018-02-23 广东欧珀移动通信有限公司 Image processing method, device, readable storage medium storing program for executing and computer equipment
CN107833177A (en) * 2017-10-31 2018-03-23 维沃移动通信有限公司 A kind of image processing method and mobile terminal
CN107835367A (en) * 2017-11-14 2018-03-23 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN109544445A (en) * 2018-12-11 2019-03-29 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
乐言 ; .迷恋"美人尖".医学美学美容.2007,(01),全文. *
凡言 等."修整美人尖 发际线手术探讨".《中国科学美容》.2007,(第3期),第60-61页. *

Also Published As

Publication number Publication date
CN111402115A (en) 2020-07-10

Similar Documents

Publication Publication Date Title
CN108184050B (en) Photographing method and mobile terminal
CN109461117B (en) Image processing method and mobile terminal
CN108712603B (en) Image processing method and mobile terminal
CN109819167B (en) Image processing method and device and mobile terminal
CN108062400A (en) Examination cosmetic method, smart mirror and storage medium based on smart mirror
CN109685915B (en) Image processing method and device and mobile terminal
CN110706179A (en) Image processing method and electronic equipment
CN111047511A (en) Image processing method and electronic equipment
CN111031253B (en) Shooting method and electronic equipment
CN111031234B (en) Image processing method and electronic equipment
CN109671034B (en) Image processing method and terminal equipment
CN108881782B (en) Video call method and terminal equipment
CN109448069B (en) Template generation method and mobile terminal
CN109272473B (en) Image processing method and mobile terminal
CN109461124A (en) A kind of image processing method and terminal device
CN108984143B (en) Display control method and terminal equipment
US20230014409A1 (en) Detection result output method, electronic device and medium
CN111091519B (en) Image processing method and device
CN111080747B (en) Face image processing method and electronic equipment
CN107563353B (en) Image processing method and device and mobile terminal
CN113255396A (en) Training method and device of image processing model, and image processing method and device
CN110908517A (en) Image editing method, image editing device, electronic equipment and medium
CN108156386B (en) Panoramic photographing method and mobile terminal
CN109005337A (en) A kind of photographic method and terminal
CN109903218B (en) Image processing method and terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant