CN111210491B - Image processing method, electronic device, and storage medium - Google Patents

Image processing method, electronic device, and storage medium Download PDF

Info

Publication number
CN111210491B
CN111210491B CN201911416461.6A CN201911416461A CN111210491B CN 111210491 B CN111210491 B CN 111210491B CN 201911416461 A CN201911416461 A CN 201911416461A CN 111210491 B CN111210491 B CN 111210491B
Authority
CN
China
Prior art keywords
hair
target area
preset
filling
target
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911416461.6A
Other languages
Chinese (zh)
Other versions
CN111210491A (en
Inventor
周楚瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911416461.6A priority Critical patent/CN111210491B/en
Publication of CN111210491A publication Critical patent/CN111210491A/en
Application granted granted Critical
Publication of CN111210491B publication Critical patent/CN111210491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses an image processing method, electronic equipment and a storage medium. The image processing method comprises the following steps: acquiring a target image; determining a region of a target object from the target image; and filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold. By using the embodiment of the invention, the corresponding hair filling aiming at the user requirement can be realized, and the user experience is improved.

Description

Image processing method, electronic device, and storage medium
Technical Field
The embodiment of the invention relates to the technical field of image processing, in particular to an image processing method, electronic equipment and a storage medium.
Background
Nowadays, electronic devices are becoming more and more popular, and more users use electronic devices for photographing, especially portrait photographing.
Currently, when a portrait is photographed, the electronic device has various makeup functions, such as an "eyebrow beautifying" function; the eyebrow beautifying function firstly identifies the eyebrow area of the user, and then adopts the standard eyebrow filling template to directly cover the eyebrow area of the user, so that the beautifying effect of deepening the eyebrow color of the user can be achieved; however, the standard eyebrow filling template is generally fixed and does not meet the requirements of each user, resulting in poor user experience; in addition, the eyebrow area of the user comprises a dense hair area and a sparse hair area, if the standard eyebrow filling template is directly covered on the eyebrow area of the user, the eyebrow color of the dense hair area is too deep, the eyebrow color of the sparse eyebrow area is too shallow, so that the beautified eyebrows are abrupt, and the user experience is poor.
Disclosure of Invention
The embodiment of the invention provides an image processing method, electronic equipment and a storage medium, which are used for solving the problem that the beautifying effect is distorted and the user experience is poor due to direct coverage of a standard filling template in image beautifying.
In order to solve the technical problems, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method, applied to an electronic device, where the image processing method includes:
acquiring a target image;
determining a region of a target object from the target image;
and filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold.
In a second aspect, an embodiment of the present invention provides an electronic device, including:
the acquisition module is used for acquiring a target image;
the determining module is used for determining the area of the target object from the target image;
the filling module is used for filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program is executed by the processor to implement the image processing method according to the first aspect.
In a fourth aspect, an embodiment of the present invention provides a computer-readable storage medium, on which a computer program is stored, which when executed by a processor implements the image processing method according to the first aspect.
In the embodiment of the invention, the area of the target object is determined from the target image; and filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold, and further, corresponding hair filling aiming at the requirements of users can be realized, and the user experience is improved.
Drawings
The invention will be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings in which like or similar reference characters designate like or similar features.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a schematic illustration of the profile of an eyelash provided by an embodiment of the present invention;
FIG. 3 is a schematic view of eyelashes according to an embodiment of the present invention;
FIG. 4 is a schematic view of an eyebrow with a missing region according to an embodiment of the present invention;
FIG. 5 is a schematic view of a right angle distance provided by an embodiment of the present invention;
FIG. 6 is a schematic illustration of a stuffed hair texture provided by an embodiment of the present invention;
fig. 7 is a schematic diagram of an electronic device according to an embodiment of the present invention;
fig. 8 is a schematic diagram of another electronic device according to an embodiment of the present invention
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention. As shown in fig. 1, the image processing method includes:
step 101: acquiring a target image;
step 102: determining a region of a target object from the target image;
step 103: and filling the target area with the hair textures if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold.
In the embodiment of the invention, the area of the target object is determined from the target image; and filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold, and further, corresponding hair filling aiming at the requirements of users can be realized, and the user experience is improved.
In an embodiment of the invention, the target image is an image comprising a preview of the target object or an image photograph of the target object.
In one example, when the target image is an image previewed by the electronic device, the acquiring the target image in step 101 includes:
and acquiring a preview image of the target object through a camera of the electronic equipment.
In the embodiment of the invention, when the user shoots by using the electronic equipment, the effect of the filled target object can be previewed on the preview interface, so that the interestingness of the user can be increased, and when the user is satisfied with the filling effect, the user can click to shoot, shoot and image the effect image of the filled target object, and the interactivity of the user and the electronic equipment is enhanced.
In one example, when the target image is an image photograph of the target object, the acquiring the target object in step 101 includes:
receiving a selection operation of a user on a plurality of images stored in the electronic equipment;
in response to the selection operation, a target image is acquired.
In the related art, for a target object in an image, a user is required to manually adjust a filling effect, and the manual filling effect is related to the manual capability of the user, so that the filling effect of the target object does not necessarily accord with the preference of the user, and further user experience is reduced.
In some embodiments of the present invention, the target object described in step 102 comprises an eyebrow or a eyelash.
In one example, determining the region of the target object from the target image as described in step 102 includes:
acquiring a plurality of contour feature points of a target object from a target image;
and determining the outline of the target object according to the outline characteristic points, wherein the range of the outline girth is the area of the target object.
In fig. 2, the plurality of contour feature points are "o", and the range surrounded by the plurality of "o" is an area of the eyebrow.
In one example, the plurality of contour feature points may include: eyebrow, belly, waist, peak, slope, tail and apex.
In fig. 3, the plurality of contour feature points are "o", and the range surrounded by the plurality of "o" is the region of the eyelash object.
In the embodiment of the invention, the area of the eyebrow or the eyelash in the target image can be determined from the plurality of contour feature points of the eyebrow or the eyelash, and the area to be filled can be determined, so that the operation is convenient and quick.
For convenience of description, an eyebrow is described as an example hereinafter.
In one example, the region of the target object includes a plurality of regions, and the region of the target object is a region divided according to a hair texture growth trend.
For example, in fig. 2, the target object has 3 regions.
It should be noted that, the area of the corresponding number of target objects may be selected according to the beautification precision, which is not described herein.
In one example, the density of the hair texture is the sparseness of the eyebrow distribution, and if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area for the target area in the area of the target object in step 103, filling the hair texture in the target area so that the density of the hair texture in the filled target area meets the first preset density threshold includes:
when the sparse program of the distribution of the hairs in the target area does not meet the corresponding first preset density threshold, the target area needs to be filled with the hairs so that the distribution sparsity of the eyebrows in the target area is changed from sparse to dense, and the distribution sparsity of the eyebrows in the filled target area meets the corresponding first preset density threshold.
It should be noted that, the first preset density threshold may be determined according to the distribution sparseness of the whole eyebrows of the user, so that the filled eyebrows are more uniform, and the filled eyebrows become abrupt due to too much hair texture in a certain area, so that the overall aesthetic feeling of the target object is increased, and the user experience is improved.
In one example, in addition to the distribution sparseness, the eyebrow may further relate to brightness information, that is, the overall color shade of the eyebrow, and if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, filling the hair texture in the target area so that the density of the hair texture in the filled target area meets the first preset density threshold in the target area, which includes:
and filling the hair texture in the target area if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not meet the preset brightness threshold corresponding to the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold and the brightness information of the hair texture in the filled target area meets the preset brightness threshold.
Taking black dense eyebrows as an example, in a target area (can be any area in fig. 2) of the eyebrows, the sparseness of the distribution of the hair in the target area is sparse, and the color shade is shallow, then the hair textures need to be filled in the area so that the sparseness of the distribution of the eyebrows in the target area is changed from sparse to dense, and the color shade is shallow to deep, so that the sparseness of the distribution of the eyebrows in the filled target area meets a corresponding first preset density threshold value, and the darkness of the eyebrows in the filled target area meets a preset brightness threshold value.
It should be noted that, the preset brightness threshold value can be determined according to the color shade degree of the whole eyebrow of the user, so that the color shade degree of the filled eyebrow is more uniform, the filled eyebrow becomes abrupt due to the fact that a certain area is not filled with the deep eyebrow, the whole aesthetic feeling of the target object is increased, and the user experience is improved.
In addition, when the color shade of the eyebrow of the target area is inconsistent with that of other areas, for example, when the color shade of the eyebrow of the target area is smaller than that of other areas, the color shade of the eyebrow of the target area can be adjusted first, so that the color shade of the eyebrow of the target area is consistent with that of the other areas as much as possible, and is further uniform as a whole, and the preference of a user is met; after the color depth degree of the target area is adjusted, the area is filled with hair, so that the filled eyebrows are free from uneven distribution and inconsistent color depth, the whole eyebrow effect is attractive, the preference of a user is met, and the user experience degree is improved.
In one example, if there is a missing region within the target region, then the missing portion needs to be filled based on standard target objects, specifically including:
when the density of the hair textures in the target area does not meet a second preset density threshold value, determining that a missing area exists in the target area; wherein the second preset density threshold is less than the first preset density threshold;
filling a hair texture in a target area, comprising:
filling the missing region according to the hair texture in the region corresponding to the missing region in the standard target object; the standard target object refers to a target object in which the density of the hair texture in each target area meets a first preset density threshold corresponding to each target area, and the brightness information of the hair texture in each target area meets a preset brightness threshold corresponding to each target area.
It should be noted that, the standard target object may be an eyebrow after filling according to the whole eyebrow sparse program and the color shade degree of the user, that is, before filling the user's eyebrow, a standard eyebrow may be first determined based on the whole eyebrow sparse program and the color shade degree of the user's eyebrow, so that the eyebrow with a missing area may be filled correspondingly, so that after the eyebrow with a missing area is filled by the standard eyebrow, the effect is uniform from the sparse degree on the adult, the color degree is suitable, the filling effect of the missing area is not inconsistent with other areas of the eyebrow, but the whole is abrupt, and the effect is relatively poor.
In fig. 4, there is a missing region between the contour feature point 1 and the contour feature point 2 (the missing region may also be filled by a region corresponding to the missing region between the contour feature point 1 and the contour feature point 2 in the standard target object according to a small difference in the minimum length of the missing region from the length between the contour feature point 1 and the contour feature point 2, for example, the minimum length of the missing region is 1/3 of the length between the contour feature point 1 and the contour feature point 2).
For filling of the missing area, a part of hair can be cut from the target area to be filled, so that the filled eyebrows are more consistent with the eyebrow characteristics of the user, namely the effect of each user after filling is not the same, the method has the characteristics of the user, and the user experience is higher.
In an embodiment of the present invention, optionally, filling the target area with the hair texture in step 103 includes:
acquiring size information of a first hair and a second hair in a target area;
determining the shortest distance between the first hair and the second hair according to the size information of the first hair and the second hair, and determining the same-side longest distance between the first endpoint of the first hair and the second hair;
and filling the hair textures in the target area according to the first hair and the second hair when the shortest distance is smaller than a first preset distance threshold value and the same-side longest distance is larger than a second preset distance threshold value.
Therefore, the judging condition of whether the hair is filled or not can be determined according to the distance between the hair filling judging condition and the hair filling judging condition, so that the hair filling judging device enables the hair to be more natural and attractive and meets the requirements of users.
In fig. 5, size information of a first hair A1 and a second hair A2 within a second target area (from left to right) is acquired; next, determining the shortest distance (i.e., the minimum right angle distance) between the first end point D of the first hair A1 and the second hair, and determining the ipsilateral distance (the first ipsilateral distance and the second ipsilateral distance, which may be the same or different, taking the longest ipsilateral distance as an example) between the first end point D of the first hair A1 and the first end point of the second hair A2; and filling the hair textures in the target area according to the first hair and the second hair when the shortest distance is smaller than a first preset distance threshold value and the same-side long distance of the end points is larger than a second preset distance threshold value.
In one example, filling hair textures within a target region from a first hair and a second hair comprises:
the first hair and/or the second hair is added within the target area.
In the embodiment of the invention, the nearest hair in the area to be filled is preferably used for filling, so that the filling effect is best, the integral aesthetic feeling of eyebrows is improved, and the user experience is better.
In one example, adding the first hair and/or the second hair within the target area includes:
the first hair and the second hair are respectively the side lengths of the quadrangle, and a third eyebrow is added on a connecting line of the midpoints of the other two side lengths in the quadrangle; wherein the third eyebrow is the eyebrow in the target area.
In fig. 6, the first hair A1 and the second hair A2 are respectively the sides of the quadrangle, the growing trend of the connecting line with the e and the f is filled in the connecting line of the middle point of the e and the f sides of the quadrangle, the growing trend of the connecting line with the e and the f is determined in the target area after the connecting line of the middle point of the e and the f sides is determined, the growing trend of the connecting line with the e and the f is determined in the target area, and then the filling is carried out according to the third hair, so that the filled effect is more consistent with the growing area of the eyebrows in the target area, and the overall aesthetic feeling of the eyebrows is increased.
In one example, the size information of the first hair includes a diameter of the first hair and the size information of the second hair includes a diameter of the second hair;
before filling the hair texture in the target area from the first hair and the second hair, the method further comprises, before the image processing method further comprises:
a first preset distance threshold is determined based on the diameter of the first hair and the diameter of the second hair, and a second preset distance threshold is determined.
It should be noted that, the step of determining the first preset distance threshold and the second preset distance threshold may be performed simultaneously with the step of determining the shortest distance and the longest distance on the same side, or performed before, or performed after, which will not be described herein.
In one example, the first preset distance threshold is: k1 (b1+b2)/k 2; the second preset distance threshold is: (b1+b2)/k2×k1;
wherein b1 is the diameter of the first hair, b2 is the diameter of the second hair, k1 and k2 are positive numbers, k1> k2.
Fig. 7 is a schematic diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 7, the electronic device 700 includes:
an acquisition module 701, configured to acquire a target image;
a determining module 702, configured to determine a region of the target object from the target image;
and a filling module 703, configured to fill, for a target area in the target area, the hair texture in the target area if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold.
In the embodiment of the invention, the area of the target object is determined from the target image; and filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold, and further, corresponding hair filling aiming at the requirements of users can be realized, and the user experience is improved.
The filling module 703 is further configured to:
and filling the hair texture in the target area if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not meet the preset brightness threshold corresponding to the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold and the brightness information of the hair texture in the filled target area meets the preset brightness threshold.
Optionally, the acquiring module 701 is further configured to acquire size information of the first hair and the second hair in the target area;
a determining module 702, configured to determine a shortest distance between the first hair and the second hair, and determine a ipsilateral longest distance between the first hair and the second hair, based on the size information of the first hair and the second hair;
the filling module 703 is further configured to fill the hair texture in the target area according to the first hair and the second hair when the shortest distance is less than the first preset distance threshold and the same-side longest distance is greater than the second preset distance threshold.
Optionally, the filling module 703 is further configured to:
and adding a third hair on a connecting line of midpoints of the other two side lengths in the rectangle, wherein the third hair is the hair in the target area.
Optionally, the size information of the first hair comprises a diameter of the first hair and the size information of the second hair comprises a diameter of the second hair;
the determining module 702 is further configured to determine a first preset distance threshold and a second preset distance threshold according to the diameter of the first hair and the diameter of the second hair.
The electronic device provided in the embodiment of the present invention can implement each process implemented by the electronic device in the method embodiment of fig. 1, and in order to avoid repetition, details are not repeated here.
In the embodiment of the invention, the area of the target object is determined from the target image; and filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold, and further, corresponding hair filling aiming at the requirements of users can be realized, and the user experience is improved.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
The electronic device 800 includes, but is not limited to: radio frequency unit 801, network module 802, audio output unit 803, input unit 804, sensor 805, display unit 806, user input unit 807, interface unit 808, memory 809, processor 810, and power supply 811. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 8 is not limiting of the electronic device and that the electronic device may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. In the embodiment of the invention, the electronic equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
A processor 810 for acquiring a target image;
determining a region of a target object from the target image;
and filling the hair textures in the target area if the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in order to enable the density of the hair textures in the filled target area to meet the first preset density threshold corresponding to the target area.
In the embodiment of the invention, the area of the target object is determined from the target image; and filling the hair textures in the target area under the condition that the density of the hair textures in the target area does not meet the first preset density threshold corresponding to the target area in the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold corresponding to the target area, further, corresponding hair filling aiming at the user requirement can be realized, and the user experience is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and transmitting signals during the process of receiving and transmitting information or communication, specifically, receiving downlink data from a base station, and then processing the received downlink data by the processor 810; and, the uplink data is transmitted to the base station. In general, the radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 801 may also communicate with networks and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user through the network module 802, such as helping the user to send and receive e-mail, browse web pages, access streaming media, and the like.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the electronic device 800. The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input unit 804 may include a graphics processor (Graphics Processing Unit, GPU) 8041 and a microphone 8042, the graphics processor 8041 processing image data of still pictures or video obtained by an image capturing apparatus (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. Microphone 1042 may receive sound and be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 801 in case of a telephone call mode.
The electronic device 800 also includes at least one sensor 805 such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor and a proximity sensor, wherein the ambient light sensor can adjust the brightness of the display panel 8061 according to the brightness of ambient light, and the proximity sensor can turn off the display panel 8061 and/or the backlight when the electronic device 800 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the electronic equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 805 may also include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 806 is used to display information input by a user or information provided to the user. The display unit 806 may include a display panel 8061, and the display panel 8061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 807 is operable to receive input numeric or character information and to generate key signal inputs related to user settings and function controls of the electronic device. In particular, the user input unit 807 includes a touch panel 8071 and other input devices 8072. Touch panel 8071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 8071 or thereabout using any suitable object or accessory such as a finger, stylus, etc.). The touch panel 8071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts it into touch point coordinates, sends the touch point coordinates to the processor 810, and receives and executes commands sent from the processor 810. In addition, the touch panel 8071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. The user input unit 107 may include other input devices 8072 in addition to the touch panel 8071. In particular, other input devices 8072 may include, but are not limited to, physical keyboards, function keys (e.g., volume control keys, switch keys, etc.), trackballs, mice, joysticks, and so forth, which are not described in detail herein.
Further, the touch panel 8071 may be overlaid on the display panel 8061, and when the touch panel 1071 detects a touch operation thereon or thereabout, the touch panel is transferred to the processor 810 to determine a type of touch event, and then the processor 810 provides a corresponding visual output on the display panel 8061 according to the type of touch event. Although in fig. 8, the touch panel 8071 and the display panel 8061 are two independent components for implementing the input and output functions of the electronic device, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 808 is an interface to which an external device is connected to the electronic apparatus 800. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 800 or may be used to transmit data between the electronic apparatus 800 and an external device.
The memory 809 can be used to store software programs as well as various data. The memory 809 may mainly include a storage program area that may store an operating system, application programs required for at least one function (such as a sound playing function, an image playing function, etc.), and a storage data area; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 809 may include high speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 809, and invoking data stored in the memory 809, thereby performing overall monitoring of the electronic device. The processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 810.
The electronic device 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and the power supply 811 may preferably be logically coupled to the processor 810 through a power management system that provides for managing charge, discharge, and power consumption.
In addition, the electronic device 800 includes some functional modules, which are not shown, and will not be described herein.
Preferably, the embodiment of the present invention further provides an electronic device, including a processor 810, a memory 809, and a computer program stored in the memory 809 and capable of running on the processor 810, where the computer program when executed by the processor 110 implements each process of the above embodiment of the image processing method, and the same technical effects can be achieved, and for avoiding repetition, a detailed description is omitted herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above-mentioned image processing method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here. Wherein the computer readable storage medium is selected from Read-Only Memory (ROM), random access Memory (Random Access Memory, RAM), magnetic disk or optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method according to the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (10)

1. An image processing method applied to an electronic device, the method comprising:
acquiring a target image;
determining a region of a target object from the target image;
and filling the hair textures in the target area if the density of the hair textures in the target area does not meet a first preset density threshold corresponding to the target area in the target area so that the density of the hair textures in the filled target area meets the first preset density threshold, wherein the first preset density threshold is determined according to the overall distribution sparseness of the target object.
2. The method according to claim 1, wherein for a target area in the area of the target object, if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, filling the hair texture in the target area so that the density of the hair texture in the filled target area meets the first preset density threshold, comprising:
and filling the hair texture in the target area if the density of the hair texture in the target area does not meet a first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not meet a preset brightness threshold corresponding to the target area, so that the density of the hair texture in the target area after filling meets the first preset density threshold and the brightness information of the hair texture in the target area after filling meets the preset brightness threshold.
3. The method according to claim 1 or 2, wherein said filling hair texture in said target area comprises:
acquiring size information of a first hair and a second hair in the target area;
determining a shortest distance between the first hair and the second hair, and determining a ipsilateral longest distance between the first hair and the second hair, according to the size information of the first hair and the second hair;
and filling hair textures in the target area according to the first hair and the second hair when the shortest distance is smaller than a first preset distance threshold and the same-side longest distance is larger than a second preset distance threshold.
4. A method according to claim 3, wherein said filling hair texture in said target area from said first hair and said second hair comprises:
and adding a third hair on a connecting line of midpoints of the other two side lengths in the rectangle, wherein the first hair and the second hair are respectively quadrangular side lengths, and the third hair is the hair in the target area.
5. The method of claim 4, wherein the size information of the first hair comprises a diameter of the first hair and the size information of the second hair comprises a diameter of the second hair, the method further comprising, prior to filling hair textures within the target region from the first hair and the second hair:
and determining the first preset distance threshold and the second preset distance threshold according to the diameter of the first hair and the diameter of the second hair.
6. An electronic device, comprising:
the acquisition module is used for acquiring a target image;
the determining module is used for determining the area of the target object from the target image;
and the filling module is used for filling the hair textures in the target area in the area of the target object if the density of the hair textures in the target area does not meet a first preset density threshold corresponding to the target area, so that the density of the hair textures in the filled target area meets the first preset density threshold, wherein the first preset density threshold is determined according to the overall distribution sparseness of the target object.
7. The electronic device of claim 6, wherein the population module is further configured to:
and filling the hair texture in the target area if the density of the hair texture in the target area does not meet a first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not meet a preset brightness threshold corresponding to the target area, so that the density of the hair texture in the target area after filling meets the first preset density threshold and the brightness information of the hair texture in the target area after filling meets the preset brightness threshold.
8. The electronic device according to claim 6 or 7, characterized in that,
the acquisition module is also used for acquiring the size information of the first hair and the second hair in the target area;
a determining module, further configured to determine a shortest distance between the first hair and the second hair, and determine a same-side longest distance between the first hair and the second hair, according to the size information of the first hair and the second hair;
the filling module is further configured to fill hair textures in the target area according to the first hair and the second hair when the shortest distance is smaller than a first preset distance threshold and the same-side longest distance is larger than a second preset distance threshold.
9. The electronic device of claim 8, wherein the population module is further configured to:
and adding a third hair on a connecting line of midpoints of the other two side lengths in the rectangle, wherein the first hair and the second hair are respectively quadrangular side lengths, and the third hair is the hair in the target area.
10. The electronic device of claim 9, wherein the size information of the first hair comprises a diameter of the first hair and the size information of the second hair comprises a diameter of the second hair;
the determining module is further configured to determine the first preset distance threshold and the second preset distance threshold according to the diameter of the first hair and the diameter of the second hair.
CN201911416461.6A 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium Active CN111210491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911416461.6A CN111210491B (en) 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911416461.6A CN111210491B (en) 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN111210491A CN111210491A (en) 2020-05-29
CN111210491B true CN111210491B (en) 2023-07-07

Family

ID=70784123

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911416461.6A Active CN111210491B (en) 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN111210491B (en)

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015029371A1 (en) * 2013-08-30 2015-03-05 パナソニックIpマネジメント株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090151741A1 (en) * 2007-12-14 2009-06-18 Diem Ngo A cosmetic template and method for creating a cosmetic template
CN106296605B (en) * 2016-08-05 2019-03-26 腾讯科技(深圳)有限公司 A kind of image mending method and device
CN107547797A (en) * 2017-07-27 2018-01-05 努比亚技术有限公司 A kind of image pickup method, terminal and computer-readable recording medium
CN107835367A (en) * 2017-11-14 2018-03-23 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN108573527B (en) * 2018-04-18 2020-02-18 腾讯科技(深圳)有限公司 Expression picture generation method and equipment and storage medium thereof
CN109859115A (en) * 2018-12-28 2019-06-07 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015029371A1 (en) * 2013-08-30 2015-03-05 パナソニックIpマネジメント株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program

Also Published As

Publication number Publication date
CN111210491A (en) 2020-05-29

Similar Documents

Publication Publication Date Title
CN107817939B (en) Image processing method and mobile terminal
CN109461117B (en) Image processing method and mobile terminal
CN107644396B (en) Lip color adjusting method and device
CN107730460B (en) Image processing method and mobile terminal
CN111147752B (en) Zoom factor adjusting method, electronic device, and medium
CN109819166B (en) Image processing method and electronic equipment
CN109542321B (en) Control method and device for screen display content
CN109727212B (en) Image processing method and mobile terminal
CN109448069B (en) Template generation method and mobile terminal
CN108881782B (en) Video call method and terminal equipment
CN108984143B (en) Display control method and terminal equipment
CN108509141B (en) Control generation method and mobile terminal
CN110555815B (en) Image processing method and electronic equipment
CN110636225B (en) Photographing method and electronic equipment
CN109639981B (en) Image shooting method and mobile terminal
CN109903218B (en) Image processing method and terminal
CN111080747A (en) Face image processing method and electronic equipment
CN108536513B (en) Picture display direction adjusting method and mobile terminal
CN108259756B (en) Image shooting method and mobile terminal
CN107563353B (en) Image processing method and device and mobile terminal
CN107729100B (en) Interface display control method and mobile terminal
CN107817963B (en) Image display method, mobile terminal and computer readable storage medium
CN111432154B (en) Video playing method, video processing method and electronic equipment
CN111432122A (en) Image processing method and electronic equipment
CN111178306A (en) Display control method and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant