CN111210491A - Image processing method, electronic device, and storage medium - Google Patents

Image processing method, electronic device, and storage medium Download PDF

Info

Publication number
CN111210491A
CN111210491A CN201911416461.6A CN201911416461A CN111210491A CN 111210491 A CN111210491 A CN 111210491A CN 201911416461 A CN201911416461 A CN 201911416461A CN 111210491 A CN111210491 A CN 111210491A
Authority
CN
China
Prior art keywords
hair
target area
target
preset
density
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911416461.6A
Other languages
Chinese (zh)
Other versions
CN111210491B (en
Inventor
周楚瑶
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911416461.6A priority Critical patent/CN111210491B/en
Publication of CN111210491A publication Critical patent/CN111210491A/en
Application granted granted Critical
Publication of CN111210491B publication Critical patent/CN111210491B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the invention discloses an image processing method, electronic equipment and a storage medium. The image processing method comprises the following steps: acquiring a target image; determining a region of a target object from the target image; and filling the hair texture in the target area if the density of the hair texture in the target area does not meet a first preset density threshold corresponding to the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold. By utilizing the embodiment of the invention, the corresponding hair filling can be realized according to the requirements of the user, and the user experience is improved.

Description

Image processing method, electronic device, and storage medium
Technical Field
Embodiments of the present invention relate to the field of image processing technologies, and in particular, to an image processing method, an electronic device, and a storage medium.
Background
Nowadays, electronic devices are more and more popular, and more users use electronic devices to take pictures, especially portrait photos.
At present, when people take photos, electronic equipment has a plurality of makeup functions, such as an eyebrow beautifying function; the eyebrow beautifying function firstly identifies the eyebrow area of a user, and then adopts a standard eyebrow filling template to directly cover the eyebrow area of the user, so that the beautifying effect of deepening the eyebrow color of the user can be achieved; however, the standard eyebrow filling template is generally fixed and not consistent with the requirements of each user, which results in poor user experience; in addition, the eyebrow area of the user comprises a hair dense area and a hair sparse area, and if the standard eyebrow filling template is directly covered on the eyebrow area of the user, the color of the eyebrow in the hair dense area is too dark, and the color of the eyebrow in the hair sparse area is too light, so that the beautified eyebrow is very sharp, and the user experience is poor.
Disclosure of Invention
The embodiment of the invention provides an image processing method, electronic equipment and a storage medium, and aims to solve the problems of beautifying effect distortion and poor user experience caused by directly covering standard filling templates in image beautification.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention provides an image processing method applied to an electronic device, where the image processing method includes:
acquiring a target image;
determining a region of a target object from the target image;
and filling the hair texture in the target area if the density of the hair texture in the target area does not meet a first preset density threshold corresponding to the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold.
In a second aspect, an embodiment of the present invention provides an electronic device, including:
the acquisition module is used for acquiring a target image;
the determining module is used for determining the area of the target object from the target image;
and the filling module is used for filling the hair texture in the target area under the condition that the density of the hair texture in the target area does not meet a first preset density threshold corresponding to the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold.
In a third aspect, an embodiment of the present invention provides an electronic device, which includes a processor, a memory, and a computer program stored on the memory and executable on the processor, and when executed by the processor, the electronic device implements the image processing method according to the first aspect.
In a fourth aspect, the present invention provides a computer-readable storage medium, on which a computer program is stored, where the computer program, when executed by a processor, implements the image processing method according to the first aspect.
In the embodiment of the invention, the area of the target object is determined from the target image; in the target area in the target object area, if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, the hair texture is filled in the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold, and then the hair filling corresponding to the requirements of the user can be realized, and the user experience is improved.
Drawings
The present invention will be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings, in which like or similar reference characters designate like or similar features.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a schematic view of an eyelash profile provided by an embodiment of the present invention;
FIG. 3 is a schematic view of eyelashes provided by an embodiment of the present invention;
FIG. 4 is a schematic diagram of an eyebrow with missing regions according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a right angle distance provided by an embodiment of the present invention;
FIG. 6 is a schematic illustration of a fill hair texture provided by an embodiment of the present invention;
fig. 7 is a schematic diagram of an electronic device according to an embodiment of the present invention;
FIG. 8 is a schematic diagram of another electronic device according to an embodiment of the invention
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Fig. 1 is a flowchart of an image processing method according to an embodiment of the present invention. As shown in fig. 1, the image processing method includes:
step 101: acquiring a target image;
step 102: determining a region of a target object from the target image;
step 103: and for a target area in the area of the target object, filling hair textures in the target area under the condition that the density of the hair textures in the target area does not meet a first preset density threshold corresponding to the target area, so that the density of the hair textures in the target area after filling meets the first preset density threshold.
In the embodiment of the invention, the area of the target object is determined from the target image; in the target area in the target object area, if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, the hair texture is filled in the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold, and then the hair filling corresponding to the requirements of the user can be realized, and the user experience is improved.
In an embodiment of the invention, the target image is an image comprising a preview of the target object or an image photograph of the target object.
In one example, when the target image is an image previewed by the electronic device, the acquiring the target image in step 101 includes:
and acquiring a preview image of the target object through a camera of the electronic equipment.
In the embodiment of the invention, when the user uses the electronic equipment to shoot, the effect of the target object after filling can be previewed on the preview interface, so that the interestingness of the user can be increased, when the user is satisfied with the filling effect, the user can click to shoot, the effect image of the target object after filling is shot and imaged, and the interactivity between the user and the electronic equipment is enhanced.
In one example, when the target image is an image photo of the target object, the acquiring of the target object in step 101 includes:
receiving a selection operation of a user on a plurality of images stored in the electronic equipment;
in response to the selection operation, a target image is acquired.
In the related technology, for a target object in an image, a user needs to manually adjust a filling effect, and the manual filling effect is related to the manual capability of the user, so that the filling effect of the target object does not necessarily accord with the preference of the user, and further the user experience is reduced.
In some embodiments of the present invention, the target object of step 102 comprises an eyebrow or an eyelash.
In one example, the determining the region of the target object from the target image in step 102 includes:
acquiring a plurality of contour feature points of a target object from a target image;
and determining the contour of the target object according to the plurality of contour feature points, wherein the range of the contour enclosing city is the region of the target object.
In fig. 2, the plurality of contour feature points are "○", and the range surrounded by the plurality of "○" is an area of the eyebrow.
In one example, the plurality of contour feature points may include: eyebrow, eyebrow belly, eyebrow waist, eyebrow peak, eyebrow slope, eyebrow tail, and eyebrow tip.
In fig. 3, the plurality of contour feature points are "○", and the range surrounded by the plurality of "○" is the region of the eyelash object.
In the embodiment of the invention, the area of the eyebrows or eyelashes in the target image can be determined according to the plurality of contour characteristic points of the eyebrows or eyelashes, and the area needing to be filled can be determined, so that the operation is convenient and quick.
For convenience of description, the eyebrow will be explained as an example hereinafter.
In one example, the region of the target object includes a plurality of regions, and the region of the target object is a region divided according to a hair texture growth tendency.
For example, in fig. 2, the number of regions of the target object is 3.
It should be noted that, the areas of the corresponding number of target objects may be selected according to the beautification precision, which is not described herein again.
In an example, the filling of the hair texture in the target region in step 103, if the density of the hair texture in the target region does not satisfy the first preset density threshold corresponding to the target region, so that the density of the hair texture in the filled target region satisfies the first preset density threshold, includes:
when the hair distribution sparseness program in the target area does not meet the corresponding first preset density threshold, the hair texture needs to be filled in the target area, so that the distribution sparseness of the eyebrows in the target area is changed from sparse to dense, and the distribution sparseness of the eyebrows in the filled target area meets the corresponding first preset density threshold.
It should be noted that the first preset density threshold may be determined according to the distribution sparsity of the whole eyebrows of the user, so that the filled eyebrows are more uniform, and the filled eyebrows are not abrupt due to filling of too many hair textures in a certain area, thereby increasing the whole aesthetic feeling of the target object and improving the user experience.
In one example, the eyebrow may relate to brightness information, that is, the color depth of the whole eyebrow, in addition to the distribution sparseness, and the filling of the hair texture in the target region in step 103, if the density of the hair texture in the target region does not satisfy the first preset density threshold corresponding to the target region, so that the density of the hair texture in the filled target region satisfies the first preset density threshold, includes:
for a target area in the area of the target object, if the density of the hair texture in the target area does not satisfy a first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not satisfy a preset brightness threshold corresponding to the target area, filling the hair texture in the target area so that the density of the hair texture in the filled target area satisfies the first preset density threshold and the brightness information of the hair texture in the filled target area satisfies the preset brightness threshold.
Taking black dense eyebrows as an example, in a target area (which may be any area in fig. 2) of the eyebrows, if the distribution sparsity of the hairs in the target area is sparse and the color depth is light, hair textures need to be filled in the area, so that the distribution sparsity of the eyebrows in the target area is changed from sparse to dense, and the color depth is changed from shallow to deep, so that the distribution sparsity of the eyebrows in the filled target area meets a corresponding first preset density threshold, and the depth of the eyebrows in the filled target area meets a preset brightness threshold.
It should be noted that the preset brightness threshold may be determined according to the color depth of the whole eyebrow of the user, so that the color depth of the filled eyebrow is more uniform, and there is no situation that the filled eyebrow becomes too steep due to filling of a deeper eyebrow in a certain area, thereby increasing the whole aesthetic feeling of the target object and improving the user experience.
In addition, when the color depth of the eyebrows in the target area is inconsistent with the color depth of other areas, for example, when the color depth of the eyebrows in the target area is smaller than the color depth of other areas, the color depth of the eyebrows in the target area can be adjusted first, so that the color depth of the eyebrows in the target area is consistent with the color depth of other areas as much as possible, and the whole is relatively uniform, and accords with the preference of a user; after the color depth degree of the target area is adjusted, hair filling is carried out on the area, so that the filled eyebrows are free from the conditions of uneven distribution and inconsistent color depth, the overall effect of the eyebrows is attractive, the preference of a user is met, and the user experience is improved.
In one example, if there is a missing region in the target region, the missing part needs to be filled based on the standard target object, which specifically includes:
when the density of the hair texture in the target area does not meet a second preset density threshold, determining that a missing area exists in the target area; wherein the second preset density threshold is smaller than the first preset density threshold;
filling a hair texture within a target region, comprising:
filling the missing region according to the hair texture in the region corresponding to the missing region in the standard target object; the standard target object is a target object in which the density of the hair texture in each target area meets a first preset density threshold corresponding to each target area, and the brightness information of the hair texture in each target area meets a preset brightness threshold corresponding to each target area.
It should be noted that the standard target object may be an eyebrow that is filled according to the sparse program and the color depth of the entire eyebrow of the user, that is, before the eyebrow of the user is filled, a standard eyebrow may be determined based on the entire sparse program and the color depth of the eyebrow of the user, so that the eyebrow with the missing region may be filled correspondingly, and after the eyebrow with the missing region is filled through the standard eyebrow, the effect is uniform in sparse degree from the adult, the color depth is appropriate, the filling effect of the missing region is not inconsistent with other regions of the eyebrow, and the entire eyebrow is more obtrusive and the effect is worse.
In fig. 4, a missing region exists between the contour feature point 1 and the contour feature point 2 (the missing region may also be filled by a region corresponding to the missing region between the contour feature point 1 and the contour feature point 2 in the standard target object according to the fact that the minimum length of the missing region is smaller than the length between the contour feature point 1 and the contour feature point 2, for example, the minimum length of the missing region is 1/3 of the length between the contour feature point 1 and the contour feature point 2).
For filling of the missing region, a part of hair can be cut from the target region for filling, so that the filled eyebrow can better accord with the eyebrow characteristics of the user, namely the filled eyebrow has different effects, the filling method has the personal characteristics of the user, and the user experience is higher.
In this embodiment of the present invention, optionally, the filling the hair texture in the target region in step 103 includes:
acquiring size information of a first hair and a second hair in a target area;
determining the shortest distance between the first hair and the second hair and the longest distance between the first end point of the first hair and the same side of the second hair according to the size information of the first hair and the second hair;
and under the condition that the shortest distance is smaller than a first preset distance threshold value and the longest distance on the same side is larger than a second preset distance threshold value, filling hair textures in the target area according to the first hair and the second hair.
Therefore, the judgment condition of whether to fill the hair can be determined according to the distance between the two, so that the hair filling is more natural and beautiful, and the requirements of users are met.
In fig. 5, size information of a first hair a1 and a second hair a2 within a second target area (from left to right) is obtained; next, the shortest distance (i.e., the smallest right-angle distance) between the first end point D of the first hair a1 and the second hair a is determined, and the ipsilateral distance (the first ipsilateral distance and the second ipsilateral distance, which may be the same or different, for example, the longest ipsilateral distance) between the first end point D of the first hair a1 and the first end point of the second hair a2 is determined; and when the shortest distance is smaller than a first preset distance threshold value and the longest distance on the same side of the end point is larger than a second preset distance threshold value, filling hair textures in the target area according to the first hair and the second hair.
In one example, filling a hair texture within a target region according to a first hair and a second hair, comprises:
a first hair and/or a second hair is added to the target area.
In the embodiment of the invention, the hair closest to the area to be filled is preferably taken into consideration for filling, so that the filling effect is best, the integral aesthetic feeling of eyebrow is improved, and the user experience is better.
In one example, adding a first hair and/or a second hair within a target area includes:
respectively taking the first hair and the second hair as the side lengths of the quadrangle, and adding a third eyebrow on a connecting line of midpoints of the other two side lengths in the quadrangle; wherein the third eyebrow is the eyebrow in the target area.
In fig. 6, the first hair a1 and the second hair a2 are respectively the side length of the quadrangle, the growth trend of the connecting line between the e and f is filled on the connecting line of the middle points of the e and f side lengths of the quadrangle, the third eyebrow with the same length is filled, namely after the connecting line of the middle points of the e and f side lengths is determined, the growth trend of the connecting line between the e and f and the third eyebrow with the same length are determined from the target area, and then the filling is performed according to the third hair, so that the filled effect is more in line with the growth area of the eyebrow in the target area, and the integral aesthetic feeling of the eyebrow is increased.
In one example, the size information of the first hair comprises a diameter of the first hair, and the size information of the second hair comprises a diameter of the second hair;
before filling the hair texture in the target region according to the first hair and the second hair, the method further comprises:
determining a first preset distance threshold value and a second preset distance threshold value according to the diameter of the first hair and the diameter of the second hair.
It should be noted that the step of determining the first preset distance threshold and the second preset distance threshold may be performed simultaneously with the step of determining the shortest distance and the longest distance on the same side, or performed first, or performed later, and will not be described herein again.
In one example, the first preset distance threshold is: k1 (b1+ b2)/k 2; the second preset distance threshold is: (b1+ b2)/k2 × k 1;
wherein b1 is the diameter of the first hair, b2 is the diameter of the second hair, k1 and k2 are positive numbers, and k1> k 2.
Fig. 7 is a schematic diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 7, the electronic device 700 includes:
an obtaining module 701, configured to obtain a target image;
a determining module 702, configured to determine a region of the target object from the target image;
the filling module 703 is configured to, for a target area in an area of a target object, fill a hair texture in the target area if the density of the hair texture in the target area does not satisfy a first preset density threshold corresponding to the target area, so that the density of the hair texture in the filled target area satisfies the first preset density threshold.
In the embodiment of the invention, the area of the target object is determined from the target image; in the target area in the target object area, if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, the hair texture is filled in the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold, and then the hair filling corresponding to the requirements of the user can be realized, and the user experience is improved.
A padding module 703, further configured to:
for a target area in the area of the target object, if the density of the hair texture in the target area does not satisfy a first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not satisfy a preset brightness threshold corresponding to the target area, filling the hair texture in the target area so that the density of the hair texture in the filled target area satisfies the first preset density threshold and the brightness information of the hair texture in the filled target area satisfies the preset brightness threshold.
Optionally, the obtaining module 701 is further configured to obtain size information of the first hair and the second hair in the target area;
a determining module 702, further configured to determine a shortest distance between the first hair and the second hair and a longest distance on the same side between the first hair and the second hair according to the size information of the first hair and the second hair;
the filling module 703 is further configured to fill hair textures in the target area according to the first hair and the second hair when the shortest distance is smaller than a first preset distance threshold and the longest distance on the same side is greater than a second preset distance threshold.
Optionally, the filling module 703 is further configured to:
and adding a third hair on a connecting line of the middle points of the other two side lengths in the rectangle, wherein the first hair and the second hair are respectively the side lengths of the quadrangle, and the third hair is the hair in the target area.
Optionally, the size information of the first hair includes a diameter of the first hair, and the size information of the second hair includes a diameter of the second hair;
the determining module 702 is further configured to determine a first preset distance threshold and a second preset distance threshold according to the diameter of the first hair and the diameter of the second hair.
The electronic device provided in the embodiment of the present invention can implement each process implemented by the electronic device in the method embodiment of fig. 1, and is not described herein again to avoid repetition.
In the embodiment of the invention, the area of the target object is determined from the target image; in the target area in the target object area, if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, the hair texture is filled in the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold, and then the hair filling corresponding to the requirements of the user can be realized, and the user experience is improved.
Fig. 8 is a schematic diagram of a hardware structure of an electronic device implementing various embodiments of the present invention.
The electronic device 800 includes, but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the electronic device configuration shown in fig. 8 does not constitute a limitation of the electronic device, and that the electronic device may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the electronic device includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
A processor 810 for obtaining a target image;
determining a region of a target object from the target image;
and filling the hair texture in the target area if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold corresponding to the target area.
In the embodiment of the invention, the area of the target object is determined from the target image; in the target area in the target object area, if the density of the hair texture in the target area does not meet the first preset density threshold corresponding to the target area, the hair texture is filled in the target area, so that the density of the hair texture in the filled target area meets the first preset density threshold corresponding to the target area, and then the hair filling corresponding to the requirements of the user can be realized, and the user experience is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user via the network module 802, such as to assist the user in sending and receiving e-mails, browsing web pages, and accessing streaming media.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the electronic apparatus 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics processor 8041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 1042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The electronic device 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8061 according to the brightness of ambient light and a proximity sensor that can turn off the display panel 8061 and/or the backlight when the electronic device 800 is moved to the ear. As one type of motion sensor, an accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of an electronic device (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 806 is used to display information input by the user or information provided to the user. The Display unit 806 may include a Display panel 8061, and the Display panel 8061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 807 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the electronic apparatus. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 8071 may include two portions of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 107 may include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8061, and when the touch panel 1071 detects a touch operation on or near the touch panel 8061, the touch operation can be transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 can provide a corresponding visual output on the display panel 8061 according to the type of the touch event. Although in fig. 8, the touch panel 8071 and the display panel 8061 are two independent components to implement the input and output functions of the electronic device, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the electronic device, and the implementation is not limited herein.
The interface unit 808 is an interface for connecting an external device to the electronic apparatus 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the electronic device 800 or may be used to transmit data between the electronic device 800 and external devices.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the electronic device, connects various parts of the whole electronic device by using various interfaces and lines, performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby monitoring the whole electronic device. Processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The electronic device 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and preferably, the power supply 811 may be logically coupled to the processor 810 via a power management system to manage charging, discharging, and power consumption management functions via the power management system.
In addition, the electronic device 800 includes some functional modules that are not shown, and are not described in detail herein.
Preferably, an embodiment of the present invention further provides an electronic device, which includes a processor 810, a memory 809, and a computer program stored in the memory 809 and capable of running on the processor 810, where the computer program, when executed by the processor 110, implements each process of the above-mentioned embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An image processing method applied to an electronic device, the method comprising:
acquiring a target image;
determining a region of a target object from the target image;
and for a target area in the area of the target object, filling hair textures in the target area under the condition that the density of the hair textures in the target area does not meet a first preset density threshold corresponding to the target area, so that the density of the hair textures in the target area after filling meets the first preset density threshold.
2. The method according to claim 1, wherein the filling, for a target area in the target object area, hair texture in the target area if the density of the hair texture in the target area does not satisfy a first preset density threshold corresponding to the target area, so that the density of the hair texture in the target area after filling satisfies the first preset density threshold comprises:
for a target area in the area of the target object, if the density of the hair texture in the target area does not satisfy a first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not satisfy a preset brightness threshold corresponding to the target area, filling the hair texture in the target area so that the density of the hair texture in the filled target area satisfies the first preset density threshold and the brightness information of the hair texture in the filled target area satisfies the preset brightness threshold.
3. The method of claim 1 or 2, wherein the filling of the hair texture within the target region comprises:
acquiring size information of a first hair and a second hair in the target area;
determining the shortest distance between the first hair and the second hair and the longest distance on the same side of the first hair and the second hair according to the size information of the first hair and the second hair;
and under the condition that the shortest distance is smaller than a first preset distance threshold value and the longest distance on the same side is larger than a second preset distance threshold value, filling hair textures in the target area according to the first hair and the second hair.
4. The method of claim 3, wherein said filling the hair texture within the target region based on the first hair and the second hair comprises:
and adding a third hair on a connecting line of midpoints of the other two side lengths in the rectangle, wherein the third hair is the hair in the target area.
5. The method of claim 4, wherein the size information of the first hair comprises a diameter of the first hair, the size information of the second hair comprises a diameter of the second hair, and the method further comprises, before filling the target area with the hair texture based on the first hair and the second hair:
determining the first preset distance threshold and the second preset distance threshold according to the diameter of the first hair and the diameter of the second hair.
6. An electronic device, comprising:
the acquisition module is used for acquiring a target image;
a determining module, configured to determine a region of a target object from the target image;
and a filling module, configured to, for a target area in the area of the target object, fill a hair texture in the target area if the density of the hair texture in the target area does not satisfy a first preset density threshold corresponding to the target area, so that the density of the hair texture in the filled target area satisfies the first preset density threshold.
7. The electronic device of claim 6, wherein the fill module is further configured to:
for a target area in the area of the target object, if the density of the hair texture in the target area does not satisfy a first preset density threshold corresponding to the target area and the brightness information of the hair texture in the target area does not satisfy a preset brightness threshold corresponding to the target area, filling the hair texture in the target area so that the density of the hair texture in the filled target area satisfies the first preset density threshold and the brightness information of the hair texture in the filled target area satisfies the preset brightness threshold.
8. The electronic device of claim 6 or 7,
the acquisition module is further used for acquiring the size information of the first hair and the second hair in the target area;
the determining module is further used for determining the shortest distance between the first hair and the second hair and the longest distance between the first hair and the second hair on the same side according to the size information of the first hair and the second hair;
the filling module is further configured to fill hair textures in the target region according to the first hair and the second hair when the shortest distance is smaller than a first preset distance threshold and the longest distance on the same side is greater than a second preset distance threshold.
9. The electronic device of claim 8, wherein the fill module is further configured to:
and adding a third hair on a connecting line of midpoints of the other two side lengths in the rectangle, wherein the third hair is the hair in the target area.
10. The electronic device of claim 9, wherein the size information of the first hair comprises a diameter of the first hair, and the size information of the second hair comprises a diameter of the second hair;
the determining module is further configured to determine the first preset distance threshold and the second preset distance threshold according to the diameter of the first hair and the diameter of the second hair.
CN201911416461.6A 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium Active CN111210491B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911416461.6A CN111210491B (en) 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911416461.6A CN111210491B (en) 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium

Publications (2)

Publication Number Publication Date
CN111210491A true CN111210491A (en) 2020-05-29
CN111210491B CN111210491B (en) 2023-07-07

Family

ID=70784123

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911416461.6A Active CN111210491B (en) 2019-12-31 2019-12-31 Image processing method, electronic device, and storage medium

Country Status (1)

Country Link
CN (1) CN111210491B (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090151741A1 (en) * 2007-12-14 2009-06-18 Diem Ngo A cosmetic template and method for creating a cosmetic template
WO2015029371A1 (en) * 2013-08-30 2015-03-05 パナソニックIpマネジメント株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program
CN106296605A (en) * 2016-08-05 2017-01-04 腾讯科技(深圳)有限公司 A kind of image mending method and device
CN107547797A (en) * 2017-07-27 2018-01-05 努比亚技术有限公司 A kind of image pickup method, terminal and computer-readable recording medium
CN107835367A (en) * 2017-11-14 2018-03-23 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN108573527A (en) * 2018-04-18 2018-09-25 腾讯科技(深圳)有限公司 A kind of expression picture generation method and its equipment, storage medium
CN109859115A (en) * 2018-12-28 2019-06-07 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090151741A1 (en) * 2007-12-14 2009-06-18 Diem Ngo A cosmetic template and method for creating a cosmetic template
WO2015029371A1 (en) * 2013-08-30 2015-03-05 パナソニックIpマネジメント株式会社 Makeup assistance device, makeup assistance method, and makeup assistance program
CN106296605A (en) * 2016-08-05 2017-01-04 腾讯科技(深圳)有限公司 A kind of image mending method and device
CN107547797A (en) * 2017-07-27 2018-01-05 努比亚技术有限公司 A kind of image pickup method, terminal and computer-readable recording medium
CN107835367A (en) * 2017-11-14 2018-03-23 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN108573527A (en) * 2018-04-18 2018-09-25 腾讯科技(深圳)有限公司 A kind of expression picture generation method and its equipment, storage medium
CN109859115A (en) * 2018-12-28 2019-06-07 努比亚技术有限公司 A kind of image processing method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN111210491B (en) 2023-07-07

Similar Documents

Publication Publication Date Title
CN107817939B (en) Image processing method and mobile terminal
CN109461117B (en) Image processing method and mobile terminal
CN107644396B (en) Lip color adjusting method and device
CN111050073B (en) Focusing method and electronic equipment
CN111147752B (en) Zoom factor adjusting method, electronic device, and medium
CN109819167B (en) Image processing method and device and mobile terminal
CN111669503A (en) Photographing method and device, electronic equipment and medium
CN109542321B (en) Control method and device for screen display content
CN109656636B (en) Application starting method and device
CN109448069B (en) Template generation method and mobile terminal
CN108881782B (en) Video call method and terminal equipment
CN108495036B (en) Image processing method and mobile terminal
CN108174110B (en) Photographing method and flexible screen terminal
CN108132749B (en) Image editing method and mobile terminal
CN110650367A (en) Video processing method, electronic device, and medium
CN110555815B (en) Image processing method and electronic equipment
CN109639981B (en) Image shooting method and mobile terminal
CN109903218B (en) Image processing method and terminal
CN111080747A (en) Face image processing method and electronic equipment
CN107563353B (en) Image processing method and device and mobile terminal
CN110908517A (en) Image editing method, image editing device, electronic equipment and medium
CN110944112A (en) Image processing method and electronic equipment
CN108536513B (en) Picture display direction adjusting method and mobile terminal
CN107817963B (en) Image display method, mobile terminal and computer readable storage medium
CN111556358A (en) Display method and device and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant