CN111260600B - Image processing method, electronic equipment and medium - Google Patents

Image processing method, electronic equipment and medium Download PDF

Info

Publication number
CN111260600B
CN111260600B CN202010070763.9A CN202010070763A CN111260600B CN 111260600 B CN111260600 B CN 111260600B CN 202010070763 A CN202010070763 A CN 202010070763A CN 111260600 B CN111260600 B CN 111260600B
Authority
CN
China
Prior art keywords
target
area
image
sticker image
brightness
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010070763.9A
Other languages
Chinese (zh)
Other versions
CN111260600A (en
Inventor
李阳勤
石小周
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202010070763.9A priority Critical patent/CN111260600B/en
Publication of CN111260600A publication Critical patent/CN111260600A/en
Application granted granted Critical
Publication of CN111260600B publication Critical patent/CN111260600B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02DCLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
    • Y02D10/00Energy efficient computing, e.g. low power processors, power management or thermal management

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the invention discloses an image processing method, electronic equipment and a medium. The method comprises the following steps: determining a target area in the target sticker image according to the brightness of the first sticker image and the brightness of the first area fused in the target image and the curvature of the first edge curve of the first sticker image; the first edge curve is the outermost curve of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image; and carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the area except the target area in the target sticker image. The embodiment of the invention can solve the problem that the image is unnatural after the sticker is added.

Description

Image processing method, electronic equipment and medium
Technical Field
The embodiment of the invention relates to the field of image processing, in particular to an image processing method, electronic equipment and a medium.
Background
Many applications are now able to add various types of decals or cosmetic effects to the photos taken by the user to create different special effects.
However, the current process of adding a sticker is generally to directly cover the built-in sticker of the camera on a target area in a target image to be processed. However, in this method, after the lamination is completed, the effect of processing the image is affected due to the unnatural condition between the sticker and the target image.
Disclosure of Invention
The embodiment of the invention provides an image processing method, electronic equipment and a medium, which are used for solving the problem that an image is unnatural after a sticker is added.
In a first aspect, an embodiment of the present invention provides an image processing method, which is applied to an electronic device, including:
determining a target area in the target sticker image according to the brightness of the first sticker image and the brightness of the first area fused in the target image and the curvature of the first edge curve of the first sticker image; the first edge curve is the outermost curve of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image;
and carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the area except the target area in the target sticker image.
In a second aspect, an embodiment of the present invention further provides an electronic device, including:
the area determining module is used for determining a target area in the target sticker image according to the brightness of the first sticker image, the brightness of the first area fused with the brightness of the target image and the curvature of the first edge curve of the first sticker image; the first edge curve is the outermost curve of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image;
and the brightness fusion module is used for carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the target sticker image.
In a third aspect, an embodiment of the present invention provides an electronic device comprising a processor, a memory and a computer program stored on the memory and executable on the processor, the computer program implementing the steps of the image processing method as in the first aspect when executed by the processor.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, implements the steps of the image processing method as in the first aspect.
In the embodiment of the invention, a target area in a target sticker image is determined according to the brightness of the first sticker image, the brightness of a corresponding area in the target image and the curvature of a first edge curve of the first sticker image, wherein the target area refers to an edge area to be fused between the target sticker image and the target image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image; and then, the brightness weighting fusion is carried out on the target area and the corresponding area in the target image, and the brightness of the target sticker image and the brightness of the target image are integrated in the transition area, so that a better transition effect can be presented between the target area and the target image after the mapping is completed, the unnatural situation of the edge area of the attached mapping part is reduced, and the image processing effect is improved.
Drawings
The invention will be better understood from the following description of specific embodiments thereof taken in conjunction with the accompanying drawings in which like or similar reference characters designate like or similar features.
Fig. 1 is a schematic flow chart of an image processing method according to an embodiment of the present invention;
FIG. 2 is a flowchart of another image processing method according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a target sticker image according to an embodiment of the present invention;
FIG. 4 is a flowchart of another image processing method according to an embodiment of the present invention;
fig. 5 is a schematic structural diagram of an electronic device according to an embodiment of the present invention;
fig. 6 is a schematic hardware structure of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
In the existing image processing method, the effect difference of color, texture and the like may exist between the sticker and the target area, so that the attaching effect is abrupt and is not real enough, obvious faults easily occur in the edge areas of the sticker and the target area, and the image processing effect is poor.
In order to solve the defect, an embodiment of the present invention provides an image processing method, which is applied to an electronic device, referring to fig. 1, fig. 1 shows a flow diagram of the image processing method provided by the embodiment of the present invention; the method comprises the following steps:
s101, determining a target area in a target sticker image according to the brightness of the first sticker image, the brightness of a first area fused with the brightness of the target image and the curvature of a first edge curve of the first sticker image;
the first edge curve is the outermost curve of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image.
If there is a large brightness difference between the first sticker image and the first area, an unnatural situation exists at the edge of the attached area after the mapping, so that the target area needing to perform brightness fusion needs to be determined according to the similarity degree between the brightness of the first sticker image and the brightness of the first area. The shape of the first sticker image can be determined according to the curvature of the first edge region, and when the target region to be fused is determined, the brightness and the curvature are mainly integrated to perform calculation, so that the region needing brightness fusion is accurately determined.
S102, carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the target sticker image.
Wherein the region of the first region other than the first region is the second region.
The brightness weighted fusion refers to that the brightness of the target area and the brightness of the first subarea are multiplied by corresponding weights respectively and then summed, and the sum result is used as a fused brightness value. The above-mentioned covering means that the brightness of the fused second sub-area is the same as the brightness of the area other than the target area in the first sticker image. Wherein, for natural fusion, the weight coefficients corresponding to the parts in the target area are gradually changed from inside to outside during the fusion.
In the embodiment of the invention, the brightness of the target sticker image and the brightness of the target image are integrated in the transition area, so that a better transition effect can be presented between the target area and the target image after the mapping is completed, the unnatural situation of the edge area of the attached mapping part is reduced, and the image processing effect is improved
In some embodiments of the present invention, as shown in fig. 2, fig. 2 is a schematic flow chart of another image processing method according to an embodiment of the present invention. The method may include:
s201, dividing a first sticker image into a preset number of first grids, and dividing a first area into second grids which are in one-to-one correspondence with the first grids;
s202, respectively deforming each first grid into the shape of a second grid corresponding to the first grid to obtain a target sticker image;
s203, determining a target area in the target sticker image according to the brightness of the target sticker image, the brightness of the first area and the curvature of the first edge curve of the target sticker image;
s204, carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the target sticker image. S204 is similar to S102 in fig. 1, and will not be described again.
According to the embodiment, the first sticker image is subjected to grid division and adjusted according to the divided grids, so that the shape of the target sticker image and the shape of the first area to be fused in the target image can be the same as much as possible, and the natural degree after lamination is improved. In addition, the shape of the image is adjusted according to the divided grids, so that the accuracy and convenience of adjustment are improved.
In other embodiments of the present invention, before S201, the method may further include:
extracting a face region in the target image;
detecting key points in the face area;
determining a five sense organ region in the face region according to the key points;
digging out the five sense organs;
correspondingly, the step S201 is adjusted to: and dividing a first region in the target image after the five sense organs are scratched into second grids corresponding to the first grids one by one.
According to the embodiment, the facial features are scratched, so that the brightness of the facial features is not affected in the subsequent brightness adjustment process, the condition of facial feature distortion caused by brightness adjustment is avoided, and the natural degree of the finally obtained image is improved. The face region here may not be limited to a human face region. It should be noted that after the brightness adjustment is completed, the five-sense organ area portion needs to be restored.
In still other embodiments of the present invention, the step S203 may include:
determining a geometric center point of the target sticker image;
calculating a distance set from a geometric center point to each pixel point on a second edge curve of the target area according to the brightness of the target sticker image, the brightness of the first area and the curvature of the first edge curve of the target sticker image;
and determining a second edge curve according to the distance set, and taking the area between the second edge curve and the first edge curve as a target area.
Referring to fig. 3, fig. 3 is a schematic diagram of a target sticker image according to an embodiment of the present invention. Wherein R1 and R2 are distances from the geometric center point to the pixel points on the second edge curve, L1 and L2 are distances from the geometric center point to the pixel points on the first edge curve, the area I is an area except the target area in the target sticker image, and the area II is the target area. Since the target area to be determined in the present embodiment is an edge area to be fused between the inside of the target sticker image and the target image, the range of the edge area must be from the first edge curve at the outermost side of the target sticker image to the second edge curve inside the target sticker image. Therefore, according to the similarity of the brightness of the target sticker image and the first area to be fused and the curvature of the first edge curve, the position of the second edge curve can be determined, so that the target area is determined.
In a further embodiment, as shown in fig. 4, fig. 4 is a schematic flow chart of still another image processing method according to an embodiment of the present invention. The method may include:
s301, dividing a first sticker image into a preset number of first grids, and dividing a first area into second grids which are in one-to-one correspondence with the first grids;
s302, respectively deforming each first grid into the shape of a second grid corresponding to the first grid to obtain a target sticker image;
s301 to S302 are similar to S201 to S202 in fig. 2, and are not described here again.
S303, determining a geometric center point O of the target sticker image; since the present embodiment intends to determine the set of distances from the geometric center point to the respective pixel points on the second edge curve, it is necessary to first determine the geometric center point of the target decal image.
S304, calculating the curvature K of each pixel point on the first edge curve of the target sticker image;
since the first edge curve is not a regular curve, the curvature of each pixel point on the first edge curve is typically different.
S305, calculating a first brightness histogram H of the target sticker image 1 Second luminance histogram H of the first region 2
The luminance histogram is a quantization tool capable of intuitively reflecting the luminance condition of each portion within the image, and the present embodiment is not limited to a specific algorithm for calculating the luminance histogram.
S306, calculating the similarity d (H) between the first luminance histogram and the second luminance histogram 1 ,H 2 );
Alternatively, the above-described similarity may be calculated according to any one of the following similarity calculation algorithms: a card-put comparison algorithm, a Papanic distance algorithm and a correlation algorithm.
The sequence of S303, S304, and S305 to S306 is not limited, and the three may be executed in parallel.
S307, calculating a distance set R from the geometric center point to each pixel point on the second edge curve according to the curvature and the similarity;
and S308, determining a second edge curve according to the distance set, and taking the area between the second edge curve and the first edge curve as a target area. S308 is similar to the above embodiment, and will not be described again.
S309, carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the area except the target area in the target sticker image. S309 is similar to S102 in fig. 1 and S204 in fig. 2, and will not be described here.
In this embodiment, the geometric center point O of the target sticker image, the curvature K of each pixel point on the first edge curve, and the first luminance histogram H of the target sticker image are obtained by calculation 1 Second luminance histogram H of the first region 2 The parameters are used as required parameters of a distance set R to be calculated subsequently, so that the distance set R is calculated, a target area to be fused is determined, and the target area and a first subarea in a target image are subjected to brightness weighted fusion, so that the nature degree of a transition area between the sticker and the target image is improved, and obvious faults are avoided as much as possible.
In an embodiment of the present invention, optionally, the step S307 may specifically include:
determining a target area according to the fusion area relation; wherein, the fusion area relation formula is:
wherein K is the curvature of the first edge curve, d (H 1 ,H 2 ) The similarity is the above-mentioned similarity; l is a distance set from the geometric center point to each pixel point on the first edge curve, namely, a ray is radially generated outwards from the geometric center point O and intersects with the pixel point on the first edge curve, and the length set of the ray is L; r is a distance set from the geometric center point to each pixel point on a second edge curve of the target area; alpha is the weight coefficient of the similarity and beta is the weight coefficient of the curvature.
In this embodiment, the physical meaning represented by the above fusion area relation is that the higher the similarity between the target sticker image and the target image is, the smaller the target area to be fused is; the larger the curvature, the smaller the target area that needs to be fused. According to the fusion region relational expression, the calculated amount is small, and the distance set R can be accurately calculated, so that the target region is determined.
In another embodiment, optionally, after S304, before S307, the method may further include:
and under the condition that the curvature of the target pixel point on the first edge curve is smaller than zero, adjusting the curvature of the target pixel point to be the inverse of the curvature. That is, then |K| in the formula needs to be usedInstead of.
When the curvature K is smaller than 0, it indicates that the curvature direction of the current target pixel is opposite to the set positive direction, and if the curvature in the opposite direction is to use the same weight, the curvature value in the opposite direction needs to be adjusted to be the inverse value, so that the curvature in any direction can be ensured to use the same weight.
In another embodiment, optionally, before S305, the method may further include:
the target decal image and the first region are normalized to the same scale space.
In this embodiment, after the target sticker image and the first area have the same scale, the first histogram and the second histogram that are obtained later are calculated based on the same scale, so that when the similarity between the first histogram and the second histogram is calculated, the accuracy of the calculated similarity can be improved, and the error of the similarity caused by the different scales of the two histograms is avoided.
Based on the above method embodiment, correspondingly, the embodiment of the invention also provides an electronic device, as shown in fig. 5, and fig. 5 shows a schematic structural diagram of the electronic device provided by the embodiment of the invention.
The electronic device includes:
a region determining module 401, configured to determine a target region in the target sticker image according to the brightness of the first sticker image and the brightness of the first region fused in the target image, and the curvature of the first edge curve of the first sticker image; the first edge curve is the outermost curve of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image;
and the brightness fusion module 402 is configured to perform brightness weighted fusion on the target area and a first sub-area corresponding to the target area in the first area, and cover an area except the first sub-area in the first area in the target sticker image.
In the embodiment of the invention, the brightness of the target sticker image and the brightness of the target image are integrated in the transition area, so that a better transition effect can be presented between the target area and the target image after the mapping is completed, the unnatural situation of the edge area of the attached mapping part is reduced, and the image processing effect is improved
In some embodiments of the present invention, the area determining module may include:
the first area is divided into second grids corresponding to the first grids one by one;
the grid adjusting unit is used for respectively deforming each first grid into the shape of a second grid corresponding to the first grid to obtain a target sticker image;
and the target determining unit is used for determining a target area in the target sticker image according to the brightness of the target sticker image, the brightness of the first area and the curvature of the first edge curve of the target sticker image.
According to the embodiment, the first sticker image is subjected to grid division and adjusted according to the divided grids, so that the shape of the target sticker image obtained after adjustment and the shape of the first area to be fused in the target image can be the same as much as possible, and the natural degree after lamination is improved. In addition, the shape of the image is adjusted according to the divided grids, so that the accuracy and convenience of adjustment are improved.
In other embodiments of the present invention, the electronic device may further include:
the five sense organs matting module is used for extracting face areas in the target image; detecting key points in the face area; determining a five sense organ region in the face region according to the key points; digging out the five sense organs;
correspondingly, the grid adjustment unit is specifically configured to: and dividing a first region in the target image after the five sense organs are scratched into second grids corresponding to the first grids one by one.
According to the embodiment, the facial features are scratched, so that the brightness of the facial features is not affected in the subsequent brightness adjustment process, the condition of facial feature distortion caused by brightness adjustment is avoided, and the natural degree of the finally obtained image is improved. The face region here may not be limited to a human face region. It should be noted that after the brightness adjustment is completed, the five-sense organ area portion needs to be restored.
In still other embodiments of the present invention, the above-mentioned target determining unit specifically includes:
a center point determining unit for determining a geometric center point of the target sticker image;
a distance set determining unit, configured to calculate a distance set from a geometric center point to each pixel point on a second edge curve of the target area according to the brightness of the target sticker image, the brightness of the first area, and the curvature of the first edge curve of the target sticker image;
and the target area determining unit is used for determining a second edge curve according to the distance set, and taking an area between the second edge curve and the first edge curve as a target area.
According to the brightness similarity between the target sticker image and the first area to be fused and the curvature of the first edge curve, the position of the second edge curve can be determined, and therefore the target area is determined.
In a further embodiment, the distance set determining unit may specifically be configured to:
a curvature calculating unit for calculating a curvature K of each pixel point on a first edge curve of the target sticker image;
a histogram calculation unit for calculating a first luminance histogram H of the target decal image 1 Second luminance histogram H of the first region 2
A similarity calculation unit for calculating a similarity d (H) between the first luminance histogram and the second luminance histogram 1 ,H 2 );
And the set calculation unit is used for calculating a distance set R from the geometric center point to each pixel point on the second edge curve according to the curvature and the similarity.
In this embodiment, the geometric center point O of the target sticker image, the curvature K of each pixel point on the first edge curve, and the first luminance histogram H of the target sticker image are obtained by calculation 1 Second luminance histogram H of the first region 2 The parameters are used as required parameters of a distance set R to be calculated subsequently, so that the distance set R is calculated, a target area to be fused is determined, and the target area and a first subarea in a target image are subjected to brightness weighted fusion, so that the nature degree of a transition area between the sticker and the target image is improved, and obvious faults are avoided as much as possible.
In an embodiment of the present invention, optionally, the foregoing set calculating unit may be configured to:
determining a target area according to the fusion area relation; wherein, the fusion area relation formula is:
wherein K is the curvature of the first edge curve, d (H 1 ,H 2 ) The similarity is the above-mentioned similarity; l is a distance set from the geometric center point to each pixel point on the first edge curve, namely, a ray is radially generated outwards from the geometric center point O and intersects with the pixel point on the first edge curve, and the length set of the ray is L; r is the geometric centerA set of distances from the point to respective pixel points on a second edge curve of the target region; alpha is the weight coefficient of the similarity and beta is the weight coefficient of the curvature.
In this embodiment, the physical meaning represented by the above fusion area relation is that the higher the similarity between the target sticker image and the target image is, the smaller the target area to be fused is; the larger the curvature, the smaller the target area that needs to be fused. According to the fusion region relational expression, the calculated amount is small, and the distance set R can be accurately calculated, so that the target region is determined.
In another embodiment, optionally, the above-mentioned set calculation unit may be further configured to:
and under the condition that the curvature of the target pixel point on the first edge curve is smaller than zero, adjusting the curvature of the target pixel point to be the inverse of the curvature.
When the curvature K is smaller than 0, it indicates that the curvature direction of the current target pixel is opposite to the set positive direction, and if the curvature in the opposite direction is to use the same weight, the curvature value in the opposite direction needs to be adjusted to be the inverse value, so that the curvature in any direction can be ensured to use the same weight.
In another embodiment, optionally, the distance set determining unit may further include:
a normalization unit for normalizing the target decal image and the first region to the same scale space; and triggers the histogram calculation unit.
In this embodiment, after the target sticker image and the first area have the same scale, the first histogram and the second histogram that are obtained later are calculated based on the same scale, so that when the similarity between the first histogram and the second histogram is calculated, the accuracy of the calculated similarity can be improved, and the error of the similarity caused by the different scales of the two histograms is avoided.
The electronic device provided in the embodiment of the present invention can implement each method step implemented in the method embodiments of fig. 1, fig. 2, and fig. 4, and in order to avoid repetition, a description is omitted here.
Fig. 6 shows a schematic hardware structure of an electronic device according to an embodiment of the present invention.
The electronic device 500 includes, but is not limited to: radio frequency unit 501, network module 502, audio output unit 503, input unit 504, sensor 505, display unit 506, user input unit 507, interface unit 508, memory 509, processor 510, and power source 511. It will be appreciated by those skilled in the art that the electronic device structure shown in fig. 6 is not limiting of the electronic device and that the electronic device may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. In the embodiment of the invention, the electronic equipment comprises, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer and the like.
The processor 510 is configured to determine a target area in the target sticker image according to the brightness of the first area fused with the brightness of the first sticker image and the curvature of the first edge curve of the first sticker image; the first edge curve is the outermost curve of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image; and carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the area except the target area in the target sticker image.
In the embodiment of the invention, the brightness of the target sticker image and the brightness of the target image are integrated in the transition area, so that a better transition effect can be presented between the target area and the target image after the mapping is completed, the unnatural situation of the edge area of the attached mapping part is reduced, and the image processing effect is improved
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used to receive and send information or signals during a call, specifically, receive downlink data from a base station, and then process the downlink data with the processor 510; and, the uplink data is transmitted to the base station. Typically, the radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 may also communicate with networks and other devices through a wireless communication system.
The electronic device provides wireless broadband internet access to the user through the network module 502, such as helping the user to send and receive e-mail, browse web pages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output (e.g., a call signal reception sound, a message reception sound, etc.) related to a specific function performed by the electronic device 500. The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used for receiving an audio or video signal. The input unit 504 may include a graphics processor (Graphics Processing Unit, GPU) 5041 and a microphone 5042, the graphics processor 5041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphics processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. Microphone 5042 may receive sound and may be capable of processing such sound into audio data. The processed audio data may be converted into a format output that can be transmitted to the mobile communication base station via the radio frequency unit 501 in case of a phone call mode.
The electronic device 500 also includes at least one sensor 505, such as a light sensor, a motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 5061 and/or the backlight when the electronic device 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the acceleration in all directions (generally three axes), and can detect the gravity and direction when stationary, and can be used for recognizing the gesture of the electronic equipment (such as horizontal and vertical screen switching, related games, magnetometer gesture calibration), vibration recognition related functions (such as pedometer and knocking), and the like; the sensor 505 may further include a fingerprint sensor, a pressure sensor, an iris sensor, a molecular sensor, a gyroscope, a barometer, a hygrometer, a thermometer, an infrared sensor, etc., which are not described herein.
The display unit 506 is used to display information input by a user or information provided to the user. The display unit 506 may include a display panel 5061, and the display panel 5061 may be configured in the form of a liquid crystal display (Liquid Crystal Display, LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 is operable to receive input numeric or character information and to generate key signal inputs related to user settings and function control of the electronic device. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations thereon or thereabout by a user (e.g., operations of the user on touch panel 5071 or thereabout using any suitable object or accessory such as a finger, stylus, etc.). Touch panel 5071 may include two parts, a touch detection device and a touch controller. The touch detection device detects the touch azimuth of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch detection device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as resistive, capacitive, infrared, and surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, physical keyboards, function keys (e.g., volume control keys, switch keys, etc.), trackballs, mice, joysticks, and so forth, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 5071 detects a touch operation thereon or thereabout, the touch operation is transmitted to the processor 510 to determine a type of touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of touch event. Although in fig. 6, the touch panel 5071 and the display panel 5061 are two independent components for implementing the input and output functions of the electronic device, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the electronic device, which is not limited herein.
The interface unit 508 is an interface for connecting an external device to the electronic apparatus 500. For example, the external devices may include a wired or wireless headset port, an external power (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from an external device and transmit the received input to one or more elements within the electronic apparatus 500 or may be used to transmit data between the electronic apparatus 500 and an external device.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, phonebook, etc.) created according to the use of the handset, etc. In addition, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the electronic device, connects various parts of the entire electronic device using various interfaces and lines, and performs various functions of the electronic device and processes data by running or executing software programs and/or modules stored in the memory 509, and calling data stored in the memory 509, thereby performing overall monitoring of the electronic device. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor that primarily handles operating systems, user interfaces, applications, etc., with a modem processor that primarily handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 510.
The electronic device 500 may also include a power supply 511 (e.g., a battery) for powering the various components, and preferably the power supply 511 may be logically connected to the processor 510 via a power management system that performs functions such as managing charging, discharging, and power consumption.
In addition, the electronic device 500 includes some functional modules, which are not shown, and will not be described herein.
Preferably, the embodiment of the present invention further provides an electronic device, including a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program when executed by the processor 510 implements each process of the above embodiment of the image processing method, and the same technical effects can be achieved, and for avoiding repetition, a description is omitted herein.
The embodiment of the invention also provides a computer readable storage medium, on which a computer program is stored, which when executed by a processor, implements the processes of the above-mentioned image processing method embodiment, and can achieve the same technical effects, so that repetition is avoided, and no further description is given here. Among them, a computer readable storage medium such as Read-Only Memory (ROM), random access Memory (Random Access Memory RAM), magnetic disk or optical disk, and the like.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present invention may be embodied essentially or in a part contributing to the prior art in the form of a software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising several instructions for causing a terminal (which may be a mobile phone, a computer, a server, an air conditioner, or a network device, etc.) to perform the method of the embodiments of the present invention.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (8)

1. An image processing method applied to an electronic device, the method comprising:
determining a target area in the target sticker image according to the brightness of the first sticker image and the brightness of a first area fused in the target image and the curvature of a first edge curve of the first sticker image; the first edge curve is the curve of the outermost side of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image; the target area is an area needing brightness fusion;
performing brightness weighted fusion on the target area and a first subarea of the first area, which corresponds to the target area, and covering the area except the first subarea in the first area, wherein the area except the target area in the target sticker image;
the determining the target area in the target sticker image according to the brightness of the first area fused with the brightness of the first sticker image and the curvature of the first edge curve of the first sticker image comprises the following steps:
determining a geometric center point of the target decal image;
calculating a distance set from the geometric center point to each pixel point on a second edge curve of the target area according to the brightness of the target sticker image, the brightness of the first area and the curvature of the first edge curve of the target sticker image;
and determining the second edge curve according to the distance set, and taking the area between the second edge curve and the first edge curve as the target area.
2. The method of claim 1, wherein determining the target region in the target decal image based on the luminance of the first region fused in the target image with the luminance of the first decal image and the curvature of the first edge curve of the first decal image, further comprises:
dividing the first sticker image into a preset number of first grids, and dividing the first area into second grids corresponding to the first grids one by one;
and respectively deforming each first grid into the shape of the corresponding second grid to obtain the target sticker image.
3. The method of claim 2, wherein before dividing the first sticker image into a preset number of first grids and dividing the first area into second grids corresponding to the first grids one by one, further comprises:
extracting a face region in the target image;
detecting key points in the face area;
determining a five sense organs area in the face area according to the key points;
digging out the five sense organs;
the dividing the first area into second grids corresponding to the first grids one by one comprises:
dividing the first region in the target image after the five sense organs are scratched into second grids corresponding to the first grids one by one.
4. The method of claim 1, wherein calculating the set of distances from the geometric center point to the respective pixels on the second edge curve of the target area based on the luminance of the target sticker image and the luminance of the first area and the curvature of the first edge curve of the target sticker image comprises:
calculating the curvature of each pixel point on a first edge curve of the target sticker image;
calculating a first luminance histogram of the target decal image and a second luminance histogram of the first region;
calculating a similarity between the first luminance histogram and the second luminance histogram;
and calculating a distance set from the geometric center point to each pixel point on the second edge curve according to the curvature and the similarity.
5. An electronic device, comprising:
the area determining module is used for determining a target area in the target sticker image according to the brightness of the first sticker image and the brightness of a first area fused in the target image and the curvature of a first edge curve of the first sticker image; the first edge curve is the curve of the outermost side of the first sticker image, and the target sticker image is the first sticker image or an image obtained according to the first sticker image; the target area is an area needing brightness fusion;
the brightness fusion module is used for carrying out brightness weighted fusion on the target area and a first subarea corresponding to the target area in the first area, and covering the area except the first subarea in the first area in the area except the target area in the target sticker image;
the area determination module includes:
a center point determining unit configured to determine a geometric center point of the target sticker image;
a distance set determining unit, configured to calculate a distance set from the geometric center point to each pixel point on a second edge curve of the target area according to the brightness of the target sticker image, the brightness of the first area, and the curvature of the first edge curve of the target sticker image;
and the target area determining unit is used for determining the second edge curve according to the distance set, and taking the area between the second edge curve and the first edge curve as the target area.
6. The electronic device of claim 5, wherein the region determination module further comprises:
the grid dividing unit is used for dividing the first sticker image into first grids with preset numbers and dividing the first area into second grids which are in one-to-one correspondence with the first grids;
and the grid adjustment unit is used for respectively deforming each first grid into the shape of the second grid corresponding to the first grid to obtain the target sticker image.
7. The electronic device of claim 6, wherein the electronic device further comprises:
the five sense organs matting module is used for extracting face areas in the target image; detecting key points in the face area; determining a five sense organs area in the face area according to the key points; digging out the five sense organs;
correspondingly, the grid adjustment unit is specifically configured to: dividing the first region in the target image after the five sense organs are scratched into second grids corresponding to the first grids one by one.
8. The electronic device of claim 5, wherein the distance set determination unit comprises:
a curvature calculating unit for calculating the curvature of each pixel point on the first edge curve of the target sticker image;
a histogram calculation unit configured to calculate a first luminance histogram of the target sticker image and a second luminance histogram of the first area;
a similarity calculation unit configured to calculate a similarity between the first luminance histogram and the second luminance histogram;
and the set calculation unit is used for calculating a distance set from the geometric center point to each pixel point on the second edge curve according to the curvature and the similarity.
CN202010070763.9A 2020-01-21 2020-01-21 Image processing method, electronic equipment and medium Active CN111260600B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010070763.9A CN111260600B (en) 2020-01-21 2020-01-21 Image processing method, electronic equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010070763.9A CN111260600B (en) 2020-01-21 2020-01-21 Image processing method, electronic equipment and medium

Publications (2)

Publication Number Publication Date
CN111260600A CN111260600A (en) 2020-06-09
CN111260600B true CN111260600B (en) 2023-08-22

Family

ID=70954423

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010070763.9A Active CN111260600B (en) 2020-01-21 2020-01-21 Image processing method, electronic equipment and medium

Country Status (1)

Country Link
CN (1) CN111260600B (en)

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016886A (en) * 2012-07-10 2014-01-30 Furyu Kk Image processor and image processing method
CN108495049A (en) * 2018-06-15 2018-09-04 Oppo广东移动通信有限公司 Filming control method and Related product
CN108833779A (en) * 2018-06-15 2018-11-16 Oppo广东移动通信有限公司 Filming control method and Related product
CN108876886A (en) * 2017-05-09 2018-11-23 腾讯科技(深圳)有限公司 Image processing method, device and computer equipment
CN108898551A (en) * 2018-06-14 2018-11-27 北京微播视界科技有限公司 The method and apparatus that image merges
CN109147007A (en) * 2018-08-01 2019-01-04 Oppo(重庆)智能科技有限公司 Paster loading method, device, terminal and computer readable storage medium
CN109784301A (en) * 2019-01-28 2019-05-21 广州酷狗计算机科技有限公司 Image processing method, device, computer equipment and storage medium
CN110136071A (en) * 2018-02-02 2019-08-16 杭州海康威视数字技术股份有限公司 A kind of image processing method, device, electronic equipment and storage medium

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4636146B2 (en) * 2008-09-05 2011-02-23 ソニー株式会社 Image processing method, image processing apparatus, program, and image processing system
CN108898546B (en) * 2018-06-15 2022-08-16 北京小米移动软件有限公司 Face image processing method, device and equipment and readable storage medium

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2014016886A (en) * 2012-07-10 2014-01-30 Furyu Kk Image processor and image processing method
CN108876886A (en) * 2017-05-09 2018-11-23 腾讯科技(深圳)有限公司 Image processing method, device and computer equipment
CN110136071A (en) * 2018-02-02 2019-08-16 杭州海康威视数字技术股份有限公司 A kind of image processing method, device, electronic equipment and storage medium
CN108898551A (en) * 2018-06-14 2018-11-27 北京微播视界科技有限公司 The method and apparatus that image merges
CN108495049A (en) * 2018-06-15 2018-09-04 Oppo广东移动通信有限公司 Filming control method and Related product
CN108833779A (en) * 2018-06-15 2018-11-16 Oppo广东移动通信有限公司 Filming control method and Related product
CN109147007A (en) * 2018-08-01 2019-01-04 Oppo(重庆)智能科技有限公司 Paster loading method, device, terminal and computer readable storage medium
CN109784301A (en) * 2019-01-28 2019-05-21 广州酷狗计算机科技有限公司 Image processing method, device, computer equipment and storage medium

Also Published As

Publication number Publication date
CN111260600A (en) 2020-06-09

Similar Documents

Publication Publication Date Title
CN107846583B (en) Image shadow compensation method and mobile terminal
CN111031234B (en) Image processing method and electronic equipment
CN111401463B (en) Method for outputting detection result, electronic equipment and medium
CN107730460B (en) Image processing method and mobile terminal
CN111459233B (en) Display method, electronic device and storage medium
CN111145087B (en) Image processing method and electronic equipment
CN108174110B (en) Photographing method and flexible screen terminal
CN110602390B (en) Image processing method and electronic equipment
CN110312070B (en) Image processing method and terminal
CN109343811B (en) Display adjustment method and terminal equipment
CN108833791B (en) Shooting method and device
CN109104573B (en) Method for determining focusing point and terminal equipment
CN107977947B (en) Image processing method and mobile terminal
CN113888447A (en) Image processing method, terminal and storage medium
CN111028161B (en) Image correction method and electronic equipment
CN110740265B (en) Image processing method and terminal equipment
CN109858447B (en) Information processing method and terminal
CN109257504B (en) Audio processing method and terminal equipment
CN108965701B (en) Jitter correction method and terminal equipment
CN111328132A (en) Method for adjusting transmitting power and electronic equipment
CN111260600B (en) Image processing method, electronic equipment and medium
EP4047921A1 (en) Electronic device and focusing method
CN111563838B (en) Image processing method and electronic equipment
CN111126388B (en) Image recognition method and electronic equipment
CN109379531B (en) Shooting method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant