CN114004732A - Image editing prompting method and device, electronic equipment and readable storage medium - Google Patents

Image editing prompting method and device, electronic equipment and readable storage medium Download PDF

Info

Publication number
CN114004732A
CN114004732A CN202110693749.9A CN202110693749A CN114004732A CN 114004732 A CN114004732 A CN 114004732A CN 202110693749 A CN202110693749 A CN 202110693749A CN 114004732 A CN114004732 A CN 114004732A
Authority
CN
China
Prior art keywords
area
image
deformation
user
contour data
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110693749.9A
Other languages
Chinese (zh)
Inventor
刘秋冶
林焕鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Honor Device Co Ltd
Original Assignee
Honor Device Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Honor Device Co Ltd filed Critical Honor Device Co Ltd
Priority to CN202110693749.9A priority Critical patent/CN114004732A/en
Publication of CN114004732A publication Critical patent/CN114004732A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T3/04
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Abstract

The application provides an image editing prompting method, an image editing prompting device, electronic equipment and a readable storage medium, and relates to the image processing technology. The method comprises the following steps: the electronic device may change a shape of an image region including a first region of a first object of the plurality of objects in response to a first morphing operation by a user on the first region. In a case where second contour data of a second object among the plurality of objects other than the first object is changed with respect to the corresponding first contour data, the electronic device determines that a second region of the second object is deformed before and after the first deforming operation. The electronic equipment displays first prompt information near the second area of the second object to prompt a user that the second area of the second object is deformed due to misoperation. Furthermore, the electronic equipment responds to the repairing operation of the user on the second area of the second object, so as to repair the shape of the second area of the second object, and eliminate the adverse effect of the deformation of the second area on the quality of the image to be edited.

Description

Image editing prompting method and device, electronic equipment and readable storage medium
Technical Field
The present application relates to image processing technologies, and in particular, to an image editing prompting method and apparatus, an electronic device, and a readable storage medium.
Background
The user can input deformation operation on the target object in the image to be edited on the image editing interface of the electronic equipment, and the electronic equipment can respond to the deformation operation to change the shape of the target object, so that the visual effect of the target object is better. For example, the electronic device may reduce the face area of the human body in response to a user inputting a face-thinning operation to the face area of the human body in the image to be edited, so that the face area of the human body is more beautiful; for another example, the electronic device may reduce the leg region of the human body in response to the user inputting a leg-thinning operation to the leg region of the human body in the image to be edited, so as to make the leg region of the human body more beautiful.
Currently, when the shape of a target object in an image to be edited is changed in an electronic device, the shape of other objects adjacent to the target object is changed. However, it is not the user's intention to change the shape of the other object region adjacent to the target object, that is, the other object whose shape is changed is an erroneous operation object. As such, the quality of the image may be lowered.
Disclosure of Invention
The embodiment of the application provides an image editing prompting method and device, electronic equipment and a readable storage medium, which can enable the quality of an image to be edited not to be affected when the electronic equipment responds to a first deformation operation of a user on a first area of a first object in the image to be edited and changes the first area of the first object.
In a first aspect, an embodiment of the present application provides an image editing prompting method, including: the electronic equipment displays an image editing interface, wherein the image editing interface displays an image to be edited; the electronic equipment acquires first contour data of each object in a plurality of objects included in an image to be edited; the electronic equipment responds to a first deformation operation of a first region of a first object in the plurality of objects by a user, and changes the shape of an image region including the first region; the electronic equipment acquires second contour data of each object in the image to be edited after the first deformation operation; the electronic equipment determines that a second area of a second object is deformed before and after the first deformation operation according to the second contour data and the first contour data, wherein the second object is located near the first object, and the second area is adjacent to the first area; the electronic equipment displays first prompt information near a second area of the second object, wherein the first prompt information is used for indicating that the second area deforms; alternatively, the first prompt information is used to instruct the second region of the second object to restore the shape before the first morphing operation.
In this way, the electronic device displays the first prompt message near the second area of the second object, and can prompt the user that the second area of the second object is deformed due to misoperation. When the user feels that the second object is deformed and the quality of the image to be edited is greatly affected, the user may input a repairing operation to the second area of the second object. Further, the electronic device may display the second object by replacing the second contour data of the second object with the first contour data of the second object in response to a repair operation of the second region of the second object by the user. Therefore, the shape of the second area of the second object is repaired, and the adverse effect of deformation of the second area of the second object on the quality of the image to be edited is eliminated. Or, the electronic device may further replace the second contour data of the second object with the first contour data of the second object to display the second object, so that automatic repair of the second region of the deformed second object is achieved, and further, the quality of the image to be edited is not affected. The user can also perceive the automatic repair of the second area of the deformed second object through the first prompt message.
In one possible implementation, the first object is a person, the first region is a face region, and the second object is an object; alternatively, the first object is a person and the second object is a person.
In one possible implementation manner, after the electronic device displays the first prompt message near the second area of the second object, the method for writing the edit prompt further includes: the electronic device displays the second object with the first contour data of the second object in response to a user's repair operation on the second region of the second object. In this way, the electronic device may display the second object by replacing the second contour data of the second object with the first contour data of the second object in response to a repair operation of the second region of the second object by the user.
Therefore, the shape of the second area of the second object is repaired, and the adverse effect of deformation of the second area of the second object on the quality of the image to be edited is eliminated.
Further, the electronic device displays the second object with the first contour data of the second object in response to a user's repair operation on the second area of the second object, including: the electronic equipment responds to a trigger operation of a user on a second area of the second object, and displays a first repair button in the vicinity of the second area of the second object; the electronic device responds to the triggering operation of the first repairing button by the user, and displays the second object with the first outline data of the second object.
Therefore, the user can display the second object by triggering the first repairing button displayed near the second area, namely the shape of the second area is repaired, and the method is convenient and quick. The first repair button is the repair button in the following embodiments.
Or, further, the number of the second objects is multiple, and the electronic device displays the second objects with the first contour data of the second objects in response to the user's repair operation on the second areas of the second objects, including: the electronic equipment displays a second repair button on the image editing interface; the electronic device displays each of the plurality of second objects with the first contour data of the plurality of second objects in response to a user's trigger operation of the second fix button.
Therefore, the user can complete the shape restoration of the second areas of the plurality of second objects by one key only by triggering the second restoration button, and the method is convenient and quick. The second repair button is a one-key repair button in the following embodiments.
In another possible implementation manner, the first prompt information is used to indicate that the second area is deformed, and after the electronic device displays the first prompt information near the second area of the second object, the image editing prompt method provided by the present application further includes: and the electronic equipment cancels the display of the first prompt message in response to the unrepaired operation of the user on the second area of the second object.
Therefore, after the first prompt message is cancelled to be displayed, the interference of the first prompt message to the user for browsing the image to be edited can be eliminated. The unrepaired operation is a trigger operation for the unrepaired button in the following embodiments.
In one possible implementation manner, the number of the second areas is multiple, and the electronic device displays the first prompt message near the second area of the second object, including: the electronic equipment calculates the importance degree of each second area in the plurality of second areas; the importance degrees of the plurality of second areas are ranked by the electronic equipment; and the electronic equipment displays first prompt information near the second area of each second object according to the sorting result, wherein the first prompt information displayed near different second areas is marked with different marks.
In this way, the user can determine different processing modes for different second areas according to different marks marked by the first prompt information near the different second areas.
In a possible implementation manner, after the electronic device displays the first prompt information near the second area of the second object, the image editing prompt method provided by the present application further includes: the electronic device changes a shape of an image area including a third area in response to a second deforming operation of the third area of the first object among the plurality of objects by the user; the electronic equipment acquires third contour data of each object in the image to be edited after the second deformation operation; the electronic equipment determines that a fourth area of a third object is deformed before and after the first deformation operation according to the third contour data and the first contour data, wherein the third object is located near the first object, and the fourth area is adjacent to the third area; the electronic equipment displays second prompt information near a fourth area of the third object, wherein the second prompt information is used for indicating that the fourth area deforms; alternatively, the second hint information indicates that the fourth area of the third object has recovered the shape prior to the first morphing operation.
Therefore, the electronic equipment can repair the shape of the fourth area of the third object, and the adverse effect of deformation of the fourth area of the third object on the quality of the image to be edited is eliminated. Or, the electronic device may further replace the second contour data of the third object with the first contour data of the third object to display the third object, so that the fourth area of the deformed third object is automatically repaired, and further, the quality of the image to be edited is not affected. The user can also perceive the automatic repair of the fourth area of the deformed third object through the first prompt message.
In one possible implementation manner, determining that the second region of the second object is deformed before and after the first deformation operation according to the second contour data and the first contour data includes: the electronic device calculates second contour data of the second object, and deformation degree values of the first contour data relative to the second object number; and under the condition that the deformation degree value is larger than the deformation threshold value, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation.
The user can visually perceive that the second region of the second object is deformed if the value of the degree of deformation is greater than the deformation threshold. Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation, and the user can be more accurately fitted to the visual feeling.
In one possible implementation, the first shape change operation is input to the first object by the user to change the shape of the first object. Further, it is considered that the first object is an object having a high user attention, and the closer the second object is to the first object, the higher the possibility that the first object is focused on by the user; conversely, the further the second object is from the first object, the lower the likelihood of being focused on by the user. Furthermore, before determining that the second region of the second object is deformed before and after the first deformation operation, the image editing prompting method provided by the application further includes: the electronic device may calculate a first distance between the geometric center of the second object and the geometric center of the first object, and find the deformation threshold from a pre-stored first mapping table according to the calculated distance. In a pre-stored first mapping relation table, the size of the first distance is in one-to-one correspondence and positive correlation with the deformation threshold value.
Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation by using the deformation threshold searched by the first distance, and the user can more accurately fit the visual feeling.
Or, in another possible implementation manner, in the case that the area of the second object is large, even if the deformation degree value of the second object is large, the user may visually consider that the second object is not deformed; in the case where the area of the second object is small, the user visually recognizes that the second object is deformed even if the value of the degree of deformation of the second object is small. Therefore, the electronic device may calculate the area of the second object, and find the deformation threshold from the pre-stored first mapping relation table according to the calculated area. In the pre-stored first mapping relation table, the size of the area and the deformation threshold value have a one-to-one correspondence and positive correlation.
Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation by using the deformation threshold searched by the area of the second object, and the user can more accurately fit the visual feeling.
Or, in another possible implementation, in general, the closer the user is to the edge of the image, the higher the attention of the user; conversely, the more distant the user is from the edge of the image, the less attentive the user is to the object. In this way, in the case where the second object is closer to the edge of the image, the user visually recognizes that the second object is not deformed even if the value of the degree of deformation is large; conversely, in the case where the second object is far from the edge of the image, the user visually recognizes that the second object is deformed even if the deformation degree value is small. In this way, the electronic device may calculate a second distance between the geometric center of the second object and the edge of the image, and find the deformation threshold from the pre-stored first mapping table according to the calculated second distance from the edge of the image. In the pre-stored first mapping relation table, the size of the second distance from the image edge is in one-to-one correspondence and negative correlation with the deformation threshold.
Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation by using the deformation threshold searched by the second distance, and the user can more accurately fit the visual feeling.
In one possible implementation, the deformation degree value is a difference between an area of the second contour data and an area of the first contour data; alternatively, the deformation degree value is a difference between the circumference of the second contour data and the circumference of the first contour data.
In a second aspect, the present application further provides an image editing prompting apparatus, including: the display unit is used for displaying an image editing interface, wherein the image editing interface displays an image to be edited; a processing unit configured to acquire first contour data of each of a plurality of objects included in an image to be edited; a processing unit, further configured to change a shape of an image area including a first area of a first object among the plurality of objects in response to a first morphing operation by a user on the first area; the processing unit is further used for acquiring second contour data of each object in the image to be edited after the first deformation operation; the processing unit is further used for determining that a second area of the second object is deformed before and after the first deformation operation according to the second contour data and the first contour data, wherein the second object is located near the first object, and the second area is adjacent to the first area; the display unit is further used for displaying first prompt information near a second area of the second object, and the first prompt information is used for indicating that the second area deforms; alternatively, the first prompt information is used to instruct the second region of the second object to restore the shape before the first morphing operation.
In one possible implementation, the first object is a person, the first region is a face region, and the second object is an object; alternatively, the first object is a person and the second object is a person.
In one possible implementation, the processing unit is further configured to display the second object with the first contour data of the second object in response to a repair operation of the second area of the second object by the user.
In a possible implementation manner, the processing unit is further configured to respond to a triggering operation of a second area of the second object by a user, and the display unit is further configured to display a first repair button in the vicinity of the second area of the second object; and the display unit is also used for displaying the second object by using the first contour data of the second object.
In a possible implementation manner, the number of the second objects is multiple, and the display unit is specifically configured to display a second repair button on the image editing interface; and the display unit is also used for displaying each second object by the first contour data of the plurality of second objects.
In a possible implementation manner, the processing unit is further configured to respond to a user's unrepairable operation on the second area of the second object, and the display unit is further configured to cancel displaying the first prompt message.
In a possible implementation manner, the number of the second areas is multiple, and the processing unit is further configured to calculate, by the electronic device, the importance degree of each of the multiple second areas; the importance degrees of the plurality of second areas are ranked by the electronic equipment; and the electronic equipment displays first prompt information near the second area of each second object according to the sorting result, wherein the first prompt information displayed near different second areas is marked with different marks.
In one possible implementation, the processing unit is further configured to change a shape of an image region including the third region in response to a second deforming operation of the third region of the first object among the plurality of objects by the user; the processing unit is further used for acquiring third contour data of each object in the image to be edited after the second deformation operation; the processing unit is further used for determining that a fourth area of a third object is deformed before and after the first deformation operation according to the third contour data and the first contour data, wherein the third object is located near the first object, and the fourth area is adjacent to the third area; the display unit is further used for displaying second prompt information near a fourth area of the third object, and the second prompt information is used for indicating that the fourth area deforms; alternatively, the second hint information indicates that the fourth area of the third object has recovered the shape prior to the first morphing operation.
In a possible implementation, the processing unit is specifically configured to calculate second contour data of a second object, a deformation degree value of the first contour data relative to a second number of objects; and under the condition that the deformation degree value is larger than the deformation threshold value, determining that the second area of the second object deforms before and after the first deformation operation.
In a possible implementation manner, the processing unit is further configured to calculate a first distance between a geometric center of the second object and a geometric center of the first object, and find the deformation threshold from a pre-stored first mapping relation table according to the first distance, where in the pre-stored first mapping relation table, a one-to-one correspondence relationship exists between a size of the first distance and the deformation threshold, and the size of the first distance and the deformation threshold are positively correlated.
Or, the processing unit is further configured to calculate an area of the second object, and find the deformation threshold from a pre-stored first mapping relation table according to the area, where in the pre-stored first mapping relation table, the size of the area and the deformation threshold have a one-to-one correspondence and a positive correlation.
Or, the processing unit is further configured to calculate a second distance between the geometric center of the second object and the edge of the image, and find the deformation threshold from a pre-stored first mapping relation table according to the second distance, where in the pre-stored first mapping relation table, the size of the second distance is in one-to-one correspondence with and is inversely related to the deformation threshold.
In one possible implementation, the deformation degree value is a difference between an area of the second contour data and an area of the first contour data; alternatively, the deformation degree value is a difference between the circumference of the second contour data and the circumference of the first contour data.
In a third aspect, the present application further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to execute the instructions to perform the image editing prompting method as described in the first aspect or any implementation manner of the first aspect.
In a fourth aspect, the present application also provides a computer-readable storage medium, where instructions of the computer-readable storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the image editing prompting method as described in the first aspect or any implementation manner of the first aspect.
In a fifth aspect, the present application further provides a computer program product comprising a computer program or instructions which, when executed by a processor, implements the image editing prompt method as described in the first aspect or any implementation manner of the first aspect.
It should be understood that the second aspect to the fifth aspect of the present application correspond to the technical solutions of the first aspect of the present application, and the beneficial effects achieved by the aspects and the corresponding possible implementations are similar and will not be described again.
Drawings
FIG. 1 is a schematic view of an interface showing the deformation of an arm area of a male b while an arm area of a female a is being thinned;
FIG. 2 is a diagram of a hardware system architecture of a terminal device;
FIG. 3 is a diagram of a software system architecture of a terminal device;
fig. 4 is a flowchart of an image editing prompting method according to an embodiment of the present application;
FIG. 5 is a schematic view of an interface for prompting a user whether to repair an arm area of male b according to an embodiment of the present application;
FIG. 6 is a schematic interface diagram of a mobile phone provided in the embodiment of the present application for repairing an arm area of a male b in response to a trigger operation of a repair button;
FIG. 7 is a schematic interface diagram of a mobile phone according to an embodiment of the present application, responding to a trigger operation of a no-repair button, and not repairing an arm area of male b;
FIG. 8 is a schematic diagram of an interface showing a hand-held area of a horizontal bar c deformed while the face area of a woman a is thinned;
FIG. 9 is an interface schematic diagram of a mobile phone provided by an embodiment of the present application repairing an arm area of a male b and a hand-held area of a horizontal bar c in response to a trigger operation of a one-touch repair button;
FIG. 10 is a schematic interface diagram of a cell phone according to an embodiment of the present application, in response to a trigger operation of a one-touch unrepaired button, not repairing an arm area of male b and a hand-held area of horizontal bar c;
11-12 are schematic interface diagrams of a mobile phone repairing the arm area of male b in response to a trigger operation of a repair button, and a mobile phone not repairing the arm area of male b in response to a trigger operation of a non-repair button;
fig. 13 is a second flowchart of an image editing prompting method according to the embodiment of the present application;
fig. 14 is a schematic structural diagram of an image editing prompting apparatus according to an embodiment of the present application;
fig. 15 is a schematic hardware structure diagram of an electronic device according to an embodiment of the present disclosure;
fig. 16 is a schematic structural diagram of a chip according to an embodiment of the present application.
Detailed Description
In the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same or similar items having substantially the same function and action. For example, the first value and the second value are only used to distinguish different values, and the order of the values is not limited. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
It is noted that, in the present application, words such as "exemplary" or "for example" are used to mean exemplary, illustrative, or descriptive. Any embodiment or design described herein as "exemplary" or "e.g.," is not necessarily to be construed as preferred or advantageous over other embodiments or designs. Rather, use of the word "exemplary" or "such as" is intended to present concepts related in a concrete fashion.
In the present application, "at least one" means one or more, "a plurality" means two or more. "and/or" describes the association relationship of the associated objects, meaning that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone, wherein A and B can be singular or plural. The character "/" generally indicates that the former and latter associated objects are in an "or" relationship. "at least one of the following" or similar expressions refer to any combination of these items, including any combination of the singular or plural items. For example, at least one (one) of a, b, or c, may represent: a, b, c, a-b, a-c, b-c, or a-b-c, wherein a, b, c may be single or multiple.
At present, a user can input a deformation operation on a target object in an image to be edited in an image editing interface of electronic equipment, and the electronic equipment can change the shape of the target object in response to the deformation operation, so that the visual effect of the target object is better. For example, as shown in fig. 1, the electronic device may enter an album preview interface in which thumbnails of a plurality of photos that have been taken (not shown in fig. 1) are displayed in response to an operation of the user to open an album application. The electronic equipment can enter the first operation interface in response to the triggering operation of the user on the thumbnail of one of the taken photos.
As shown in fig. 1 (a), the photographs in the first manipulation interface 201 include a woman a and a man b. When the user observes that the arm of lady a is shot thicker in the picture, the electronic device may enter the second operation interface 203 in response to the user's trigger operation on the return button 202 in the first operation interface 201. As shown in fig. 1 (b), a second operation interface 203 is displayed
Figure RE-GDA0003413613950000061
Can be responsive to user pairing
Figure RE-GDA0003413613950000062
The operation of triggering icon 204 goes to a third operation interface (not shown in fig. 1) where a photo addition button is displayed. The electronic device may enter the image editing interface 205 in response to a user's toggle operation of the photo add button.
As shown in fig. 1 (c), a photograph and a slimming button 206 in the first operation interface 201 are displayed on the image editing interface 205. The electronic device may display an adjustment circle 206 indicating a slimming area at the time of single triggering of the photograph at the image editing interface 205 in response to a user's triggering operation of the slimming button 206. The electronic device may adjust the area of the adjustment ring 206 down in response to a user's activation of the adjustment ring; the electronic device may also adjust the area of the adjustment collar 206 by increasing the area in response to a user activating the adjustment collar 206. Further, as shown in fig. 1 (d), the electronic device may determine the slimming area in response to a user's trigger operation of the adjusted adjustment ring 206. As such, as shown in (e) of fig. 1, the electronic device may thin the arm of woman a according to the determined slimming area in response to the user's trigger operation on the arm of woman a. Referring to fig. 1 (f), the contour of the arm of the woman a changes from the dotted line to the solid line in the rectangular frame P, and it can be seen that the arm of the woman a is thinned.
However, while the arms of woman a are being thinned, the contour of the arms of man b on the arm side of woman a changes from the dotted line to the solid line in the rectangular frame q, and it can be seen that the arms of man b are also deformed. It is understood that changing the shape of the arm of the male b is not the user's intention, that is, the shape of the arm of the male b is deformed by an erroneous operation. As such, the quality of the photograph may be affected.
In view of this, an embodiment of the present application provides an image editing prompting method, in which an electronic device obtains first contour data of each object in an image to be edited; after the electronic equipment responds to a first deformation operation of a first region of a first object in an image to be edited by a user, second contour data of each object after the first deformation operation is obtained; and the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation according to the second contour data and the first contour data. Therefore, the electronic equipment displays first prompt information near the second area of the second object, and the first prompt information is used for indicating the second area to deform so as to prompt a user that the second area deforms. After the user perceives the first prompt message, the electronic device can be triggered to restore the shape of the second area before the first deformation operation, so that the quality of the image to be edited is not affected. Or, the first prompt information is used to indicate that the second area of the second object recovers the shape before the first deformation operation, so as to remind the user that the electronic device recovers the shape of the second area before the first deformation operation, and the quality of the image to be edited may also be unaffected.
It is understood that the electronic device may be a mobile phone (mobile phone), a smart tv, a wearable device, a tablet computer (Pad), a computer with wireless transceiving function, a Virtual Reality (VR) terminal device, an Augmented Reality (AR) terminal device, a wireless terminal in industrial control (industrial control), a wireless terminal in self-driving (self-driving), a wireless terminal in remote surgery (remote medical supply), a wireless terminal in smart grid (smart grid), a wireless terminal in transportation safety (transportation safety), a wireless terminal in smart city (smart city), a wireless terminal in smart home (smart home), and so on. The embodiment of the present application does not limit the specific technology and the specific device form adopted by the terminal device.
In order to better understand the embodiments of the present application, the following describes the structure of the electronic device according to the embodiments of the present application. Fig. 2 is a schematic structural diagram of an electronic device to which the embodiment of the present disclosure is applied. As shown in fig. 2, the electronic device 100 may include: the mobile terminal includes a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charging management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor 180, a button 190, a motor 191, an indicator 192, a camera 193, a display screen 194, a Subscriber Identity Module (SIM) card interface 195, and the like. It is to be understood that the illustrated structure of the present embodiment does not constitute a specific limitation to the electronic apparatus 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, a Display Processing Unit (DPU), and/or a neural-Network Processing Unit (NPU), etc. The different processing units may be separate devices or may be integrated into one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110. The processor may be, among other things, a neural center and a command center of the electronic device 100. The processor can generate an operation control signal according to the instruction operation code and the timing signal to complete the control of instruction fetching and instruction execution. A memory may also be provided in processor 110 for storing instructions and data. In some embodiments, the memory in the processor 110 is a cache memory. The memory may hold instructions or data that have been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from memory. This avoids repeated accesses, reduces the latency of the processor 110, and thus increases the efficiency of the electronic device 100.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an integrated circuit (I2C) interface, an integrated circuit built-in audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a Subscriber Identity Module (SIM) interface, and/or a Universal Serial Bus (USB) interface, etc. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. And the earphone can also be used for connecting an earphone and playing audio through the earphone.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is an illustrative description, and does not limit the structure of the electronic apparatus 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device 100 through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like. The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, switch, power amplifier, low noise amplifier, etc. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The modem processor may include a modulator and a demodulator. The modulator is used for modulating a low-frequency baseband signal to be transmitted into a medium-high frequency signal. The demodulator is used for demodulating the received electromagnetic wave signal into a low-frequency baseband signal. The demodulator then passes the demodulated low frequency baseband signal to a baseband processor for processing. The low frequency baseband signal is processed by the baseband processor and then transferred to the application processor. The application processor outputs a sound signal through an audio device (not limited to the speaker 170A, the receiver 170B, etc.) or displays an image or video through the display screen 194. In some embodiments, the modem processor may be a stand-alone device. In other embodiments, the modem processor may be provided in the same device as the mobile communication module 150 or other functional modules, independent of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLAN), bluetooth, Global Navigation Satellite System (GNSS), Frequency Modulation (FM), NFC, Infrared (IR), and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
In some embodiments, antenna 1 of electronic device 100 is coupled to mobile communication module 150 and antenna 2 is coupled to wireless communication module 160 so that electronic device 100 can communicate with networks and other devices through wireless communication techniques. The wireless communication technologies may include GSM, GPRS, CDMA, WCDMA, TD-SCDMA, LTE, GNSS, WLAN, NFC, FM, and/or IR technologies, among others. The GNSS may include a Global Positioning System (GPS), a global navigation satellite system (GLONASS), a bei dou navigation satellite system (BDS), a quasi-zenith satellite system (QZSS), and/or a Satellite Based Augmentation System (SBAS).
The electronic device 100 may implement display functions via the GPU, the display screen 194, and the application processor. The application processor may comprise an NPU, a DPU. The GPU is a microprocessor for image processing, and is connected to the display screen 194 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute instructions to generate or change display information. The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like. The DPU is also called a Display Sub-System (DSS), and is used for adjusting the color of the Display screen 194, and the DPU may adjust the color of the Display screen through a three-dimensional look-up table (3D LUT). The DPU may also perform scaling, noise reduction, contrast enhancement, backlight brightness management, hdr processing, display parameter Gamma adjustment, and the like on the picture.
The display screen 194 is used to display images, video, and the like. The display screen 194 includes a display panel. The display panel may adopt a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active-matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a miniature, a Micro-oeld, a quantum dot light-emitting diode (QLED), and the like. In some embodiments, the electronic device 100 may include 1 or N display screens 194, with N being a positive integer greater than 1.
The electronic device 100 may implement a capture function via the ISP, one or more cameras 193, video codec, GPU, one or more display screens 194, and application processor, among others.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, data files such as music, photos, videos, and the like are saved in the external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may cause the electronic device 100 to execute various functional applications, data processing, and the like by executing the above-described instructions stored in the internal memory 121. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage area may also store one or more application programs (e.g., gallery, contacts, etc.), etc. The storage data area may store data (e.g., photos, contacts, etc.) created during use of the electronic device 100, and the like. In addition, the internal memory 121 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (UFS), and the like. In some embodiments, the processor 110 may cause the electronic device 100 to execute various functional applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110.
The internal memory 121 is configured to store first contour data of each of the plurality of objects before the deformation occurs and second contour data of each of the plurality of objects after the deformation occurs in the image to be edited in the embodiment of the present application. In addition, the internal memory 121 may also be used to store the first mapping relationship table and the second mapping relationship table in the embodiment of the present application. The first mapping relation table may include a one-to-one correspondence relationship between a first distance and a deformation threshold, where the first distance is a distance between a geometric center of an object that does not receive a deformation operation and a geometric center of an object that has undergone a deformation operation; the area of the object which does not receive the deformation operation is in one-to-one correspondence with the deformation threshold; and the one-to-one correspondence relationship between the second distance and the deformation threshold value, wherein the second distance is the distance between the object which does not receive the deformation operation and the edge of the image. The second mapping relation table may include a one-to-one correspondence relationship between a first distance and an importance value, where the first distance is a distance between a geometric center of an object that does not receive a deformation operation and a geometric center of an object that has undergone a deformation operation; and the second distance is in one-to-one correspondence with the importance degree value, wherein the second distance is the distance between the object which does not receive the deformation operation and the edge of the image.
The electronic device 100 may implement audio functions via the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the headphone interface 170D, and the application processor. Such as music playing, recording, etc. The audio module 170 is configured to convert digital audio information into an analog audio signal for output, and also configured to convert an analog audio input into a digital audio signal. The audio module 170 may also be used to encode and decode audio signals. In some embodiments, the audio module 170 may be disposed in the processor 110, or some functional modules of the audio module 170 may be disposed in the processor 110. The speaker 170A, also called a "horn", is used to convert the audio electrical signal into an acoustic signal. The electronic apparatus 100 can listen to music through the speaker 170A or listen to a handsfree call. The receiver 170B, also called "earpiece", is used to convert the electrical audio signal into an acoustic signal. When the electronic apparatus 100 receives a call or voice information, it can receive voice by placing the receiver 170B close to the ear of the person. The microphone 170C, also referred to as a "microphone", is used to convert a sound signal into an electrical signal. When making a call or transmitting voice information, the user can input a voice signal to the microphone 170C by speaking the user's mouth near the microphone 170C. The electronic device 100 may be provided with at least one microphone 170C. In other embodiments, the electronic device 100 may be provided with two microphones 170C to achieve a noise reduction function in addition to collecting sound signals. In other embodiments, the electronic device 100 may further include three, four or more microphones 170C to collect sound signals, reduce noise, identify sound sources, perform directional recording, and so on. The headphone interface 170D is used to connect a wired headphone. The headset interface 170D may be the USB interface 130, may be an open mobile electronic device platform (OMTP) standard interface of 3.5mm, and may also be a CTIA (cellular telecommunications industry association) standard interface.
The sensors 180 may include a pressure sensor 180A, a gyroscope sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., the x, y, and z axes) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyro sensor 180B may also be used for navigation, body sensing game scenes, and the like.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
A distance sensor 180F for measuring a distance. The electronic device 100 may measure the distance by infrared or laser. In some embodiments, taking a picture of a scene, electronic device 100 may utilize range sensor 180F to range for fast focus.
The keys 190 include a power-on key, a volume key, and the like. The keys 190 may be mechanical keys or touch keys. The electronic apparatus 100 may receive a key input, and generate a key signal input related to user setting and function control of the electronic apparatus 100.
The SIM card interface 195 is used to connect a SIM card. The SIM card can be brought into and out of contact with the electronic apparatus 100 by being inserted into the SIM card interface 195 or being pulled out of the SIM card interface 195. The electronic device 100 may support 1 or N SIM card interfaces, N being a positive integer greater than 1. The SIM card interface 195 may support a Nano SIM card, a Micro SIM card, a SIM card, etc. The same SIM card interface 195 can be inserted with multiple cards at the same time. The types of the plurality of cards may be the same or different. The SIM card interface 195 may also be compatible with different types of SIM cards. The SIM card interface 195 may also be compatible with external memory cards. The electronic device 100 interacts with the network through the SIM card to implement functions such as communication and data communication. In some embodiments, the electronic device 100 employs esims, namely: an embedded SIM card, eSIM card, may be embedded in the electronic device 100.
The software system of the electronic device 100 may employ a layered architecture, an event-driven architecture, a micro-core architecture, a micro-service architecture, or a cloud architecture. The embodiment of the present application takes an Android system with a layered architecture as an example, and exemplarily illustrates a software structure of the electronic device 100. Fig. 3 is a block diagram of a software structure of an electronic device to which the embodiment of the present application is applied. The layered architecture divides the software system of the electronic device 100 into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system may be divided into five layers, namely an application layer (applications), an application framework layer (application framework), an Android runtime (Android runtime), and a system library, a Hardware Abstraction Layer (HAL), and a kernel layer (kernel).
The application layer may include a series of application packages, and the application layer runs the application by calling an Application Programming Interface (API) provided by the application framework layer. As shown in fig. 3, the application packages may include american show, camera, gallery, calendar, talk, map, navigation, WLAN, bluetooth, music, video, short message, etc. applications.
The application framework layer provides an API and programming framework for the applications of the application layer. The application framework layer includes a number of predefined functions. As shown in FIG. 3, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like. The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc. The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures. The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.). The resource manager provides various resources, such as localized strings, icons, pictures, layout files, video files, etc., to the application. The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. Such as prompting for text information in the status bar, sounding a prompt tone, the electronic device 100 vibrating, flashing an indicator light, etc.
The android runtime includes a core library and a virtual machine. The android runtime is responsible for scheduling and managing the android system. The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android. The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like. The system library may include a plurality of functional modules. For example: surface managers (surface managers), Media Libraries (Media Libraries), three-dimensional graphics processing Libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications. The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc. The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, composition, layer processing and the like. The 2D graphics engine is a drawing engine for 2D drawing.
The hardware abstraction layer can contain a plurality of library modules, and the library modules can be camera library modules, motor library modules and the like. The Android system can load corresponding library modules for the equipment hardware, and then the purpose that the application program framework layer accesses the equipment hardware is achieved. The device hardware may include, for example, a motor, a camera, etc. in the electronic device.
The kernel layer is a layer between hardware and software. The kernel layer is used for driving hardware so that the hardware works. The kernel layer at least comprises a display driver, a camera driver, an audio driver, a sensor driver, a motor driver and the like, and the embodiment of the application does not limit the display driver, the camera driver, the audio driver, the sensor driver, the motor driver and the like.
The following describes an image editing prompting method provided in the embodiment of the present application, taking an electronic device as a mobile phone 200 as an example, which is not intended to limit the embodiment of the present application. The following embodiments may be combined with each other and are not described in detail with respect to the same or similar concepts or processes. Fig. 4 is a flowchart illustrating an embodiment of an image editing prompting method according to an embodiment of the present application. It should be noted that, for convenience of reading, the application interfaces possibly related in the step of fig. 4 will be described in detail in the subsequent embodiments, and details are not repeated in the embodiment corresponding to fig. 4. As shown in fig. 4, the image editing prompting method provided in the embodiment of the present application may include:
s401: the mobile phone 200 displays an image editing interface. The image editing interface displays an image to be edited.
The image to be edited may be an image immediately shot by a camera application of the mobile phone 200, or an image stored in an album application of the mobile phone 200, which is not limited herein. The image to be edited comprises a plurality of objects, and the objects are different things. For example, the plurality of objects may include lady a, men b, sun, horizontal bar, and the like; for another example, the plurality of objects may include the character 1, a platform, a blackboard, a projector, and the like, which is not limited herein.
Illustratively, when the image to be edited is an image instantly captured by the camera application, after the image is captured by the camera application, the mobile phone 200 enters an image browsing interface of the camera application, and the captured image is displayed on an image preview interface. The mobile phone 200 may display the image editing interface in response to an editing operation input by the user on the image browsing interface. The image editing interface displays shot images, namely images to be edited.
For example, when the image to be edited is an image saved in an album application, the mobile phone 200 may enter the home page of the image editing application in response to a trigger operation of the image editing application by the user. A photo add button is displayed in the main page of the image editing application. The mobile phone 200 can add a photo from the photo album application and display an image editing interface in response to a user's trigger operation of the photo addition button. The image editing interface displays the added photos from the photo album application, and the added photos are the images to be edited.
S402: the cell phone 200 acquires first contour data of each of a plurality of objects included in an image to be edited.
The mobile phone 200 may segment each of the plurality of objects in the image to be edited by using an image segmentation algorithm, and the mobile phone 200 detects edge pixel points of each of the plurality of objects by using an edge detection algorithm. The mobile phone 200 connects the pixel coordinates of the edge pixel points to form geometric data, which is the first contour data. Alternatively, the image segmentation algorithm described above may be a semantic segmentation algorithm.
For example, in the case where the plurality of objects include lady a, men b, sun, and horizontal bar, the mobile phone 200 may segment the lady a, men b, sun, and horizontal bar in the image to be edited using an image segmentation algorithm. Further, the mobile phone 200 may detect first contour data corresponding to each of the woman a, the man b, the sun, and the horizontal bar by using an edge detection algorithm.
S403: the mobile phone 200 changes the shape of an image area including a first area of a first object among the plurality of objects in response to a first morphing operation by a user on the first area.
The user may input a first morphing operation to a first region of a first object of the plurality of objects, and the cell phone 200 changes a shape of the first region of the first object in response to the first morphing operation of the first region of the first object. The cell phone 200 may change the shape of a second region of a second object located near the first region while changing the first region of the first object. It should be noted that the first deformation operation may be a one-time triggering operation, and may also include a plurality of consecutive triggering operations with an interval duration less than a preset duration, which is not limited herein.
Illustratively, in the case where the plurality of subjects include lady a, men b, the sun, and a horizontal bar, taking as an example that the first region of the first subject is an arm region of lady a and the second region of the second subject is an arm region of men b, it is described how to change the shape of the first region. The mobile phone 200 can respond to one triggering operation of the user on the arm area of the lady a, and the outline of the arm area of the lady a is shrunk, so that the slimming operation on the arm area of the lady a is realized. Wherein the triggering operation is the first deformation operation. In addition, while the mobile phone 200 contracts the outline of the arm area of the woman a, the outline of the arm area of the man b near the arm area of the woman a may also be deformed; that is, the arm area of the male b may be mishandled to be deformed. It is understood that the first object is a person, the first region is an arm region, and the second object is also a person.
S404: the mobile phone 200 obtains second contour data of each object in the image to be edited after the first deformation operation.
The manner in which the mobile phone 200 acquires the second contour data of each object in the image to be edited after the first deformation operation may be the same as the manner in which the mobile phone 200 acquires the first contour data of each object in the plurality of objects in S402, which may specifically refer to the description in S402 and is not described herein again.
It should be noted that, since the first region of the first object is deformed, the second contour data of the first object is different from the first contour data of the first object. In addition, when a second region of a second object located near the first region of the first object is not deformed, the second contour data of the second object is the same as the first contour data of the second object; when a second region of the second object located near the first region of the first object is deformed, the second contour data of the second object is different from the first contour data of the second object.
Illustratively, on the basis of the example in S403 described above, the second contour data of woman a is different from the first contour data of woman a due to the deformation of the arm region of woman a. In addition, when the arm area of the male b near the arm area of the female a is not deformed, the second profile data of the male b is the same as the first profile data of the male b; when the arm area of the male b near the arm area of the female a is deformed, the second profile data of the male b is different from the first profile data of the male b.
S405: the mobile phone 200 determines that the second region of the second object is deformed before and after the first deformation operation according to the second contour data and the first contour data. Wherein the second object is located in the vicinity of the first object and the second area is adjacent to the first area.
The mobile phone 200 may calculate a deformation degree value of second contour data of a plurality of objects except the first object in the image to be edited relative to the corresponding first contour data. When the deformation degree value of the second contour data of the second object in the plurality of objects other than the first object relative to the first contour data is greater than the deformation threshold, the mobile phone 200 determines that the second object is deformed before and after the first deformation operation. Further, the mobile phone 200 may determine an area where the second contour data of the second object is different from the first contour data as the deformed second area. The user can visually perceive that the second region of the second object is deformed if the value of the degree of deformation is greater than the deformation threshold. Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation, and the user can be more accurately fitted to the visual feeling.
The above-mentioned deformation degree value may be a difference between the area of the second contour data and the area of the first contour data, a difference between the perimeter of the second contour data and the perimeter of the first contour data, or the like, and is not limited herein.
Illustratively, on the basis of the example in S403, the mobile phone 200 respectively compares whether the deformation degree value of the second profile data of male b relative to the first profile data of male b is greater than the deformation threshold value. Whether the deformation degree value of the second profile data of the sun relative to the first profile data of the sun is greater than a deformation threshold value, and whether the deformation degree value of the second profile data of the horizontal bar relative to the first profile data of the horizontal bar is greater than a deformation threshold value. When the second profile data of the male b is larger than the deformation threshold value relative to the first profile data of the male b, the mobile phone 200 determines that the male b deforms before and after the first deformation operation. Further, the cell phone 200 may determine an area where the second contour data of the male b is different from the first contour data as the deformed second area.
S406: the mobile phone 200 displays first prompt information near the second area of the second object, where the first prompt information is used to indicate that the second area is deformed.
Illustratively, the first prompt message may be a text prompt message such as "deformation occurs", "editing error" and "misoperation occurs here". Thus, the user can perceive that the second region of the second object is deformed through the first prompt message displayed by the mobile phone 200. Further, the mobile phone 200 may process the second region of the second object according to the operation of the user to ensure the quality of the image to be edited.
It should be noted that, when the number of the second areas where the distortion occurs in the image to be edited is plural, the first prompt information may be displayed in the vicinity of each second area. Therefore, the user can be respectively prompted to deform in different second areas. The second region where the deformation occurs may be located in the same object or in a different object, and is not limited herein.
In addition, the mobile phone 200 may also calculate the importance level of each of the plurality of second areas; the importance degrees of the mobile phone 200 to the plurality of second areas are ranked; further, the mobile phone 200 displays the first prompt information near the second area of each second object according to the sorting result, wherein the first prompt information displayed near different second areas is marked with different marks, and the different marks are used for indicating the importance degree of the corresponding second areas. For example, the first prompting message displayed near the different second area is identified with different colors, or the first prompting message displayed near the different second area is identified with different color depths, and so on, which are not limited herein. In this way, the user can determine different processing modes for different second areas according to different marks marked by the first prompt information near the different second areas.
In one aspect, the specific way for determining the importance level of the mobile phone 200 may be as follows: the handset 200 calculates a first distance between the geometric center of each second area and the geometric center of the first object; the mobile phone 200 finds the importance degree value from the pre-stored second mapping relation table according to the calculated first distance. In the second mapping relation table, the size of the first distance and the importance degree value have a one-to-one correspondence and positive correlation.
On the other hand, the specific way for calculating the importance level of the mobile phone 200 may be as follows: the mobile phone 200 calculates a second distance between the geometric center of each second region and the edge of the image; and searching the importance degree value from a pre-stored second mapping relation table according to the calculated second distance. In the second mapping relation table, the magnitude of the second distance and the importance degree value have a one-to-one correspondence and are inversely related.
Optionally, the image editing prompting method provided in the embodiment of the present application may further include:
s407: the mobile phone 200 displays the second object with the first contour data of the second object in response to the user's repair operation on the second area of the second object.
When the user feels that the second object is deformed and the quality of the image to be edited is greatly affected, the user may input a repairing operation to the second area of the second object. Further, the mobile phone 200 replaces the second contour data of the second object with the first contour data of the second object to display the second object in response to the user's repair operation on the second area of the second object. Therefore, the shape of the second area of the second object is repaired, and the adverse effect of deformation of the second area of the second object on the quality of the image to be edited is eliminated. Moreover, the user can display the second object by triggering the first repairing button displayed near the second area, namely the shape of the second area is repaired, so that the method is convenient and fast.
Illustratively, in the case that the user feels that the deformation of the male b has a great influence on the quality of the image to be edited, the user may input the repair operation to the arm area of the male b. Further, the mobile phone 200 displays the male b by replacing the second profile data of the male b with the first profile data of the male b in response to the user's repair work on the arm area of the male b. Therefore, the shape of the arm area of the male b is repaired, and the adverse effect of deformation of the arm area of the male b on the quality of the image to be edited is eliminated.
To sum up, in the image editing prompting method provided in the embodiment of the present application, the mobile phone 200 may change the shape of the image area including the first area in response to the first deformation operation of the user on the first area of the first object in the plurality of objects. The mobile phone 200 identifies whether second contour data of the plurality of objects except the first object after the first morphing operation is changed from first contour data before the corresponding first morphing operation. In the case where second contour data of a second object among the plurality of objects other than the first object is changed with respect to the corresponding first contour data, the mobile phone 200 determines that a second region of the second object is deformed before and after the first deforming operation. Further, the mobile phone 200 displays the first prompt message near the second area of the second object to prompt the user that the second area of the second object is deformed by misoperation. When the user feels that the second object is deformed and the quality of the image to be edited is greatly affected, the user may input a repairing operation to the second area of the second object. Further, the mobile phone 200 replaces the second contour data of the second object with the first contour data of the second object to display the second object in response to the user's repair operation on the second area of the second object. Therefore, the shape of the second area of the second object is repaired, and the adverse effect of deformation of the second area of the second object on the quality of the image to be edited is eliminated.
Optionally, after S407, in a case that the user further feels that the third area of the first object in the multiple objects needs to change the shape, the image editing prompting method provided in the embodiment of the present application may further include:
s408: the mobile phone 200 changes the shape of the image area including the third area in response to a second deforming operation of the third area of the first object among the plurality of objects by the user.
Similarly to the above S403, the mobile phone 200 may change the shape of the fourth area of the third object located near the first area while changing the third area of the first object.
Illustratively, in the case where the plurality of objects include lady a, men b, the sun, and a horizontal bar, taking as an example that the third region of the first object is a face region of lady a and the fourth region of the third object is a hand-held region of the horizontal bar, it is described how to change the shape of the third region. The mobile phone 200 may contract the outline of the face area of the lady a in response to a trigger operation of the user on the face area of the lady a, so that the face slimming operation on the face area of the lady a is realized. Wherein the triggering operation is the second deforming operation described above. In addition, when the mobile phone 200 contracts the outline of the facial area of the lady a, the hand-held area of the horizontal bar above the facial area of the lady a may be bent, so that the hand-held area of the horizontal bar is deformed; that is, the hand-held area of the horizontal bar is deformed by an erroneous operation. It is to be understood that the first object is a person, the third area is a face area, and the third object is an object.
S409: the mobile phone 200 acquires third contour data of each object in the image to be edited after the second morphing operation.
The manner in which the mobile phone 200 acquires the third contour data of each object in the image to be edited after the second morphing operation is the same as the manner in which the mobile phone 200 acquires the first contour data of each object in the plurality of objects in S402, which may specifically refer to the description in S402 and is not repeated herein.
S410-1: the mobile phone 200 determines that a fourth area of a third object is deformed before and after the first deformation operation according to the third contour data and the first contour data, wherein the third object is located near the first object, and the fourth area is adjacent to the third area.
The method for determining, by the mobile phone 200, the deformation of the fourth area of the third object before and after the second deformation operation according to the third profile data and the first profile data is the same as the method for determining, by the mobile phone 200, the deformation of the second area of the second object before and after the first deformation operation according to the second profile data and the first profile data in S405, and details thereof are not repeated here.
Illustratively, on the basis of the example of S408, when the third profile data of the horizontal bar is changed relative to the first profile data of the horizontal bar, the cell phone 200 determines that the horizontal bar is deformed before and after the second deforming operation. Further, the mobile phone 200 may determine an area where the third profile data of the horizontal bar is different from the first profile data as a fourth area where the deformation occurs.
In addition, S4010-1 described above can also be replaced with:
s410-2: the mobile phone 200 determines that a fourth area of a third object is deformed before and after the second deforming operation according to the third contour data and the second contour data, wherein the third object is located near the first object, and the fourth area is adjacent to the third area. It is understood that the implementation principle of S4011-2 is the same as that of S4011-1, and is not described herein again.
S411: the mobile phone 200 displays second prompt information near the fourth area of the third object, where the second prompt information is used to indicate that the fourth area is deformed.
The second prompt message may also be a text prompt message such as "deformation occurs", "editing error" and "misoperation occurs here". Illustratively, based on the above-mentioned embodiment of S4010-1 or S4010-2, the mobile phone 200 may display "deformation occurs" near the hand-held area of the horizontal bar to prompt the user that the hand-held area of the horizontal bar is deformed. In this way, the user can also perceive that the handheld area of the horizontal bar is deformed through the second prompt message displayed by the mobile phone 200. Further, the mobile phone 200 may process the deformation of the handheld region of the horizontal bar according to the operation of the user, so as to ensure the quality of the image to be edited.
In addition, in the embodiment corresponding to S405 described above, the degree of attention paid by the user is different for a plurality of objects in the image to be edited. In this way, the judgment criteria for the user to visually deform different objects are also different. For example, for an object with high user attention, even if the deformation degree value is small, the user visually recognizes that deformation occurs; even if the degree of deformation is large, an object with a low user attention is visually recognized by the user as not being deformed. Thus, different deformation thresholds need to be set for different objects, so that the mobile phone 200 can more accurately fit the visual perception of the user to determine whether each object is deformed. Next, three ways of how the mobile phone 200 specifically sets the deformation threshold corresponding to the second object are described.
The first setting mode is as follows: since the user inputs the first morphing operation to the first object, the shape of the first object is changed. Further, it is considered that the first object is an object having a high user attention, and the closer the second object is to the first object, the higher the possibility that the first object is focused on by the user; conversely, the further the second object is from the first object, the lower the likelihood of being focused on by the user. Therefore, the mobile phone 200 may calculate a first distance between the geometric center of the second object and the geometric center of the first object, and find the deformation threshold from the pre-stored first mapping relation table according to the calculated distance. In a pre-stored first mapping relation table, the size of the first distance is in one-to-one correspondence and positive correlation with the deformation threshold value. Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation by using the deformation threshold searched by the first distance, and the user can more accurately fit the visual feeling.
The second setting mode is as follows: in the case where the area of the second object is large, even if the value of the degree of deformation of the second object is large, the user visually recognizes that the second object is not deformed; in the case where the area of the second object is small, the user visually recognizes that the second object is deformed even if the value of the degree of deformation of the second object is small. Therefore, the mobile phone 200 may calculate the area of the second object, and find the deformation threshold from the pre-stored first mapping table according to the calculated area. In the pre-stored first mapping relation table, the size of the area and the deformation threshold value have a one-to-one correspondence and positive correlation. Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation by using the deformation threshold searched by the area of the second object, and the user can more accurately fit the visual feeling.
The third setting mode is as follows: in general, the closer an object is to the edge of an image, the higher the user's attention; conversely, the more distant the user is from the edge of the image, the less attentive the user is to the object. In this way, in the case where the second object is closer to the edge of the image, the user visually recognizes that the second object is not deformed even if the value of the degree of deformation is large; conversely, in the case where the second object is far from the edge of the image, the user visually recognizes that the second object is deformed even if the deformation degree value is small. Thus, the mobile phone 200 may calculate a second distance between the geometric center of the second object and the edge of the image, and find the deformation threshold from the pre-stored first mapping table according to the calculated second distance between the geometric center of the second object and the edge of the image. In the pre-stored first mapping relation table, the size of the second distance from the image edge is in one-to-one correspondence and negative correlation with the deformation threshold. Therefore, the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation by using the deformation threshold searched by the second distance, and the user can more accurately fit the visual feeling.
Taking a plurality of objects in the image to be edited including a woman a, a man b, the sun and a horizontal bar, a first area of the first object being an arm area of the woman a, a second area of the second object being an arm area of the man b, a third area of the first object being a face area of the woman a, and a fourth area of the third object being a handheld area of the horizontal bar as an example, with reference to the interface operation diagrams of fig. 1 and fig. 5-12, how to implement the image editing prompting method according to the embodiment corresponding to fig. 4 is described.
For example, how to prompt the user that the arm area of the male b is deformed by being misoperated after the first deformation operation of the mobile phone 200 on the arm area of the female a in the above-mentioned S401-S408 is described with reference to fig. 1 and 5-7.
Referring to fig. 1 (a) - (e) and the above description of fig. 1 (a) - (e), the mobile phone 200 may make the arm of the female a thin in response to the user's trigger operation on the arm of the female a when the user observes that the arm area of the female a is photographed thickly in the image to be edited. While the arms of the woman a are thinned, the shape of the arms of the man b is deformed by misoperation.
Furthermore, the mobile phone 200 may recognize that the second profile data of the male b responding to the trigger operation is greater than the deformation threshold value with respect to the first profile data of the male b responding to the trigger operation. Thus, the cellular phone 200 determines that the male b is deformed before and after the response to the above-described trigger operation. The mobile phone 200 may determine the arm region where the second contour data of the male b is different from the first contour data as the deformed region.
Further, as shown in fig. 5(a), the mobile phone 200 may display a first prompt message 410 of "deformation occurs" near the arm area of male b to prompt the user that the arm area of male b is deformed. In order to help the user locate the area where the deformation occurs, the mobile phone 200 surrounds the arm area of the male b with a rectangular frame to prompt the user that the area in the rectangular frame is the area where the deformation occurs.
As shown in (b) and (c) of fig. 5, the cellular phone 200 may display a repair button 502 and a not-repair button 503 in response to a user's trigger operation on the arm area of male b. In the case where the user feels that the arm area of male b is deformed, which affects the quality of the image to be edited, the user may click a repair button 502 as shown in fig. 6 (a). As such, as shown in (b) of fig. 6, the mobile phone 200 displays the male b by replacing the second profile data of the male b with the first profile data of the male b in response to the user's trigger operation of the repair button 502. Therefore, the shape of the arm area of the male b is repaired, and the adverse effect of deformation of the arm area of the male b on the quality of the image to be edited is eliminated.
In addition, alternatively, in the case where the user does not affect the quality of the image to be edited even so that the quality of the image becomes higher after the user feels that men b has deformed, as shown in (a) in fig. 7, the no-fix button 503 may be triggered. Further, the cell phone 200 may still display male b with the second profile data of male b in response to the user's activation of the unrepaired button 503. Therefore, even if the arm area of the male b deforms, the quality of the image to be edited cannot be affected. In addition, the mobile phone 200 may also cancel displaying the first prompt message to eliminate interference to the user in browsing the image to be edited.
Next, how to prompt the user that the hand-held area of the horizontal bar is deformed by the misoperation after the second deforming operation of the face area of the lady a by the mobile phone 200 in the above S409-S4011 is described with reference to fig. 8.
On the basis of the image editing interface 205 shown in fig. 5(a), if the user observes that the face area of the lady a in the image to be edited is shot to be larger, as shown in (a) and (b) of fig. 8, the mobile phone 200 responds to a trigger operation of the user on the face area of the lady a, so that the face contour of the lady a is shrunk from the solid line to the dotted line, and thus, the face thinning operation on the face area of the lady a is realized.
Referring to fig. 8 (b), while the mobile phone 200 contracts the outline of the facial area of the lady a, the hand-held area of the horizontal bar is bent from the solid line to the dotted line, which causes the hand-held area of the horizontal bar c above the facial area of the lady a to be bent, that is, the hand-held area of the horizontal bar c is deformed by the misoperation. In this way, the mobile phone 200 can recognize that the second contour data of the horizontal bar c after responding to the above-mentioned trigger operation is changed from the first contour data of the horizontal bar c before responding to the above-mentioned trigger operation. Further, the cellular phone 200 determines that the horizontal bar c is deformed before and after responding to the above-described trigger operation. The mobile phone 200 may determine an arm region where the second contour data of the horizontal bar c is different from the first contour data as a deformed region.
Further, as shown in (a) of fig. 8, the mobile phone 200 may display a second prompt message 802 of "deformation occurs" near the hand-held area of the horizontal bar c to prompt the user that the hand-held area of the horizontal bar c is deformed. To help the user locate the area where the deformation occurs, the mobile phone 200 surrounds the hand-held area of the horizontal bar c with the rectangular box 801 to prompt the user that the area inside the rectangular box 801 is the area where the deformation occurs.
On the basis of the embodiment corresponding to fig. 8, three different processing modes of the mobile phone for the image to be edited in the case that the arm area of the male b in the image to be edited is deformed by misoperation and the hand-held area of the horizontal bar c is deformed by misoperation are respectively described below with reference to fig. 9 to 12.
Fig. 9 is an interface schematic diagram of a mobile phone 200 for repairing an arm area of a male b and a hand-held area of a horizontal bar c by one key according to an embodiment of the present application. On the basis of the image editing interface 205 corresponding to fig. 8, as shown in (a) in fig. 9, the mobile phone 200 may display a one-key repair button 901 and a one-key no-repair button 902 in the image to be edited in response to a user's trigger operation at any position of the image to be edited. If the user feels that the deformation of the arm area of the male b and the hand-held area of the horizontal bar c respectively affects the quality of the image to be edited, as shown in (b) of fig. 9, the one-touch repair button 901 may be triggered. The mobile phone 200 replaces the second profile data of the male b with the first profile data of the male b to display the male b in response to the user's trigger operation of the one-key repair button 901; and replacing the second contour data of the horizontal bar c with the first contour data of the horizontal bar c to display the horizontal bar c.
Therefore, the shape of the arm area of the male b is repaired, and the adverse effect of deformation of the arm area of the male b on the quality of the image to be edited is eliminated; and the shape of the handheld area of the horizontal bar c is repaired, so that the adverse effect of the deformation of the handheld area of the horizontal bar c on the quality of the image to be edited is eliminated. In addition, the user only needs to trigger the one-touch repair button 901 once, and the mobile phone 200 can complete shape repair of the arm area of the male b and the hand-held area of the horizontal bar c, so that the efficiency is high, and convenience and rapidness are achieved. In addition, the mobile phone 200 cancels the display of the first prompt message and the second prompt message, and can eliminate the interference to the user in browsing the image to be edited.
Fig. 10 is an interface schematic diagram of the mobile phone 200 provided in the embodiment of the present application, which does not repair the arm area of the male b and the hand-held area of the horizontal bar c by one key. On the basis of the image editing interface 205 corresponding to fig. 9(a), if the user feels that the deformation of the arm area of the male b and the hand-held area of the horizontal bar c respectively does not affect the quality of the image to be edited, as shown in fig. 10 (a), the one-touch no-repair button 902 may be triggered. The mobile phone 200 responds to the triggering operation of the one-key non-repair button 902 by the user, and still displays the male b with the second profile data of the male b; and still displaying the horizontal bar c with the second contour data for the horizontal bar c. Thus, even if the arm area of the male b and the hand-held area of the horizontal bar c are deformed, the quality of the image to be edited is not affected. In addition, the mobile phone 200 cancels the display of the first prompt message and the second prompt message, so that the interference caused to the user browsing the image to be edited can be eliminated.
Fig. 11-12 are schematic interface diagrams of the mobile phone 200 provided in the embodiment of the present application for repairing the hand-held area of the horizontal bar c and not repairing the arm area of the male b. On the basis of the image editing interface 205 corresponding to fig. 8(a), when the user feels that the handheld area of the horizontal bar c is deformed, and the quality of the image to be edited is affected, as shown in fig. 11 (a), the handheld area of the horizontal bar c may be triggered. Further, the cellular phone 200 cancels display of the first prompt information of "occurrence of distortion" in the vicinity of the hand-held area of the horizontal bar c, and displays a repair button 1101 and a non-repair button 1102 in the vicinity of the hand-held area of the horizontal bar c, as shown in (b) in fig. 11, in response to the user's trigger operation on the hand-held area of the horizontal bar c. The mobile phone 200 cancels the display of the first prompt message of 'deformation occurs', so that the interference on the user browsing the image to be edited can be eliminated. As shown in (c) and (d) of fig. 11, the cell phone 200 displays the horizontal bar c by replacing the second contour data of the horizontal bar c with the first contour data of the horizontal bar c in response to the user's trigger operation of the repair button 1101. Therefore, the shape of the arm area of the horizontal bar c is repaired, and the adverse effect of the deformation of the arm area of the horizontal bar c on the quality of the image to be edited is eliminated.
In a case where the user does not affect the quality of the image to be edited after the user feels that the arm region of the male b is deformed, as shown in (a) of fig. 12, the mobile phone 200 may display a repair button 1201 and a non-repair button 1202 in response to the user's trigger operation on the arm region of the male b. When the user feels that the arm area of the male b is deformed without affecting the quality of the image to be edited, the user may click the unrepaired button 1202 as shown in fig. 12 (a). As such, as shown in (c) of fig. 12, the cell phone 200 still displays the male b with the second profile data of the male b in response to the user's trigger operation of the unrepaired button 1202. Therefore, even if the arm area of the male b deforms, the quality of the image to be edited cannot be affected. In addition, the mobile phone 200 cancels the display of the first prompt message, so that the interference caused by the user browsing the image to be edited can be eliminated.
It is to be understood that, with fig. 11 to 12 described above, the mobile phone 200 can selectively repair the shape of an object that a user desires to repair, and selectively not repair the shape of an object that a user does not desire to repair, from among a plurality of objects that are deformed by a malfunction. The image editing flexibility is high, and the quality of the image to be edited is further ensured.
Fig. 13 is a flowchart illustrating an image editing prompting method according to another embodiment of the present application. It should be noted that the basic principle and the generated technical effect of the image editing prompting method provided by the embodiment of the present application are the same as those of the above embodiment, and for brief description, no mention is made in the embodiment of the present application, and reference may be made to the corresponding contents in the above embodiment. As shown in fig. 13, an image editing prompting method according to another embodiment provided in the present application may include:
s131: the mobile phone 200 displays an image editing interface, wherein the image editing interface displays an image to be edited.
The implementation principle of S131 is the same as that of S401 provided in fig. 4, and is not described herein again.
S132: the cell phone 200 acquires first contour data of each of a plurality of objects included in an image to be edited.
The implementation principle of S132 is the same as that of S402 provided in fig. 4, and is not described herein again.
S133: the mobile phone 200 changes the shape of an image area including a first area in response to a first morphing operation of the user on the first area of a first object among the plurality of objects.
The implementation principle of S133 is the same as that of S403 provided in fig. 4, and is not described herein again.
S134: the mobile phone 200 obtains second contour data of each object in the image to be edited after the first deformation operation.
The implementation principle of S134 is the same as that of S404 provided in fig. 4, and is not described herein again.
S135: the mobile phone 200 determines that the second region of the second object is deformed before and after the first deformation operation according to the second contour data and the first contour data. Wherein the second object is located in the vicinity of the first object and the second area is adjacent to the first area.
The implementation principle of S135 is the same as that of S405 provided in fig. 4, and is not described herein again.
S136: the cell phone 200 replaces the second contour data of the second object with the first contour data of the second object to display the second object.
Different from the image editing prompting method provided in fig. 4, after determining that the second region of the second object is deformed before and after the first deformation operation, the mobile phone 200 replaces the second contour data of the second object with the first contour data of the second object to display the second object. Furthermore, automatic repair of the second region of the deformed second object is achieved. Furthermore, the quality of the image to be edited is not affected.
S137: the cell phone 200 displays the first prompt message near the second area of the second object.
Unlike the image editing prompting method provided in fig. 4, the first prompting information is used to indicate that the second region of the second object restores the shape before the first morphing operation. For example, the first prompt message may be text messages such as "repaired after deformation", "successfully repaired", and the like. Therefore, the user can perceive that the second area of the second object is deformed and repaired after the first prompt message is received, and the quality of the image to be edited is not affected.
In addition, in the image editing prompting method provided by the above-mentioned introduction of the embodiment of the present application, the trigger operation mentioned may include: a click operation, a long-press operation, a gesture trigger operation, and the like, which are not limited herein.
In addition, in the image editing prompting method provided by the above-mentioned introduction embodiment of the present application, the image to be edited may also be replaced by a process component image, and after the mobile phone 200 responds to the first deformation operation of the user on the first region of the first component in the process component image, the shape of the first component is changed, and at the same time, the second region of the second component is also deformed by the misoperation. At this time, the first prompt information may be displayed in the vicinity of the second area. In this manner, the user may input a repair operation to the second region of the second component. Further, the mobile phone 200 replaces the second contour data of the second part with the first contour data of the second part to display the second object in response to the user's repair operation on the second area of the second part. Therefore, the shape of the second area of the component is repaired, and the adverse effect of the deformation of the second area of the second component on the quality of the image to be edited is eliminated. Furthermore, the quality of the process part manufactured according to the process part image is ensured.
As shown in fig. 14, fig. 14 is a schematic structural diagram illustrating an image editing and prompting apparatus provided in an embodiment of the present application, where the image editing and prompting apparatus may be an electronic device in the embodiment of the present application, or may be an electronic chip or a chip system.
Illustratively, taking the image editing prompting apparatus as an electronic device or a chip system applied in the electronic device as an example, the display unit 1401 is used to support the image editing prompting apparatus to execute the display steps in the above embodiments, and the processing unit 1402 is used to support the image editing prompting apparatus to execute the processing steps in the above embodiments.
The processing unit 1402 may be integrated with the display unit 1401, and the processing unit 1402 and the display unit 1401 may communicate.
In a possible implementation manner, the image editing prompting apparatus may further include: a storage unit 1403. The storage unit 1403 may include one or more memories, which may be devices in one or more devices or circuits for storing programs or data.
The storage unit 1403 may be independent and connected to the processing unit 1402 through a communication bus. The storage unit 1403 may also be integrated with the processing unit 1402.
Taking an example that the image editing prompting apparatus may be a chip or a chip system of the electronic device in the embodiment of the present application, the storage unit 1403 may store computer-executable instructions of a method of the electronic device, so that the processing unit 1402 executes the method of the electronic device in the above embodiment. The storage unit 1403 may be a register, a cache, a Random Access Memory (RAM), or the like, and the storage unit 1403 may be integrated with the processing unit 1402. Storage unit 1403 may be a read-only memory (ROM) or other type of static storage device that may store static information and instructions, and storage unit 1403 may be separate from processing unit 1402.
An embodiment of the present application provides an image editing prompt apparatus, including: a display unit 1401 configured to display an image editing interface, where the image editing interface displays an image to be edited; a processing unit 1402 for acquiring first contour data of each of a plurality of objects included in an image to be edited; a processing unit 1402, further configured to change a shape of an image area including a first area of a first object of the plurality of objects in response to a first morphing operation by a user on the first area; the processing unit 1402 is further configured to obtain second contour data of each object in the image to be edited after the first deformation operation; the processing unit 1402 is further configured to determine, according to the second contour data and the first contour data, that a second region of the second object is deformed before and after the first deformation operation, where the second object is located near the first object, and the second region is adjacent to the first region; a display unit 1401, further configured to display first prompt information near a second region of the second object, where the first prompt information is used to indicate that the second region is deformed; alternatively, the first prompt information is used to instruct the second region of the second object to restore the shape before the first morphing operation.
In one possible implementation, the first object is a person, the first region is a face region, and the second object is an object; alternatively, the first object is a person and the second object is a person.
In a possible implementation, the processing unit 1402 is further configured to display the second object with the first contour data of the second object in response to a repair operation of the second area of the second object by the user.
In a possible implementation manner, the processing unit 1402 is further configured to respond to a triggering operation of a second area of the second object by a user, and the display unit 1401 is further configured to display a first repair button in a vicinity of the second area of the second object; the processing unit 1402 is further configured to respond to a trigger operation of the first repair button by the user, and the display unit 1401 is further configured to display the second object with the first contour data of the second object.
In a possible implementation manner, the number of the second objects is multiple, and the display unit 1401 is specifically configured to display a second repair button on the image editing interface; the processing unit 1402 is further configured to respond to a trigger operation of the second repair button by the user, and the display unit 1401 is further configured to display each of the plurality of second objects with the first contour data of the plurality of second objects.
In a possible implementation manner, the processing unit 1402 is further configured to cancel displaying the first prompt information in response to a user's unrepairable operation on the second region of the second object.
In a possible implementation manner, the number of the second areas is multiple, and the processing unit 1402 is further configured to calculate, by the electronic device, the importance level of each of the multiple second areas; the importance degrees of the plurality of second areas are ranked by the electronic equipment; and the electronic equipment displays first prompt information near the second area of each second object according to the sorting result, wherein the first prompt information displayed near different second areas is marked with different marks.
In one possible implementation, the processing unit 1402 is further configured to change a shape of an image region including the third region in response to a second deforming operation of the third region of the first object in the plurality of objects by the user; the processing unit 1402 is further configured to obtain third contour data of each object in the image to be edited after the second morphing operation; the processing unit 1402 is further configured to determine, according to the third contour data and the first contour data, that a fourth region of a third object is deformed before and after the first deformation operation, where the third object is located near the first object, and the fourth region is adjacent to the third region; a display unit 1401, further configured to display second prompt information near a fourth area of the third object, where the second prompt information is used to indicate that the fourth area is deformed; alternatively, the second hint information indicates that the fourth area of the third object has recovered the shape prior to the first morphing operation.
In a possible implementation, the processing unit 1402 is specifically configured to calculate second contour data of a second object, a deformation degree value of the first contour data relative to a second number of objects; and under the condition that the deformation degree value is larger than the deformation threshold value, determining that the second area of the second object deforms before and after the first deformation operation.
In a possible implementation manner, the processing unit 1402 is further configured to calculate a first distance between a geometric center of the second object and a geometric center of the first object, and find the deformation threshold from a pre-stored first mapping relation table according to the first distance, where in the pre-stored first mapping relation table, a size of the first distance is in one-to-one correspondence and positive correlation with the deformation threshold.
Or, the processing unit 1402 is further configured to calculate an area of the second object, and find the deformation threshold from a pre-stored first mapping relation table according to the area, where in the pre-stored first mapping relation table, the size of the area and the deformation threshold have a one-to-one correspondence and a positive correlation.
Or, the processing unit 1402 is further configured to calculate a second distance between the geometric center of the second object and the edge of the image, and find the deformation threshold from a pre-stored first mapping relation table according to the second distance, where in the pre-stored first mapping relation table, the size of the second distance and the deformation threshold have a one-to-one correspondence and are inversely related.
In one possible implementation, the deformation degree value is a difference between an area of the second contour data and an area of the first contour data; alternatively, the deformation degree value is a difference between the circumference of the second contour data and the circumference of the first contour data.
Fig. 15 is a schematic diagram of a hardware structure of an electronic device according to an embodiment of the present disclosure, and as shown in fig. 15, the electronic device includes a processor 1501, a communication line 1504, and at least one communication interface (the communication interface 1503 is exemplarily illustrated in fig. 15).
The processor 1501 may be a general-purpose Central Processing Unit (CPU), a microprocessor, an application-specific integrated circuit (ASIC), or one or more ics for controlling the execution of programs in accordance with the present invention.
The communication lines 1504 may include circuitry to transfer information between the above-described components.
Communication interface 1503 may use any device such as a transceiver for communicating with other devices or communication networks, such as an ethernet, a Wireless Local Area Network (WLAN), etc.
Possibly, the electronic device may further comprise a memory 1502.
The memory 1502 may be, but is not limited to, a read-only memory (ROM) or other type of static storage device that can store static information and instructions, a Random Access Memory (RAM) or other type of dynamic storage device that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disk storage, optical disk storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer. The memory may be separate and coupled to the processor via communication line 1504. The memory may also be integral to the processor.
The memory 1502 is used for storing computer-executable instructions for executing the present application, and is controlled by the processor 1501 to execute the computer-executable instructions. The processor 1501 is configured to execute computer-executable instructions stored in the memory 1502, so as to implement the image editing prompting method provided in the embodiment of the present application.
Possibly, the computer executed instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
In particular implementations, processor 1501 may include one or more CPUs such as CPU0 and CPU1 of fig. 15, for example, as an example.
In particular implementations, an electronic device may include multiple processors, such as processor 1501 and processor 1505 in FIG. 15, for example, as an example. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
Exemplarily, fig. 16 is a schematic structural diagram of a chip provided in an embodiment of the present application. Chip 160 includes one or more (including two) processors 1610 and a communication interface 1630.
In some embodiments, memory 1640 stores the following elements: an executable module or a data structure, or a subset thereof, or an expanded set thereof.
In an embodiment of the present application, memory 1640 may comprise read-only memory and random access memory and provides instructions and data to processor 1610. A portion of memory 1640 may also include non-volatile random access memory (NVRAM).
In the illustrated embodiment, memory 1640, communication interface 1630 and memory 1640 are coupled together by bus system 1620. The bus system 1620 may include a power bus, a control bus, a status signal bus, and the like, in addition to the data bus. For ease of description, the various buses are labeled in FIG. 16 as the bus system 1620.
The methods described in the embodiments of the present application may be applied to the processor 1610 or implemented by the processor 1610. Processor 1610 may be an integrated circuit chip having signal processing capabilities. In implementation, the steps of the above method may be performed by instructions in the form of hardware, integrated logic circuits or software in the processor 1610. The processor 1610 may be a general-purpose processor (e.g., a microprocessor or a conventional processor), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf programmable gate array (FPGA) or other programmable logic device, discrete gate, transistor logic device or discrete hardware component, and the processor 1610 may implement or execute the methods, steps and logic blocks disclosed in the embodiments of the present invention.
The steps of the method disclosed in connection with the embodiments of the present application may be directly implemented by a hardware decoding processor, or implemented by a combination of hardware and software modules in the decoding processor. The software module may be located in a storage medium mature in the field, such as a random access memory, a read only memory, a programmable read only memory, or a charged erasable programmable memory (EEPROM). The storage medium is located in the memory 1640, and the processor 1610 reads information in the memory 1640, which in combination with its hardware performs steps of the above-described method.
In the above embodiments, the instructions stored by the memory for execution by the processor may be implemented in the form of a computer program product. The computer program product may be written in the memory in advance, or may be downloaded in the form of software and installed in the memory.
The computer program product includes one or more computer instructions. The procedures or functions according to the embodiments of the present application are all or partially generated when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on a computer readable storage medium or transmitted from one computer readable storage medium to another computer readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center via wire (e.g., coaxial cable, fiber optic, Digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.). A computer-readable storage medium may be any available medium that a computer can store or a data storage device including one or more available media integrated servers, data centers, and the like. For example, usable media may include magnetic media (e.g., floppy disks, hard disks, or tapes), optical media (e.g., Digital Versatile Disks (DVDs)), or semiconductor media (e.g., Solid State Disks (SSDs)), among others.
The embodiment of the application also provides a computer readable storage medium. The methods described in the above embodiments may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Computer-readable media may include computer storage media and communication media, and may include any medium that can communicate a computer program from one place to another. A storage medium may be any target medium that can be accessed by a computer.
As one possible design, the computer-readable medium may include a compact disk read-only memory (CD-ROM), RAM, ROM, EEPROM, or other optical disk storage; the computer readable medium may include a disk memory or other disk storage device. Also, any connecting line may also be properly termed a computer-readable medium. For example, if the software is transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. Disk and disc, as used herein, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
Combinations of the above should also be included within the scope of computer-readable media. The above description is only for the specific embodiments of the present invention, but the scope of the present invention is not limited thereto, and any person skilled in the art can easily conceive of the changes or substitutions within the technical scope of the present invention, and the changes or substitutions should be covered within the scope of the present invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (14)

1. An image editing prompting method, comprising:
the method comprises the steps that an electronic device displays an image editing interface, wherein the image editing interface displays an image to be edited;
the electronic equipment acquires first contour data of each object in a plurality of objects included in the image to be edited;
the electronic equipment responds to a first deformation operation of a first region of a first object in the plurality of objects by a user, and changes the shape of an image region including the first region;
the electronic equipment acquires second contour data of each object in the image to be edited after the first deformation operation;
the electronic equipment determines that a second area of a second object is deformed before and after the first deformation operation according to the second contour data and the first contour data, wherein the second object is located near the first object, and the second area is adjacent to the first area;
the electronic equipment displays first prompt information near a second area of a second object, wherein the first prompt information is used for indicating that the second area deforms; or the first prompt information is used for indicating that the second area of the second object restores the shape before the first deformation operation.
2. The method of claim 1, wherein the first object is a human being, the first region is a face region, and the second object is an object; or, the first object is a person, and the second object is a person.
3. The method of claim 1, wherein after the electronic device displays the first prompt message in proximity to the second region of the second object, the method further comprises:
the electronic device displays the second object with the first contour data of the second object in response to a user's repair operation on a second region of the second object.
4. The method of claim 3, wherein the electronic device, in response to a user repair operation on a second region of a second object, displays the second object with first contour data of the second object, comprising:
the electronic equipment responds to a trigger operation of a user on a second area of the second object, and displays a first repair button in the vicinity of the second area of the second object;
the electronic equipment responds to the triggering operation of the first repairing button by a user, and displays the second object with the first outline data of the second object.
5. The method of claim 3, wherein the second object is plural in number, and wherein the electronic device displays the second object with the first contour data of the second object in response to a user's repair operation on the second region of the second object, including:
the electronic equipment displays a second repair button on the image editing interface;
the electronic equipment responds to the triggering operation of the user on the second repairing button, and each second object is displayed with the first outline data of the plurality of second objects.
6. The method of claim 1, wherein the first prompting message is used to indicate that the second area is deformed, and wherein after the electronic device displays the first prompting message near the second area of the second object, the method further comprises:
and the electronic equipment cancels the display of the first prompt message in response to the unrepairable operation of the user on the second area of the second object.
7. The method of claim 1, wherein the number of the second areas is multiple, and the electronic device displays a first prompt message in the vicinity of the second area of the second object, including:
the electronic device calculating the importance degree of each of a plurality of the second areas;
ranking the importance degrees of the plurality of second areas by the electronic equipment;
and the electronic equipment displays first prompt information near the second area of each second object according to the sorting result, wherein the first prompt information displayed near different second areas is marked with different marks.
8. The method of claim 1, wherein after the electronic device displays the first prompt message proximate to the second region of the second object, the method further comprises:
the electronic device changing a shape of an image area including a third area of a first object of the plurality of objects in response to a second deforming operation of the third area by a user;
the electronic equipment acquires third contour data of each object in the image to be edited after the second deformation operation;
the electronic equipment determines that a fourth area of a third object is deformed before and after the first deformation operation according to the third contour data and the first contour data, wherein the third object is located near the first object, and the fourth area is adjacent to the third area;
the electronic equipment displays second prompt information near a fourth area of the third object, wherein the second prompt information is used for indicating that the fourth area deforms; or, the second prompt information is used to indicate that the fourth region of the third object restores the shape before the first morphing operation.
9. The method of claim 1, wherein determining that the second region of the second object was deformed before and after the first deforming operation based on the second contour data and the first contour data comprises:
the electronic device calculates second contour data of the second object, and deformation degree values of the first contour data relative to the second object number;
and the electronic equipment determines that the second area of the second object deforms before and after the first deformation operation under the condition that the deformation degree value is larger than a deformation threshold value.
10. The method of claim 9, wherein prior to the determining that the second region of the second object is deformed before and after the first deforming operation, the method further comprises:
the electronic equipment calculates a first distance between the geometric center of the second object and the geometric center of the first object, and finds out a deformation threshold value from a pre-stored first mapping relation table according to the first distance, wherein in the pre-stored first mapping relation table, the size of the first distance is in one-to-one correspondence and positive correlation with the deformation threshold value;
or the electronic device calculates the area of the second object, and finds out a deformation threshold from a pre-stored first mapping relation table according to the area, wherein in the pre-stored first mapping relation table, the size of the area and the deformation threshold have a one-to-one correspondence and a positive correlation;
or, the electronic device calculates a second distance between the geometric center of the second object and the edge of the image, and finds the deformation threshold from a pre-stored first mapping relation table according to the second distance, wherein in the pre-stored first mapping relation table, the size of the second distance is in one-to-one correspondence and negative correlation with the deformation threshold.
11. The method of claim 9,
the deformation degree value is the difference value of the area of the second contour data and the area of the first contour data;
or, the deformation degree value is a difference between a perimeter of the second contour data and a perimeter of the first contour data.
12. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image editing prompt method of any one of claims 1 to 11.
13. A computer readable storage medium, instructions in which, when executed by a processor of an electronic device, enable the electronic device to perform the image editing prompt method of any one of claims 1 to 11.
14. A computer program product comprising a computer program or instructions which, when executed by a processor, carries out the image editing hinting method of any one of claims 1-11.
CN202110693749.9A 2021-06-22 2021-06-22 Image editing prompting method and device, electronic equipment and readable storage medium Pending CN114004732A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110693749.9A CN114004732A (en) 2021-06-22 2021-06-22 Image editing prompting method and device, electronic equipment and readable storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110693749.9A CN114004732A (en) 2021-06-22 2021-06-22 Image editing prompting method and device, electronic equipment and readable storage medium

Publications (1)

Publication Number Publication Date
CN114004732A true CN114004732A (en) 2022-02-01

Family

ID=79921033

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110693749.9A Pending CN114004732A (en) 2021-06-22 2021-06-22 Image editing prompting method and device, electronic equipment and readable storage medium

Country Status (1)

Country Link
CN (1) CN114004732A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114913549A (en) * 2022-05-25 2022-08-16 北京百度网讯科技有限公司 Image processing method, apparatus, device and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140270544A1 (en) * 2013-03-14 2014-09-18 Cyberlink Corp. Image Editing Method and System
CN108287872A (en) * 2017-12-28 2018-07-17 百度在线网络技术(北京)有限公司 A kind of building change detecting method, device, server and storage medium
CN108492246A (en) * 2018-03-12 2018-09-04 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN108875780A (en) * 2018-05-07 2018-11-23 广东省电信规划设计院有限公司 The acquisition methods and device of difference object between image based on view data
CN109859211A (en) * 2018-12-28 2019-06-07 努比亚技术有限公司 A kind of image processing method, mobile terminal and computer readable storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140270544A1 (en) * 2013-03-14 2014-09-18 Cyberlink Corp. Image Editing Method and System
CN108287872A (en) * 2017-12-28 2018-07-17 百度在线网络技术(北京)有限公司 A kind of building change detecting method, device, server and storage medium
CN108492246A (en) * 2018-03-12 2018-09-04 维沃移动通信有限公司 A kind of image processing method, device and mobile terminal
CN108875780A (en) * 2018-05-07 2018-11-23 广东省电信规划设计院有限公司 The acquisition methods and device of difference object between image based on view data
CN109859211A (en) * 2018-12-28 2019-06-07 努比亚技术有限公司 A kind of image processing method, mobile terminal and computer readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114913549A (en) * 2022-05-25 2022-08-16 北京百度网讯科技有限公司 Image processing method, apparatus, device and medium

Similar Documents

Publication Publication Date Title
CN114679537B (en) Shooting method and terminal
EP3944063A1 (en) Screen capture method and electronic device
WO2020000448A1 (en) Flexible screen display method and terminal
CN115866121B (en) Application interface interaction method, electronic device and computer readable storage medium
CN112714901B (en) Display control method of system navigation bar, graphical user interface and electronic equipment
CN111443884A (en) Screen projection method and device and electronic equipment
CN111669462B (en) Method and related device for displaying image
CN113994317A (en) User interface layout method and electronic equipment
CN113791706A (en) Display processing method and electronic equipment
CN114089932B (en) Multi-screen display method, device, terminal equipment and storage medium
CN113542580B (en) Method and device for removing light spots of glasses and electronic equipment
EP3882793A1 (en) Electronic device control method and electronic device
CN114140365B (en) Event frame-based feature point matching method and electronic equipment
CN111768352A (en) Image processing method and device
CN115115679A (en) Image registration method and related equipment
WO2022095744A1 (en) Vr display control method, electronic device, and computer readable storage medium
CN114756184A (en) Collaborative display method, terminal device and computer-readable storage medium
CN110286975B (en) Display method of foreground elements and electronic equipment
CN114004732A (en) Image editing prompting method and device, electronic equipment and readable storage medium
CN116048358B (en) Method and related device for controlling suspension ball
CN113821130A (en) Method and related device for determining screenshot area
CN114942741B (en) Data transmission method and electronic equipment
CN113986406B (en) Method, device, electronic equipment and storage medium for generating doodle pattern
CN116795476B (en) Wallpaper deleting method and electronic equipment
CN116048236B (en) Communication method and related device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination