CN115619902A - Image processing method, device, equipment and medium - Google Patents

Image processing method, device, equipment and medium Download PDF

Info

Publication number
CN115619902A
CN115619902A CN202110811845.9A CN202110811845A CN115619902A CN 115619902 A CN115619902 A CN 115619902A CN 202110811845 A CN202110811845 A CN 202110811845A CN 115619902 A CN115619902 A CN 115619902A
Authority
CN
China
Prior art keywords
ith
resource
material resource
image processing
material resources
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110811845.9A
Other languages
Chinese (zh)
Inventor
卢欣琪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202110811845.9A priority Critical patent/CN115619902A/en
Publication of CN115619902A publication Critical patent/CN115619902A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/50Information retrieval; Database structures therefor; File system structures therefor of still image data
    • G06F16/58Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually
    • G06F16/583Retrieval characterised by using metadata, e.g. metadata not derived from the content or metadata generated manually using metadata automatically derived from the content

Abstract

The embodiment of the application discloses an image processing method, an image processing device, image processing equipment and an image processing medium, wherein the method comprises the following steps: displaying an image processing interface, wherein a target image is displayed in the image processing interface, the target image comprises N objects, and N is a positive integer; based on the object characteristics of the N objects and the processing area associated with each object in the N objects in the target image, N material resources are displayed in the target image in an overlapping mode; and the N material resources correspond to the N objects one to one. The image processing method and the image processing device can perform image processing on a plurality of objects at one time, are simple and convenient to operate, and can improve the convenience of image processing.

Description

Image processing method, device, equipment and medium
Technical Field
The present application relates to the field of computer technologies, and in particular, to the field of image processing technologies, and in particular, to an image processing method, an image processing apparatus, an image processing device, and a computer-readable storage medium.
Background
With the rapid development of image processing technology in computer technology, image processing methods become richer and more interesting, for example: the user can add some material resources in the image, such as adding virtual decorations to the person in the image, or attaching an animation character image to the person in the image, or mosaicing objects which are not easy to be disclosed in the image, and the like. Practice shows that the conventional image processing process needs manual completion by a user, and the operation flow is complex, the repeatability is high, and the convenience is poor.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, an image processing apparatus and an image processing medium, which can simplify an image processing flow and improve convenience of image processing.
In one aspect, an embodiment of the present application provides an image processing method, including:
displaying an image processing interface, wherein a target image is displayed in the image processing interface, the target image comprises N objects, and N is a positive integer;
based on the object characteristics of the N objects and the processing area associated with each object in the N objects in the target image, superposing and displaying N material resources in the target image; and the N material resources correspond to the N objects one to one.
In one aspect, an embodiment of the present application provides an image processing apparatus, including:
the display unit is used for displaying an image processing interface, a target image is displayed in the image processing interface, the target image comprises N objects, and N is a positive integer;
the processing unit is used for displaying N material resources in the target image in an overlapping mode based on the object characteristics of the N objects and the processing area, associated with each object in the N objects, in the target image; and the N material resources correspond to the N objects one to one.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; the corresponding means that: the object characteristics of the ith object are matched with the resource characteristics of the ith material resource; i is a positive integer and i is less than or equal to N; the processing unit is specifically configured to:
and at the processing area of the ith object in the target image, overlapping and displaying the ith material resource.
In one implementation, the transparency of the ith material resource is less than a transparent threshold; when the ith material resource is displayed in the target image in an overlapping mode, the ith material resource shields the processing area of the ith object in the target image.
In one implementation, N material resources are located in a target material resource set; the processing unit is further configured to:
and matching N material resources for the N objects from the target material resource set based on the object characteristics of the N objects.
In one implementation mode, the image processing interface is associated with a material library, the material library comprises a plurality of material resource sets, and each material resource set contains a plurality of material resources;
the target material resource set comprises any one of the following: the target material resource set is any set randomly selected from the material library; or the target material resource set is any set in the material library, wherein the heat value is higher than the heat threshold value; or the target material resource set is a set with the highest use heat value in the material library; or the target material resource set is a set selected by a request user of image processing in the material library; alternatively, the target material resource set is a set in the material library that is adapted to the user habits of the requesting user.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; i is a positive integer and i is not more than N; the target material resource set comprises a first sub-set and a second sub-set, wherein the first sub-set contains material resources of a first type, and the second sub-set contains material resources of a second type; the processing unit is further configured to:
if the ith material resource is the material resource of the first type, outputting prompt information, wherein the prompt information is used for prompting that the ith object is successfully matched with the ith material resource.
In one implementation, the prompt information includes prompt text or prompt audio; the processing unit may be specifically configured to:
if the prompt information comprises a prompt text, displaying the prompt text in a non-image display area of the image processing interface; or displaying a prompt text around a processing area associated with the ith object in the target image;
and if the prompt information comprises prompt audio, playing the prompt audio.
In one implementation, the hint information includes a hint animation, and the processing unit is further configured to:
playing a prompt animation in an image processing interface; alternatively, the first and second electrodes may be,
displaying a playing window on the image processing interface, and playing a prompt animation in the playing window; alternatively, the first and second electrodes may be,
displaying a floating window on the image processing interface, and playing a prompt animation in the floating window;
after the prompt animation is played, the ith material resource is displayed in the target image in an overlapped mode; the playing ending comprises prompting that the playing of the animation is finished, or the playing duration of the prompting animation reaches a duration threshold.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; i is a positive integer and i is not more than N; the ith object is a person, the object features of the ith object comprise action features and biological features, and the action features comprise action amplitude features and action angle features; the biometric features include gender features, facial features, and head features; the target material resource set comprises a first subset and a second subset, at least one first type of material resource and the resource characteristics of each first type of material resource are recorded in the first subset, and at least one second type of material resource and the resource characteristics of each second type of material resource are recorded in the second subset; the processing unit is further configured to:
calculating a first matching degree between the resource characteristics of each first type material resource in the first subset and the action characteristics of the ith object;
and if the material resources with the first matching degree larger than the first matching threshold exist in the first subset, determining the material resources with the first matching degree larger than the first matching threshold as the ith material resource corresponding to the ith object.
In one implementation, the processing unit is further configured to:
if the material resources with the first matching degree larger than the first matching threshold do not exist in the first subset, calculating a second matching degree between the resource characteristics of each second type of material resource in the second subset and the biological characteristics of the ith object;
and if the material resources with the second matching degree larger than the second matching threshold exist in the second subset, determining the material resources with the second matching degree larger than the second matching threshold as the ith material resource corresponding to the ith object.
In one implementation, any one of the N objects is represented as an ith object, the ith object corresponds to an ith material resource of the N material resources, i is a positive integer and i is not greater than N; the processing unit is further configured to:
if the ith object is matched with M material resources from the target material resource set, selecting one material resource from the M material resources to determine the ith material resource;
wherein M is a positive integer; the selecting includes any of: and selecting randomly, selecting in the sequence from high matching degree to low matching degree, and selecting in the sequence from high weight to low weight.
In one implementation, the selecting includes selecting in order of greater weight to lesser weight, the processing unit is further to:
outputting a guess prompt which is used for prompting to guess M material resources;
obtaining a guessing result, wherein the guessing result comprises the guessing times of each material resource in the M material resources;
setting the weight of each material resource in the M material resources according to the guessing result; the more times any one of the M material resources is guessed, the greater the weight is.
In one implementation, the processing unit is further configured to:
after the ith material resource is displayed in the target image in a superposed manner, if a switching trigger event aiming at the ith material resource exists, removing the ith material resource from the target image;
acquiring a kth material resource from the M material resources, and displaying the kth material resource in a superposition manner in a processing area of the ith object in the target image; k is a positive integer, k is less than or equal to M, and k is not equal to i;
wherein the handover trigger event comprises any one of: calling out a menu in a display area of the ith material resource and selecting and switching options in the menu to generate an event; or, an event generated by a switching trigger operation is executed in the display area of the ith material resource.
In one implementation, any one of the N objects is represented as an ith object, the ith object corresponds to an ith material resource of the N material resources, i is a positive integer and i is not greater than N; the processing unit is further configured to:
after the ith material resource is displayed in the target image in a superposed manner, if an editing trigger event aiming at the ith material resource exists, editing the ith material resource;
the editing triggering event comprises an event of dragging the ith material resource; editing comprises changing the display position of the ith material resource in the target image according to dragging; alternatively, the first and second liquid crystal display panels may be,
the editing triggering event comprises an event generated by performing a zooming operation on the ith material resource; the editing comprises the steps of adjusting the size of the display area occupied by the ith material resource in the target image according to the zooming; alternatively, the first and second electrodes may be,
the editing triggering event comprises an event for stopping displaying the ith material resource; editing includes removing the ith material asset from the target image.
In one implementation, a sharing entry is arranged in the image processing interface; the display unit is further configured to:
when the sharing entry is triggered, displaying a sharing object list;
the processing unit is further configured to: when a target sharing object in the sharing object list is selected, sharing the target image on which the N material resources are superposed and displayed to the target sharing object.
In one implementation, the object is a person; the treatment area includes areas of various parts of a person, including any of: head region, face region, limb region and body region; any two of the N objects are represented as an ith object and a jth object, i and j are positive integers, i is less than or equal to N, and j is less than or equal to N;
the processing area associated with the ith object in the target image and the processing area associated with the jth object in the target image are areas of the same part or areas of different parts.
In one aspect, an embodiment of the present application provides an image processing apparatus, including:
a processor adapted to execute a computer program;
a computer-readable storage medium, in which a computer program is stored which, when executed by a processor, implements the image processing method as described above.
In one aspect, the present application provides a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program, and the computer program is adapted to be loaded by a processor and execute the image processing method described above.
In one aspect, embodiments of the present application provide a computer program product or a computer program, which includes computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the image processing method described above.
In the embodiment of the application, the target image displayed on the image processing interface comprises a plurality of objects, the material resources are matched for the ith object by acquiring the object characteristics of the ith object, and the matched material resources are superposed and displayed at the processing area associated with the ith object in the target image.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the description below are some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
FIG. 1 is a block diagram of an image processing system according to an exemplary embodiment of the present application;
FIG. 2 is a schematic illustration of a target image provided by an exemplary embodiment of the present application;
FIG. 3a is a schematic diagram of an application scenario provided by an exemplary embodiment of the present application;
FIG. 3b is a schematic diagram of another application scenario provided in an exemplary embodiment of the present application;
FIG. 4 is a flowchart illustrating an image processing method according to an exemplary embodiment of the present application;
fig. 5a is a schematic flowchart of a method for displaying material resources in an overlapping manner at the same position of different objects according to an exemplary embodiment of the present application;
FIG. 5b is a schematic flow chart illustrating a process of displaying material resources in different parts of different objects in an overlapping manner according to an exemplary embodiment of the present application;
FIG. 6a is a schematic flowchart illustrating a process of dragging an ith material resource according to an exemplary embodiment of the present application;
FIG. 6b is a flowchart illustrating a method for stopping displaying an ith material resource according to an exemplary embodiment of the present application;
fig. 6c is a schematic flowchart of a process of narrowing down the ith material resource according to an exemplary embodiment of the present application;
FIG. 6d is a schematic flowchart illustrating an enlargement process for the ith material resource according to an exemplary embodiment of the present application;
fig. 6e is a schematic flowchart of sharing, overlapping and displaying an image with N material resources according to an exemplary embodiment of the present application;
FIG. 7a is a flow diagram illustrating a process for determining a set of target material resources provided by an exemplary embodiment of the present application;
FIG. 7b is a schematic illustration of a prompt message display location provided by an exemplary embodiment of the present application;
FIG. 8 is a flowchart illustrating matching an ith material resource to an ith object according to an exemplary embodiment of the present application;
FIG. 9 is a schematic representation of a character action and skeleton node provided by an exemplary embodiment of the present application;
FIG. 10 is a diagram illustrating a method for matching assets according to an exemplary embodiment of the present application;
FIG. 11 is a diagrammatic illustration of a method of outputting a guess prompt in accordance with an exemplary embodiment of the present application;
FIG. 12a is a schematic diagram of a process for switching material resources according to an exemplary embodiment of the present application;
FIG. 12b is a schematic diagram of a process for switching material resources according to an exemplary embodiment of the present application;
FIG. 13 is a schematic flow chart diagram illustrating yet another image processing method provided by an exemplary embodiment of the present application;
FIG. 14a is a schematic diagram illustrating a method for triggering display of an image processing interface according to an exemplary embodiment of the present application;
FIG. 14b is a schematic diagram of a prompt message output based on a target set of material assets and a target image according to an exemplary embodiment of the present application;
FIG. 14c is a schematic diagram of still another method for outputting a prompt based on a target set of material assets and a target image according to an exemplary embodiment of the present application;
FIG. 14d is a diagrammatic illustration of a target material resource set as provided by an exemplary embodiment of the present application;
FIG. 15 is a diagram illustrating a method for determining a processing region and label identification information of an ith object according to an exemplary embodiment of the present application;
fig. 16 is a schematic diagram of a standby material resource according to an exemplary embodiment of the present application;
fig. 17 is a schematic flowchart of an angle adjustment of material resources according to an exemplary embodiment of the present application;
fig. 18 is a schematic structural diagram of an image processing apparatus according to an exemplary embodiment of the present application;
fig. 19 is a schematic structural diagram of an image processing apparatus according to an exemplary embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below clearly and completely with reference to the accompanying drawings in the embodiments of the present application, and it is obvious that the embodiments described in the present application are only some embodiments of the present application, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
Reference will now be made to terms involved in embodiments of the present application:
1. an image processing interface:
the image processing interface is a user interface for performing image processing, and a user can display a target image to be processed in the image processing interface, perform processing such as adding material resources to the target image, adjusting the size or color of the target image, and perform operations such as previewing and sharing on the processed target image.
2. Image:
the image is a generic term for various figures and images. The images may include, but are not limited to, photographs, drawings, clip art, maps, frame images in video, movie pictures, and the like.
3. Object:
an object refers to something in an image; objects may include, but are not limited to: humans, animals (e.g., cats, dogs, etc.), plants (e.g., trees, mountains, flowers, etc.), articles (e.g., books, balls, clothing, etc.), and so forth. For convenience of explanation, the following examples of the present application are all exemplified by taking a subject as a person.
4. Object characteristics:
the object feature is information indicating a feature of the object. Each object is provided with respective object characteristics; for example: when the object is a human, the object features may include, but are not limited to: action characteristics, biometrics. The action features may include, but are not limited to: motion amplitude characteristics (such as arm lifting amplitude, leg striding amplitude and the like), motion angle characteristics (such as palm rotation angle, body inclination angle and the like) and the like. Biometric features may include, but are not limited to, gender features (male or female), facial features (e.g., facial contours, facial expressions, etc.), and head features (e.g., head rotation angle, head offset angle, hairstyle, hair color, etc.). And the following steps: when the object is a plant, the object characteristics may include, but are not limited to: color characteristics (such as red, white, red-white color mixing, and the like), shape characteristics (such as sawtooth shape, drop shape, fan shape, and the like), growth environment characteristics (such as humidity, shade, and sufficient light), and the like. For convenience of explanation, in the following embodiments of the present application, the subject is taken as a human, and the subject feature includes an action feature, or includes a biological feature, or includes both the action feature and the biological feature.
5. A processing area:
the processing area refers to an area in the target image where image processing needs to be performed. The processing regions are associated with objects, one object may be associated with one or more processing regions; for example: a certain object is a person and the processing area associated with the object may be a face area, a head area, a limb area, a body area, etc. in the image. The objects are different, and the associated processing areas may be the same or different; for example: the target image includes two persons, namely, a and B, the processing area associated with a may be a face area of a in the target image, and the processing area associated with B may also be a face area of B in the target image, so that the embodiment of the present application needs to process both the face area of a (for example, a face of a is blocked by using a certain material resource) and the face area of B (for example, a face of B is blocked by using a certain material resource). The following steps are repeated: according to the above example, the target image includes two persons, i.e., a and B, the processing region associated with a may be a face region of a in the target image, and the processing region associated with B may also be a body region of B in the target image, so that the embodiment of the present application needs to process both the face region of a (for example, the face of a is blocked by using a certain material resource) and the body region of B (for example, the body region of B is reloaded by using a certain material resource).
6. Material resources:
the material resources refer to resources which can be superposed into the target image; divided by asset type, material assets can include, but are not limited to, icons, photos, animations, text, and the like. The material resources can include, but are not limited to, 2D (two-dimensional) material resources (e.g., comic character stickers, 2D plant stickers, 2D animal stickers, etc.), 3D (three-dimensional) material resources (e.g., 3D animation avatars, 3D hairstyles, 3D pendants, etc.) divided by spatial dimensions. In the embodiment of the application, the material resources can be further divided according to resource importance, specifically, the material resources can be divided into first type material resources and second type material resources, wherein the first type material resources refer to material resources which need to remind a user of important attention, and specifically, the first type material resources can be understood as some colored egg material resources which can bring surprises to the user, such as classical cartoon characters, cartoon characters with high utilization rate, cartoon characters with high influence and the like. The second type of material resources are material resources that do not need to be focused by the user, and can be specifically understood as common material resources, such as dragon lantern characters in the cartoon, cartoon characters with low popularity, and the like. In practical applications, the first type of material resources may be reminded of the user to pay attention to the first type of material resources through prompt information, where the prompt information may include, but is not limited to: a prompt text (such as XX for congratulating unlocking characters), a prompt audio (such as a classical speech of a character corresponding to the material resource, background music of a film and television work in which the character corresponding to the material resource is located, and the like), a prompt animation (such as a classical film and television segment including the character corresponding to the material resource), and the like.
7. Resource characteristics:
the resource characteristics refer to information for representing characteristics of material resources; resource characteristics include, but are not limited to: gender characteristics, material offset angle characteristics (such as garment offset angle, cartoon head portrait offset angle, hair accessory offset angle, and the like), action characteristics (such as action of cartoon characters), and the like.
An embodiment of the present application provides an image processing system, which can be exemplarily shown in fig. 1, and includes: the terminal device 10 and the server 11, and a communication connection is established between the terminal device 10 and the server 11. The terminal device 10 may include, but is not limited to: smart phones, tablet computers, notebook computers, desktop computers, smart televisions, and the like; a variety of clients (APPs) may run in the terminal device 10, such as a multimedia playing client, a social client, a browser client, an information flow client, an education client, and so on. Applications may also be run in the terminal device 10, which may include but are not limited to: instant messaging applications, social applications, photo applications, and the like. The server 11 may be an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, a cloud server providing basic cloud computing services such as cloud service, a cloud database, cloud computing, a cloud function, cloud storage, network service, cloud communication, middleware service, domain name service, security service, content Delivery Network (CDN), big data, and an artificial intelligence platform, and the server 11 may also be a backend server corresponding to an application program (or a client) running in the terminal device 10, and the like, which is not limited in the present application.
In one embodiment, based on the system shown in fig. 1, the image processing method provided in the embodiment of the present application may be executed by the terminal device 10 in cooperation with the server 11. Based on this, the specific flow of the image processing method may be as follows: (1) the terminal device 10 displays an image processing interface, and the image processing interface displays a target image, which may be an image uploaded to the image processing interface by a user or an image captured in real time through the image processing interface. The target image may include N objects, which may be of the same or different types, for example: the N objects contained in the target image may all be people; alternatively, a part of the N objects included in the target image is a human, and a part of the objects is an animal. Illustratively, the target image may be as shown in fig. 2, where fig. 2 includes an object 201, an object 202, an object 203, and an object 204, where the object 201, the object 202, and the object 203 are all people, and the object 203 is a plant. (2) The terminal device 10 may send the target image to the server 11, and the server 11 matches material resources for each of the N objects in the target image; the server 11 may then transmit the material resources matched for each object to the terminal device 10, and the matched material resources are displayed by the terminal device 10 in an overlapping manner in the processing region of each object associated with the target image. Of course, it can be understood that, after the material resources are matched to the N objects, the server 11 may also superimpose the material resources into the processing area associated with each object in the target image, and then return the target image subjected to the superimposition processing to the terminal device 10, and the terminal device 10 only needs to display the target image subjected to the superimposition processing. This embodiment can uniformly execute the load of image processing by a server having excellent performance, and is advantageous for reducing the processing load of the terminal device.
In another embodiment, based on the system shown in fig. 1, the image processing method proposed in the embodiment of the present application may also be executed by the terminal device 10 independently. In this case, the specific flow of the image processing method may be as follows: (1) displaying an image processing interface in the terminal device 10, the image processing interface displaying a target image; (2) the terminal device 10 may match material resources for each object according to the object features of the N objects in the target image, and superimpose and display the matched material resources in the processing region where each object is associated with the target image. This embodiment can avoid interaction between the terminal device 10 and the server 11, and improve the efficiency of image processing.
The image processing method provided by the present application can be applied to a plurality of image processing scenarios, and the following example describes the image processing method provided by the embodiment of the present application in detail with two image processing scenarios.
(1) And taking a picture of a scene.
The image processing interface may be a photo taking interface in a terminal device (e.g. a smart phone), and the target image may be a preview image (e.g. a person image, a landscape image, etc.) displayed in the photo taking interface, where the preview image may be understood as: the picture captured by the photo taking interface before the user has not triggered the take button, but the picture is not frozen in the photo taking interface, and can change in the photo taking interface in real time as the smartphone lens moves. Illustratively, the preview image in the photo taking interface is taken as the image of the person, and the specific application mode is as follows: the method comprises the following steps that a user can enter a photo shooting interface by opening a camera program in the smart phone, a preview image to be shot can be displayed in the photo shooting interface, and the smart phone can perform feature recognition on each person appearing in the preview image to obtain object features of each person; then, material resources are matched for each person based on the object characteristics of each person, a material library can be stored in the storage space of the smart phone in advance, and the smart phone can match the material resources for each person from the pre-stored material library; certainly, the smart phone can also pull the material resources from the network or the cloud material library in real time to match each person, and the application does not specifically limit the scope of the invention. After matching material resources for each person, the smart phone displays the matched material resources in an overlapping manner at the processing area associated with each person in the preview image. Referring to fig. 3a, fig. 3a is an application schematic diagram of a photo taking scene according to an exemplary embodiment of the present application, and when a smart phone recognizes that a person exists in a preview image 311 of a photo taking interface 31, the smart phone matches a corresponding material resource (for example, hair accessory resource shown in fig. 3 a) for the person, and displays the matched material resource in a processing area (for example, a head area in fig. 3 a) corresponding to the preview image 311 of the person, so as to obtain a preview image with the material resources superimposed as shown in 321. It should be noted that when the position of the person in the preview image changes, the material resource changes along with the position change of the person, so that the material resource is always displayed in the head area of the person; in addition, when the angle of a person (e.g., a face angle) in the preview image changes, the display angle of the material resource also changes accordingly. It can be understood that, when the user clicks the shooting key to shoot an image, the shot image of the stop motion is a preview image on which the pixel resources are displayed in an overlapping manner.
(2) The method comprises the following steps of using an application program to carry out occlusion processing on objects in an image.
Referring to fig. 3b, an image processing entry 331 is disposed in the conversation window 33 of the social application program, and selecting the image processing entry 331 can trigger displaying the image selection interface 34; the user may then determine the target image 341 to be image processed in the image selection interface 34 and click the ok button 342 to trigger the display of the image processing interface 35. Further, the selected target image 341 may be displayed in the image processing interface 35, and the target image 341 includes the object 3411 and the object 3412. The image processing interface 35 is further provided with an operation control (for example, a face blocking control 351 in the image processing interface 35), when the user selects the face blocking control 351 in the image processing interface 35, the social application program or the terminal device where the social application program is located may identify object features of a plurality of objects in the target image 341, match material resources for each object, and then superimpose the matched material resources on a face region in the target image 341 to implement face blocking, so as to display the target image 36 on which the material resources are superimposed in the image processing interface 35. Further, the user may also share the target image 36, such as the target image superimposed with the material resources, to other users in the social application.
The image processing method provided by the embodiment of the application can automatically and uniformly process each object in the target image, including face covering processing, reloading processing and the like, so that the image processing flow is simplified, and the user experience is improved.
The following describes in detail the image processing method provided in the embodiment of the present application, and it should be noted that the image processing method may be executed by an image processing device, where the image processing device may be a terminal device in the image processing system shown in fig. 1, and the terminal device may independently complete the flow of the image processing method; of course, the terminal device may also complete the flow of the image processing method with the assistance of the server shown in fig. 1.
Referring to fig. 4, fig. 4 is a flowchart illustrating an image processing method according to an exemplary embodiment of the present application. The image processing method may include steps S401 to S402:
s401, displaying an image processing interface, wherein a target image is displayed in the image processing interface, the target image comprises N objects, and N is a positive integer.
The image processing interface may be an interface provided by any application, for example the image processing interface may be an image capture interface in a camera application as shown in fig. 3 a; or an image processing interface 35 provided by a social application as shown in figure 3 b. The number of objects included in the target image may be greater than or equal to N, and when the number of objects in the target image is just equal to N, it indicates that all the objects in the target image need to be processed, for example, the target image 41 in fig. 4 includes 2 objects in total, and each of the 2 objects has a face image, and if the user needs to perform the occlusion processing on all the face images in the target image 41, N =2. When the number of objects included in the target image is greater than N, it indicates that some (i.e., N) objects in the target image need to be processed, and the rest of the objects do not need to be processed. For example, in FIG. 2, the target image contains a total of four objects 201-204; if only the person in the target image needs to be processed and the objects 201-203 are all people, so N =3, it will be appreciated that no processing is needed if the object 204 is a plant. In a specific application, the N values and the selection of the N objects may be determined by a user requesting image processing according to a requirement.
S402, based on the object characteristics of the N objects and the processing areas of the N objects, which are associated with each object, in the target image, the N material resources are displayed in the target image in an overlapping mode. The N material resources may correspond one-to-one to the N objects.
The one-to-one correspondence between the N objects and the N material resources is as follows: one object corresponds to one material resource. Any one of the N objects can be represented as the ith object, i is a positive integer, and i is less than or equal to N; the ith object corresponds to the ith material resource among the N material resources. By correspondence is understood: the object characteristics of the ith object are matched with the resource characteristics of the ith material resource. That is to say, for N objects in the target image, the image processing apparatus may match one material resource for each object, and then superimpose and display the matched N material resources into the target image, thereby completing the processing on the N objects. For example, in fig. 4, when the image processing apparatus determines that the value of N is 2, and detects that a user has triggered a display entry (e.g., the display entry 42) of the image processing interface, the image processing apparatus may perform image processing on 2 objects included in the target image 41, so that in the material resources 43 required for the image processing apparatus to process the 2 objects, 2 material resources may be included, and one material resource corresponds to one object, and further, the image processing apparatus may obtain the image 44 after performing image processing on the target image 41.
In the embodiment of the application, for a target image displayed in an image processing interface, N material resources corresponding to N objects (for example, N persons) can be automatically displayed in an overlaid manner in the target image based on object features of the N objects (for example, N persons) in the target image and based on processing areas (for example, face areas) of the N objects in the target image; for example: the method provided by the embodiment of the application can be used for carrying out multi-face portrait recognition on the static multi-person photo, and matching the face blocking image in each face area in the photo based on the recognized recognition information, so that a user does not need to manually drag the face blocking image to the face area. Through the image processing process, the N objects in the target image can be uniformly and automatically processed, the image processing flow is effectively simplified, and the convenience of image processing is improved.
The following describes a specific embodiment of displaying N material resources in a target image in a superimposed manner.
Based on the method embodiment shown in fig. 4, a specific flow of the image processing apparatus displaying N material resources in the target image in an overlapping manner may include the following steps: the image processing device displays the ith material resource in an overlapping mode at the processing area, associated with the ith object in the target image, wherein the transparency of the ith material resource can be smaller than the transparency threshold value, so that when the ith material resource is displayed in the target image in an overlapping mode, the ith material resource forms shielding for the processing area, associated with the ith object in the target image, of the ith object. Illustratively, referring to fig. 5a, if the material resources matched by the image processing apparatus according to the object features of the object 51 are the material resources 531, the material resources 531 are displayed in an overlapping manner in the processing area 511; if the material resources matched by the image processing apparatus according to the object features of the object 52 are the material resources 532, the material resources 532 are displayed in an overlapping manner in the processing area 521, and finally the target image 54 in which the material resources 531 and the material resources 532 are displayed in an overlapping manner is obtained.
In one embodiment, assuming that any two objects in the N objects are represented as the ith object and the jth object (i and j are positive integers, i is less than or equal to N, j is less than or equal to N), and the ith object and the jth object are both human, then the processing area associated with the ith object in the target image and the processing area associated with the jth object in the target image may be areas of the same part or areas of different parts; illustratively, with continued reference to fig. 5a, the processing region 511 of the object 51 and the processing region 521 of the object 52 are both header regions, and it can be seen that the processing regions of the two objects are regions of the same location. Referring to fig. 5b, the treatment area of the subject 51 is a body area 512, the treatment area of the subject 52 is a head area 521, and the treatment areas of the two subjects are areas of different parts. Further, assuming that in fig. 5b, the image processing apparatus matches the object 51 to the material resources 533 and matches the object 52 to the material resources 532, after the image processing apparatus superposes and displays the respective matched material resources at the processing areas of the object 51 and the object 52, an image 55 as shown in fig. 5b can be obtained.
In one embodiment, the material assets displayed superimposed into the target image also support editing. After the ith material resource is displayed in a superimposed manner in the target image, the image processing apparatus may further perform the following operations: if an editing trigger event aiming at the ith material resource exists, the ith material resource can be edited; wherein, the edit triggering event may include any one of the following events: and (1) dragging the event of the ith material resource. It can be understood that dragging the ith material resource displayed in an overlapped manner in the target image can change the display position of the ith material resource, and therefore, dragging the corresponding edit of the ith material resource is as follows: and changing the display position of the ith material resource in the target image according to the dragging. Illustratively, as shown in fig. 6a, if the image processing apparatus displays the material resource 612 in a superimposed manner in the head region of the object included in the target image 611, at this time, if there is a drag operation (e.g., drag to the right) on the material resource 612, the dragged material resource 612 will move (e.g., move to the right) following the drag operation, so that the display position of the material resource 612 changes (e.g., as shown in the reference image 613). And (2) stopping displaying the event of the ith material resource. Then, the edit corresponding to this edit triggering event may be: and removing the ith material resource from the target image. In one embodiment, after the material resources are displayed in an overlapping mode, the ith material resource can be removed from the target image by arranging a display stopping component in a display area corresponding to the material resource. Illustratively, as shown in fig. 6b, a stop display component 614 is disposed in the display area of the material resource 612, the user can click the stop display component 614 to generate an event for stopping displaying the material resource 612, and the image processing apparatus can remove the material resource 612 from the target image according to the event, thereby restoring the target image 611. (3) An event generated by performing a zoom operation on the ith material resource, it can be understood that the edit event corresponds to an edit such as: adjusting the size of the display area occupied by the ith material resource in the target image according to the zooming operation; as an example, as shown in fig. 6c, fig. 6c illustrates a process of performing a reduction process on the material resource 612 by the image processing apparatus, and after performing the reduction process on the material resource 612 by the image processing apparatus, an image 615 with the reduced material resource displayed in an overlapping manner can be obtained; fig. 6d shows a process of performing an amplification process on the material resource 612 by the image processing apparatus, and after performing the amplification process on the material resource 612 by the image processing apparatus, an image 616 displayed with the amplified material resource is obtained.
According to the image processing method, the object characteristics of the object in the target image are obtained through the image processing equipment, the material resources are matched for the object according to the object characteristics, then the material resources are automatically overlapped and displayed in the processing area where the object is associated with the target image, full-automatic photo synthesis is achieved, and due to the fact that the material resources after overlapping and displaying can be edited, a user can adjust the material resources overlapped and displayed on the object according to the actual requirements of the user, the dominance of the user on image processing is guaranteed, and user experience is improved to a certain extent.
In addition, after the N material resources are displayed in the target image in an overlapped mode, the target image with the N material resources displayed in the overlapped mode is also supported to be shared. As shown in fig. 6e, for example, a sharing entry (e.g., sharing entry 621 in fig. 6 e) may be set in the image processing interface, where the sharing entry refers to a component that can be used to implement a sharing function, and the manner of triggering the sharing entry includes, but is not limited to: (1) click the share button; (2) The target images of the N material resources are displayed in a long-time and long-time manner in a superposition manner and slide towards a specified direction (such as sliding towards the upper side of an image processing interface, sliding towards the right side of the image processing interface and the like); the sharing object list means: a list (for example, the list 622 in fig. 6 e) including a plurality of sharing objects capable of image sharing is described, where the sharing objects refer to: an object capable of sharing images, or an object capable of receiving images, such as a friend in a chat application (e.g., the shared object 6221 in the list 622), an XX dynamic publishing platform, and the like. It can be understood that, by sharing the target image obtained by displaying the N material resources in the overlapping manner with the sharing object (for example, sharing the target image with the session page 623 of the sharing object 6221), the transmission capability of the image processing method provided by the embodiment of the present application can be enhanced to a certain extent.
The following describes a specific embodiment for matching material resources for objects in a target image.
After the image processing device displays the image processing interface, the image processing device matches the N material resources for the N objects from the target material resource set based on the object features of the N objects, so that the image processing device can display the matched N material resources in the target image in an overlapping manner. In addition, the image processing interface may be associated with a material library, where the material library includes a plurality of material resource sets, and each material resource set includes a plurality of material resources. Illustratively, the material assets can be sticker images, such as: cartoon character stickers, Q-version animal stickers, clothing stickers, and the like, the material resource set may be a sticker theme pack, such as an XX cartoon sticker pack, which includes a plurality of cartoon character stickers; or an lovely sticker package containing a plurality of Q-version animal stickers, etc.
The image processing device may select the set of target material resources from the material library in a manner that may include, but is not limited to, any of the following:
(1) A random selection mode. The target material resource set is any one randomly selected from the material library.
(2) In a manner selected using the heat value. In one implementation, the set of target material resources is any set in the material library that uses a heat value above a heat threshold. Each material resource set in the material library corresponds to a usage heat value, the usage heat value is used for reflecting the situation that each material resource set is used, and the usage heat value can be set according to one or more dimensions, for example: the usage heat value of the material resource set can be set to 100 if the material resource set has 100 users in total; if the second material resource set is used by 200 users in total, the use heat value of the second material resource set can be set as 200; as can be seen, a higher usage heat value indicates that the set of material assets is more popular with the user. The following steps are repeated: the setting can be performed according to the number of times of use of each material resource set in a period of time, for example, the total number of the material resource sets a, b, and c is 3, the total number of the material resource a is 3, the total number of the material resource b is 4, and the total number of the material resource c is 6, the number of times of use of the material resource set can be the sum of the number of times of use of each material resource, that is, 3+4+6=11, and the value of heat of use of the material resource set can be set to 11. The heat threshold may be set according to actual conditions, for example, 100, 50, etc. may be set. According to the method and the device, the alternative material resource sets with the heat value larger than the heat threshold value can be screened from the material library, and one alternative material resource set is randomly selected from the alternative material resource sets to serve as the target material resource set. In another implementation, the target material resource set is the set of the material library that has the highest usage heat value. The material resource sets in the material library can be sorted according to the sequence of the use heat values from high to low, and then the material resource set with the highest sorting is selected to be determined as the target material resource set.
(3) Selected by the user. As shown in fig. 7a, the image processing device may output an identifier of each material resource set in the material library 701 in the image processing interface, and the requesting user may select one identifier (e.g., identifier 702) from the identifier, so that the image processing device may determine the material resource set corresponding to the identifier selected by the requesting user as the target material resource set, and match the material resources for each object according to the target material resource set.
(4) And selecting according to the habit of the user. The target material resource set is a set in the material library that is adapted to the user habits of the requesting user. For example, assuming that the requesting user frequently browses some animation websites, a material resource set containing animation-related content in the material library may be determined as the target material resource set.
The target material resource set may include a first subset in which material resources of a first type are included and a second subset in which material resources of a second type are included. In this embodiment of the present application, if the material resource matched to the ith object by the image processing device is a first type of material resource, the image processing device may further output prompt information, where the prompt information is used to prompt that the ith object is successfully matched to the ith material resource. In one implementation, the prompt information may be prompt text, and the image processing device may display the prompt text in a non-image display area of the image processing interface, where the non-image display area may be understood as: other areas in the image processing interface except for areas for displaying the target image; alternatively, the prompt text may be displayed around the processing area associated with the ith object in the target image of the image processing interface. For example, the application is explained by taking an example that the image processing device displays the prompt text around the processing area associated with the ith object in the target image of the image processing interface, and specifically, as shown in fig. 7 b: assuming that the material resources to which the object 71 is matched in fig. 7b are material resources of the first type, the image processing apparatus may display prompt text at a display position 721 in the image processing interface 72.
In another implementation, the prompting message may also be prompting audio (such as classical background music in a certain cartoon work, classical lines of a certain character, etc.), and then the image processing device may also play the prompting audio in the image processing device. In another implementation, the hint information can also be a hint animation, and the way the image processing device outputs the hint animation can include, but is not limited to, any of the following: (1) playing a prompt animation in an image processing interface; (2) Displaying a playing window on the image processing interface, and playing a prompt animation in the playing window; (3) And displaying the floating window on the image processing interface, and playing the prompt animation in the floating window.
Further, the ith material resource may be displayed in the target image after the end of the cue animation playing, where the end of playing refers to: the prompt animation is completely played (namely, the prompt animation is completely played); or, the playing duration of the prompt animation reaches a duration threshold, such as: the total animation time length of the prompt animation is 10 seconds, the time length threshold value is 3 seconds, and after the prompt animation is played for 3 seconds, the ith material resource is displayed in the target image in an overlapping mode.
According to the image processing method provided by the embodiment of the application, after the user is matched with the first type of material resource, the prompt information corresponding to the first type of material resource is output, so that the interestingness of the user in image processing is enhanced.
In one embodiment, the image processing apparatus may match the ith material resource for the ith object from the target material resource set based on object features of the ith object. As can be seen from the foregoing, when the ith object is a person, the object characteristics include an action feature and a biological feature, and then, the method for the image processing apparatus to match the ith material resource for the ith object can be seen in fig. 8; as shown in fig. 8, the method may specifically include the following steps:
s801, calculating a first matching degree between the resource characteristics of each first type material resource in the first subset and the action characteristics of the ith object.
The first matching degree can be used for representing the similarity between the material resources of the first type and the ith object. When the image processing apparatus calculates the first matching degree, the image processing apparatus may perform motion recognition on the ith object to obtain a motion feature of the ith object, specifically, the motion feature may be, for example, motion skeleton information (two-dimensional mapping information) of the ith object, the motion skeleton information is generated by regarding each part in the ith object as a skeleton node and acquiring a coordinate corresponding to each skeleton node, and the coordinate corresponding to the skeleton node may be as follows: coordinates of the head node, coordinates of the left palm node, coordinates of the right palm node, coordinates of the waist and the like; for example, for the portrait motion shown in fig. 9, the image processing apparatus may perform motion recognition on the portrait motion shown in fig. 9, and mark a point at each part that may be used for depicting the portrait motion, such as: marking the palm of the portrait to obtain a palm node 91, and marking the head of the portrait to obtain a head node 92; further, the image processing apparatus may generate two-dimensional mapping information of each skeleton node, the two-dimensional mapping information being used to reflect coordinates of the skeleton node in the target image, such as: the two-dimensional mapping information of the palm node 91 is recorded as (x 1, y 1), the two-dimensional mapping information of the head node 92 is recorded as (x 2, y 2), based on this, the two-dimensional mapping information of all skeleton nodes of the ith object can be obtained, and further, the action skeleton data of the ith object can be generated based on the two-dimensional mapping information of all skeleton nodes, wherein the action skeleton data is a coordinate matrix formed by the coordinates of each skeleton node of the ith object. For example, the action skeleton data may be represented by a matrix G, and then the action skeleton data G corresponding to the portrait action in fig. 9 may be represented as: g = [ (x 1, y 1), (x 2, y 2), \8230: (xk, yk).. Cndot. ]. It is to be understood that, in order to be able to calculate a first matching degree between the motion characteristics of the ith object and each of the first type material resources, each of the first type material resources may also correspond to a set of motion skeleton data, and the motion skeleton data corresponding to the first type material resources may be recorded as the resource characteristics of the first type material resources. Illustratively, the action skeleton data of a first type of material resource may be represented by a; therefore, it is understood that the image processing apparatus may compare the motion skeleton data G of the ith object with the motion skeleton data a corresponding to each first type of material resource, and then calculate a distance variation matrix of the relative positions of skeleton nodes of the two sets of motion skeleton data, where the distance variation matrix may be represented by D, and D may be used to represent the first matching degree.
S802, judging whether the material resources with the first matching degree larger than the first matching threshold exist in the first subset.
If the action skeleton data a corresponding to any one first type of material resource exists in the first subset, and the distance variation matrix D between the action skeleton data a and the action skeleton data G of the ith object is smaller than the set deviation threshold, it is considered that the first matching degree between the resource characteristics of any one first type of material resource and the action characteristics of the ith object is greater than a first matching threshold, that is: material resources with a first matching degree greater than a first matching threshold exist in the first subset, and then step S806 may be executed; if the distance variation matrix D between the motion skeleton data a corresponding to all the material resources of the first type in the first subset and the motion skeleton data G of the ith object is greater than or equal to the set deviation threshold, it is determined that there is no material resource in the first subset whose first matching degree with the motion feature of the ith object is greater than the first matching threshold, and step S803 may be executed.
And S803, calculating a second matching degree between the resource characteristics of each second type material resource in the second subset and the biological characteristics of the ith object.
The image processing apparatus may obtain biometrics characteristics (sex information, head rotation information, two-dimensional face shift angle information, and the like) of the ith object by performing portrait recognition processing on the ith object; further, a second matching degree between the biological feature of the ith object and the resource feature of each second type of material resource in the second subset may be calculated based on the biological feature of the ith object and the resource feature of each second type of material resource in the second subset.
S804, whether the material resources with the second matching degree larger than the second matching threshold exist in the second subset is judged.
In one embodiment, if there is no material resource in the second subset whose second matching degree is greater than the second matching threshold, ending the process of matching the material resource for the ith object; if there are material resources in the second subset having a second matching degree greater than the second matching threshold, the image processing apparatus executes step S805.
And S805, determining the material resource with the second matching degree larger than the second matching degree threshold value as the ith material resource corresponding to the ith object. The flow ends.
S806, determining the material resource with the first matching degree larger than the first matching threshold as the ith material resource corresponding to the ith object. The flow ends.
Step S801-step S806 are exemplarily explained below with reference to fig. 10, as shown in fig. 10, the first subset 101 includes two material resources of the first type, namely, material resource 1011 and material resource 1012, and the second subset 102 includes 3 material resources of the second type, namely, material resource 1021, material resource 1022 and material resource 1023; then, if material resources are to be matched for the object 1032 in the target image 103, motion recognition may be performed on the object 1032 to obtain motion characteristics of the object 1032, and then, according to the motion characteristics of the object 1032, material resources whose first matching degree with the motion characteristics of the object 1032 is greater than a first matching threshold are matched for the object 1032 from the first material resource set 101, in this example, it can be seen that there is no material resource whose first matching degree with the motion characteristics of the object 1032 is greater than the first matching threshold in the first material resource set 101, then, the image processing apparatus performs human-image recognition on the object 1032 to obtain biological characteristics, and matches the material resources for the object 1032 from the second subset 102 based on the biological characteristics obtained by the human-image recognition, in this example, since the material resources 1021 and the material resources 1023 do not conform to the gender characteristics of the object 1032, only the material resources 1022 conform to the gender characteristics of the object 1032, and thus it can be considered that there is one material resource of the second type in the second subset that is present between the resource characteristics of the second type and the biological characteristics of the object 1032 is greater than the second matching degree, and the image processing apparatus may further display the final processing region 1032 (i.e., the head portion 1042, the image processing apparatus may obtain the material resources 1032). If the material resources are to be matched for the object 1031, similarly, the motion recognition may be performed on the object 1031 to obtain the motion features of the object 1031, and then the material resources are matched for the object 1031 from the first subset 101, which is seen in this example, the material features (specifically, the motion features) of the material resources 1011 are very similar to the motion features of the object 1031, so that the image processing apparatus may directly use the material resources 1011 as the material resources whose first matching degree with the motion features of the object 1031 is greater than the first matching threshold, and display the material resources 1011 in an overlapping manner in the processing area of the object 1031 to finally obtain the object 1041.
According to the image processing method, when the material resources are matched for each object, the first type of material resources are matched preferentially according to the action characteristics, so that when a user tends to be matched with a certain first type of material resources, the possibility of matching the first type of material resources is improved by making the action related to the first type of material resources, and the interestingness of the user in shooting the target image and performing image processing on the shot target image is improved.
In one embodiment, M (M is a positive integer) material resources matching the ith object may exist in the target material resource set, and then the image processing apparatus may determine one material resource from the M material resources as the ith material resource corresponding to the ith object. The manner of determining the ith material resource by the image processing device may include, but is not limited to, any of the following: (1) Determining any one material resource from the M material resources as the ith material resource; (2) Selecting the ith material resource from the M material resources according to the sequence of the matching degree from high to low, such as: determining the material resource with the highest matching degree with the ith object as the ith material resource; (3) Selecting the ith material resource from the M material resources according to the sequence of the weights from large to small, such as: and determining the material resource with the largest weight value as the ith material resource.
The weights of the M material resources may be set by guessing, for example: the user guesses which role in the M cartoon material resources the object in the target image has the highest matching degree, and the weights of the M cartoon material resources are set according to the guessing result of the user. Specifically, the method comprises the following steps: if the ith material resource is determined by the image processing device from the M material resources according to the descending order of the weight, the determining of the ith material resource by the image processing device may specifically include the following steps: (1) And outputting a guess prompt, wherein the guess prompt is used for prompting the user to guess the M material resources, and the guess prompt can be, for example: prompt text, prompt audio, etc., so-called guessing may be understood as: and selecting one or more material resources which are matched with the ith object in the user center with highest degree from the M material resources by the user. For example, the guess prompt may be output on an image processing interface, or may be output in a notification bar of a terminal device where the image processing interface is located, the application is described by taking the guess prompt output on the image processing interface as an example, as shown in fig. 11, assuming that the image processing device is the object 1103 matched with 3 material resources, the image processing interface may output the prompt information 1101 as shown in fig. 11, where the prompt information may be used to prompt the user to select, from the 3 material resources included in the material resource set 1102, the material resource that is most matched with the object 1103 in the mind of the user; (2) Obtaining a guess result, wherein the guess result comprises the guess (namely selected) times of each material resource in the M material resources; (3) The weight of each material resource in the M material resources is set according to the guessed times of each material resource, and the more the guessed times of each material resource are, the larger the weight is correspondingly set. The weight is set in a guessing mode, the image processing field can be combined with a guessing game, and the spreading force of the image can be effectively strengthened.
Since there may be M material resources matching the ith object in the target material resource set, the image processing apparatus may determine one material resource from the M material resources as the ith material resource corresponding to the ith object, and after the ith material resource is displayed in the target image in an overlapping manner, it is also supported to switch the ith material resource corresponding to the ith object in the target image. If there is a switching trigger event for the ith material resource, the image processing apparatus may further perform the following steps: removing the ith material resource in the target image; (2) And acquiring a kth material resource (k is less than or equal to M and is not equal to i) from the M material resources, and displaying the kth material resource in a superposed manner in a processing area associated with the ith object in the target image, wherein the kth material resource is any one of the M material resources except the ith material resource. Optionally, the handover triggering event includes, but is not limited to, any of the following: (1) Calling out a menu in the display area of the ith material resource and selecting and switching options in the menu to generate an event, wherein the operation of calling out the menu can comprise pressing the display area of the ith material resource for a long time, clicking a menu component and the like; illustratively, the following explanation is given by taking the example of calling out a menu by long-pressing the display area of the ith material resource, please refer to fig. 12a, the image processing apparatus matches 3 material resources shown in the material set 1201 for the object, the user can call out the menu 1203 by performing long-pressing operation on the display area 1202, and further, the user can switch the material resources currently displayed in the display area 1202 in an overlapping manner by clicking the switching option 12031 to obtain the image as shown in the image processing interface 1204. (2) And executing an event generated by switching trigger operation in the display area of the ith material resource, wherein the switching trigger operation can comprise clicking operation, sliding screen operation and the like. For example, the following description takes the toggle trigger operation as a click operation as an example, please refer to fig. 12b, the user may click on the material resources 1205 to switch the material resources 1205 to other material resources (e.g., the material resources 1206) matching the object, and further, if the user continues to click on the material resources 1206, the user may continue to switch the material resources 1206 to other material resources (e.g., the material resources 1207) matching the object.
According to the image processing method, the overlapped and displayed material resources are switched after the target image is overlapped and displayed, the requirement for matching diversified material resources of a user is met, and the user experience is improved.
FIG. 13 is a schematic flow chart diagram of yet another image processing method provided in an exemplary embodiment of the present application; in this embodiment, the target image includes N objects, and the N objects are all people. The material resources corresponding to any two of the N objects may be different, that is: the N objects correspond to N different material resources. The flow of the image processing method comprises the following steps:
and S1301, displaying a target image, wherein the target image comprises a plurality of portraits.
As shown in fig. 14a, a user may click on a display entry 140 of an image processing interface at a first interface (e.g., a friend dynamic viewing interface, an XX topic discussion interface) to trigger display of the image processing interface, a target image uploaded by the user may be displayed in the image processing interface, and further, the user may select a target material resource set that is desired to be used from a plurality of material resource sets displayed in the image processing interface.
S1302, a portrait recognition process is started.
After the target material resource set and the target image are determined, the image recognition processing may be performed on the target image, and specifically, the image processing apparatus may perform preprocessing on the target image, such as: and performing information noise reduction processing, portrait number statistics and the like on the target image. In addition, the image processing device can also extract key biological features of each portrait in the target image in the portrait identification processing process, such as: gender characteristics, human face angle characteristics and the like, so that the image processing equipment can match corresponding material resources for each human image from the target material resource set based on the extracted key biological characteristics in the subsequent process. It should be noted that if the image processing apparatus does not recognize the person image of the material resource in the applicable target material resource set in the target image, the image processing apparatus may execute step S1305. For example, assuming that each material resource in the target material resource set is a face sticker material, step S1305 may be executed when the image processing apparatus does not recognize a face image in the target image.
And S1303, judging whether the number of the figures in the target image exceeds the number of the material resources in the target material resource set.
The image processing apparatus may determine whether the number of the figures in the target image exceeds the number of the material resources in the target material resource based on the number of the obtained figures, and since the material resources corresponding to any two objects are different, when the number of the figures in the target image exceeds the number of the material resources in the target material resource, the image processing apparatus may perform step S1305. It should be noted that if the number of the figures in the target image is less than or equal to the number of the material resources in the target material resource, the image processing apparatus may execute step S1304.
And S1304, judging whether the portrait in the target image accords with the action characteristics of the first type of material resources.
The image processing device performs motion recognition on each portrait in the target image to judge whether one or more target portraits with motion characteristics having a resource characteristic matching degree larger than a first matching threshold value with the first type of material resources exist in the target image, and if yes, the step S1306 is executed; if not, step S1307 is executed.
And S1305, outputting prompt information and ending the process.
In one embodiment, if the number of people in the target image exceeds the number of material resources in the target material resource set, the image processing apparatus may output the prompt information in a manner as shown in fig. 14b, for example, assuming that the number of material resources in the target material resource set 143 is 2, since the number of people in the target image 142 is 4, since 4 > 2, it is seen that the number of people in the target image exceeds the number of material resources in the target material resource set, for example, the image processing apparatus may output the prompt information 141 in the image processing interface to prompt the user that the currently selected target material resource set cannot be completely applied to the target image, and then the image processing apparatus may end the current image processing flow. In yet another embodiment, referring to fig. 14c, if each material resource in the target material resource set 143 is a face sticker, when the image processing device does not recognize the face image in the target image 144, a prompt message 145 may be displayed on the image processing interface.
And S1306, marking the portrait meeting the action characteristics of the first type of material resources as a color egg attribute, and marking other objects as common attributes.
Other objects refer to: all objects in the target image except the person who meets the action characteristics of the first type of material resources. If the material resources matching the ith object are material resources of the first type, the image processing apparatus may perform step S1309 after performing step S1306; if the material resources matching the i-th object are material resources of the second type, the image processing apparatus may execute step S1307 after executing step S1306.
S1307, it is determined whether each material resource in the target material resource set has a gender attribute.
Wherein, if the material resources in the target material resource set have a gender attribute, the image processing apparatus activates the portrait gender determination apparatus to execute step S1308, for example, the target material resource set with the material resources having a gender attribute can be shown as the material resource set 146 in fig. 14d, in which it is easily seen that the gender attributes of the material resources 1461 and 1463 are "male", and the gender attributes of the material resources 1462 and 1464 are "female"; if the material resources in the target material resource set do not have the gender attribute, the image processing device starts a face angle judgment device to execute step S1309; for example, the target material resource set with no material resource having gender attribute can be shown as the material resource set 147 in fig. 14d, and it can be seen that each material resource included in the material resource set 147 does not have gender attribute.
And S1308, judging the sex of the portrait of each object in the target image.
S1309, a face angle determination is performed on each object in the target image.
After step S1308, the image processing apparatus may obtain the gender determination results (i.e., gender characteristics, which may be denoted by S) of the respective figures, and after step 1309, the image processing apparatus may obtain the face angle determination results, which include: a head rotation feature (which may be represented by R) and a two-dimensional face offset angle feature (which may be represented by P), and the gender determination result and the face angle determination result of each portrait may constitute portrait identification information of the portrait; therefore, after the image processing apparatus has executed steps S1308 to S1309, the processing area of each portrait in the target image may be framed based on the acquired gender determination result and face angle determination result, and for example, as shown in fig. 15, the image processing apparatus may frame the processing area 152 of the portrait on the portrait based on the portrait recognition information 151.
S1310, drawing material resources of all the portrait in the target image.
After the image processing device acquires the portrait identification information of the ith portrait in the target image, the object characteristics of the ith portrait can be obtained based on the portrait identification information, so that the image processing device can further match material resources for the ith portrait according to the object characteristics, and then, the image processing device can match the material resources for all the portraits in the target image based on the method, and after the image processing device matches the material resources for all the portraits, the image processing device can draw imageviews (image views, which can be understood as images corresponding to the material resources) of all the portraits on the target image, and draw controlviews containing the material resources (control view components, which can be understood as material resources matched with the angle of the face of the object in a superimposed manner, and which can be edited by a user) in the target image; if the material resource matched with the ith portrait is the first type of material resource, the image processing device may execute step S1311 to render the material resource to the ith portrait; if the material resources of the ith person image are the material resources of the second type, the image processing apparatus may execute step S1312 to render the material resources for the ith object.
S1311, according to the face angle judgment result, a first type of material resource is called for the color egg attribute object.
In one embodiment, each first type of material resource may be associated with a plurality of standby material resources, the angle characteristics of the material resources corresponding to any two standby material resources are not the same, it should be noted that the standby material resources and the first type of material resources are different only in the display angle of the material resources, as shown in fig. 16, the material resources 162 and the material resources 163 are standby material resources of the material resources 161, where the material resources 162 are suitable for the oblique side of a portrait and the material resources 163 are suitable for the full side of a portrait; it is to be understood that, although the display angles corresponding to the 3 material resources in fig. 16 are not consistent, the 3 material resources all correspond to the same role. Therefore, it can be further understood that, after obtaining the face angle of the ith object, the image processing apparatus may determine, from the first type of material resources and the plurality of standby material resources corresponding to the first type of material resources, material resources displayed in an overlapping manner in the processing area of the ith object.
And S1312, calling a second type of material resource for the object with the common attribute according to the gender judgment result and the face angle judgment result.
A plurality of standby material assets are also associated with each second type of material asset, and the display angles between any two standby material assets are not consistent.
S1313, rendering and synthesizing the whole.
In one embodiment, the image processing device may scale the material resources called by the ith person image proportionally so that the material resources may coincide with the processing area of the ith person image, and further, the image processing device may correct the angle of the material resources according to the two-dimensional face offset angle information P of the ith person image, for example, the angle correction process of the material resources may refer to fig. 17, and the image processing device may display the material resources without the two-dimensional face offset angle in an overlapping manner in the processing area (as shown in 171 in fig. 17), and then correct the angle of the material resources according to the two-dimensional face offset angle information of the person image (as shown in 172 in fig. 17). It can be understood that after the image processing device completes the angle correction of the material resources of each object in the target image, the image with completed image processing (i.e., the image with the N material resources displayed in an overlapping manner) can be rendered.
The image processing method provided by the application can be used for carrying out gender identification, head rotation angle identification and two-dimensional face deviation angle identification on all the portraits in the target image through portraits identification, and matching corresponding material resources for each object based on the information obtained by portraits identification, so that a user can conveniently process the target image by means of image processing equipment, the image processing process can be full of playability and exploratory performance, and the user can be prompted to create more interesting shooting contents by combining different target material resource sets.
While the above embodiments illustrate the method of the embodiments of the present application in detail, in order to better implement the above aspects of the embodiments of the present application, the following provides the apparatus of the embodiments of the present application.
Referring to fig. 18, fig. 18 is a schematic structural diagram of an image processing apparatus according to an exemplary embodiment of the present application; the image processing apparatus may be a computer program (including program code) running in an image processing device, for example, if the image processing device is a terminal device, the image processing apparatus may be a photographing application in the terminal device; the image processing apparatus may be adapted to perform some or all of the steps in the method embodiments shown in fig. 4, 8 and 13. Referring to fig. 18, the image processing apparatus includes the following units:
a display unit 1801, configured to display an image processing interface, where a target image is displayed in the image processing interface, where the target image includes N objects, and N is a positive integer;
a processing unit 1802, configured to display N material resources in a target image in an overlapping manner based on object features of the N objects and a processing region associated with each object in the N objects in the target image; and the N material resources correspond to the N objects one to one.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; the corresponding means that: the object characteristics of the ith object are matched with the resource characteristics of the ith material resource; i is a positive integer and i is less than or equal to N; the processing unit 1802 is specifically configured to:
and at the processing area of the ith object in the target image, overlapping and displaying the ith material resource.
In one implementation, the transparency of the ith material resource is less than a transparency threshold; when the ith material resource is displayed in the target image in an overlapping mode, the ith material resource shields the processing area of the ith object in the target image.
In one implementation, N material resources are located in a target material resource set; the processing unit 1802 is further configured to:
and matching N material resources for the N objects from the target material resource set based on the object characteristics of the N objects.
In one implementation mode, the image processing interface is associated with a material library, the material library comprises a plurality of material resource sets, and each material resource set comprises a plurality of material resources;
the target material resource set comprises any one of the following: the target material resource set is any set randomly selected from the material library; or the target material resource set is any set in the material library, wherein the use heat value is higher than the heat threshold value; or the target material resource set is a set with the highest use heat value in the material library; or the target material resource set is a set selected by a request user of image processing in the material library; alternatively, the target material resource set is a set in the material library that is adapted to the user habits of the requesting user.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; i is a positive integer and i is not more than N; the target material resource set comprises a first sub-set and a second sub-set, wherein the first sub-set contains material resources of a first type, and the second sub-set contains material resources of a second type; the processing unit 1802 is further configured to:
and if the ith material resource is the material resource of the first type, outputting prompt information, wherein the prompt information is used for prompting that the ith object is successfully matched with the ith material resource.
In one implementation, the prompt information includes prompt text or prompt audio; the processing unit 1802 may be specifically configured to:
if the prompt information comprises a prompt text, displaying the prompt text in a non-image display area of the image processing interface; or displaying a prompt text around a processing area associated with the ith object in the target image;
and if the prompt information comprises a prompt audio, playing the prompt audio.
In one implementation, the hint information includes a hint animation, and the processing unit 1802 is further configured to:
playing a prompt animation in an image processing interface; alternatively, the first and second liquid crystal display panels may be,
displaying a playing window on the image processing interface, and playing a prompt animation in the playing window; alternatively, the first and second liquid crystal display panels may be,
displaying a floating window on the image processing interface, and playing a prompt animation in the floating window;
after the prompt animation is played, the ith material resource is displayed in a target image in a superposed manner; the playing ending comprises prompting that the playing of the animation is finished, or the playing duration of the prompting animation reaches a duration threshold.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; i is a positive integer and i is not more than N; the ith object is a person, the object features of the ith object comprise action features and biological features, and the action features comprise action amplitude features and action angle features; the biometric features include gender features, facial features, and head features; the target material resource set comprises a first sub-set and a second sub-set, at least one first type of material resource and the resource characteristics of each first type of material resource are recorded in the first sub-set, and at least one second type of material resource and the resource characteristics of each second type of material resource are recorded in the second sub-set; the processing unit 1802 is further configured to:
calculating a first matching degree between the resource characteristics of each first type of material resource in the first subset and the action characteristics of the ith object;
and if the material resources with the first matching degree larger than the first matching threshold exist in the first subset, determining the material resources with the first matching degree larger than the first matching threshold as the ith material resource corresponding to the ith object.
In one implementation, the processing unit 1802 is further configured to:
if the material resources with the first matching degree larger than the first matching threshold value do not exist in the first subset, calculating a second matching degree between the resource characteristics of each second type of material resource in the second subset and the biological characteristics of the ith object;
and if the material resources with the second matching degree larger than the second matching threshold exist in the second subset, determining the material resources with the second matching degree larger than the second matching threshold as the ith material resource corresponding to the ith object.
In one implementation, any one of the N objects is represented as an ith object, the ith object corresponds to an ith material resource of the N material resources, i is a positive integer and i is not greater than N; the processing unit 1802 is further configured to:
if the ith object is matched with M material resources from the target material resource set, selecting one material resource from the M material resources to determine the ith material resource;
wherein M is a positive integer; the selecting includes any of: and selecting randomly, selecting in the sequence from high matching degree to low matching degree, and selecting in the sequence from high weight to low weight.
In one implementation, the selecting includes selecting in order of greatest to least weight, the processing unit 1802 is further configured to:
outputting a guess prompt which is used for prompting guessing of the M material resources;
obtaining a guess result, wherein the guess result comprises the guess times of each material resource in the M material resources;
setting the weight of each material resource in the M material resources according to the guessing result; the more times any one of the M material resources is guessed, the greater the weight is.
In one implementation, the processing unit 1802 is further configured to:
after the ith material resource is overlapped and displayed in the target image, if a switching trigger event aiming at the ith material resource exists, removing the ith material resource from the target image;
acquiring a kth material resource from the M material resources, and displaying the kth material resource in a superposition manner in a processing area of the ith object in the target image; k is a positive integer, k is less than or equal to M, and k is not equal to i;
wherein the handover trigger event comprises any one of: calling out a menu in a display area of the ith material resource and selecting and switching options in the menu to generate an event; or, an event generated by performing a switching trigger operation in the display area of the ith material resource.
In one implementation, any one of the N objects is represented as an ith object, the ith object corresponds to an ith material resource of the N material resources, i is a positive integer and i is not greater than N; the processing unit 1802 is further configured to:
after the ith material resource is overlapped and displayed in the target image, if an editing trigger event aiming at the ith material resource exists, editing the ith material resource;
the editing triggering event comprises an event of dragging the ith material resource; editing comprises changing the display position of the ith material resource in the target image according to dragging; alternatively, the first and second liquid crystal display panels may be,
the editing triggering event comprises an event generated by carrying out zooming operation on the ith material resource; editing comprises adjusting the size of the display area occupied by the ith material resource in the target image according to scaling; alternatively, the first and second electrodes may be,
the editing triggering event comprises an event of stopping displaying the ith material resource; editing includes removing the ith material asset from the target image.
In one implementation, a sharing entry is arranged in the image processing interface; the display unit 1801 is further configured to:
when the sharing entry is triggered, displaying a sharing object list;
the processing unit 1802 is further configured to: when a target sharing object in the sharing object list is selected, sharing the target image on which the N material resources are superposed and displayed to the target sharing object.
In one implementation, the object is a person; the treatment area includes areas of various parts of a person, including any of: head region, face region, limb region and body region; any two of the N objects are represented as an ith object and a jth object, i and j are positive integers, i is less than or equal to N, and j is less than or equal to N;
the processing area associated with the ith object in the target image and the processing area associated with the jth object in the target image are areas of the same part or areas of different parts.
According to an embodiment of the present application, the units in the image processing apparatus shown in fig. 18 may be respectively or entirely combined into one or several other units to form the image processing apparatus, or some unit(s) of the image processing apparatus may be further split into multiple functionally smaller units to form the image processing apparatus, which may implement the same operation without affecting implementation of technical effects of the embodiment of the present application. The units are divided based on logic functions, and in practical application, the functions of one unit can be realized by a plurality of units, or the functions of a plurality of units can be realized by one unit. In other embodiments of the present application, the image processing apparatus may also include other units, and in practical applications, these functions may also be implemented by assistance of other units, and may be implemented by cooperation of multiple units. According to another embodiment of the present application, the image processing apparatus shown in fig. 18 may be configured by running a computer program (including program codes) capable of executing the steps involved in the respective methods shown in fig. 4, 8, and 13 on a general-purpose computing device such as a computer including a Central Processing Unit (CPU), a random access storage medium (RAM), a read-only storage medium (ROM), and the like, and a storage element, and the image processing method of the embodiment of the present application may be implemented. The computer program may be recorded on a computer-readable recording medium, for example, and loaded and executed in the above-described computing apparatus via the computer-readable recording medium.
The image processing device provided by the embodiment of the application matches the material resources for the object characteristics of each object in the plurality of objects included in the target image according to the object characteristics of each object, so that the material resources displayed in an overlapped mode in the processing area associated with each object in the target image can be more fit with the characteristics of the object, and meanwhile, because the material matching and overlapped display can be carried out on each object in the plurality of objects according to the mode, the situation that the material resources are manually overlapped for many times in the plurality of objects manually is avoided, and the convenience of image processing is improved.
Fig. 19 is a schematic diagram illustrating a structure of an image processing apparatus according to an exemplary embodiment of the present application. Referring to fig. 19, the image processing apparatus includes a processor 1901, a communication interface 1902, and a computer-readable storage medium 1903. The processor 1901, communication interface 1902, and computer-readable storage medium 1903 may be connected by a bus or other means, among others. The communication interface 1902 is used, among other things, to receive and transmit data. A computer-readable storage medium 1903 may be stored in the memory of the image processing apparatus, the computer-readable storage medium 1903 for storing a computer program, the computer program comprising program instructions, the processor 2801 for executing the program instructions stored by the computer-readable storage medium 1903. The processor 1901 (or CPU) is a computing core and a control core of the image Processing apparatus, and is adapted to implement one or more instructions, and specifically, to load and execute the one or more instructions so as to implement a corresponding method flow or a corresponding function.
An embodiment of the present application also provides a computer-readable storage medium (Memory), which is a Memory device in an image processing device and is used for storing programs and data. It is understood that the computer readable storage medium herein may include a built-in storage medium in the image processing apparatus, and may also include an extended storage medium supported by the image processing apparatus. The computer readable storage medium provides a storage space that stores a processing system of the document editing apparatus. Also stored in the memory space are one or more instructions, which may be one or more computer programs (including program code), suitable for loading and execution by the processor 1901. It should be noted that the computer-readable storage medium may be a high-speed RAM memory, or may be a non-volatile memory (non-volatile memory), such as at least one disk memory; optionally, at least one computer readable storage medium located remotely from the aforementioned processor is also possible.
In one embodiment, the image processing apparatus may be the terminal apparatus mentioned in the foregoing embodiments; the computer-readable storage medium has one or more instructions stored therein; one or more instructions stored in the computer-readable storage medium are loaded and executed by the processor 1901 to implement the corresponding steps in the above-described image processing method embodiments; in particular implementations, one or more instructions in the computer-readable storage medium are loaded and executed by the processor 1901 to perform the steps of:
displaying an image processing interface, wherein a target image is displayed in the image processing interface, the target image comprises N objects, and N is a positive integer;
based on the object characteristics of the N objects and the processing area associated with each object in the N objects in the target image, superposing and displaying N material resources in the target image; and the N material resources correspond to the N objects one to one.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; the corresponding means that: the object characteristics of the ith object are matched with the resource characteristics of the ith material resource; i is a positive integer and i is less than or equal to N; one or more instructions in the computer-readable storage medium are loaded by the processor 1901, and when the N material resources are displayed in the target image in an overlapping manner based on the object features of the N objects and the processing area associated with each object in the N objects in the target image, the following steps are specifically performed:
and at the processing area of the ith object in the target image, overlapping and displaying the ith material resource.
In one implementation, the transparency of the ith material resource is less than a transparent threshold; when the ith material resource is displayed in the target image in an overlapping mode, the ith material resource forms shielding for the processing area of the ith object in the target image.
In one implementation, N material resources are located in a target material resource set; one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and perform the steps of:
and matching N material resources for the N objects from the target material resource set based on the object characteristics of the N objects.
In one implementation mode, the image processing interface is associated with a material library, the material library comprises a plurality of material resource sets, and each material resource set contains a plurality of material resources;
the target material resource sets include any of: the target material resource set is any one set randomly selected from the material library; or the target material resource set is any set in the material library, wherein the heat value is higher than the heat threshold value; or the target material resource set is a set with the highest use heat value in the material library; or the target material resource set is a set selected by a request user of image processing in the material library; alternatively, the target material resource set is a set in the material library that is adapted to the user habits of the requesting user.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; i is a positive integer and i is not more than N; the target material resource set comprises a first sub-set and a second sub-set, wherein the first sub-set contains material resources of a first type, and the second sub-set contains material resources of a second type; one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and perform the steps of:
if the ith material resource is the material resource of the first type, outputting prompt information, wherein the prompt information is used for prompting that the ith object is successfully matched with the ith material resource.
In one implementation, the prompt information includes prompt text or prompt audio; one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and when outputting the hint information, perform the following steps:
if the prompt information comprises a prompt text, displaying the prompt text in a non-image display area of the image processing interface; or displaying a prompt text around a processing area associated with the ith object in the target image;
and if the prompt information comprises prompt audio, playing the prompt audio.
In one implementation, where the hint information comprises a hint animation, one or more instructions in a computer-readable storage medium are loaded by the processor 1901 and when outputting the hint information, perform the following steps:
playing a prompt animation in an image processing interface; alternatively, the first and second liquid crystal display panels may be,
displaying a playing window on the image processing interface, and playing a prompt animation in the playing window; alternatively, the first and second electrodes may be,
displaying a floating window on the image processing interface, and playing a prompt animation in the floating window;
after the prompt animation is played, the ith material resource is displayed in a target image in a superposed manner; the playing ending comprises prompting that the playing of the animation is finished, or the playing duration of the prompting animation reaches a duration threshold.
In one implementation, any one of the N objects is represented as an ith object, and the ith object corresponds to an ith material resource of the N material resources; i is a positive integer and i is not more than N; the ith object is a person, the object features of the ith object comprise action features and biological features, and the action features comprise action amplitude features and action angle features; the biometric features include gender features, facial features, and head features; the target material resource set comprises a first subset and a second subset, at least one first type of material resource and the resource characteristics of each first type of material resource are recorded in the first subset, and at least one second type of material resource and the resource characteristics of each second type of material resource are recorded in the second subset; one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and when matching N material resources for N objects from the target set of material resources based on the object features of the N objects, perform the following steps:
calculating a first matching degree between the resource characteristics of each first type of material resource in the first subset and the action characteristics of the ith object;
and if the material resources with the first matching degree larger than the first matching threshold exist in the first subset, determining the material resources with the first matching degree larger than the first matching threshold as the ith material resources corresponding to the ith object.
In one implementation, one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and when matching N material assets for N objects from the target set of material assets based on the object characteristics of the N objects, perform the following steps:
if the material resources with the first matching degree larger than the first matching threshold value do not exist in the first subset, calculating a second matching degree between the resource characteristics of each second type of material resource in the second subset and the biological characteristics of the ith object;
and if the material resources with the second matching degree larger than the second matching threshold exist in the second subset, determining the material resources with the second matching degree larger than the second matching threshold as the ith material resource corresponding to the ith object.
In one implementation, any one of the N objects is represented as an ith object, the ith object corresponds to an ith material resource of the N material resources, i is a positive integer and i is not greater than N; one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and perform the following steps:
if the ith object is matched with M material resources from the target material resource set, selecting one material resource from the M material resources to determine the ith material resource;
wherein M is a positive integer; the selecting includes any one of: randomly selecting, selecting according to the sequence of the matching degree from high to low, and selecting according to the sequence of the weight from high to low.
In one implementation, the selecting includes selecting in order of decreasing weight, and one or more instructions in the computer readable storage medium are loaded by the processor 1901 and specifically perform the following steps:
outputting a guess prompt which is used for prompting guessing of the M material resources;
obtaining a guess result, wherein the guess result comprises the guess times of each material resource in the M material resources;
setting the weight of each material resource in the M material resources according to the guessing result; the more times any one of the M material resources is guessed, the greater the weight is.
In one implementation, one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and perform the following steps:
after the ith material resource is overlapped and displayed in the target image, if a switching trigger event aiming at the ith material resource exists, removing the ith material resource from the target image;
acquiring a kth material resource from the M material resources, and displaying the kth material resource in a superposed manner in a processing area of the ith object in the target image; k is a positive integer, k is less than or equal to M, and k is not equal to i;
wherein the handover trigger event comprises any one of: calling out a menu in a display area of the ith material resource and selecting and switching options in the menu to generate an event; or, an event generated by performing a switching trigger operation in the display area of the ith material resource.
In one implementation, any one of the N objects is represented as an ith object, the ith object corresponds to an ith material resource of the N material resources, i is a positive integer and i is not greater than N; one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and perform the following steps:
after the ith material resource is displayed in the target image in a superposed manner, if an editing trigger event aiming at the ith material resource exists, editing the ith material resource;
the editing triggering event comprises an event of dragging the ith material resource; editing comprises changing the display position of the ith material resource in the target image according to dragging; alternatively, the first and second electrodes may be,
the editing triggering event comprises an event generated by performing a zooming operation on the ith material resource; the editing comprises the steps of adjusting the size of the display area occupied by the ith material resource in the target image according to the zooming; alternatively, the first and second liquid crystal display panels may be,
the editing triggering event comprises an event for stopping displaying the ith material resource; editing includes removing the ith material asset from the target image.
In one implementation, a sharing entry is arranged in the image processing interface; one or more instructions in the computer-readable storage medium are loaded by the processor 1901 and perform the following steps:
when the sharing entry is triggered, displaying a sharing object list;
the processing unit is further configured to: when a target sharing object in the sharing object list is selected, sharing the target image on which the N material resources are superposed and displayed to the target sharing object.
In one implementation, the object is a person; the treatment area includes areas of various parts of a person, including any of: head region, face region, limb region and body region; any two of the N objects are represented as an ith object and a jth object, i and j are positive integers, i is less than or equal to N, and j is less than or equal to N;
the processing area of the ith object in the target image is the same as or different from the processing area of the jth object in the target image.
The image processing apparatus provided in the embodiment of the present application may display a target image, where the target image may include a plurality of objects, and when the plurality of objects in the target image need to be processed, for any one object, the processor 1901 in the image processing apparatus may match material resources for the object according to object features of the object, based on which, the image processing apparatus provided in the embodiment of the present application may match material resources for each object in the target image, so that the material resources displayed by each object in an overlapping manner at a processing area associated with the target image may better conform to features of the object; in addition, after the image processing device provided in the embodiment of the present application matches the material resources for the object, the material resources may be displayed in an overlapping manner at the processing region associated with the object on the target image, so as to avoid that the user manually drags the material resources to perform a shielding operation on the processing region, thereby reducing a threshold of the user for image processing. In addition, it is understood that, since the material resources matched with each object are related to the object characteristics of the object, the image processing apparatus provided in the embodiment of the present application may be further used to perform image processing related activities, such as "guessing a character", "finding XXX (cartoon character) around you", and the like, so that the transmissibility and exploratory property of image processing may be effectively enhanced to some extent.
Embodiments of the present application also provide a computer program product or a computer program comprising computer instructions stored in a computer-readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device executes the image processing method.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the above embodiments, all or part of the implementation may be realized by software, hardware, firmware, or any combination thereof. When implemented in software, it may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. The procedures or functions described in accordance with the embodiments of the invention are all or partially effected when the computer program instructions are loaded and executed on a computer. The computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable device. The computer instructions may be stored on or transmitted over a computer-readable storage medium. The computer instructions may be transmitted from one website site, computer, server, or data center to another website site, computer, server, or data center by wired (e.g., coaxial cable, fiber optic, digital Subscriber Line (DSL)) or wireless (e.g., infrared, wireless, microwave, etc.) means. The computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device, such as a server, a data center, etc., that incorporates one or more of the available media. The usable medium may be a magnetic medium (e.g., a floppy Disk, a hard Disk, a magnetic tape), an optical medium (e.g., a DVD), or a semiconductor medium (e.g., a Solid State Disk (SSD)), among others.
The above description is only for the specific embodiments of the present application, but the scope of the present application is not limited thereto, and any person skilled in the art can easily think of the changes or substitutions within the technical scope of the present disclosure, and shall cover the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (19)

1. An image processing method, comprising:
displaying an image processing interface, wherein a target image is displayed in the image processing interface, the target image comprises N objects, and N is a positive integer;
based on the object characteristics of the N objects and the processing area associated with each object in the N objects in the target image, N material resources are displayed in the target image in an overlapping mode; and the N material resources correspond to the N objects one to one.
2. The method of claim 1, wherein any of the N objects is represented as an ith object, the ith object corresponding to an ith material resource of the N material resources; the correspondence means: the object characteristics of the ith object are matched with the resource characteristics of the ith material resource; i is a positive integer and i is not more than N;
the displaying, in an overlapping manner, N material resources in the target image based on the object features of the N objects and the processing area associated with each object of the N objects in the target image includes:
and displaying the ith material resource in an overlapping way at the processing area of the ith object in the target image.
3. The method of claim 2, wherein the transparency of the ith material resource is less than a transparency threshold; when the ith material resource is displayed in the target image in an overlapping mode, the ith material resource forms shielding on a processing area which is associated with the ith object in the target image.
4. The method of claim 1, wherein the N material resources are located in a set of target material resources; the method further comprises the following steps:
matching the N material resources for the N objects from the target material resource set based on the object characteristics of the N objects.
5. The method of claim 4, wherein the image processing interface is associated with a materials library, the materials library comprising a plurality of materials resource sets, each materials resource set comprising a plurality of materials resources;
the set of target material resources includes any one of: the target material resource set is any one set randomly selected from the material library; or the target material resource set is any set of the material library, wherein the usage heat value is higher than a heat threshold value; or the target material resource set is a set with the highest use heat value in the material library; or the target material resource set is a set selected by a request user of image processing in the material library; or the target material resource set is a set which is matched with the user habit of the requesting user in the material library.
6. The method according to claim 4, wherein any one of the N objects is represented as an ith object, the ith object corresponding to an ith material resource of the N material resources; i is a positive integer and i is not more than N; the target material resource set comprises a first subset and a second subset, the first subset includes material resources of a first type, and the second subset includes material resources of a second type; the method further comprises the following steps:
if the ith material resource is a first type material resource, outputting prompt information, wherein the prompt information is used for prompting that the ith object is successfully matched with the ith material resource.
7. The method of claim 6, wherein the prompt information comprises prompt text or prompt audio; the outputting the prompt message comprises:
if the prompt information comprises a prompt text, displaying the prompt text in a non-image display area of the image processing interface; or, displaying the prompt text around the processing area associated with the ith object in the target image;
and if the prompt information comprises prompt audio, playing the prompt audio.
8. The method of claim 6, wherein the hint information comprises a hint animation; the output prompt message comprises any one of the following:
playing the prompt animation in the image processing interface; alternatively, the first and second electrodes may be,
displaying a playing window on the image processing interface, and playing the prompt animation in the playing window; alternatively, the first and second electrodes may be,
displaying a floating window on the image processing interface, and playing the prompt animation in the floating window;
after the prompt animation is played, the ith material resource is displayed in the target image in an overlapped mode; and the playing ending comprises the completion of playing of the prompt animation, or the playing time of the prompt animation reaches a time threshold.
9. The method of claim 4, wherein any of the N objects is represented as an ith object, the ith object corresponding to an ith material resource of the N material resources; i is a positive integer and i is less than or equal to N; the ith object is a person, the object features of the ith object comprise action features and biological features, and the action features comprise action amplitude features and action angle features; the biometric features include gender features, facial features, and head features; the target material resource set comprises a first sub-set and a second sub-set, the first sub-set contains at least one first type of material resource and the resource characteristics of each first type of material resource, and the second sub-set contains at least one second type of material resource and the resource characteristics of each second type of material resource;
the matching the N material resources for the N objects from the target material resource set based on the object features of the N objects includes:
calculating a first matching degree between the resource characteristics of each first type material resource in the first subset and the action characteristics of the ith object;
and if the material resources with the first matching degree larger than the first matching threshold exist in the first subset, determining the material resources with the first matching degree larger than the first matching threshold as the ith material resource corresponding to the ith object.
10. The method of claim 9, wherein said matching said N material assets for said N objects from a set of target material assets based on object characteristics of said N objects further comprises:
if the material resources with the first matching degree larger than a first matching threshold value do not exist in the first subset, calculating a second matching degree between the resource features of the material resources of the second type in the second subset and the biological features of the ith object;
and if the material resources with the second matching degree larger than a second matching threshold exist in the second subset, determining the material resources with the second matching degree larger than the second matching threshold as the ith material resource corresponding to the ith object.
11. The method according to claim 4, wherein any one of the N objects is represented as an ith object, the ith object corresponding to an ith material resource of the N material resources, i being a positive integer and i ≦ N; the method further comprises the following steps:
if M material resources are matched for the ith object from the target material resource set, selecting one material resource from the M material resources and determining the material resource as the ith material resource;
wherein M is a positive integer; the selecting includes any one of: randomly selecting, selecting according to the sequence of the matching degree from high to low, and selecting according to the sequence of the weight from high to low.
12. The method of claim 11, wherein the selecting comprises selecting in order of decreasing weight; the method comprises the following steps:
outputting a guess prompt, wherein the guess prompt is used for prompting to guess the M material resources;
obtaining a guessing result, wherein the guessing result comprises the guessing times of each material resource in the M material resources;
setting the weight of each material resource in the M material resources according to the guessing result; wherein, the more times any one of the M material resources is guessed, the greater the weight.
13. The method of claim 11, wherein the method further comprises:
after the ith material resource is displayed in the target image in an overlapped mode, if a switching trigger event aiming at the ith material resource exists, removing the ith material resource from the target image;
acquiring a kth material resource from the M material resources, and displaying the kth material resource in an overlapping manner at a processing area of the ith object in the target image; k is a positive integer, k is less than or equal to M, and k is not equal to i;
wherein the handover trigger event comprises any one of: calling out a menu in the display area of the ith material resource and selecting and switching options in the menu to generate an event; or, executing an event generated by a switching trigger operation in the display area of the ith material resource.
14. The method according to claim 1, wherein any one of the N objects is represented as an ith object, the ith object corresponding to an ith material resource of the N material resources, i being a positive integer and i ≦ N; the method further comprises the following steps:
after the ith material resource is overlapped and displayed in the target image, if an editing trigger event aiming at the ith material resource exists, editing the ith material resource;
the editing triggering event comprises an event of dragging the ith material resource; the editing comprises changing the display position of the ith material resource in the target image according to the dragging; alternatively, the first and second electrodes may be,
the editing trigger event comprises an event generated by performing a zooming operation on the ith material resource; the editing comprises the step of adjusting the size of a display area occupied by the ith material resource in the target image according to the scaling; alternatively, the first and second electrodes may be,
the editing triggering event comprises an event of stopping displaying the ith material resource; the editing comprises removing the ith material resource from the target image.
15. The method of claim 1, wherein a sharing entry is provided in the image processing interface; the method further comprises the following steps:
when the sharing entry is triggered, displaying a sharing object list;
when a target sharing object in the sharing object list is selected, sharing the target image on which the N material resources are superposed and displayed to the target sharing object.
16. The method of claim 1, wherein the subject is a human; the treatment area includes areas of various parts of a person, including any of: head region, face region, limb region and body region; any two of the N objects are represented as an ith object and a jth object, i and j are positive integers, i is less than or equal to N, and j is less than or equal to N;
the processing area of the ith object in the target image is the same as or different from the processing area of the jth object in the target image.
17. An image processing apparatus characterized by comprising:
the display unit is used for displaying an image processing interface, wherein a target image is displayed in the image processing interface, the target image comprises N objects, and N is a positive integer;
the processing unit is used for displaying N material resources in the target image in an overlapping mode on the basis of the object characteristics of the N objects and the processing area, associated with each object in the N objects, in the target image; and the N material resources correspond to the N objects one by one.
18. An image processing apparatus characterized by comprising:
a processor adapted to execute a computer program;
computer-readable storage medium, in which a computer program is stored which, when being executed by the processor, carries out the image processing method according to any one of claims 1 to 16.
19. A computer-readable storage medium, characterized in that the computer-readable storage medium stores a computer program adapted to be loaded by a processor and to perform the image processing method according to any of claims 1-16.
CN202110811845.9A 2021-07-16 2021-07-16 Image processing method, device, equipment and medium Pending CN115619902A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110811845.9A CN115619902A (en) 2021-07-16 2021-07-16 Image processing method, device, equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110811845.9A CN115619902A (en) 2021-07-16 2021-07-16 Image processing method, device, equipment and medium

Publications (1)

Publication Number Publication Date
CN115619902A true CN115619902A (en) 2023-01-17

Family

ID=84854533

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110811845.9A Pending CN115619902A (en) 2021-07-16 2021-07-16 Image processing method, device, equipment and medium

Country Status (1)

Country Link
CN (1) CN115619902A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116777940A (en) * 2023-08-18 2023-09-19 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116777940A (en) * 2023-08-18 2023-09-19 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium
CN116777940B (en) * 2023-08-18 2023-11-21 腾讯科技(深圳)有限公司 Data processing method, device, equipment and storage medium

Similar Documents

Publication Publication Date Title
CN115735229A (en) Updating avatar garments in messaging systems
US11893301B2 (en) Colocated shared augmented reality without shared backend
CN117157667A (en) Garment segmentation
CN116547717A (en) Facial animation synthesis
US20230269345A1 (en) Recorded sound thumbnail
CN117940962A (en) Facial expression based control interactive fashion
CN117321622A (en) Portal shopping for AR-based connections
CN116710881A (en) Selecting audio for multiple video clip capture
CN116261850A (en) Bone tracking for real-time virtual effects
KR20230019927A (en) Context transfer menu
CN117642762A (en) Custom advertising with virtual changing room
CN117957043A (en) Controlling AR games on fashion items
CN115699716A (en) Message interface extension system
CN117501675A (en) Rendering content received by a messaging application from a third party resource
CN117337442A (en) VR-based connection portal shopping
US20240073373A1 (en) Sharing social augmented reality experiences in video calls
CN116648895A (en) Image pickup apparatus mode for capturing a plurality of video clips
CN116457821A (en) Object re-illumination using neural networks
US11868676B2 (en) Augmenting image content with sound
KR20230017348A (en) context application menu
CN115619902A (en) Image processing method, device, equipment and medium
KR20230022234A (en) context action bar
WO2023211688A1 (en) Shared augmented reality experience in video chat
WO2023220163A1 (en) Multi-modal human interaction controlled augmented reality
US11768587B2 (en) Electronic transaction activated augmented reality experiences

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40082643

Country of ref document: HK