CN114390193B - Image processing method, device, electronic equipment and storage medium - Google Patents

Image processing method, device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114390193B
CN114390193B CN202111481292.1A CN202111481292A CN114390193B CN 114390193 B CN114390193 B CN 114390193B CN 202111481292 A CN202111481292 A CN 202111481292A CN 114390193 B CN114390193 B CN 114390193B
Authority
CN
China
Prior art keywords
image
target
image processing
flow
materials
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111481292.1A
Other languages
Chinese (zh)
Other versions
CN114390193A (en
Inventor
赵伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202111481292.1A priority Critical patent/CN114390193B/en
Publication of CN114390193A publication Critical patent/CN114390193A/en
Application granted granted Critical
Publication of CN114390193B publication Critical patent/CN114390193B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream, rendering scenes according to MPEG-4 scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs

Abstract

The disclosure provides an image processing method, an image processing device, an electronic device and a storage medium. The method comprises the following steps: responding to a selection instruction of a target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow; acquiring an initial image containing a shooting object in response to a shooting instruction; determining a target material for processing an initial image in the current image processing flow from the materials according to the matching relation; and processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow. According to the method, through the matching relation between the materials and the image processing flows, the image processing modes of different materials are applied to different image processing flows, so that the output images can meet the actual requirements of different image processing flows, the image processing efficiency is greatly improved, and the image display effect is improved.

Description

Image processing method, device, electronic equipment and storage medium
Technical Field
The disclosure relates to the technical field of image processing, and in particular relates to an image processing method, an image processing device, electronic equipment and a storage medium.
Background
In order to enrich the final visual perception of image works such as short videos or expression packages, a user can add various image effects to the image works in the shooting process, such as beauty of the heart, mouth opening, bubble spitting and the like.
In the related art, prompt information is added in a video shooting interface so as to trigger corresponding image effects when a user shoots, and guide the user to put out gesture expressions matched with the image effects. Taking magic expression (abbreviated as magic table in this disclosure) as an example, the prompt information is usually bound into magic table materials and issued to the client along with the magic table materials, so that the prompt information can be synchronously recorded into the finally displayed short video when the client records the short video, and the display effect and the image processing efficiency of the short video are affected.
Disclosure of Invention
The disclosure provides an image processing method, an image processing device, electronic equipment and a storage medium, so as to at least solve the problem that prompt information influences image shooting in the related technology, improve image processing efficiency and improve image display effect.
According to a first aspect of embodiments of the present disclosure, the present disclosure provides an image processing method, including:
responding to a selection instruction of a target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
Acquiring an initial image containing a shooting object in response to a shooting instruction;
determining a target material for processing an initial image in the current image processing flow from the materials according to the matching relation;
and processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
In an optional embodiment, the material data further includes a flow identifier for matching the material, where the flow identifier is used to indicate an image processing flow matched by the material.
Determining target materials used for processing an initial image in the current image processing flow from the materials according to the matching relation, wherein the target materials comprise: determining the current image processing flow of the client; and determining the material matched with the image processing flow from the materials according to the flow identification as a target material.
In an alternative embodiment, the material includes image material for composing the target image effect, and text material for interacting with the photographic subject.
The matching relation between the materials and the image processing flow comprises the following steps: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow.
In an optional embodiment, selecting, from the materials, materials matching the current target image processing procedure as target materials according to the procedure identifier, including: if the current image processing flow is the back-end processing flow, the flow mark is the image material identified by the back-end processing flow, and the image material is determined to be the target material.
Processing the initial image by using the target material to obtain a target image corresponding to the image processing flow, including: and rendering the image material into the initial image to obtain a first image containing the target image effect.
In an optional embodiment, selecting, from the image materials, materials matching the current target image processing procedure as target materials according to the procedure identifier, including: if the current image processing flow is a preview flow, determining the flow mark as the image material and the text material of the preview flow mark as the target material.
Processing the initial image by using the target material to obtain a target image corresponding to the image processing flow, including: and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
In an alternative embodiment, an editing interface and/or a publishing interface comprising the first image is displayed.
In an alternative embodiment, a capture interface is displayed containing the second image.
In an alternative embodiment, a shooting interface containing a shooting object is displayed, and a plurality of effect controls are included in the shooting interface.
Responding to the selection instruction of the target effect, acquiring the material data corresponding to the target effect, including: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring material data corresponding to the target effect control.
According to a second aspect of embodiments of the present disclosure, the present disclosure provides an image processing apparatus including:
the first acquisition module is configured to respond to a selection instruction of a target effect and acquire material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
a second acquisition module configured to acquire an initial image containing a photographic subject in response to a photographic instruction;
the determining module is configured to determine target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation;
the image processing module is configured to process the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
In an optional embodiment, the material data further includes a flow identifier for matching the material, where the flow identifier is used to indicate an image processing flow matched by the material.
The determination module is specifically configured to: determining the current image processing flow of the client; and determining the material matched with the image processing flow from the materials according to the flow identification as a target material.
In an alternative embodiment, the material includes image material for composing the target image effect, and text material for interacting with the photographic subject.
The matching relation between the materials and the image processing flow comprises the following steps: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow.
In an optional embodiment, when the determining module selects, from the materials, materials matching the current target image processing procedure as target materials according to the procedure identifier, the determining module is specifically configured to: if the current image processing flow is the back-end processing flow, the flow mark is the image material identified by the back-end processing flow, and the image material is determined to be the target material.
The image processing module processes the initial image by adopting the target material, and is specifically configured to: and rendering the image material into the initial image to obtain a first image containing the target image effect.
In an optional embodiment, when the determining module selects, from the materials, materials matching the current target image processing procedure as target materials according to the procedure identifier, the determining module is specifically configured to: if the current image processing flow is a preview flow, determining the flow mark as the image material and the text material of the preview flow mark as the target material.
The image processing module processes the initial image by adopting the target material, and is specifically configured to: and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
In an alternative embodiment, the method further comprises displaying an editing interface and/or a publishing interface comprising the first image.
In an alternative embodiment, the camera further comprises a display module configured to display a shooting interface containing the second image.
In an alternative embodiment, the camera further comprises a display module configured to display a shooting interface containing a shooting object, wherein the shooting interface comprises a plurality of effect controls.
The first obtaining module is specifically configured to, when responding to a selection instruction of a target effect and obtaining material data corresponding to the target effect: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring material data corresponding to the target effect control.
According to a third aspect of embodiments of the present disclosure, there is provided an electronic device, including a processor and a memory, wherein the memory has executable code stored thereon, which when executed by the processor, causes the processor to implement at least the image processing method in the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer readable storage medium, which when executed by an electronic device, enables the electronic device to perform at least the image processing method in the first aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the image processing method in the first aspect.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
in the disclosure, material data corresponding to a target effect is obtained in response to a selection instruction of the target effect, wherein the material data comprises materials and a matching relationship between the materials and an image processing flow. Further, in the case where an initial image including a subject is acquired in response to a shooting instruction, a target material for processing the initial image in the image processing flow in which the subject is currently located may be determined from materials according to a matching relationship. Furthermore, the initial image is processed by adopting the target material to obtain a target image corresponding to the image processing flow, so that an image processing mode of applying the corresponding material to the current image processing flow is realized, and the output image can meet the actual requirements (such as editing, publishing, previewing and the like) of the current image processing flow. In the method, the materials required by different image processing flows can be clearly distinguished through the matching relation between the materials and the image processing flows, so that the problem of image display effect reduction caused by recording the materials such as prompt information into the finally displayed image is avoided, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, the materials such as prompt information can be updated into the material data according to the respective matched image processing flow, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure and do not constitute an undue limitation on the disclosure.
Fig. 1 is a schematic diagram of a photographing interface according to the related art.
Fig. 2 is a flow chart illustrating an image processing method according to an exemplary embodiment.
Fig. 3 is a schematic diagram of a photographic interface, shown according to an exemplary embodiment.
FIG. 4 is a schematic diagram of an editing interface shown in accordance with an exemplary embodiment.
FIG. 5 is a schematic diagram illustrating a publishing interface according to an example embodiment.
Fig. 6 is a schematic diagram of a photographic interface, shown according to an example embodiment.
Fig. 7 is a schematic structural view of an image processing apparatus according to an exemplary embodiment.
Fig. 8 is a schematic diagram of an electronic device according to an exemplary embodiment.
Detailed Description
In order to enable those skilled in the art to better understand the technical solutions of the present disclosure, the technical solutions of the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the foregoing figures are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that the embodiments of the disclosure described herein may be capable of operation in sequences other than those illustrated or described herein. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
As described above, in the related art, the prompt information is added to the video capturing interface, so that the user can trigger the corresponding image effect when capturing, and the user is guided to put out the gesture expression matched with the image effect. Taking the shooting interface shown in fig. 1 as an example, a magic watch of scissors stone cloth can be added when shooting a short video, and in this case, prompt information of "call friends to play bars together" can be added to prompt a user to invite other people to join in shooting the short video. In this disclosure, a magic watch may also be referred to as a magic expression, or a magic effect, or a special effect, or the like.
However, in a related technology, the prompt information is bound to the magic table material and is issued to the client along with the magic table material, so that the prompt information can be synchronously recorded in the finally displayed short video when the client records the short video, and the display effect and the image processing efficiency of the short video are affected.
In another related technology, when the client side displays the preview effect of the magic table, the prompt information can be displayed by setting a layer above the layer where the video stream is located, so that the prompt information is prevented from being recorded in the final short video. These pieces of hint information are fixed information built in advance in the client side, such as "face is not recognized", "please get the head", and the like. The quantity of the prompt information is limited, and the requirement of the magic table materials with higher update frequency is difficult to match. Moreover, the prompt information is updated along with the version of the client, so that the update difficulty of the prompt information in the client is high, and the maintenance efficiency is low.
In order to solve at least one technical problem existing in the related art, the disclosure provides an image processing method, an image processing device, an electronic device and a storage medium.
The core idea of the technical scheme is as follows: and responding to the selection instruction of the target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and matching relations between the materials and the image processing flow. Further, in the case where an initial image including a subject is acquired in response to a shooting instruction, a target material for processing the initial image in the image processing flow in which the subject is currently located may be determined from materials according to a matching relationship. Furthermore, the initial image is processed by adopting the target material to obtain a target image corresponding to the image processing flow, so that an image processing mode of applying the corresponding material to the current image processing flow is realized, and the output image can meet the actual requirements (such as editing, publishing, previewing and the like) of different image processing flows. In the scheme, the materials required by different image processing flows can be clearly distinguished through the matching relation between the materials and the image processing flows, so that the problem of image display effect reduction caused by recording the materials such as prompt information into the finally displayed image is avoided, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, the materials such as prompt information can be updated into the material data according to the respective matched image processing flow, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
The actual requirements of the different image processing flows can be understood as the effect of the output required by the different image processing flows according to the specific application scene. In practical applications, the effects include image effects and text effects.
For example, assume that the material includes image material for constituting a target image effect, text material for interacting with a photographic subject. The text material here includes the prompt information introduced previously. Assume that a target image effect needs to be presented to a user in a preview flow and text material used for interaction with the user (i.e., a photographic subject). It is assumed that the image to be output by the back-end processing flow is mainly used for the subsequent editing flow and the publishing flow, so that the display of text materials needs to be avoided.
Based on the above assumption, the target image (for distinction, this disclosure refers to the first image) corresponding to the backend processing flow may be an image that contains the target image effect but does not carry the text effect, so that the first image may be directly applied to subsequent editing and distribution. While the target image (for distinction, the present disclosure refers to the second image) corresponding to the preview flow may be an image containing the target image effect and the text effect. Therefore, through the second image output by the preview flow, the shooting state of the user can be synchronously adjusted according to the text materials when the user previews the target effect, such as the gesture, the expression, the number of people involved and the like, so as to cooperatively realize the target effect.
Based on the core ideas described above, the embodiment of the present disclosure provides an image processing method, and fig. 2 is a schematic flow chart of the image processing method provided in an exemplary embodiment of the present disclosure. As shown in fig. 2, the method includes:
201. responding to a selection instruction of a target effect, and acquiring material data corresponding to the target effect;
202. acquiring an initial image containing a shooting object in response to a shooting instruction;
203. determining a target material for processing an initial image in the current image processing flow from the materials according to the matching relation;
204. and processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
In the method, the materials required by different image processing flows can be clearly distinguished through the matching relation between the materials and the image processing flows, so that a brand new image processing mode of applying different materials in different image processing flows is realized, images with different image effects can be output by different image processing flows, the output requirements of different image processing flows are matched, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, the materials such as prompt information can be updated into the material data according to the respective matched image processing flow, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
In practical applications, each step in the method may be implemented by an electronic device, where the electronic device may be a terminal device such as a mobile phone, a smart band, a tablet computer, a PC, a notebook computer, or the like. Taking a mobile phone as an example, the method can be realized by calling a special application program carried in the mobile phone, can be realized by calling an applet arranged in an instant messaging application or other types of applications, and can be realized by calling a cloud server through the mobile phone application. The steps of the method can also be realized by matching a plurality of electronic devices. For example, the server may send the execution result to the terminal device for rendering and displaying the execution result by the terminal device. The server may be a physical server including an independent host, or may also be a virtual server borne by a host cluster, or may also be a cloud server, which is not limited in this disclosure.
The following describes steps in an image processing method in connection with specific embodiments.
First, in 201, material data corresponding to a target effect is acquired in response to a selection instruction for the target effect.
In the related art, a layer containing text materials can be superimposed on the video recording interface, so that the text materials in the layer are prevented from being recorded in a short video. However, the text material in the layer is usually preset in the client, so that the client needs to be updated to adjust the setting or the type of the text material, the operation is complicated, the text material is less in type, and the text material is difficult to update in time to match various effects.
In view of this problem, in the present embodiment, the material data corresponding to the target effect includes the material and the matching relationship of the material and the image processing flow. In this way, through the matching relation between the materials and the image processing flows, the materials corresponding to different image processing flows can be set in the material data according to the output requirements of matching different image processing flows, so that the image processing flows have more pertinence, the setting and updating difficulty of materials such as text materials, image materials and the like is greatly reduced, and the method is beneficial to adding different materials (such as different text materials) in various effects more pertinently.
In practical applications, the image processing flow includes, but is not limited to, a preview flow and a back-end processing flow. It will be appreciated that the different image processing flows may be performed in tandem or simultaneously in different code branches, respectively. For example, the preview flow is automatically triggered after entering the shooting interface, so that an image for previewing the target effect is displayed in real time for a user entering the shooting interface. And the processing flow of the back end is triggered by the shooting control in the shooting interface, namely, the processing flow of the client background running on the shot image such as image effect rendering, coding and the like can be started after the shooting of the image is triggered, and the processing flow is regarded as the back end processing flow.
Optionally, the image material includes, but is not limited to: image material for constituting a target image effect, and text material for interacting with a photographic subject. In practical application, the text material may be text information or animation containing text information, which is not limited in this disclosure. Regardless of the form of the text material, the core intention is to guide the shooting object (such as a photographer or a shot subject) to adjust the shooting state according to the content indicated by the text material. Taking the magic table as an example, the text material may be "no shooting object detected", in which case, the photographer may move the device to make the shooting object enter the viewfinder, and complete shooting of the magic table. For example, the text material may be an animation that is matched with the magic table and contains a reference shooting action, so as to prompt the shooting object to complete shooting of the magic table according to the reference shooting action.
Taking a magic table as an example, the material data corresponding to the magic table includes an image material constituting the magic table, a text material associated with the magic table, a matching relationship between the image material and an image processing flow, and a matching relationship between the text material and the image processing flow. In practical application, the image materials forming the magic table are, for example, various special effect maps and special effect setting parameters. According to actual requirements, setting parameters for special effects includes: the special effects show order, duration, map location, map shape size, font type, etc.
Optionally, before 201, a shooting interface including a shooting object is displayed, where the shooting interface includes a plurality of effect controls. For example, a magic table selection control corresponding to a plurality of magic tables. The plurality of effect controls in the shooting interface may be suspended above the shooting interface or may be disposed in a designated area of the shooting interface. The layout of the plurality of effect controls may be set according to the photographing interface. For example, a plurality of magic list selection controls may be evoked by the magic list function wake-up control and laid out in a designated area of the capture interface, such as the lower half of the capture interface. For example, the plurality of magic list selection controls are arranged on the right side or the left side of the interface according to the use habit (such as the use habit) of the user, so that one-hand operation is facilitated. For example, the magic list selection control may be divided into two parts, which are respectively disposed on the left and right sides of the interface, so that both hands operate. It is to be understood that the layout form of the plurality of effect controls in the photographing interface is not limited to the above example.
Based on this, in 201, in response to a selection instruction for a target effect, an alternative embodiment of obtaining material data corresponding to the target effect may be implemented as: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring material data corresponding to the target effect control.
In the above steps, it is assumed that the material data corresponding to the various effects is a magic table material packet corresponding to the various magic tables. Based on the assumption, detecting an operation instruction of a user, and determining that the user selects a target magic table selection control in a plurality of magic table selection controls according to the detected operation instruction. And responding to the detected operation instruction, and acquiring a magic table material packet corresponding to the target magic table selection control. The magic table material packets can be selected and downloaded by a user to be updated to the client side or can be prestored to the client side. For example, a plurality of magic table material packages are downloaded to the client side along with the client start-up update flow.
Further, in 202, an initial image including a subject is acquired in response to a photographing instruction.
The initial image may be a picture, a video stream, or other forms, which is not limited in this disclosure. The subject included in the initial image may be determined by a shooting scene. For example, people in daily life, pets, landscapes, anchor in live scenes, spectators, athletes in sporting events, spectators, referees, actors in entertainment performances, anchor, spectators, daily life, etc. Of course, the present disclosure may set scene elements in various scenes as photographic subjects in addition to the active persons of the above examples. The scene elements are, for example, commodities in live scenes, live scene panoramas and scenery elements, and are, for example, scenery elements and scenery elements in entertainment performances.
It should be noted that there are various ways of selecting the shooting object. Wherein, can select the shooting object that will demonstrate through the following way, including:
in an alternative embodiment, the shooting object in the shooting interface may be selected manually, and then the shooting object is selected from a plurality of objects to be selected. For example, one or more objects to be selected in the shooting interface are identified, the objects to be selected in the shooting interface are selected by using the box frames, and the box corresponding to the selected shooting object is clicked. Optionally, visual features of the object to be selected in the photographing interface may also be entered in advance. For example, before recording a magic expression, the visual features of the user are entered. Such as taking a facial image of the user, or a clothing image of the user.
In another alternative embodiment, the shooting device may be moved to a specified position by moving the shooting device to move the to-be-selected object to be selected in the shooting interface, so as to identify the shooting object. For example, the focus mode is selected as the center focus, in which case the object to be selected at the center of the photographing interface is taken as the photographing object.
After the shooting object is selected in the above manner, in 202, it may be detected whether a shooting instruction is received, and after the shooting instruction is received, an initial image is acquired by an image acquisition sensor. In practical application, the shooting instruction may be a selection instruction of a shooting control, may be generated after the shooting object is identified to send out a specified trigger action, or may be implemented in other forms, which is not limited in the disclosure. Taking the magic chart shooting scene shown in fig. 3 as an example, according to the prompt sign "nodding aiming emission", if the nodding of the shooting object is identified, the initial image containing the shooting object can be triggered to be acquired.
After the initial image is acquired through the above steps, in 203, a target material for processing the initial image in the current image processing flow is determined from the materials according to the matching relationship. 204, processing the initial image by using the target material to obtain a target image corresponding to the image processing flow.
In the above steps, it is assumed that the material data further includes a process identifier for matching the material, where the process identifier is used to indicate an image processing process matched with the material.
Based on the above assumption, in 203, an alternative embodiment of determining the target material used for processing the initial image in the current image processing flow according to the matching relationship is specifically: determining the current image processing flow of the client; and selecting materials matched with the current target image processing flow from the materials according to the flow identification, and taking the materials as target materials. Therefore, materials required by different image processing flows can be effectively distinguished through the flow identification, so that the output image can meet the actual requirements of the different image processing flows, and the image processing efficiency is improved.
Continuing the above assumption, it is further assumed that the material includes image material for constituting the target image effect, and text material for interacting with the photographic subject. Based on this, optionally, the matching relationship of the material and the image processing flow may be implemented as: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow. For example, assume that the material data is a magic table material packet corresponding to a magic table. Based on this assumption, the magic table material package includes, for example: the method is used for forming the image material, the text material, the preview flow identifier and the back-end processing flow identifier (namely the flow identifier) matched with the image material and the text material of the magic table, and the preview flow identifier (namely the flow identifier) matched with the text material. Through the matching relation, the image processing modes of different materials can be applied to different image processing flows, so that the output images can meet the actual requirements of different image processing flows, the image processing efficiency is improved, and the image display effect is improved.
First, there are various ways of determining the image processing flow. The image processing flow where the client is currently located can be determined by the following method, including:
in an alternative embodiment, the image processing procedure in which the client is currently located is identified according to the last operation instruction. For example, assuming that the last operation instruction is an instruction to jump to the shooting interface, it may be determined that the image processing flow in which the client is currently located is a preview flow. For example, assuming that the latest operation instruction is a shooting instruction for a shooting object, it may be determined that the image processing flow in which the client is currently located is a back-end processing flow. In another alternative embodiment, the execution sequence of the preview process and the back-end process may be preset, so as to determine, according to the execution sequence, the image process in which the current client is currently located.
Further, the target material may be selected from the materials by:
in an optional embodiment, after determining the current image processing procedure of the client, if the current image processing procedure is a back-end processing procedure, the image material identified by the procedure identifier as the back-end processing procedure is determined as the target material in 203. Furthermore, in 204, an optional manner of processing the initial image with the target material to obtain the target image corresponding to the image processing flow is as follows: and rendering the image material into the initial image to obtain a first image containing the target image effect. Taking a magic table as an example, image materials for constructing the magic table include, for example: flashing, props, cosmetic stickers. Based on this, in 204, the flash, prop, cosmetic map is rendered into the initial image, resulting in a first image comprising the magic table described above. Therefore, the target image obtained through the back-end processing flow does not contain text materials, and the effect corresponding to the text materials is prevented from affecting the image display.
Optionally, after obtaining the first image including the target image effect, an editing interface and/or a publishing interface including the first image may also be displayed. The editing interface is mainly used for editing the first image. For example, the first image including the magic table effect may be imported into the editing interface shown in fig. 4 for editing. The editing operation for the first image includes, for example: editing, transcoding, watermarking, post-effect editing and score matching. The publishing interface is mainly used for publishing the first image. For example, the first image including the magic table effect may be imported into the posting interface shown in fig. 5 for posting operations, such as editing related text, selecting a current location, selecting whether to publish publicly, and selecting other settings.
In another alternative embodiment, in 203, after determining the current image processing flow of the client, if the current image processing flow is a preview flow, the image material and the text material whose flow is identified as the preview flow are determined as the target material. Furthermore, in 204, an optional manner of processing the initial image with the target material to obtain the target image corresponding to the image processing flow is as follows: and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text effect. Taking a magic table as an example, image materials for constructing the magic table include, for example: flashing, props, cosmetic stickers. Based on this, at 204, the flash, prop, cosmetic map, and text material are rendered into the initial image, resulting in a second image comprising the magic watch and text effect described above. Therefore, the target image obtained through the preview flow contains the text material, so that a user can be prompted to perform related operations through the effect corresponding to the text material, and user interaction experience and image display effect are improved.
Of course, in practical application, the second image may be rendered based on the first image. In an alternative embodiment, after the first image is obtained, if it is detected that the current image processing flow is switched to the preview flow, the text material is rendered into the first image, so as to obtain a second image including the target image effect and the text effect.
In practical application, optionally, a plurality of text materials corresponding to the magic table can be set, so that the scene of the shooting object can be identified, and the text material matched with the scene of the shooting object is selected. For example, in the shooting scene shown in fig. 6, the shooting object is not recognized, and the text material may be selected as "no person detected". Optionally, the reference shooting action can be prompted to the shooting object through the text effect corresponding to the text material. For example, in the shooting scene shown in fig. 6, the user may be guided to rotate in cooperation with the cat ear by using the text effect corresponding to the text material in cooperation with the cat face.
Optionally, after obtaining the second image including the target image effect and the text effect, a photographing interface including the second image may also be displayed. In this way, the target image effect and the text effect in the second image can be synchronously previewed in the shooting interface.
In the image processing method shown in fig. 2, materials required by different image processing flows can be clearly distinguished through the matching relation between the materials and the image processing flows, so that a brand new image processing mode of applying different materials in different image processing flows is realized, images with different effects can be output by different image processing flows, the output requirements of different image processing flows are matched, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, the materials such as prompt information can be updated into the material data according to the respective matched image processing flow, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
Fig. 7 is an image processing apparatus according to an embodiment of the present disclosure. As shown in fig. 7, wherein the image processing apparatus includes:
a first obtaining module 701, configured to obtain, in response to a selection instruction for a target effect, material data corresponding to the target effect, where the material data includes a material and a matching relationship between the material and an image processing flow;
a second acquisition module 702 configured to acquire an initial image containing a subject in response to a photographing instruction;
A determining module 703 configured to determine, from the materials, a target material for processing the initial image in the image processing flow in which the image processing flow is currently located according to the matching relationship;
the image processing module 704 is configured to process the initial image by using the target material to obtain a target image corresponding to the image processing flow.
Optionally, the material data further includes a process identifier for matching the material, where the process identifier is used to indicate an image processing process matched with the material.
Optionally, the determining module 703, when determining, according to the matching relationship, the target material for processing the initial image in the current image processing flow, is specifically configured to:
determining the current image processing flow of the client; and selecting materials matched with the current target image processing flow from the materials according to the flow identification, and taking the materials as target materials.
Optionally, the material includes image material for constituting a target image effect, and text material for interacting with a photographic subject.
The matching relation between the materials and the image processing flow comprises the following steps: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow.
Optionally, when selecting, from the materials, a material matching the current target image processing procedure as a target material according to the procedure identifier, the determining module 703 is specifically configured to: if the current image processing flow is the back-end processing flow, the flow mark is the image material identified by the back-end processing flow, and the image material is determined to be the target material.
The image processing module 704, when processing the initial image by using the target material to obtain a target image corresponding to the image processing flow, is specifically configured to: and rendering the image material into the initial image to obtain a first image containing the target image effect.
Optionally, the determining module 703 is specifically configured to, when selecting, from the materials, a material matching the current target image processing procedure as a target material according to the procedure identifier:
if the current image processing flow is a preview flow, determining the flow mark as the image material and the text material of the preview flow mark as the target material.
The image processing module 704, when processing the initial image by using the target material to obtain a target image corresponding to the image processing flow, is specifically configured to:
and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
Optionally, the apparatus further comprises a display module specifically configured to: an editing interface and/or a publishing interface including the first image is displayed.
Optionally, the display module is further specifically configured to: a capture interface is displayed that includes a second image.
Optionally, the display module is further specifically configured to: and displaying a shooting interface containing a shooting object, wherein the shooting interface comprises a plurality of effect controls.
The first obtaining module 701, in response to a selection instruction for a target effect, obtains material data corresponding to the target effect, and is specifically configured to: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring material data corresponding to the target effect control.
The image processing apparatus may perform the system or the method provided in the foregoing embodiments, and for the parts of this embodiment that are not described in detail, reference may be made to the description related to the foregoing embodiments, which is not repeated herein.
In one possible design, the structure of the image processing apparatus may be implemented as an electronic device. As shown in fig. 8, the electronic device may include: a processor 21, and a memory 22. Wherein the memory 22 has stored thereon executable code which, when executed by the processor 21, at least enables the processor 21 to implement the image processing method as provided in the previous embodiments.
The electronic device may further include a communication interface 23 for communicating with other devices or a communication network.
In addition, the present disclosure also provides a computer-readable storage medium including instructions, the medium having stored thereon executable code, which when executed by a processor of a wireless router, causes the processor to perform the methods provided in the foregoing embodiments. Alternatively, the computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
In an exemplary embodiment, a computer program product is also provided, comprising a computer program which, when executed by a processor, implements the method provided in the preceding embodiments. The computer program is implemented by a program running on a terminal or a server.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (16)

1. An image processing method, comprising:
responding to a selection instruction of a target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
Acquiring an initial image containing a shooting object in response to a shooting instruction;
determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation;
processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow;
the materials comprise image materials used for forming target image effects and text materials used for interacting with shooting objects;
the matching relation between the material and the image processing flow comprises the following steps:
the image material is matched with a back-end processing flow and a preview flow, and the text material is matched with the preview flow.
2. The method according to claim 1, wherein the material data further includes a process identifier for matching the material, and the process identifier is used for indicating an image processing process matched by the material;
determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation, wherein the target materials comprise:
determining the current image processing flow of the client;
and determining the material matched with the image processing flow from the materials according to the flow identification, and taking the material as the target material.
3. The method according to claim 2, wherein selecting, from the materials, materials matching a current target image processing procedure as the target materials according to the procedure identifier includes:
if the current image processing flow is the back-end processing flow, identifying the flow as the image material identified by the back-end processing flow, and determining the image material as the target material;
the step of processing the initial image by using the target material to obtain a target image corresponding to the image processing flow comprises the following steps:
and rendering the image material into the initial image to obtain a first image containing the target image effect.
4. The method according to claim 2, wherein selecting, from the image materials, materials matching a current target image processing procedure as the target materials according to the procedure identifier includes:
if the current image processing flow is the preview flow, identifying the flow as the image material and the text material identified by the preview flow, and determining the flow as the target material;
the step of processing the initial image by using the target material to obtain a target image corresponding to the image processing flow comprises the following steps:
And rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
5. A method according to claim 3, further comprising: displaying an editing interface and/or a publishing interface containing the first image.
6. The method as recited in claim 4, further comprising: and displaying a shooting interface containing the second image.
7. The method as recited in claim 1, further comprising:
displaying a shooting interface containing the shooting object, wherein the shooting interface comprises a plurality of effect controls;
the responding to the selection instruction of the target effect, obtaining the material data corresponding to the target effect, comprises the following steps:
receiving a selection instruction of a target effect control in the effect controls;
and responding to the received selection instruction, and acquiring material data corresponding to the target effect control.
8. An image processing apparatus, comprising:
the first acquisition module is configured to respond to a selection instruction of a target effect and acquire material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
A second acquisition module configured to acquire an initial image containing a photographic subject in response to a photographic instruction;
the determining module is configured to determine target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation;
the image processing module is configured to process the initial image by adopting the target material to obtain a target image corresponding to the image processing flow;
the materials comprise image materials used for forming target image effects and text materials used for interacting with shooting objects;
the matching relation between the material and the image processing flow comprises the following steps:
the image material is matched with a back-end processing flow and a preview flow, and the text material is matched with the preview flow.
9. The apparatus of claim 8, wherein the material data further includes a process identifier for matching the material, the process identifier being used to indicate an image processing process for which the material is matched;
the determination module is specifically configured to:
determining the current image processing flow of the client;
and determining the material matched with the image processing flow from the materials according to the flow identification, and taking the material as the target material.
10. The apparatus of claim 9, wherein the determining module is specifically configured to, when selecting, from the materials, a material matching a current target image processing procedure as the target material according to the procedure identifier:
if the current image processing flow is the back-end processing flow, identifying the flow as the image material identified by the back-end processing flow, and determining the image material as the target material;
the image processing module processes the initial image by adopting the target material, and is specifically configured to:
and rendering the image material into the initial image to obtain a first image containing the target image effect.
11. The apparatus of claim 9, wherein the determining module is specifically configured to, when selecting, from the materials, a material matching a current target image processing procedure as the target material according to the procedure identifier:
if the current image processing flow is the preview flow, identifying the flow as the image material and the text material identified by the preview flow, and determining the flow as the target material;
The image processing module processes the initial image by adopting the target material, and is specifically configured to:
and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
12. The apparatus of claim 10, further comprising a display module configured to display an editing interface and/or a publishing interface containing the first image.
13. The apparatus of claim 11, further comprising a display module configured to display a capture interface containing the second image.
14. The apparatus of claim 8, further comprising a display module configured to
Displaying a shooting interface containing the shooting object, wherein the shooting interface comprises a plurality of effect controls;
the first obtaining module is specifically configured to, when responding to a selection instruction of a target effect and obtaining material data corresponding to the target effect:
receiving a selection instruction of a target effect control in the effect controls;
and responding to the received selection instruction, and acquiring material data corresponding to the target effect control.
15. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image processing method of any of claims 1 to 7.
16. A computer readable storage medium, which when executed by an electronic device, causes the electronic device to perform the image processing method of any of claims 1 to 7.
CN202111481292.1A 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium Active CN114390193B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111481292.1A CN114390193B (en) 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111481292.1A CN114390193B (en) 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114390193A CN114390193A (en) 2022-04-22
CN114390193B true CN114390193B (en) 2023-12-19

Family

ID=81195646

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111481292.1A Active CN114390193B (en) 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114390193B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979471B (en) * 2022-05-12 2023-10-10 北京达佳互联信息技术有限公司 Interface display method, device, electronic equipment and computer readable storage medium
WO2024001513A1 (en) * 2022-06-28 2024-01-04 北京字跳网络技术有限公司 Content photographing method and apparatus, device, and storage medium
CN115291772A (en) * 2022-08-16 2022-11-04 北京字跳网络技术有限公司 Page display method, device, equipment and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017016030A1 (en) * 2015-07-30 2017-02-02 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN107635104A (en) * 2017-08-11 2018-01-26 光锐恒宇(北京)科技有限公司 A kind of method and apparatus of special display effect in the application
CN111899192A (en) * 2020-07-23 2020-11-06 北京字节跳动网络技术有限公司 Interaction method, interaction device, electronic equipment and computer-readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017016030A1 (en) * 2015-07-30 2017-02-02 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN107635104A (en) * 2017-08-11 2018-01-26 光锐恒宇(北京)科技有限公司 A kind of method and apparatus of special display effect in the application
CN111899192A (en) * 2020-07-23 2020-11-06 北京字节跳动网络技术有限公司 Interaction method, interaction device, electronic equipment and computer-readable storage medium

Also Published As

Publication number Publication date
CN114390193A (en) 2022-04-22

Similar Documents

Publication Publication Date Title
CN114390193B (en) Image processing method, device, electronic equipment and storage medium
CN109889914B (en) Video picture pushing method and device, computer equipment and storage medium
CN108377334B (en) Short video shooting method and device and electronic terminal
CN108986192B (en) Data processing method and device for live broadcast
CN106730815B (en) Somatosensory interaction method and system easy to realize
WO2015001437A1 (en) Image processing method and apparatus, and electronic device
CN112905074B (en) Interactive interface display method, interactive interface generation method and device and electronic equipment
CN106792147A (en) A kind of image replacement method and device
CN113507621A (en) Live broadcast method, device, system, computer equipment and storage medium
CN109327737A (en) TV programme suggesting method, terminal, system and storage medium
JP2017064161A (en) Game system, imaging apparatus, game apparatus, and program
CN110162667A (en) Video generation method, device and storage medium
WO2016202024A1 (en) 3d animation presentation method and device
WO2019062571A1 (en) Dynamic image synthesis method and device, terminal and storage medium
US9881086B2 (en) Image shooting device, image shooting method, and recording medium
CN113709545A (en) Video processing method and device, computer equipment and storage medium
CN113392690A (en) Video semantic annotation method, device, equipment and storage medium
CN115362474A (en) Scoods and hairstyles in modifiable video for custom multimedia messaging applications
CN114598819A (en) Video recording method and device and electronic equipment
KR20190133211A (en) Server device, and computer program used for it
KR20130142315A (en) Character service system and character service providing method thereof
CN113989424A (en) Three-dimensional virtual image generation method and device and electronic equipment
CN116962748A (en) Live video image rendering method and device and live video system
CN114612637A (en) Scene picture display method and device, computer equipment and storage medium
KR20160128900A (en) Method and apparatus for generating moving photograph based on moving effect

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant