CN114390193A - Image processing method, image processing device, electronic equipment and storage medium - Google Patents

Image processing method, image processing device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114390193A
CN114390193A CN202111481292.1A CN202111481292A CN114390193A CN 114390193 A CN114390193 A CN 114390193A CN 202111481292 A CN202111481292 A CN 202111481292A CN 114390193 A CN114390193 A CN 114390193A
Authority
CN
China
Prior art keywords
image
image processing
target
flow
materials
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111481292.1A
Other languages
Chinese (zh)
Other versions
CN114390193B (en
Inventor
赵伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Dajia Internet Information Technology Co Ltd
Original Assignee
Beijing Dajia Internet Information Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Dajia Internet Information Technology Co Ltd filed Critical Beijing Dajia Internet Information Technology Co Ltd
Priority to CN202111481292.1A priority Critical patent/CN114390193B/en
Publication of CN114390193A publication Critical patent/CN114390193A/en
Application granted granted Critical
Publication of CN114390193B publication Critical patent/CN114390193B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44012Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving rendering scenes according to scene graphs, e.g. MPEG-4 scene graphs

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides an image processing method, an image processing apparatus, an electronic device and a storage medium. The method comprises the following steps: responding to a selection instruction of the target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow; acquiring an initial image containing a photographic object in response to a photographing instruction; determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation; and processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow. According to the method, the image processing modes of different materials are applied to different image processing flows through the matching relationship between the materials and the image processing flows, so that the output image can meet the actual requirements of the different image processing flows, the image processing efficiency is greatly improved, and the image display effect is improved.

Description

Image processing method, image processing device, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image processing method and apparatus, an electronic device, and a storage medium.
Background
In order to enrich the final visual perception of image works such as short videos or facial expression packs, the user can add various image effects to the image works in the shooting process, such as heart beautifying, mouth opening and bubble spitting.
In the related technology, prompt information is added into a video shooting interface so that a user can trigger a corresponding image effect when shooting, and the user is guided to put a posture expression matched with the image effect. Taking magic expression (this disclosure is referred to as magic watch for short) as an example, the prompt information is usually bound to magic watch materials and is issued to the client along with the magic watch materials, so that when the client records the short video, the prompt information is synchronously recorded to the finally displayed short video, and the display effect and the image processing efficiency of the short video are influenced.
Disclosure of Invention
The present disclosure provides an image processing method, an image processing apparatus, an electronic device, and a storage medium, which at least solve a problem that a prompt message influences image capturing in a related art, improve image processing efficiency, and improve an image display effect.
According to a first aspect of embodiments of the present disclosure, the present disclosure provides an image processing method, including:
responding to a selection instruction of the target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
acquiring an initial image containing a photographic object in response to a photographing instruction;
determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation;
and processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
In an optional embodiment, the material data further includes a material matching process identifier, and the process identifier is used to indicate an image processing process to which the material is matched.
Determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation, wherein the method comprises the following steps: determining the current image processing flow of the client; and determining materials matched with the image processing flow from the materials according to the flow identification as target materials.
In an alternative embodiment, the material includes image material for constituting the target image effect and text material for interacting with the photographic subject.
The matching relation between the material and the image processing flow comprises the following steps: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow.
In an optional embodiment, selecting, from the materials according to the process identifier, a material that matches the current target image processing process as a target material includes: and if the current image processing flow is the back-end processing flow, identifying the flow as the image material identified by the back-end processing flow, and determining the image material as the target material.
Adopting the target material to process the initial image to obtain a target image corresponding to the image processing flow, comprising the following steps: and rendering the image material into the initial image to obtain a first image containing the target image effect.
In an optional embodiment, selecting, from the image materials according to the process identifier, a material that matches a current target image processing process as a target material includes: and if the current image processing flow is a preview flow, identifying the flow as an image material and a character material identified by the preview flow, and determining the flow as a target material.
Adopting the target material to process the initial image to obtain a target image corresponding to the image processing flow, comprising the following steps: and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
In an alternative embodiment, an editing interface and/or a publishing interface is displayed that includes the first image.
In an alternative embodiment, a capture interface is displayed that includes the second image.
In an optional embodiment, a shooting interface containing a shooting object is displayed, and the shooting interface comprises a plurality of effect controls.
Responding to a selection instruction of the target effect, and acquiring material data corresponding to the target effect, wherein the method comprises the following steps: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring the material data corresponding to the target effect control.
According to a second aspect of the embodiments of the present disclosure, there is provided an image processing apparatus comprising:
the first acquisition module is configured to respond to a selection instruction of a target effect and acquire material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
a second acquisition module configured to acquire an initial image containing a photographic subject in response to a photographing instruction;
the determining module is configured to determine target materials used for processing the initial images in the current image processing flow from the materials according to the matching relation;
and the image processing module is configured to process the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
In an optional embodiment, the material data further includes a material matching process identifier, and the process identifier is used to indicate an image processing process to which the material is matched.
The determination module is specifically configured to: determining the current image processing flow of the client; and determining materials matched with the image processing flow from the materials according to the flow identification as target materials.
In an alternative embodiment, the material includes image material for constituting the target image effect and text material for interacting with the photographic subject.
The matching relation between the material and the image processing flow comprises the following steps: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow.
In an optional embodiment, when the determining module selects, as the target material, a material matched with the current target image processing flow from the materials according to the flow identifier, the determining module is specifically configured to: and if the current image processing flow is the back-end processing flow, identifying the flow as the image material identified by the back-end processing flow, and determining the image material as the target material.
The image processing module is used for processing the initial image by adopting the target material, and when the target image corresponding to the image processing flow is obtained, the image processing module is specifically configured to: and rendering the image material into the initial image to obtain a first image containing the target image effect.
In an optional embodiment, when the determining module selects, as the target material, a material matched with the current target image processing flow from the materials according to the flow identifier, the determining module is specifically configured to: and if the current image processing flow is a preview flow, identifying the flow as an image material and a character material identified by the preview flow, and determining the flow as a target material.
The image processing module is used for processing the initial image by adopting the target material, and when the target image corresponding to the image processing flow is obtained, the image processing module is specifically configured to: and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
In an optional embodiment, the display module is further configured to display an editing interface and/or a publishing interface including the first image.
In an optional embodiment, the display module is further configured to display a shooting interface including the second image.
In an optional embodiment, the shooting device further comprises a display module configured to display a shooting interface containing a shooting object, wherein the shooting interface comprises a plurality of effect controls.
When the first obtaining module responds to the selection instruction of the target effect and obtains the material data corresponding to the target effect, the first obtaining module is specifically configured to: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring the material data corresponding to the target effect control.
According to a third aspect of the embodiments of the present disclosure, there is provided an electronic device, which includes a processor and a memory, wherein the memory has stored thereon an executable code, and when the executable code is executed by the processor, the processor is enabled to implement at least the image processing method in the first aspect.
According to a fourth aspect of embodiments of the present disclosure, there is provided a computer-readable storage medium, in which instructions, when executed by an electronic device, enable the electronic device to perform at least the image processing method of the first aspect.
According to a fifth aspect of embodiments of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the image processing method of the first aspect.
The technical scheme provided by the embodiment of the disclosure at least brings the following beneficial effects:
in the disclosure, in response to a selection instruction for a target effect, material data corresponding to the target effect is obtained, where the material data includes materials and a matching relationship between the materials and an image processing flow. Further, in the case where the initial image containing the photographic subject is acquired in response to the photographing instruction, the target material for processing the initial image in the image processing flow currently located can be determined from the materials according to the matching relationship. Furthermore, the initial image is processed by adopting the target material to obtain the target image corresponding to the image processing flow, so that the image processing mode of applying the corresponding material to the current image processing flow is realized, and the output image can meet the actual requirements (such as editing, publishing, previewing and the like) of the current image processing flow. In the method, the materials required by different image processing flows can be clearly distinguished through the matching relation between the materials and the image processing flows, so that the problem that the image display effect is reduced due to the fact that the materials such as prompt messages are recorded into the finally displayed image is solved, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, materials such as prompt messages and the like can be updated into the material data according to the respective matched image processing flows, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure and are not to be construed as limiting the disclosure.
Fig. 1 is a schematic view of a photographing interface according to a related art.
FIG. 2 is a flow diagram illustrating an image processing method according to an exemplary embodiment.
FIG. 3 is a schematic diagram illustrating a capture interface in accordance with an exemplary embodiment.
FIG. 4 is a schematic diagram illustrating an editing interface in accordance with an exemplary embodiment.
FIG. 5 is a diagram illustrating a publication interface, according to an example embodiment.
FIG. 6 is a schematic diagram illustrating a capture interface in accordance with an exemplary embodiment.
Fig. 7 is a schematic configuration diagram illustrating an image processing apparatus according to an exemplary embodiment.
Fig. 8 is a schematic structural diagram of an electronic device according to an exemplary embodiment.
Detailed Description
In order to make the technical solutions of the present disclosure better understood by those of ordinary skill in the art, the technical solutions in the embodiments of the present disclosure will be clearly and completely described below with reference to the accompanying drawings.
It should be noted that the terms "first," "second," and the like in the description and claims of the present disclosure and in the above-described drawings are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used is interchangeable under appropriate circumstances such that the embodiments of the disclosure described herein are capable of operation in sequences other than those illustrated or otherwise described herein. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
As described above, in the related art, the prompt information is added to the video shooting interface, so that the user triggers a corresponding image effect when shooting, and the user is guided to put a gesture expression matched with the image effect. Taking the shooting interface shown in fig. 1 as an example, a magic watch made of scissors, stone and cloth can be added when shooting a short video, and in this case, a prompt message of "call friend to play bar together" can be added to prompt a user to invite other people to participate in shooting of the short video. In the present disclosure, a magic watch may also be referred to as a magic expression, a magic special effect, a special effect, or the like.
However, in a related art, the prompt information is bound to the magic watch material and is issued to the client along with the magic watch material, so that when the client records the short video, the prompt information is synchronously recorded into the short video which is finally displayed, and the display effect and the image processing efficiency of the short video are affected.
In another related technology, when the client displays the magic watch preview effect, the prompt information can be displayed by arranging a layer on the layer where the video stream is located, so that the prompt information is prevented from being recorded in the final short video. These pieces of prompt information are fixed information that is previously embedded in the client side, such as "no face recognized", "please click the head", and the like. The prompting information is limited in quantity and is difficult to match with the requirement of magic watch materials with high updating frequency. Moreover, the prompt messages are updated along with the version of the client, so that the update difficulty of the prompt messages in the client is high, and the maintenance efficiency is low.
To solve at least one technical problem in the related art, the present disclosure provides an image processing method, an apparatus, an electronic device, and a storage medium.
The core idea of the technical scheme is as follows: and responding to a selection instruction of the target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and the image processing flow. Further, in the case where the initial image containing the photographic subject is acquired in response to the photographing instruction, the target material for processing the initial image in the image processing flow currently located can be determined from the materials according to the matching relationship. Furthermore, the initial image is processed by adopting the target material to obtain the target image corresponding to the image processing flow, so that the image processing mode of applying the corresponding material to the current image processing flow is realized, and the output image can meet the actual requirements (such as editing, publishing, previewing and the like) of different image processing flows. In the scheme, the materials required by different image processing flows can be clearly distinguished through the matching relation between the materials and the image processing flows, so that the problem that the image display effect is reduced due to the fact that the materials such as the prompt information are recorded into the finally displayed image is solved, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, materials such as prompt messages and the like can be updated into the material data according to the respective matched image processing flows, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
In combination with a specific application scenario, the actual requirements of the different image processing flows can be understood as the effects required to be output by the different image processing flows. In practical application, the effect includes an image effect and a character effect.
For example, it is assumed that the material includes image material for constituting a target image effect, and text material for interacting with a photographic subject. The text material herein includes the prompt information as described above. Assume that a target image effect and text material for interacting with a user (i.e., a photographic subject) need to be presented to the user in the preview flow. It is assumed that the images to be output by the back-end processing flow are mainly used for the subsequent editing flow and distribution flow, and thus it is necessary to avoid displaying text materials.
Based on the above assumption, the target image (referred to as the first image in this disclosure for distinction) corresponding to the back-end processing flow may be an image containing the target image effect but not carrying a text effect, so that the first image may be directly applied to subsequent editing and publishing. And the target image (for distinction, referred to as the second image in this disclosure) corresponding to the preview flow may be an image containing a target image effect as well as a text effect. Therefore, the second image output by the preview process can enable the user to synchronously adjust the shooting state of the user according to the text materials when previewing the target effect, such as the posture, the expression, the number of people participating and the like, so as to realize the target effect in a matching manner.
Based on the core ideas introduced in the foregoing, an embodiment of the present disclosure provides an image processing method, and fig. 2 is a flowchart illustrating the image processing method according to an exemplary embodiment of the present disclosure. As shown in fig. 2, the method includes:
201. responding to a selection instruction of the target effect, and acquiring material data corresponding to the target effect;
202. acquiring an initial image containing a photographic object in response to a photographing instruction;
203. determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation;
204. and processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
According to the method, the materials required by different image processing flows can be clearly distinguished through the matching relation between the materials and the image processing flows, a brand-new image processing mode of applying different materials in different image processing flows is realized, so that images with different image effects can be output by different image processing flows, the output requirements of the different image processing flows are matched, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, materials such as prompt messages and the like can be updated into the material data according to the respective matched image processing flows, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
In practical applications, each step of the method may be implemented by one electronic device, and the electronic device may be a terminal device such as a mobile phone, an intelligent bracelet, a tablet computer, a PC, a notebook computer, and the like. Taking a mobile phone as an example, the method can be realized by calling a special application program carried in the mobile phone, can also be realized by calling a small program set in an instant messaging application or other types of applications, and can also be realized by calling a cloud server through mobile phone application. The steps of the above method can also be implemented by cooperation of a plurality of electronic devices. For example, the server may send the execution result to the terminal device for rendering and displaying the execution result by the terminal device. The server may be a physical server including an independent host, or may also be a virtual server borne by a host cluster, or may also be a cloud server, which is not limited in the present disclosure.
The following describes each step in the image processing method with reference to a specific embodiment.
First, in 201, in response to a selection instruction for a target effect, material data corresponding to the target effect is obtained.
In the related art, a layer containing text materials can be also superimposed on the video recording interface, so that the text materials in the layer are prevented from being recorded into the short video. However, the text materials in the layer are usually preset in the client, so that the setting or the type of the text materials can be adjusted only by updating the client, the operation is complicated, the types of the text materials are few, and the text materials are difficult to update in time to match various effects.
To address this problem, in this embodiment, the material data corresponding to the target effect includes a material and a matching relationship between the material and the image processing flow. Therefore, through the matching relationship between the materials and the image processing flow, the materials corresponding to different image processing flows can be set in the material data according to the output requirements matched with different image processing flows, so that the image processing flow has pertinence, the setting and updating difficulty of the materials such as text materials, image materials and the like is greatly reduced, and the targeted addition of different materials (such as different text materials) in various effects is facilitated.
In practical applications, the image processing flow includes, but is not limited to, a preview flow and a back-end processing flow. It is understood that different image processing flows may be executed sequentially or synchronously in different code branches. For example, the preview process is automatically triggered after entering the shooting interface, so that an image for previewing the target effect is displayed in real time for a user entering the shooting interface. And the back-end processing flow is triggered by the shooting control in the shooting interface, namely the processing flow which runs in the background of the client and performs image effect rendering, encoding and the like on the shot image can be started after the image shooting is triggered, and the processing flow is regarded as the back-end processing flow.
Optionally, the image material includes, but is not limited to: image material for constituting a target image effect, and text material for interacting with a photographic subject. In practical applications, the text material may be text information or animation containing text information, and the disclosure is not limited thereto. In any form of text material, the core intention is to guide the shooting object (such as the photographer or the photographed person) to adjust the shooting status according to the content indicated by the text material. Taking the magic watch as an example, the text material may be that the shooting object is not detected, and in this case, the photographer may move the device so that the shooting object enters the view finder frame to complete the shooting of the magic watch. For example, the text material may be a piece of animation matched with the magic watch and including the reference shooting action, so as to prompt the shooting object to complete magic watch shooting according to the reference shooting action.
Taking the magic watch as an example, the material data corresponding to the magic watch comprises image materials forming the magic watch, character materials related to the magic watch, a matching relation between the image materials and the image processing flow and a matching relation between the character materials and the image processing flow. In practical applications, the image materials constituting the magic table are, for example, various effect maps and effect setting parameters. According to actual requirements, the special effect setting parameters include: special effect display sequence, duration, chartlet position, chartlet shape and size, font type and the like.
Optionally, before 201, a shooting interface containing a shooting object is displayed, where the shooting interface includes a plurality of effect controls. For example, the magic tables correspond to magic table selection controls. The plurality of effect controls in the shooting interface can be suspended above the shooting interface or can be arranged in a designated area of the shooting interface. The layout of the plurality of effect controls may be set according to a shooting interface. For example, a plurality of magic watch selection controls can be called by the magic watch function awakening control and are arranged in a designated area of the shooting interface, such as the lower half part of the shooting interface. For example, a plurality of magic watch selection controls are arranged on the right side or the left side of the interface according to the use habits (such as the dominant hand) of the user, so that the magic watch selection controls are convenient to operate by a single hand. For example, the magic watch selecting controls can be divided into two parts which are respectively arranged on the left side and the right side of the interface, so that the magic watch selecting controls can be operated by two hands. It is to be understood that the layout form of the plurality of effect controls in the shooting interface is not limited to the above example.
Based on this, in 201, in response to the selection instruction for the target effect, an optional embodiment of obtaining the material data corresponding to the target effect may be implemented as: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring the material data corresponding to the target effect control.
In the above steps, it is assumed that the material data corresponding to the multiple effects is a magic watch material packet corresponding to the multiple magic watches. And based on the hypothesis, detecting an operation instruction of the user, and determining that the user selects a target magic table selection control from the magic table selection controls according to the detected operation instruction. And responding to the detected operation instruction, and acquiring a magic watch material packet corresponding to the target magic watch selection control. The plurality of magic watch material packages may be selected by the user and downloaded and updated to the client side, or may be stored in the client side in advance. For example, a plurality of magic watch material packages are downloaded to the client side along with the starting updating process of the client.
Further, in 202, in response to the photographing instruction, an initial image including the photographic subject is acquired.
The initial image may be a picture, a video stream, or another form, and the disclosure is not limited thereto. The photographic subject contained in the initial image may be determined by the photographic scene. Such as people, pets, landscapes in daily life, anchor casts, spectators in live scenes, athletes, spectators, referees in sporting events, actors, presenters, spectators in entertainment shows, daily life, and so forth. Of course, the present disclosure may set scene elements in various scenes as photographic subjects, in addition to the moving persons of the above-described example. The scene elements are, for example, commodities in a live broadcast scene, a live broadcast site panorama and scene elements, and further, for example, a performance site panorama and scene elements of an entertainment performance.
It should be noted that there are various ways to select the photographic subject. The shooting object to be displayed can be selected in the following modes, including:
in an optional embodiment, the shooting object in the shooting interface can be manually selected, and then the shooting object can be selected from a plurality of objects to be selected. For example, one or more objects to be selected in the shooting interface are identified, the objects to be selected in the shooting interface are selected by using a box frame, and a box corresponding to the shooting object to be selected is clicked. Optionally, the visual characteristics of the object to be selected in the shooting interface can be pre-entered. For example, before recording the magic expression, the visual features of the user are entered. Such as taking an image of the user's face, or an image of the user's clothing.
In another optional embodiment, the shooting device can be moved to move the object to be selected in the shooting interface to the specified position, so that the shooting object is identified. For example, the focus mode is selected as the center focus, and in this case, the subject to be selected at the center of the shooting interface is set as the shooting subject.
After the shooting object is selected in the above manner, in 202, whether a shooting instruction is received or not may be detected, and after the shooting instruction is received, an initial image is acquired by the image capture sensor. In practical application, the shooting instruction may be a selection instruction of the shooting control, may also be generated after a shooting object is identified to send a specified trigger action, and may also be implemented in other forms, which is not limited in the present disclosure. Taking the magic watch shooting scene shown in fig. 3 as an example, according to the prompt of "nodding and aiming emission", if nodding of the shooting object is recognized, the acquisition of an initial image containing the shooting object can be triggered.
After the initial image is obtained through the above steps, in 203, the target material for processing the initial image in the current image processing flow is determined from the materials according to the matching relationship. And 204, processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
In the above step, it is assumed that the material data further includes a material matching process identifier, and the process identifier is used to indicate an image processing process matched with the material.
Based on the above assumptions, in 203, an alternative embodiment of determining the target material for processing the initial image in the current image processing flow according to the matching relationship is specifically: determining the current image processing flow of the client; and selecting the material matched with the current target image processing flow from the materials according to the flow identification as the target material. Therefore, materials required to be used by different image processing flows can be effectively distinguished through the flow identification, the actual requirements of the different image processing flows can be met by the output image, and the image processing efficiency is improved.
Continuing with the above assumption, it is further assumed that the material includes image material for constituting the target image effect and text material for interacting with the photographic subject. Based on this, optionally, the matching relationship between the material and the image processing flow can be implemented as: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow. For example, it is assumed that the material data is a magic watch material packet corresponding to a magic watch. Based on this assumption, the magic watch material package includes, for example: the image material, the text material, the preview process identifier and the back-end processing process identifier (i.e., the process identifier described above) matched with the image material and the preview process identifier (i.e., the process identifier described above) matched with the text material are used to form the magic watch. Through the matching relation, the image processing modes of different materials can be applied to different image processing flows, so that the output image can meet the actual requirements of the different image processing flows, the image processing efficiency is improved, and the image display effect is improved.
First, there are various ways to determine the image processing flow. The image processing flow in which the client is currently located can be determined in the following ways, including:
in an alternative embodiment, the image processing flow in which the client is currently located is identified according to the latest operation instruction. For example, assuming that the latest operation instruction is an instruction to jump to the shooting interface, it may be determined that the image processing flow in which the client is currently located is a preview flow. For example, assuming that the latest operation instruction is a shooting instruction for a shooting object, it can be determined that the image processing flow in which the client is currently located is a back-end processing flow. In another optional embodiment, the execution sequence of the preview flow and the back-end processing flow may be preset, so as to determine the image processing flow currently located by the current client according to the execution sequence.
Furthermore, the target material can be selected from the materials by the following methods, including:
in an optional embodiment, in 203, after determining the image processing flow currently located by the client, if the currently located image processing flow is a back-end processing flow, the image material identified as the back-end processing flow is determined as the target material. Furthermore, in 204, an optional way of obtaining the target image corresponding to the image processing flow by processing the initial image with the target material is as follows: and rendering the image material into the initial image to obtain a first image containing the target image effect. Taking a magic watch as an example, the image materials for forming the magic watch include: flashing light, props and cosmetic stickers. Based on this, at 204, the flash, props, and makeup maps are rendered into the initial image, resulting in a first image containing the magic watch. Therefore, the target image obtained through the back-end processing flow does not contain character materials, and the influence of the effect corresponding to the character materials on image display is avoided.
Optionally, after the first image containing the target image effect is obtained, an editing interface and/or a publishing interface containing the first image may also be displayed. The editing interface is mainly used for editing the first image. For example, the first image including the magic watch effect may be imported into the editing interface shown in fig. 4 to be edited. The editing operation on the first image includes, for example: clipping, transcoding processing, watermarking, post effect editing processing and dubbing. The publishing interface is mainly used for publishing the first image. For example, the first image containing the magic watch effect may be imported into the publishing interface shown in fig. 5 for publishing operations, such as editing relevant text, selecting the current location, selecting whether to publish publicly, and selecting other settings.
In another optional embodiment, in 203, after determining the image processing flow currently located by the client, if the current image processing flow is a preview flow, the image material and the text material identified by the flow as the preview flow are determined as target materials. Furthermore, in 204, an optional way of obtaining the target image corresponding to the image processing flow by processing the initial image with the target material is as follows: and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text effect. Taking a magic watch as an example, the image materials for forming the magic watch include: flashing light, props and cosmetic stickers. Based on this, at 204, the flash, props, makeup drawings, and text material are rendered into the initial image, resulting in a second image containing the magic watch and text effects. Therefore, the target image obtained through the preview process contains the text materials, so that the user can be prompted to perform related operations through the effect corresponding to the text materials, and the user interaction experience and the image display effect are improved.
Of course, in practical applications, the second image may also be rendered on the basis of the first image. In an optional embodiment, after the first image is obtained, if it is detected that the current image processing flow is switched to the preview flow, the text material is rendered into the first image to obtain a second image containing the target image effect and the text effect.
In practical application, optionally, a plurality of text materials corresponding to the magic table can be set, so that the scene where the shooting object is located can be identified, and the text materials matched with the scene where the shooting object is located are selected. For example, in the shooting scene shown in fig. 6, the shooting object is not recognized, and the text material may be selected as "no human image detected". Optionally, the reference shooting action can be prompted to the shooting object through the text effect corresponding to the text material. For example, in the shooting scene shown in fig. 6, in cooperation with the cat face, the user can be guided to rotate in cooperation with the cat ear by using the text effect corresponding to the text material.
Optionally, after obtaining the second image containing the target image effect and the text effect, a shooting interface containing the second image may also be displayed. In this way, the target image effect and the text effect in the second image can be previewed synchronously in the shooting interface.
In the image processing method shown in fig. 2, the materials required by different image processing flows can be clearly distinguished through the matching relationship between the materials and the image processing flows, so that a brand-new image processing mode of applying different materials in different image processing flows is realized, images with different effects can be output by different image processing flows, the output requirements of the different image processing flows are matched, the image processing efficiency is greatly improved, and the image display effect is improved. In addition, materials such as prompt messages and the like can be updated into the material data according to the respective matched image processing flows, so that the updating difficulty of the material data is greatly reduced, and the maintenance efficiency of the material data is improved.
Fig. 7 is an image processing apparatus according to an embodiment of the present disclosure. As shown in fig. 7, wherein the image processing apparatus includes:
the first obtaining module 701 is configured to respond to a selection instruction of a target effect, and obtain material data corresponding to the target effect, where the material data includes a material and a matching relationship between the material and an image processing flow;
a second acquisition module 702 configured to acquire an initial image containing a photographic subject in response to a photographing instruction;
a determining module 703 configured to determine, from the materials, a target material for processing the initial image in the current image processing flow according to the matching relationship;
and the image processing module 704 is configured to process the initial image by using the target material to obtain a target image corresponding to the image processing flow.
Optionally, the material data further includes a material matching process identifier, and the process identifier is used to indicate an image processing process matched with the material.
Optionally, when determining, according to the matching relationship, that the target material for processing the initial image in the current image processing flow is used, the determining module 703 is specifically configured to:
determining the current image processing flow of the client; and selecting the material matched with the current target image processing flow from the materials according to the flow identification as the target material.
Alternatively, the material includes image material for constituting the target image effect, and text material for interacting with the photographic subject.
The matching relation between the material and the image processing flow comprises the following steps: the image material is matched with the back-end processing flow and the preview flow, and the text material is matched with the preview flow.
Optionally, when the determining module 703 selects, as the target material, a material matched with the current target image processing flow from the materials according to the flow identifier, the determining module is specifically configured to: and if the current image processing flow is the back-end processing flow, identifying the flow as the image material identified by the back-end processing flow, and determining the image material as the target material.
The image processing module 704, which processes the initial image by using the target material, is specifically configured to: and rendering the image material into the initial image to obtain a first image containing the target image effect.
Optionally, when the determining module 703 selects, as the target material, a material matched with the current target image processing flow from the materials according to the flow identifier, the determining module is specifically configured to:
and if the current image processing flow is a preview flow, identifying the flow as an image material and a character material identified by the preview flow, and determining the flow as a target material.
The image processing module 704, which processes the initial image using the target material, is specifically configured to:
and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
Optionally, the apparatus further comprises a display module, configured specifically to: and displaying an editing interface and/or a publishing interface containing the first image.
Optionally, the display module is further specifically configured to: and displaying a shooting interface containing the second image.
Optionally, the display module is further specifically configured to: and displaying a shooting interface containing a shooting object, wherein the shooting interface comprises a plurality of effect controls.
The first obtaining module 701, in response to the selection instruction for the target effect, obtains the material data corresponding to the target effect, and is specifically configured to: receiving a selection instruction of a target effect control in a plurality of effect controls; and responding to the received selection instruction, and acquiring the material data corresponding to the target effect control.
The image processing apparatus may execute the systems or methods provided in the foregoing embodiments, and details of the embodiments may refer to related descriptions of the foregoing embodiments, which are not repeated herein.
In one possible design, the structure of the image processing apparatus may be implemented as an electronic device. As shown in fig. 8, the electronic device may include: a processor 21 and a memory 22. Wherein the memory 22 has stored thereon executable code which, when executed by the processor 21, at least makes the processor 21 capable of implementing the image processing method as provided in the preceding embodiments.
The electronic device may further include a communication interface 23 for communicating with other devices or a communication network.
Additionally, the present disclosure also provides a computer-readable storage medium comprising instructions, the medium having stored thereon executable code, which, when executed by a processor of a wireless router, causes the processor to perform the methods provided in the foregoing embodiments. Alternatively, the computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
In an exemplary embodiment, a computer program product is also provided, comprising a computer program, which when executed by a processor, implements the method provided in the preceding embodiments. The computer program is realized by a program running on a terminal or a server.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image processing method, comprising:
responding to a selection instruction of a target effect, and acquiring material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
acquiring an initial image containing a photographic object in response to a photographing instruction;
determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation;
and processing the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
2. The method according to claim 1, wherein the material data further comprises a flow identifier for the material matching, and the flow identifier is used for indicating the image processing flow matched with the material;
determining target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation, wherein the target materials comprise:
determining the image processing flow where the client is currently located;
and determining a material matched with the image processing flow from the materials according to the flow identification to serve as the target material.
3. The method according to claim 1 or 2, wherein the material includes image material for constituting a target image effect and text material for interacting with a photographic subject;
the matching relationship between the material and the image processing flow comprises the following steps:
the image material is matched with a back-end processing flow and a preview flow, and the text material is matched with the preview flow.
4. The method according to claim 3, wherein said selecting, as the target material, a material that matches a current target image processing flow from the materials according to the flow identifier comprises:
if the current image processing flow is the back-end processing flow, determining the image material with the flow identification as the back-end processing flow identification as the target material;
the processing the initial image by using the target material to obtain the target image corresponding to the image processing flow comprises the following steps:
and rendering the image material to the initial image to obtain a first image containing the target image effect.
5. The method according to claim 3, wherein said selecting, as the target material, a material that matches a current target image processing flow from the image materials according to the flow identifier comprises:
if the current image processing flow is the preview flow, determining the image material and the character material of which the flow marks are preview flow marks as the target material;
the processing the initial image by using the target material to obtain the target image corresponding to the image processing flow comprises the following steps:
and rendering the image material and the text material into the initial image to obtain a second image containing the target image effect and the text material.
6. The method of claim 4, further comprising: and displaying an editing interface and/or a publishing interface containing the first image.
7. An image processing apparatus characterized by comprising:
the first acquisition module is configured to respond to a selection instruction of a target effect and acquire material data corresponding to the target effect, wherein the material data comprises materials and a matching relation between the materials and an image processing flow;
a second acquisition module configured to acquire an initial image containing a photographic subject in response to a photographing instruction;
the determining module is configured to determine target materials used for processing the initial image in the current image processing flow from the materials according to the matching relation;
and the image processing module is configured to process the initial image by adopting the target material to obtain a target image corresponding to the image processing flow.
8. An electronic device, comprising:
a processor;
a memory for storing the processor-executable instructions;
wherein the processor is configured to execute the instructions to implement the image processing method of any of claims 1 to 6.
9. A computer-readable storage medium in which instructions, when executed by an electronic device, enable the electronic device to perform the image processing method of any one of claims 1 to 6.
10. A computer program product comprising a computer program, characterized in that the computer program realizes the image processing method of any one of claims 1 to 6 when executed by a processor.
CN202111481292.1A 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium Active CN114390193B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111481292.1A CN114390193B (en) 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111481292.1A CN114390193B (en) 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114390193A true CN114390193A (en) 2022-04-22
CN114390193B CN114390193B (en) 2023-12-19

Family

ID=81195646

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111481292.1A Active CN114390193B (en) 2021-12-06 2021-12-06 Image processing method, device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114390193B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979471A (en) * 2022-05-12 2022-08-30 北京达佳互联信息技术有限公司 Interface display method and device, electronic equipment and computer readable storage medium
CN115291772A (en) * 2022-08-16 2022-11-04 北京字跳网络技术有限公司 Page display method, device, equipment and storage medium
WO2024001513A1 (en) * 2022-06-28 2024-01-04 北京字跳网络技术有限公司 Content photographing method and apparatus, device, and storage medium

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017016030A1 (en) * 2015-07-30 2017-02-02 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN107635104A (en) * 2017-08-11 2018-01-26 光锐恒宇(北京)科技有限公司 A kind of method and apparatus of special display effect in the application
CN111899192A (en) * 2020-07-23 2020-11-06 北京字节跳动网络技术有限公司 Interaction method, interaction device, electronic equipment and computer-readable storage medium

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2017016030A1 (en) * 2015-07-30 2017-02-02 宇龙计算机通信科技(深圳)有限公司 Image processing method and terminal
CN107635104A (en) * 2017-08-11 2018-01-26 光锐恒宇(北京)科技有限公司 A kind of method and apparatus of special display effect in the application
CN111899192A (en) * 2020-07-23 2020-11-06 北京字节跳动网络技术有限公司 Interaction method, interaction device, electronic equipment and computer-readable storage medium

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114979471A (en) * 2022-05-12 2022-08-30 北京达佳互联信息技术有限公司 Interface display method and device, electronic equipment and computer readable storage medium
CN114979471B (en) * 2022-05-12 2023-10-10 北京达佳互联信息技术有限公司 Interface display method, device, electronic equipment and computer readable storage medium
WO2024001513A1 (en) * 2022-06-28 2024-01-04 北京字跳网络技术有限公司 Content photographing method and apparatus, device, and storage medium
CN115291772A (en) * 2022-08-16 2022-11-04 北京字跳网络技术有限公司 Page display method, device, equipment and storage medium

Also Published As

Publication number Publication date
CN114390193B (en) 2023-12-19

Similar Documents

Publication Publication Date Title
CN114390193B (en) Image processing method, device, electronic equipment and storage medium
CN109068081A (en) Video generation method, device, electronic equipment and storage medium
WO2020138107A1 (en) Video streaming system, video streaming method, and video streaming program for live streaming of video including animation of character object generated on basis of motion of streaming user
CN108986192B (en) Data processing method and device for live broadcast
CN106730815B (en) Somatosensory interaction method and system easy to realize
CN113507621A (en) Live broadcast method, device, system, computer equipment and storage medium
CN110691279A (en) Virtual live broadcast method and device, electronic equipment and storage medium
WO2015001437A1 (en) Image processing method and apparatus, and electronic device
CN110062271A (en) Method for changing scenes, device, terminal and storage medium
EP4047938A1 (en) Method for displaying interactive interface and apparatus thereof, method for generating interactive interface
CN109327737A (en) TV programme suggesting method, terminal, system and storage medium
CN107948702B (en) Synchronous method, device, terminal and the storage medium of Application Status
CN112218154B (en) Video acquisition method and device, storage medium and electronic device
JP2018113616A (en) Information processing unit, information processing method, and program
JP2017064161A (en) Game system, imaging apparatus, game apparatus, and program
WO2019227324A1 (en) Method and device for controlling video playback speed and motion camera
WO2022198934A1 (en) Method and apparatus for generating video synchronized to beat of music
CN110162667A (en) Video generation method, device and storage medium
JP2020074041A (en) Imaging device for gaming, image processing device, and image processing method
JP2005136841A (en) Device, method and program of image output processing, image distribution server and image distribution processing program
CN108833740B (en) Real-time prompter method and device based on three-dimensional animation live broadcast
WO2024007290A1 (en) Video acquisition method, electronic device, storage medium, and program product
CN113473224B (en) Video processing method, video processing device, electronic equipment and computer readable storage medium
CN113989424A (en) Three-dimensional virtual image generation method and device and electronic equipment
CN114363549B (en) Intelligent script running and showing recording processing method, device and system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant