CN111986076A - Image processing method and device, interactive display device and electronic equipment - Google Patents

Image processing method and device, interactive display device and electronic equipment Download PDF

Info

Publication number
CN111986076A
CN111986076A CN202010850797.XA CN202010850797A CN111986076A CN 111986076 A CN111986076 A CN 111986076A CN 202010850797 A CN202010850797 A CN 202010850797A CN 111986076 A CN111986076 A CN 111986076A
Authority
CN
China
Prior art keywords
stylized
image
template
target object
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202010850797.XA
Other languages
Chinese (zh)
Inventor
田济源
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen TetrasAI Technology Co Ltd
Original Assignee
Shenzhen TetrasAI Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen TetrasAI Technology Co Ltd filed Critical Shenzhen TetrasAI Technology Co Ltd
Priority to CN202010850797.XA priority Critical patent/CN111986076A/en
Publication of CN111986076A publication Critical patent/CN111986076A/en
Priority to PCT/CN2021/090249 priority patent/WO2022037111A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • G06T3/04
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/22Matching criteria, e.g. proximity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration by the use of more than one image, e.g. averaging, subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging

Abstract

The disclosure relates to an image processing method and device, an interactive display device and electronic equipment, wherein the method comprises the following steps: acquiring an image to be processed; determining a stylized template for stylizing the image to be processed; performing stylization processing on a target object in the image to be processed according to the stylized template to obtain a stylized target object, wherein the stylized target object and the stylized template have the same style; and fusing the stylized target object and the stylized template to obtain a stylized image. The embodiment of the disclosure can enrich the processing mode of the image.

Description

Image processing method and device, interactive display device and electronic equipment
Technical Field
The present disclosure relates to the field of computer technologies, and in particular, to an image processing method and apparatus, an interactive display apparatus, and an electronic device.
Background
With the development and progress of technology, multimedia data is processed more and more. For example: the method can be used for beautifying, adding special effects, adding filters and the like to multimedia data, and has higher and higher requirements on richness and processing quality of multimedia data processing modes.
Disclosure of Invention
The present disclosure proposes an image processing solution for processing and fusing images.
According to an aspect of the present disclosure, there is provided an image processing method including:
acquiring an image to be processed;
determining a stylized template for stylizing the image to be processed;
performing stylization processing on a target object in the image to be processed according to the stylized template to obtain a stylized target object, wherein the stylized target object and the stylized template have the same style;
and fusing the stylized target object and the stylized template to obtain a stylized image.
In a possible implementation manner, the determining a stylized template used for stylizing the image to be processed includes:
carrying out target object identification on the image to be processed to obtain a target object identification result of the image to be processed, wherein the target object identification result comprises a target object attribute;
and determining a stylized template for stylizing the image to be processed according to the target object attribute.
In a possible implementation manner, the determining a stylized template used for stylizing the image to be processed includes:
determining the similarity between the image to be processed and each reference stylized template, and determining the stylized template for stylizing the image to be processed from the reference stylized templates according to the similarity; or the like, or, alternatively,
and in response to the selection operation of the reference stylized template, determining the selected reference stylized template as the stylized template for stylizing the image to be processed.
In a possible implementation manner, the fusing the stylized target object and the stylized template to obtain a stylized image includes:
and under the condition that the size of the stylized target object meets the size adjustment condition, adjusting the target object according to a preset adjustment rule, and fusing the adjusted stylized target object and the stylized template to obtain a stylized image.
In a possible implementation manner, in a case that the size of the stylized target object satisfies a size adjustment condition, adjusting the target object according to a preset adjustment rule includes:
detecting a size of a template object in the stylized template;
and under the condition that the size of the stylized target object does not match the size of the template object, adjusting the size of the stylized target object according to the size of the template object.
In a possible implementation manner, the fusing the stylized target object and the stylized template to obtain a stylized image includes:
determining the corresponding position of the stylized target object in the stylized template according to the size of the stylized target object;
and fusing the stylized target object at the position in the stylized template to obtain the stylized image.
In one possible implementation, the method further includes:
acquiring a template image;
and sending the template image to a server so that the server performs image processing on the template image to obtain a reference stylized template.
In a possible implementation manner, fusing the stylized target object and the stylized template to obtain a stylized image includes:
fusing the stylized target object and the stylized template to obtain a preliminary stylized image;
and responding to an adjusting operation aiming at the formatting target object in the preliminary formatting image to obtain a formatting image, wherein the adjusting operation comprises a size adjusting operation and/or a position adjusting operation aiming at the formatting target object.
According to an aspect of the present disclosure, there is provided an interactive display device, including: the display system comprises an image processing module and a display module, wherein the image processing module comprises an image acquisition unit and an image processing unit; wherein the content of the first and second substances,
the image acquisition unit is used for acquiring an image to be processed;
the image processing unit is used for determining a stylized template used for stylizing the image to be processed, and carrying out stylized processing on a target object in the image to be processed according to the stylized template to obtain a stylized target object, wherein the stylized target object has the same style as the stylized template;
the image processing unit is further configured to fuse the stylized target object with the stylized template to obtain a stylized image;
the display module is used for displaying the stylized image.
According to an aspect of the present disclosure, there is provided an image processing apparatus including:
the first acquisition module is used for acquiring an image to be processed;
the determining module is used for determining a stylized template used for stylizing the image to be processed;
the processing module is used for carrying out stylization processing on the target object in the image to be processed according to the stylized template to obtain a stylized target object, and the stylized target object and the stylized template have the same style;
and the fusion module is used for fusing the stylized target object and the stylized template to obtain a stylized image.
In a possible implementation manner, the determining module is further configured to:
carrying out target object identification on the image to be processed to obtain a target object identification result of the image to be processed, wherein the target object identification result comprises a target object attribute;
and determining a stylized template for stylizing the image to be processed according to the target object attribute.
In a possible implementation manner, the determining module is further configured to:
determining the similarity between the image to be processed and each reference stylized template, and determining the stylized template for stylizing the image to be processed from the reference stylized templates according to the similarity; or the like, or, alternatively,
and in response to the selection operation of the reference stylized template, determining the selected reference stylized template as the stylized template for stylizing the image to be processed.
In a possible implementation manner, the fusion module is further configured to:
and under the condition that the size of the stylized target object meets the size adjustment condition, adjusting the target object according to a preset adjustment rule, and fusing the adjusted stylized target object and the stylized template to obtain a stylized image.
In a possible implementation manner, the fusion module is further configured to:
detecting a size of a template object in the stylized template;
and under the condition that the size of the stylized target object does not match the size of the template object, adjusting the size of the stylized target object according to the size of the template object.
In a possible implementation manner, the fusion module is further configured to:
determining the corresponding position of the stylized target object in the stylized template according to the size of the stylized target object;
and fusing the stylized target object at the position in the stylized template to obtain the stylized image.
In one possible implementation, the apparatus further includes:
and the second acquisition module is used for sending the template image to a server so that the server can perform image processing on the template image to obtain a reference stylized template.
In a possible implementation manner, the fusion module is further configured to:
fusing the stylized target object and the stylized template to obtain a preliminary stylized image;
and responding to an adjusting operation aiming at the formatting target object in the preliminary formatting image to obtain a formatting image, wherein the adjusting operation comprises a size adjusting operation and/or a position adjusting operation aiming at the formatting target object.
According to an aspect of the present disclosure, there is provided an electronic device including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to invoke the memory-stored instructions to perform the above-described method.
According to an aspect of the present disclosure, there is provided a computer readable storage medium having stored thereon computer program instructions which, when executed by a processor, implement the above-described method.
In the embodiment of the present disclosure, an image to be processed may be obtained, and after a stylized template used for stylizing the image to be processed is determined, a stylized target object in the image to be processed is obtained by stylizing a target object according to the stylized template, where the stylized target object has the same style as the stylized template. The stylized target object and the stylized template may be fused to obtain a stylized image. According to the image processing method and device, the interactive display device and the electronic equipment, provided by the embodiment of the disclosure, the image processing mode is enriched, and the problem of fusion hardening caused by unnatural segmentation edge effect of the target object is relieved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure. Other features and aspects of the present disclosure will become apparent from the following detailed description of exemplary embodiments, which proceeds with reference to the accompanying drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and, together with the description, serve to explain the principles of the disclosure.
FIG. 1 shows a flow diagram of an image processing method according to an embodiment of the present disclosure;
2 a-2 c show schematic diagrams of an image processing method according to an embodiment of the disclosure;
FIG. 3 shows a schematic diagram of an image processing method according to an embodiment of the present disclosure;
fig. 4 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure;
FIG. 5 shows a block diagram of an electronic device 800 in accordance with an embodiment of the disclosure;
fig. 6 illustrates a block diagram of an electronic device 1900 in accordance with an embodiment of the disclosure.
Detailed Description
Various exemplary embodiments, features and aspects of the present disclosure will be described in detail below with reference to the accompanying drawings. In the drawings, like reference numbers can indicate functionally identical or similar elements. While the various aspects of the embodiments are presented in drawings, the drawings are not necessarily drawn to scale unless specifically indicated.
The word "exemplary" is used exclusively herein to mean "serving as an example, embodiment, or illustration. Any embodiment described herein as "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments.
The term "and/or" herein is merely an association describing an associated object, meaning that three relationships may exist, e.g., a and/or B, may mean: a exists alone, A and B exist simultaneously, and B exists alone. In addition, the term "at least one" herein means any one of a plurality or any combination of at least two of a plurality, for example, including at least one of A, B, C, and may mean including any one or more elements selected from the group consisting of A, B and C.
Furthermore, in the following detailed description, numerous specific details are set forth in order to provide a better understanding of the present disclosure. It will be understood by those skilled in the art that the present disclosure may be practiced without some of these specific details. In some instances, methods, means, elements and circuits that are well known to those skilled in the art have not been described in detail so as not to obscure the present disclosure.
Fig. 1 shows a flowchart of an image processing method according to an embodiment of the present disclosure, which may be performed by an electronic device such as a terminal device or a server, the terminal device may be a User Equipment (UE), a mobile device, a User terminal, a cellular phone, a cordless phone, a Personal Digital Assistant (PDA), a handheld device, a computing device, a vehicle-mounted device, a wearable device, or the like, and the method may be implemented by a processor calling a computer-readable instruction stored in a memory. Alternatively, the method may be performed by a server.
As shown in fig. 1, the image processing method may include:
in step S11, an image to be processed is acquired.
For example, the to-be-processed image may be acquired by an image acquisition device (e.g., a device with an image acquisition function such as a mobile phone, a camera, a monitoring device, etc.), or the to-be-processed image may be acquired by uploading or downloading through a terminal, and the manner of acquiring the to-be-processed image is not particularly limited in the embodiments of the present disclosure.
In step S12, a stylized template for stylizing the image to be processed is determined.
For example, the stylized template may be an image of a work of art, such as: the stylized templates may be images corresponding to oil paintings, sketches, ink-wash paintings, sculptures or sculptures, and accordingly, the stylized templates may respectively present oil painting styles, sketching styles, ink-wash paintings, sculptures or sculptures, and the like, or the stylized templates may also be images of other styles, for example: the stylized template may be an image that exhibits a comic style, or the stylized template may be an image that exhibits a movie style, and so forth.
For example, a stylized template for stylizing an image to be processed may be determined in response to a user's selection operation for the stylized template; or, a stylized template for stylizing the image to be processed may be determined according to the image to be processed.
For example, when the embodiment of the present disclosure is executed by a terminal device, the terminal device may obtain and store at least one stylized template from a server, and after obtaining an image to be processed, determine, from the stored at least one stylized template, a stylized template for stylizing the image to be processed.
For example: the terminal equipment can determine a stylized template for stylizing the image to be processed according to the image to be processed; or the terminal device may display each stylized template on the display interface, the user may select the stylized template, and the terminal device determines the selected stylized template as the stylized template for stylizing the image to be processed in response to a selection operation of the user on the stylized template.
For example, when the embodiment of the present disclosure is executed by a server, the server may receive a stylized processing instruction from a terminal device, the stylized processing instruction may include a to-be-processed image, and the server may determine, according to the stylized processing instruction, a stylized template for stylizing the to-be-processed image from the locally stored stylized templates.
For example: the server can determine a stylized template used for stylizing the image to be processed according to the image to be processed; or, the stylized processing instruction further includes template image identification information of a stylized template selected by the user, and the server may determine the stylized template identified by the template image identification information, as the stylized template for performing the stylized processing on the image to be processed.
In step S13, performing stylization processing on the target object in the image to be processed according to the stylized template to obtain a stylized target object, where the stylized target object has the same style as the stylized template.
For example, the image to be processed may be stylized through a stylized processing network corresponding to the pre-trained stylized template. The stylized processing network is a neural network that can correspond to one or more stylized templates with the same style, and the style of the image data processed by the stylized processing network is the same as that of the stylized templates, or the stylized processing network can correspond to different stylized templates, which is not limited in the embodiment of the disclosure. Style may refer to a representative appearance that an image presents as a whole. Different images can present different styles, for example, the styles of the images can be reflected by the extracted features in the images, and whether the styles are the same or different can be judged by the similarity of the features between the images. For example, the training process of the stylized processing network may be performed at the server side, that is, the server side may train the stylized processing network corresponding to each stylized template according to each stylized template. The processing efficiency and precision of the image can be improved by performing stylization processing on the image to be processed through the stylization processing network.
For example, after determining a stylized template for stylizing an image to be processed, a stylized processing network corresponding to the stylized template may be determined, and a target object in the image to be processed may be stylized through the stylized processing network to obtain a stylized target object, where the stylized target object has the same style as that of the stylized template image, for example: and if the stylized template is an image corresponding to the oil painting work, the stylized target object is the oil painting texture style.
Exemplarily, the stylized image to be processed is stylized through the stylized template to obtain the stylized image to be processed, and the stylized image to be processed is subjected to target object recognition and segmentation to obtain the stylized target object. Or, the target object in the image to be processed is obtained by identifying and segmenting the target object, and the stylized target object is obtained by stylizing the target object through the stylized template.
The target object recognition and segmentation process may be performed by using a pre-trained neural network for recognition and segmentation, and the neural network for recognition and segmentation of the target object is not limited in the embodiments of the present disclosure.
In step S14, the stylized target object and the stylized template are fused to obtain a stylized image.
For example, after the stylized target object is obtained, the stylized target object may be fused into a stylized template to obtain a stylized image. For example: the stylized target object can be added to the preset position of the stylized template after being subjected to size adjustment according to the preset size, or the position of the stylized target object in the stylized template can be determined according to the position of the target object corresponding to the stylized target object in the image to be processed, and the stylized target object is added to the position.
Illustratively, the image to be processed is image data collected for a user a, the stylized template is an oil painting, the image data is processed by a stylized processing network corresponding to the oil painting, the human body image of the user a in the image data is processed into the texture style of the oil painting, and the finally obtained stylized image data is the human body image including the texture style of the oil painting of the user a in the oil painting.
It should be noted that, in the embodiment of the present disclosure, the stylized processing may also be performed on the video data, and when the video data is processed, the stylized processing may be performed on each video frame of the video data by using the above image processing method, so as to obtain a stylized video frame corresponding to each video frame, where the stylized video frames corresponding to all the video frames constitute the stylized video data corresponding to the video data.
In this way, an image to be processed can be obtained, after a stylized template used for stylizing the image to be processed is determined, stylized processing is carried out on a target object in the image to be processed according to the stylized template, and a stylized target object is obtained, wherein the stylized target object has the same style as the stylized template. The stylized target object and the stylized template may be fused to obtain a stylized image. According to the image processing method provided by the embodiment of the disclosure, the processing modes of the image are enriched, and the problem of fusion hardening caused by unnatural segmentation edge effect of the target object is relieved.
In a possible implementation manner, the determining a stylized template used for stylizing the image to be processed may include:
carrying out target object identification on the image to be processed to obtain a target object identification result of the image to be processed, wherein the target object identification result comprises a target object attribute;
and determining a stylized template for stylizing the image to be processed according to the target object attribute.
For example, the target object recognition result of the image to be processed may be obtained by performing target object recognition on the image to be processed (e.g., performing target object recognition on the image to be processed through a recognition network for recognizing the target object). The target object identification result may include a target object attribute, and the target object attribute may include at least one attribute feature of the target object.
The stylized template matched with the target object attribute can be determined and is used for stylizing the image to be processed. For example, the stylized template may have at least one attribute tag information, and in the case that the stylized template has the attribute tag information matching the attribute of the target object, the stylized template may be determined to be the stylized template for stylizing the image to be processed.
In one possible implementation, the target object attribute includes at least one of an age and/or a gender of a target object, a size of the target object in the image to be processed, a facial expression of the target object, a body posture of the target object, and a color tone of the target object.
For example, the target object attribute includes the age and/or gender of the target object, and a stylized template with an attribute tag matching the age and/or gender of the target object may be determined as the stylized template for stylizing the image to be processed. For example: when the target object is 0-6 years old, the searchable attribute tag includes: the stylized template for the infant or the 0-6 year old can be an image corresponding to an artistic work which comprises the infant or the animal and the like and has a warm and lovely style.
Or, the target object attribute includes a size of the target object in the multimedia data, and the stylized template whose attribute tag matches the size of the target object in the image to be processed may be determined as the stylized template for stylizing the image to be processed. For example: when the size of the target object in the image to be processed is a × b, the attribute tag may be searched for, including: and a stylized model with the size of c x d, wherein c can be any value from (a-x) to (a + x), and d can be any value from b (b-y) to (b + y) (wherein x and y can be the same or different). The stylized model can be an image corresponding to an artwork containing characters with sizes similar to the size a b of the target object, or an image corresponding to an artwork with more harmonious visual effect presented by the target object with the size a b added at a preset position.
Or the target object attribute includes the facial expression of the target object, a stylized template whose attribute tag matches the facial expression of the target object may be determined, and the stylized template is a stylized model for stylizing the image to be processed. For example: when the facial expression of the target object is a happy laugh, a stylized template with the attribute labels matched with the happy character can be searched, wherein the stylized template is an image and the like corresponding to an artistic work with a lighter and brighter style, or the stylized template is an image and the like including a happy laugh figure.
Or, the target object attribute includes the body posture of the target object, and the stylized template whose attribute tag matches with the body posture of the target object may be determined as the stylized template for stylizing the image to be processed. For example: when the physical pose of the target object is sitting, a stylized template with attribute tags matching the sitting position (e.g., the attribute tags include the sitting position, items related to the sitting position (chair, sofa, etc.) may be found, the stylized template including an image of a person in the sitting position, or the stylized template including an image of household items related to the sitting position.
Or, if the target object attribute includes the color tone of the target object, the stylized template whose attribute tag matches the color tone of the target object may be determined, and is used for stylizing the image to be processed. For example: when the tone of the target object is a dark tone, a stylized template with the attribute labels matched with the dark tone can be searched, and the tone of the stylized template is dark.
Therefore, the corresponding stylized template can be matched in a self-adaptive manner according to the target object attribute, and the processing efficiency and the interestingness of the image to be processed can be improved.
It should be noted that when a plurality of stylized templates are matched according to the target object attributes, a stylized template for stylizing the image to be processed can be determined in response to the selection operation of the user; when the stylized templates are not matched according to the target object attributes, the stylized templates used for stylizing the images to be processed can be determined from all the stylized templates in response to the selection operation of the user, or the stylized templates with higher utilization rate can be recommended to the user according to the historical used data of all the stylized templates.
In a possible implementation manner, the determining a stylized template used for stylizing the image to be processed may include:
determining the similarity between the image to be processed and each reference stylized template, and determining the stylized template used for stylizing the image to be processed from the reference stylized templates according to the similarity, or determining the selected reference stylized template as the stylized template used for stylizing the image to be processed in response to the selection operation aiming at the reference stylized template.
For example, the similarity (including content similarity, color similarity, etc.) between the image to be processed and each reference stylized template may be determined, and the stylized template matching the image to be processed may be determined according to the similarity between the image to be processed and each reference stylized template. For example: the reference stylized template with the highest similarity can be determined as the stylized template for stylizing the image to be processed.
Or the reference stylized templates can be sorted from high to low according to the similarity between the image to be processed and each reference stylized template, the reference stylized templates are displayed on the terminal equipment side according to the sorting for the user to select, and the reference stylized template selected by the user is the stylized template used for stylizing the image to be processed.
Or, the content of the image to be processed can be analyzed according to the image to be processed, the content of the image to be processed is obtained, the associated content of the image to be processed is determined, and the stylized template matched with the content of the image to be processed or the associated content is determined according to the content of the image to be processed and the associated content. For example: if the content of the image to be processed is that the target object is seated and the related content is an article (chair, sofa, etc.) related to the seat, etc., a stylized template including the seat or the article related to the seat can be matched. And when the number of the matched stylized templates is more than one, the plurality of stylized templates can be displayed on the terminal equipment side for the user to select, and the stylized template selected by the user is the finally determined stylized template for stylizing the image to be processed.
Therefore, the corresponding stylized template can be matched in a self-adaptive manner according to the target object attribute, and the processing efficiency and the interestingness of the image to be processed can be improved.
For example, each reference stylized template may be presented in a presentation area of a display interface of the terminal device. The user may select a stylized template for stylizing the image to be processed by manually selecting a reference stylized template according to preferences. The user may select the reference stylized template by a touch or click, etc. The terminal equipment can respond to the selection operation of the user for the reference stylized template, and the selected reference stylized template is determined as the stylized template for stylizing the image to be processed.
In a possible implementation manner, the fusing the stylized target object and the stylized template to obtain a stylized image may include:
and under the condition that the size of the stylized target object meets the size adjustment condition, adjusting the target object according to a preset adjustment rule, and fusing the adjusted stylized target object and the stylized template to obtain a stylized image.
For example, the resizing condition may be a restriction condition for resizing the stylized target object, and the preset resizing rule may be a rule for resizing the stylized target object corresponding to the resizing condition.
For example: a first size threshold and a second size threshold are preset, and the first size threshold is larger than the second size threshold. When the size of the stylized target object is greater than the first size threshold or less than the second size threshold, the size of the stylized target object may be adjusted to any value between the first size threshold and the second size threshold. Or when the size of the stylized target object is larger than the first size threshold or smaller than the second size threshold, the size of the stylized target object can be adjusted according to the size of the person or the article in the stylized template so as to be matched with the size of the person or the article in the stylized template, and the adjusted stylized target image and the stylized template are fused to obtain the stylized image.
In a possible implementation manner, in the case that the size of the stylized target object satisfies the size adjustment condition, adjusting the target object according to a preset adjustment rule may include:
detecting a size of a template object in the stylized template;
and under the condition that the size of the stylized target object does not match the size of the template object, adjusting the size of the stylized target object according to the size of the template object.
For example, the size of a template object in a stylized template, which may be a person, animal, or item in the stylized template, may be detected. The matching relationship between the template object and the stylized target object may be preset, for example: when the template object is a person, when the difference value between the size of the person and the size of the stylized target object is smaller than a first difference threshold value, the stylized target object and the template object can be determined to be matched in size, otherwise, the stylized target object and the template object are determined to be not matched in size. The first difference threshold may be a value determined according to an age attribute of the person, for example: when the template object and the stylized object are of the same age attribute (both children or both adults, etc.), the determined first difference threshold is smaller; when the template object and the stylized object are different age attributes, the determined first difference threshold is larger.
Or when the template object is an animal, when the difference value between the size of the animal and the size of the stylized target object is smaller than the second difference threshold value, determining that the sizes of the stylized target object and the template object are matched, otherwise, determining that the sizes of the stylized target object and the template object are not matched. Wherein, the second difference threshold may be a value determined according to the animal body type, such as: when the animal body size is small, the second difference threshold value is set to be large, and when the animal body size is large, the second difference threshold value is set to be small.
Or, when the template object is an article, when the difference between the size of the article and the size of the stylized target object is smaller than the third difference threshold, determining that the sizes of the stylized target object and the template object are matched, otherwise, determining that the sizes of the stylized target object and the template object are not matched. Wherein the third difference threshold may be a value determined according to the type of the article, such as: different article types set different third difference thresholds, such as: the third difference threshold corresponding to the wardrobe is greater than the third difference threshold corresponding to the chair.
Therefore, the size of the stylized target object can be adjusted in a self-adaptive mode, user experience can be improved, and the obtained stylized image fusion degree is higher.
In a possible implementation manner, the fusing the stylized target object and the stylized template to obtain a stylized image may include:
determining the corresponding position of the stylized target object in the stylized template according to the size of the stylized target object;
and fusing the stylized target object at the position in the stylized template to obtain the stylized image.
For example, for the stylized template, the corresponding relationship between the sizes and the positions of different stylized target objects may be preset, and then the corresponding positions of the stylized target objects in the stylized template may be determined according to the sizes of the stylized target objects.
For example: aiming at the stylized template of the landscape class, the size of a stylized target object corresponding to the position of the template presenting the near content is larger, and the size of a stylized target object corresponding to the position of the template presenting the far content is smaller; alternatively, for a stylized template including a person, the smaller the difference between the size of the person and the size of the stylized target object in the template, the closer the position corresponding to the stylized target object is to the position where the person is located.
And then, the position of the stylized target object can be determined according to the corresponding relation between the position in the stylized template and the size of the stylized target object, and the stylized target object can be fused and added to the position.
Therefore, the position of the stylized target object in the stylized image can be adjusted in a self-adaptive manner according to the size of the stylized target object, the user experience can be improved, and the obtained stylized image has higher fusion degree.
In one possible implementation, the method may further include:
acquiring a template image;
and sending the template image to a server so that the server performs image processing on the template image to obtain a reference stylized template.
For example, the user may acquire the template image through an image acquisition device, or may acquire the template image through a terminal in an uploading, downloading, or other manner. The terminal device can respond to the customization operation of the user to the stylized template (the triggering operation of the customization control or a corresponding voice control instruction and the like), and send a template customization instruction to the server, wherein the template customization instruction can comprise a template image. The server may respond to the template customizing instruction, perform image processing on the template image to obtain a reference stylized template, for example: and training the stylized processing network corresponding to the template image according to a preset training set.
For example, the server may process the sample image in the training set through the stylized processing network to obtain a stylized sample image, obtain a content loss of the stylized processing network according to the sample image and the stylized sample image, and obtain a style loss of the stylized processing network according to the stylized sample image and the template image. The network loss of the stylized processing network can be obtained according to the content loss and the style loss (for example, a weighted summation operation can be carried out), and then the stylized processing network is trained according to the network loss. The content loss and style loss calculation process may use any loss calculation function, and variables in the function may include feature information reflecting the style of the image, which is not specifically limited in this disclosure.
After the training of the stylized processing network of the template image is completed, the server can take the template image as a reference stylized template and perform stylized processing on the image to be processed according to the stylized processing network of the reference stylized template; or the template customizing instruction further includes information such as account information of the user, and the server may send the stylized processing network of the reference stylized template to the terminal device corresponding to the account information of the user (or the server may send a network training completion instruction to the terminal device corresponding to the account information of the user), so that the user can perform stylized fusion on the image to be processed and the reference stylized template through the stylized processing network corresponding to the reference stylized template.
For example: referring to fig. 2a to 2c and fig. 3, when a user likes an oil painting during shopping, and wants to integrate his portrait into the oil painting, an image corresponding to the oil painting may be collected as a template image, and the template image is sent to a server through a terminal device to perform corresponding training of a stylized processing network. And after the training of the stylized processing network is finished, taking the template image as a reference stylized template, and displaying the reference stylized template in a display area of the stylized template in a display interface of the user terminal equipment. After the to-be-processed image including the head portrait of the user is obtained through the terminal device, the terminal device can respond to the selection operation of the user for the reference stylized template, the user portrait in the to-be-processed image can be stylized through the reference stylized template, the stylized user portrait is obtained, the stylized user portrait has the oil painting texture style of the reference stylized template, and the stylized user portrait is combined with the reference stylized template, so that the stylized image is obtained.
Therefore, the user can customize the reference stylized template, the individual customization requirements of the user can be met, and the user experience is improved.
In a possible implementation manner, fusing the stylized target object and the stylized template to obtain a stylized image includes:
fusing the stylized target object and the stylized template to obtain a preliminary stylized image;
and responding to an adjusting operation aiming at the formatting target object in the preliminary formatting image to obtain a formatting image, wherein the adjusting operation comprises a size adjusting operation and/or a position adjusting operation aiming at the formatting target object.
For example, the preliminary stylized image may be obtained by placing the stylized target object at a preset position of the stylized template, or placing the stylized target object in the stylized template according to a position of the target object corresponding to the stylized target object in the image to be processed. The user can adjust the position of the stylized target object in the preliminary stylized image through position adjustment operation (for example: dragging operation aiming at the stylized target object); the user can also adjust the position of the stylized target object in the preliminary stylized image through a size adjustment operation (enlargement operation or reduction operation) of the stylized target object to obtain a stylized image.
Illustratively, the stylized template comprises a character image, and the user can adjust the stylized target object to the position of the character image through position adjustment operation and adjust the stylized target object to be similar to the character image in size through size adjustment operation of the stylized target object so as to replace the character image in the stylized template with the self-stylized character image of the user; or, the stylized template comprises a chair, the user can adjust the stylized target object to the position of the chair through position adjustment operation, and the proportion of the stylized target object to the size of the chair is adjusted to accord with the proportion of the normal human body to the size of the chair through size adjustment operation of the stylized target object, so that the visual effect of the stylized target object sitting on the chair is realized, the interaction and the interaction between the user and the stylized template can be realized, and the interest of image processing is improved.
The disclosed embodiment also provides an interactive display device, which includes: the display system comprises an image processing module and a display module, wherein the image processing module comprises an image acquisition unit and an image processing unit; wherein the content of the first and second substances,
the image acquisition unit is used for acquiring an image to be processed;
the image processing unit is used for determining a stylized template used for stylizing the image to be processed, and carrying out stylized processing on a target object in the image to be processed according to the stylized template to obtain a stylized target object, wherein the stylized target object has the same style as the stylized template;
the image processing unit is further configured to fuse the stylized target object with the stylized template to obtain a stylized image;
the display module is used for displaying the stylized image.
For example, the interactive display device may be a display device in a public place, such as: the exhibition device can be used for exhibition of art exhibition venues such as paintings, sculpture exhibitions and museums, and through the interactive exhibition device, users who participate in exhibition can be interactively fused with images corresponding to the exhibited articles.
Taking the exhibition of a painting as an example, the interactive display device can acquire the image to be processed through an image acquisition unit, such as: the user stands in front of the interactive display device and the image acquisition unit starts to acquire images for the user.
The image processing unit may determine, according to the acquired image to be processed, a stylized template for stylizing the image to be processed, or may display all stylized templates (displayed parts or images corresponding to all pictures) on the display module, and then determine, in response to a selection operation of a user on the stylized template on the display module, that the selected stylized template is the stylized template for stylizing the image to be processed, or determine, according to the acquired image to be processed, the stylized template for stylizing the image to be processed from the stylized template, where the specific process may refer to the foregoing embodiment, and details of the process are not repeated in this disclosed embodiment.
The image processing unit may perform stylization processing on a target object in the acquired image to be processed through the stylization template to obtain a stylized target object, and fuse the stylized target object with the stylized template to obtain a stylized image, and display the stylized image through the display module.
Therefore, according to the interactive display device provided by the embodiment of the disclosure, the interactive mode between the user and the artwork can be enriched, and the user experience and the interestingness are improved.
The stylized processing of the image to be processed according to the stylized template can be realized by a stylized network corresponding to the stylized template, and the embodiment of the disclosure is not repeated herein, and only the foregoing embodiment is referred to. The processing efficiency and precision of the image can be improved by performing stylization processing on the image to be processed through the stylization processing network.
In one possible implementation, the interactive display apparatus may further include a communication module,
the communication module may be configured to establish a connection with a target terminal device, receive an image to be processed sent by the target terminal device, or send the stylized image to the target terminal device.
For example, the interactive exhibition device can establish a connection with the target terminal equipment through the communication module. For example: two-dimension code information is displayed on the interactive display device, and a user can scan the two-dimension code information through the target terminal equipment so as to establish connection with the communication module. The user can upload the to-be-processed image to the interactive display device through the target terminal equipment for stylized processing, and the terminal equipment can also send the stylized image to the target terminal equipment after the stylized processing of the to-be-processed image is completed.
In one possible implementation, the interactive display apparatus may further include a customization module,
the customization module is used for acquiring a template image through the image acquisition unit or the communication module;
the communication module is further configured to send the first template image to a server, so that the server performs image processing on the template image to obtain a stylized template corresponding to the template image. For example, the user may customize the stylized template through a customization module. For example: the user likes a displayed picture, but the interactive display device does not have a stylized template corresponding to the picture, the user can acquire an image of the picture, the image is used as a template image and is sent to the server through the interactive display device, so that the server trains a stylized processing model corresponding to the template image according to a preset training set and the template image, and further obtains a stylized template corresponding to the template image, the to-be-processed image acquired by the user can be processed through the stylized processing model corresponding to the stylized template, a stylized target object is obtained, and the stylized target object and the stylized template are fused, and the stylized image is obtained.
Illustratively, the authority setting may also be performed on the customization module, for example: the customization module needs to be opened under the condition of login of an administrator, so that the condition that a user randomly customizes and occupies too much resources can be avoided.
It is understood that the above-mentioned method embodiments of the present disclosure can be combined with each other to form a combined embodiment without departing from the logic of the principle, which is limited by the space, and the detailed description of the present disclosure is omitted. Those skilled in the art will appreciate that in the above methods of the specific embodiments, the specific order of execution of the steps should be determined by their function and possibly their inherent logic.
In addition, the present disclosure also provides an image processing apparatus, an electronic device, a computer-readable storage medium, and a program, which can be used to implement any one of the image processing methods provided by the present disclosure, and the descriptions and corresponding descriptions of the corresponding technical solutions and the corresponding descriptions in the methods section are omitted for brevity.
Fig. 4 shows a block diagram of an image processing apparatus according to an embodiment of the present disclosure, which includes, as shown in fig. 4:
a first obtaining module 41, configured to obtain an image to be processed;
a determining module 42, configured to determine a stylized template for stylizing the image to be processed;
the processing module 43 may be configured to perform stylization processing on the target object in the image to be processed according to the stylized template to obtain a stylized target object, where the stylized target object has the same style as the stylized template;
the fusion module 44 may be configured to fuse the stylized target object and the stylized template to obtain a stylized image.
In this way, an image to be processed can be obtained, after a stylized template used for stylizing the image to be processed is determined, stylized processing is carried out on a target object in the image to be processed according to the stylized template, and a stylized target object is obtained, wherein the stylized target object has the same style as the stylized template. The stylized target object and the stylized template may be fused to obtain a stylized image. According to the image processing device provided by the embodiment of the disclosure, the processing modes of the image are enriched, and the problem of fusion hardening caused by unnatural segmentation edge effect of the target object is relieved.
In a possible implementation manner, the determining module may be further configured to:
carrying out target object identification on the image to be processed to obtain a target object identification result of the image to be processed, wherein the target object identification result comprises a target object attribute;
and determining a stylized template for stylizing the image to be processed according to the target object attribute.
In a possible implementation manner, the determining module may be further configured to:
determining the similarity between the image to be processed and each reference stylized template, and determining the stylized template for stylizing the image to be processed from the reference stylized templates according to the similarity; or the like, or, alternatively,
and in response to the selection operation of the reference stylized template, determining the selected reference stylized template as the stylized template for stylizing the image to be processed.
In a possible implementation manner, the fusion module may be further configured to:
and under the condition that the size of the stylized target object meets the size adjustment condition, adjusting the target object according to a preset adjustment rule, and fusing the adjusted stylized target object and the stylized template to obtain a stylized image.
In a possible implementation manner, the fusion module may be further configured to:
detecting a size of a template object in the stylized template;
and under the condition that the size of the stylized target object does not match the size of the template object, adjusting the size of the stylized target object according to the size of the template object.
In a possible implementation manner, the fusion module may be further configured to:
determining the corresponding position of the stylized target object in the stylized template according to the size of the stylized target object;
and fusing the stylized target object at the position in the stylized template to obtain the stylized image.
In one possible implementation, the apparatus may further include:
and the second acquisition module is used for sending the template image to a server so that the server can perform image processing on the template image to obtain a reference stylized template.
In a possible implementation manner, the fusion module may be further configured to:
fusing the stylized target object and the stylized template to obtain a preliminary stylized image;
and responding to an adjusting operation aiming at the formatting target object in the preliminary formatting image to obtain a formatting image, wherein the adjusting operation comprises a size adjusting operation and/or a position adjusting operation aiming at the formatting target object.
In some embodiments, functions of or modules included in the apparatus provided in the embodiments of the present disclosure may be used to execute the method described in the above method embodiments, and specific implementation thereof may refer to the description of the above method embodiments, and for brevity, will not be described again here.
Embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon computer program instructions, which when executed by a processor, implement the above-mentioned method. The computer readable storage medium may be a non-volatile computer readable storage medium.
An embodiment of the present disclosure further provides an electronic device, including: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to invoke the memory-stored instructions to perform the above-described method.
The embodiments of the present disclosure also provide a computer program product, which includes computer readable code, and when the computer readable code runs on a device, a processor in the device executes instructions for implementing the image processing method provided in any one of the above embodiments.
The embodiments of the present disclosure also provide another computer program product for storing computer readable instructions, which when executed cause a computer to perform the operations of the image processing method provided in any of the above embodiments.
The electronic device may be provided as a terminal, server, or other form of device.
Fig. 5 illustrates a block diagram of an electronic device 800 in accordance with an embodiment of the disclosure. For example, the electronic device 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, a fitness device, a personal digital assistant, or the like terminal.
Referring to fig. 5, electronic device 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the electronic device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the electronic device 800. Examples of such data include instructions for any application or method operating on the electronic device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
The power supply component 806 provides power to the various components of the electronic device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the electronic device 800.
The multimedia component 808 includes a screen that provides an output interface between the electronic device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the electronic device 800 is in an operation mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the electronic device 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the electronic device 800. For example, the sensor assembly 814 may detect an open/closed state of the electronic device 800, the relative positioning of components, such as a display and keypad of the electronic device 800, the sensor assembly 814 may also detect a change in the position of the electronic device 800 or a component of the electronic device 800, the presence or absence of user contact with the electronic device 800, orientation or acceleration/deceleration of the electronic device 800, and a change in the temperature of the electronic device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a Complementary Metal Oxide Semiconductor (CMOS) or Charge Coupled Device (CCD) image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate wired or wireless communication between the electronic device 800 and other devices. The electronic device 800 may access a wireless network based on a communication standard, such as a wireless network (WiFi), a second generation mobile communication technology (2G) or a third generation mobile communication technology (3G), or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the electronic device 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium, such as the memory 804, is also provided that includes computer program instructions executable by the processor 820 of the electronic device 800 to perform the above-described methods.
Fig. 6 illustrates a block diagram of an electronic device 1900 in accordance with an embodiment of the disclosure. For example, the electronic device 1900 may be provided as a server. Referring to fig. 6, electronic device 1900 includes a processing component 1922 further including one or more processors and memory resources, represented by memory 1932, for storing instructions, e.g., applications, executable by processing component 1922. The application programs stored in memory 1932 may include one or more modules that each correspond to a set of instructions. Further, the processing component 1922 is configured to execute instructions to perform the above-described method.
The electronic device 1900 may also include a power component 1926 configured to perform power management of the electronic device 1900, a wired or wireless network interface 1950 configured to connect the electronic device 1900 to a network, and an input/output (I/O) interface 1958. The electronic device 1900 may operate based on an operating system, such as the Microsoft Server operating system (Windows Server), stored in the memory 1932TM) Apple Inc. of the present application based on the graphic user interface operating System (Mac OS X)TM) Multi-user, multi-process computer operating system (Unix)TM) Free and open native code Unix-like operating System (Linux)TM) Open native code Unix-like operating System (FreeBSD)TM) Or the like.
In an exemplary embodiment, a non-transitory computer readable storage medium, such as the memory 1932, is also provided that includes computer program instructions executable by the processing component 1922 of the electronic device 1900 to perform the above-described methods.
The present disclosure may be systems, methods, and/or computer program products. The computer program product may include a computer-readable storage medium having computer-readable program instructions embodied thereon for causing a processor to implement various aspects of the present disclosure.
The computer readable storage medium may be a tangible device that can hold and store the instructions for use by the instruction execution device. The computer readable storage medium may be, for example, but not limited to, an electronic memory device, a magnetic memory device, an optical memory device, an electromagnetic memory device, a semiconductor memory device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), a Static Random Access Memory (SRAM), a portable compact disc read-only memory (CD-ROM), a Digital Versatile Disc (DVD), a memory stick, a floppy disk, a mechanical coding device, such as punch cards or in-groove projection structures having instructions stored thereon, and any suitable combination of the foregoing. Computer-readable storage media as used herein is not to be construed as transitory signals per se, such as radio waves or other freely propagating electromagnetic waves, electromagnetic waves propagating through a waveguide or other transmission medium (e.g., optical pulses through a fiber optic cable), or electrical signals transmitted through electrical wires.
The computer-readable program instructions described herein may be downloaded from a computer-readable storage medium to a respective computing/processing device, or to an external computer or external storage device via a network, such as the internet, a local area network, a wide area network, and/or a wireless network. The network may include copper transmission cables, fiber optic transmission, wireless transmission, routers, firewalls, switches, gateway computers and/or edge servers. The network adapter card or network interface in each computing/processing device receives computer-readable program instructions from the network and forwards the computer-readable program instructions for storage in a computer-readable storage medium in the respective computing/processing device.
The computer program instructions for carrying out operations of the present disclosure may be assembler instructions, Instruction Set Architecture (ISA) instructions, machine-related instructions, microcode, firmware instructions, state setting data, or source or object code written in any combination of one or more programming languages, including an object oriented programming language such as Smalltalk, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The computer-readable program instructions may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider). In some embodiments, the electronic circuitry that can execute the computer-readable program instructions implements aspects of the present disclosure by utilizing the state information of the computer-readable program instructions to personalize the electronic circuitry, such as a programmable logic circuit, a Field Programmable Gate Array (FPGA), or a Programmable Logic Array (PLA).
Various aspects of the present disclosure are described herein with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the disclosure. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer-readable program instructions.
These computer-readable program instructions may be provided to a processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks. These computer-readable program instructions may also be stored in a computer-readable storage medium that can direct a computer, programmable data processing apparatus, and/or other devices to function in a particular manner, such that the computer-readable medium storing the instructions comprises an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer readable program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer, other programmable apparatus or other devices implement the functions/acts specified in the flowchart and/or block diagram block or blocks.
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of instructions, which comprises one or more executable instructions for implementing the specified logical function(s). In some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The computer program product may be embodied in hardware, software or a combination thereof. In an alternative embodiment, the computer program product is embodied in a computer storage medium, and in another alternative embodiment, the computer program product is embodied in a Software product, such as a Software Development Kit (SDK), or the like.
Having described embodiments of the present disclosure, the foregoing description is intended to be exemplary, not exhaustive, and not limited to the disclosed embodiments. Many modifications and variations will be apparent to those of ordinary skill in the art without departing from the scope and spirit of the described embodiments. The terminology used herein is chosen in order to best explain the principles of the embodiments, the practical application, or improvements made to the technology in the marketplace, or to enable others of ordinary skill in the art to understand the embodiments disclosed herein.

Claims (12)

1. An image processing method, characterized in that the method comprises:
acquiring an image to be processed;
determining a stylized template for stylizing the image to be processed;
performing stylization processing on a target object in the image to be processed according to the stylized template to obtain a stylized target object, wherein the stylized target object and the stylized template have the same style;
and fusing the stylized target object and the stylized template to obtain a stylized image.
2. The method of claim 1, wherein determining a stylized template for stylizing the image to be processed comprises:
carrying out target object identification on the image to be processed to obtain a target object identification result of the image to be processed, wherein the target object identification result comprises a target object attribute;
and determining a stylized template for stylizing the image to be processed according to the target object attribute.
3. The method of claim 1, wherein determining a stylized template for stylizing the image to be processed comprises:
determining the similarity between the image to be processed and each reference stylized template, and determining the stylized template for stylizing the image to be processed from the reference stylized templates according to the similarity; or the like, or, alternatively,
and in response to the selection operation of the reference stylized template, determining the selected reference stylized template as the stylized template for stylizing the image to be processed.
4. The method according to any one of claims 1 to 3, wherein the fusing the stylized target object with the stylized template to obtain a stylized image comprises:
and under the condition that the size of the stylized target object meets the size adjustment condition, adjusting the target object according to a preset adjustment rule, and fusing the adjusted stylized target object and the stylized template to obtain a stylized image.
5. The method according to claim 4, wherein the adjusting the target object according to a preset adjustment rule in the case that the size of the stylized target object satisfies a size adjustment condition comprises:
detecting a size of a template object in the stylized template;
and under the condition that the size of the stylized target object does not match the size of the template object, adjusting the size of the stylized target object according to the size of the template object.
6. The method according to any one of claims 1 to 3, wherein the fusing the stylized target object with the stylized template to obtain a stylized image comprises:
determining the corresponding position of the stylized target object in the stylized template according to the size of the stylized target object;
and fusing the stylized target object at the position in the stylized template to obtain the stylized image.
7. The method of claim 3, further comprising:
acquiring a template image;
and sending the template image to a server so that the server performs image processing on the template image to obtain a reference stylized template.
8. The method according to any one of claims 1 to 7, wherein fusing the stylized target object with the stylized template to obtain a stylized image comprises:
fusing the stylized target object and the stylized template to obtain a preliminary stylized image;
and responding to an adjusting operation aiming at the formatting target object in the preliminary formatting image to obtain a formatting image, wherein the adjusting operation comprises a size adjusting operation and/or a position adjusting operation aiming at the formatting target object.
9. An interactive display device, the interactive display device comprising: the display system comprises an image processing module and a display module, wherein the image processing module comprises an image acquisition unit and an image processing unit; wherein the content of the first and second substances,
the image acquisition unit is used for acquiring an image to be processed;
the image processing unit is used for determining a stylized template used for stylizing the image to be processed, and carrying out stylized processing on a target object in the image to be processed according to the stylized template to obtain a stylized target object, wherein the stylized target object has the same style as the stylized template;
the image processing unit is further configured to fuse the stylized target object with the stylized template to obtain a stylized image;
the display module is used for displaying the stylized image.
10. An image processing apparatus, characterized in that the apparatus comprises:
the first acquisition module is used for acquiring an image to be processed;
the determining module is used for determining a stylized template used for stylizing the image to be processed;
the processing module is used for carrying out stylization processing on the target object in the image to be processed according to the stylized template to obtain a stylized target object, and the stylized target object and the stylized template have the same style;
and the fusion module is used for fusing the stylized target object and the stylized template to obtain a stylized image.
11. An electronic device, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to invoke the memory-stored instructions to perform the method of any one of claims 1 to 8.
12. A computer readable storage medium having computer program instructions stored thereon, which when executed by a processor implement the method of any one of claims 1 to 8.
CN202010850797.XA 2020-08-21 2020-08-21 Image processing method and device, interactive display device and electronic equipment Pending CN111986076A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202010850797.XA CN111986076A (en) 2020-08-21 2020-08-21 Image processing method and device, interactive display device and electronic equipment
PCT/CN2021/090249 WO2022037111A1 (en) 2020-08-21 2021-04-27 Image processing method and apparatus, interactive display apparatus, and electronic device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010850797.XA CN111986076A (en) 2020-08-21 2020-08-21 Image processing method and device, interactive display device and electronic equipment

Publications (1)

Publication Number Publication Date
CN111986076A true CN111986076A (en) 2020-11-24

Family

ID=73442432

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010850797.XA Pending CN111986076A (en) 2020-08-21 2020-08-21 Image processing method and device, interactive display device and electronic equipment

Country Status (2)

Country Link
CN (1) CN111986076A (en)
WO (1) WO2022037111A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112488085A (en) * 2020-12-28 2021-03-12 深圳市慧鲤科技有限公司 Face fusion method, device, equipment and storage medium
CN113012082A (en) * 2021-02-09 2021-06-22 北京字跳网络技术有限公司 Image display method, apparatus, device and medium
CN113160039A (en) * 2021-04-28 2021-07-23 北京达佳互联信息技术有限公司 Image style migration method and device, electronic equipment and storage medium
CN114037599A (en) * 2021-09-16 2022-02-11 福建大娱号信息科技股份有限公司 Intelligent image matting method and device based on natural environment scene information and storage medium
WO2022037111A1 (en) * 2020-08-21 2022-02-24 深圳市慧鲤科技有限公司 Image processing method and apparatus, interactive display apparatus, and electronic device
CN115145442A (en) * 2022-06-07 2022-10-04 杭州海康汽车软件有限公司 Environment image display method and device, vehicle-mounted terminal and storage medium
CN115272146A (en) * 2022-07-27 2022-11-01 天翼爱音乐文化科技有限公司 Stylized image generation method, system, device and medium
CN117315148A (en) * 2023-09-26 2023-12-29 北京智象未来科技有限公司 Three-dimensional object stylization method, device, equipment and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184249A (en) * 2015-08-28 2015-12-23 百度在线网络技术(北京)有限公司 Method and device for processing face image
US20180082715A1 (en) * 2016-09-22 2018-03-22 Apple Inc. Artistic style transfer for videos
CN109523460A (en) * 2018-10-29 2019-03-26 北京达佳互联信息技术有限公司 Moving method, moving apparatus and the computer readable storage medium of image style
CN110189246A (en) * 2019-05-15 2019-08-30 北京字节跳动网络技术有限公司 Image stylization generation method, device and electronic equipment
CN110458918A (en) * 2019-08-16 2019-11-15 北京百度网讯科技有限公司 Method and apparatus for output information
US10552977B1 (en) * 2017-04-18 2020-02-04 Twitter, Inc. Fast face-morphing using neural networks
CN111028137A (en) * 2018-10-10 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN111127378A (en) * 2019-12-23 2020-05-08 Oppo广东移动通信有限公司 Image processing method, image processing device, computer equipment and storage medium

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108805803B (en) * 2018-06-13 2020-03-13 衡阳师范学院 Portrait style migration method based on semantic segmentation and deep convolution neural network
CN110310222A (en) * 2019-06-20 2019-10-08 北京奇艺世纪科技有限公司 A kind of image Style Transfer method, apparatus, electronic equipment and storage medium
CN111986076A (en) * 2020-08-21 2020-11-24 深圳市慧鲤科技有限公司 Image processing method and device, interactive display device and electronic equipment

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105184249A (en) * 2015-08-28 2015-12-23 百度在线网络技术(北京)有限公司 Method and device for processing face image
US20180082715A1 (en) * 2016-09-22 2018-03-22 Apple Inc. Artistic style transfer for videos
US10552977B1 (en) * 2017-04-18 2020-02-04 Twitter, Inc. Fast face-morphing using neural networks
CN111028137A (en) * 2018-10-10 2020-04-17 Oppo广东移动通信有限公司 Image processing method, image processing device, electronic equipment and computer readable storage medium
CN109523460A (en) * 2018-10-29 2019-03-26 北京达佳互联信息技术有限公司 Moving method, moving apparatus and the computer readable storage medium of image style
CN110189246A (en) * 2019-05-15 2019-08-30 北京字节跳动网络技术有限公司 Image stylization generation method, device and electronic equipment
CN110458918A (en) * 2019-08-16 2019-11-15 北京百度网讯科技有限公司 Method and apparatus for output information
CN111127378A (en) * 2019-12-23 2020-05-08 Oppo广东移动通信有限公司 Image processing method, image processing device, computer equipment and storage medium

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
赵敏: "基于深度学习的卷积神经网络在图像风格化处理中的应用", 《计算机产品与流通》 *
陈超: "前景与背景分离的图像风格迁移系统设计与实现", 《信息通信》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2022037111A1 (en) * 2020-08-21 2022-02-24 深圳市慧鲤科技有限公司 Image processing method and apparatus, interactive display apparatus, and electronic device
CN112488085A (en) * 2020-12-28 2021-03-12 深圳市慧鲤科技有限公司 Face fusion method, device, equipment and storage medium
CN113012082A (en) * 2021-02-09 2021-06-22 北京字跳网络技术有限公司 Image display method, apparatus, device and medium
WO2022171024A1 (en) * 2021-02-09 2022-08-18 北京字跳网络技术有限公司 Image display method and apparatus, and device and medium
EP4276738A4 (en) * 2021-02-09 2023-11-29 Beijing Zitiao Network Technology Co., Ltd. Image display method and apparatus, and device and medium
CN113160039A (en) * 2021-04-28 2021-07-23 北京达佳互联信息技术有限公司 Image style migration method and device, electronic equipment and storage medium
CN113160039B (en) * 2021-04-28 2024-03-26 北京达佳互联信息技术有限公司 Image style migration method and device, electronic equipment and storage medium
CN114037599A (en) * 2021-09-16 2022-02-11 福建大娱号信息科技股份有限公司 Intelligent image matting method and device based on natural environment scene information and storage medium
CN115145442A (en) * 2022-06-07 2022-10-04 杭州海康汽车软件有限公司 Environment image display method and device, vehicle-mounted terminal and storage medium
CN115272146A (en) * 2022-07-27 2022-11-01 天翼爱音乐文化科技有限公司 Stylized image generation method, system, device and medium
CN115272146B (en) * 2022-07-27 2023-04-07 天翼爱音乐文化科技有限公司 Stylized image generation method, system, device and medium
CN117315148A (en) * 2023-09-26 2023-12-29 北京智象未来科技有限公司 Three-dimensional object stylization method, device, equipment and storage medium

Also Published As

Publication number Publication date
WO2022037111A1 (en) 2022-02-24

Similar Documents

Publication Publication Date Title
CN111986076A (en) Image processing method and device, interactive display device and electronic equipment
EP4309024A1 (en) Activating a hands-free mode of operating an electronic mirroring device
KR20180057366A (en) Mobile terminal and method for controlling the same
WO2022179025A1 (en) Image processing method and apparatus, electronic device, and storage medium
KR101944112B1 (en) Method and apparatus for creating user-created sticker, system for sharing user-created sticker
US11790614B2 (en) Inferring intent from pose and speech input
WO2020093798A1 (en) Method and apparatus for displaying target image, terminal, and storage medium
WO2022198934A1 (en) Method and apparatus for generating video synchronized to beat of music
CN113194254A (en) Image shooting method and device, electronic equipment and storage medium
US20220210337A1 (en) Trimming video in association with multi-video clip capture
CN106200917A (en) The content display method of a kind of augmented reality, device and mobile terminal
WO2018098968A9 (en) Photographing method, apparatus, and terminal device
CN111626183A (en) Target object display method and device, electronic equipment and storage medium
CN111506758A (en) Method and device for determining article name, computer equipment and storage medium
WO2023141146A1 (en) Object replacement system
CN112783316A (en) Augmented reality-based control method and apparatus, electronic device, and storage medium
EP4272209A1 (en) Trimming video for multi-video clip capture
WO2024011181A1 (en) Dynamically switching between rgb and ir capture
US20220319059A1 (en) User-defined contextual spaces
CN111462279B (en) Image display method, device, equipment and readable storage medium
CN113794799A (en) Video processing method and device
CN112035705A (en) Label generation method and device, electronic equipment and storage medium
KR20180108541A (en) Method and apparatus for creating user-created sticker, system for sharing user-created sticker
US11825276B2 (en) Selector input device to transmit audio signals
US11960653B2 (en) Controlling augmented reality effects through multi-modal human interaction

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
REG Reference to a national code

Ref country code: HK

Ref legal event code: DE

Ref document number: 40039502

Country of ref document: HK

RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20201124