CN116363260B - Image generation method and device and electronic equipment - Google Patents

Image generation method and device and electronic equipment Download PDF

Info

Publication number
CN116363260B
CN116363260B CN202310343529.2A CN202310343529A CN116363260B CN 116363260 B CN116363260 B CN 116363260B CN 202310343529 A CN202310343529 A CN 202310343529A CN 116363260 B CN116363260 B CN 116363260B
Authority
CN
China
Prior art keywords
image
superposition
generated
interface
generation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310343529.2A
Other languages
Chinese (zh)
Other versions
CN116363260A (en
Inventor
曹溪语
陈璇
辛永正
张久金
苏文嗣
王展鹏
李国豪
李伟
佘俏俏
刘红星
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Baidu Netcom Science and Technology Co Ltd
Original Assignee
Beijing Baidu Netcom Science and Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Baidu Netcom Science and Technology Co Ltd filed Critical Beijing Baidu Netcom Science and Technology Co Ltd
Priority to CN202310343529.2A priority Critical patent/CN116363260B/en
Publication of CN116363260A publication Critical patent/CN116363260A/en
Application granted granted Critical
Publication of CN116363260B publication Critical patent/CN116363260B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/0482Interaction with lists of selectable items, e.g. menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04847Interaction techniques to control parameter settings, e.g. interaction with sliders or dials

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The disclosure provides an image generation method, an image generation device and electronic equipment, relates to the technical field of artificial intelligence, and particularly relates to the technical fields of deep learning, natural language processing and computer vision. The specific implementation scheme is as follows: acquiring an image superposition request, wherein the image superposition request comprises: a base image; displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting a basic image, a superposition image and superposition parameters; determining a superposition image and superposition parameters set by the object according to the operation of the object in the setting area; under the condition that an image generation request is received, image generation processing is carried out according to the basic image, the superimposed image and the superimposed parameter, so that a generated image is obtained, the requirement on a user is low, the user is not required to have the design capability of a drawing basis and the image, the use is simple, and the image generation efficiency is improved.

Description

Image generation method and device and electronic equipment
Technical Field
The disclosure relates to the technical field of artificial intelligence, in particular to the technical fields of deep learning, natural language processing and computer vision, and particularly relates to an image generation method, an image generation device and electronic equipment.
Background
Currently, the image generation is mainly performed by drawing an image using drawing software. The image editing modification is mainly performed by editing an image using image editing software.
The drawing software and the image editing software require users to have drawing foundation and image design capability, have too high requirements for users, are complex to use, and have poor efficiency of image generation or editing modification.
Disclosure of Invention
The disclosure provides an image generation method, an image generation device and electronic equipment.
According to an aspect of the present disclosure, there is provided an image generation method including: acquiring an image overlaying request, the image overlaying request comprising: a base image; displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting the basic image, the superposition image and the superposition parameters; determining a superposition image and superposition parameters set by an object according to the operation of the object in the setting area; and when an image generation request is received, performing image generation processing according to the basic image, the superposition image and the superposition parameters to obtain a generated image.
According to another aspect of the present disclosure, there is provided an image generating apparatus including: the acquisition module is used for acquiring an image superposition request, and the image superposition request comprises: a base image; the first display module is used for displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting the basic image, the superposition image and the superposition parameters; the first determining module is used for determining a superposition image and superposition parameters set by the object according to the operation of the object in the setting area; the first generation module is used for carrying out image generation processing according to the basic image, the superposition image and the superposition parameters under the condition of receiving an image generation request to obtain a generated image.
According to another aspect of the present disclosure, there is provided an electronic device including: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the at least one processor to enable the at least one processor to perform the image generation method set forth above in the present disclosure.
According to another aspect of the present disclosure, there is provided a non-transitory computer-readable storage medium storing computer instructions for causing a computer to execute the image generation method proposed above of the present disclosure.
According to another aspect of the present disclosure, there is provided a computer program product comprising a computer program which, when executed by a processor, implements the steps of the image generation method proposed above by the present disclosure.
It should be understood that the description in this section is not intended to identify key or critical features of the embodiments of the disclosure, nor is it intended to be used to limit the scope of the disclosure. Other features of the present disclosure will become apparent from the following specification.
Drawings
The drawings are for a better understanding of the present solution and are not to be construed as limiting the present disclosure. Wherein:
FIG. 1 is a schematic diagram according to a first embodiment of the present disclosure;
FIG. 2 is a schematic diagram of an image editing interface;
FIG. 3 is a schematic illustration of an image overlay interface;
FIG. 4 is a schematic illustration of selection of images in an image library;
FIG. 5 is a schematic diagram of an image creation interface;
FIG. 6 is an image overlay interface based on a first history generated image;
FIG. 7 is a schematic illustration of an image overlay interface based on a first history generated image and provided with an overlay image;
FIG. 8 is a schematic diagram of an image overlay interface during generation of a generated image;
FIG. 9 is a schematic diagram of an image overlay interface after completion of generating an image;
FIG. 10 is a schematic diagram according to a second embodiment of the present disclosure;
FIG. 11 is a schematic diagram according to a third embodiment of the present disclosure;
FIG. 12 is a schematic diagram according to a fourth embodiment of the present disclosure;
Fig. 13 is a block diagram of an electronic device used to implement an image generation method of an embodiment of the present disclosure.
Detailed Description
Exemplary embodiments of the present disclosure are described below in conjunction with the accompanying drawings, which include various details of the embodiments of the present disclosure to facilitate understanding, and should be considered as merely exemplary. Accordingly, one of ordinary skill in the art will recognize that various changes and modifications of the embodiments described herein can be made without departing from the scope and spirit of the present disclosure. Also, descriptions of well-known functions and constructions are omitted in the following description for clarity and conciseness.
Currently, the image generation is mainly performed by drawing an image using drawing software. The image editing modification is mainly performed by editing an image using image editing software.
The drawing software and the image editing software require users to have drawing foundation and image design capability, have too high requirements for users, are complex to use, and have poor efficiency of image generation or editing modification.
In view of the above, the present disclosure provides an image generating method, an image generating device, and an electronic device.
Fig. 1 is a schematic diagram of a first embodiment of the present disclosure, and it should be noted that the image generating method of the embodiment of the present disclosure may be applied to an image generating apparatus, and the apparatus may be configured in an electronic device, so that the electronic device may perform an image generating function. In the following embodiments, an execution body is described as an example of an electronic device.
The electronic device may be any device with computing capability, for example, may be a personal computer (Personal Computer, abbreviated as PC), a mobile terminal, a server, and the like, and the mobile terminal may be, for example, a vehicle-mounted device, a mobile phone, a tablet computer, a personal digital assistant, a wearable device, a smart speaker, and other hardware devices with various operating systems, touch screens, and/or display screens.
As shown in fig. 1, the image generation method may include the steps of:
Step 101, obtaining an image superposition request, where the image superposition request includes: a base image.
In the embodiment of the disclosure, the base image may be an original image, or a history generated image in a history image generation task. Where the base image is an original image, the electronic device may generate an image based on the original image and other images, and at least one element in the original image may be considered in generating the image. Where the base image is a history-generated image, the electronic device may generate an image based on the history-generated image and other images, and may consider at least one element in the history-generated image when generating the image.
Wherein elements such as style, color, body, etc. The style of the ink can be set according to actual needs, such as freshness, literature, ink, light and shadow. Wherein the subject is, for example, an animal, plant, character, object, or the like.
The base image can be an original image or a history generated image, so that an object can flexibly select different images according to requirements to generate the image, the flexibility of image generation is improved, the image generation is performed based on the history generated image, continuous editing or continuous generation of the original image can be realized, and the image generation efficiency is improved.
In an example of the embodiment of the present disclosure, the process of obtaining the image stacking request by the electronic device may be, for example, displaying an image editing interface, where the image editing interface includes an image stacking control; displaying an image overlaying interface under the condition that the selected operation for the image overlaying control is detected; according to the operation of the object in the setting area of the image superposition interface, in the case that the object is determined to be provided with the basic image, the image superposition request is determined to be acquired.
The schematic diagram of the image editing interface may be as shown in fig. 2, where in fig. 2, the image editing interface is an interface displayed after an image editing control (for example, AI editing) in the image processing interface is selected for the object. In fig. 2, an image superimposition control (picture superimposition) and a smear editing control (smear editing) are included in the image editing interface.
The schematic diagram of the image overlaying interface may be shown in fig. 3, and in fig. 3, the area is set, as shown in a left list in fig. 3. The setting area includes: a base map setting sub-area, an overlay map setting sub-area, a description text setting sub-area, a size information setting sub-area, a quantity information setting sub-area, an image generation control (immediate generation), and the like.
In fig. 3, operations of the object in the setting area of the image superimposition interface, for example, a base image selection operation, a superimposition image selection operation, a description text input operation, a size information selection operation, a quantity information selection operation, and the like.
Wherein the electronic device displays an image editing interface; and under the condition that the selected operation aiming at the image superposition control is detected, displaying an image superposition interface to select a basic image, thereby triggering the image superposition function, providing a mode for triggering the image superposition function for the object, and improving the triggering efficiency of the image superposition function.
In the embodiment of the present disclosure, according to an operation of an object in a setting area of an image superimposition interface, a manner of determining that the object is provided with a base image may be, for example, displaying a local image library and generating an image library when detecting a base image selection operation of the object in the setting area; the local image library comprises local original images; the generated image library comprises history generated images; and taking the image selected by the object in the local image library or the generated image library as a basic image.
The selection schematic diagram of the images in the image library may be as shown in fig. 4, and in fig. 4, after the "upload local photo" control is selected, the original images in the local image library may be displayed; after the "one-grid sample" control is selected, a sample image can be displayed. Upon selection of the My work control, a generated image in the generated image library may be displayed.
The electronic equipment provides a plurality of image libraries, each image library comprises a plurality of images, and an object can conveniently select a basic image, so that an image superposition function is triggered, the flexibility of basic image selection can be improved, and the image generation efficiency is further improved.
In another example of the embodiment of the present disclosure, the process of obtaining the image stacking request by the electronic device may be, for example, displaying an image processing interface, where the image processing interface includes an image editing control, an image creation control, and an image generation task list; the image generation task list comprises at least one historical image generation task; displaying an image creation interface and displaying a first history generation image in the first history image generation task in the image creation interface when a selection operation for the first history image generation task in the image generation task list is detected; in the case that the selection operation of editing the image control for the periphery of the first history generation image is detected, the image superposition request for acquiring the image based on the first history generation image is determined.
Wherein, the schematic diagram of the image creation interface may be shown in fig. 5, and in fig. 5, the image generates a task list, as shown in the right-hand list in fig. 5. The right list displays history generated images in each history image generation task in the image generation task list. In fig. 5, a history generation image in the first history image generation task of the image generation task list is displayed in the image creation interface.
Under the displayed first history generated image in fig. 5, an "edit present picture" control is displayed, that is, an edit present image control, and after detecting a selection operation for the edit present image control, the electronic device determines that an image superimposition request based on the first history generated image is acquired, and further displays an image superimposition interface based on the first history generated image, as shown in fig. 6. In fig. 6, a base image, i.e., a first history-generated image shown in fig. 5, is displayed.
The electronic equipment determines to acquire an image superposition request taking the first history generation image as a basic image under the condition of detecting a selection operation of editing the image control aiming at the periphery of the first history generation image; triggering an image superposition function, displaying an image editing interface taking a first history generated image as a basic image, providing another mode of triggering the image superposition function for an object, and improving the triggering efficiency of the image superposition function.
In this example, in the setting area of the image superimposition interface in fig. 6, the first history superimposition parameter used when the first history generated image is generated is displayed, for example, the size information is 1:1, the number information is 1, and the like, so that the object can conveniently adjust the superimposition parameter on the basis of the first history superimposition parameter, avoid repeatedly setting the superimposition parameter, facilitate multiplexing of the first history superimposition parameter, and further improve the image generation efficiency.
Step 102, displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area used for setting a basic image, a superposition image and superposition parameters.
Step 103, determining the superposition image and the superposition parameters set by the object according to the operation of the object in the setting area.
Step 104, when receiving the image generation request, performing image generation processing according to the base image, the superimposed image and the superimposed parameter to obtain a generated image.
In the embodiment of the present disclosure, the number of superimposed images is one or more. Wherein the superimposition parameters may include at least one of: descriptive text for describing the generated image, influence weight of the base image, influence weight of the superimposed image, influence weight of the descriptive text, size information of the generated image, and the number of images of the generated image.
The schematic diagram of the image overlay interface in fig. 6, in which the overlay image is set and the image overlay interface after the first history overlay parameter is adjusted, may be as shown in fig. 7, where fig. 7 is a schematic diagram of the image overlay interface in which the first history generated image is taken as a base image and the overlay image is set.
In fig. 7, with respect to fig. 6, the size information and the number information are not adjusted, the description text "mechanical dog" is newly added, and a superimposed image, that is, a robot image, is newly added.
Under the condition that the electronic equipment detects the superimposed image selection operation of the object in the setting area, the electronic equipment can display a local image library and generate an image library, so that the object can conveniently select the superimposed image.
A schematic diagram of an image overlay interface in the image generation process is generated, and may be shown in fig. 8. A schematic diagram of the image overlay interface after the image generation is generated may be shown in fig. 9. In fig. 9, after the generated image is generated, the setting area may be in a locked state, and a "re-edit" control, that is, a re-edit control, is displayed, so that the basic image, the superimposed image, or the superimposed parameter in the setting area is conveniently adjusted to generate the image again.
The object can select proper superimposed images and superimposed parameters according to requirements, so that the flexibility of the selection of the superimposed images and the superimposed parameters can be improved, and the image generation efficiency is further improved.
In an embodiment of the present disclosure, in order to facilitate viewing of the generated image by the subject, and subsequent image generation processing based on the generated image, the electronic device may further perform the following process after step 104: and updating the image generation task list according to the basic image, the superposition parameters and the generated image.
According to the image generation method of the embodiment of the disclosure, an image superposition request is acquired, wherein the image superposition request comprises: a base image; displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting a basic image, a superposition image and superposition parameters; determining a superposition image and superposition parameters set by the object according to the operation of the object in the setting area; under the condition that an image generation request is received, image generation processing is carried out according to the basic image, the superimposed image and the superimposed parameter, so that a generated image is obtained, the requirement on a user is low, the user is not required to have the design capability of a drawing basis and the image, the use is simple, and the image generation efficiency is improved.
If the generated image does not meet the requirement of the object, the basic image, the superimposed parameters and the like adopted in the process of generating the image can be adjusted, so that the regenerated generated image meets the requirement of the object, and the image generation efficiency is improved. As shown in fig. 10, fig. 10 is a schematic diagram of a second embodiment according to the present disclosure, and the embodiment shown in fig. 10 may include the following steps:
In step 1001, an image overlay request is acquired, where the image overlay request includes: a base image.
Step 1002, displaying an image superposition interface according to the image superposition request, where the image superposition interface includes a setting area, and the setting area is used for setting a base image, a superposition image and superposition parameters.
In step 1003, according to the operation of the object in the setting area, the superimposition image and superimposition parameters set by the object are determined.
In step 1004, when an image generation request is received, image generation processing is performed according to the base image, the superimposed image, and the superimposed parameter, so as to obtain a generated image.
Step 1005, displaying the generated image, locking the setting area of the image superposition interface, and displaying a re-editing control in the image superposition interface.
In the disclosed embodiment, as shown in fig. 9, a generated image, that is, a generated machine dog image is displayed. In fig. 9, the setting area is in a locked state, i.e., a non-editable state, avoiding a case where the base image, the superimposed image, or the superimposed parameter in the setting area is not identical to the image and the superimposed parameter employed when the image is generated after adjustment. And providing a re-editing control, wherein when the generated image does not meet the requirement of the object, at least one of the basic image, the superimposed image or the superimposed parameter is conveniently adjusted, and the image generation process is performed again.
In step 1006, in the event that a selected operation for the re-edit control is detected, it is determined that a re-edit request for the generated image is received.
Step 1007, unlock the setup area in the image overlay interface.
In the embodiment of the disclosure, when the electronic device detects the selection operation for the re-editing control, the electronic device indicates that the generated image does not meet the requirement of the object, and at least one of the basic image, the superimposed image and the superimposed parameter in the setting area needs to be adjusted, so that the electronic device can unlock the setting area in the image superimposed interface, that is, the setting area is in an editable state, and the object adjustment is facilitated.
At step 1008, at least one of the base image, the superimposed image, and the superimposed parameter in the setting area is adjusted according to the operation of the object in the setting area.
In step 1009, when the image generation request is received, image generation processing is performed according to the adjusted base image, the adjusted superimposed image, and the adjusted superimposed parameter, and a re-edited generated image is obtained.
In embodiments of the present disclosure, an image generation control, such as the "immediately generated" control in fig. 9, may be displayed in the image overlay interface. The electronic device determines that an image generation request is received when detecting a selected operation of the object for the control.
In an example of the embodiment of the present disclosure, according to the operation of the object in the setting area, the base image in the setting area is adjusted, and the corresponding process of the electronic device executing step 1009 may be, for example, performing, when the image generation request is received, image generation processing according to the adjusted base image, the superimposed image, and the superimposed parameter, to obtain a re-edited generated image.
In another example, the base image and the superimposed image in the setting area are adjusted according to the operation of the object in the setting area, and the corresponding process of the electronic device in step 1009 may be, for example, when the image generation request is received, performing image generation processing according to the adjusted base image, the adjusted superimposed image, and the superimposed parameter, to obtain a re-edited generated image.
In another example, the superimposition parameters in the setting area are adjusted according to the operation of the object in the setting area, and the corresponding process of the electronic device executing step 1009 may be, for example, when the image generation request is received, performing image generation processing according to the base image, the superimposition image, and the adjusted superimposition parameters, so as to obtain a generated image after being edited again.
It should be noted that, as an alternative to steps 1008 to 1009, after step 1007, the electronic device may further perform the following procedures: when an image generation request is received, image generation processing is performed according to the basic image, the superimposed image and the superimposed parameter in the setting area, and a re-edited generated image is obtained.
In the alternative scheme, the electronic device does not adjust the basic image, the superimposed image and the superimposed parameter, and re-performs image generation processing based on the basic image, the superimposed image and the superimposed parameter displayed in the setting area to obtain a new generated image. Wherein the new generated image may be different from the generated image in step 1005, and may satisfy the requirement of the object.
It should be noted that, for details of steps 1001 to 1004, reference may be made to steps 101 to 104 in the embodiment shown in fig. 1, and detailed description thereof will not be given here.
According to the image generation method of the embodiment of the disclosure, an image superposition request is acquired, wherein the image superposition request comprises: a base image; displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting a basic image, a superposition image and superposition parameters; determining a superposition image and superposition parameters set by the object according to the operation of the object in the setting area; under the condition that an image generation request is received, performing image generation processing according to the basic image, the superimposed image and the superimposed parameter to obtain a generated image; displaying the generated image, locking a setting area of the image superposition interface, and displaying a re-editing control in the image superposition interface; determining that a re-editing request for the generated image is received in the case that a selected operation for the re-editing control is detected; unlocking a setting area in the image superposition interface; according to the operation of the object in the setting area, at least one of a basic image, a superposition image and a superposition parameter in the setting area is adjusted; under the condition that an image generation request is received, performing image generation processing according to the adjusted basic image, the adjusted superposition image and the adjusted superposition parameters to obtain a re-edited generation image; the requirements on the user are low, the user is not required to have drawing foundation and image design capability, the use is simple, the continuous editing or continuous generating function of the image can be provided for the user, and the image generating efficiency is improved.
If the generated image does not meet the requirement of the object, and lacks elements of the requirement of the object, the object hopes to continue editing the generated image, that is, to perform image generation processing again by taking the generated image as a base image, so that the regenerated generated image meets the requirement of the object, and the image generation efficiency is improved. As shown in fig. 11, fig. 11 is a schematic diagram of a third embodiment according to the present disclosure, and the embodiment shown in fig. 11 may include the steps of:
Step 1101, obtaining an image overlay request, where the image overlay request includes: a base image.
Step 1102, displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting a basic image, a superposition image and superposition parameters.
In step 1103, the superimposed image and the superimposed parameters set by the object are determined according to the operation of the object in the setting area.
In step 1104, when an image generation request is received, image generation processing is performed according to the base image, the superimposed image, and the superimposed parameter, to obtain a generated image.
In the embodiment of the present disclosure, the image generation processing performed by the electronic device may be, for example, inputting the base image, the superimposed image, and the superimposed parameter into the image generation model, and obtaining the generated image output by the image generation model.
Wherein, the superposition parameters include: under the conditions of descriptive texts for describing the generated images, influence weights of the basic images, influence weights of the overlapped images and influence weights of the descriptive texts, the image generation process of the image generation model according to the basic images, the overlapped images and the overlapped parameters is that image feature extraction processing is carried out on the basic images and the overlapped images to obtain image features of the basic images and image features of the overlapped images; extracting text features of the descriptive text to obtain the text features of the descriptive text; weighting and splicing the plurality of image features and the text features according to the influence weight to obtain spliced features; and denoising and decoding the noise image features according to the splicing features to obtain a generated image.
The basic image, the superimposed image and the superimposed parameters are input into the image generation model, and the generated image output by the image generation model is acquired.
Step 1105, displaying the generated image, displaying an edit image control around the generated image, and regenerating the generated image by using the generated image as a basic image.
In the disclosed embodiment, as shown in fig. 9, a generated image, that is, a generated machine dog image is displayed. In fig. 9, an edit image control, that is, an "edit image" control is displayed below the generated image, and when the generated image does not meet the requirement of the object, for example, an element is absent or needs to be adjusted, the generated image is conveniently used as a base image, and further image generation processing is performed.
In step 1106, in the event that a selection operation for editing the present image control is detected, it is determined that an image superimposition request is received to generate an image-based image.
Step 1107, displaying an image superposition interface according to the image superposition request, and displaying a generated image serving as a basic image and superposition parameters used when the generated image is generated in a setting area of the image superposition interface.
In the embodiment of the present disclosure, optionally, a superimposed image used when the generated image is generated may also be displayed in the setting area of the image superimposition interface, so that reference is facilitated.
Step 1108, setting the superimposed image or setting the superimposed image and performing the superimposed parameter adjustment according to the operation of the object in the setting area.
In an example of the embodiment of the present disclosure, the electronic device may set the superimposition image according to an operation of the object in the setting area, and multiplex superimposition parameters when generating the generated image. In another example, the electronic device may set the superimposed image according to an operation of the object in the setting area, and perform adjustment processing on the superimposed parameter displayed in the image superimposition interface.
In step 1109, when an image generation request is received, image generation processing is performed based on the generated image as the base image, the set superimposed image, and the adjusted superimposed parameter, and a new generated image is obtained.
It should be noted that, for the details of steps 1101 to 1104, reference may be made to steps 101 to 104 in the embodiment shown in fig. 1, and the details will not be described here.
According to the image generation method of the embodiment of the disclosure, an image superposition request is acquired, wherein the image superposition request comprises: a base image; displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting a basic image, a superposition image and superposition parameters; determining a superposition image and superposition parameters set by the object according to the operation of the object in the setting area; under the condition that an image generation request is received, performing image generation processing according to the basic image, the superimposed image and the superimposed parameter to obtain a generated image; displaying a generated image, displaying an edit image control around the generated image, and indicating to regenerate the image by taking the generated image as a basic image; determining that an image superposition request based on the generated image is received under the condition that a selection operation for editing the image control is detected; displaying an image superposition interface according to the image superposition request, and displaying a generated image serving as a basic image and superposition parameters used when the generated image is generated in a setting area of the image superposition interface; setting a superimposed image or setting the superimposed image and performing superimposed parameter adjustment according to the operation of the object in the setting area; under the condition that an image generation request is received, image generation processing is carried out according to the generated image serving as a basic image, the set superimposed image and the adjusted superimposed parameter, so that a new generated image is obtained, the requirement on a user is low, the user does not need to have drawing basic and image design capability, the use is simple, the continuous editing or continuous generation function of the image can be provided for the user, and the image generation efficiency is improved.
In order to achieve the above embodiments, the present disclosure also provides an image generating apparatus. As shown in fig. 12, fig. 12 is a schematic diagram according to a fourth embodiment of the present disclosure. The image generating apparatus 120 may include: an acquisition module 1201, a first display module 1202, a first determination module 1203, and a first generation module 1204.
The obtaining module 1201 is configured to obtain an image stacking request, where the image stacking request includes: a base image;
a first display module 1202, configured to display an image overlay interface according to the image overlay request, where the image overlay interface includes a setting area, and the setting area is used to set the base image, the overlay image, and the overlay parameter;
a first determining module 1203 configured to determine, according to an operation of an object in the setting area, a superimposed image and a superimposed parameter set by the object;
The first generating module 1204 is configured to, when receiving an image generating request, perform image generating processing according to the base image, the superimposed image, and the superimposed parameter, to obtain a generated image.
As one possible implementation of the embodiments of the present disclosure, the base image is an original image, or a history generated image in a history image generation task.
As one possible implementation manner of the embodiments of the present disclosure, the obtaining module 1201 is specifically configured to display an image editing interface, where the image editing interface includes an image overlay control; displaying the image overlaying interface under the condition that the selected operation for the image overlaying control is detected; and determining that the image superposition request is acquired when the object is determined to be provided with the basic image according to the operation of the object in the setting area of the image superposition interface.
As one possible implementation manner of the embodiment of the present disclosure, according to an operation of the object in a setting area of the image superimposition interface, determining a manner in which the object is set with a base image includes: displaying a local image library and generating an image library in the case that a basic image selection operation of the object in the setting area is detected; the local image library comprises local original images; the generated image library comprises historical generated images; and taking the image selected by the object in the local image library or the generated image library as the basic image.
As one possible implementation manner of the embodiment of the present disclosure, the obtaining module 1201 is specifically configured to display an image processing interface, where the image processing interface includes an image editing control, an image creation control, and an image generation task list; the image generation task list comprises at least one historical image generation task; displaying an image creation interface and displaying a first history generation image in a first history image generation task in the image creation interface when a selection operation for the first history image generation task in the image generation task list is detected; and determining to acquire an image superposition request taking the first history generation image as a basic image under the condition that the selection operation of the image control for editing the periphery of the first history generation image is detected.
As one possible implementation manner of the embodiment of the present disclosure, in the setting area of the image superimposition interface, a first history superimposition parameter used when the first history generated image is generated is displayed.
As one possible implementation manner of the embodiments of the present disclosure, the apparatus further includes: the second display module and the third display module; the second display module is used for displaying the generated image, displaying an edit image control around the generated image and indicating to regenerate the image by taking the generated image as a basic image; and the third display module is used for locking the setting area of the image superposition interface and displaying a re-editing control in the image superposition interface.
As one possible implementation manner of the embodiments of the present disclosure, the apparatus further includes: the device comprises a second determining module, a first unlocking module, a first adjusting module and a second generating module; the second determining module is used for determining that a re-editing request for the generated image is received under the condition that the selected operation for the re-editing control is detected; the first unlocking module is used for unlocking the setting area in the image superposition interface; the first adjusting module is used for adjusting at least one of a basic image, a superposition image and a superposition parameter in the setting area according to the operation of the object in the setting area; and the second generation module is used for carrying out image generation processing according to the adjusted basic image, the adjusted superposition image and the adjusted superposition parameters under the condition of receiving the image generation request, so as to obtain a re-edited generated image.
As one possible implementation manner of the embodiments of the present disclosure, the apparatus further includes: the device comprises a third determining module, a second unlocking module and a third generating module; the third determining module is used for determining that a re-editing request for the generated image is received under the condition that the selected operation for the re-editing control is detected; the second unlocking module is used for unlocking the setting area in the image superposition interface; and the third generation module is used for carrying out image generation processing according to the basic image, the superposition image and the superposition parameters in the setting area under the condition of receiving an image generation request, so as to obtain a re-edited generated image.
As one possible implementation manner of the embodiments of the present disclosure, the apparatus further includes: the device comprises a fourth determining module, a fourth display module, a second adjusting module and a fourth generating module; the fourth determining module is used for determining that an image superposition request taking the generated image as a basic image is received under the condition that a selection operation for editing the image control is detected; the fourth display module is configured to display an image stacking interface according to the image stacking request, where the setting area of the image stacking interface displays the generated image serving as a base image and stacking parameters used when the generated image is generated; the second adjustment module is used for setting a superposition image or setting the superposition image and adjusting superposition parameters according to the operation of the object in the setting area; the fourth generation module is configured to, when receiving an image generation request, perform image generation processing according to the generated image serving as a base image, the set superimposed image, and the adjusted superimposed parameter, to obtain a new generated image.
As one possible implementation manner of the embodiments of the present disclosure, the apparatus further includes: and the updating module is used for updating the image generation task list according to the basic image, the superposition parameters and the generated image.
As one possible implementation of the embodiments of the present disclosure, the superimposition parameters include at least one of: descriptive text for describing the generated image, influence weight of the base image, influence weight of the superimposed image, influence weight of the descriptive text, size information of the generated image, and image number of the generated image.
As one possible implementation of the embodiments of the present disclosure, the number of superimposed images is one or more.
As one possible implementation manner of the embodiments of the present disclosure, the first generating module is specifically configured to input the base image, the superimposed image, and the superimposition parameter into an image generating model, and obtain the generated image output by the image generating model.
As one possible implementation of the embodiments of the present disclosure, the superposition parameters include: descriptive text for describing the generated image, influence weights of the base image, influence weights of the superimposed image, influence weights of the descriptive text; the image generation model performs image generation according to the basic image, the superposition image and the superposition parameters, namely, performs image feature extraction processing on the basic image and the superposition image to obtain image features of the basic image and image features of the superposition image; extracting text features of the descriptive text to obtain the text features of the descriptive text; according to the influence weight, weighting and splicing the image features and the text features to obtain spliced features; and according to the splicing characteristics, denoising and decoding the noise image characteristics to obtain the generated image.
The image generating apparatus of the embodiment of the present disclosure, by acquiring an image superimposition request, the image superimposition request includes: a base image; displaying an image superposition interface according to the image superposition request, wherein the image superposition interface comprises a setting area, and the setting area is used for setting a basic image, a superposition image and superposition parameters; determining a superposition image and superposition parameters set by the object according to the operation of the object in the setting area; under the condition that an image generation request is received, image generation processing is carried out according to the basic image, the superimposed image and the superimposed parameter, so that a generated image is obtained, the requirement on a user is low, the user is not required to have the design capability of a drawing basis and the image, the use is simple, and the image generation efficiency is improved.
In the technical scheme of the disclosure, the related processes of collecting, storing, using, processing, transmitting, providing, disclosing and the like of the personal information of the user are performed on the premise of proving the consent of the user, and all the processes accord with the regulations of related laws and regulations, and the public welfare is not violated.
According to embodiments of the present disclosure, the present disclosure also provides an electronic device, a readable storage medium and a computer program product.
Fig. 13 illustrates a schematic block diagram of an example electronic device 1300 that can be used to implement embodiments of the present disclosure. Electronic devices are intended to represent various forms of digital computers, such as laptops, desktops, workstations, personal digital assistants, servers, blade servers, mainframes, and other appropriate computers. The electronic device may also represent various forms of mobile devices, such as personal digital processing, cellular telephones, smartphones, wearable devices, and other similar computing devices. The components shown herein, their connections and relationships, and their functions, are meant to be exemplary only, and are not meant to limit implementations of the disclosure described and/or claimed herein.
As shown in fig. 13, the apparatus 1300 includes a computing unit 1301 that can perform various appropriate actions and processes according to a computer program stored in a Read Only Memory (ROM) 1302 or a computer program loaded from a storage unit 1308 into a Random Access Memory (RAM) 1303. In the RAM 1303, various programs and data required for the operation of the device 1300 can also be stored. The computing unit 1301, the ROM 1302, and the RAM 1303 are connected to each other through a bus 1304. An input/output (I/O) interface 1305 is also connected to bus 1304.
Various components in device 1300 are connected to I/O interface 1305, including: an input unit 1306 such as a keyboard, a mouse, or the like; an output unit 1307 such as various types of displays, speakers, and the like; storage unit 1308, such as a magnetic disk, optical disk, etc.; and a communication unit 1309 such as a network card, a modem, a wireless communication transceiver, or the like. The communication unit 1309 allows the device 1300 to exchange information/data with other devices through a computer network such as the internet and/or various telecommunication networks.
The computing unit 1301 may be a variety of general and/or special purpose processing components having processing and computing capabilities. Some examples of computing unit 1301 include, but are not limited to, a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), various specialized Artificial Intelligence (AI) computing chips, various computing units running machine learning model algorithms, a Digital Signal Processor (DSP), and any suitable processor, controller, microcontroller, etc. The computing unit 1301 performs the respective methods and processes described above, for example, an image generation method. For example, in some embodiments, the image generation method may be implemented as a computer software program tangibly embodied on a machine-readable medium, such as storage unit 1308. In some embodiments, part or all of the computer program may be loaded and/or installed onto the device 1300 via the ROM 1302 and/or the communication unit 1309. When the computer program is loaded into the RAM 1303 and executed by the computing unit 1301, one or more steps of the image generating method described above may be performed. Alternatively, in other embodiments, the computing unit 1301 may be configured to perform the image generation method by any other suitable means (e.g. by means of firmware).
Various implementations of the systems and techniques described here above may be implemented in digital electronic circuitry, integrated circuit systems, field Programmable Gate Arrays (FPGAs), application Specific Integrated Circuits (ASICs), application Specific Standard Products (ASSPs), systems On Chip (SOCs), load programmable logic devices (CPLDs), computer hardware, firmware, software, and/or combinations thereof. These various embodiments may include: implemented in one or more computer programs, the one or more computer programs may be executed and/or interpreted on a programmable system including at least one programmable processor, which may be a special purpose or general-purpose programmable processor, that may receive data and instructions from, and transmit data and instructions to, a storage system, at least one input device, and at least one output device.
Program code for carrying out methods of the present disclosure may be written in any combination of one or more programming languages. These program code may be provided to a processor or controller of a general purpose computer, special purpose computer, or other programmable data processing apparatus such that the program code, when executed by the processor or controller, causes the functions/operations specified in the flowchart and/or block diagram to be implemented. The program code may execute entirely on the machine, partly on the machine, as a stand-alone software package, partly on the machine and partly on a remote machine or entirely on the remote machine or server.
In the context of this disclosure, a machine-readable medium may be a tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. The machine-readable medium may be a machine-readable signal medium or a machine-readable storage medium. The machine-readable medium may include, but is not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples of a machine-readable storage medium would include an electrical connection based on one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
To provide for interaction with a user, the systems and techniques described here can be implemented on a computer having: a display device (e.g., a CRT (cathode ray tube) or LCD (liquid crystal display) monitor) for displaying information to a user; and a keyboard and pointing device (e.g., a mouse or trackball) by which a user can provide input to the computer. Other kinds of devices may also be used to provide for interaction with a user; for example, feedback provided to the user may be any form of sensory feedback (e.g., visual feedback, auditory feedback, or tactile feedback); and input from the user may be received in any form, including acoustic input, speech input, or tactile input.
The systems and techniques described here can be implemented in a computing system that includes a background component (e.g., as a data server), or that includes a middleware component (e.g., an application server), or that includes a front-end component (e.g., a user computer having a graphical user interface or a web browser through which a user can interact with an implementation of the systems and techniques described here), or any combination of such background, middleware, or front-end components. The components of the system can be interconnected by any form or medium of digital data communication (e.g., a communication network). Examples of communication networks include: local Area Networks (LANs), wide Area Networks (WANs), and the internet.
The computer system may include a client and a server. The client and server are typically remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a client-server relationship to each other. The server may be a cloud server, a server of a distributed system, or a server incorporating a blockchain.
It should be appreciated that various forms of the flows shown above may be used to reorder, add, or delete steps. For example, the steps recited in the present disclosure may be performed in parallel or sequentially or in a different order, provided that the desired results of the technical solutions of the present disclosure are achieved, and are not limited herein.
The above detailed description should not be taken as limiting the scope of the present disclosure. It will be apparent to those skilled in the art that various modifications, combinations, sub-combinations and alternatives are possible, depending on design requirements and other factors. Any modifications, equivalent substitutions and improvements made within the spirit and principles of the present disclosure are intended to be included within the scope of the present disclosure.

Claims (29)

1. An image generation method, the method comprising:
acquiring an image overlaying request, the image overlaying request comprising: a base image;
According to the image superposition request, displaying an image superposition interface, wherein the image superposition interface comprises a setting area, the setting area is used for setting the basic image, the superposition image and superposition parameters, and the superposition parameters comprise at least one of the following: the method comprises the steps of describing a description text of a generated image, influencing weights of basic images, influencing weights of superimposed images, influencing weights of the description text, size information of the generated image and the number of images of the generated image;
determining a superimposed image and superimposed parameters set by an object according to the operation of the object in the setting area, wherein the superimposed image is selected from a local image library or a generated image library;
Under the condition that an image generation request is received, performing image generation processing according to the basic image, the superposition image and the superposition parameters to obtain a generated image;
wherein the acquiring the image superposition request includes:
Displaying an image processing interface, wherein the image processing interface comprises an image editing control, an image creation control and an image generation task list; the image generation task list comprises at least one historical image generation task;
Displaying an image creation interface and displaying a first history generation image in a first history image generation task in the image creation interface when a selection operation for the first history image generation task in the image generation task list is detected;
And determining to acquire an image superposition request taking the first history generation image as a basic image under the condition that the selection operation of the image control for editing the periphery of the first history generation image is detected.
2. The method of claim 1, wherein the base image is an original image, or a history generated image in a history image generation task.
3. The method according to claim 1 or 2, wherein the acquiring an image superimposition request includes:
displaying an image editing interface, wherein the image editing interface comprises an image superposition control;
Displaying the image overlaying interface under the condition that the selected operation for the image overlaying control is detected;
and determining that the image superposition request is acquired when the object is determined to be provided with the basic image according to the operation of the object in the setting area of the image superposition interface.
4. A method according to claim 3, wherein determining the manner in which the object is provided with the base image in accordance with the operation of the object in the setting area of the image overlay interface comprises:
Displaying a local image library and generating an image library in the case that a basic image selection operation of the object in the setting area is detected; the local image library comprises local original images; the generated image library comprises historical generated images;
and taking the image selected by the object in the local image library or the generated image library as the basic image.
5. The method according to claim 1 or 2, wherein a first history superimposition parameter used in generating the first history generated image is displayed in the setting area of the image superimposition interface.
6. The method of claim 1, wherein the method further comprises:
Displaying the generated image, displaying an edit image control around the generated image, and indicating to regenerate the image by taking the generated image as a basic image;
And locking the setting area of the image superposition interface, and displaying a re-editing control in the image superposition interface.
7. The method of claim 6, wherein the method further comprises:
Determining that a re-edit request for the generated image is received if a selected operation for the re-edit control is detected;
unlocking the setting area in the image superposition interface;
According to the operation of the object in the setting area, at least one of a basic image, a superposition image and a superposition parameter in the setting area is adjusted;
when an image generation request is received, performing image generation processing according to the adjusted basic image, the adjusted superimposed image and the adjusted superimposed parameter to obtain a re-edited generated image.
8. The method of claim 6, wherein the method further comprises:
Determining that a re-edit request for the generated image is received if a selected operation for the re-edit control is detected;
unlocking the setting area in the image superposition interface;
And when an image generation request is received, performing image generation processing according to the basic image, the superposition image and the superposition parameters in the setting area to obtain a re-edited generated image.
9. The method of claim 6, wherein the method further comprises:
Determining that an image superposition request taking the generated image as a basic image is received under the condition that a selected operation for editing the image control is detected;
Displaying an image superposition interface according to the image superposition request, wherein the generated image serving as a basic image and superposition parameters used when the generated image is generated are displayed in a setting area of the image superposition interface;
setting a superimposed image or setting a superimposed image and performing superimposed parameter adjustment according to the operation of the object in the setting area;
When an image generation request is received, performing image generation processing according to the generated image serving as a basic image, the set superimposed image and the adjusted superimposed parameter to obtain a new generated image.
10. The method of claim 1, wherein the method further comprises:
And updating an image generation task list according to the basic image, the superposition parameters and the generated image.
11. The method of claim 1, wherein the number of superimposed images is one or more.
12. The method of claim 1, wherein the performing an image generation process according to the base image, the superimposed image, and the superimposed parameter to obtain a generated image includes:
and inputting the basic image, the superposition image and the superposition parameters into an image generation model, and obtaining the generated image output by the image generation model.
13. The method of claim 12, wherein the overlay parameters comprise: descriptive text for describing the generated image, influence weights of the base image, influence weights of the superimposed image, influence weights of the descriptive text;
The image generation model generates an image according to the basic image, the superposition image and the superposition parameters,
Performing image feature extraction processing on the basic image and the superimposed image to obtain image features of the basic image and image features of the superimposed image;
extracting text features of the descriptive text to obtain the text features of the descriptive text;
according to the influence weight, weighting and splicing the image features and the text features to obtain spliced features;
And according to the splicing characteristics, denoising and decoding the noise image characteristics to obtain the generated image.
14. An image generation apparatus, the apparatus comprising:
The acquisition module is used for acquiring an image superposition request, and the image superposition request comprises: a base image;
The first display module is used for displaying an image superposition interface according to the image superposition request, the image superposition interface comprises a setting area, the setting area is used for setting the basic image, the superposition image and superposition parameters, and the superposition parameters comprise at least one of the following: the method comprises the steps of describing a description text of a generated image, influencing weights of basic images, influencing weights of superimposed images, influencing weights of the description text, size information of the generated image and the number of images of the generated image;
The first determining module is used for determining a superposition image and superposition parameters set by an object according to the operation of the object in the setting area, wherein the superposition image is selected from a local image library or a generated image library;
the first generation module is used for carrying out image generation processing according to the basic image, the superposition image and the superposition parameters under the condition of receiving an image generation request to obtain a generated image;
Wherein the acquisition module is specifically used for acquiring the data of the object,
Displaying an image processing interface, wherein the image processing interface comprises an image editing control, an image creation control and an image generation task list; the image generation task list comprises at least one historical image generation task;
Displaying an image creation interface and displaying a first history generation image in a first history image generation task in the image creation interface when a selection operation for the first history image generation task in the image generation task list is detected;
And determining to acquire an image superposition request taking the first history generation image as a basic image under the condition that the selection operation of the image control for editing the periphery of the first history generation image is detected.
15. The apparatus of claim 14, wherein the base image is a raw image, or a history generated image in a history image generation task.
16. The apparatus of claim 14 or 15, wherein the acquisition module is specifically configured to,
Displaying an image editing interface, wherein the image editing interface comprises an image superposition control;
Displaying the image overlaying interface under the condition that the selected operation for the image overlaying control is detected;
and determining that the image superposition request is acquired when the object is determined to be provided with the basic image according to the operation of the object in the setting area of the image superposition interface.
17. The apparatus of claim 16, wherein determining a manner in which the object is provided with a base image in accordance with an operation of the object in a setup region of the image overlay interface comprises:
Displaying a local image library and generating an image library in the case that a basic image selection operation of the object in the setting area is detected; the local image library comprises local original images; the generated image library comprises historical generated images;
and taking the image selected by the object in the local image library or the generated image library as the basic image.
18. The apparatus according to claim 14 or 15, wherein a first history superimposition parameter used in generating the first history generated image is displayed in the setting area of the image superimposition interface.
19. The apparatus of claim 14, wherein the apparatus further comprises: the second display module and the third display module;
The second display module is used for displaying the generated image, displaying an edit image control around the generated image and indicating to regenerate the image by taking the generated image as a basic image;
and the third display module is used for locking the setting area of the image superposition interface and displaying a re-editing control in the image superposition interface.
20. The apparatus of claim 19, wherein the apparatus further comprises: the device comprises a second determining module, a first unlocking module, a first adjusting module and a second generating module;
the second determining module is used for determining that a re-editing request for the generated image is received under the condition that the selected operation for the re-editing control is detected;
the first unlocking module is used for unlocking the setting area in the image superposition interface;
The first adjusting module is used for adjusting at least one of a basic image, a superposition image and a superposition parameter in the setting area according to the operation of the object in the setting area;
And the second generation module is used for carrying out image generation processing according to the adjusted basic image, the adjusted superposition image and the adjusted superposition parameters under the condition of receiving the image generation request, so as to obtain a re-edited generated image.
21. The apparatus of claim 19, wherein the apparatus further comprises: the device comprises a third determining module, a second unlocking module and a third generating module;
The third determining module is used for determining that a re-editing request for the generated image is received under the condition that the selected operation for the re-editing control is detected;
the second unlocking module is used for unlocking the setting area in the image superposition interface;
And the third generation module is used for carrying out image generation processing according to the basic image, the superposition image and the superposition parameters in the setting area under the condition of receiving an image generation request, so as to obtain a re-edited generated image.
22. The apparatus of claim 19, wherein the apparatus further comprises: the device comprises a fourth determining module, a fourth display module, a second adjusting module and a fourth generating module;
The fourth determining module is used for determining that an image superposition request taking the generated image as a basic image is received under the condition that a selection operation for editing the image control is detected;
The fourth display module is configured to display an image stacking interface according to the image stacking request, where the setting area of the image stacking interface displays the generated image serving as a base image and stacking parameters used when the generated image is generated;
the second adjustment module is used for setting a superposition image or setting the superposition image and adjusting superposition parameters according to the operation of the object in the setting area;
The fourth generation module is configured to, when receiving an image generation request, perform image generation processing according to the generated image serving as a base image, the set superimposed image, and the adjusted superimposed parameter, to obtain a new generated image.
23. The apparatus of claim 14, wherein the apparatus further comprises: and the updating module is used for updating the image generation task list according to the basic image, the superposition parameters and the generated image.
24. The apparatus of claim 14, wherein the number of superimposed images is one or more.
25. The apparatus of claim 14, wherein the first generation module is configured to,
And inputting the basic image, the superposition image and the superposition parameters into an image generation model, and obtaining the generated image output by the image generation model.
26. The apparatus of claim 25, wherein the superposition parameters comprise: descriptive text for describing the generated image, influence weights of the base image, influence weights of the superimposed image, influence weights of the descriptive text;
The image generation model generates an image according to the basic image, the superposition image and the superposition parameters,
Performing image feature extraction processing on the basic image and the superimposed image to obtain image features of the basic image and image features of the superimposed image;
extracting text features of the descriptive text to obtain the text features of the descriptive text;
according to the influence weight, weighting and splicing the image features and the text features to obtain spliced features;
And according to the splicing characteristics, denoising and decoding the noise image characteristics to obtain the generated image.
27. An electronic device, comprising:
at least one processor; and
A memory communicatively coupled to the at least one processor; wherein,
The memory stores instructions executable by the at least one processor to enable the at least one processor to perform the method of any one of claims 1 to 13.
28. Non-transitory computer readable storage medium storing computer instructions
Wherein the computer instructions are for causing the computer to perform according to the claims
The method of any one of claims 1 to 13.
29. A computer program product comprising a computer program which, when executed by a processor, implements the method according to any one of claims 1 to 13.
CN202310343529.2A 2023-03-31 2023-03-31 Image generation method and device and electronic equipment Active CN116363260B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310343529.2A CN116363260B (en) 2023-03-31 2023-03-31 Image generation method and device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310343529.2A CN116363260B (en) 2023-03-31 2023-03-31 Image generation method and device and electronic equipment

Publications (2)

Publication Number Publication Date
CN116363260A CN116363260A (en) 2023-06-30
CN116363260B true CN116363260B (en) 2024-05-17

Family

ID=86936336

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310343529.2A Active CN116363260B (en) 2023-03-31 2023-03-31 Image generation method and device and electronic equipment

Country Status (1)

Country Link
CN (1) CN116363260B (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105243119A (en) * 2015-09-29 2016-01-13 百度在线网络技术(北京)有限公司 Determination of to-be-superimposed region of image, superimposition of images and image display method and apparatus
CN106899878A (en) * 2017-03-21 2017-06-27 电子科技大学 A kind of adjustable video and graph compound method and system of transparency based on OMAP chips
CN110365907A (en) * 2019-07-26 2019-10-22 维沃移动通信有限公司 A kind of photographic method, device and electronic equipment
CN113822784A (en) * 2021-07-06 2021-12-21 腾讯科技(深圳)有限公司 Image processing method and device
CN115408562A (en) * 2021-05-26 2022-11-29 阿里巴巴新加坡控股有限公司 Target object searching method and image searching method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105243119A (en) * 2015-09-29 2016-01-13 百度在线网络技术(北京)有限公司 Determination of to-be-superimposed region of image, superimposition of images and image display method and apparatus
CN106899878A (en) * 2017-03-21 2017-06-27 电子科技大学 A kind of adjustable video and graph compound method and system of transparency based on OMAP chips
CN110365907A (en) * 2019-07-26 2019-10-22 维沃移动通信有限公司 A kind of photographic method, device and electronic equipment
CN115408562A (en) * 2021-05-26 2022-11-29 阿里巴巴新加坡控股有限公司 Target object searching method and image searching method
CN113822784A (en) * 2021-07-06 2021-12-21 腾讯科技(深圳)有限公司 Image processing method and device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
A Prototype Imaging and Visualization System for Robotic Infrastructure Inspection;David A. Lattanzi等;Structures Congress 2013;20130510;第1-12页 *
机载视频叠加单元的设计与实现;赵维娜等;电子设计工程;第22卷(第21期);第145-148页 *
郭春燕.办公自动化应用.中央广播电视大学出版社,2009,第250-252页. *

Also Published As

Publication number Publication date
CN116363260A (en) 2023-06-30

Similar Documents

Publication Publication Date Title
CN112597754B (en) Text error correction method, apparatus, electronic device and readable storage medium
CN114792355B (en) Virtual image generation method and device, electronic equipment and storage medium
CN113657518A (en) Training method, target image detection method, device, electronic device, and medium
CN116205819B (en) Character image generation method, training method and device of deep learning model
CN116363260B (en) Image generation method and device and electronic equipment
US20230005171A1 (en) Visual positioning method, related apparatus and computer program product
CN112990046B (en) Differential information acquisition method, related device and computer program product
CN114445682A (en) Method, device, electronic equipment, storage medium and product for training model
CN115660991A (en) Model training method, image exposure correction method, device, equipment and medium
CN115082298A (en) Image generation method, image generation device, electronic device, and storage medium
CN114218166A (en) Data processing method and device, electronic equipment and readable storage medium
CN114119990A (en) Method, apparatus and computer program product for image feature point matching
CN114358198A (en) Instance segmentation method and device and electronic equipment
CN113344213A (en) Knowledge distillation method, knowledge distillation device, electronic equipment and computer readable storage medium
CN113361535A (en) Image segmentation model training method, image segmentation method and related device
CN113900734B (en) Application program file configuration method, device, equipment and storage medium
CN114791996B (en) Information processing method, device, system, electronic equipment and storage medium
CN116071422B (en) Method and device for adjusting brightness of virtual equipment facing meta-universe scene
CN116363262B (en) Image generation method and device and electronic equipment
CN116543075B (en) Image generation method, device, electronic equipment and storage medium
CN114051110B (en) Video generation method, device, electronic equipment and storage medium
CN113793290B (en) Parallax determining method, device, equipment and medium
CN118015675A (en) Face detection method, device, electronic equipment and readable storage medium
CN114445683A (en) Attribute recognition model training method, attribute recognition device and attribute recognition equipment
CN118151804A (en) Automatic layout method, device and equipment for large screen and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant