CN114663327A - Fusion method, fusion device, computer equipment and storage medium - Google Patents
Fusion method, fusion device, computer equipment and storage medium Download PDFInfo
- Publication number
- CN114663327A CN114663327A CN202210389597.8A CN202210389597A CN114663327A CN 114663327 A CN114663327 A CN 114663327A CN 202210389597 A CN202210389597 A CN 202210389597A CN 114663327 A CN114663327 A CN 114663327A
- Authority
- CN
- China
- Prior art keywords
- image
- fusion
- filled
- information
- current
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004927 fusion Effects 0.000 title claims abstract description 207
- 238000007500 overflow downdraw method Methods 0.000 title claims abstract description 32
- 238000004590 computer program Methods 0.000 claims description 21
- 238000000034 method Methods 0.000 claims description 16
- 230000000694 effects Effects 0.000 abstract description 4
- 238000010586 diagram Methods 0.000 description 8
- 230000006870 function Effects 0.000 description 6
- 238000012986 modification Methods 0.000 description 5
- 230000004048 modification Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000001133 acceleration Effects 0.000 description 3
- 230000008859 change Effects 0.000 description 2
- 239000003086 colorant Substances 0.000 description 2
- 230000004075 alteration Effects 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 230000010365 information processing Effects 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/20—Processor architectures; Processor configuration, e.g. pipelining
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20212—Image combination
- G06T2207/20221—Image fusion; Image merging
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Generation (AREA)
Abstract
The embodiment of the application provides a fusion method, a fusion device, computer equipment and a storage medium, which relate to the field of computers, wherein the fusion method determines the attribute category of the current fusion operation according to the received fusion configuration information; if the attribute type of the current fusion operation is the image fusion operation, determining the image information to be filled of the image to be filled; and performing pixel filling operation in the image to be fused corresponding to the current fusion operation based on the image information to be filled to obtain a target fusion image containing the image to be filled. The technical problem that the execution efficiency of the existing fusion operation is low is solved, and the technical effect of improving the execution efficiency of the fusion operation is achieved.
Description
Technical Field
The present application relates to the field of computer technologies, and in particular, to a fusion method, an apparatus, a computer device, and a storage medium.
Background
The composition operation is one of exa (Ex-kaa. xellection Architecture) hardware acceleration operations, and is generally an operation of taking out image contents corresponding to the sizes of Maks (mask images) from sources (Source images) and drawing or filling them into destinations (target images).
In the fusion operation, some special attributes are sometimes set for the source image and the mask image for the convenience of the user, and a GPU (graphics processing unit) performs a corresponding fusion operation based on the special attributes. However, since the versions of the graphics processors are different, the low-version graphics processor may not be able to process a part of operations corresponding to the complex special attribute, and therefore the complex attribute operations need to be submitted to a Central Processing Unit (CPU) by the graphics processor for drawing a corresponding graphic, which takes a long time in the interaction process.
Therefore, the execution efficiency of the fusion operation is low at present.
Disclosure of Invention
The embodiment of the application provides a fusion method, a fusion device, computer equipment and a storage medium, so as to improve the execution efficiency of the current fusion operation.
In a first aspect of the embodiments of the present application, there is provided a fusion method, including:
determining the attribute type of the current fusion operation according to the received fusion configuration information;
if the attribute type of the current fusion operation is the image fusion operation, determining the information of the image to be filled;
and performing pixel filling operation in the image to be fused corresponding to the current fusion operation based on the information of the image to be filled to obtain a target fusion image containing the image to be filled.
In an optional embodiment of the present application, determining an attribute category of a current fusion operation according to received fusion configuration information includes:
acquiring operation object information and operation content information of the current fusion operation;
and determining whether the attribute type of the current fusion operation is the image fusion operation or not according to the operation object information and the operation content information.
In an optional embodiment of the present application, if the attribute type of the current fusion operation is an image fusion operation, determining image information to be filled of an image to be filled includes:
and if the attribute type of the current fusion operation is the image fusion operation, analyzing the fusion configuration information to obtain the image information to be filled of the image to be filled.
In an optional embodiment of the present application, if the attribute type of the current fusion operation is an image fusion operation, analyzing the fusion configuration information to obtain image information to be filled of the image to be filled, including:
and if the attribute type of the current fusion operation is the color filling operation, determining pixel information for performing color filling on the image to be fused from the fusion configuration information.
In an alternative embodiment of the present application, the image fusion operation is at least one of a source image fill operation and a color fill operation.
In an optional embodiment of the present application, performing a pixel filling operation in an image to be fused corresponding to a current fusion operation based on information of the image to be filled to obtain a target fusion image including the image to be filled, includes:
determining an initial pixel unit with the shape consistent with that of the image to be filled in a pixel memory based on the information of the image to be filled;
adjusting the color of each pixel in the initial pixel unit to be matched with the color of each pixel in the image to be filled to obtain a target pixel unit;
and replacing the image in the region to be filled in the image to be fused with a target pixel unit to obtain a target fused image.
In an optional embodiment of the present application, after replacing the image in the region to be filled in the image to be fused with the target pixel unit to obtain the target fused image, the method further includes:
and initializing the color of each pixel in the target pixel unit.
In a second aspect of the embodiments of the present application, there is provided a fusion apparatus, including:
the first determining module is used for determining the attribute category of the current fusion operation according to the received fusion configuration information;
the second determining module is used for determining the image information to be filled of the image to be filled if the attribute type of the current fusion operation is the image fusion operation;
and the fusion module is used for carrying out pixel filling operation in the image to be fused corresponding to the current fusion operation based on the information of the image to be filled to obtain a target fusion image containing the image to be filled.
In a third aspect of embodiments of the present application, there is provided a computer device, including: comprising a memory storing a computer program and a processor implementing the steps of the method as claimed in any one of the above when the processor executes the computer program.
In a fourth aspect of the embodiments of the present application, there is provided a computer-readable storage medium on which a computer program is stored, wherein the computer program, when executed by a processor, implements the steps of the method as in any one of the above.
The fusion method provided by the embodiment of the application determines the attribute type of the current fusion operation according to the received fusion configuration information, then determines the image information to be filled of the image to be filled under the condition that the attribute type of the current fusion operation is the image fusion operation, and performs pixel filling operation in the image to be fused corresponding to the current fusion operation based on the image information to be filled, so as to obtain the target fusion image containing the image to be filled. The graphics processor is a microprocessor which is specially used for image and graphics related operation work on personal computers, workstations, game machines and some mobile devices (such as tablet computers, smart phones and the like), and pixel processing is the basic function of the graphics processor. Therefore, the graphics processor converts the fusion operation which cannot be executed by part of versions of the graphics processor into the pixel filling operation which can be executed by any version of the graphics processor, and the compatibility and the adaptability of image fusion are greatly expanded. Meanwhile, the graphics processor does not need to interact with the CPU, the CPU does not need to perform software drawing corresponding to the image fusion operation, and the graphics processor can independently complete the whole image fusion operation, so that the technical problem that the execution efficiency of the current fusion operation is low is solved, and the technical effect of improving the execution efficiency of the image fusion operation is achieved.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this application, illustrate embodiment(s) of the application and together with the description serve to explain the application and not to limit the application. In the drawings:
fig. 1 is a schematic diagram illustrating an image fusion process in a fusion method according to an embodiment of the present application;
FIG. 2 is a flow chart of a fusion method provided in an embodiment of the present application;
FIG. 3 is a flow chart of a fusion method provided by an embodiment of the present application;
FIG. 4 is a flow chart of a fusion method provided by an embodiment of the present application;
FIG. 5 is a schematic view of a fusion device according to an embodiment of the present application;
fig. 6 is a schematic structural diagram of a computer device according to an embodiment of the present application.
Detailed Description
In the process of implementing the present application, the inventors found that the execution efficiency of the fusion operation is low at present. In view of the above problems, the embodiments of the present application provide a fusion method to improve the efficiency of fusion operations.
The scheme in the embodiment of the application can be implemented by adopting various computer languages, such as object-oriented programming language Java and transliterated scripting language JavaScript.
In order to make the technical solutions and advantages of the embodiments of the present application more apparent, the following further detailed description of the exemplary embodiments of the present application with reference to the accompanying drawings makes it clear that the described embodiments are only a part of the embodiments of the present application, and are not exhaustive of all embodiments. It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict.
The application environment of the fusion method provided by the embodiment of the present application is briefly described as follows:
referring to fig. 1, image fusion refers to an operation of filling an image to be filled into an image to be fused according to a certain mask shape to obtain a target fused image. The fusion method provided by the embodiment of the application is applied to a graphics processor, and the graphics processor sets some special attributes for a to-be-filled image and a mask image sometimes for user convenience in a fusion operation, and then performs corresponding fusion operations based on the special attributes, such as a repeat operation and a source type operation. When the operation object is set to repeat operation, for example, the size of the image to be filled is w1 × h1, and the size of the mask image actually needed to be drawn is w2 × h2, there are w1< w2 and h1< h2 in some cases, and the graph of the graph is smaller than the size needed to be drawn, then repeat operation is needed, and the operation type and the corresponding operation content of the repeat operation are as in the following table (1):
watch (1)
Type of operation | Content of operation |
RepeatNone | Do not repeat operation |
RepeatPad | The blank area is configured as the color value of the nearest pixel point |
RepeatNormal | White space configuration for source picture repetition |
RepeatReflect | Blank region configuration for source picture mirrored repetition |
The Sourcetype operation requires a specific operation to be performed on a given source picture and then the acceleration is performed, and the operation generally requires a special display effect, such as linear gradient, radial gradient, conical gradient or color change. The Sourcetype operation comprises the following operation types and corresponding operation contents as shown in the following table (2):
watch (2)
Type of operation | Content of operation |
SourcePictTypeSolidFill | Generating a solid color picture of a given color |
SourcePictTypeLinear | Generating the linearly-gradually-changed picture of the original picture |
SourcePictTypeRadial | Generating the picture after the original picture is gradually changed in the radial direction |
SourcePictTypeConical | Generating the image with the original image cone gradually changed |
When the color change is required as in the above table (2), the specific type is SourcePictTypeSolidFill, and the operation only gives a specific color without an actual picture, and in the specific operation, the operation is equivalent to a pure color picture with a specified width and height.
Referring to fig. 2, the following embodiment specifically describes an example in which the graphics processor is used as an execution subject, and the fusion method provided in the present application is applied to the graphics processor to fuse an image to be filled into an image to be fused to obtain a target fusion image. The fusion method provided by the embodiment of the application comprises the following steps 201 to 203:
The fusion configuration information is used for representing an operation object, operation content and the like of the current fusion operation, and the attribute type is used for representing specific operation content of the current fusion operation, such as data fusion operation, image fusion operation, information fusion operation and the like. For example, if the operation object in the fusion configuration information is an image and the operation content is image fill, the corresponding attribute type is an image fusion operation. The operation content of the specific fusion operation may be as described in the above table (1) and table (2): repeat operation is not performed, the blank area is configured to the color value of the nearest pixel point, the blank area is configured to repeat the source picture, the blank area is configured to repeat the mirror image of the source picture, and the like, and details are not repeated herein.
If the attribute type of the current fusion operation is the image fusion operation, the image fusion operation cannot be executed due to the lower version of the partial image processor, namely the corresponding hardware acceleration cannot be executed. In this case, the graphics processor first determines the image information to be filled of the image to be filled, where the image to be filled refers to the image element to be filled into the image to be fused, that is, the image to be filled in fig. 1 may be one image, one element, one color, or the like. The image to be filled may be directly configured to the graphics processor by the CPU, or may be obtained by analyzing the fusion configuration information by the graphics processor, which is not specifically limited in this embodiment.
And step 203, the graphics processor performs pixel filling operation in the image to be fused corresponding to the current fusion operation based on the information of the image to be filled, so as to obtain a target fusion image containing the image to be filled.
After obtaining the image information to be filled, the graphics processor converts the fusion filling operation into a common pixel filling operation, for example, after obtaining the image information to be filled in the image to be filled in fig. 2, performs pixel filling in the image to be fused based on the image information, and thus obtains a corresponding target fusion image.
The fusion method provided by the embodiment of the application determines the attribute type of the current fusion operation according to the received fusion configuration information, then determines the image information to be filled of the image to be filled under the condition that the attribute type of the current fusion operation is the image fusion operation, and performs pixel filling operation in the image to be fused corresponding to the current fusion operation based on the image information to be filled, so as to obtain the target fusion image containing the image to be filled. The graphics processor is a microprocessor which is specially used for image and graphics related operation work on personal computers, workstations, game machines and some mobile devices (such as tablet computers, smart phones and the like), and pixel processing is the basic function of the graphics processor. Therefore, the graphics processor converts the fusion operation which cannot be executed by part of versions of the graphics processor into the pixel filling operation which can be executed by any version of the graphics processor, and the compatibility and the adaptability of image fusion are greatly expanded. Meanwhile, the graphics processor does not need to interact with the CPU, the CPU does not need to perform software drawing corresponding to the image fusion operation, and the graphics processor can independently complete the whole image fusion operation, so that the technical problem that the execution efficiency of the current fusion operation is low is solved, and the technical effect of improving the execution efficiency of the image fusion operation is achieved.
Referring to fig. 3, in an optional embodiment of the present application, the step 201, where the graphics processor determines the attribute type of the current fusion operation according to the received fusion configuration information, includes the following steps 301 to 302:
The operation object information refers to an object of the current fusion operation, and is, for example, information, data, an image, or the like; the operation content information is information such as a range, a shape, a region, and a position of the current fusion operation. For example, if the current fusion operation is information fusion, the corresponding operation target information is information, and the corresponding operation content includes source information content, a source information address, a destination information address, an insertion position, and the like. If the current fusion operation is image fusion, the corresponding operation object information is an image, and the corresponding operation content includes an image address to be filled, an image address to be fused, and a fusion size or shape (i.e., the shape or size of the mask).
The graphics processor may determine whether the type of the current fusion operation is the image fusion operation by simple judgment after obtaining the operation object information and the operation content information in step 401, for example, if the object of the current fusion operation is an image, or the operation content information includes an image address to be filled, an image address to be fused, a fusion size or a fusion shape, and the like, it determines whether the type of the current fusion operation is the image fusion operation, otherwise, the current fusion operation is the non-image fusion operation.
According to the method and the device for image fusion, the operation object information and the operation content information of the current fusion operation are obtained firstly, and then whether the attribute type of the current fusion operation is the image fusion operation or not is determined according to the operation object information and the operation content information, so that the accuracy of judging the type of the current fusion operation is higher, and the reliability of the image fusion method in the embodiment of the application can be further improved.
In an optional embodiment of the present application, in the step 202, if the attribute type of the current fusion operation is an image fusion operation, the graphics processor determines image information to be filled of an image to be filled, including the following step a:
and step A, if the attribute type of the current fusion operation is the image fusion operation, the image processor analyzes the fusion configuration information to obtain the image information to be filled of the image to be filled.
The fusion configuration information may be directly configured by a user, or may be sent to the graphics processor by the CPU, and when the attribute type of the current fusion operation is an image filling operation, the graphics processor analyzes the fusion configuration information to obtain image information to be filled of the image to be filled, where the information to be filled may be a frame of image or a designated color, and this embodiment is not particularly limited.
In the embodiment of the application, when the attribute category of the current fusion operation is the image fusion operation, the image processor analyzes the fusion configuration information to obtain the image information to be filled of the image to be filled so as to ensure the normal execution of the conversion from image fusion to pixel filling operation; meanwhile, under the condition that the attribute type of the current fusion operation is the non-image fusion operation, information analysis is not carried out, unnecessary computing resources are prevented from being wasted, and the execution efficiency of the image fusion operation is further improved.
In an optional embodiment of the present application, in the step a, if the attribute type of the current fusion operation is an image fusion operation, the graphics processor analyzes the fusion configuration information to obtain image information to be filled of the image to be filled, including the following step B:
and B, if the attribute type of the current fusion operation is color filling operation, the image processor determines pixel information for performing color filling on the image to be fused from the fusion configuration information.
If the attribute type of the current fusion operation is a color filling operation, for example, sourcepicttypesolid fill operation in the above table (2), and the operation content is a pure color picture generating a specified color, the graphics processor only needs to determine the specified color from the fusion configuration information, that is, only needs to obtain pixel information for performing color filling on the image to be fused, and does not need to continuously determine information such as the image to be filled, so that the information processing efficiency is greatly improved, and the execution efficiency of the image fusion operation in the embodiment of the present application is further improved. The pixel information may be a color, such as red, green, blue, or the like, or may be a color value (rgb value), such as (0, 255, 0), or the like, and the embodiment is not particularly limited.
In an alternative embodiment of the present application, the image fusion operation is at least one of a source image fill operation and a color fill operation.
The color filling operation refers to sourcepicttypesolid fill in the table (2), and the operation content is to generate a pure color picture of a specified color. The source image filling operation refers to a repeat normal operation in the table (1), the operation content is that the blank area is configured as a repetition of the source image, the blank area can be divided into a plurality of unit areas after the image information of the source image, that is, the pixel display information is obtained, and the specific operation can be converted into the filling of the designated color for the plurality of unit areas.
The image fusion operation in the embodiment of the application is at least one of source image filling operation and color filling operation, common image processors do not support the source image filling operation and the color filling operation, and are generally handed to a CPU (central processing unit) for image drawing.
Referring to fig. 4, in an optional embodiment of the present application, in step 203, the graphics processor performs a pixel filling operation in the image to be fused corresponding to the current fusion operation based on the information of the image to be filled, so as to obtain a target fusion image including the image to be filled, including the following steps 401 to 403:
After determining and obtaining the information of the image to be filled of the determined image to be filled, the graphics processor applies for a corresponding pixel frame to a pixel storage module in the graphics processor, that is, a pixel unit with the size and the shape consistent with the size and the shape of the filling content is obtained from the pixel storage module, and then the corresponding initial pixel unit is obtained.
For example, the image to be filled is a green triangle, and the graphics processor has applied for an initial pixel element of the triangle based on step 401 above. In this step, the graphics processor adjusts each pixel in the initial pixel unit to green, and rgb thereof is (0, 255, 0), so as to obtain the corresponding target pixel unit.
And step 403, replacing the image in the region to be filled in the image to be fused with a target pixel unit by the graphics processor to obtain a target fused image.
After obtaining the target pixel unit, the graphics processor replaces the region to be filled in the image to be fused (e.g., a rectangle), that is, the triangle region, with the target pixel unit of the green triangle based on the target pixel unit, so as to obtain a rectangular image including the green triangle, that is, obtain the target fused image.
According to the image fusion method and device, the initial pixel unit which is consistent with the shape of the image to be filled is determined in the pixel memory based on the information of the image to be filled, the color of each pixel in the initial pixel unit is adjusted to the target pixel unit of the color matched with each pixel in the image to be filled, finally the image in the area to be filled in the image to be fused is replaced by the target pixel unit, the image can be fused in a target mode, the image drawing is not needed by a CPU, the whole process is completed independently by the graphic processor through the pixel application unit and the color filling, and the execution efficiency of the image filling is greatly improved.
In an optional embodiment of the present application, after the step 403, replacing, by the graphics processor, the image in the region to be filled in the image to be fused with the target pixel unit to obtain the target fused image, the fusing method further includes the following steps:
the graphics processor initializes the color of each pixel in the target pixel unit.
The image processor initializes the colors of the pixels in the target pixel unit after the target fusion image is obtained, so that the colors of the pixels can be conveniently adjusted when the next color filling is carried out, and the execution efficiency of the image filling is greatly improved.
It should be understood that, although the steps in the flowchart are shown in order as indicated by the arrows, the steps are not necessarily performed in order as indicated by the arrows. The steps are not performed in the exact order shown and described, and may be performed in other orders, unless explicitly stated otherwise. Moreover, at least some of the steps in the figures may include multiple sub-steps or multiple stages that are not necessarily performed at the same time, but may be performed at different times, and the order of performance of the sub-steps or stages is not necessarily sequential, but may be performed in turn or alternately with other steps or at least some of the sub-steps or stages of other steps.
Referring to fig. 5, one embodiment of the present application provides a fusion device 500, including: a first determining module 510, a second determining module 520, and a fusing module 530, wherein:
the first determining module 510 is configured to determine an attribute category of a current fusion operation according to the received fusion configuration information;
the second determining module 520 is configured to determine image information to be filled of the image to be filled if the attribute type of the current fusion operation is the image fusion operation;
the fusion module 530 is configured to perform a pixel filling operation in the image to be fused corresponding to the current fusion operation based on the information of the image to be filled, so as to obtain a target fusion image including the image to be filled.
In an optional embodiment of the present application, the first determining module 510 is specifically configured to obtain operation object information and operation content information of a current fusion operation; and determining whether the attribute type of the current fusion operation is the image fusion operation or not according to the operation object information and the operation content information.
In an optional embodiment of the present application, the second determining module 520 is specifically configured to, if the attribute type of the current fusion operation is an image fusion operation, analyze the fusion configuration information to obtain image information to be filled of the image to be filled.
In an optional embodiment of the present application, the second determining module 520 is specifically configured to determine, if the attribute type of the current fusion operation is a color filling operation, pixel information for performing color filling on the image to be fused from the fusion configuration information.
In an alternative embodiment of the present application, the image fusion operation is at least one of a source image fill operation and a color fill operation.
In an optional embodiment of the present application, the fusion module 530 is specifically configured to determine, in the pixel memory, an initial pixel unit having a shape consistent with that of the image to be filled based on the information of the image to be filled; adjusting the color of each pixel in the initial pixel unit to be matched with the color of each pixel in the image to be filled to obtain a target pixel unit; and replacing the image in the region to be filled in the image to be fused with a target pixel unit to obtain a target fused image.
In an optional embodiment of the present application, the fusion module 530 is further configured to initialize the color of each pixel in the target pixel unit.
For the specific limitations of the fusion device 500, reference may be made to the limitations of the fusion method above, and the details are not repeated here. The various modules in the fusion device 500 described above may be implemented in whole or in part by software, hardware, and combinations thereof. The modules can be embedded in a hardware form or independent from a processor in the computer device, and can also be stored in a memory in the computer device in a software form, so that the processor can call and execute operations corresponding to the modules.
In one embodiment, a computer device is provided, the internal structure of which may be as shown in fig. 6. The computer device includes a processor, a memory, a network interface, and a database connected by a system bus. Wherein the processor of the computer device is configured to provide computing and control capabilities. The memory of the computer device comprises a nonvolatile storage medium and an internal memory. The non-volatile storage medium stores an operating system, a computer program, and a database. The internal memory provides an environment for the operation of an operating system and computer programs in the non-volatile storage medium. The database of the computer device is used for storing data. The network interface of the computer device is used for communicating with an external terminal through a network connection. The computer program is executed by a processor to implement a fusion method as above. The method comprises the following steps: the fusion method comprises a memory and a processor, wherein the memory stores a computer program, and the processor realizes any step of the fusion method when executing the computer program.
In an embodiment, a computer-readable storage medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out any of the steps of the above fusion method.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems), and computer program products according to embodiments of the application. It will be understood that each flow and/or block of the flowchart illustrations and/or block diagrams, and combinations of flows and/or blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
While the preferred embodiments of the present application have been described, additional variations and modifications in those embodiments may occur to those skilled in the art once they learn of the basic inventive concepts. Therefore, it is intended that the appended claims be interpreted as including preferred embodiments and all alterations and modifications as fall within the scope of the application.
It will be apparent to those skilled in the art that various changes and modifications may be made in the present application without departing from the spirit and scope of the application. Thus, if such modifications and variations of the present application fall within the scope of the claims of the present application and their equivalents, the present application is intended to include such modifications and variations as well.
Claims (10)
1. A fusion method, comprising:
determining the attribute type of the current fusion operation according to the received fusion configuration information;
if the attribute type of the current fusion operation is the image fusion operation, determining the information of the image to be filled;
and performing pixel filling operation in the image to be fused corresponding to the current fusion operation based on the image information to be filled to obtain a target fusion image containing the image to be filled.
2. The fusion method according to claim 1, wherein the determining the attribute category of the current fusion operation according to the received fusion configuration information comprises:
acquiring operation object information and operation content information of the current fusion operation;
and determining whether the attribute category of the current fusion operation is the image fusion operation or not according to the operation object information and the operation content information.
3. The fusion method according to claim 1, wherein determining the image information to be filled of the image to be filled if the attribute type of the current fusion operation is an image fusion operation comprises:
and if the attribute type of the current fusion operation is the image fusion operation, analyzing the fusion configuration information to obtain the image information to be filled of the image to be filled.
4. The fusion method according to claim 3, wherein if the attribute type of the current fusion operation is the image fusion operation, the analyzing the image information to be filled of the image to be filled from the fusion configuration information includes:
and if the attribute type of the current fusion operation is color filling operation, determining pixel information for performing color filling on the image to be fused from the fusion configuration information.
5. The fusion method of claim 1, wherein the image fusion operation is at least one of a source image fill operation and a color fill operation.
6. The fusion method according to claim 1, wherein the performing a pixel filling operation in the image to be fused corresponding to the current fusion operation based on the image information to be filled to obtain a target fusion image including the image to be filled comprises:
determining an initial pixel unit with the shape consistent with that of the image to be filled in a pixel memory based on the image information to be filled;
adjusting the color of each pixel in the initial pixel unit to be matched with the color of each pixel in the image to be filled to obtain a target pixel unit;
and replacing the image in the region to be filled in the image to be fused with the target pixel unit to obtain the target fused image.
7. The fusion method according to claim 6, wherein after the replacing the image in the region to be filled in the image to be fused with the target pixel unit to obtain the target fused image, the method further comprises:
and initializing the color of each pixel in the target pixel unit.
8. A fusion device, comprising:
the first determining module is used for determining the attribute category of the current fusion operation according to the received fusion configuration information;
the second determining module is used for determining the image information to be filled of the image to be filled if the attribute type of the current fusion operation is the image fusion operation;
and the fusion module is used for carrying out pixel filling operation in the image to be fused corresponding to the current fusion operation based on the information of the image to be filled to obtain a target fusion image containing the image to be filled.
9. A computer device, comprising: comprising a memory and a processor, said memory storing a computer program, characterized in that said processor realizes the steps of the method according to any one of claims 1 to 7 when executing said computer program.
10. A computer-readable storage medium, on which a computer program is stored, which, when being executed by a processor, carries out the steps of the method of any one of claims 1 to 7.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210389597.8A CN114663327A (en) | 2022-04-14 | 2022-04-14 | Fusion method, fusion device, computer equipment and storage medium |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202210389597.8A CN114663327A (en) | 2022-04-14 | 2022-04-14 | Fusion method, fusion device, computer equipment and storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
CN114663327A true CN114663327A (en) | 2022-06-24 |
Family
ID=82035693
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202210389597.8A Pending CN114663327A (en) | 2022-04-14 | 2022-04-14 | Fusion method, fusion device, computer equipment and storage medium |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN114663327A (en) |
-
2022
- 2022-04-14 CN CN202210389597.8A patent/CN114663327A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US9558542B2 (en) | Method and device for image processing | |
US20150222814A1 (en) | Image Acquisition Method and Apparatus | |
CN108021671B (en) | Page transparent processing method and device | |
CN107450897B (en) | Cross-platform migration method and device for graphic engine | |
US10319068B2 (en) | Texture not backed by real mapping | |
CN112651475B (en) | Two-dimensional code display method, device, equipment and medium | |
CN109472852A (en) | Display methods and device, the equipment and storage medium of point cloud chart picture | |
CN115439609B (en) | Three-dimensional model rendering method, system, equipment and medium based on map service | |
CN114072760B (en) | Cutting method, distribution method, medium, server, and system | |
CN111833417A (en) | Method and system for realizing black and white mode of android application program | |
CN114416056A (en) | Page generation method, system, computer equipment and readable storage medium | |
CN110806847A (en) | Distributed multi-screen display method, device, equipment and system | |
CN112714357A (en) | Video playing method, video playing device, electronic equipment and storage medium | |
US20190043249A1 (en) | Method and apparatus for blending layers within a graphics display component | |
CN111787240B (en) | Video generation method, apparatus and computer readable storage medium | |
CN111223155A (en) | Image data processing method, image data processing device, computer equipment and storage medium | |
CN112396610A (en) | Image processing method, computer equipment and storage medium | |
CN117710549A (en) | Rendering method and device | |
CN114663327A (en) | Fusion method, fusion device, computer equipment and storage medium | |
CN108010095B (en) | Texture synthesis method, device and equipment | |
CN111787081B (en) | Information processing method based on Internet of things interaction and intelligent communication and cloud computing platform | |
CN114463400A (en) | Texture sampling method and system based on texture object segmentation | |
US9069905B2 (en) | Tool-based testing for composited systems | |
CN112381905A (en) | Color superposition method and device and electronic equipment | |
CN110852936B (en) | Method and device for mixing pictures |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |