CA2931695A1 - Picture fusion method and apparatus - Google Patents

Picture fusion method and apparatus Download PDF

Info

Publication number
CA2931695A1
CA2931695A1 CA2931695A CA2931695A CA2931695A1 CA 2931695 A1 CA2931695 A1 CA 2931695A1 CA 2931695 A CA2931695 A CA 2931695A CA 2931695 A CA2931695 A CA 2931695A CA 2931695 A1 CA2931695 A1 CA 2931695A1
Authority
CA
Canada
Prior art keywords
picture
clipping
fusion
carrier
clipped
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CA2931695A
Other languages
French (fr)
Other versions
CA2931695C (en
Inventor
Yulong Wang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Publication of CA2931695A1 publication Critical patent/CA2931695A1/en
Application granted granted Critical
Publication of CA2931695C publication Critical patent/CA2931695C/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation

Abstract

A picture fusion method and apparatus are described. A clipping carrier is created; a first picture is added to the clipping carrier and the first picture is set to a clipping template; a second picture is added to the clipping carrier and the second picture is set to to-be-clipped content; a fusion parameter of the first picture and the second picture is set; the clipping carrier is added to a corresponding layer of an application background; when a picture fusion instruction is received, the second picture is clipped along an outline of a graph in the first picture, parts of the second picture and out of an outline surrounding range are filtered out, and according to the fusion parameter, pixel data of a fusion picture of a reserved part of the second picture and the first picture is calculated.

Description

PICTURE FUSION METHOD AND APPARATUS
FIELD OF THE TECHNOLOGY
[0001] The present disclosure relates to picture processing technologies, and in particular, to a picture fusion method and apparatus.
BACKGROUND OF THE DISCLOSURE
[0002] An Open Graphics Library (OpenGL) is a professional graphics program interface and an underlying graphics library that is powerful in function and easy to invoke, which is used for processing a two-dimensional or three-dimensional image. The OpenGL
defines a standard for a cross-programming-language and cross-platform programming interface, is independent of Windows operating systems or other operating systems, and is also network-transparent. Therefore, software supporting the OpenGL has good portability. An OpenGL for Embedded Systems (OpenGL ES) is a subset of an OpenGL Three-dimensional Graphics API, is specially designed for embedded devices, such as a mobile phone, a PDA, and a game console, and various embedded systems, and creates a flexible and powerful underlying interaction interface between software and graph acceleration. An OpenGL ES 2.0 may greatly improve 3D
graphics rendering speeds of different consumer electronic devices and achieve all programmable 3D graphics in an embedded system.
[0003] Picture fusion in the OpenGL is a common technology, that is, according to different fusion parameters, fusion calculation is performed on pixel data of pictures, so as to obtain a fusion picture of a special effect.
[0004] Because of limitations of performance of hardware in a mobile terminal, applications in the mobile terminal cannot use a great number of high-definition pictures and flash animation effects as those of a PC terminal do. Applications in the mobile terminal supporting the OpenGL achieve florid effects by fusing pictures and then adding some geometric and color changes and achieve the purpose of occupying little memory.
SUMMARY
[0005] Based on the above, it is necessary to provide a picture fusion method for fusing pictures based on an outline of a graph in a picture.
[0006] A picture fusion method includes:

creating a clipping carrier;
adding a first picture to the clipping carrier and setting the first picture to a clipping template on the clipping carrier;
adding a second picture to the clipping carrier and setting the second picture to to-be-clipped content on the clipping carrier;
setting a fusion parameter of the first picture and the second picture;
adding the clipping carrier to a corresponding layer of an application background;
and clipping the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, a clipped second picture including parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range being a fusion part of the first picture and the second picture; filtering out the parts of the clipped second picture and out of the outline surrounding range of the first picture; and calculating pixel data of the picture within the outline surrounding range according to the fusion parameter.
[0007] In addition, it is further necessary to provide a picture fusion apparatus for fusing pictures based on an outline of a graph in a picture.
[0008] A picture fusion apparatus includes:
a carrier creating module, configured to create a clipping carrier;
a clipping template setting module, configured to add a first picture to the clipping carrier and set the first picture to a clipping template on the clipping carrier;
a clipped object setting module, configured to add a second picture to the clipping carrier and set the second picture to to-be-clipped content on the clipping carrier;
a fusion parameter setting module, configured to set a fusion parameter of the first picture and the second picture;
a carrier adding module, configured to add the clipping carrier to a corresponding layer of an application background; and a fusion module, configured to clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, filter out parts of the second picture and out of an outline surrounding range, and calculate pixel data of a fusion picture of a reserved part of the second picture and the first picture according to the fusion parameter.
[0009] In the foregoing picture fusion method and apparatus, a clipping carrier is created, a first picture is added to the clipping carrier, the first picture is set to a clipping template, and a second picture is added to the clipping carrier, so that the second picture becomes a to-be-clipped picture; when the first picture and the second picture are fused, the second picture is clipped along an outline of a graph in the first picture, parts of the second picture and out of an outline surrounding range are filtered out, and a reserved part of the second picture and the first picture are fused. According to the foregoing picture fusion method and apparatus, the first picture and the second picture are fused on the basis of the outline of the graph in the first picture, and an obtained fusion picture has no redundant parts that are out of the outline of the graph in the first picture, so that an effect of mixing colors visually does not occur, which is produced by overlapping of the redundant parts and other pictures that are in the application background, thereby avoiding that effect display of the fusion picture is affected by the effect of mixing colors.
BRIEF DESCRIPTION OF THE DRAWINGS
[0010] FIG. 1 is a schematic diagram of a picture fusion effect in the existing technology;
[0011] FIG. 2 is a schematic flowchart of a picture fusion method in an embodiment;
[0012] FIG. 3 is a schematic flowchart of a picture fusion method in another embodiment;
[0013] FIG. 4 is a schematic diagram of an effect of a fusion picture obtained by fusing picture a and picture b in FIG. 1 according to a picture fusion method in an embodiment;
[0014] FIG. 5 is a schematic structural diagram of a picture fusion apparatus in an embodiment;
[0015] FIG. 6 is a schematic structural diagram of a picture fusion apparatus in another embodiment; and
[0016] FIG. 7 is a schematic structural diagram of a picture fusion apparatus in still another embodiment.
DESCRIPTION OF EMBODIMENTS
[0017] To make the objectives, technical solutions, and advantages of the present disclosure more clearly, the following further describes the present disclosure in detail with reference to the accompanying drawings and embodiments. It should be understood that, the specific embodiments described herein are merely for illustrating the present disclosure, but are not intended to limit the present disclosure.
[0018] It may be understood that terms such as "first" and "second" used in the present disclosure may be used for describing elements herein, but the elements are not limited by the terms. The terms are used only to differentiate an element from another element. For example, without departing from the scope of the present disclosure, a first picture may be referred to as a second picture, and similarly, a second picture may be referred to as a first picture. A first picture and a second picture are pictures but are not a same picture.
[0019] When two pictures are fused, it is usually required to use an outline of a graph in a first picture as a basis to fuse a lighting effect of a second picture into the first picture, and parts of the second picture and out of an outline surrounding range of the graph need to be filtered out and are not displayed any more. However, currently, a picture is clipped based on a rectangle and cannot be clipped according to irregular graphs. Therefore, in an application background, after the two pictures are fused, redundant and unwanted parts are overlapped with other pictures in the application background, which produces an effect of mixing colors visually, thereby probably affecting effect display of a fusion picture.
[0020] As shown in FIG. 1, picture a and picture b are in picture formats having an alpha channel, and include a transparent part and a non-transparent part. Both of the pictures are rectangular. The non-transparent part in picture a is elliptical and has a high lighting effect. The non-transparent part in picture b is in the shape of a Chinese character " t) j " . The objective of fusing picture a and picture b is to achieve a high lighting effect of the Chinese character " t) j " in picture b. However, after picture a and picture b are fused, a fusion picture is placed in an application background, which is shown in picture c, redundant parts in picture a except the Chinese character " t) j " are overlapped with other pictures that are in the application background, which produces an effect of mixing colors visually, thereby affecting display of a high lighting effect of the Chinese character " t) j " .
[ 0021 ] As shown in FIG. 2, in an embodiment, a picture fusion method includes the following steps:
[0022] Step S201: Create a clipping carrier.
[0023] The clipping carrier, as a carrier of a clipping template and to-be-clipped content, may be used for setting a picture to a clipping template and a picture to to-be-clipped content.

[0024] Step S202: Add a first picture to the clipping carrier and set the first picture to a clipping template on the clipping carrier.
[0025] Step S203: Add a second picture to the clipping carrier and set the second picture to to-be-clipped content on the clipping carrier.
[0026] The first picture and the second picture are pictures having an alpha channel, for example, a picture in a png format, and both the first picture and the second picture include a transparent part and a non-transparent part. Setting the first picture to a clipping template on the clipping carrier may trigger an operation of setting the non-transparent part of the first picture to a clipping area. Therefore, the picture fusion method further includes a step:
setting the non-transparent part of the first picture to a clipping area after the first picture is set to a clipping template on the clipping carrier. The clipping area of the clipping template is used as a clipping mold for the to-be-clipped content and the to-be-clipped content is clipped according to an outline of the clipping mold, so that a part within a range of the clipping mold and parts out of the range of the clipping mold may be obtained.
[0027] Step S204: Set a fusion parameter of the first picture and the second picture.
[0028] The fusion parameter is used to limit fusion calculation of pixel data of the first picture and the second picture. Pixel data of the first picture and the second picture is calculated according to different fusion parameters, so that different fusion values may be obtained to serve as pixel data of a fusion picture, thereby obtaining different fusion effects of the fusion picture.
The number of fusion parameters depends on an underlying rendering engine. If the underlying rendering engine provides n fusion parameters, there are totally n*n combinations of fusion parameters of the first picture and the second picture. Respective n fusion parameters of the first picture and the second picture may be set according to different required fusion effects.
[0029] Step S205: Add the clipping carrier to a corresponding layer of an application background.
[0030] Multiple pictures in an application background belong to different levels and the pictures are overlapped according to corresponding levels.
[0031] Step S206: Clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, a clipped second picture including parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range being a fusion part of the first picture and the second picture; filter out the parts of the clipped second picture and out of the outline surrounding range of the first picture; and calculate pixel data of the picture within the outline surrounding range according to the fusion parameter.
[0032] Specifically, a clipped second picture includes parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range is a fusion part of the first picture and the second picture or may be referred to as a superposed part or an overlapped part of the first picture and the second picture. The filtering out refers to filtering out the parts of the clipped second picture and out of the outline surrounding range of the first picture, so as to reserve a picture within the outline surrounding range of the first picture and in the second picture.
[0033] The non-transparent part in the first picture is presented in a form of a graph, and the graph in the first picture is the non-transparent part of the first picture.
[0034] A fusion value of pixel data (that is, data of overlapped pixels) of the reserved part of the second picture and the first picture at a same position may be calculated according to the fusion parameter, so as to obtain pixel data of the fusion picture at the same position. Specifically, pixel data of the picture within the outline surrounding range of the first picture and in the second picture is calculated.
[0035] In an embodiment, the foregoing clipping carrier is a CCClippingNode object. As shown in FIG. 3, a picture fusion method in this embodiment includes the following steps:
[0036] Step S301: Create a CCClippingNode object.
[0037] The CCClippingNode object may be used for clipping a UI (user interface) control and is inherited from a node class CCNode. The CCNode is a parent class of scene, layer, menu, sprite, and the like in a cocos2d-x. The cocos2d-x is an open-source mobile 2D
game framework supporting an OpenGL ES.
[0038] For example, program code for creating a CCClippingNode object includes:
CCClippingNode *clipper = CCClippingNode::create(); where a name of the created CCClippingNode object is clipper.
[0039] Step S302: Set a clipping template of the CCClippingNode object to a first picture.
[0040] For example, program code for setting a clipping template of the foregoing clipper to a first picture includes:
CCSprite *word=CCSprite::createWithSpriteFrameName("firstpicture.png");
//create a CCSprite object word corresponding to firstpicture.png clipper->setStencil(word); //set word to a clipping template of clipper clipper->addChild(word).//add word to serve as a child of clipper where firstpicture.png is a first picture, CCSprite is a sprite class in the cocos2d-x, createWithSpriteFrameName is a function for creating a sprite on the basis of input parameters, and the foregoing code in the first line is used for creating a sprite object word corresponding to the first picture; setStencil is a function of a CCClippingNode class for setting a clipping template and the foregoing code in the second line is used for setting word to a clipping template of clipper, that is, setting the first picture to a clipping template of clipper; and further, word further needs to be added to serve as a child of clipper and the third line implements the function.
[0041] Step S303: Set to-be-clipped content of the CCClippingNode object to a second picture.
[0042] For example, program code for setting to-be-clipped content of the foregoing clipper to a second picture includes:
CCSprite* silderShine =
CCSprite::createWithSpriteFrameName("secondpicture.png"); //create a CCSprite object silderShine corresponding to secondpicture.png clipper->addChild(silderShine).//add silderShine to serve as a child of clipper where secondpicture.png is a second picture, code in the first line is used for creating a sprite object silderShine corresponding to the second picture; code in the second line is used for adding silderShine to serve as a child of clipper, so as to set to-be-clipped content of the CCClippingNode object to a second picture.
[0043] The picture fusion method in this embodiment may further include a step of setting a part, reserved after clipping, of the to-be-clipped content of the CCClippingNode object to a part surrounded by a clipping outline. An attribute value of isInverted of the CCClippingNode object may be set to true.
[0044] Step S304: Set a fusion parameter of the first picture and the second picture.
[0045] Step S305: Add the CCClippingNode object to a corresponding layer of an application background.
[0046] Step S306: Clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, filter out parts of the second picture and out of an outline surrounding range, and calculate pixel data of a fusion picture of a reserved part of the second picture and the first picture according to the fusion parameter.
[0047] Steps S301 to S303 and S305 in this embodiment are respectively corresponding to steps S201 to S203 and S205 in the foregoing embodiment and steps S301 to S303 and S305 are respectively specific implementation manners of steps S201 to S203 and S205.
[0048] In an embodiment, a picture fusion instruction is triggered by a screen refresh instruction, or a picture fusion instruction is a screen refresh instruction.
Receiving a screen refresh instruction is equivalent to receiving a picture fusion instruction.
Each time a screen is refreshed, the first picture and the second picture are fused once, that is, a fusion value of pixel data of the reserved part of the first picture and pixel data of the second picture is calculated once according to the fusion parameter, so as to obtain pixel data of a fusion picture.
[0049] The foregoing picture fusion method further includes a step of displaying the fusion picture obtained by means of calculation.
[0050] If the first picture includes a dynamic effect, that is, the first picture is composed of a series of image frames, fusion calculation is performed on the second picture and the image frames selected sequentially from the first picture, so as to obtain a series of fusion image frames.
The multiple fusion image frames are displayed sequentially, which produces a dynamic effect. A
processing process of the second picture including a dynamic effect is the same as the processing process of the first picture including a dynamic effect, which is not described herein again. If both the first picture and the second picture include a dynamic effect, that is, both the first picture and the second picture are composed of a series of image frames, the image frames included in the first picture and the image frames included in the second picture may be selected sequentially, and fusion calculation is performed on two selected image frames, so as to obtain a series of fusion image frames.
[0051] FIG. 4 is a schematic diagram of a fusion picture obtained by fusing picture a and picture b in FIG. 1 according to a picture fusion method in an embodiment. In a fusion process, picture a is clipped along an outline of a Chinese character " t) j " in picture b, and parts of picture a and out of an outline surrounding range of the Chinese character " t) j " are filtered out, so that an effect of mixing colors visually does not occur, which is produced by overlapping of redundant parts and other pictures that are in an application background.

[0052] As shown in FIG. 5, in an embodiment, a picture fusion apparatus includes a carrier creating module 10, a clipping template setting module 20, a clipped object setting module 30, a fusion parameter setting module 40, a carrier adding module 50, and a fusion module 60.
[0053] The carrier creating module 10 is configured to create a clipping carrier.
[0054] The clipping carrier, as a carrier of a clipping template and to-be-clipped content, may be used for setting a picture to a clipping template and a picture to to-be-clipped content.
[0055] The clipping template setting module 20 is configured to add a first picture to the clipping carrier and set the first picture to a clipping template on the clipping carrier.
[0056] Setting the first picture to a clipping template on the clipping carrier may trigger an operation of setting a non-transparent part of the first picture to a clipping area. Therefore, the clipping template setting module 20 is further configured to set a non-transparent part of the first picture to a clipping area after the first picture is set to a clipping template on the clipping carrier.
The clipping area is used as a clipping mold for the to-be-clipped content and the to-be-clipped content is clipped according to an outline of the clipping mold, so that a part within a range of the clipping mold and parts out of the range of the clipping mold may be obtained.
[0057] The clipped object setting module 30 is configured to add a second picture to the clipping carrier and set the second picture to to-be-clipped content on the clipping carrier.
[0058] The fusion parameter setting module 40 is configured to set a fusion parameter of the first picture and the second picture.
[0059] The fusion parameter is used to limit fusion calculation of pixel data of the first picture and the second picture. The pixel data of the first picture and the second picture is calculated according to different fusion parameters, so that different fusion values may be obtained to serve as pixel data of a fusion picture, thereby obtaining different fusion effects of the fusion picture. The number of fusion parameters depends on an underlying rendering engine. If the underlying rendering engine provides n fusion parameters, there are totally n*n combinations of fusion parameters of the first picture and the second picture. Respective n fusion parameters of the first picture and the second picture may be set according to different required fusion effects.
[0060] The carrier adding module 50 is configured to add the clipping carrier to a corresponding layer of an application background.
[0061] Multiple pictures in an application background belong to different levels and the pictures are overlapped according to corresponding levels.

[0062] The fusion module 60 is configured to clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, filter out parts of the second picture and out of an outline surrounding range, and calculate pixel data of a fusion picture of a reserved part of the second picture and the first picture according to the fusion parameter.
[0063] The non-transparent part in the first picture is presented in a form of a graph, and the graph in the first picture is the non-transparent part of the first picture.
[0064] The fusion module 60 may calculate a fusion value of pixel data (that is, data of overlapped pixels) of the reserved part of the second picture and the first picture at a same position according to the fusion parameter, so as to obtain pixel data of the fusion picture at the same position.
[0065] In an embodiment, the foregoing clipping carrier is a CCClippingNode object. In this embodiment, the carrier creating module 10 is configured to create a CCClippingNode object.
[0066] The CCClippingNode object may be used for clipping a UI (user interface) control and is inherited from a node class CCNode. The CCNode is a parent class of scene, layer, menu, sprite, and the like in a cocos2d-x. The cocos2d-x is an open-source mobile 2D
game framework supporting an OpenGL ES.
[0067] For example, program code for the carrier creating module 10 to create a CCClippingNode object includes: CCClippingNode *clipper =
CCClippingNode::create(); where a name of the created CCClippingNode object is clipper.
[0068] The clipping template setting module 20 is configured to set a clipping template of the CCClippingNode object to a first picture.
[0069] For example, program code for the clipping template setting module 20 to set a clipping template of the foregoing clipper to a first picture includes:
CCSprite *word=CCSprite::createWithSpriteFrameName("firstpicture.png");
//create a CCSprite object word corresponding to firstpicture.png clipper->setStencil(word); //set word to a clipping template of clipper clipper->addChild(word).//add word to serve as a child of clipper.
where firstpicture.png is a first picture, CCSprite is a sprite class in the cocos2d-x, createWithSpriteFrameName is a function for creating a sprite on the basis of input parameters, and the foregoing code in the first line is used for creating a sprite object word corresponding to the first picture; setStencil is a function of a CCClippingNode class for setting a clipping template and the foregoing code in the second line is used for setting word to a clipping template of clipper, that is, setting the first picture to a clipping template of clipper; and further, word further needs to be added to serve as a child of clipper and the third line implements the function.
[0070] The clipped object setting module 30 is configured to set to-be-clipped content of the CCClippingNode object to a second picture.
[0071] For example, program code for the clipped object setting module 30 to set to-be-clipped content of the foregoing clipper to a second picture includes:
CCSprite* silderShine =
CCSprite::createWithSpriteFrameName("secondpicture.png"); //create a CCSprite object silderShine corresponding to secondpicture.png clipper->addChild(silderShine).//add silderShine to serve as a child of clipper where secondpicture.png is a second picture, code in the first line is used for creating a sprite object silderShine corresponding to the second picture; code in the second line is used for adding silderShine to serve as a child of clipper, so as to set to-be-clipped content of the CCClippingNode object to a second picture.
[0072] As shown in FIG. 6, the picture fusion apparatus in this embodiment may further include a reserved part setting module 70 that is configured to set a part, reserved after clipping, of the to-be-clipped content of the CCClippingNode object to a part surrounded by a clipping outline.
The reserved part setting module 70 may set an attribute value of isInverted of the CCClippingNode object may be set to true, so as to set a part, reserved after clipping, of the to-be-clipped content of the CCClippingNode object to a part surrounded by a clipping outline.
[0073] The carrier adding module 50 is configured to add the CCClippingNode object to a corresponding layer of an application background.
[0074] In an embodiment, a picture fusion instruction is triggered by a screen refresh instruction, or a picture fusion instruction is a screen refresh instruction.
Receiving a screen refresh instruction is equivalent to receiving a picture fusion instruction.
Each time a screen is refreshed, the fusion module 60 fuses the first picture and the second picture once, that is, a fusion value of pixel data of the reserved part of the first picture and pixel data of the second picture is calculated once according to the fusion parameter, so as to obtain pixel data of a fusion picture.
[0075] As shown in FIG. 7, the foregoing picture fusion apparatus may further include a display module 80 that is configured to display the fusion picture obtained by means of calculation.

[0076] If the first picture includes a dynamic effect, that is, the first picture is composed of a series of image frames, the fusion module 60 may perform fusion calculation on the second picture and the image frames that are sequentially selected from the first picture, so as to obtain a series of fusion image frames. A processing process of the second picture including a dynamic effect is the same as the processing process of the first picture including a dynamic effect, which is not described herein again. If both the first picture and the second picture include a dynamic effect, that is, both the first picture and the second picture are composed of a series of image frames, the fusion module 60 may sequentially select the image frames included in the first picture and the image frames included in the second picture, and perform fusion calculation on two selected image frames, so as to obtain a series of fusion image frames. The display module 80 may sequentially display the multiple fusion image frames, which produces a dynamic effect.
[0077] In the foregoing picture fusion method and apparatus, a clipping carrier is created, a first picture is added to the clipping carrier, the first picture is set to a clipping template, and a second picture is added to the clipping carrier, so that the second picture becomes a to-be-clipped picture; when the first picture and the second picture are fused, the second picture is clipped along an outline of a graph in the first picture, parts of the second picture and out of an outline surrounding range are filtered out, and a reserved part of the second picture and the first picture are fused. According to the foregoing picture fusion method and apparatus, the first picture and the second picture are fused on the basis of the outline of the graph in the first picture, and an obtained fusion picture has no redundant parts that are out of the outline of the graph in the first picture, so that an effect of mixing colors visually does not occur, which is produced by overlapping of the redundant parts and other pictures that are in the application background, thereby avoiding that effect display of the fusion picture is affected by the effect of mixing colors.
[0078] The foregoing embodiments only describe several implementation manners of the present disclosure, and their description is specific and detailed, but cannot therefore be understood as a limitation to the patent scope of the present disclosure. it should be noted that a person of ordinary skill in the art may further make variations and improvements without departing from the conception of the present disclosure, and these all fall within the protection scope of the present disclosure. Therefore, the patent protection scope of the present disclosure should be subject to the appended claims.

Claims (10)

What is claimed is:
1. A picture fusion method, comprising:
at a computing device having one or more processors and a memory storing programs executed by the one or more processors:
creating a clipping carrier;
adding a first picture to the clipping carrier and setting the first picture to a clipping template on the clipping carrier;
adding a second picture to the clipping carrier and setting the second picture to to-be-clipped content on the clipping carrier;
setting a fusion parameter of the first picture and the second picture;
adding the clipping carrier to a corresponding layer of an application background;
clipping the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, a clipped second picture comprising parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range being a fusion part of the first picture and the second picture;
filtering out the parts of the clipped second picture and out of the outline surrounding range of the first picture; and calculating pixel data of the picture within the outline surrounding range according to the fusion parameter.
2. The method according to claim 1, wherein both the first picture and the second picture comprise a transparent part and a non-transparent part.
3. The method according to claim 2, wherein the non-transparent part of the first picture is presented in a form of a graph; and the setting the first picture to a clipping template on the clipping carrier triggers a clipping area setting operation: setting the non-transparent part of the first picture to a clipping area, wherein the clipping area is a clipping mold for the to-be-clipped content and the to-be-clipped content is clipped according to an outline of the clipping mold.
4. The method according to claim 1, wherein the picture fusion instruction is triggered by a screen refresh instruction, or the picture fusion instruction is a screen refresh instruction.
5. The method according to claim 1, wherein the calculating pixel data of the picture within the outline surrounding range according to the fusion parameter comprises:
calculating a fusion value of pixel data of the picture within the outline surrounding range of the first picture and in the second picture and the first picture at a same position according to the fusion parameter, so as to obtain pixel data of the picture within the outline surrounding range.
6. A picture fusion apparatus, comprising:
one or more processors;
a memory; and one or more program modules stored in the memory and executed by the one or more processors, the one or more program modules comprising:
a carrier creating module, configured to create a clipping carrier;
a clipping template setting module, configured to add a first picture to the clipping carrier and set the first picture to a clipping template on the clipping carrier;
a clipped object setting module, configured to add a second picture to the clipping carrier and set the second picture to to-be-clipped content on the clipping carrier;
a fusion parameter setting module, configured to set a fusion parameter of the first picture and the second picture;
a carrier adding module, configured to add the clipping carrier to a corresponding layer of an application background;
a fusion module, configured to clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, a clipped second picture comprising parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range being an overlapped part of the first picture and the second picture, filter out the parts of the clipped second picture and out of the outline surrounding range of the first picture, and calculate pixel data of the picture within the outline surrounding range according to the fusion parameter.
7. The apparatus according to claim 6, wherein both the first picture and the second picture comprise a transparent part and a non-transparent part.
8. The apparatus according to claim 7, wherein the non-transparent part of the first picture is presented in a form of a graph; and the clipped object setting module is further configured to set the first picture to the clipping template on the clipping carrier to trigger a clipping area setting operation:
setting the non-transparent part of the first picture to a clipping area, wherein the clipping area is a clipping mold for the to-be-clipped content and the to-be-clipped content is clipped according to an outline of the clipping mold.
9. The apparatus according to claim 6, wherein the picture fusion instruction is triggered by a screen refresh instruction, or the picture fusion instruction is a screen refresh instruction.
10. The apparatus according to claim 6, wherein the calculating, by the fusion module, the pixel data of the picture within the outline surrounding range according to the fusion parameter comprises:
calculating, by the fusion module, a fusion value of pixel data of the picture within the outline surrounding range of the first picture and in the second picture and the first picture at a same position according to the fusion parameter, so as to obtain pixel data of the picture within the outline surrounding range.
CA2931695A 2014-04-18 2015-04-15 Picture fusion method and apparatus Active CA2931695C (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
CN201410159269.4A CN105023259B (en) 2014-04-18 2014-04-18 Picture fusion method, device, terminal and computer readable storage medium
CN201410159269.4 2014-04-18
PCT/CN2015/076597 WO2015158255A1 (en) 2014-04-18 2015-04-15 Picture fusion method and apparatus

Publications (2)

Publication Number Publication Date
CA2931695A1 true CA2931695A1 (en) 2015-10-22
CA2931695C CA2931695C (en) 2018-04-24

Family

ID=54323485

Family Applications (1)

Application Number Title Priority Date Filing Date
CA2931695A Active CA2931695C (en) 2014-04-18 2015-04-15 Picture fusion method and apparatus

Country Status (4)

Country Link
CN (1) CN105023259B (en)
CA (1) CA2931695C (en)
MY (1) MY174549A (en)
WO (1) WO2015158255A1 (en)

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN100555325C (en) * 2007-08-29 2009-10-28 华中科技大学 A kind of image interfusion method based on wave transform of not sub sampled contour
CN101551904B (en) * 2009-05-19 2011-05-11 清华大学 Image synthesis method and apparatus based on mixed gradient field and mixed boundary condition
CN103026382B (en) * 2010-07-22 2016-08-24 皇家飞利浦电子股份有限公司 The fusion of multiple images
KR101798408B1 (en) * 2011-01-03 2017-11-20 삼성전자주식회사 Method and apparatus for rendering high quality multi-view images for 3d display
CN102496180B (en) * 2011-12-15 2014-03-26 山东师范大学 Method for automatically generating wash landscape painting image
CN102737394A (en) * 2012-06-20 2012-10-17 北京市网讯财通科技有限公司 Method for drawing irregular skin of windows system software
CN103632355A (en) * 2012-08-29 2014-03-12 郭昊 Image automatic synthesis processing method and device thereof
CN103139439B (en) * 2013-01-24 2015-12-23 厦门美图网科技有限公司 A kind ofly can add modify the picture synthetic method of material based on segment template

Also Published As

Publication number Publication date
CN105023259A (en) 2015-11-04
MY174549A (en) 2020-04-24
WO2015158255A1 (en) 2015-10-22
CN105023259B (en) 2019-06-25
CA2931695C (en) 2018-04-24

Similar Documents

Publication Publication Date Title
US11012740B2 (en) Method, device, and storage medium for displaying a dynamic special effect
TWI698841B (en) Data processing method and device for merging map areas
KR20220030263A (en) texture mesh building
US9558542B2 (en) Method and device for image processing
CN105447898A (en) Method and device for displaying 2D application interface in virtual real device
US10554803B2 (en) Method and apparatus for generating unlocking interface, and electronic device
CN106658139B (en) Focus control method and device
CN105528207A (en) Virtual reality system, and method and apparatus for displaying Android application images therein
CN106657757B (en) Image preview method and device for camera application and camera application system
CN106569700B (en) Screenshot method and screenshot device
CN106648508B (en) Image drawing method and device
CN111295693B (en) Image processing method and device
CN110570501B (en) Line animation drawing method and equipment, storage medium and electronic equipment
US20140333669A1 (en) System, method, and computer program product for implementing smooth user interface animation using motion blur
CN104731855A (en) Method and device for displaying image resources of WeChat friend circle
CN112541960A (en) Three-dimensional scene rendering method and device and electronic equipment
US20140325404A1 (en) Generating Screen Data
CN108093245B (en) Multi-screen fusion method, system, device and computer readable storage medium
CN107621951B (en) View level optimization method and device
CN105719335A (en) Map image rendering method and device, and vehicle-mounted terminal
CN106648623B (en) Display method and device for characters in android system
CN109766530B (en) Method and device for generating chart frame, storage medium and electronic equipment
CA2931695C (en) Picture fusion method and apparatus
CN114428573B (en) Special effect image processing method and device, electronic equipment and storage medium
CN104516696A (en) Information processing method and electronic device

Legal Events

Date Code Title Description
EEER Examination request

Effective date: 20160526