WO2015158255A1 - Picture fusion method and apparatus - Google Patents

Picture fusion method and apparatus Download PDF

Info

Publication number
WO2015158255A1
WO2015158255A1 PCT/CN2015/076597 CN2015076597W WO2015158255A1 WO 2015158255 A1 WO2015158255 A1 WO 2015158255A1 CN 2015076597 W CN2015076597 W CN 2015076597W WO 2015158255 A1 WO2015158255 A1 WO 2015158255A1
Authority
WO
WIPO (PCT)
Prior art keywords
picture
clipping
fusion
carrier
clipped
Prior art date
Application number
PCT/CN2015/076597
Other languages
English (en)
French (fr)
Inventor
Yulong Wang
Original Assignee
Tencent Technology (Shenzhen) Company Limited
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology (Shenzhen) Company Limited filed Critical Tencent Technology (Shenzhen) Company Limited
Priority to CA2931695A priority Critical patent/CA2931695C/en
Publication of WO2015158255A1 publication Critical patent/WO2015158255A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation

Definitions

  • the present disclosure relates to picture processing technologies, and in particular, to a picture fusion method and apparatus.
  • OpenGL Open Graphics Library
  • OpenGL is a professional graphics program interface and an underlying graphics library that is powerful in function and easy to invoke, which is used for processing a two-dimensional or three-dimensional image.
  • the OpenGL defines a standard for a cross-programming-language and cross-platform programming interface, is independent of Windows operating systems or other operating systems, and is also network-transparent. Therefore, software supporting the OpenGL has good portability.
  • An OpenGL for Embedded Systems (OpenGL ES) is a subset of an OpenGL Three-dimensional Graphics API, is specially designed for embedded devices, such as a mobile phone, a PDA, and a game console, and various embedded systems, and creates a flexible and powerful underlying interaction interface between software and graph acceleration.
  • An OpenGL ES 2.0 may greatly improve 3D graphics rendering speeds of different consumer electronic devices and achieve all programmable 3D graphics in an embedded system.
  • Picture fusion in the OpenGL is a common technology, that is, according to different fusion parameters, fusion calculation is performed on pixel data of pictures, so as to obtain a fusion picture of a special effect.
  • a picture fusion method includes:
  • a clipped second picture including parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range being a fusion part of the first picture and the second picture; filtering out the parts of the clipped second picture and out of the outline surrounding range of the first picture; and calculating pixel data of the picture within the outline surrounding range according to the fusion parameter.
  • a picture fusion apparatus includes:
  • a carrier creating module configured to create a clipping carrier
  • a clipping template setting module configured to add a first picture to the clipping carrier and set the first picture to a clipping template on the clipping carrier;
  • a clipped object setting module configured to add a second picture to the clipping carrier and set the second picture to to-be-clipped content on the clipping carrier
  • a fusion parameter setting module configured to set a fusion parameter of the first picture and the second picture
  • a carrier adding module configured to add the clipping carrier to a corresponding layer of an application background
  • a fusion module configured to clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, filter out parts of the second picture and out of an outline surrounding range, and calculate pixel data of a fusion picture of a reserved part of the second picture and the first picture according to the fusion parameter.
  • a clipping carrier is created, a first picture is added to the clipping carrier, the first picture is set to a clipping template, and a second picture is added to the clipping carrier, so that the second picture becomes a to-be-clipped picture; when the first picture and the second picture are fused, the second picture is clipped along an outline of a graph in the first picture, parts of the second picture and out of an outline surrounding range are filtered out, and a reserved part of the second picture and the first picture are fused.
  • the first picture and the second picture are fused on the basis of the outline of the graph in the first picture, and an obtained fusion picture has no redundant parts that are out of the outline of the graph in the first picture, so that an effect of mixing colors visually does not occur, which is produced by overlapping of the redundant parts and other pictures that are in the application background, thereby avoiding that effect display of the fusion picture is affected by the effect of mixing colors.
  • FIG. 1 is a schematic diagram of a picture fusion effect in the existing technology
  • FIG. 2 is a schematic flowchart of a picture fusion method in an embodiment
  • FIG. 3 is a schematic flowchart of a picture fusion method in another embodiment
  • FIG. 4 is a schematic diagram of an effect of a fusion picture obtained by fusing picture a and picture b in FIG. 1 according to a picture fusion method in an embodiment
  • FIG. 5 is a schematic structural diagram of a picture fusion apparatus in anembodiment
  • FIG. 6 is a schematic structural diagram of a picture fusion apparatus in another embodiment.
  • FIG. 7 is a schematic structural diagram of a picture fusion apparatus in still another embodiment.
  • first and second used in the present disclosure may be used for describing elements herein, but the elements are not limited by the terms. The terms are used only to differentiate an element from another element.
  • a first picture may be referred to as a second picture, and similarly, a second picture may be referred to as a first picture.
  • a first picture and a second picture are pictures but are not a same picture.
  • picture a and picture b are in picture formats having an alpha channel, and include a transparent part and a non-transparent part. Both of the pictures are rectangular.
  • the non-transparent part in picture a is elliptical and has a high lighting effect.
  • the non-transparent part in picture b is in the shape of a Chinese character " ⁇ " .
  • the objective of fusing picture a and picture b is to achieve a high lighting effect of the Chinese character " ⁇ " in picture b.
  • a picture fusion method includes the following steps:
  • Step S201 Create a clipping carrier.
  • the clipping carrier as a carrier of a clipping template and to-be-clipped content, may be used for setting a picture to a clipping template and a picture to to-be-clipped content.
  • Step S202 Add a first picture to the clipping carrier and set the first picture to a clipping template on the clipping carrier.
  • Step S203 Add a second picture to the clipping carrier and set the second picture to to-be-clipped content on the clipping carrier.
  • the first picture and the second picture are pictures having an alpha channel, for example, a picture in a png format, and both the first picture and the second picture include a transparent part and a non-transparent part.
  • Setting the first picture to a clipping template on the clipping carrier may trigger an operation of setting the non-transparent part of the first picture to a clipping area. Therefore, the picture fusion method further includes a step: setting the non-transparent part of the first picture to a clipping area after the first picture is set to a clipping template on the clipping carrier.
  • the clipping area of the clipping template is used as a clipping mold for the to-be-clipped content and the to-be-clipped content is clipped according to an outline of the clipping mold, so that a part within a range of the clipping mold and parts out of the range of the clipping mold may be obtained.
  • Step S204 Set a fusion parameter of the first picture and the second picture.
  • the fusion parameter is used to limit fusion calculation of pixel data of the first picture and the second picture.
  • Pixel data of the first picture and the second picture is calculated according to different fusion parameters, so that different fusion values may be obtained to serve as pixel data of a fusion picture, thereby obtaining different fusion effects of the fusion picture.
  • the number of fusion parameters depends on an underlying rendering engine. If the underlying rendering engine provides n fusion parameters, there are totally n*n combinations of fusion parameters of the first picture and the second picture. Respective n fusion parameters of the first picture and the second picture may be set according to different required fusion effects.
  • Step S205 Add the clipping carrier to a corresponding layer of an application background.
  • Multiple pictures in an application background belong to different levels and the pictures are overlapped according to corresponding levels.
  • Step S206 Clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, a clipped second picture including parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range being a fusion part of the first picture and the second picture; filter out the parts of the clipped second picture and out of the outline surrounding range of the first picture; and calculate pixel data of the picture within the outline surrounding range according to the fusion parameter.
  • a clipped second picture includes parts out of an outline surrounding range of the first picture and a picture within the outline surrounding range of the first picture and the picture within the outline surrounding range is a fusion part of the first picture and the second picture or may be referred to as a superposed part or an overlapped part of the first picture and the second picture.
  • the filtering out refers to filtering out the parts of the clipped second picture and out of the outline surrounding range of the first picture, so as to reserve a picture within the outline surrounding range of the first picture and in the second picture.
  • the non-transparent part in the first picture is presented in a form of a graph, and the graph in the first picture is the non-transparent part of the first picture.
  • a fusion value of pixel data (that is, data of overlapped pixels) of the reserved part of the second picture and the first picture at a same position may be calculated according to the fusion parameter, so as to obtain pixel data of the fusion picture at the same position. Specifically, pixel data of the picture within the outline surrounding range of the first picture and in the second picture is calculated.
  • the foregoing clipping carrier is a CCClippingNode object.
  • a picture fusion method in this embodiment includes the following steps:
  • Step S301 Create a CCClippingNode object.
  • the CCClippingNode object may be used for clipping a UI (user interface) control and is inherited from a node class CCNode.
  • the CCNode is a parent class of scene, layer, menu, sprite, and the like in a cocos2d-x.
  • the cocos2d-x is an open-source mobile 2D game framework supporting an OpenGL ES.
  • Step S302 Set a clipping template of the CCClippingNode object to a first picture.
  • program code for setting a clipping template of the foregoing clipper to a first picture includes:
  • CCSprite *word CCSprite: : createWithSpriteFrameName ( "firstpicture. png" ) ; //create a CCSprite object word corresponding to firstpicture. png
  • firstpicture. png is a first picture
  • CCSprite is a sprite class in the cocos2d-x
  • createWithSpriteFrameName is a function for creating a sprite on the basis of input parameters, and the foregoing code in the first line is used for creating a sprite object word corresponding to the first picture
  • setStencil is a function of a CCClippingNode class for setting a clipping template and the foregoing code in the second line is used for setting word to a clipping template of clipper, that is, setting the first picture to a clipping template of clipper; and further, word further needs to be added to serve as a child of clipper and the third line implements the function.
  • Step S303 Set to-be-clipped content of the CCClippingNode object to a second picture.
  • program code for setting to-be-clipped content of the foregoing clipper to a second picture includes:
  • CCSprite*silderShine CCSprite: : createWithSpriteFrameName ( "secondpicture. png" ) ; //create a CCSprite object silderShine corresponding to secondpicture. png
  • secondpicture. png is a second picture
  • code in the first line is used for creating a sprite object silderShine corresponding to the second picture; code in the second line is used for adding silderShine to serve as a child of clipper, so as to set to-be-clipped content of the CCClippingNode object to a second picture.
  • the picture fusion method in this embodiment may further include a step of setting a part, reserved after clipping, of the to-be-clipped content of the CCClippingNode object to a part surrounded by a clipping outline.
  • An attribute value of isInverted of the CCClippingNode object may be set to true.
  • Step S304 Set a fusion parameter of the first picture and the second picture.
  • Step S305 Add the CCClippingNode object to a corresponding layer of an application background.
  • Step S306 Clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, filter out parts of the second picture and out of an outline surrounding range, and calculate pixel data of a fusion picture of a reserved part of the second picture and the first picture according to the fusion parameter.
  • Steps S301 to S303 and S305 in this embodiment are respectively corresponding to steps S201 to S203 and S205 in the foregoing embodiment and steps S301 to S303 and S305 are respectively specific implementation manners of steps S201 to S203 and S205.
  • a picture fusion instruction is triggered by a screen refresh instruction, or a picture fusion instruction is a screen refresh instruction.Receiving a screen refresh instruction is equivalent to receiving a picture fusion instruction.
  • the foregoing picture fusion method further includes a step of displaying the fusion picture obtained by means of calculation.
  • the first picture includes a dynamic effect, that is, the first picture is composed of a series of image frames
  • fusion calculation is performed on the second picture and the image frames selected sequentially from the first picture, so as to obtain a series of fusion image frames.
  • the multiple fusion image frames are displayed sequentially, which produces a dynamic effect.
  • a processing process of the second picture including a dynamic effect is the same as the processing process of the first picture including a dynamic effect, which is not described herein again.
  • both the first picture and the second picture include a dynamic effect, that is, both the first picture and the second picture are composed of a series of image frames
  • the image frames included in the first picture and the image frames included in the second picture may be selected sequentially, and fusion calculation is performed on two selected image frames, so as to obtain a series of fusion image frames.
  • FIG. 4 is a schematic diagram of a fusion picture obtained by fusing picture a and picture b in FIG. 1 according to a picture fusion method in an embodiment.
  • picture a is clipped along an outline of a Chinese character " ⁇ " in picture b, and parts of picture a and out of an outline surrounding range of the Chinese character " ⁇ " are filtered out, so that an effect of mixing colors visually does not occur, which is produced by overlapping of redundant parts and other pictures that are in an application background.
  • a picture fusion apparatus includes a carrier creating module 10, a clipping template setting module 20, a clipped object setting module 30, a fusion parameter setting module 40, a carrier adding module 50, and a fusion module 60.
  • the carrier creating module 10 is configured to create a clipping carrier.
  • the clipping carrier as a carrier of a clipping template and to-be-clipped content, may be used for setting a picture to a clipping template and a picture to to-be-clipped content.
  • the clipping template setting module 20 is configured to add a first picture to the clipping carrier and set the first picture to a clipping template on the clipping carrier.
  • the clipping template setting module 20 is further configured to set a non-transparent part of the first picture to a clipping area after the first picture is set to a clipping template on the clipping carrier.
  • the clipping area is used as a clipping mold for the to-be-clipped content and the to-be-clipped content is clipped according to an outline of the clipping mold, so that a part within a range of the clipping mold and parts out of the range of the clipping mold may be obtained.
  • the clipped object setting module 30 is configured to add a second picture to the clipping carrier and set the second picture to to-be-clipped content on the clipping carrier.
  • the fusion parameter setting module 40 is configured to set a fusion parameter of the first picture and the second picture.
  • the fusion parameter is used to limit fusion calculation of pixel data of the first picture and the second picture.
  • the pixel data of the first picture and the second picture is calculated according to different fusion parameters, so that different fusion values may be obtained to serve as pixel data of a fusion picture, thereby obtaining different fusion effects of the fusion picture.
  • the number of fusion parameters depends on an underlying rendering engine. If the underlying rendering engine provides n fusion parameters, there are totally n*n combinations of fusion parameters of the first picture and the second picture. Respective n fusion parameters of the first picture and the second picture may be set according to different required fusion effects.
  • the carrier adding module 50 is configured to add the clipping carrier to a corresponding layer of an application background.
  • Multiple pictures in an application background belong to different levels and the pictures are overlapped according to corresponding levels.
  • the fusion module 60 is configured to clip the second picture along an outline of a graph in the first picture when a picture fusion instruction is received, filter out parts of the second picture and out of an outline surrounding range, and calculate pixel data of a fusion picture of a reserved part of the second picture and the first picture according to the fusion parameter.
  • the non-transparent part in the first picture is presented in a form of a graph, and the graph in the first picture is the non-transparent part of the first picture.
  • the fusion module 60 may calculate a fusion value of pixel data (that is, data of overlapped pixels) of the reserved part of the second picture and the first picture at a same position according to the fusion parameter, so as to obtain pixel data of the fusion picture at the same position.
  • the foregoing clipping carrier is a CCClippingNode object.
  • the foregoing clipping carrier is a CCClippingNode object.
  • the carrier creating module 10 is configured to create a CCClippingNode object.
  • the CCClippingNode object may be used for clipping a UI (user interface) control and is inherited from a node class CCNode.
  • the CCNode is a parent class of scene, layer, menu, sprite, and the like in a cocos2d-x.
  • the cocos2d-x is an open-source mobile 2D game framework supporting an OpenGL ES.
  • the clipping template setting module 20 is configured to set a clipping template of the CCClippingNode object to a first picture.
  • program code for the clipping template setting module 20 to set a clipping template of the foregoing clipper to a first picture includes:
  • CCSprite *word CCSprite: : createWithSpriteFrameName ( "firstpicture. png" ) ; //create a CCSprite object word corresponding to firstpicture. png
  • firstpicture. png is a first picture
  • CCSprite is a sprite class in the cocos2d-x
  • createWithSpriteFrameName is a function for creating a sprite on the basis of input parameters, and the foregoing code in the first line is used for creating a sprite object word corresponding to the first picture
  • setStencil is a function of a CCClippingNode class for setting a clipping template and the foregoing code in the second line is used for setting word to a clipping template of clipper, that is, setting the first picture to a clipping template of clipper; and further, word further needs to be added to serve as a child of clipper and the third line implements the function.
  • the clipped object setting module 30 is configured to set to-be-clipped content of the CCClippingNode object to a second picture.
  • program code for the clipped object setting module 30 to set to-be-clipped content of the foregoing clipper to a second picture includes:
  • CCSprite*silderShine CCSprite: : createWithSpriteFrameName ( "secondpicture. png" ) ; //create a CCSprite object silderShine corresponding to secondpicture. png
  • secondpicture. png is a second picture
  • code in the first line is used for creating a sprite object silderShine corresponding to the second picture; code in the second line is used for adding silderShine to serve as a child of clipper, so as to set to-be-clipped content of the CCClippingNode object to a second picture.
  • the picture fusion apparatus in this embodiment may further include a reserved part setting module 70 that is configured to set a part, reserved after clipping, of the to-be-clipped content of the CCClippingNode object to a part surrounded by a clipping outline.
  • the reserved part setting module 70 may set an attribute value of isInverted of the CCClippingNode object may be set to true, so as to set a part, reserved after clipping, of the to-be-clipped content of the CCClippingNode object to a part surrounded by a clipping outline.
  • the carrier adding module 50 is configured to add the CCClippingNode object to a corresponding layer of an application background.
  • a picture fusion instruction is triggered by a screen refresh instruction, or a picture fusion instruction is a screen refresh instruction.
  • Receiving a screen refresh instruction is equivalent to receiving a picture fusion instruction.
  • the fusion module 60 fuses the first picture and the second picture once, that is, a fusion value of pixel data of the reserved part of the first picture and pixel data of the second picture is calculated once according to the fusion parameter, so as to obtain pixel data of a fusion picture.
  • the foregoing picture fusion apparatus may further include a display module 80 that is configured to display the fusion picture obtained by means of calculation.
  • the fusion module 60 may perform fusion calculation on the second picture and the image frames that are sequentially selected from the first picture, so as to obtain a series of fusion image frames.
  • a processing process of the second picture including a dynamic effect is the same as the processing process of the first picture including a dynamic effect, which is not described herein again.
  • both the first picture and the second picture include a dynamic effect, that is, both the first picture and the second picture are composed of a series of image frames
  • the fusion module 60 may sequentially select the image frames included in the first picture and the image frames included in the second picture, and perform fusion calculation on two selected image frames, so as to obtain a series of fusion image frames.
  • the display module 80 may sequentially display the multiple fusion image frames, which produces a dynamic effect.
  • a clipping carrier is created, a first picture is added to the clipping carrier, the first picture is set to a clipping template, and a second picture is added to the clipping carrier, so that the second picture becomes a to-be-clipped picture; when the first picture and the second picture are fused, the second picture is clipped along an outline of a graph in the first picture, parts of the second picture and out of an outline surrounding range are filtered out, and a reserved part of the second picture and the first picture are fused.
  • the first picture and the second picture are fused on the basis of the outline of the graph in the first picture, and an obtained fusion picture has no redundant parts that are out of the outline of the graph in the first picture, so that an effect of mixing colors visually does not occur, which is produced by overlapping of the redundant parts and other pictures that are in the application background, thereby avoiding that effect display of the fusion picture is affected by the effect of mixing colors.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Circuits (AREA)
  • Image Processing (AREA)
  • Processing Or Creating Images (AREA)
  • Controls And Circuits For Display Device (AREA)
PCT/CN2015/076597 2014-04-18 2015-04-15 Picture fusion method and apparatus WO2015158255A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CA2931695A CA2931695C (en) 2014-04-18 2015-04-15 Picture fusion method and apparatus

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201410159269.4 2014-04-18
CN201410159269.4A CN105023259B (zh) 2014-04-18 2014-04-18 图片融合方法、装置、终端和计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2015158255A1 true WO2015158255A1 (en) 2015-10-22

Family

ID=54323485

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/076597 WO2015158255A1 (en) 2014-04-18 2015-04-15 Picture fusion method and apparatus

Country Status (4)

Country Link
CN (1) CN105023259B (zh)
CA (1) CA2931695C (zh)
MY (1) MY174549A (zh)
WO (1) WO2015158255A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101093580A (zh) * 2007-08-29 2007-12-26 华中科技大学 一种基于非子采样轮廓波变换的图像融合方法
CN101551904A (zh) * 2009-05-19 2009-10-07 清华大学 基于混合梯度场和混合边界条件的图像合成方法和装置
WO2012011028A1 (en) * 2010-07-22 2012-01-26 Koninklijke Philips Electronics N.V. Fusion of multiple images
CN102496180A (zh) * 2011-12-15 2012-06-13 李大锦 一种自动生成水墨山水画图像的方法
US20120169722A1 (en) * 2011-01-03 2012-07-05 Samsung Electronics Co., Ltd. Method and apparatus generating multi-view images for three-dimensional display

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102737394A (zh) * 2012-06-20 2012-10-17 北京市网讯财通科技有限公司 一种绘制windows系统软件不规则皮肤的方法
CN103632355A (zh) * 2012-08-29 2014-03-12 郭昊 图像自动合成处理方法及其装置
CN103139439B (zh) * 2013-01-24 2015-12-23 厦门美图网科技有限公司 一种基于图块模板且可添加修饰素材的图片合成方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101093580A (zh) * 2007-08-29 2007-12-26 华中科技大学 一种基于非子采样轮廓波变换的图像融合方法
CN101551904A (zh) * 2009-05-19 2009-10-07 清华大学 基于混合梯度场和混合边界条件的图像合成方法和装置
WO2012011028A1 (en) * 2010-07-22 2012-01-26 Koninklijke Philips Electronics N.V. Fusion of multiple images
US20120169722A1 (en) * 2011-01-03 2012-07-05 Samsung Electronics Co., Ltd. Method and apparatus generating multi-view images for three-dimensional display
CN102496180A (zh) * 2011-12-15 2012-06-13 李大锦 一种自动生成水墨山水画图像的方法

Also Published As

Publication number Publication date
CA2931695C (en) 2018-04-24
MY174549A (en) 2020-04-24
CN105023259B (zh) 2019-06-25
CA2931695A1 (en) 2015-10-22
CN105023259A (zh) 2015-11-04

Similar Documents

Publication Publication Date Title
US11012740B2 (en) Method, device, and storage medium for displaying a dynamic special effect
TWI698841B (zh) 地圖區域合併的資料處理方法及裝置
KR20220030263A (ko) 텍스처 메시 빌딩
US9558542B2 (en) Method and device for image processing
US10554803B2 (en) Method and apparatus for generating unlocking interface, and electronic device
CN105447898A (zh) 一种虚拟现实设备中显示2d应用界面的方法和装置
CN104038807A (zh) 一种基于OpenGL的图层混合方法及装置
CN106658139B (zh) 一种焦点控制方法及装置
CN106657757B (zh) 一种相机应用的图像预览方法、装置及相机应用系统
CN106569700B (zh) 一种截图方法以及截图装置
CN105528207A (zh) 一种虚拟现实系统及其中显示安卓应用图像的方法和装置
WO2017032233A1 (zh) 一种图像生成方法及装置
CN106648508B (zh) 一种图像绘制方法和装置
CN111295693B (zh) 影像处理方法及装置
CN107027068A (zh) 渲染方法、解码方法、播放多媒体数据流的方法及装置
US20140333669A1 (en) System, method, and computer program product for implementing smooth user interface animation using motion blur
CN104731855A (zh) 一种显示微信朋友圈图片资源的方法及装置
CN110570501B (zh) 一种线条动画绘制方法及其设备、存储介质、电子设备
CN112541960A (zh) 三维场景的渲染方法、装置及电子设备
US20140325404A1 (en) Generating Screen Data
CN108093245B (zh) 一种多屏融合方法、系统、装置和计算机可读存储介质
CN105719335A (zh) 一种地图图像渲染方法、装置以及车载终端
CN106648623B (zh) 一种安卓系统中字符的显示方法及装置
CN109766530B (zh) 图表边框的生成方法、装置、存储介质和电子设备
CA2931695C (en) Picture fusion method and apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15780123

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2931695

Country of ref document: CA

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 27.03.2017)

122 Ep: pct application non-entry in european phase

Ref document number: 15780123

Country of ref document: EP

Kind code of ref document: A1