WO2021155688A1 - 图片处理方法、装置、存储介质及电子设备 - Google Patents
图片处理方法、装置、存储介质及电子设备 Download PDFInfo
- Publication number
- WO2021155688A1 WO2021155688A1 PCT/CN2020/126344 CN2020126344W WO2021155688A1 WO 2021155688 A1 WO2021155688 A1 WO 2021155688A1 CN 2020126344 W CN2020126344 W CN 2020126344W WO 2021155688 A1 WO2021155688 A1 WO 2021155688A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- picture
- data
- component
- target mask
- light
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/005—General purpose rendering architectures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/503—Blending, e.g. for anti-aliasing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/506—Illumination models
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T15/00—3D [Three Dimensional] image rendering
- G06T15/50—Lighting effects
- G06T15/80—Shading
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10141—Special mode during image acquisition
- G06T2207/10152—Varying illumination
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
Definitions
- This application relates to the field of image processing, and in particular to an image processing method, device, storage medium, and electronic equipment.
- the usual way is to use the 3D model to represent the objects in the picture.
- the embodiments of the present application provide an image processing method, device, storage medium, and electronic device, which can avoid the jam of the mobile terminal device caused by the increase of power consumption of the mobile terminal device for image rendering.
- An embodiment of the present application provides an image processing method, including:
- the g component of the target mask picture stores first light change data
- the b component of the target mask picture stores second light change data
- the first light change data indicates that the first light change data is being rendered.
- the change data of the light on the left side of the first picture used in the picture, and the second light change data is the change data of the light on the right side of the first picture used when the first picture is rendered;
- An embodiment of the application provides an image processing device, including:
- the first obtaining unit is configured to obtain the first picture to be rendered
- the second acquiring unit is configured to acquire a target mask picture corresponding to the first picture
- the g component of the target mask picture stores first light change data
- the b component of the target mask picture stores second light change data
- the first light change data indicates that the first light change data is being rendered.
- the change data of the light on the left side of the first picture used in the picture, and the second light change data is the change data of the light on the right side of the first picture used when the first picture is rendered;
- the rendering unit is configured to render the first picture using the target mask picture to obtain a second picture.
- the embodiment of the present application provides a computer-readable storage medium in which a computer program is stored, and the above-mentioned image processing method is executed when the computer program is running.
- the embodiment of the present application provides an electronic device including a memory, a processor, and a computer program stored in the memory and capable of running on the processor.
- the processor is configured to execute the above-mentioned picture when the computer program is run. Approach.
- the purpose of rendering the picture is achieved through the light change data stored in the target mask of the picture, thereby avoiding the use of 3D models, and realizing light projection by laying out real-time light sources or baking scenes in the scene, thereby avoiding adding mobile terminals to the picture rendering The stall of the mobile terminal device caused by the power consumption of the device.
- FIG. 1 is a schematic diagram of an application environment of a picture processing method provided by an embodiment of the present application
- Fig. 2 is a flowchart of a picture processing method provided by an embodiment of the present application.
- FIG. 3 is a flowchart of using a custom shader for the first picture to complete rendering according to an embodiment of the present application
- FIG. 4 is an engineering flowchart of GPU one-time rendering provided by an embodiment of the present application.
- FIG. 5 is a comparison diagram of rendering results of the first image provided by an embodiment of the present application.
- FIG. 6 is a first picture provided by an embodiment of the present application and a mask diagram corresponding to the first picture;
- FIG. 7 is a schematic structural diagram of a picture processing apparatus provided by an embodiment of the present application.
- FIG. 8 is a schematic structural diagram of an electronic device of a picture processing method provided by an embodiment of the present application.
- the embodiment of the present application provides a picture processing method, which can be but not limited to be applied to a picture processing system in a hardware environment as shown in FIG. 1, wherein the picture processing system may include but is not limited to a terminal device 102.
- the terminal device 102 is configured to display the first picture to be rendered and the second picture obtained by rendering the first picture.
- the terminal device 102 may include, but is not limited to: a display 108, a processor 106, and a memory 104.
- the display 108 is configured to obtain a human-computer interaction instruction through a human-computer interaction interface, and is also configured to present a first picture to be rendered;
- the processor 106 is configured to respond to the human-computer interaction instruction and use the target mask picture corresponding to the first picture to compare the first picture The picture is rendered.
- the memory 104 is configured to store the first picture to be rendered and the attribute information of the target mask picture corresponding to the first picture.
- the server 112 may include, but is not limited to: a database 114 and a processing engine 116.
- the processing engine 116 is configured to call a target mask picture corresponding to the first picture stored in the database 114, and use the target mask picture to render the first picture , To obtain the second picture, to achieve the purpose of rendering the picture through the light change data stored in the target mask picture of the picture, thereby avoiding the use of 3D models, and realizing light projection through the layout of real-time light sources or baking scenes in the scene. It avoids the freezing of the mobile terminal device caused by the increase of the power consumption of the mobile terminal device for the image rendering.
- the display 108 in the terminal device 102 is configured to display the first picture to be rendered (as shown in FIG. 1 in the game screen in the shooting game, the target virtual character is sniping the target object in the distance).
- the terminal device 102 obtains the first picture to be rendered, and sends the first picture to the server 112 via the network 110;
- S104 the server 112 obtains the target mask picture corresponding to the first picture, where ,
- the g component of the target mask picture stores the first light change data
- the b component of the target mask picture stores the second light change data.
- the second light change data is the light change data on the right side of the first picture used when rendering the first picture;
- S106 Use the target mask picture to render the first picture to obtain the second Picture;
- S108 The server 112 returns the second picture obtained above to the terminal device 102 via the network 110.
- the terminal device 102 obtains the first picture to be rendered; obtains the target mask picture corresponding to the first picture, wherein the g component of the target mask picture stores the first light change data, and the target mask picture
- the b component of stores the second light change data
- the first light change data is the light change data on the left side of the first picture used when rendering the first picture
- the second light change data is used when rendering the first picture
- the light change data located on the right side of the first picture use the target mask picture to render the first picture to obtain the second picture, which achieves the realization of the picture through the light change data stored in the target mask picture of the picture
- the purpose of rendering is to avoid the use of 3D models and realize light projection by laying out real-time light sources or baking scenes in the scene, thereby avoiding the freezing of mobile terminal devices caused by increasing the power consumption of the mobile terminal device for image rendering.
- the above-mentioned image processing method can be, but not limited to, applied to the server 112, and is also used to assist the application client to render the first image using the mask image corresponding to the first image.
- the above-mentioned application client can be but not limited to running in the terminal device 102
- the terminal device 102 can be, but not limited to, a mobile phone, a tablet computer, a notebook computer, a PC, and other terminal devices that support the running of the application client.
- the foregoing server 112 and the terminal device 102 may, but are not limited to, implement data interaction through a network, and the foregoing network may include, but is not limited to, a wireless network or a wired network.
- the wireless network includes: Bluetooth, WIFI and other networks that realize wireless communication.
- the aforementioned wired network may include, but is not limited to: wide area network, metropolitan area network, and local area network.
- the aforementioned server 112 may include, but is not limited to, any hardware device that can perform computing, such as an independent physical server, a server cluster or a distributed system composed of multiple physical servers, or the provision of cloud services, cloud databases, Cloud servers for basic cloud computing services such as cloud computing, cloud functions, cloud storage, network services, cloud communications, middleware services, domain name services, security services, and big data and artificial intelligence platforms.
- the foregoing image processing method includes:
- Step 202 Obtain a first picture to be rendered.
- Step 204 Obtain a target mask picture corresponding to the first picture, where the g component of the target mask picture stores the first light change data, the b component of the target mask picture stores the second light change data, and the first light
- the change data is the light change data on the left side of the first picture used when the first picture is rendered
- the second light change data is the light change data on the right side of the first picture used when the first picture is rendered.
- Step 206 Render the first picture using the target mask picture to obtain the second picture.
- the first picture to be rendered may include, but is not limited to: a picture containing a character object, and a picture containing a static object.
- a character to use a 2D real-life picture
- the position of the screen light is fixed
- the artist outputs a light mask image (that is, a target mask) according to the light direction.
- the mask image is an opaque image
- the r, g, and b components of the r, g, and b data correspond to different data. Due to the need to independently control the color changes on both sides of the character, the g component of the mask map stores the light change data on the left, and the b component stores the light change data on the right.
- Step 2 Load the shader to the graphics processing unit (GPU, Graphics Processing Unit);
- Step 3 Call the Update function
- Step 4 Determine whether the lighting parameters have changed; if yes, go to step 5, otherwise go to step 6;
- Step 5 Modify the light input parameters of the fragment shader
- Step 6 Submit UI vertex data (including mask texture, first image texture) to the GPU, and set the rendering state (shader, start mixing)
- Step 7 Submit rendering commands to the GPU
- Step 8 judge whether the rendering is finished; if yes, go to step 9, otherwise go to step 3;
- the CPU will call the Update function once every frame.
- the Update function judge whether the lighting parameters of the screen have changed. If there are changes, modify the left and right light color input parameters of the shader. Every frame will submit all the rendering data, and the custom shader will be submitted to the GPU.
- the GPU is responsible for rendering the data submitted by the CPU.
- the engineering flow chart of GPU one rendering is shown in Figure 4.
- Step 1 GPU one-time rendering processing starts
- Step 2 GPU rendering pipeline
- Step 3 Fragment shader processing: use the first picture and the mask map corresponding to the first picture, mix the light color, and return the fragment color value;
- Step 4 GPU rendering pipeline
- Step 5 Fix the display cache data
- Step 6 the end.
- the GPU when the GPU renders the player's picture, it will call the custom shader code.
- the fragment function of the shader the player image and mask image are sampled.
- the left and right light colors have been passed to the shader variable as parameters, and the rgba data value of the fragment can be obtained by mixing and superimposing the light color and the image sampling data.
- the first picture to be rendered by acquiring the first picture to be rendered; acquiring the target mask picture corresponding to the first picture, wherein the g component of the target mask picture stores the first light change data, and the value of the target mask picture
- the b component stores the second light change data.
- the first light change data is the light change data on the left side of the first picture used when rendering the first picture
- the second light change data is the light change data used when rendering the first picture.
- the light change data located on the right side of the first picture use the target mask picture to render the first picture to obtain the second picture, which achieves the rendering of the picture through the light change data stored in the target mask picture of the picture
- the purpose of this is to avoid the use of 3D models and realize light projection by laying out real-time light sources or baking scenes in the scene, thereby avoiding the increase in power consumption of the mobile terminal device for image rendering and causing the mobile terminal device to freeze.
- rendering the first picture using the target mask picture to obtain the second picture includes:
- the first picture is rendered using the r component, g component, and b component of the target mask picture to obtain the second picture.
- the picture is a picture obtained by compressing the third picture
- the third picture is a picture with transparency
- the alpha value is used to indicate the transparency of the third picture.
- the compression algorithm for pictures in this application uses etc compression. etc compression does not support pictures with transparency.
- the alpha value of the picture needs to be saved to another picture.
- the alpha value of the original character picture is saved to the r component of the mask picture.
- the first picture and the mask map corresponding to the first picture that is, the left picture is the first picture
- the right picture is the mask picture corresponding to the first picture.
- the mask image generally presents color 1, which is the r component representing the alpha value.
- the data on the left is the r+g component, showing color 2.
- the data on the right is the r+b component data, showing color 3.
- the third picture is The alpha value of the picture is stored in the r component of the target mask picture.
- rendering the first picture using the target mask picture to obtain the second picture includes:
- the first light change amount data and the second light change amount data are superimposed with the original data in the first picture to obtain a second picture.
- the method further includes:
- the data after the first light change and the data after the second light change are overlaid on the data to be rendered in the first picture to obtain the second picture.
- rendering the first picture using the target mask picture to obtain the second picture includes:
- the original data of the first picture and the g component and b component in the target mask picture corresponding to the first picture are superimposed to obtain the second picture.
- the third picture is determined The picture is the first picture to be rendered.
- the embodiment of the present application also provides a picture processing device for implementing the above picture processing method. As shown in FIG. 7, the device includes: a first acquiring unit 71, a second acquiring unit 73, and a rendering unit 75.
- the first obtaining unit 71 is configured to obtain the first picture to be rendered.
- the second acquiring unit 73 is configured to acquire a target mask picture corresponding to the first picture, wherein the g component of the target mask picture stores the first light change data, and the b component of the target mask picture stores the second light change Data, the first light change data is the light change data on the left side of the first picture used when rendering the first picture, and the second light change data is the light on the right side of the first picture used when rendering the first picture The change data.
- the rendering unit 75 is configured to render the first picture using the target mask picture to obtain the second picture.
- the rendering unit 75 includes:
- the rendering module is configured to render the first picture using the r component, g component, and b component of the target mask picture when the r component of the target mask picture stores the alpha value of the third picture to obtain the second picture ,
- the first picture is a picture obtained by compressing the third picture
- the third picture is a picture with transparency
- the alpha value is used to indicate the transparency of the third picture.
- the first obtaining unit 71 obtains the first picture to be rendered; the second obtaining unit 73 obtains the target mask picture corresponding to the first picture, wherein the g component of the target mask picture stores the first picture.
- Light change data The b component of the target mask picture stores the second light change data.
- the first light change data is the light change data on the left side of the first picture used when rendering the first picture, and the second light change data It is the change data of the light on the right side of the first picture used when rendering the first picture; the rendering unit 75 uses the target mask picture to render the first picture to obtain the second picture.
- the above-mentioned device further includes:
- the storage module is configured to, before using the target mask picture to render the first picture, when the first picture is a picture obtained by compressing the third picture, and the third picture is a picture with transparency, the third picture
- the alpha value is stored in the r component of the target mask picture.
- the aforementioned rendering unit 75 includes:
- the collection module is configured to collect the original data in the first picture
- An acquiring module configured to acquire the first light variation data stored in the g component of the target mask picture, and the second light variation data stored in the b component of the target mask picture;
- the superimposing module is configured to superimpose the first light change amount data and the second light change amount data with the original data in the first picture to obtain a second picture.
- the above-mentioned device further includes:
- the third obtaining unit is configured to obtain the first light changed data stored in the g component of the target mask picture after obtaining the raw data to be rendered in the first picture, and the second light stored in the b component of the target mask picture The changed data;
- the processing unit is configured to overwrite the data to be rendered in the first picture with the data after the first light change and the data after the second light change to obtain the second picture.
- the above device further includes:
- the superimposing module is configured to superimpose the original data of the first picture and the g component and b component in the target mask picture corresponding to the first picture by calling the target function to obtain the second picture.
- the embodiment of the present application also provides an electronic device for implementing the foregoing image processing method.
- the electronic device includes a memory 802 and a processor 804.
- the memory 802 stores a computer program
- the processor 804 It is configured to execute the steps in any one of the above-mentioned method embodiments through a computer program.
- the above-mentioned electronic device may be located in at least one network device among a plurality of network devices in a computer network.
- the above-mentioned processor may be configured to execute the following steps through a computer program:
- S3 Render the first picture using the target mask picture to obtain the second picture.
- FIG. 8 is only for illustration, and the electronic device may also be a smart phone (such as an Android phone, an iOS phone, etc.), a tablet computer, a palmtop computer, and a mobile Internet device. Terminal devices such as Mobile Internet Devices (MID) and PAD can also be various types of servers.
- FIG. 8 does not limit the structure of the above-mentioned electronic device.
- the electronic device may also include more or fewer components (such as a network interface, etc.) than shown in FIG. 8, or have a configuration different from that shown in FIG. 8.
- the memory 802 may be configured to store software programs and modules, such as program instructions/modules corresponding to the image processing method and apparatus in the embodiments of the present application.
- the processor 804 executes the software programs and modules stored in the memory 802 by running Various functional applications and data processing, that is, to achieve the above-mentioned image processing method.
- the memory 802 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic storage devices, flash memory, or other non-volatile solid-state memory.
- the memory 802 may further include a memory remotely provided with respect to the processor 804, and these remote memories may be connected to the terminal through a network.
- the memory 802 may be, but is not limited to, configured to store information such as the first picture to be rendered, the target mask picture corresponding to the first picture, and the second picture.
- the foregoing memory 802 may, but is not limited to, include the first obtaining unit 71, the second obtaining unit 73, and the rendering unit 75 in the foregoing image processing apparatus.
- it may also include, but is not limited to, other module units in the above-mentioned picture processing apparatus, which will not be repeated in this example.
- the aforementioned transmission device 806 is configured to receive or send data via a network.
- the aforementioned network examples may include wired networks and wireless networks.
- the transmission device 806 includes a network adapter (Network Interface Controller, NIC), which can be connected to other network devices and routers via a network cable so as to communicate with the Internet or a local area network.
- the transmission device 806 is a radio frequency (RF) module, which is configured to communicate with the Internet in a wireless manner.
- RF radio frequency
- the aforementioned electronic device further includes: a display 808 configured to display the aforementioned first picture to be rendered and a second picture; and a connection bus 810 configured to connect each module component in the aforementioned electronic device.
- the embodiment of the present application provides a computer-readable storage medium in which a computer program is stored, and the computer program is configured to execute the image processing method provided in the embodiment of the present application when the computer program is run.
- the aforementioned computer-readable storage medium may be configured to store a computer program configured to perform the following steps:
- S3 Render the first picture using the target mask picture to obtain the second picture.
- the storage medium may include: flash disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk, etc.
- the integrated unit in the foregoing embodiment is implemented in the form of a software functional unit and sold or used as an independent product, it may be stored in the foregoing computer-readable storage medium.
- the technical solution of the present application essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
- a number of instructions are included to enable one or more computer devices (which may be personal computers, servers, or network devices, etc.) to execute all or part of the steps of the methods described in the various embodiments of the present application.
- the disclosed client can be implemented in other ways.
- the device embodiments described above are only illustrative.
- the division of the units is only a logical function division.
- there may be other division methods for example, multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not implemented.
- the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, units or modules, and may be in electrical or other forms.
- the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple network units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
- the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
- the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
Abstract
Description
Claims (15)
- 一种图片处理方法,所述方法由电子设备执行,所述方法包括:获取待渲染的第一图片;获取与所述第一图片对应的目标掩膜图片;其中,所述目标掩膜图片的g分量存储了第一灯光变化数据,所述目标掩膜图片的b分量存储了第二灯光变化数据,所述第一灯光变化数据为在渲染所述第一图片时采用的位于所述第一图片左侧的灯光的变化数据,所述第二灯光变化数据为在渲染所述第一图片时采用的位于所述第一图片右侧的灯光的变化数据;使用所述目标掩膜图片对所述第一图片进行渲染,得到第二图片。
- 根据权利要求1所述的方法,其中,所述使用所述目标掩膜图片对所述第一图片进行渲染,得到第二图片,包括:在所述目标掩膜图片的r分量存储了第三图片的alpha值的情况下,使用所述目标掩模图片的r分量、g分量和b分量对所述第一图片进行渲染,得到所述第二图片;其中,所述第一图片是将所述第三图片压缩得到的图片、且所述第三图片为具有透明度的图片,所述alpha值用于表示所述第三图片的透明度。
- 根据权利要求2所述的方法,其中,所述使用所述目标掩膜图片对所述第一图片进行渲染之前,所述方法还包括:在所述第一图片是将所述第三图片压缩得到的图片、且所述第三图片为具有透明度的图片的情况下,将所述第三图片的alpha值存储到所述目标掩膜图片的r分量。
- 根据权利要求1至3中任一项所述的方法,其中,所述使用所述目标掩膜图片对所述第一图片进行渲染,得到第二图片,包括:采集所述第一图片中的原始数据;获取所述目标掩膜图片的g分量存储的第一灯光变化量数据,以及所述目标掩膜图片的b分量存储的第二灯光变化量数据;将所述第一灯光变化量数据和所述第二灯光变化量数据与所述第一图片中的原始数据进行叠加,得到所述第二图片。
- 根据权利要求4中所述的方法,其中,所述获取所述第一图片中的待渲染的原始数据之后,所述方法还包括:获取所述目标掩膜图片的g分量存储的第一灯光变化后的数据,所述目标掩膜图片的b分量存储的第二灯光变化后的数据;将所述第一灯光变化后的数据和所述第二灯光变化后的数据覆盖所述第一图片中待渲染的数据,得到所述第二图片。
- 根据权利要求1中所述的方法,其中,所述使用所述目标掩膜图片对所述第一图片进行渲染,得到第二图片,包括:通过调用目标函数将所述第一图片的原始数据与所述第一图片对应的目标掩膜图片中的g分量和b分量进行叠加,得到所述第二图片。
- 根据权利要求1中所述的方法,其中,所述获取待渲染的第一图片之前,所述方法还包括:通过判断函数检测到第三图片对应的目标掩膜图片的g分量、b分量所对应的数据发生变化的情况下,确定所述第三图片为所述待渲染的第一图片。
- 一种图片处理装置,包括:第一获取单元,配置为获取待渲染的第一图片;第二获取单元,配置为获取与所述第一图片对应的目标掩膜图片;其中,所述目标掩膜图片的g分量存储了第一灯光变化数据,所述目标掩膜图片的b分量存储了第二灯光变化数据,所述第一灯光变化数据为在渲染所述第一图片时采用的位于所述第一图片左侧的灯光的变化数据,所述第二灯光变化数据为在渲染所述第一图片时采用的位于所述第一图片右侧的灯光的变化数据;渲染单元,配置为使用所述目标掩膜图片对所述第一图片进行渲染,得到第二图片。
- 根据权利要求8所述的装置,其中,所述渲染单元,包括:渲染模块,配置为在所述目标掩膜图片的r分量存储了第三图片的alpha值的情况下,使用所述目标掩模图片的r分量、g分量和b分量对所述第一图片进行渲染,得到所述第二图片,其中,所述第一图片是将所述第三图片压缩得到的图片、且所述第三图片为具有透明度的图片,所述alpha值配置为表示所述第三图片的透明度。
- 根据权利要求9所述的装置,其中,所述装置还包括:存储模块,配置为在使用所述目标掩膜图片对所述第一图片进行渲染之前,在所述第一图片是将所述第三图片压缩得到的图片、且所述第三图片为具有透明度的图片的情况下,将所述第三图片的alpha值存储到所述目标掩膜图片的r分量。
- 根据权利要求9至10中任一项所述的装置,其中,所述渲染单元,包括:采集模块,配置为采集所述第一图片中的原始数据;获取模块,配置为获取所述目标掩膜图片的g分量存储的第一灯光变化量数据,以及所述目标掩膜图片的b分量存储的第二灯光变化量数据;叠加模块,配置为将所述第一灯光变化量数据和所述第二灯光变化量数据与所述第一图片中的原始数据进行叠加,得到所述第二图片。
- 根据权利要求11中所述的装置,其中,所述装置包括:第三获取单元,配置为获取所述第一图片中的待渲染的原始数据之后,获取所述目标掩膜图片的g分量存储的第一灯光变化后的数据,所述目标掩膜图片的b分量存储的第二灯光变化后的数据;处理单元,配置为将所述第一灯光变化后的数据和所述第二灯光变化后的数据覆盖所述第一图片中待渲染的数据,得到所述第二图片。
- 根据权利要求8中所述的装置,其中,所述渲染单元,包括:叠加模块,配置为通过调用目标函数将所述第一图片的原始数据与所述第一图片对应的目标掩膜图片中的g分量和b分量进行叠加,得到所述第二图片。
- 一种计算机可读存储介质,所述存储介质包括存储的程序,所述程序运行时,执行所述权利要求1至7任一项所述的图片处理方法。
- 一种电子设备,包括存储器和处理器,其中,所述存储器中存储有计算机程序,所述处理器被设置为运行所述计算机程序时,执行所述权利要求1至7任一项所述的图片处理方法。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP20917554.6A EP4002289A4 (en) | 2020-02-07 | 2020-11-04 | METHOD AND DEVICE FOR IMAGE PROCESSING, INFORMATION CARRYING AND ELECTRONIC APPARATUS |
JP2022523027A JP7301453B2 (ja) | 2020-02-07 | 2020-11-04 | 画像処理方法、画像処理装置、コンピュータプログラム、及び電子機器 |
KR1020227008760A KR102617789B1 (ko) | 2020-02-07 | 2020-11-04 | 픽처 처리 방법 및 디바이스, 저장 매체 그리고 전자 장치 |
US17/588,536 US11983900B2 (en) | 2020-02-07 | 2022-01-31 | Image processing method and apparatus, storage medium, and electronic device |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010082717.0 | 2020-02-07 | ||
CN202010082717.0A CN111260768B (zh) | 2020-02-07 | 2020-02-07 | 图片处理方法和装置、存储介质及电子装置 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/588,536 Continuation US11983900B2 (en) | 2020-02-07 | 2022-01-31 | Image processing method and apparatus, storage medium, and electronic device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021155688A1 true WO2021155688A1 (zh) | 2021-08-12 |
Family
ID=70954883
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/126344 WO2021155688A1 (zh) | 2020-02-07 | 2020-11-04 | 图片处理方法、装置、存储介质及电子设备 |
Country Status (5)
Country | Link |
---|---|
EP (1) | EP4002289A4 (zh) |
JP (1) | JP7301453B2 (zh) |
KR (1) | KR102617789B1 (zh) |
CN (1) | CN111260768B (zh) |
WO (1) | WO2021155688A1 (zh) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111260768B (zh) * | 2020-02-07 | 2022-04-26 | 腾讯科技(深圳)有限公司 | 图片处理方法和装置、存储介质及电子装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1258837A1 (en) * | 2001-05-14 | 2002-11-20 | Thomson Licensing S.A. | Method to generate mutual photometric effects |
US7583264B2 (en) * | 2005-08-31 | 2009-09-01 | Sega Corporation | Apparatus and program for image generation |
CN104392479A (zh) * | 2014-10-24 | 2015-03-04 | 无锡梵天信息技术股份有限公司 | 一种利用灯光索引号对像素进行光照着色的方法 |
CN108520551A (zh) * | 2018-03-30 | 2018-09-11 | 苏州蜗牛数字科技股份有限公司 | 实现光贴图动态光照的方法、存储介质及计算设备 |
CN109887066A (zh) * | 2019-02-25 | 2019-06-14 | 网易(杭州)网络有限公司 | 光照效果处理方法及装置、电子设备、存储介质 |
CN111260768A (zh) * | 2020-02-07 | 2020-06-09 | 腾讯科技(深圳)有限公司 | 图片处理方法和装置、存储介质及电子装置 |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6016150A (en) * | 1995-08-04 | 2000-01-18 | Microsoft Corporation | Sprite compositor and method for performing lighting and shading operations using a compositor to combine factored image layers |
JP3777288B2 (ja) | 2000-05-10 | 2006-05-24 | 株式会社ナムコ | ゲームシステム及び情報記憶媒体 |
US20070070082A1 (en) * | 2005-09-27 | 2007-03-29 | Ati Technologies, Inc. | Sample-level screen-door transparency using programmable transparency sample masks |
US20130265306A1 (en) * | 2012-04-06 | 2013-10-10 | Penguin Digital, Inc. | Real-Time 2D/3D Object Image Composition System and Method |
JP5904281B2 (ja) * | 2012-08-10 | 2016-04-13 | 株式会社ニコン | 画像処理方法、画像処理装置、撮像装置および画像処理プログラム |
CN104134230B (zh) * | 2014-01-22 | 2015-10-28 | 腾讯科技(深圳)有限公司 | 一种图像处理方法、装置及计算机设备 |
JP6727816B2 (ja) | 2016-01-19 | 2020-07-22 | キヤノン株式会社 | 画像処理装置、撮像装置、画像処理方法、画像処理プログラムおよび記憶媒体 |
CN108492339B (zh) * | 2018-03-28 | 2021-06-01 | 腾讯科技(深圳)有限公司 | 获取资源压缩包的方法、装置、电子设备及存储介质 |
CN110473280B (zh) * | 2018-05-09 | 2024-02-23 | 网易(杭州)网络有限公司 | 多光源画面渲染方法、装置、存储介质、处理器和终端 |
-
2020
- 2020-02-07 CN CN202010082717.0A patent/CN111260768B/zh active Active
- 2020-11-04 KR KR1020227008760A patent/KR102617789B1/ko active IP Right Grant
- 2020-11-04 JP JP2022523027A patent/JP7301453B2/ja active Active
- 2020-11-04 EP EP20917554.6A patent/EP4002289A4/en active Pending
- 2020-11-04 WO PCT/CN2020/126344 patent/WO2021155688A1/zh unknown
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP1258837A1 (en) * | 2001-05-14 | 2002-11-20 | Thomson Licensing S.A. | Method to generate mutual photometric effects |
US7583264B2 (en) * | 2005-08-31 | 2009-09-01 | Sega Corporation | Apparatus and program for image generation |
CN104392479A (zh) * | 2014-10-24 | 2015-03-04 | 无锡梵天信息技术股份有限公司 | 一种利用灯光索引号对像素进行光照着色的方法 |
CN108520551A (zh) * | 2018-03-30 | 2018-09-11 | 苏州蜗牛数字科技股份有限公司 | 实现光贴图动态光照的方法、存储介质及计算设备 |
CN109887066A (zh) * | 2019-02-25 | 2019-06-14 | 网易(杭州)网络有限公司 | 光照效果处理方法及装置、电子设备、存储介质 |
CN111260768A (zh) * | 2020-02-07 | 2020-06-09 | 腾讯科技(深圳)有限公司 | 图片处理方法和装置、存储介质及电子装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP4002289A4 * |
Also Published As
Publication number | Publication date |
---|---|
US20220164991A1 (en) | 2022-05-26 |
JP7301453B2 (ja) | 2023-07-03 |
KR20220046665A (ko) | 2022-04-14 |
EP4002289A1 (en) | 2022-05-25 |
EP4002289A4 (en) | 2022-11-30 |
JP2022553251A (ja) | 2022-12-22 |
CN111260768A (zh) | 2020-06-09 |
CN111260768B (zh) | 2022-04-26 |
KR102617789B1 (ko) | 2023-12-22 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021129044A1 (zh) | 对象渲染方法和装置、存储介质及电子装置 | |
CN112215934A (zh) | 游戏模型的渲染方法、装置、存储介质及电子装置 | |
CN112316433B (zh) | 游戏画面渲染方法、装置、服务器和存储介质 | |
US20240096007A1 (en) | Rendering Method, Device, and System | |
CN111583378B (zh) | 一种虚拟资产处理的方法及装置、电子设备、存储介质 | |
CN114119818A (zh) | 场景模型的渲染方法、装置及设备 | |
CN111583379B (zh) | 虚拟模型的渲染方法、装置、存储介质与电子设备 | |
US10237563B2 (en) | System and method for controlling video encoding using content information | |
CN116091672A (zh) | 图像渲染方法、计算机设备及其介质 | |
WO2021155688A1 (zh) | 图片处理方法、装置、存储介质及电子设备 | |
JP7160495B2 (ja) | 画像前処理方法、装置、電子機器及び記憶媒体 | |
US20230343021A1 (en) | Visible element determination method and apparatus, storage medium, and electronic device | |
WO2023202254A1 (zh) | 图像渲染方法、装置、电子设备、计算机可读存储介质及计算机程序产品 | |
CN116758201A (zh) | 三维场景的渲染处理方法、设备、系统及计算机存储介质 | |
CN114428573B (zh) | 特效图像处理方法、装置、电子设备及存储介质 | |
EP4231243A1 (en) | Data storage management method, object rendering method, and device | |
US11983900B2 (en) | Image processing method and apparatus, storage medium, and electronic device | |
CN113313796B (zh) | 场景生成方法、装置、计算机设备和存储介质 | |
WO2022033162A1 (zh) | 一种模型加载方法以及相关装置 | |
CN111462007B (zh) | 图像处理方法、装置、设备及计算机存储介质 | |
CN114119831A (zh) | 积雪模型的渲染方法、装置、电子设备及可读介质 | |
CN113192173A (zh) | 三维场景的图像处理方法、装置及电子设备 | |
WO2023197729A1 (zh) | 对象渲染方法、装置、电子设备及存储介质 | |
WO2022135050A1 (zh) | 渲染方法、设备以及系统 | |
WO2023216771A1 (zh) | 虚拟天气交互方法、装置、电子设备、计算机可读存储介质及计算机程序产品 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20917554 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2020917554 Country of ref document: EP Effective date: 20220218 |
|
ENP | Entry into the national phase |
Ref document number: 20227008760 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2022523027 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |