CN117152171A - Image processing method, apparatus, device, storage medium, and program product - Google Patents

Image processing method, apparatus, device, storage medium, and program product Download PDF

Info

Publication number
CN117152171A
CN117152171A CN202210571068.XA CN202210571068A CN117152171A CN 117152171 A CN117152171 A CN 117152171A CN 202210571068 A CN202210571068 A CN 202210571068A CN 117152171 A CN117152171 A CN 117152171A
Authority
CN
China
Prior art keywords
image
matting
saturation
processed
hue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210571068.XA
Other languages
Chinese (zh)
Inventor
陈法圣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN202210571068.XA priority Critical patent/CN117152171A/en
Publication of CN117152171A publication Critical patent/CN117152171A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The application provides an image processing method, an image processing device, an electronic device, a computer readable storage medium and a computer program product; the method comprises the following steps: acquiring an image to be processed, wherein the image to be processed comprises a background object and a target object; constructing a one-dimensional transparency lookup table of the matting parameters based on at least one matting parameter of the background object; inquiring a one-dimensional transparency inquiry table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed; and carrying out matting processing based on the transparent channel image to obtain a target image which is used for removing the background object and comprises the target object. According to the application, the matting efficiency can be improved.

Description

Image processing method, apparatus, device, storage medium, and program product
Technical Field
The present application relates to computer technology, and more particularly, to an image processing method, apparatus, electronic device, computer readable storage medium, and computer program product.
Background
With the development of computer technology, the matting is widely applied to various aspects such as virtual film making, interactive games, virtual live broadcasting and the like. For example, in the current virtual production, the target object is extracted from the original video, and the extracted target object is fused with the background in the material video, so that a new video conforming to the shooting scene is produced, different scenes can be prevented from being repeatedly built when the video is shot, and the cost of scene arrangement is reduced.
However, the matting scheme in the related art mainly relies on a set multi-dimensional transparency lookup table of an original image, through which a channel image is transparent, so that matting is performed through the transparent channel image. However, the computation process of the transparent channel image of this scheme requires a large amount of computation resources, resulting in poor matting efficiency.
Disclosure of Invention
The embodiment of the application provides an image processing method, an image processing device, electronic equipment, a computer readable storage medium and a computer program product, which can fully and effectively utilize a one-dimensional transparency lookup table of each matting parameter to realize matting and improve matting efficiency.
The technical scheme of the embodiment of the application is realized as follows:
the embodiment of the application provides an image processing method, which comprises the following steps:
acquiring an image to be processed, wherein the image to be processed comprises a background object and a target object;
constructing a one-dimensional transparency lookup table of the matting parameters based on at least one matting parameter of the background object;
inquiring a one-dimensional transparency inquiry table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed;
And carrying out matting processing based on the transparent channel image to obtain a target image which is used for removing the background object and comprises the target object.
An embodiment of the present application provides an image processing apparatus including:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring an image to be processed, and the image to be processed comprises a background object and a target object;
the construction module is used for constructing a one-dimensional transparency lookup table of the matting parameters based on at least one matting parameter of the background object;
the query module is used for querying a one-dimensional transparency query table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed;
and the processing module is used for carrying out matting processing based on the transparent channel image to obtain a target image which is used for removing the background object and comprises the target object.
In the above technical solution, the construction module is further configured to determine a matting range corresponding to the background object based on matting parameters of the background object;
the transparency corresponding to the color parameter value included in the matting range is completely transparent;
and constructing a one-dimensional transparency lookup table of the matting parameters based on the matting range corresponding to the background object, wherein the one-dimensional transparency lookup table comprises the corresponding relation between the color parameter values and the transparency.
In the above technical solution, when the matting parameter is a hue matting parameter, the matting range corresponding to the background object includes a hue matting range; when the matting parameters are saturation matting parameters, the matting range corresponding to the background object comprises a saturation matting range;
the construction module is further used for determining a hue minimum value and a hue maximum value of the background object based on hue matting parameters of the background object, and taking the hue minimum value and the hue maximum value as endpoint values of the hue matting range;
and determining a saturation minimum value and a saturation maximum value of the background object based on the saturation matting parameter of the background object, and taking the saturation minimum value and the saturation maximum value as endpoint values of the saturation matting range.
In the above technical solution, the construction module is further configured to determine an initial hue minimum value, an initial hue maximum value, and a hue gradient width of the background object based on the hue matting parameter of the background object;
increasing the initial hue minimum value based on the hue gradient width to obtain a hue minimum value of the background object;
Reducing the initial hue maximum value based on the hue gradient width to obtain a hue maximum value of the background object;
determining an initial saturation minimum value, an initial saturation maximum value and a saturation gradient width of the background object based on the saturation matting parameters of the background object;
and increasing the initial saturation minimum value based on the saturation gradient width to obtain a saturation minimum value of the background object, and taking the initial saturation maximum value as the saturation maximum value of the background object.
In the above technical solution, the construction module is further configured to obtain a color gradient width of the background object, and determine a color gradient range based on the color matting range and the color gradient width, where a transparency corresponding to a color parameter value included in the color gradient range is not completely transparent and not completely opaque;
constructing a one-dimensional transparency lookup table of the hue matting parameters based on the hue matting range and the hue gradient range;
acquiring a saturation gradient width of the background object, and determining a saturation gradient range based on the saturation matting range and the saturation gradient width, wherein the saturation gradient range comprises a transparency which is not completely transparent and not completely opaque and corresponds to a color parameter value;
And constructing a one-dimensional transparency lookup table of the saturation matting parameter based on the saturation matting range and the saturation gradient range.
In the above technical solution, the query module is further configured to perform, for any pixel in the image to be processed, the following processing:
inquiring a one-dimensional transparency inquiry table of the matting parameter based on the color parameter of the pixel to obtain the transparency of the pixel;
and combining the transparency of a plurality of pixels according to the position relation of the pixels in the image to be processed to obtain a transparent channel image corresponding to the image to be processed.
In the above technical solution, when the number of the matting parameters is multiple, the query module is further configured to query a one-dimensional transparency query table of the plurality of matting parameters based on the color parameters of the pixels, respectively, to obtain transparency corresponding to the plurality of matting parameters;
and taking the maximum value of the transparency corresponding to the matting parameters as the transparency of the pixel.
In the above technical solution, the image to be processed is a color image; the construction module is further configured to convert the image to be processed into a candidate color space before the one-dimensional transparency lookup table of the matting parameter is queried based on the color parameter value of each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed, so as to obtain a first converted image of the image to be processed in the candidate color space;
Converting the image to be processed into a black-and-white space to obtain a second converted image of the image to be processed in the black-and-white space;
fusing the first conversion image and the second conversion image to obtain a third conversion image corresponding to the image to be processed;
the query module is further configured to query a one-dimensional transparency query table of the matting parameter based on a color parameter of each pixel in the third conversion image, so as to obtain a transparent channel image corresponding to the image to be processed.
In the above technical solution, the color parameter of each pixel in the third converted image includes a hue parameter, a saturation parameter, and a brightness parameter; the construction module is further configured to determine a saturation scaling factor based on a luminance parameter of each pixel in the second converted image;
scaling the saturation parameter of each pixel in the first converted image based on the saturation scaling factor to obtain the scaled saturation parameter;
and combining the brightness parameter of each pixel in the second conversion image, the saturation parameter after scaling and the hue parameter of each pixel in the first conversion image to obtain a third conversion image corresponding to the image to be processed.
In the above technical solution, the processing module is further configured to perform morphological processing on the transparent channel image, to obtain the transparent channel image after morphological processing;
performing interference color removal processing on the image to be processed to obtain the image to be processed with the interference color removed;
and carrying out fusion processing on the transparent channel image subjected to morphological processing and the image to be processed subjected to interference color removal to obtain a target image which is used for removing the background object and comprises the target object.
In the above technical solution, the pixel value of each pixel in the image to be processed includes a first color value, a second color value, and an interference color value; the processing module is further configured to perform the following processing for any pixel in the image to be processed:
determining an interference color reference value for the pixel based on the first color value and the second color value of the pixel;
combining the interference color of the pixel with the minimum value of the interference color reference values of the pixel, the first color value of the pixel and the second color value to obtain the pixel with the interference color removed;
and combining the pixels with the removed interference colors according to the position relation of the pixels in the image to be processed to obtain the image to be processed with the removed interference colors.
In the above technical solution, the processing module is further configured to use an absolute value of a difference between the first color value and the second color value of the pixel as a difference between the first color value and the second color value;
determining a minimum of the first color value and the second color value of the pixel;
and adding the product of the difference and the interference color coefficient and the minimum value as an interference color reference value of the pixel.
An embodiment of the present application provides an electronic device for image processing, including:
a memory for storing executable instructions;
and the processor is used for realizing the image processing method provided by the embodiment of the application when executing the executable instructions stored in the memory.
The embodiment of the application provides a computer readable storage medium which stores executable instructions for realizing the image processing method provided by the embodiment of the application when being executed by a processor.
An embodiment of the present application provides a computer program product including executable instructions that when executed by a processor implement the image processing method provided by the embodiment of the present application.
The embodiment of the application has the following beneficial effects:
Constructing a one-dimensional transparency lookup table of each matting parameter through at least one matting parameter of a background object in the image to be processed, quickly querying the one-dimensional transparency lookup table based on each pixel in the image to be processed, and carrying out matting processing on the image to be processed based on a transparent channel image corresponding to the image to be processed, thereby fully and effectively utilizing the one-dimensional transparency lookup table to realize quick matting, improving matting efficiency and saving a large amount of computing resources.
Drawings
Fig. 1 is a schematic view of an application scenario of an image processing system according to an embodiment of the present application;
fig. 2 is a schematic structural diagram of an electronic device according to an embodiment of the present application;
fig. 3A to 3C are schematic flow diagrams of an image processing method according to an embodiment of the present application;
FIGS. 3D-3E are diagrams illustrating a one-dimensional transparency lookup table according to an embodiment of the present application;
fig. 4A is a schematic diagram of a matting effect provided by an embodiment of the present application;
fig. 4B is a schematic view of a virtual production scene provided by an embodiment of the present application;
fig. 5 is a flow chart of a high-quality rapid green screen matting algorithm based on a separate lookup table according to an embodiment of the present application;
FIG. 6 is a flow chart of color space conversion provided by an embodiment of the present application;
FIG. 7 is a schematic diagram of a table-building process of a one-dimensional < hue-transparency > table according to an embodiment of the present application;
FIG. 8 is a schematic diagram of a < hue-transparency > table provided by an embodiment of the present application;
FIG. 9 is a schematic diagram of a table-building process of a one-dimensional < saturation-transparency > table according to an embodiment of the present application;
FIG. 10 is a schematic diagram of a < saturation-transparency > table provided by an embodiment of the present application;
FIG. 11 is a schematic flow chart of calculating a transparent channel according to an embodiment of the present application;
FIG. 12A is a schematic diagram of a two-dimensional table provided by an embodiment of the present application;
FIG. 12B is a schematic diagram of a < hue-transparency > table provided by embodiments of the present application;
FIG. 12C is a schematic diagram of a < saturation-transparency > table provided by an embodiment of the present application;
FIG. 13A is a schematic illustration of a pixel of black pants being greened by a green curtain according to an embodiment of the present application;
FIG. 13B is a schematic illustration of a pixel of skin not darkened by a shade of green provided by an embodiment of the present application;
FIG. 14 is a schematic flow chart of a post-process provided by an embodiment of the present application;
fig. 15A is a schematic diagram of a matting result provided by an embodiment of the present application;
fig. 15B is a schematic diagram of a matting result provided by an embodiment of the present application.
Detailed Description
The present application will be further described in detail with reference to the accompanying drawings, for the purpose of making the objects, technical solutions and advantages of the present application more apparent, and the described embodiments should not be construed as limiting the present application, and all other embodiments obtained by those skilled in the art without making any inventive effort are within the scope of the present application.
In the following description, the terms "first", "second", and the like are merely used to distinguish between similar objects and do not represent a particular ordering of the objects, it being understood that the "first", "second", or the like may be interchanged with one another, if permitted, to enable embodiments of the application described herein to be practiced otherwise than as illustrated or described herein.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
Before describing embodiments of the present application in further detail, the terms and terminology involved in the embodiments of the present application will be described, and the terms and terminology involved in the embodiments of the present application will be used in the following explanation.
1) Video: consists of a series of image frames, i.e. of a sequence of images. The fluency of the video may be represented by the number of transmission frames per second (FPS, frames Per Second), with the more frames per second, the more fluent the motion displayed. FPS is a definition in the image field, that is, the number of pictures of an animation or video. Each image frame is a still image, and when played in sequence, a moving image is created. For example, a 30FPS indicates that 30 "still images" will be played per second.
2) Matting parameters: and determining whether the pixel in the image to be processed is a parameter of the background, and when the pixel in the image to be processed belongs to the background, the color parameter (or the pixel value) corresponding to the pixel is a matting parameter. For example, color parameters (including saturation and hue) corresponding to a background object (such as a green curtain) on an image to be processed are matting parameters of the background object.
3) Two-dimensional code green curtain: and a green curtain printed with a two-dimensional code. The green curtain includes three faces: two-dimensional code left plane, two-dimensional code right plane, two-dimensional code bottom plane. All the two-dimensional codes on the two-dimensional code green curtain have unique patterns and numbers, and the corresponding two-dimensional code detection algorithm is used for detecting all the two-dimensional codes which are not shielded and accurately acquiring the angular point coordinates of the two-dimensional codes on the imaging image.
4) Color parameters: the parameter value corresponding to each pixel in the color space, for example, red (R, red), green (G, green), blue (B, blue) in the color space RGB, each represent a color parameter; in the color space RGBA, red (R, red), green (G, green), blue (B, blue), transparency (a, alpha) all represent color parameters; in the color space HSY, hue (H), saturation (S), and brightness (Y) each represent a color parameter.
Among them, hue is the primary characteristic of color, and is the most accurate standard for distinguishing various colors. Under the irradiation of light with different wavelengths, the human eye can feel different colors, such as blue, red, and the like. The appearance of these colors is known as hue.
Saturation is also referred to as "purity" and refers to the vividness of a color. The higher the saturation, the purer the color and the more vivid the color. When mixed with other colors, the saturation of the color will decrease and the color will darken and fade. When the saturation of the color drops to the bottom, the hue is lost, and the color becomes achromatic (black, white, gray).
Lightness refers to the degree of brightness of a color, all colors being bright to different degrees. Among the achromatic colors, the highest brightness is white, the middle is gray, and the darkest is black. It should be noted that the change in color brightness often affects the purity, for example, brightness increases after red is added to white, but purity decreases.
5) Morphology: image components which are significant to express and delineate the shape of the region are extracted from the image, so that the subsequent recognition work can grasp the most essential shape features of the target object, such as boundaries, connected regions and the like. Morphology is the fundamental theory of mathematical morphological image processing, whose basic operations include: binary corrosion and expansion, binary opening and closing operation, skeleton extraction, extreme corrosion, hit-miss conversion, morphological gradient, top-hat conversion, particle analysis, drainage basin conversion, gray value corrosion and expansion, gray value opening and closing operation, gray value morphological gradient and the like.
6) Shader (Shader): is used to implement image rendering, replacing the editable program of the fixed rendering pipeline. In a shader, the processing flow for pixels can be defined so that the shader can be used to speed up computation while the graphics processor (GPU, graphics Processing Unit) processes the image in parallel, freeing up the central processing unit (CPU, central Processing Unit) from computing power. The shader replaces the fixed rendering pipeline, can implement related computation in 3D graphics computation, and can implement various image effects due to its editability without being limited by the fixed rendering pipeline of the graphics card.
7) Client side: applications running in the terminal for providing various services.
The embodiment of the application provides an image processing method, an image processing device, electronic equipment, a computer readable storage medium and a computer program product, which can fully and effectively utilize a one-dimensional transparency lookup table of each matting parameter to realize matting and improve matting efficiency.
The image processing method provided by the embodiment of the application can be independently realized by a terminal or a server; the method can also be realized cooperatively by a terminal and a server, for example, the terminal solely bears an image processing method described below, or the terminal sends a matting request for an image to be processed to the server, the server executes the image processing method according to the received matting request for the image to be processed, constructs a one-dimensional transparency lookup table of the matting parameter based on at least one matting parameter of a background object, queries the one-dimensional transparency lookup table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed, performs matting processing based on the transparent channel image to obtain a target image which removes the background object and comprises the target object, and executes applications such as virtual production, interactive game, virtual live broadcast and the like based on the target image.
The electronic device for image processing provided by the embodiment of the application can be various types of terminals or servers, wherein the servers can be independent physical servers, can be server clusters or distributed systems formed by a plurality of physical servers, and can also be cloud servers for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content distribution networks (Content Delivery Network, CDN), basic cloud computing services such as big data and artificial intelligent platforms and the like; the terminal may be, but is not limited to, a smart phone, a tablet computer, a notebook computer, a desktop computer, a smart speaker, a smart watch, a smart television, a smart car device, etc. The terminal and the server may be directly or indirectly connected through wired or wireless communication, and the present application is not limited herein.
Referring to fig. 1, fig. 1 is a schematic view of an application scenario of an image processing system 10 according to an embodiment of the present application, a terminal 200 is connected to a server 100 through a network 300, and the network 300 may be a wide area network or a local area network, or a combination of the two.
The terminal 200 (running with a client, for example, a matting client) may be used to obtain a matting request for an image to be processed, for example, when the user inputs an image to be processed after opening the client running on the terminal, the terminal automatically obtains a matting request for an image to be processed (including an image to be processed for image processing).
In some embodiments, an image processing plug-in may be implanted in a client running in the terminal 200 to implement an image processing method locally on the client. For example, the terminal 200 invokes an image processing plug-in to implement an image processing method, constructs a one-dimensional transparency lookup table of the matting parameters based on at least one matting parameter of a background object in an image to be processed, queries the one-dimensional transparency lookup table of the matting parameters based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed, performs matting processing based on the transparent channel image to obtain a target image which removes the background object and includes a target object, so as to fully and effectively utilize the one-dimensional transparency lookup table of each matting parameter to implement matting, improve matting efficiency, and execute applications such as virtual film making, interactive games, virtual live broadcasting, and the like based on the target image.
In some embodiments, after the terminal 200 obtains a matting request for an image to be processed, an image processing interface (which may be provided in a cloud service form, that is, an image processing service) of the server 100 is invoked, the server 100 implements an image processing method based on the matting request for the image to be processed, constructs a one-dimensional transparency lookup table of matting parameters based on at least one matting parameter of a background object in the image to be processed, queries the one-dimensional transparency lookup table of matting parameters based on each pixel in the image to be processed, obtains a transparent channel image corresponding to the image to be processed, performs matting processing based on the transparent channel image, and obtains a target image which removes the background object and includes the target object, so as to enable matting to be implemented by fully and effectively using the one-dimensional transparency lookup table of each matting parameter, improve matting efficiency, and execute virtual production, interactive game, virtual live broadcast and other applications based on the target image.
In some embodiments, the terminal or the server may implement the image processing method provided by the embodiment of the present application by running a computer program (i.e., executable instructions). For example, the computer program may be a native program or a software module in an operating system; a Native Application (APP), i.e. a program that needs to be installed in an operating system to be run, such as a video-type application (e.g. a video client running on a terminal); the method can also be an applet, namely a program which can be run only by being downloaded into a browser environment; but also an applet that can be embedded in any APP. In general, the computer programs described above may be any form of application, module or plug-in.
In some embodiments, multiple servers may be organized into a blockchain, and server 100 may be nodes on the blockchain, where there may be an information connection between each node in the blockchain, and where information may be transferred between nodes via the information connection. The data (such as logic of image processing and target image) related to the image processing method provided by the embodiment of the application can be stored on a blockchain.
Referring to fig. 2, fig. 2 is a schematic structural diagram of an electronic device 500 according to an embodiment of the present application, and the electronic device 500 shown in fig. 2 includes: at least one processor 510, a memory 550, at least one network interface 520, and a user interface 530. The various components in electronic device 500 are coupled together by bus system 540. It is appreciated that the bus system 540 is used to enable connected communications between these components. The bus system 540 includes a power bus, a control bus, and a status signal bus in addition to the data bus. The various buses are labeled as bus system 540 in fig. 2 for clarity of illustration.
The processor 510 may be an integrated circuit chip with signal processing capabilities such as a general purpose processor, such as a microprocessor or any conventional processor, or the like, a digital signal processor (DSP, digital Signal Processor), or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or the like.
Memory 550 includes volatile memory or nonvolatile memory, and may also include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read Only Memory (ROM), and the volatile Memory may be a random access Memory (RAM, random Access Memory). The memory 550 described in embodiments of the present application is intended to comprise any suitable type of memory. Memory 550 may optionally include one or more storage devices physically located remote from processor 510.
In some embodiments, memory 550 is capable of storing data to support various operations, examples of which include programs, modules and data structures, or subsets or supersets thereof, as exemplified below.
An operating system 551 including system programs for handling various basic system services and performing hardware-related tasks, such as a framework layer, a core library layer, a driver layer, etc., for implementing various basic services and handling hardware-based tasks;
Network communication module 552 is used to reach other electronic devices via one or more (wired or wireless) network interfaces 520, exemplary network interfaces 520 include: bluetooth, wireless compatibility authentication (WiFi), and universal serial bus (USB, universal Serial Bus), etc.;
in some embodiments, the image processing apparatus provided in the embodiments of the present application may be implemented in software, and fig. 2 shows an image processing apparatus 555 stored in a memory 550, which may be software in the form of a program, a plug-in, or the like, including the following software modules: the acquisition module 5551, the construction module 5552, the query module 5553, the processing module 5554 are logical, and thus may be arbitrarily combined or further split according to the implemented functions. The functions of the respective modules will be described hereinafter.
As described above, the video processing method provided by the embodiment of the present application may be implemented by various types of electronic devices. Referring to fig. 3A, fig. 3A is a schematic flow chart of an image processing method according to an embodiment of the present application, and the description is made with reference to the steps shown in fig. 3A.
In step 101, an image to be processed is acquired, wherein the image to be processed includes a background object and a target object.
It should be noted that, the image to be processed is an image to be scratched, and the target object needs to be scratched out from the image to be processed. The image to be processed may be an image of capturing and imaging a background object (for example, a green curtain, a two-dimensional code green curtain, etc.), that is, the image to be processed includes the background object and a target object (also referred to as a main body) that obstructs a part of the background object. The background object in the image to be processed is an object to be scratched out, the target object scratched out from the image to be processed is fused with the material to manufacture an image conforming to a shooting scene, different scenes can be prevented from being repeatedly built during shooting, the cost of scene arrangement is reduced, for example, the material is a news host hall, and the target object is fused with the news host hall to manufacture an image of a news host scene.
In step 102, a one-dimensional transparency lookup table of matting parameters is constructed based on at least one matting parameter of a background object.
The one-dimensional transparency lookup table comprises a one-dimensional correspondence between color parameter values of pixels and transparency. And constructing a one-dimensional transparency lookup table (namely a one-dimensional transparency lookup table) of the matting parameter through one matting parameter of the background object, and then quickly separating and querying the one-dimensional transparency lookup table through each pixel in the image to be processed, so that the query of the multi-dimensional transparency lookup table is avoided, and the query efficiency is improved. It should be noted that the one-dimensional transparency lookup table may be a continuous table or a discrete table.
Referring to fig. 3B, fig. 3B is a flowchart of an image processing method according to an embodiment of the present application, and fig. 3B illustrates that step 102 of fig. 3A may be implemented by steps 1021-1022: in step 1021, a matting range corresponding to the background object is determined based on the matting parameters of the background object; the transparency corresponding to the color parameter value included in the matting range is completely transparent; in step 1022, a one-dimensional transparency lookup table of the matting parameters is constructed based on the matting range corresponding to the background object, where the one-dimensional transparency lookup table includes a correspondence between color parameter values and transparency.
For example, based on the matting parameters of the background object, the matting range of the background object under the matting parameters can be determined, that is, the color parameter value of any pixel is in the matting range, and then the pixel is completely transparent, and it is to be noted that the one-dimensional transparency lookup table of the matting parameters constructed by the matting range corresponding to the background object includes a matting range and a non-matting range, where the transparency corresponding to the color parameter value included in the non-matting range is not completely transparent, and then the transparency of the pixel can be queried by querying the one-dimensional transparency lookup table through the color parameter value of the pixel.
For example, when the matting parameter is a hue matting parameter, determining a hue matting range corresponding to the background object based on the hue matting parameter of the background object, wherein the transparency corresponding to a hue value included in the hue matting range is completely transparent, and constructing a one-dimensional transparency lookup table of the hue matting parameter based on the hue matting range corresponding to the background object, wherein the one-dimensional transparency lookup table includes a correspondence between the hue value and the transparency; when the matting parameters are saturation matting parameters, determining a saturation matting range corresponding to the background object based on the saturation matting parameters of the background object, wherein the saturation matting range comprises a completely transparent transparency corresponding to the saturation, and constructing a one-dimensional transparency lookup table of the saturation matting parameters based on the saturation matting range corresponding to the background object, wherein the one-dimensional transparency lookup table comprises a corresponding relation between the saturation and the transparency.
In some embodiments, when the matting parameter is a hue matting parameter, the matting range corresponding to the background object includes a hue matting range; when the matting parameters are saturation matting parameters, the matting range corresponding to the background object comprises a saturation matting range; based on the matting parameters of the background object, determining the matting range corresponding to the background object comprises the following steps: based on hue matting parameters of the background object, determining hue minimum and hue maximum of the background object, and taking the hue minimum and hue maximum as endpoint values of a hue matting range; and determining a saturation minimum value and a saturation maximum value of the background object based on the saturation matting parameter of the background object, and taking the saturation minimum value and the saturation maximum value as end point values of the saturation matting range.
For example, when the matting parameter is a hue matting parameter, a hue minimum value h1 and a hue maximum value h2 corresponding to the background object are determined according to actual requirements, the hue minimum value h1 and the hue maximum value h2 are used as endpoint values of a hue matting range, namely, a hue matting range [ h1, h2] corresponding to the background object is obtained, and when the hue value of a pixel is in the hue matting range [ h1, h2], the transparency of the pixel is completely transparent.
As shown in fig. 3D, the horizontal axis in the one-dimensional transparency lookup table of the hue key parameter (also referred to as a one-dimensional < hue-transparency > table) represents the hue value, and the vertical axis represents the transparency, where the transparency corresponding to the hue key range [ h1, h2] is completely transparent, i.e., f (x) =0, and the transparency corresponding to the other ranges [0, h 1), (h 1,255] is completely opaque, i.e., f (x) =255.
For example, when the matting parameter is a saturation matting parameter, determining a saturation minimum value s1 and a saturation maximum value s2 corresponding to the background object according to actual requirements, and taking the saturation minimum value s1 and the saturation maximum value s2 as endpoint values of a saturation matting range to obtain a saturation matting range [ s1, s2] corresponding to the background object, where when the saturation of a pixel is in the saturation matting range [ s1, s2], the transparency of the pixel is completely transparent.
As shown in fig. 3E, a one-dimensional transparency lookup table (also referred to as a one-dimensional < saturation-transparency > table) of saturation matting parameters, wherein a horizontal axis in the one-dimensional transparency lookup table of saturation matting parameters represents saturation, and a vertical axis represents transparency, wherein transparency corresponding to a saturation matting range s1,255 is completely transparent, i.e., g (x) =0, and transparency corresponding to other ranges, i.e., 0, s1, is completely opaque, i.e., g (x) =255.
In some embodiments, determining the hue minimum value and the hue maximum value of the background object based on the hue matting parameter of the background object comprises: based on the hue matting parameters of the background object, determining an initial hue minimum value, an initial hue maximum value and a hue gradient width of the background object; increasing an initial hue minimum value based on the hue gradient width to obtain a hue minimum value of the background object; reducing the initial hue maximum value based on the hue gradient width to obtain the hue maximum value of the background object; based on saturation matting parameters of the background object, determining a saturation minimum value and a saturation maximum value of the background object comprises: determining an initial saturation minimum value, an initial saturation maximum value and a saturation gradient width of the background object based on the saturation matting parameters of the background object; and increasing the initial saturation minimum value based on the saturation gradient width to obtain the saturation minimum value of the background object, and taking the initial saturation maximum value as the saturation maximum value of the background object.
The transparency corresponding to the hue gradient width is not completely transparent and not completely opaque, and the transparency corresponding to the saturation gradient width is not completely transparent and not completely opaque. The hue matting range is reduced through the hue gradient width, and the saturation gradient width is reduced to the saturation matting range, so that the matting range is more accurate, and the matting effect is improved.
With the above example, the background object has an initial hue minimum value of h1, an initial hue maximum value of h2, and a hue gradient width of r1, the initial hue minimum value is increased based on the hue gradient width r1 to obtain a hue minimum value h1+r1/2 of the background object, and the initial hue maximum value is decreased based on the hue gradient width r1 to obtain a hue maximum value h2-r1/2 of the background object.
With the above example in mind, the background object has an initial saturation minimum value s1, an initial saturation maximum value s2, and a saturation gradient width r2, the initial saturation minimum value is increased based on the saturation gradient width r2 to obtain a saturation minimum value s1+r2/2 of the background object, and the initial saturation maximum value is reduced based on the saturation gradient width r2 to obtain a saturation maximum value s2-r2/2 of the background object.
In some embodiments, constructing a one-dimensional transparency lookup table of matting parameters based on matting ranges corresponding to background objects includes: acquiring the hue gradient width of a background object, and determining the hue gradient range based on the hue matting range and the hue gradient width, wherein the transparency corresponding to the color parameter value included in the hue gradient range is not completely transparent and not completely opaque; based on the hue matting range and the hue gradient range, constructing a one-dimensional transparency lookup table of hue matting parameters; acquiring a saturation gradient width of a background object, and determining a saturation gradient range based on the saturation matting range and the saturation gradient width, wherein the transparency corresponding to a color parameter value included in the saturation gradient range is not completely transparent and not completely opaque; and constructing a one-dimensional transparency lookup table of the saturation matting parameter based on the saturation matting range and the saturation gradient range.
For example, through the hue gradient range and the saturation gradient range, the pixels are prevented from being directly from being completely transparent to being completely opaque, or from being completely opaque to being completely transparent, so that the pixels can be transited from being completely transparent to being completely opaque, or from being completely opaque to being completely transparent, the matting process is softened, the matting effect is improved, and the matting process is prevented from being too abrupt.
As shown in fig. 8, a one-dimensional transparency lookup table (also called a one-dimensional < hue-transparency > table) of hue matting parameters is shown, wherein the horizontal axis in the one-dimensional transparency lookup table of hue matting parameters represents hue values, the vertical axis represents transparency, wherein the transparency corresponding to the hue matting range [ h1+r1/2, h2-r1/2] is completely transparent, i.e., f (x) =0, x represents hue values, the transparency corresponding to the hue gradient range (h 1-r1/2, h1+r1/2), (h 2-r1/2, h2+r1/2) is not completely transparent and not completely opaque, and the transparency corresponding to the other ranges [0, h1-r1/2], [ h2+r1/2,255] is completely opaque, i.e., f (x) =255. Wherein, the specific definition of the formula f (x) is as follows:
as shown in fig. 10, a one-dimensional transparency lookup table (also called a one-dimensional < saturation-transparency > table) of saturation matting parameters, wherein a horizontal axis in the one-dimensional transparency lookup table of saturation matting parameters represents saturation values, a vertical axis represents transparency, wherein transparency corresponding to a saturation matting range [ s2+r2/2,255] is completely transparent, that is, g (x) =0, x represents saturation, transparency corresponding to a saturation gradient range (s 1-r2/2, s1+r2/2) is not completely transparent and not completely opaque, and transparency corresponding to other ranges [0, s1-r2/2] is completely opaque, that is, g (x) =255. Wherein, the specific definition of the formula g (x) is as follows:
/>
In step 103, a one-dimensional transparency lookup table of the matting parameters is queried based on each pixel in the image to be processed, and a transparent channel image corresponding to the image to be processed is obtained.
For example, the color parameters of each pixel in the image to be processed are rapidly separated and inquired into a one-dimensional transparency inquiry table, so that the transparency of each pixel is obtained, the inquiry of a multi-dimensional transparency inquiry table is avoided, and the inquiry efficiency is improved.
In some embodiments, obtaining a transparent channel image corresponding to an image to be processed based on a one-dimensional transparency lookup table of a matting parameter queried for each pixel in the image to be processed includes: the following processing is performed for any pixel in the image to be processed: inquiring a one-dimensional transparency inquiry table of the matting parameters based on the color parameters of the pixels to obtain the transparency of the pixels; and combining the transparency of the pixels according to the position relation of the pixels in the image to be processed to obtain a transparent channel image corresponding to the image to be processed.
For example, when the matting parameter is a hue matting parameter, the following processing is performed for any pixel in the image to be processed: and inquiring a one-dimensional transparency inquiry table of the hue key parameters based on hue values of the pixels to obtain transparency of the pixels, and combining the transparency of a plurality of pixels according to the position relation of the pixels in the image to be processed to obtain a transparent channel image corresponding to the image to be processed.
For example, when the matting parameter is a saturation matting parameter, the following processing is performed for any pixel in the image to be processed: and inquiring a one-dimensional transparency inquiry table of saturation matting parameters based on the saturation of the pixels to obtain the transparency of the pixels, and combining the transparency of a plurality of pixels according to the position relation of the pixels in the image to be processed to obtain a transparent channel image corresponding to the image to be processed.
In some embodiments, when the matting parameter is plural, querying a one-dimensional transparency lookup table of the matting parameter based on the color parameter of the pixel to obtain the transparency of the pixel, including: inquiring one-dimensional transparency inquiry tables of a plurality of matting parameters based on color parameters of pixels respectively to obtain transparency corresponding to the plurality of matting parameters; and taking the maximum value of the transparency corresponding to the plurality of matting parameters as the transparency of the pixel.
For example, when the matting parameters of the background object include a hue matting parameter and a saturation matting parameter, a one-dimensional transparency lookup table of the hue matting parameter is queried based on a hue value of a pixel to obtain a transparency a1 of the hue matting parameter, a one-dimensional transparency lookup table of the saturation matting parameter is queried based on a saturation of the pixel to obtain a transparency a2 of the saturation matting parameter, and a maximum value in the transparency a1 of the hue matting parameter and the transparency a2 of the saturation matting parameter is used as the transparency a of the pixel.
In some embodiments, the image to be processed is a color image; inquiring a one-dimensional transparency inquiry table of the matting parameters based on the color parameter value of each pixel in the image to be processed, and converting the image to be processed into a candidate color space before obtaining a transparent channel image corresponding to the image to be processed, so as to obtain a first converted image of the image to be processed in the candidate color space; converting the image to be processed into a black-and-white space to obtain a second converted image of the image to be processed in the black-and-white space; fusing the first conversion image and the second conversion image to obtain a third conversion image corresponding to the image to be processed; inquiring a one-dimensional transparency inquiry table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed, wherein the method comprises the following steps: and inquiring a one-dimensional transparency inquiry table of the matting parameters based on the color parameters of each pixel in the third conversion image to obtain a transparent channel image corresponding to the image to be processed.
For example, the image to be processed is an RGB color image, the matting parameters are saturation matting parameters and hue matting parameters, the image to be processed needs to be converted into a candidate color space (i.e., HLS color space) to obtain a first conversion image (i.e., HLS image) of the image to be processed in the candidate color space, the image to be processed is converted into a black-and-white space to obtain a second conversion image (i.e., black-and-white image) of the image to be processed in the black-and-white space, the first conversion image and the second conversion image are fused to obtain a third conversion image (i.e., HSY image) corresponding to the image to be processed, and finally a one-dimensional transparency lookup table is queried based on the color parameters (e.g., saturation and hue value) of each pixel in the third conversion image to obtain a transparent channel image corresponding to the image to be processed.
In some embodiments, the color parameters of each pixel in the third converted image include a hue parameter, a saturation parameter, and a brightness parameter; fusing the first conversion image and the second conversion image to obtain a third conversion image corresponding to the image to be processed, wherein the method comprises the following steps: determining a saturation scaling factor based on a luminance parameter of each pixel in the second converted image; scaling the saturation parameter of each pixel in the first converted image based on the saturation scaling factor to obtain a scaled saturation parameter; and combining the brightness parameter of each pixel in the second conversion image, the saturation parameter after scaling and the hue parameter of each pixel in the first conversion image to obtain a third conversion image corresponding to the image to be processed.
With the above example in mind, for each pixel on an RGB color image, the following processing is performed, and an HSY image is obtained by conversion: converting the image to be processed into a candidate color space by using an RGB-to-HLS algorithm to obtain the hue h and the saturation s of each pixel; converting the image to be processed into a black-and-white space by using an RGB-to-black-and-white image algorithm to obtain the brightness Y (namely brightness parameter) of each pixel; determining a saturation scaling factor based on the brightness parameter of each pixel in the second converted image, namely calculating a minimum value m between the brightness Y and 255-Y, wherein m/255 is the saturation scaling factor; and scaling the saturation parameter s by using m/255, wherein the scaled s is the saturation value in the HSY color space, the hue parameter H of each pixel in the first converted image is used as the hue H in the HSY color space, the scaled s is used as the saturation S, Y in the HSY color space and is used as the Y in the HSY color space, so as to obtain converted HSY pixels, and all the converted HSY pixels are combined to obtain a third converted image (namely the HSY image) corresponding to the image to be processed.
In step 104, a matting process is performed based on the transparent channel image, so as to obtain a target image from which the background object is removed and which includes the target object.
For example, a one-dimensional transparency lookup table is quickly queried based on each pixel in the image to be processed, and the image to be processed is subjected to image matting processing based on the transparent channel image corresponding to the image to be processed, so that the one-dimensional transparency lookup table is fully and effectively utilized to realize quick image matting, the image matting efficiency is improved, and a large amount of computing resources are saved.
Referring to fig. 3C, fig. 3C is a flowchart of an image processing method according to an embodiment of the present application, and fig. 3C illustrates that step 104 of fig. 3A may be implemented by steps 1041 to 1043: in step 1041, performing morphological processing on the transparent channel image to obtain a transparent channel image after morphological processing; in step 1042, performing interference color removal processing on the image to be processed to obtain an image to be processed with interference color removed; in step 1043, fusion processing is performed on the transparent channel image after morphological processing and the image to be processed after interference color removal, so as to obtain a target image including the target object and the background object removed.
It should be noted that, the interference color is that the color of the background object affects the original color on the target object, for example, the target object is a green curtain, and may cause that a part of pixels of the target object on the image to be processed may be greenish, so that the image to be processed needs to be subjected to the green removing process, and the image to be processed after the green removing process is obtained.
For example, morphological processing such as expansion, erosion and blurring is performed on the transparent channel image to obtain a transparent channel image after morphological processing, fusion processing is performed on the transparent channel image after morphological processing and the image to be processed after interference color removal is combined to obtain a target image which is used for removing a background object and comprises a target object, so that noise, cavities and the like on the transparent channel image can be removed, and an accurate image matting function is realized.
In some embodiments, the pixel value of each pixel in the image to be processed includes a first color value, a second color value, and an interference color value; performing interference color removal processing on an image to be processed to obtain an image to be processed with interference color removed, wherein the method comprises the following steps: the following processing is performed for any pixel in the image to be processed: determining an interference color reference value for the pixel based on the first color value and the second color value of the pixel; combining the interference color of the pixel with the minimum value in the interference color reference value of the pixel, the first color value of the pixel and the second color value to obtain a pixel with the interference color removed; and combining the pixels with the interference colors removed according to the position relation of the pixels in the image to be processed to obtain the image to be processed with the interference colors removed.
Wherein the interference color reference value is used for reference for defining specific values of the interference color. Interference color reference value of the pixel. The process of determining the interference color reference value of the pixel is as follows: taking the absolute value of the difference value between the first color value and the second color value of the pixel as the difference between the first color value and the second color value; determining a minimum value of the first color value and the second color value of the pixel; and adding the product of the difference and the interference color coefficient and the minimum value to obtain an interference color reference value of the pixel.
In the following, the interference color is taken as green as an example, the first color value is the gray value of the red channel (R) of the pixel, the second color value is the gray value of the blue channel (B) of the pixel, and if the gray value of the green channel (G) of a pixel is greater than the average value of the red channel and the blue channel, the pixel will be greenish. Therefore, if the green channel value of the green-colored pixel is reduced to the same value as the average value of the red channel and the blue channel, the green-colored pixel does not appear to be greenish, the green-colored process (i.e., interference color removal) is as follows: for each RGB pixel (i.e., each pixel in the image to be processed), the following processing is performed: calculating the minimum value and the maximum value of the R channel and the B channel, and respectively marking the minimum value and the maximum value as j and k; calculating d, d=k-j, i.e. d represents the difference of the first color value and the second color value; calculating a threshold T, t=j+t·d, wherein T is a green-removing coefficient, which can be set to 0.5, and T represents an interference color reference value; and taking the minimum value in the channels G and T as a G channel value after green removal, taking the original B channel value as a B channel value after green removal and the original R channel value as an R channel value after green removal, and combining the G channel value after green removal, the B channel value after green removal and the R channel value after green removal to obtain RGB pixel colors after green removal, namely pixels after interference color removal.
In the following, an exemplary application of the embodiment of the present application in a practical application scenario will be described.
The embodiment of the application can be applied to various matted scenes, such as live broadcast, star accompany scenes, virtual production, video clip service, virtual Reality (VR) video production, interactive games (enabling a real person to interact with a Virtual scene to provide an immersive picture effect) and the like.
Regarding live broadcasting and star accompanying scenes, the green curtain buckling technology can be utilized to achieve the picture effect shown in fig. 4A, and the picked true person 401 is implanted into the virtual scene 402, so that the viewing is better.
With respect to virtual production, virtual production is a technique of mixing a real person with a virtual scene, and here, a green curtain buckle drawing technique is required. It should be noted that the good green screen matting can keep the shadows of the human body on the green screen, and can achieve better picture effect. The virtual film making technology is adopted in scenes such as variety and large-scale live broadcast, so that very cool and flexible picture special effects can be obtained, and extremely special effect experience is brought to live broadcast and on-demand. As shown in fig. 4B, the scratched real person 403 is implanted into the live scene 404.
Regarding video editing services, the green screen matting technology can be deployed as an online service for video and image matting services and applied to various later clips.
In the related art, the matting algorithm has various limitations, and although the matting algorithm of the related art can accurately deduct the two-dimensional code green curtain and also can process 4 kilomega (K) video in real time, the matting algorithm of the related art is based on the operation of a central processing unit (CPU, central Processing Unit), and can cause very high load on the CPU when processing the 4K video. When the method is applied to virtual-real fusion, other modules in the virtual-real fusion also need to use a CPU, so that the CPU resource is preempted, and then the clamping is caused. For the green screen matting algorithm of the related art, or the matting algorithm in commercial software, a very high quality green screen is required, and the two-dimensional code green screen cannot be deducted.
In order to solve the above problems, an embodiment of the present application provides a high-quality fast green screen matting algorithm (i.e., an image processing method) based on a separate lookup table, which can facilitate transplanting of a shader, can process video with 4K or higher resolution in real time, and occupies only a small amount of graphics processor (GPU, graphics Processing Unit) resources; the robustness is high, and not only can the green curtain be deducted, but also the two-dimensional code green curtain can be deducted; 1080P video matting can be processed in real time by using a single-core CPU.
It should be noted that, because the two-dimensional code green curtain needs to be used in the virtual-real fusion, the two-dimensional code green curtain can bring a lot of advantages on convenience and cost for the virtual-real fusion, and the virtual-real fusion is to implant a real person into a virtual scene. To obtain a high-quality virtual-real fusion effect, a mirror is required to be supported. The two-dimensional code on the two-dimensional code green curtain can provide accurate characteristic points for the real video, so that the mirror information of the real video can be calculated through a computer vision algorithm. Otherwise, the hardware positioner is needed to acquire the mirror information of the real video, which consumes hardware cost, and a special algorithm is needed to synchronize the time difference between the real video and the hardware positioner, so as to influence the virtual-real fusion effect.
The high-quality rapid green screen matting algorithm based on the separation table lookup provided by the embodiment of the application has the key points that the speed of acquiring the transparent channel image is accelerated by adopting two one-dimensional table lookup, and the high-quality rapid green screen matting algorithm is easy to realize on a shader of a GPU. It should be noted that, for ultra-high resolution video, for example, 4K, 5K, 6K, 8K video, excessive occupation of CPU can be avoided by performing operation using a shader instead of CPU operation.
The following describes a high-quality rapid green screen matting algorithm based on a separate lookup table according to an embodiment of the present application with reference to fig. 5, including 4 parts: color space conversion, one-dimensional table construction, look-up table acquisition transparent channel, post-processing, the following details of these 4 parts:
1. Color space conversion.
The embodiment of the application adopts an HSY color space, wherein the HSY color space is obtained by converting based on an HLS algorithm. For each pixel (i.e., RGB pixel to be processed) on the RGB image to be scratched (i.e., the image to be processed), the following processing is performed to convert the RGB image to an image in the HSY color space, and the following specific description of the conversion process is given with reference to fig. 6:
and 11, obtaining the hue h and the saturation s of each pixel by using an RGB-to-HLS algorithm (for example, RGB in opencv) to HLS.
And step 12, obtaining the brightness Y of each pixel by using an RGB to black and white image algorithm.
And step 13, calculating the minimum value m between the brightness Y and 255-Y.
Step 14. Scaling the saturation s using m: the saturation s is amplified by m times and then reduced by 255 times, and the scaled s is the saturation value in the HSY color space.
Step 15, using the obtained hue H of each pixel as the hue H in the HSY color space; the scaled saturation S is taken as the saturation S in the HSY color space; y is taken as Y in the HSY color space to obtain converted HSY pixels.
2. And one-dimensional table construction.
In this step, it is necessary to create a one-dimensional < hue-transparency > table and a one-dimensional < saturation-transparency > table from the background hue range [ h1, h2], the saturation minimum s0, the hue gradation width r1, and the saturation gradation width r 2.
The table-building flow of the one-dimensional < hue-transparency > table is described below with reference to fig. 7:
step 21, initializing a one-dimensional table with length of 256.
Step 22, for each element in the table, calculating a table look-up value according to formula f (x), to obtain a < hue-transparency > one-dimensional table:
the subscript of the element is i (i.e. the hue value of the pixel), i is carried into f (x) by using formula (1), the table lookup value of the element is calculated, and the matting parameter is used by formula f (x): background color hue range [ h1, h2], and hue gradation width r1. Where 0 represents transparency and 255 represents opacity.
Wherein the specific definition of the formula f (x) is shown in the formula (1).
As shown in fig. 8, a < hue-transparency > one-dimensional table is created by the formula f (x).
The table-building flow of the one-dimensional < saturation-transparency > table is described below in conjunction with fig. 9:
step 31, initializing a one-dimensional table with length of 256.
Step 32, for each element in the table, calculating a table look-up value according to formula g (x), to obtain a < saturation-transparency > one-dimensional table:
the subscript of the element is i (i.e. the saturation of the pixel), i is brought into g (x) using formula (2), the table look-up value of the element is calculated, and formula g (x) uses the matting parameters: saturation minimum s1 and saturation fade width r2. Where 0 represents transparency and 255 represents opacity.
Wherein, the specific definition of the formula g (x) is shown in the formula (2).
As shown in fig. 10, a < saturation-transparency > one-dimensional table is created by the formula g (x).
3. And looking up a table to obtain a transparent channel.
As shown in fig. 11, after obtaining two one-dimensional tables, a table look-up operation may be performed in real time on each frame of image (i.e., an image in HSY color space, abbreviated as HSY image) to obtain a transparent channel value (i.e., transparency) of each pixel in the image (i.e., an HSY pixel to be processed).
For the HSY pixels to be processed, the following processing is carried out to obtain the transparency a of each HSY pixel to be processed:
and step 41, acquiring a hue value h of the HSY pixel to be processed.
And 42, checking a one-dimensional < hue-transparency > table by taking h as a subscript to obtain the transparency a1.
And 43, acquiring the saturation s of the HSY pixel to be processed.
Step 44, look up a one-dimensional < saturation-transparency > table with s as a subscript to obtain transparency a2.
And 45, taking the maximum value from the transparency a1 and the transparency a2 to obtain the transparency a of the HSY pixel to be processed.
It should be noted that, the transparency a of all the HSY pixels to be processed are combined to obtain a transparent channel image of the HSY image, that is, a transparent channel image of the RGB image to be scratched.
It should be noted that, by looking up the table twice and finally obtaining the transparency, the gradual effect of the transparency according to the hue h and the saturation s as shown in fig. 12A is obtained, which is essentially a two-dimensional table. However, the method for directly looking up the two-dimensional table is slow (two-dimensional table is needed to be built through matrix multiplication, matrix multiplication is time-consuming, index of one-dimensional table is 0-255 when looking up table, index of two-dimensional table is 0-65535, look up table is slow), and algorithm is not easy to realize on the GPU shader (the shader calculates for pixel points, and matrix operation cannot be performed for the two-dimensional table). Therefore, the two-dimensional table lookup effect can be completely equivalent by respectively looking up the table in fig. 12B for h and then looking up the table in fig. 12C for s and taking the maximum value of the two table lookup values, which is the process of separating the table lookup.
4. And (5) post-treatment.
If the gray value of the green channel (G) of a pixel is greater than the average of the red and blue channels, that pixel will be greenish. As shown in fig. 13A, the gray value of the green channel (G) of the pixel of the black trousers that is green-colored by the green curtain is greater than the average value of the red channel and the blue channel; as shown in fig. 13B, the gray value of the green channel (G) of the pixel of the skin that is not light-stained green by the green curtain is smaller than the average of the red channel and the blue channel.
Therefore, if the green channel value of the green-colored pixel is reduced to the same value as the average value of the red channel and the blue channel, it does not appear greenish. Further, to adjust the intensity of the green light, a parameter t is added to control the calculation of the average of the red and blue channels.
The overall flow of post-processing is described below in conjunction with fig. 14:
step 51, removing the main body greenish of the RGB channel caused by reflection of the green curtain: for each RGB pixel, the following processing is performed:
a) And calculating the minimum value and the maximum value of the R channel and the B channel, and respectively marking as j and k.
b) D, d=k-j is calculated.
c) The threshold T, t=j+t·d is calculated, where T is the de-greening coefficient, which can be set to 0.5.
d) The minimum of T and G channels is taken as the G channel value after green removal.
e) And combining the original B channel value serving as a B channel value after green removal and the original R channel value serving as an R channel value after green removal with the G channel value after green removal to obtain the RGB pixel color after green removal.
Step 52, performing post-treatment on the transparent channel.
And performing morphological treatments such as expansion, erosion, blurring and the like on the transparent channel image to obtain a post-processed transparent channel image.
And step 53, combining the RGB pixel colors after green removal and the transparent channel image after post-treatment to obtain an ARGB image.
It should be noted that the image used to calculate the transparency channel is not necessarily the HSY color space, but may be, for example, three background parameters in YUV and a distance coefficient.
As shown in fig. 15A, an embodiment of the present application extracts a target object 1502 from a green curtain image 1501; as shown in fig. 15B, the embodiment of the present application extracts the target object 1504 from the two-dimensional code green curtain image 1503.
In summary, the high-quality rapid green screen matting algorithm based on the separation table lookup provided by the embodiment of the application has the following beneficial effects: 1) The two-dimensional table lookup is prevented from being accelerated, the two-dimensional table lookup is very conveniently realized on a GPU (graphics processing unit) shader, the matting algorithm can process 4K video in real time, the occupation of an excessively high CPU (Central processing Unit) and a GPU (graphics processing unit) is also prevented, the hardware resources cannot be preempted with algorithms required by other virtual-real fusion when the method is applied to the virtual-real fusion, the jamming is avoided, the rendering image quality is improved, and the user experience is improved; 2) After acceleration by the GPU shader, even 5K or 8K video can be processed, enabling a more high definition image quality experience.
The image processing method provided by the embodiment of the present application has been described so far in connection with exemplary applications and implementations of the electronic device provided by the embodiment of the present application. The embodiment of the application also provides an image processing device, in practical application, each functional module in the image processing device can be cooperatively realized by hardware resources of electronic equipment (such as a terminal and a server), computing resources such as a processor, communication resources (such as a support for realizing various modes of communication such as an optical cable and a cellular) and a memory. Fig. 2 shows an image processing device 555 stored in a memory 550, which may be software in the form of programs and plug-ins, etc., for example, software modules designed in a programming language such as software C/c++, java, etc., application software designed in a programming language such as C/c++, java, etc., or dedicated software modules in a large software system, application program interfaces, plug-ins, cloud services, etc., the different implementations being exemplified below.
The image processing apparatus 555 includes a series of modules, including an acquisition module 5551, a construction module 5552, a query module 5553, and a processing module 5554. The following further describes an implementation scheme of the image processing device 555 according to the embodiment of the present application.
An obtaining module 5551, configured to obtain an image to be processed, where the image to be processed includes a background object and a target object; a construction module 5552, configured to construct a one-dimensional transparency lookup table of the matting parameters based on at least one matting parameter of the background object; the query module 5553 is configured to query a one-dimensional transparency query table of the matting parameter based on each pixel in the image to be processed, so as to obtain a transparent channel image corresponding to the image to be processed; and a processing module 5554, configured to perform matting processing based on the transparent channel image, so as to obtain a target image that removes the background object and includes the target object.
In some embodiments, the constructing module 5552 is further configured to determine a matting range corresponding to the background object based on the matting parameter of the background object; the transparency corresponding to the color parameter value included in the matting range is completely transparent; and constructing a one-dimensional transparency lookup table of the matting parameters based on the matting range corresponding to the background object, wherein the one-dimensional transparency lookup table comprises the corresponding relation between the color parameter values and the transparency.
In some embodiments, when the matting parameter is a hue matting parameter, the matting range corresponding to the background object includes a hue matting range; when the matting parameters are saturation matting parameters, the matting range corresponding to the background object comprises a saturation matting range; the construction module 5552 is further configured to determine a hue minimum value and a hue maximum value of the background object based on a hue matting parameter of the background object, and use the hue minimum value and the hue maximum value as endpoint values of the hue matting range; and determining a saturation minimum value and a saturation maximum value of the background object based on the saturation matting parameter of the background object, and taking the saturation minimum value and the saturation maximum value as endpoint values of the saturation matting range.
In some embodiments, the constructing module 5552 is further configured to determine an initial hue minimum value, an initial hue maximum value, and a hue gradient width of the background object based on the hue matting parameter of the background object; increasing the initial hue minimum value based on the hue gradient width to obtain a hue minimum value of the background object; reducing the initial hue maximum value based on the hue gradient width to obtain a hue maximum value of the background object; determining an initial saturation minimum value, an initial saturation maximum value and a saturation gradient width of the background object based on the saturation matting parameters of the background object; and increasing the initial saturation minimum value based on the saturation gradient width to obtain a saturation minimum value of the background object, and taking the initial saturation maximum value as the saturation maximum value of the background object.
In some embodiments, the building module 5552 is further configured to obtain a hue gradient width of the background object, and determine a hue gradient range based on the hue matting range and the hue gradient width, where a transparency corresponding to a color parameter value included in the hue gradient range is not completely transparent and not completely opaque; constructing a one-dimensional transparency lookup table of the hue matting parameters based on the hue matting range and the hue gradient range; acquiring a saturation gradient width of the background object, and determining a saturation gradient range based on the saturation matting range and the saturation gradient width, wherein the saturation gradient range comprises a transparency which is not completely transparent and not completely opaque and corresponds to a color parameter value; and constructing a one-dimensional transparency lookup table of the saturation matting parameter based on the saturation matting range and the saturation gradient range.
In some embodiments, the query module 5553 is further configured to perform the following processing for any pixel in the image to be processed: inquiring a one-dimensional transparency inquiry table of the matting parameter based on the color parameter of the pixel to obtain the transparency of the pixel; and combining the transparency of a plurality of pixels according to the position relation of the pixels in the image to be processed to obtain a transparent channel image corresponding to the image to be processed.
In some embodiments, when the matting parameters are plural, the query module 5553 is further configured to query a one-dimensional transparency query table of the matting parameters based on the color parameters of the pixels, to obtain transparency corresponding to the matting parameters; and taking the maximum value of the transparency corresponding to the matting parameters as the transparency of the pixel.
In some embodiments, the image to be processed is a color image; the construction module 5552 is further configured to convert the image to be processed into a candidate color space before the one-dimensional transparency lookup table of the matting parameter is queried based on the color parameter value of each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed, so as to obtain a first converted image of the image to be processed in the candidate color space; converting the image to be processed into a black-and-white space to obtain a second converted image of the image to be processed in the black-and-white space; fusing the first conversion image and the second conversion image to obtain a third conversion image corresponding to the image to be processed; the query module 5553 is further configured to query a one-dimensional transparency query table of the matting parameter based on a color parameter of each pixel in the third conversion image, so as to obtain a transparent channel image corresponding to the image to be processed.
In some embodiments, the color parameter of each pixel in the third converted image includes a hue parameter, a saturation parameter, and a brightness parameter; the constructing module 5552 is further configured to determine a saturation scaling factor based on the luminance parameter of each pixel in the second converted image; scaling the saturation parameter of each pixel in the first converted image based on the saturation scaling factor to obtain the scaled saturation parameter; and combining the brightness parameter of each pixel in the second conversion image, the saturation parameter after scaling and the hue parameter of each pixel in the first conversion image to obtain a third conversion image corresponding to the image to be processed.
In some embodiments, the processing module 5554 is further configured to perform morphological processing on the clear channel image to obtain a morphologically processed clear channel image; performing interference color removal processing on the image to be processed to obtain the image to be processed with the interference color removed; and carrying out fusion processing on the transparent channel image subjected to morphological processing and the image to be processed subjected to interference color removal to obtain a target image which is used for removing the background object and comprises the target object.
In some embodiments, the pixel value of each pixel in the image to be processed includes a first color value, a second color value, and an interference color value; the processing module 5554 is further configured to perform the following processing for any pixel in the image to be processed: determining an interference color reference value for the pixel based on the first color value and the second color value of the pixel; combining the interference color of the pixel with the minimum value of the interference color reference values of the pixel, the first color value of the pixel and the second color value to obtain the pixel with the interference color removed; and combining the pixels with the removed interference colors according to the position relation of the pixels in the image to be processed to obtain the image to be processed with the removed interference colors.
In some embodiments, the processing module 5554 is further configured to use an absolute value of a difference between the first color value and the second color value of the pixel as the difference between the first color value and the second color value; determining a minimum of the first color value and the second color value of the pixel; and adding the product of the difference and the interference color coefficient and the minimum value as an interference color reference value of the pixel.
In summary, the image processing device provided by the embodiment of the application can construct a one-dimensional transparency lookup table of each matting parameter through at least one matting parameter of a background object in an image to be processed, quickly inquire the one-dimensional transparency lookup table based on each pixel in the image to be processed, and perform matting processing on the image to be processed based on a transparent channel image corresponding to the image to be processed, thereby fully and effectively utilizing the one-dimensional transparency lookup table to realize quick matting, improving matting efficiency and saving a large amount of computing resources.
Embodiments of the present application provide a computer program product having stored thereon executable instructions stored on a computer readable storage medium. The processor of the electronic device reads the executable instructions from the computer-readable storage medium, and the processor executes the executable instructions, so that the electronic device executes the image processing method according to the embodiment of the present application.
Embodiments of the present application provide a computer-readable storage medium storing executable instructions that, when executed by a processor, cause the processor to perform an image processing method provided by embodiments of the present application, for example, the image processing method shown in fig. 3A to 3C.
In some embodiments, the computer readable storage medium may be FRAM, ROM, PROM, EPROM, EEPROM, flash memory, magnetic surface memory, optical disk, or CD-ROM; but may be a variety of devices including one or any combination of the above memories.
In some embodiments, the executable instructions may be in the form of programs, software modules, scripts, or code, written in any form of programming language (including compiled or interpreted languages, or declarative or procedural languages), and they may be deployed in any form, including as stand-alone programs or as modules, components, subroutines, or other units suitable for use in a computing environment.
As an example, the executable instructions may, but need not, correspond to files in a file system, may be stored as part of a file that holds other programs or data, for example, in one or more scripts in a hypertext markup language (HTML, hyper Text Markup Language) document, in a single file dedicated to the program in question, or in multiple coordinated files (e.g., files that store one or more modules, sub-programs, or portions of code).
As an example, executable instructions may be deployed to be executed on one electronic device or on multiple electronic devices located at one site or, alternatively, on multiple electronic devices distributed across multiple sites and interconnected by a communication network.
The foregoing is merely exemplary embodiments of the present application and is not intended to limit the scope of the present application. Any modification, equivalent replacement, improvement, etc. made within the spirit and scope of the present application are included in the protection scope of the present application.

Claims (16)

1. An image processing method, the method comprising:
acquiring an image to be processed, wherein the image to be processed comprises a background object and a target object;
constructing a one-dimensional transparency lookup table of the matting parameters based on at least one matting parameter of the background object;
inquiring a one-dimensional transparency inquiry table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed;
and carrying out matting processing based on the transparent channel image to obtain a target image which is used for removing the background object and comprises the target object.
2. A method as in claim 1 wherein constructing a one-dimensional transparency look-up table of matting parameters based on at least one matting parameter of the background object comprises:
Determining a matting range corresponding to the background object based on the matting parameters of the background object;
the transparency corresponding to the color parameter value included in the matting range is completely transparent;
and constructing a one-dimensional transparency lookup table of the matting parameters based on the matting range corresponding to the background object, wherein the one-dimensional transparency lookup table comprises the corresponding relation between the color parameter values and the transparency.
3. The method of claim 2, wherein the step of determining the position of the substrate comprises,
when the matting parameters are hue matting parameters, the matting range corresponding to the background object comprises a hue matting range;
when the matting parameters are saturation matting parameters, the matting range corresponding to the background object comprises a saturation matting range;
the determining the matting range corresponding to the background object based on the matting parameters of the background object comprises the following steps:
based on the hue matting parameters of the background object, determining hue minimum and hue maximum of the background object, and taking the hue minimum and hue maximum as endpoint values of the hue matting range;
and determining a saturation minimum value and a saturation maximum value of the background object based on the saturation matting parameter of the background object, and taking the saturation minimum value and the saturation maximum value as endpoint values of the saturation matting range.
4. The method of claim 3, wherein the step of,
the determining the hue minimum value and the hue maximum value of the background object based on the hue matting parameter of the background object comprises the following steps:
determining an initial hue minimum value, an initial hue maximum value and a hue gradient width of the background object based on the hue matting parameters of the background object;
increasing the initial hue minimum value based on the hue gradient width to obtain a hue minimum value of the background object;
reducing the initial hue maximum value based on the hue gradient width to obtain a hue maximum value of the background object;
the determining the saturation minimum value and the saturation maximum value of the background object based on the saturation matting parameter of the background object comprises the following steps:
determining an initial saturation minimum value, an initial saturation maximum value and a saturation gradient width of the background object based on the saturation matting parameters of the background object;
and increasing the initial saturation minimum value based on the saturation gradient width to obtain a saturation minimum value of the background object, and taking the initial saturation maximum value as the saturation maximum value of the background object.
5. A method according to claim 3, wherein constructing a one-dimensional transparency lookup table of the matting parameters based on the matting range corresponding to the background object comprises:
acquiring the hue gradient width of the background object, and determining a hue gradient range based on the hue matting range and the hue gradient width, wherein the hue gradient range comprises transparency corresponding to a color parameter value which is not completely transparent and not completely opaque;
constructing a one-dimensional transparency lookup table of the hue matting parameters based on the hue matting range and the hue gradient range;
acquiring a saturation gradient width of the background object, and determining a saturation gradient range based on the saturation matting range and the saturation gradient width, wherein the saturation gradient range comprises a transparency which is not completely transparent and not completely opaque and corresponds to a color parameter value;
and constructing a one-dimensional transparency lookup table of the saturation matting parameter based on the saturation matting range and the saturation gradient range.
6. The method of claim 1, wherein the querying the one-dimensional transparency lookup table of the matting parameter based on each pixel in the image to be processed to obtain the transparent channel image corresponding to the image to be processed includes:
The following processing is performed for any pixel in the image to be processed:
inquiring a one-dimensional transparency inquiry table of the matting parameter based on the color parameter of the pixel to obtain the transparency of the pixel;
and combining the transparency of a plurality of pixels according to the position relation of the pixels in the image to be processed to obtain a transparent channel image corresponding to the image to be processed.
7. A method as in claim 6 wherein when the matting parameter is plural, the querying a one-dimensional transparency look-up table of the matting parameter based on the color parameters of the pixel to obtain the transparency of the pixel comprises:
inquiring one-dimensional transparency inquiry tables of a plurality of the matting parameters based on the color parameters of the pixels respectively to obtain transparency corresponding to the matting parameters;
and taking the maximum value of the transparency corresponding to the matting parameters as the transparency of the pixel.
8. The method of claim 1, wherein the step of determining the position of the substrate comprises,
the image to be processed is a color image;
the method further comprises the steps of:
Converting the image to be processed into a candidate color space to obtain a first converted image of the image to be processed in the candidate color space;
converting the image to be processed into a black-and-white space to obtain a second converted image of the image to be processed in the black-and-white space;
fusing the first conversion image and the second conversion image to obtain a third conversion image corresponding to the image to be processed;
the step of inquiring the one-dimensional transparency inquiry table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed comprises the following steps:
and inquiring a one-dimensional transparency inquiry table of the matting parameter based on the color parameter of each pixel in the third conversion image to obtain a transparent channel image corresponding to the image to be processed.
9. The method of claim 8, wherein the step of determining the position of the first electrode is performed,
the color parameters of each pixel in the third conversion image comprise hue parameters, saturation parameters and brightness parameters;
the fusing the first conversion image and the second conversion image to obtain a third conversion image corresponding to the image to be processed comprises the following steps:
Determining a saturation scaling factor based on a luminance parameter of each pixel in the second converted image;
scaling the saturation parameter of each pixel in the first converted image based on the saturation scaling factor to obtain the scaled saturation parameter;
and combining the brightness parameter of each pixel in the second conversion image, the saturation parameter after scaling and the hue parameter of each pixel in the first conversion image to obtain a third conversion image corresponding to the image to be processed.
10. The method of claim 1, wherein the performing matting based on the transparent channel image to obtain a target image that removes the background object and includes the target object includes:
carrying out morphological processing on the transparent channel image to obtain the transparent channel image after morphological processing;
performing interference color removal processing on the image to be processed to obtain the image to be processed with the interference color removed;
and carrying out fusion processing on the transparent channel image subjected to morphological processing and the image to be processed subjected to interference color removal to obtain a target image which is used for removing the background object and comprises the target object.
11. The method of claim 10, wherein the step of determining the position of the first electrode is performed,
the pixel value of each pixel in the image to be processed comprises a first color value, a second color value and an interference color value;
the step of performing interference color removal processing on the image to be processed to obtain the image to be processed with the interference color removed, includes:
the following processing is performed for any pixel in the image to be processed:
determining an interference color reference value for the pixel based on the first color value and the second color value of the pixel;
combining the interference color of the pixel with the minimum value of the interference color reference values of the pixel, the first color value of the pixel and the second color value to obtain the pixel with the interference color removed;
and combining the pixels with the removed interference colors according to the position relation of the pixels in the image to be processed to obtain the image to be processed with the removed interference colors.
12. The method of claim 11, wherein the determining the interference color reference value for the pixel based on the first color value and the second color value of the pixel comprises:
taking the absolute value of the difference between the first color value and the second color value of the pixel as the difference between the first color value and the second color value;
Determining a minimum of the first color value and the second color value of the pixel;
and adding the product of the difference and the interference color coefficient and the minimum value as an interference color reference value of the pixel.
13. An image processing apparatus, characterized in that the apparatus comprises:
the device comprises an acquisition module, a processing module and a processing module, wherein the acquisition module is used for acquiring an image to be processed, and the image to be processed comprises a background object and a target object;
the construction module is used for constructing a one-dimensional transparency lookup table of the matting parameters based on at least one matting parameter of the background object;
the query module is used for querying a one-dimensional transparency query table of the matting parameter based on each pixel in the image to be processed to obtain a transparent channel image corresponding to the image to be processed;
and the processing module is used for carrying out matting processing based on the transparent channel image to obtain a target image which is used for removing the background object and comprises the target object.
14. An electronic device, the electronic device comprising:
a memory for storing executable instructions;
a processor for implementing the image processing method of any one of claims 1 to 12 when executing executable instructions stored in said memory.
15. A computer readable storage medium storing executable instructions for implementing the image processing method of any one of claims 1 to 12 when executed by a processor.
16. A computer program product comprising executable instructions which when executed by a processor implement the image processing method of any of claims 1 to 12.
CN202210571068.XA 2022-05-24 2022-05-24 Image processing method, apparatus, device, storage medium, and program product Pending CN117152171A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210571068.XA CN117152171A (en) 2022-05-24 2022-05-24 Image processing method, apparatus, device, storage medium, and program product

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210571068.XA CN117152171A (en) 2022-05-24 2022-05-24 Image processing method, apparatus, device, storage medium, and program product

Publications (1)

Publication Number Publication Date
CN117152171A true CN117152171A (en) 2023-12-01

Family

ID=88897386

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210571068.XA Pending CN117152171A (en) 2022-05-24 2022-05-24 Image processing method, apparatus, device, storage medium, and program product

Country Status (1)

Country Link
CN (1) CN117152171A (en)

Similar Documents

Publication Publication Date Title
CN106331850B (en) Browser live broadcast client, browser live broadcast system and browser live broadcast method
Tanaka et al. Gradient-based low-light image enhancement
CN111681177B (en) Video processing method and device, computer readable storage medium and electronic equipment
CN103440674B (en) A kind of rapid generation of digital picture wax crayon specially good effect
WO2022218082A1 (en) Image processing method and apparatus based on artificial intelligence, and electronic device, computer-readable storage medium and computer program product
CN112053417B (en) Image processing method, device and system and computer readable storage medium
CN112019827B (en) Method, device, equipment and storage medium for enhancing video image color
CN106997608A (en) A kind of method and device for generating halation result figure
CN113132696A (en) Image tone mapping method, device, electronic equipment and storage medium
CN111696034B (en) Image processing method and device and electronic equipment
CN107424206B (en) Interaction method for influencing shadow expression of virtual scene by using real environment
US20190220954A1 (en) Apparatus, method, and computer program code for producing composite image
CN113112422B (en) Image processing method, device, electronic equipment and computer readable medium
CN112419218B (en) Image processing method and device and electronic equipment
CN109447931A (en) Image processing method and device
CN117061882A (en) Video image processing method, apparatus, device, storage medium, and program product
CN112489144A (en) Image processing method, image processing apparatus, terminal device, and storage medium
CN103338143A (en) Method, apparatus and system for realizing image sharing
CN117152171A (en) Image processing method, apparatus, device, storage medium, and program product
CN112435173A (en) Image processing and live broadcasting method, device, equipment and storage medium
CN111383289A (en) Image processing method, image processing device, terminal equipment and computer readable storage medium
CN113393391B (en) Image enhancement method, image enhancement device, electronic apparatus, and storage medium
CN111462007B (en) Image processing method, device, equipment and computer storage medium
CN114005066A (en) HDR-based video frame image processing method and device, computer equipment and medium
CN112165631A (en) Media resource processing method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination