CN113240599A - Image toning method and device, computer-readable storage medium and electronic equipment - Google Patents

Image toning method and device, computer-readable storage medium and electronic equipment Download PDF

Info

Publication number
CN113240599A
CN113240599A CN202110505184.7A CN202110505184A CN113240599A CN 113240599 A CN113240599 A CN 113240599A CN 202110505184 A CN202110505184 A CN 202110505184A CN 113240599 A CN113240599 A CN 113240599A
Authority
CN
China
Prior art keywords
image
processed
information
pixel value
feature
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110505184.7A
Other languages
Chinese (zh)
Inventor
汪路超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110505184.7A priority Critical patent/CN113240599A/en
Publication of CN113240599A publication Critical patent/CN113240599A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Processing (AREA)

Abstract

The disclosure relates to the technical field of image processing, and provides an image toning method and device, a computer readable storage medium and an electronic device, wherein the method comprises the following steps: acquiring the statistical characteristics of an image to be processed and the statistical characteristics of a reference image; generating a target mapping relation corresponding to the image to be processed according to the statistical characteristics of the image to be processed and the statistical characteristics of the reference image; and carrying out color matching processing on the image to be processed according to the target mapping relation. According to the method and the device, the target mapping relation is dynamically generated through the statistical characteristics of the image to be processed and the reference image, the generated target mapping relation system has pertinence, and color mixing processing is carried out on the image to be processed according to the target mapping relation, so that the accuracy of image processing is improved.

Description

Image toning method and device, computer-readable storage medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a resource information obtaining method, a resource information obtaining apparatus, a computer-readable storage medium, and an electronic device.
Background
With the development of image processing technology, the adjustment of multiple styles can be performed on one image, and the transformation that the image of the same content has multiple styles is realized.
In the prior art, the color matching is often performed by using an LUT table, but the LUT table is manually made by a designer, usually one LUT table is made for one style, and different images to be processed are subjected to color matching by using the same LUT table. In the prior art, the method for toning by using the static LUT table has no pertinence and has poor toning effect.
Disclosure of Invention
The present disclosure is directed to an image color matching method, an image color matching apparatus, a computer-readable storage medium, and an electronic device, so as to solve the problem of low pertinence in the prior art at least to some extent.
According to a first aspect of the present disclosure, there is provided an image toning method including: acquiring the statistical characteristics of an image to be processed and the statistical characteristics of a reference image; generating a target mapping relation corresponding to the image to be processed according to the statistical characteristics of the image to be processed and the statistical characteristics of the reference image; and carrying out color matching processing on the image to be processed according to the target mapping relation.
According to a second aspect of the present disclosure, there is provided an image toning device including: the characteristic acquisition module is used for acquiring the statistical characteristics of the image to be processed and acquiring the statistical characteristics of the reference image; the mapping generation module is used for generating a target mapping relation corresponding to the image to be processed according to the statistical characteristics of the image to be processed and the statistical characteristics of the reference image; and the color matching processing module is used for performing color matching processing on the image to be processed according to the target mapping relation.
According to a third aspect of the present disclosure, there is provided a computer readable medium, on which a computer program is stored, which when executed by a processor, implements the image toning method as described in the above embodiments.
According to a fourth aspect of the present disclosure, there is provided an electronic device comprising: one or more processors; a storage device for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image toning method as described in the above embodiments.
According to the technical scheme, the image toning method, the image toning device, the image toning system, the computer readable storage medium and the electronic equipment in the exemplary embodiments of the disclosure have at least the following advantages and positive effects:
the image toning method identifies text information corresponding to voice information and acquires an entity fragment corresponding to the text information; firstly, acquiring the statistical characteristics of an image to be processed and the statistical characteristics of a reference image; then, generating a target mapping relation corresponding to the image to be processed according to the statistical characteristics of the image to be processed and the statistical characteristics of the reference image; and finally, carrying out color matching processing on the image to be processed according to the target mapping relation. According to the image toning method, the target mapping relation is dynamically generated through the statistical characteristics of the image to be processed and the reference image, the generated target mapping relation is targeted, toning processing is carried out on the image to be processed according to the target mapping relation, and the accuracy of image processing is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty.
FIG. 1 schematically illustrates a schematic diagram of a system architecture of the present exemplary embodiment;
fig. 2 schematically shows a schematic view of an electronic device of the present exemplary embodiment;
FIG. 3 schematically shows a flow diagram of an image toning method according to an embodiment of the present disclosure;
FIG. 4 schematically shows a flowchart of a method for obtaining statistical characteristics of an image to be processed according to an embodiment of the present disclosure;
FIG. 5 schematically illustrates a flowchart of a method of generating a target mapping relationship according to an embodiment of the present disclosure;
FIG. 6 is a schematic diagram illustrating a flowchart of a method for obtaining feature information corresponding to an input pixel value according to an embodiment of the disclosure;
FIG. 7 schematically illustrates a flowchart of a method of generating a target mapping relationship according to an embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow diagram of an image toning method according to a specific embodiment of the present disclosure;
fig. 9 schematically shows a block diagram of an image toning device according to an embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to give a thorough understanding of embodiments of the disclosure. One skilled in the relevant art will recognize, however, that the subject matter of the present disclosure can be practiced without one or more of the specific details, or with other methods, components, devices, steps, and so forth. In other instances, well-known methods, devices, implementations, or operations have not been shown or described in detail to avoid obscuring aspects of the disclosure.
The block diagrams shown in the figures are functional entities only and do not necessarily correspond to physically separate entities. I.e. these functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor means and/or microcontroller means.
The flow charts shown in the drawings are merely illustrative and do not necessarily include all of the contents and operations/steps, nor do they necessarily have to be performed in the order described. For example, some operations/steps may be decomposed, and some operations/steps may be combined or partially combined, so that the actual execution sequence may be changed according to the actual situation.
In the related art in the field, a designer manually modulates LUT tables of different styles, and determines that the LUT table of a target style performs LUT mapping on an image to be processed, so as to obtain an output image of the target style corresponding to the image to be processed. However, the LUT table obtained by manual modulation is static, has no expansibility, can only be used as a filter, and has low pertinence to color matching of different images to be processed.
Based on the problems in the related art, the embodiments of the present disclosure provide an image toning method, which is applied to the system architecture of the exemplary embodiments of the present disclosure. Fig. 1 shows a schematic diagram of a system architecture of an exemplary embodiment of the present disclosure, and as shown in fig. 1, the system architecture 100 may include: terminal 110, network 120, and server 130. The terminal 110 may be various electronic devices with audio acquisition functions, including but not limited to a mobile phone, a tablet computer, a personal computer, a smart wearable device, and the like. The medium used by network 120 to provide communications links between terminals 110 and server 130 may include various connection types, such as wired, wireless communications links, or fiber optic cables. It should be understood that the number of terminals, networks, and servers in fig. 1 are merely illustrative. There may be any number of terminals, networks, and servers, as desired for an implementation. For example, the server 130 may be a server cluster composed of a plurality of servers, and the like.
The image toning method provided by the embodiment of the disclosure can be executed by the terminal 110, for example, the terminal 110 acquires the image to be processed and the reference image, generates the target mapping relationship according to the statistical characteristics of the image to be processed and the reference image, and performs toning on the image to be processed according to the target mapping relationship.
In addition, the image toning method provided by the embodiment of the disclosure may also be executed by the server 130, for example, after the terminal 110 acquires the image to be processed and the reference image, the image to be processed and the reference image are uploaded to the server 130, so that the server 130 generates a target mapping relationship according to the statistical characteristics of the acquired image to be processed and the reference image, performs toning on the image to be processed according to the target mapping relationship, and returns the image to be processed after toning to the terminal 110, which is not limited by the disclosure.
An exemplary embodiment of the present disclosure provides an electronic device for implementing an image toning method, which may be the terminal 110 or the server 130 in fig. 1. The electronic device includes at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the image toning method via execution of the executable instructions.
The electronic device may be implemented in various forms, and may include, for example, a mobile device such as a mobile phone, a tablet computer, a notebook computer, a Personal Digital Assistant (PDA), a navigation device, a wearable device, an unmanned aerial vehicle, and a stationary device such as a desktop computer and a smart television.
The following takes the mobile terminal 200 in fig. 2 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 2 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also interface differently than shown in fig. 2, or a combination of multiple interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: the mobile terminal includes a processor 210, an internal memory 221, an external memory interface 222, a USB interface 230, a charging management Module 240, a power management Module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication Module 250, a wireless communication Module 260, an audio Module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor Module 280, a display screen 290, a camera Module 291, an indicator 292, a motor 293, a button 294, a Subscriber Identity Module (SIM) card interface 295, and the like. The sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, a barometric pressure sensor 2804, and the like.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory is provided in the processor 210. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by processor 210.
The charge management module 240 is configured to receive a charging input from a charger. The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein, the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the Wireless communication module 260 may provide a solution for Wireless communication including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), and the like, applied to the mobile terminal 200. In some embodiments, antenna 1 of the mobile terminal 200 is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260, such that the mobile terminal 200 may communicate with networks and other devices via wireless communication techniques.
The mobile terminal 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The mobile terminal 200 may implement a photographing function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. The ISP is used for processing data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; the video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 210 through the external memory interface 222 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 210 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 270, the speaker 271, the receiver 272, the microphone 273, the earphone interface 274, the application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2802 may be disposed on the display screen 290. Pressure sensor 2802 can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of the mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 2803. The gyro sensor 2803 can be used to photograph anti-shake, navigation, body-feel game scenes, and the like.
In addition, other functional sensors, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices for providing auxiliary functions may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, and the like, and a user can generate key signal inputs related to user settings and function control of the mobile terminal 200 through key inputs. Further examples include indicator 292, motor 293, SIM card interface 295, etc.
The image toning method and the image toning apparatus according to the exemplary embodiments of the present disclosure are specifically described below. Fig. 3 shows a flow diagram of an image toning method, which, as shown in fig. 3, comprises at least the following steps:
step S310: acquiring the statistical characteristics of an image to be processed and the statistical characteristics of a reference image;
step S320: generating a target mapping relation corresponding to the image to be processed according to the statistical characteristics of the image to be processed and the statistical characteristics of the reference image;
step S330: and carrying out color matching treatment on the image to be processed according to the target mapping relation.
According to the image toning method, the target mapping relation is dynamically generated through the statistical characteristics of the image to be processed and the reference image, the generated target mapping relation is pertinent, and toning processing is performed on the image to be processed according to the target mapping relation, so that the accuracy of toning processing is improved.
In order to make the technical solution of the present disclosure clearer, each step of the image toning method is explained next.
In step S310, the statistical features of the image to be processed are acquired, and the statistical features of the reference image are acquired.
In an exemplary embodiment of the present disclosure, the image to be processed refers to a target image for image toning. The image input by the user can be acquired as the image to be processed, the image to be processed can also be obtained by shooting through the image acquisition unit, and the image to be processed can also be obtained by drawing through the image editing software.
In addition, the reference image refers to a source image for providing style information during image toning. For example, the reference image may be an image having a vintage style, or may be an image having a warm tone or a cool tone, which is not particularly limited in the present disclosure.
For example, based on the image a and the image B, by the image toning process, a target image C having the shape of the image a and the style of the image B is output. At this time, it can be considered that the image a is an image to be processed and the image B is a reference image.
In an exemplary embodiment of the present disclosure, before obtaining the statistical features of the image to be processed, the image to be processed is scaled to obtain a processed image to be processed, where a resolution of the processed image to be processed is smaller than a resolution of the image to be processed.
Specifically, the image to be processed is scaled by a scaling function to obtain an image to be processed with a smaller resolution. Wherein, the image to be processed can be scaled by a resize scaling function. For example, an image with a resolution of 1080 × 1920 is scaled by a resize scaling function, and then an image with a resolution of 640 × 360 is obtained. Of course, the scaling factor may be any factor, and this disclosure is not limited in this respect.
In an exemplary embodiment of the present disclosure, the statistical features include feature mean information and feature standard deviation information. The feature mean information comprises a feature mean corresponding to feature information of the image to be processed, and the feature standard deviation information comprises a feature standard deviation corresponding to the feature information of the image to be processed.
Specifically, fig. 4 is a schematic flowchart of a method for acquiring statistical characteristics of an image to be processed, and as shown in fig. 4, the flow at least includes steps S410 to S420, which are described in detail as follows:
in step S410, the image to be processed is input into the feature extraction model to obtain feature information corresponding to the image to be processed.
In an exemplary embodiment of the present disclosure, the feature extraction model may be a Convolutional Neural Network (CNN), which is a hierarchical model (hierarchical model) whose input is raw data (raw data), such as RGB image, raw audio data, and the like. The convolutional neural network extracts high-level semantic information from an original data input layer by layer through layer-by-layer stacking of a series of operations such as convolution (convolution) operation, pooling (Pooling) operation and non-linear activation function (non-linear activation function) mapping, and the process is feed-forward operation (feed-forward).
For example, the feature extraction model may be a convolutional neural network having 1 × 1 convolutional kernel, and performs upsampling on each pixel point in the input image to be processed, and upsampling on an RGB channel in each pixel point to obtain multi-channel feature information. For example, the pixel information of RGB channels may be converted into feature information of 64 channels, and the specific structure of the convolutional neural network is not specifically limited by the present disclosure.
Specifically, an image to be processed is input into a convolutional neural network, and feature extraction is performed on the image to be processed through the convolutional neural network, so that feature information of the image to be processed is obtained. In addition, the image to be processed after the scaling processing can be input into a convolutional neural network to obtain the characteristic information of the image to be processed.
In step S420, feature mean information and feature standard deviation information of the image to be processed are calculated according to the feature information of the image to be processed.
In the exemplary embodiment of the present disclosure, after the feature information of the image to be processed is acquired, a mean value and a standard deviation corresponding to the feature information of the image to be processed are calculated, and the mean value and the standard deviation corresponding to the feature information of the image to be processed are configured as statistical features of the image to be processed.
In an exemplary embodiment of the present disclosure, feature extraction may be performed on a reference image through a feature extraction model to obtain feature information of the reference image, a mean value and a standard deviation corresponding to the feature information of the reference image are calculated, and the mean value and the standard deviation corresponding to the feature information of the reference image are configured as statistical features of the reference image. The feature extraction model may be the same as or different from the feature extraction model for obtaining the feature information of the image to be processed, and this disclosure does not specifically limit this.
The feature extraction may be performed on the reference image in advance, the feature information of the reference image may be obtained, and the statistical feature of the reference image may be calculated according to the feature information of the reference image. The statistical characteristics corresponding to the reference image are stored in the database, and the statistical characteristics of the reference image can be directly obtained from the database. For example, if a certain style image is selected to perform toning on an image to be processed, statistical features corresponding to the style image are obtained in a database.
In addition, the feature extraction model can be used for extracting the features of the reference image in real time and acquiring the statistical features of the reference image in real time.
In step S320, a target mapping relationship corresponding to the image to be processed is generated according to the statistical features of the image to be processed and the statistical features of the reference image.
In an exemplary embodiment of the present disclosure, the target mapping relationship may be a LUT table, an index number of the LUT table being an input pixel value, and an index value of the LUT table being an output pixel value. The input pixel values and the output pixel values each include values of RGB channels, specifically including pixel values composed of a red channel R, a green channel G, and a blue channel B.
In an exemplary embodiment of the present disclosure, the statistical features of the image to be processed and the statistical features of the reference image may be stored in the ADAIN module, and the target mapping relationship may be generated by the ADAIN module. Fig. 5 is a schematic flowchart of a method for generating a target mapping relationship, and as shown in fig. 5, the flowchart at least includes steps S510 to S520, which are described in detail as follows:
in step S510, an input pixel value is acquired, and feature extraction is performed on the input pixel value to obtain feature information corresponding to the input pixel value.
In an exemplary embodiment of the present disclosure, the input pixel value may be a set of pixel values set in advance. For example, the red channel R, the green channel G, and the blue channel B may have pixel values of 0 to 255, respectively. For example, the input pixel values are 256 × 256 input pixel values such as (0,0,0), (0,0,1), (0,1,0), (1,0,0),. the. (0,0,255), (0,255,0), (255,0,0),. the. (255 ), and so on. For another example, the input pixel values may also be 33 × 33 input pixel values such as (0,0,0), (0,0,8), (0,8,0), (8,0,0),. (0,0,32), (0,32,0), (32,0,0),. (0,0,255), (0,255,0), (255,0,0),. (255 ), and so on.
In addition, the input pixel value can also be used for acquiring a pixel value corresponding to the image to be processed as the input pixel value by traversing the image to be processed. Specifically, fig. 6 is a schematic flow chart of a method for acquiring feature information corresponding to an input pixel value, and as shown in fig. 6, the flow chart at least includes step S610 to step S630, which is described in detail as follows:
in step S610, the image to be processed is traversed, a pixel value corresponding to the image to be processed is obtained, and the pixel value of the image to be processed is configured as an input pixel value.
In the exemplary embodiment of the present disclosure, all pixel points in the image to be processed are traversed, pixel values corresponding to all pixel points are obtained, and the pixel values corresponding to all pixel points are all used as input pixel values.
In addition, the pixel values of all the pixel points of the image to be processed can be traversed, the pixel range of the pixel values in the image to be processed can be obtained, and the input pixel values are configured according to the pixel range corresponding to the image to be processed. For example, if the pixel range of the pixel point in the to-be-processed image in the red channel R, the green channel G, and the blue channel B is 132 to 255, the input pixel values are configured as (0, 132), (0,132,0), (132,0,0), (0, 255), (255,0,0), (255 ), and the like.
In step S620, the input pixel values are matrix-transformed to obtain image information corresponding to the input pixel values.
In an exemplary embodiment of the present disclosure, the input pixel values are input into a reshape function to be matrix-transformed to obtain image information corresponding to the input pixel values. The image information may include W × H RGB channel values, where W divided by H, or H divided by W may result in any positive integer. For example, the input pixel values are input to a reshape function and matrix-transformed, so as to obtain 33 × 1089 image information.
In step S630, the image information is input into the feature extraction model to obtain feature information corresponding to the image information.
In an exemplary embodiment of the present disclosure, feature extraction is performed on image information through a feature extraction model to obtain feature information corresponding to the image information, that is, feature information corresponding to an input pixel value.
Further, an initial LUT table of 33 × 33 may be obtained, the initial LUT table having index numbers of 33 × 33 input pixel values, respectively, and the 33 × 33 input pixel values may include: (0,0,0 '), (0,0,8 '), (0,8,0 '), (8,0, 0.), (0,0,255,), (0,255,0,), and (255,0,0,). The index value of the initial LUT table is 33 × 33 initial output pixel values, which may be any pixel value, and this disclosure does not specifically limit this.
Inputting the 33 × 33 initial LUT table into a reshape function for matrix transformation to obtain 33 × 1089 image information, and inputting the image information corresponding to the initial LUT table into the feature extraction model to obtain feature information corresponding to the initial LUT table.
Continuing to refer to fig. 5, in step S520, a target mapping relationship corresponding to the image to be processed is generated according to the feature information of the input pixel values, the statistical features of the image to be processed, and the statistical features of the reference image.
In an exemplary embodiment of the present disclosure, output feature information corresponding to feature information of an input pixel value is calculated according to the feature information corresponding to the input pixel value, the statistical feature of the image to be processed, and the statistical feature of the reference image, and the output feature information is input into a feature mapping model to obtain an output pixel value.
Specifically, fig. 7 is a schematic flowchart of a method for generating a target mapping relationship, and as shown in fig. 7, the flowchart at least includes steps S710 to S730, which are described in detail as follows:
in step S710, output feature information corresponding to the feature information of the input pixel value is acquired according to the feature information of the input pixel value, the statistical feature of the image to be processed, and the statistical feature of the reference image.
In an exemplary embodiment of the present disclosure, the output characteristic information is calculated according to the following formula (1), and the formula (1) is as follows:
Figure BDA0003058081320000131
where P denotes output feature information, s denotes feature information of an input pixel value, μ (x) denotes feature mean information of an image to be processed, σ (x) denotes feature standard deviation information of the image to be processed, μ (y) denotes feature mean information of a reference image, and σ (y) denotes feature standard deviation information of the reference image.
In step S720, the output feature information is feature-mapped to obtain an output pixel value corresponding to the output feature information.
In an exemplary embodiment of the present disclosure, the output feature information is input into the feature mapping model to obtain an output pixel value corresponding to the output feature information. The feature mapping model may be a convolutional neural network, and the convolutional neural network may perform downsampling on the multi-channel output feature information, map the multi-channel output feature information to the RGB channels, and obtain output pixel values of the RGB channels.
In step S730, a target mapping relationship is generated from the input pixel values and the output pixel values.
In an exemplary embodiment of the present disclosure, a target mapping relationship is created for an input pixel value and an output pixel value to which the input pixel value corresponds. The target mapping relationship may be in the form of an index, where an input pixel value is used as an index number, and an output pixel value corresponding to the input pixel value is used as an index value corresponding to the index number.
After the feature information corresponding to the initial LUT table is obtained, the output feature information corresponding to the initial LUT table is obtained according to the above formula (1), and the feature mapping is performed on the output feature information by using the feature mapping model, so as to obtain the output pixel value corresponding to the initial LUT table. And correspondingly replacing the initial output pixel value in the initial LUT table according to the output pixel value to obtain a target LUT table.
In step S330, the image to be processed is subjected to color matching processing according to the target mapping relationship.
In the exemplary embodiment of the disclosure, the pixel value of each pixel point in the image to be processed is obtained, the pixel value of each pixel point is matched with the input pixel value in the target mapping relationship, the output pixel value corresponding to the input pixel value matched with the pixel value of each pixel point is obtained, and the output pixel value is used for replacing the pixel value of each corresponding pixel point, so as to obtain the target image after color mixing processing.
If the input pixel value corresponding to the pixel value of the pixel point in the image to be processed does not exist in the target mapping relation, two or more input pixel values closest to the pixel value of the pixel point in the image to be processed are obtained, and interpolation operation is performed on the two or more input pixel values to obtain an output pixel value corresponding to the pixel point.
Fig. 8 is a schematic flowchart of an image toning method according to the present embodiment, and as shown in fig. 8, the flowchart at least includes steps S810 to S890, which are described in detail as follows:
in step S810, an image to be processed is obtained, and the image to be processed is scaled to obtain a processed image to be processed.
The resolution of the image to be processed is 1080 × 1920, and the image resolution of the image to be processed is subjected to scaling processing to obtain an image with the resolution of 640 × 360.
In step S820, feature extraction is performed on the processed image to be processed through the feature extraction model to obtain feature information of the image to be processed, and feature mean information and feature standard deviation information corresponding to the feature information of the image to be processed are obtained.
In step S830, feature extraction is performed on the reference image through the feature extraction model to obtain feature information of the reference image, and feature mean information and feature standard deviation information corresponding to the feature information of the reference image are obtained.
In step S840, the feature mean information and the feature standard deviation information of the image to be processed, and the feature mean information and the feature standard deviation information of the reference image are stored in the ADAIN model.
Here, an ADAIN model is constructed according to formula (1) in the above embodiment, the input of the ADAIN model is the feature information of the input pixel value, and the output of the ADAIN model is the output feature information.
In step S850, an initial LUT table is obtained, the input pixel values in the initial LUT table are subjected to matrix transformation to obtain image information corresponding to the initial LUT table, and feature information corresponding to the image information of the initial LUT table is obtained by a feature extraction model.
The initial LUT table includes input pixel values and initial output pixel values, and the input pixel values are RGB pixel values of 33 × 33, which are preset in number.
In step S860, the feature information corresponding to the initial LUT table is input into the ADAIN model, and the output feature information corresponding to the initial LUT table is obtained according to formula (1) by the ADAIN model.
In step S870, the output feature information corresponding to the initial LUT table is input into the feature mapping model to obtain an output pixel value corresponding to the output feature information, and the initial output pixel value in the initial LUT table is replaced according to the output pixel value to obtain a target LUT table.
In step S880, the pixel value of each pixel point in the image to be processed is obtained, and the pixel value of each pixel point is used as an input pixel value to perform indexing in the target LUT table, so as to obtain an output pixel value corresponding to each pixel point.
In step S890, a target image after color matching is generated from the output pixel values corresponding to the respective pixel points.
In a specific embodiment of the present disclosure, the resolution of the image to be processed is scaled from 1080 × 1920 to 640 × 360, and the amount of calculation is reduced from 1080 × 1920 to 640 × 360+33 × 33 — 266337 by one order of magnitude. In addition, through experimental verification, no obvious progress loss is caused when the image with any resolution is reduced to 360P. Therefore, the image to be processed with a larger resolution will acquire a larger acceleration ratio.
In the image toning method of the embodiment, the target mapping relationship is generated according to the statistical characteristics of the image to be processed and the reference image, and then toning processing is performed on the image to be processed according to the target mapping relationship. On one hand, the method can dynamically generate the target mapping relation according to the difference between the image to be processed and the reference image, and the generated target mapping relation has pertinence with the image to be processed and the reference image, so that the color matching accuracy is higher; on the other hand, the resolution of the image to be processed is reduced through scaling processing, and the target mapping relation is generated according to the statistical information of the processed image to be processed, so that the calculation amount is greatly reduced, and the system consumption for generating the target mapping relation is reduced.
Those skilled in the art will appreciate that all or part of the steps implementing the above embodiments are implemented as computer programs executed by a CPU. The computer program, when executed by the CPU, performs the functions defined by the method provided by the present invention. The program may be stored in a computer readable storage medium, which may be a read-only memory, a magnetic or optical disk, or the like.
Furthermore, it should be noted that the above-mentioned figures are only schematic illustrations of the processes involved in the method according to exemplary embodiments of the invention, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Embodiments of the disclosed apparatus are described below, which can be used to perform the above-described image toning methods of the present disclosure. For details that are not disclosed in the embodiments of the apparatus of the present disclosure, please refer to the embodiments of the image toning method described above in the present disclosure.
Fig. 9 schematically shows a block diagram of an image toning apparatus according to one embodiment of the present disclosure.
Referring to fig. 9, an image toning device 900 according to one embodiment of the present disclosure, the image toning device 900 includes: a feature acquisition module 901, a mapping generation module 902, and a toning module 903. Specifically, the method comprises the following steps:
a feature obtaining module 901, configured to obtain statistical features of an image to be processed and obtain statistical features of a reference image;
a mapping generating module 902, configured to generate a target mapping relationship corresponding to the image to be processed according to the statistical features of the image to be processed and the statistical features of the reference image;
and the color matching processing module 903 is used for performing color matching processing on the image to be processed according to the target mapping relation.
In an exemplary embodiment of the present disclosure, the mapping generating module 902 may be further configured to obtain an input pixel value, and perform feature extraction on the input pixel value to obtain feature information corresponding to the input pixel value; and generating a target mapping relation corresponding to the image to be processed according to the characteristic information of the input pixel value, the statistical characteristic of the image to be processed and the statistical characteristic of the reference image.
In an exemplary embodiment of the present disclosure, the mapping generating module 902 may be further configured to obtain output feature information corresponding to the feature information of the input pixel value according to the feature information of the input pixel value, the statistical feature of the image to be processed, and the statistical feature of the reference image; performing feature mapping on the output feature information to obtain an output pixel value corresponding to the output feature information; and generating a target mapping relation according to the input pixel value and the output pixel value.
In an exemplary embodiment of the disclosure, the mapping generation module 902 may be further configured to calculate the output characteristic information according to the following formula (1), where the formula (1) is as follows:
Figure BDA0003058081320000161
where P denotes output feature information, s denotes feature information of an input pixel value, μ (x) denotes feature mean information of an image to be processed, σ (x) denotes feature standard deviation information of the image to be processed, μ (y) denotes feature mean information of a reference image, and σ (y) denotes feature standard deviation information of the reference image. The statistical characteristics comprise characteristic mean value information and characteristic standard deviation information.
In an exemplary embodiment of the present disclosure, the mapping generating module 902 may be further configured to traverse the image to be processed, obtain a pixel value corresponding to the image to be processed, and configure the pixel value of the image to be processed as an input pixel value; performing matrix transformation on the input pixel values to obtain image information corresponding to the input pixel values; and inputting the image information into the feature extraction model to obtain feature information corresponding to the image information.
In an exemplary embodiment of the present disclosure, the feature obtaining module 901 may be further configured to input the image to be processed into the feature extraction model, so as to obtain feature information corresponding to the image to be processed; and calculating the characteristic mean value information and the characteristic standard deviation information of the image to be processed according to the characteristic information of the image to be processed.
In an exemplary embodiment of the disclosure, the image toning device 900 may further include a scaling module (not shown in the figure) for scaling the image to be processed to obtain a processed image to be processed, where a resolution of the processed image to be processed is smaller than a resolution of the image to be processed.
The specific details of each module in the image color matching device are described in detail in the embodiment of the image color matching method, and the details that are not disclosed can be referred to the embodiment of the image color matching method, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 3 to 8 may be performed.
Exemplary embodiments of the present disclosure also provide a program product for implementing the above method, which may employ a portable compact disc read only memory (CD-ROM) and include program code, and may be run on a terminal device, such as a personal computer. However, the program product of the present disclosure is not limited thereto, and in this document, a readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device.
The readable medium may be a readable signal medium or a readable storage medium. A readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples (a non-exhaustive list) of the readable storage medium include: an electrical connection having one or more wires, a portable disk, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
A computer readable signal medium may include a propagated data signal with readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A readable signal medium may also be any readable medium that is not a readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Program code for carrying out operations for the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.

Claims (10)

1. An image toning method, comprising:
acquiring the statistical characteristics of an image to be processed and the statistical characteristics of a reference image;
generating a target mapping relation corresponding to the image to be processed according to the statistical characteristics of the image to be processed and the statistical characteristics of the reference image;
and carrying out color matching processing on the image to be processed according to the target mapping relation.
2. The image toning method according to claim 1, wherein generating a target mapping relationship corresponding to the image to be processed according to the statistical features of the image to be processed and the reference image comprises:
acquiring an input pixel value, and performing feature extraction on the input pixel value to acquire feature information corresponding to the input pixel value;
and generating a target mapping relation corresponding to the image to be processed according to the characteristic information of the input pixel value, the statistical characteristic of the image to be processed and the statistical characteristic of the reference image.
3. The image toning method according to claim 2, wherein generating a target mapping relationship corresponding to the image to be processed according to the feature information of the input pixel value, the statistical feature of the image to be processed, and the statistical feature of the reference image includes:
acquiring output characteristic information corresponding to the characteristic information of the input pixel value according to the characteristic information of the input pixel value, the statistical characteristic of the image to be processed and the statistical characteristic of the reference image;
performing feature mapping on the output feature information to obtain an output pixel value corresponding to the output feature information;
and generating the target mapping relation according to the input pixel value and the output pixel value.
4. The image toning method according to claim 3, wherein the statistical features include feature mean information and feature standard deviation information;
acquiring output characteristic information corresponding to the characteristic information of the input pixel value according to the characteristic information of the input pixel value, the statistical characteristic of the image to be processed and the statistical characteristic of the reference image, wherein the output characteristic information comprises:
the output characteristic information is calculated according to the following formula (1), and the formula (1) is as follows:
Figure FDA0003058081310000011
where P denotes the output feature information, s denotes feature information of the input pixel value, μ (x) denotes feature mean information of the image to be processed, σ (x) denotes feature standard deviation information of the image to be processed, μ (y) denotes feature mean information of the reference image, and σ (y) denotes feature standard deviation information of the reference image.
5. The image toning method according to claim 2, wherein before acquiring the statistical features of the image to be processed, the method further comprises:
and zooming the image to be processed to obtain a processed image to be processed, wherein the resolution of the processed image to be processed is smaller than that of the image to be processed.
6. The image toning method according to claim 5, wherein acquiring an input pixel value and performing feature extraction on the input pixel value to obtain feature information corresponding to the input pixel value includes:
traversing the image to be processed, acquiring a pixel value corresponding to the image to be processed, and configuring the pixel value of the image to be processed as the input pixel value;
performing matrix transformation on the input pixel values to obtain image information corresponding to the input pixel values;
and inputting the image information into a feature extraction model to obtain feature information corresponding to the image information.
7. The image toning method according to claim 6, wherein obtaining the statistical characteristics of the image to be processed comprises:
inputting the image to be processed into the feature extraction model to obtain feature information corresponding to the image to be processed;
and calculating the characteristic mean value information and the characteristic standard deviation information of the image to be processed according to the characteristic information of the image to be processed.
8. An image toning apparatus, comprising:
the characteristic acquisition module is used for acquiring the statistical characteristics of the image to be processed and acquiring the statistical characteristics of the reference image;
the mapping generation module is used for generating a target mapping relation corresponding to the image to be processed according to the statistical characteristics of the image to be processed and the statistical characteristics of the reference image;
and the color matching processing module is used for performing color matching processing on the image to be processed according to the target mapping relation.
9. A computer-readable storage medium on which a computer program is stored, the program, when executed by a processor, implementing an image toning method according to any one of claims 1 to 7.
10. An electronic device, comprising:
one or more processors;
storage means for storing one or more programs which, when executed by the one or more processors, cause the one or more processors to implement the image toning method as recited in any one of claims 1 to 7.
CN202110505184.7A 2021-05-10 2021-05-10 Image toning method and device, computer-readable storage medium and electronic equipment Pending CN113240599A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110505184.7A CN113240599A (en) 2021-05-10 2021-05-10 Image toning method and device, computer-readable storage medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110505184.7A CN113240599A (en) 2021-05-10 2021-05-10 Image toning method and device, computer-readable storage medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN113240599A true CN113240599A (en) 2021-08-10

Family

ID=77133247

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110505184.7A Pending CN113240599A (en) 2021-05-10 2021-05-10 Image toning method and device, computer-readable storage medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN113240599A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114449199A (en) * 2021-08-12 2022-05-06 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180308269A1 (en) * 2017-04-24 2018-10-25 Intel Corporation Hdr enhancement with temporal multiplex
CN109285112A (en) * 2018-09-25 2019-01-29 京东方科技集团股份有限公司 Image processing method neural network based, image processing apparatus
CN109754375A (en) * 2018-12-25 2019-05-14 广州华多网络科技有限公司 Image processing method, system, computer equipment, storage medium and terminal
CN110070124A (en) * 2019-04-15 2019-07-30 广州小鹏汽车科技有限公司 A kind of image amplification method and system based on production confrontation network
CN110222722A (en) * 2019-05-14 2019-09-10 华南理工大学 Interactive image stylization processing method, calculates equipment and storage medium at system
CN110473141A (en) * 2019-08-02 2019-11-19 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN111489322A (en) * 2020-04-09 2020-08-04 广州光锥元信息科技有限公司 Method and device for adding sky filter to static picture
CN111583165A (en) * 2019-02-19 2020-08-25 京东方科技集团股份有限公司 Image processing method, device, equipment and storage medium
US20210049468A1 (en) * 2018-11-14 2021-02-18 Nvidia Corporation Generative adversarial neural network assisted video reconstruction

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180308269A1 (en) * 2017-04-24 2018-10-25 Intel Corporation Hdr enhancement with temporal multiplex
CN109285112A (en) * 2018-09-25 2019-01-29 京东方科技集团股份有限公司 Image processing method neural network based, image processing apparatus
US20210049468A1 (en) * 2018-11-14 2021-02-18 Nvidia Corporation Generative adversarial neural network assisted video reconstruction
CN109754375A (en) * 2018-12-25 2019-05-14 广州华多网络科技有限公司 Image processing method, system, computer equipment, storage medium and terminal
CN111583165A (en) * 2019-02-19 2020-08-25 京东方科技集团股份有限公司 Image processing method, device, equipment and storage medium
CN110070124A (en) * 2019-04-15 2019-07-30 广州小鹏汽车科技有限公司 A kind of image amplification method and system based on production confrontation network
CN110222722A (en) * 2019-05-14 2019-09-10 华南理工大学 Interactive image stylization processing method, calculates equipment and storage medium at system
CN110473141A (en) * 2019-08-02 2019-11-19 Oppo广东移动通信有限公司 Image processing method, device, storage medium and electronic equipment
CN111489322A (en) * 2020-04-09 2020-08-04 广州光锥元信息科技有限公司 Method and device for adding sky filter to static picture

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114449199A (en) * 2021-08-12 2022-05-06 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium

Similar Documents

Publication Publication Date Title
CN112989904B (en) Method for generating style image, method, device, equipment and medium for training model
CN112562019A (en) Image color adjusting method and device, computer readable medium and electronic equipment
CN111369427B (en) Image processing method, image processing device, readable medium and electronic equipment
WO2022068451A1 (en) Style image generation method and apparatus, model training method and apparatus, device, and medium
CN111950570B (en) Target image extraction method, neural network training method and device
CN111866483B (en) Color restoration method and device, computer readable medium and electronic device
CN112581635B (en) Universal quick face changing method and device, electronic equipment and storage medium
US20240119082A1 (en) Method, apparatus, device, readable storage medium and product for media content processing
JP2022517463A (en) Coding methods and their devices, equipment and computer programs
WO2023072015A1 (en) Method and apparatus for generating character style image, device, and storage medium
CN111833242A (en) Face transformation method and device, electronic equipment and computer readable medium
CN113744286A (en) Virtual hair generation method and device, computer readable medium and electronic equipment
CN112785669B (en) Virtual image synthesis method, device, equipment and storage medium
CN113284206A (en) Information acquisition method and device, computer readable storage medium and electronic equipment
CN113240599A (en) Image toning method and device, computer-readable storage medium and electronic equipment
CN110619602B (en) Image generation method and device, electronic equipment and storage medium
CN112967193A (en) Image calibration method and device, computer readable medium and electronic equipment
WO2023140787A2 (en) Video processing method and apparatus, and electronic device, storage medium and program product
CN114418835A (en) Image processing method, apparatus, device and medium
CN113409204A (en) Method and device for optimizing image to be processed, storage medium and electronic equipment
CN114554226A (en) Image processing method and device, electronic equipment and storage medium
CN113936089A (en) Interface rendering method and device, storage medium and electronic equipment
CN113537470A (en) Model quantization method and device, storage medium and electronic device
CN113362260A (en) Image optimization method and device, storage medium and electronic equipment
CN113010728A (en) Song recommendation method, system, intelligent device and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination