CN112562019A - Image color adjusting method and device, computer readable medium and electronic equipment - Google Patents

Image color adjusting method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN112562019A
CN112562019A CN202011550049.6A CN202011550049A CN112562019A CN 112562019 A CN112562019 A CN 112562019A CN 202011550049 A CN202011550049 A CN 202011550049A CN 112562019 A CN112562019 A CN 112562019A
Authority
CN
China
Prior art keywords
color
image
mapping table
color mapping
original
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011550049.6A
Other languages
Chinese (zh)
Inventor
徐宏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202011550049.6A priority Critical patent/CN112562019A/en
Publication of CN112562019A publication Critical patent/CN112562019A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Abstract

The disclosure provides an image color adjusting method and device, a computer readable medium and an electronic device, and relates to the technical field of image processing. The method comprises the following steps: acquiring an original image and a preset basic color mapping table; inputting the original image into a pre-trained image adjustment model to obtain a color weight value corresponding to the original image; adjusting the basic color mapping table according to the color weight value to obtain a target color mapping table; and adjusting the color value in the original image through the target color mapping table to obtain an enhanced target image. The method and the device can avoid the problem of poor color adjustment effect caused by separation of scene identification and color adjustment, and improve the color display effect of the original image.

Description

Image color adjusting method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an image color adjustment method, an image color adjustment apparatus, a computer-readable medium, and an electronic device.
Background
Along with the continuous improvement of living standard of people, the color effect of images shot by portable shooting equipment (such as a smart phone and the like) gets more and more attention of people.
Currently, in the related art, when the color effect of the captured image is enhanced and displayed, a professional generally designs different color mapping schemes according to images in different types of scenes, then identifies the scene type of the captured image, and further performs color adjustment according to the corresponding color mapping scheme matched with the scene type. However, in this technical scheme, a professional is required to design different color mapping schemes according to different image scenes, the generation efficiency is low, a large number of color mapping schemes occupy more memory space, the identified scene types cannot be matched with the corresponding color mapping schemes, the obtained color accuracy is poor, and the color adjustment effect is poor.
Disclosure of Invention
The present disclosure is directed to an image color adjustment method, an image color adjustment apparatus, a computer-readable medium, and an electronic device, so as to avoid, at least to a certain extent, the problem of poor image color adjustment effect caused by separation of scene recognition and color adjustment.
According to a first aspect of the present disclosure, there is provided an image color adjustment method, including:
acquiring an original image and a preset basic color mapping table;
inputting the original image into a pre-trained image adjustment model to obtain a color weight value corresponding to the original image;
adjusting the basic color mapping table according to the color weight value to obtain a target color mapping table;
and adjusting the color value in the original image through the target color mapping table to obtain an enhanced target image.
According to a second aspect of the present disclosure, there is provided an image color adjustment apparatus comprising:
the image acquisition module is used for acquiring an original image and a preset basic color mapping table;
the color weight value generation module is used for inputting the original image into a pre-trained image adjustment model to obtain a color weight value corresponding to the original image;
the color mapping table adjusting module is used for adjusting the basic color mapping table according to the color weight value to obtain a target color mapping table;
and the image color adjusting module is used for adjusting the color value in the original image through the target color mapping table to obtain an enhanced target image.
According to a third aspect of the present disclosure, a computer-readable medium is provided, on which a computer program is stored, which computer program, when being executed by a processor, is adapted to carry out the above-mentioned method.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
a processor; and
a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the above-described method.
The image color adjusting method provided by the embodiment of the disclosure obtains an original image and a preset basic color mapping table, and inputs the original image into a pre-trained image adjusting model to obtain a color weight value corresponding to the original image; adjusting the basic color mapping table according to the color weight value to obtain a target color mapping table; and adjusting the color value in the original image through the target color mapping table to obtain the enhanced target image. On one hand, an original image is identified through an image adjusting model to obtain a color weight value, an initialized basic color mapping table is adjusted according to the color weight value to obtain a self-adaptive target color mapping table, the original image is adjusted according to the target color mapping table, corresponding color mapping tables do not need to be set according to different original images in a manual mode, the generation efficiency of the target color mapping table is effectively improved, and meanwhile, the memory space occupied by a large number of color mapping tables generated in a manual mode is reduced; on the other hand, the target color mapping table is obtained through color weight value adjustment obtained through image adjustment model identification, the matching between the target color mapping table and the original image is guaranteed, the accuracy of the target color mapping table is improved, and the color display effect of the target image is enhanced.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure. It is to be understood that the drawings in the following description are merely exemplary of the disclosure, and that other drawings may be derived from those drawings by one of ordinary skill in the art without the exercise of inventive faculty. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
FIG. 3 schematically illustrates a flow chart of a method of image color adjustment in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow chart for generating a base color map in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a flow chart for training an image adjustment model in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a flow chart for training an image adjustment model by way of supervised learning in an exemplary embodiment of the present disclosure;
FIG. 7 is a diagram schematically illustrating a model framework for training an image adjustment model by way of supervised learning in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow chart for training an image adjustment model by way of unsupervised learning in an exemplary embodiment of the present disclosure;
fig. 9 schematically shows a composition diagram of an image color adjustment apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. Example embodiments may, however, be embodied in many different forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus their repetitive description will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in the form of software, or in one or more hardware modules or integrated circuits, or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 is a schematic diagram illustrating a system architecture of an exemplary application environment to which an image color adjustment method and apparatus according to an embodiment of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few. The terminal devices 101, 102, 103 may be various electronic devices having an image processing function, including but not limited to desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, server 105 may be a server cluster comprised of multiple servers, or the like.
The image color adjustment method provided by the embodiment of the present disclosure is generally executed by the terminal devices 101, 102, 103, and accordingly, the image color adjustment apparatus is generally disposed in the terminal devices 101, 102, 103. However, it is easily understood by those skilled in the art that the image color adjustment method provided in the present disclosure may also be executed by the server 105, and accordingly, the image color adjustment apparatus may also be disposed in the server 105, which is not particularly limited in the present exemplary embodiment. For example, in an exemplary embodiment, the user may upload the captured original image to the server 105 through the terminal devices 101, 102, and 103, and after the server generates the target image by the image color adjustment method provided in the embodiments of the present disclosure, the target image is transmitted to the terminal devices 101, 102, and 103, and the like for display.
An exemplary embodiment of the present disclosure provides an electronic device for implementing an image color adjustment method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the image color adjustment method via execution of the executable instructions.
The following takes the mobile terminal 200 in fig. 2 as an example, and exemplifies the configuration of the electronic device. It will be appreciated by those skilled in the art that the configuration of figure 2 can also be applied to fixed type devices, in addition to components specifically intended for mobile purposes. In other embodiments, mobile terminal 200 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is only schematically illustrated and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also interface differently than shown in fig. 2, or a combination of multiple interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: a processor 210, an internal memory 221, an external memory interface 222, a Universal Serial Bus (USB) interface 230, a charging management module 240, a power management module 241, a battery 242, an antenna 1, an antenna 2, a mobile communication module 250, a wireless communication module 260, an audio module 270, a speaker 271, a microphone 272, a microphone 273, an earphone interface 274, a sensor module 280, a display 290, a camera module 291, an indicator 292, a motor 293, a button 294, and a Subscriber Identity Module (SIM) card interface 295. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyroscope sensor 2803, and the like.
Processor 210 may include one or more processing units, such as: the Processor 210 may include an Application Processor (AP), a modem Processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband Processor, and/or a Neural-Network Processing Unit (NPU), and the like. The different processing units may be separate devices or may be integrated into one or more processors.
The NPU is a Neural-Network (NN) computing processor, which processes input information quickly by using a biological Neural Network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. The NPU can implement applications such as intelligent recognition of the mobile terminal 200, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
A memory is provided in the processor 210. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transmission instructions, and notification instructions, and execution is controlled by processor 210.
The charge management module 240 is configured to receive a charging input from a charger. The power management module 241 is used for connecting the battery 242, the charging management module 240 and the processor 210. The power management module 241 receives the input of the battery 242 and/or the charging management module 240, and supplies power to the processor 210, the internal memory 221, the display screen 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein, the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including wireless communication of 2G/3G/4G/5G, etc. applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the Wireless communication module 260 may provide a solution for Wireless communication including a Wireless Local Area Network (WLAN) (e.g., a Wireless Fidelity (Wi-Fi) network), Bluetooth (BT), and the like, applied to the mobile terminal 200. In some embodiments, antenna 1 of the mobile terminal 200 is coupled to the mobile communication module 250 and antenna 2 is coupled to the wireless communication module 260, such that the mobile terminal 200 may communicate with networks and other devices via wireless communication techniques.
The mobile terminal 200 implements a display function through the GPU, the display screen 290, the application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and an application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or alter display information.
The mobile terminal 200 may implement a photographing function through the ISP, the camera module 291, the video codec, the GPU, the display screen 290, the application processor, and the like. The ISP is used for processing data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; the video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the mobile terminal 200. The external memory card communicates with the processor 210 through the external memory interface 222 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 221 may be used to store computer-executable program code, which includes instructions. The internal memory 221 may include a program storage area and a data storage area. The storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required by at least one function, and the like. The storage data area may store data (e.g., audio data, a phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory, such as at least one magnetic disk Storage device, a Flash memory device, a Universal Flash Storage (UFS), and the like. The processor 210 executes various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement an audio function through the audio module 270, the speaker 271, the receiver 272, the microphone 273, the earphone interface 274, the application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided to the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal and convert the pressure signal into an electrical signal. In some embodiments, the pressure sensor 2802 may be disposed on the display screen 290. Pressure sensor 2802 can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of the mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by the gyroscope sensor 2803. The gyro sensor 2803 can be used to photograph anti-shake, navigation, body-feel game scenes, and the like.
In addition, other functional sensors, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity light sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices for providing auxiliary functions may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, and the like, and a user can generate key signal inputs related to user settings and function control of the mobile terminal 200 through key inputs. Further examples include indicator 292, motor 293, SIM card interface 295, etc.
In the related art, a professional experienced in the field of photography first manually adjusts color parameter values such as brightness, contrast, basic saturation, and color brightness of an image according to an image design tool in advance according to images of specific and representative different types of scenes, so as to obtain color mapping schemes in different scene types. And then matching a corresponding color mapping scheme according to the scene recognition result of the image to realize image color adjustment, converting the input value of the red, green and blue three-channel numerical value of the image into a new output value through the color mapping scheme, such as a color lookup table (3D LUT) and the like, thereby realizing the conversion of the image color effect, and realizing the stylized adjustment of each scene through the color mapping. However, in this technical solution, the scene recognition result strongly depends on the image and cannot cover all types of scenes, because similar scenes in a partial image are not easily distinguished, and each color mapping scheme is an image for a certain specific scene category, a large number of color mapping schemes are required to cover images of various types of scenes, which not only has low generation efficiency, but also results in occupying a large amount of memory resources; in addition, since the training of the scene recognition classifier and the design of the color mapping scheme are relatively independent, if the scene classification is wrong, the selected color mapping scheme may result in poor color performance of the image.
The following specifically describes an image color adjustment method according to an exemplary embodiment of the present disclosure, taking an example in which a terminal device executes the image color adjustment method.
Fig. 3 shows a flowchart of an image color adjustment method in the present exemplary embodiment, which may include the following steps S310 to S340:
in step S310, an original image and a preset basic color mapping table are obtained.
In an exemplary embodiment, the original image may be an image that needs to be color-adjusted, for example, the original image may be an unprocessed image captured by the image capturing unit, or an image that needs to be color-adjusted and received through data transmission, or of course, an image that needs to be color-adjusted and obtained in another manner, such as an unprocessed image obtained from a database, which is not particularly limited in this exemplary embodiment.
The basic color mapping Table may refer to an unprocessed initial mapping relationship Table, for example, the basic color mapping Table may be an initial one-dimensional color lookup Table 1D LUT (1D Look-Up-Table), may also be an initial three-dimensional color lookup Table 3D LUT (1D Look-Up-Table), and of course, may also be an initial color mapping Table in other forms, which is not particularly limited in this embodiment. Preferably, since the color control function of the combination of the 1D LUT and the model has some limitations, and the 3D LUT can describe the accurate behavior of all color points in the stereoscopic color space, the 3D LUT can handle any non-linear property of the display, and can also accurately handle the problem of sudden and large color changes, and so on, in the precise color control, the present exemplary embodiment usually biases to use the 3D LUT, and can realize the control of the full stereoscopic color space.
In step S320, the original image is input into a pre-trained image adjustment model, so as to obtain a color weight value corresponding to the original image.
In an exemplary embodiment, the image adjustment model may refer to a learning model based on an artificial intelligence technique for outputting color weight values according to color content in an original image, for example, the image adjustment model may be a convolutional neural network model, may also be a generative confrontation network model, and of course, may also be another learning model based on an artificial intelligence technique, which is not particularly limited in this exemplary embodiment.
The color weight value may refer to a weight value of each color parameter corresponding to the content in the original image, for example, the color weight value may be a brightness weight value, a contrast weight value, a basic saturation weight value, or a color parameter related to a color value in another original image, for example, the color weight value may also be a color brightness weight value, which is not particularly limited in this example embodiment.
In step S330, the basic color mapping table is adjusted according to the color weight value to obtain a target color mapping table.
In an exemplary embodiment, the target color mapping table refers to a color mapping table obtained by adjusting color values in a base color mapping table according to color weight values output by the image enhancement model, and the color parameters in the original image are adjusted according to the target color mapping table.
In step S340, the color value in the original image is adjusted through the target color mapping table, so as to obtain an enhanced target image.
In an exemplary embodiment, the color value in the original image may be a color parameter used for representing the content of the original image in the original image, for example, the color value in the original image may be a brightness parameter, a contrast parameter, a basic saturation parameter, and the like, and of course, the color value may also be a color parameter characterizing the content of the image in other original images, for example, the color parameter may also be a color brightness parameter, which is not particularly limited in this exemplary embodiment.
The following describes steps S310 to S340 in detail.
In an exemplary embodiment, before obtaining the preset base color mapping table, the initialization processing of the base color mapping table may be implemented through the steps in fig. 4, and as shown in fig. 4, the initialization processing specifically includes:
step S410, acquiring a pre-constructed original color mapping table;
step S420, initializing the original color mapping table to generate a basic color mapping table.
The original color mapping table may be a pre-constructed color mapping table, for example, the original color mapping table may be a pre-constructed 1D LUT, or may be a pre-constructed 3D LUT, which is not particularly limited in this example embodiment. The basic color map capable of performing the color adjustment process can be obtained by initializing the original color map.
Through carrying out initialization processing on the pre-constructed original color mapping table, after the obtained basic color mapping table is combined with the color weight value output by the graphic adjustment model, the accuracy rate is higher, the accuracy of the target color mapping table is effectively ensured, and the color effect quality of the enhanced target image is improved.
Specifically, when the original color mapping table is initialized, the original color mapping table can be sampled to obtain an original color mapping table with a discrete structure; the original color map of the discrete structure is used as the base color map.
The sampling processing may refer to a processing process of converting continuous color data in an original color mapping table into discrete color data, in the image processing, the original color mapping table associates an index number with an output value, a data structure of an array or an associated array calculated during runtime may be replaced by a simple query operation through the original color mapping table, and a new output value corresponding to an input value of an RGB value may be quickly queried through the original color mapping table. However, if the original color map is to provide a corresponding value combination for each input value and output value, the data size of the original color map is large, and in order to reduce the memory usage, the original color map is generally sampled and represented as a basic color map with a discrete structure of 17x17x17 or 33x33x 33. 17x17x17 or 33x33x33 may indicate that R, G, B has 17 or 33 mapping points from the input value to the output value on each coordinate axis, and in practical application, the values between these mapping points may be calculated based on a trilinear interpolation (i.e., coordinate points on coordinate axes are interpolated to obtain continuous mapping data) or a tetrahedral interpolation (i.e., tetrahedra formed by mapping points are interpolated to obtain continuous mapping data), and a specific interpolation method is a common technique, and will not be described herein again.
The original color mapping table with the discrete structure is obtained by sampling the original color mapping table, so that the memory space occupied by the color mapping table can be greatly reduced, the system performance is improved, and the color searching efficiency is improved.
Further, in order to ensure that all color mapping transformations can be included, preset scene color data can be acquired, and the scene color data is assigned to any one of the basic color mapping tables to obtain the basic color mapping table with the scene color data.
The scene color data may be color data included in an image of a common scene, for example, the scene color data may be color data included in a landscape image including a plurality of types of color values, or color data included in a face image including a plurality of types of color values, or color data included in an image of another common scene including a plurality of types of color values, which is not particularly limited in this example embodiment.
The number of the basic color mapping tables may be set to be one, or may also be set to include at least two, and specifically, the number may be set by a user according to an actual situation, which is not particularly limited in this exemplary embodiment, for example, the number of the basic color mapping tables may be three or four, which is not particularly limited in this exemplary embodiment.
In consideration of the difference of the computing capabilities of various terminal devices (e.g., the difference of the computing capabilities of a smart phone and a computer) and the accuracy of the image display effect, the number of the basic color mapping tables in this exemplary embodiment may be set to three, the preset scene color data is assigned to any one of the basic color mapping tables, and the color values of the other two basic color mapping tables are initialized to 0.
Through assigning the preset scene color data to the basic color mapping table, the completeness and comprehensiveness of the input-to-output color mapping transformation can be ensured, the accuracy of color effect display is improved, and the display effect of the target image is further improved.
In an exemplary embodiment, the image adjustment model may include a convolutional neural network model, and the training of the image adjustment model may be implemented through the steps in fig. 5, and as shown in fig. 5, specifically, the method may include:
step S510, acquiring a preset sample image pair;
and step S520, training a pre-constructed convolutional neural network model according to the sample original image and the sample target image based on a supervised learning mode to obtain a pre-trained image adjustment model.
The sample image pair may include a sample original image and a sample target image obtained by performing color adjustment on the sample original image through a color mapping table designed in an artificial manner, and the pre-trained image adjustment model may be obtained by training a pre-constructed convolutional neural network model through samples original images and sample target images in the sample image pair in a supervised manner.
Specifically, the convolutional neural network model may include a weight prediction network layer, and then the training of the pre-constructed convolutional neural network model may be implemented through the steps in fig. 6, and as shown in fig. 6, specifically, the method may include:
step S610, inputting the down-sampled sample original image into the weight prediction network layer to obtain a sample color weight value;
step S620, obtaining a target color mapping table according to a basic color mapping table obtained by initializing the sample color weight value adjustment, and obtaining a model output image by adjusting the color value of the sample original image according to the target color mapping table;
step S630, inputting the model output image and the sample target image into a target loss function, so as to perform optimization learning on the weight prediction network layer through the adjusted target loss function;
and step S640, constructing the image adjustment model through the trained weight prediction network layer and the adjusted target loss function.
The downsampling (downsampled) refers to a process of sampling an image to make the size of the image meet a requirement, and is also called downsampling. For example, in order to reduce the calculation amount of the model, the input image may be down-sampled to a resolution of 256 × 256 by using a bilinear interpolation method, and of course, the input image may also be down-sampled to a resolution of 128 × 128, which is not limited in this exemplary embodiment. The weight prediction network layer is a network layer based on a convolutional neural network, and is mainly used for extracting color weight values from a sample original image (or an original image).
For example, the convolutional neural network of the weight prediction network layer in this exemplary embodiment may include five convolutional blocks, each convolutional block may include one convolutional layer, an activation function leakage Relu layer, and a batch normalization layer, where the convolutional kernel size may be 3 × 3, and the number of channels may be 16, 32, 64, 128, and 256, respectively. Drop out in the convolutional neural network model can effectively alleviate the occurrence of overfitting, and achieve the regularization effect to a certain extent, so Drop out may be set to 0.5, and certainly, may also be set to 0.4, which is not particularly limited in this example embodiment. Assuming that the number of basic color mapping tables employed is N, the number of fully-connected layer output classes is also N.
The network layer can automatically extract the global context in the image represented by the image characteristics in the original sample image, such as the brightness, color, tone and other parameters of the original sample image, so as to output the color weight value related to the content of the original sample image and ensure the accuracy of the finally adjusted target image.
The target loss function may be a loss function for optimizing learning of the image adjustment model, for example, the target loss function may be a mean square loss function, may also be a square loss function, and of course, may also be other regression loss functions capable of optimizing learning of the image adjustment model, which is not particularly limited in this exemplary embodiment.
For example, the batch data size Batchsize may be set to 1 during the training process of the image adjustment model, the learning rate may be fixed to 0.0001, and the image adjustment model is optimally learned by using a mean square error loss function L and an Adam optimizer, so that the mean square error loss function L may be expressed as a relation (1):
Figure BDA0002857626420000131
where T may represent the number of pairs of sample images for training, qtAdjusting the output value of the model for the representation, ytLabel values in sample image pairs may be represented and R may represent a regularization term, primarily to ensure local smoothness and monotonicity of the target color mapping table.
Further, inputting the obtained original image into the trained image adjustment model to obtain color weight values corresponding to the N outputted basic color mapping tables, and the finally obtained output color value q of the target color mapping table may be expressed as a relation (2):
Figure BDA0002857626420000132
wherein, ω isnCan express the predicted self-adaptive color weight value in the image adjusting modelnAn nth base color map table may be represented.
Fig. 7 schematically illustrates a model framework diagram for training an image adjustment model by means of supervised learning in an exemplary embodiment of the present disclosure.
Referring to fig. 7, in step S710, a down-sampling process is performed on a sample original image in an input sample image pair to obtain a down-sampled image with a fixed size;
s720, inputting the down-sampling image into a weight prediction network layer based on a convolutional neural network, and obtaining N color weight values based on the weight prediction network layer;
step S730, adjusting and pre-initializing the N basic color mapping tables through the output N color weight values to obtain the N adjusted basic color mapping tables;
step S740, superposing and fusing the adjusted N basic color mapping tables to obtain a target color mapping table;
step S750, adjusting the color value in the original image of the sample through a target color mapping table to obtain an output image;
step S760, the output image and the sample target image in the sample image pair are input into the mean square error loss function, so as to implement the optimized learning of the image enhancement model.
In another exemplary embodiment, because the pair of sample images is difficult to obtain, the sample generation efficiency is low, and it takes time and labor to label the data set, the image adjustment model can be obtained by using the unpaired data set based on the mode of generating the countermeasure network, the labeling step can be omitted, and different types of training data sets can be obtained.
Specifically, the step in fig. 8 may be implemented to obtain the image adjustment model based on the generative confrontation network training, and as shown in fig. 8, the method specifically includes:
step S810, constructing a generator according to a basic color mapping table assigned with color weight parameters, and constructing a discriminator according to a preset convolutional neural network;
step S820, constructing a generative confrontation network model through the generator and the discriminator, and using the trained generative confrontation network model as the image adjustment model.
The Generator (Generator) may be a model for generating an output image according to an input vector, for example, the Generator may be a DCGAN model of a deep convolution countermeasure network, the input of the model may be 1 × 100 vector, and then the model is learned through one full connection layer to obtain 4 × 1024 vector, and then the model is subjected to 4 up-sampling deconvolution networks to generate a 64 × 64 graph, which is, of course, only illustrated here by way of example, and the present exemplary embodiment is not limited thereto. The generator can be constructed according to the basic color mapping table assigned with the color weight parameters, and the generator capable of generating the image passing through the discriminator can be obtained by continuously adjusting the color weight parameters of the basic color mapping table in the process of the countermeasure training. The discriminator (discriminator) may be a classification network for discriminating whether the image output by the generator belongs to a real sample or a false sample, for example, the discriminator may be a convolution neural network for classification, and the present exemplary embodiment is not limited thereto.
In summary, in the exemplary embodiment, the original image and the preset basic color mapping table are obtained, and the original image is input into the pre-trained image adjustment model to obtain the color weight value corresponding to the original image; adjusting the basic color mapping table according to the color weight value to obtain a target color mapping table; and adjusting the color value in the original image through the target color mapping table to obtain the enhanced target image. On one hand, an original image is identified through an image adjusting model to obtain a color weight value, an initialized basic color mapping table is adjusted according to the color weight value to obtain a self-adaptive target color mapping table, the original image is adjusted according to the target color mapping table, corresponding color mapping tables do not need to be set according to different original images in a manual mode, the generation efficiency of the target color mapping table is effectively improved, and meanwhile, the memory space occupied by a large number of color mapping tables generated in a manual mode is reduced; on the other hand, the target color mapping table is obtained through color weight value adjustment obtained through image adjustment model identification, the matching between the target color mapping table and the original image is guaranteed, the accuracy of the target color mapping table is improved, and the color display effect of the target image is enhanced.
It is noted that the above-mentioned figures are merely schematic illustrations of processes involved in methods according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily understood that the processes shown in the above figures are not intended to indicate or limit the chronological order of the processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, e.g., in multiple modules.
Further, referring to fig. 9, an image color adjusting apparatus 900 is further provided in the present exemplary embodiment, and includes an image obtaining module 910, a color weight value generating module 920, a color mapping table adjusting module 930, and an image color adjusting module 940. Wherein:
the image obtaining module 910 is configured to obtain an original image and a preset basic color mapping table;
the color weight value generation module 920 is configured to input the original image into a pre-trained image adjustment model to obtain a color weight value corresponding to the original image;
the color mapping table adjusting module 930 is configured to adjust the basic color mapping table according to the color weight value to obtain a target color mapping table;
the image color adjusting module 940 is configured to adjust the color value in the original image through the target color mapping table to obtain an enhanced target image.
In an exemplary embodiment, the image color adjustment apparatus 900 further includes a color map initialization unit, which may be configured to:
acquiring a pre-constructed original color mapping table;
and initializing the original color mapping table to generate a basic color mapping table.
In an exemplary embodiment, the color mapping table initializing unit may be further configured to:
sampling the original color mapping table to obtain an original color mapping table with a discrete structure;
and taking the original color mapping table of the discrete structure as the basic color mapping table.
In an exemplary embodiment, the color mapping table initializing unit may be further configured to:
and acquiring preset scene color data, and assigning the scene color data to any one of the basic color mapping tables to obtain the basic color mapping table with the scene color data.
In an exemplary embodiment, the image color adjustment apparatus 900 further includes an image adjustment model training unit, which may be configured to:
acquiring a preset sample image pair, wherein the sample image pair comprises a sample original image and a sample target image corresponding to the sample original image;
and training the pre-constructed convolutional neural network model according to the sample original image and the sample target image based on a supervised learning mode to obtain a pre-trained image adjustment model.
In an exemplary embodiment, the image adjustment model training unit may be further configured to:
inputting the down-sampled sample original image into the weight prediction network layer to obtain a sample color weight value;
obtaining a target color mapping table according to a basic color mapping table obtained by the sample color weight value adjustment initialization, and adjusting the color value of the sample original image according to the target color mapping table to obtain a model output image;
inputting the model output image and the sample target image into a target loss function so as to perform optimization learning on the weight prediction network layer through the adjusted target loss function;
and constructing the image adjustment model through the trained weight prediction network layer and the target loss function.
In an exemplary embodiment, the image adjustment model training unit may be further configured to:
constructing a generator according to a basic color mapping table assigned with color weight parameters, and constructing a discriminator according to a preset convolutional neural network;
and constructing a generative confrontation network model through the generator and the discriminator, and taking the trained generative confrontation network model as the image adjustment model.
The specific details of each module in the above apparatus have been described in detail in the method section, and details that are not disclosed may refer to the method section, and thus are not described again.
As will be appreciated by one skilled in the art, aspects of the present disclosure may be embodied as a system, method or program product. Accordingly, various aspects of the present disclosure may be embodied in the form of: an entirely hardware embodiment, an entirely software embodiment (including firmware, microcode, etc.) or an embodiment combining hardware and software aspects that may all generally be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the above-described method of the present specification. In some possible embodiments, various aspects of the disclosure may also be implemented in the form of a program product including program code for causing a terminal device to perform the steps according to various exemplary embodiments of the disclosure described in the above-mentioned "exemplary methods" section of this specification, when the program product is run on the terminal device, for example, any one or more of the steps in fig. 3 to 8 may be performed.
It should be noted that the computer readable media shown in the present disclosure may be computer readable signal media or computer readable storage media or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the present disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In contrast, in the present disclosure, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, either in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C + + or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device and partly on a remote computing device, or entirely on the remote computing device or server. In the case of a remote computing device, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., through the internet using an internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is to be limited only by the terms of the appended claims.

Claims (10)

1. An image color adjustment method, comprising:
acquiring an original image and a preset basic color mapping table;
inputting the original image into a pre-trained image adjustment model to obtain a color weight value corresponding to the original image;
adjusting the basic color mapping table according to the color weight value to obtain a target color mapping table;
and adjusting the color value in the original image through the target color mapping table to obtain an enhanced target image.
2. The method according to claim 1, wherein said obtaining a preset base color mapping table further comprises:
acquiring a pre-constructed original color mapping table;
and initializing the original color mapping table to generate a basic color mapping table.
3. The method of claim 2, wherein initializing the original color map to obtain a base color map comprises:
sampling the original color mapping table to obtain an original color mapping table with a discrete structure;
and taking the original color mapping table of the discrete structure as the basic color mapping table.
4. The method of claim 3, wherein using the discrete-structured original color map as the base color map further comprises:
and acquiring preset scene color data, and assigning the scene color data to any one of the basic color mapping tables to obtain the basic color mapping table with the scene color data.
5. The method of any of claims 1 to 4, wherein the image adjustment model comprises a convolutional neural network model, the method further comprising:
acquiring a preset sample image pair, wherein the sample image pair comprises a sample original image and a sample target image corresponding to the sample original image;
and training the pre-constructed convolutional neural network model according to the sample original image and the sample target image based on a supervised learning mode to obtain a pre-trained image adjustment model.
6. The method of claim 5, wherein the convolutional neural network model comprises a weight prediction network layer, and the training of the pre-constructed convolutional neural network model based on the sample original images and the sample target images based on the supervised learning approach comprises:
inputting the down-sampled sample original image into the weight prediction network layer to obtain a sample color weight value;
obtaining a target color mapping table according to a basic color mapping table obtained by the sample color weight value adjustment initialization, and adjusting the color value of the sample original image according to the target color mapping table to obtain a model output image;
inputting the model output image and the sample target image into a target loss function so as to perform optimization learning on the weight prediction network layer through the adjusted target loss function;
and constructing the image adjustment model through the trained weight prediction network layer and the target loss function.
7. The method of claim 1, wherein the image adjustment model comprises a generative confrontation network model, the method further comprising:
constructing a generator according to a basic color mapping table assigned with color weight parameters, and constructing a discriminator according to a preset convolutional neural network;
and constructing a generative confrontation network model through the generator and the discriminator, and taking the trained generative confrontation network model as the image adjustment model.
8. An image color adjustment apparatus, comprising:
the image acquisition module is used for acquiring an original image and a preset basic color mapping table;
the color weight value generation module is used for inputting the original image into a pre-trained image adjustment model to obtain a color weight value corresponding to the original image;
the color mapping table adjusting module is used for adjusting the basic color mapping table according to the color weight value to obtain a target color mapping table;
and the image color adjusting module is used for adjusting the color value in the original image through the target color mapping table to obtain an enhanced target image.
9. A computer-readable medium, on which a computer program is stored which, when being executed by a processor, carries out the method according to any one of claims 1 to 7.
10. An electronic device, comprising:
a processor; and
a memory for storing executable instructions of the processor;
wherein the processor is configured to perform the method of any of claims 1 to 7 via execution of the executable instructions.
CN202011550049.6A 2020-12-24 2020-12-24 Image color adjusting method and device, computer readable medium and electronic equipment Pending CN112562019A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011550049.6A CN112562019A (en) 2020-12-24 2020-12-24 Image color adjusting method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011550049.6A CN112562019A (en) 2020-12-24 2020-12-24 Image color adjusting method and device, computer readable medium and electronic equipment

Publications (1)

Publication Number Publication Date
CN112562019A true CN112562019A (en) 2021-03-26

Family

ID=75033351

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011550049.6A Pending CN112562019A (en) 2020-12-24 2020-12-24 Image color adjusting method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112562019A (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113052970A (en) * 2021-04-09 2021-06-29 杭州群核信息技术有限公司 Neural network-based light intensity and color design method, device and system and storage medium
CN113269206A (en) * 2021-05-24 2021-08-17 山东大学 Color-embedded visual exploration method and system
CN113297937A (en) * 2021-05-17 2021-08-24 杭州朗和科技有限公司 Image processing method, device, equipment and medium
CN113436105A (en) * 2021-06-30 2021-09-24 北京百度网讯科技有限公司 Model training and image optimization method and device, electronic equipment and storage medium
CN114286000A (en) * 2021-12-27 2022-04-05 展讯通信(上海)有限公司 Image color processing method and device and electronic equipment
CN114449199A (en) * 2021-08-12 2022-05-06 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
CN114463196A (en) * 2021-12-28 2022-05-10 浙江大学嘉兴研究院 Image correction method based on deep learning
WO2023108568A1 (en) * 2021-12-16 2023-06-22 京东方科技集团股份有限公司 Model training method and apparatus for image processing, and storage medium and electronic device
CN116993619A (en) * 2023-08-29 2023-11-03 荣耀终端有限公司 Image processing method and related equipment

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070154084A1 (en) * 2006-01-04 2007-07-05 Samsung Electronics Co., Ltd. Apparatus and method for editing optimized color preference
CN110349107A (en) * 2019-07-10 2019-10-18 北京字节跳动网络技术有限公司 Method, apparatus, electronic equipment and the storage medium of image enhancement
CN111062876A (en) * 2018-10-17 2020-04-24 北京地平线机器人技术研发有限公司 Method and device for correcting model training and image correction and electronic equipment
CN111770320A (en) * 2020-07-14 2020-10-13 深圳市洲明科技股份有限公司 Color correction method and device, color correction equipment and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070154084A1 (en) * 2006-01-04 2007-07-05 Samsung Electronics Co., Ltd. Apparatus and method for editing optimized color preference
CN111062876A (en) * 2018-10-17 2020-04-24 北京地平线机器人技术研发有限公司 Method and device for correcting model training and image correction and electronic equipment
CN110349107A (en) * 2019-07-10 2019-10-18 北京字节跳动网络技术有限公司 Method, apparatus, electronic equipment and the storage medium of image enhancement
CN111770320A (en) * 2020-07-14 2020-10-13 深圳市洲明科技股份有限公司 Color correction method and device, color correction equipment and storage medium

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113052970A (en) * 2021-04-09 2021-06-29 杭州群核信息技术有限公司 Neural network-based light intensity and color design method, device and system and storage medium
CN113052970B (en) * 2021-04-09 2023-10-13 杭州群核信息技术有限公司 Design method, device and system for light intensity and color of lamplight and storage medium
CN113297937A (en) * 2021-05-17 2021-08-24 杭州朗和科技有限公司 Image processing method, device, equipment and medium
CN113297937B (en) * 2021-05-17 2023-12-15 杭州网易智企科技有限公司 Image processing method, device, equipment and medium
CN113269206A (en) * 2021-05-24 2021-08-17 山东大学 Color-embedded visual exploration method and system
CN113436105A (en) * 2021-06-30 2021-09-24 北京百度网讯科技有限公司 Model training and image optimization method and device, electronic equipment and storage medium
CN114449199A (en) * 2021-08-12 2022-05-06 荣耀终端有限公司 Video processing method and device, electronic equipment and storage medium
WO2023108568A1 (en) * 2021-12-16 2023-06-22 京东方科技集团股份有限公司 Model training method and apparatus for image processing, and storage medium and electronic device
CN114286000A (en) * 2021-12-27 2022-04-05 展讯通信(上海)有限公司 Image color processing method and device and electronic equipment
CN114463196A (en) * 2021-12-28 2022-05-10 浙江大学嘉兴研究院 Image correction method based on deep learning
CN116993619A (en) * 2023-08-29 2023-11-03 荣耀终端有限公司 Image processing method and related equipment
CN116993619B (en) * 2023-08-29 2024-03-12 荣耀终端有限公司 Image processing method and related equipment

Similar Documents

Publication Publication Date Title
CN112562019A (en) Image color adjusting method and device, computer readable medium and electronic equipment
WO2020173329A1 (en) Image fusion method, model training method, and related device
CN112989904A (en) Method for generating style image, method, device, equipment and medium for training model
CN111476783B (en) Image processing method, device and equipment based on artificial intelligence and storage medium
CN111866483B (en) Color restoration method and device, computer readable medium and electronic device
CN111950570B (en) Target image extraction method, neural network training method and device
CN112927362A (en) Map reconstruction method and device, computer readable medium and electronic device
CN112598780B (en) Instance object model construction method and device, readable medium and electronic equipment
CN108701355A (en) GPU optimizes and the skin possibility predication based on single Gauss online
CN112581635B (en) Universal quick face changing method and device, electronic equipment and storage medium
CN112381707B (en) Image generation method, device, equipment and storage medium
CN111967515A (en) Image information extraction method, training method and device, medium and electronic equipment
CN113744286A (en) Virtual hair generation method and device, computer readable medium and electronic equipment
CN113284206A (en) Information acquisition method and device, computer readable storage medium and electronic equipment
CN112785669B (en) Virtual image synthesis method, device, equipment and storage medium
CN112037305B (en) Method, device and storage medium for reconstructing tree-like organization in image
CN113658065A (en) Image noise reduction method and device, computer readable medium and electronic equipment
CN113642359B (en) Face image generation method and device, electronic equipment and storage medium
CN112967193A (en) Image calibration method and device, computer readable medium and electronic equipment
CN113240599A (en) Image toning method and device, computer-readable storage medium and electronic equipment
CN114418835A (en) Image processing method, apparatus, device and medium
CN113362260A (en) Image optimization method and device, storage medium and electronic equipment
CN114119413A (en) Image processing method and device, readable medium and mobile terminal
CN113409204A (en) Method and device for optimizing image to be processed, storage medium and electronic equipment
CN113569052A (en) Knowledge graph representation learning method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination