CN112967194B - Target image generation method and device, computer readable medium and electronic equipment - Google Patents

Target image generation method and device, computer readable medium and electronic equipment Download PDF

Info

Publication number
CN112967194B
CN112967194B CN202110241630.8A CN202110241630A CN112967194B CN 112967194 B CN112967194 B CN 112967194B CN 202110241630 A CN202110241630 A CN 202110241630A CN 112967194 B CN112967194 B CN 112967194B
Authority
CN
China
Prior art keywords
image
color
processed
target
consistent
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202110241630.8A
Other languages
Chinese (zh)
Other versions
CN112967194A (en
Inventor
颜海强
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Oppo Mobile Telecommunications Corp Ltd
Original Assignee
Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Oppo Mobile Telecommunications Corp Ltd filed Critical Guangdong Oppo Mobile Telecommunications Corp Ltd
Priority to CN202110241630.8A priority Critical patent/CN112967194B/en
Publication of CN112967194A publication Critical patent/CN112967194A/en
Application granted granted Critical
Publication of CN112967194B publication Critical patent/CN112967194B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20024Filtering details

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Image Processing (AREA)

Abstract

The disclosure provides a target image generation method and device, a computer readable medium and electronic equipment, and relates to the technical field of image processing. The method comprises the following steps: acquiring an image to be processed and a reference image; filtering the image to be processed to obtain an image to be processed with consistent color gradation; and performing color mapping processing on the image to be processed with consistent color gradation through the reference image, and generating a target image corresponding to the color theme of the reference image. The method and the device can carry out color migration on the to-be-processed image with any input image quality, avoid the problem that color blocks exist in the image after color migration due to poor image quality of the to-be-processed image, improve the robustness of the output image and improve the image display effect.

Description

Target image generation method and device, computer readable medium and electronic equipment
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a target image generating method, a target image generating device, a computer readable medium, and an electronic apparatus.
Background
With the increasing level of living of people, image Processing technology (Image Processing), image Processing software and Image Processing apparatuses to which the Image Processing technology is applied, and the like are increasingly used. Image color migration refers to adjusting the hue of a current input image with reference to a designated color image, so that the migrated output image has similar hue to a reference image, for example, a new image C is synthesized based on image a and image B, so that the new image C has image information such as the color of a and the shape of B.
At present, in the related image color migration scheme, for an image with poor image quality, defects in the image are amplified through a color migration algorithm, and the situation that colors of two adjacent areas with similar colors are mapped into two kinds of color level differences is large is possibly caused, so that an output picture with inconsistent color levels, namely a color lump phenomenon, is obtained.
Disclosure of Invention
An object of the present disclosure is to provide a target image generating method, a target image generating apparatus, a computer-readable medium, and an electronic device, whereby the problem of color gradation inconsistency or the occurrence of a color patch phenomenon of an image after color migration due to poor image quality of an image to be processed is avoided at least to some extent.
According to a first aspect of the present disclosure, there is provided a target image generation method including:
Acquiring an image to be processed and a reference image;
Filtering the image to be processed to obtain an image to be processed with consistent color gradation;
And performing color mapping processing on the image to be processed with consistent color gradation through the reference image, and generating a target image corresponding to the color theme of the reference image.
According to a second aspect of the present disclosure, there is provided a target image generating apparatus including:
The image acquisition module is used for acquiring an image to be processed and a reference image;
the image filtering module is used for carrying out filtering treatment on the image to be processed to obtain the image to be processed with consistent color gradation;
And the image color migration module is used for carrying out color mapping processing on the image to be processed with consistent color gradation through the reference image, and generating a target image corresponding to the color theme of the reference image.
According to a third aspect of the present disclosure, there is provided a computer readable medium having stored thereon a computer program which, when executed by a processor, implements the method described above.
According to a fourth aspect of the present disclosure, there is provided an electronic apparatus, comprising:
A processor; and
And a memory for storing one or more programs that, when executed by the one or more processors, cause the one or more processors to implement the methods described above.
The object image generation method provided by the embodiment of the disclosure obtains an image to be processed and a reference image; filtering the image to be processed to obtain an image to be processed with consistent color gradation; and performing color mapping processing on the to-be-processed image with consistent color gradation by using the reference image to generate a target image corresponding to the color theme of the reference image. On one hand, before color migration treatment is carried out on an image to be processed, filtering treatment is carried out on the image to be processed, so that the color gradation in the image with poor image quality is kept consistent, and the robustness of an output target image is effectively improved; on the other hand, as the filtering treatment is carried out on the image to be processed, and the color migration is carried out on the obtained image to be processed with consistent color gradation, the problem that the color gradation is inconsistent or the phenomenon of color lump exists after the color migration of the image to be processed with poor quality can be effectively avoided, the accuracy of the target image is improved, and the display effect of the target image is improved; on the other hand, the image quality requirement of the image to be processed can be reduced by filtering the image to be processed, and the application range of color migration is improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the disclosure and together with the description, serve to explain the principles of the disclosure. It will be apparent to those of ordinary skill in the art that the drawings in the following description are merely examples of the disclosure and that other drawings may be derived from them without undue effort. In the drawings:
FIG. 1 illustrates a schematic diagram of an exemplary system architecture to which embodiments of the present disclosure may be applied;
FIG. 2 shows a schematic diagram of an electronic device to which embodiments of the present disclosure may be applied;
FIG. 3 schematically illustrates a flow chart of a target image generation method in an exemplary embodiment of the present disclosure;
FIG. 4 schematically illustrates a flow chart of preprocessing an image to be processed and a reference image in an exemplary embodiment of the present disclosure;
FIG. 5 schematically illustrates a flowchart of a filtering process for an image to be processed in an exemplary embodiment of the present disclosure;
FIG. 6 schematically illustrates a flow chart for implementing color mapping in an exemplary embodiment of the present disclosure;
FIG. 7 schematically illustrates a flow chart for determining a migration matrix in an exemplary embodiment of the present disclosure;
FIG. 8 schematically illustrates a flow chart of one method of generating a target image in an exemplary embodiment of the present disclosure;
fig. 9 schematically illustrates a composition diagram of a target image generating apparatus in an exemplary embodiment of the present disclosure.
Detailed Description
Example embodiments will now be described more fully with reference to the accompanying drawings. However, the exemplary embodiments may be embodied in many forms and should not be construed as limited to the examples set forth herein; rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the concept of the example embodiments to those skilled in the art. The described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Furthermore, the drawings are merely schematic illustrations of the present disclosure and are not necessarily drawn to scale. The same reference numerals in the drawings denote the same or similar parts, and thus a repetitive description thereof will be omitted. Some of the block diagrams shown in the figures are functional entities and do not necessarily correspond to physically or logically separate entities. These functional entities may be implemented in software or in one or more hardware modules or integrated circuits or in different networks and/or processor devices and/or microcontroller devices.
Fig. 1 illustrates a schematic diagram of a system architecture of an exemplary application environment to which a target image generation method and apparatus of embodiments of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include one or more of the terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 is used as a medium to provide communication links between the terminal devices 101, 102, 103 and the server 105. The network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, among others. The terminal devices 101, 102, 103 may be various electronic devices having image processing functions including, but not limited to, desktop computers, portable computers, smart phones, tablet computers, and the like. It should be understood that the number of terminal devices, networks and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation. For example, the server 105 may be a server cluster formed by a plurality of servers.
The target image generation method provided by the embodiments of the present disclosure is generally performed in the terminal apparatuses 101, 102, 103, and accordingly, the target image generation apparatus is generally provided in the terminal apparatuses 101, 102, 103. However, it will be readily understood by those skilled in the art that the method for generating a target image provided in the embodiment of the present disclosure may be performed by the server 105, and accordingly, the target image generating apparatus may be provided in the server 105, which is not particularly limited in the present exemplary embodiment. For example, in an exemplary embodiment, the user may upload the image to be processed and the reference image to the server 105 through the terminal devices 101, 102, 103, and the server may transmit the target image to the terminal devices 101, 102, 103 after generating the target image through the target image generating method provided by the embodiment of the present disclosure.
Exemplary embodiments of the present disclosure provide an electronic device for implementing a target image generation method, which may be the terminal device 101, 102, 103 or the server 105 in fig. 1. The electronic device comprises at least a processor and a memory for storing executable instructions of the processor, the processor being configured to perform the target image generation method via execution of the executable instructions.
The configuration of the electronic device will be exemplarily described below using the mobile terminal 200 of fig. 2 as an example. It will be appreciated by those skilled in the art that the configuration of fig. 2 can also be applied to stationary type devices in addition to components specifically for mobile purposes. In other embodiments, mobile terminal 200 may include more or less components than illustrated, or certain components may be combined, or certain components may be split, or different arrangements of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware. The interfacing relationship between the components is shown schematically only and does not constitute a structural limitation of the mobile terminal 200. In other embodiments, the mobile terminal 200 may also employ a different interface from that of fig. 2, or a combination of interfaces.
As shown in fig. 2, the mobile terminal 200 may specifically include: processor 210, internal memory 221, external memory interface 222, universal serial bus (Universal Serial Bus, USB) interface 230, charge management module 240, power management module 241, battery 242, antenna 1, antenna 2, mobile communication module 250, wireless communication module 260, audio module 270, speaker 271, receiver 272, microphone 273, headset interface 274, sensor module 280, display screen 290, camera module 291, indicator 292, motor 293, keys 294, and subscriber identity module (subscriber identification module, SIM) card interface 295, among others. Wherein the sensor module 280 may include a depth sensor 2801, a pressure sensor 2802, a gyro sensor 2803, and the like.
Processor 210 may include one or more processing units such as, for example: the Processor 210 may include an application Processor (Application Processor, AP), a modem Processor, a graphics Processor (Graphics Processing Unit, GPU), an image signal Processor (IMAGE SIGNAL Processor, ISP), a controller, a video codec, a digital signal Processor (DIGITAL SIGNAL Processor, DSP), a baseband Processor and/or a neural network Processor (Neural-Network Processing Unit, NPU), and the like. Wherein the different processing units may be separate devices or may be integrated in one or more processors.
The NPU is a neural Network (Neural-Network, NN) computing processor, and can rapidly process input information by referencing a biological neural Network structure, such as referencing a transmission mode among human brain neurons, and can continuously learn. Applications such as intelligent awareness of the mobile terminal 200 may be implemented by the NPU, for example: image recognition, face recognition, speech recognition, text understanding, etc.
The processor 210 has a memory disposed therein. The memory may store instructions for implementing six modular functions: detection instructions, connection instructions, information management instructions, analysis instructions, data transfer instructions, and notification instructions, and are controlled to be executed by the processor 210.
The charge management module 240 is configured to receive a charge input from a charger. The power management module 241 is used for connecting the battery 242, the charge management module 240 and the processor 210. The power management module 241 receives input from the battery 242 and/or the charge management module 240 and provides power to the processor 210, the internal memory 221, the display 290, the camera module 291, the wireless communication module 260, and the like.
The wireless communication function of the mobile terminal 200 may be implemented by the antenna 1, the antenna 2, the mobile communication module 250, the wireless communication module 260, a modem processor, a baseband processor, and the like. Wherein the antenna 1 and the antenna 2 are used for transmitting and receiving electromagnetic wave signals; the mobile communication module 250 may provide a solution including 2G/3G/4G/5G wireless communication applied to the mobile terminal 200; the modem processor may include a modulator and a demodulator; the wireless communication module 260 may provide solutions for wireless communication including wireless local area network (Wireless Local Area Networks, WLAN), such as wireless fidelity (WIRELESS FIDELITY, wi-Fi) network, bluetooth (BT), etc., as applied on the mobile terminal 200. In some embodiments, antenna 1 and mobile communication module 250 of mobile terminal 200 are coupled, and antenna 2 and wireless communication module 260 are coupled, so that mobile terminal 200 may communicate with a network and other devices through wireless communication techniques.
The mobile terminal 200 implements display functions through a GPU, a display screen 290, an application processor, and the like. The GPU is a microprocessor for image processing, and is connected to the display screen 290 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. Processor 210 may include one or more GPUs that execute program instructions to generate or change display information.
The mobile terminal 200 may implement a photographing function through an ISP, a camera module 291, a video codec, a GPU, a display screen 290, an application processor, and the like. The ISP is used for processing the data fed back by the camera module 291; the camera module 291 is used for capturing still images or videos; the digital signal processor is used for processing digital signals, and can process other digital signals besides digital image signals; video codec is used to compress or decompress digital video, and the mobile terminal 200 may also support one or more video codecs.
The external memory interface 222 may be used to connect an external memory card, such as a Micro SD card, to enable expansion of the memory capabilities of the mobile terminal 200. The external memory card communicates with the processor 210 via an external memory interface 222 to implement data storage functions. For example, files such as music, video, etc. are stored in an external memory card.
The internal memory 221 may be used to store computer executable program code that includes instructions. The internal memory 221 may include a storage program area and a storage data area. The storage program area may store an application program (such as a sound playing function, an image playing function, etc.) required for at least one function of the operating system, etc. The storage data area may store data (e.g., audio data, phonebook, etc.) created during use of the mobile terminal 200, and the like. In addition, the internal memory 221 may include a high-speed random access memory, and may further include a nonvolatile memory such as at least one magnetic disk storage device, a flash memory device, a universal flash memory (Universal Flash Storage, UFS), and the like. The processor 210 performs various functional applications of the mobile terminal 200 and data processing by executing instructions stored in the internal memory 221 and/or instructions stored in a memory provided in the processor.
The mobile terminal 200 may implement audio functions through an audio module 270, a speaker 271, a receiver 272, a microphone 273, an earphone interface 274, an application processor, and the like. Such as music playing, recording, etc.
The depth sensor 2801 is used to acquire depth information of a scene. In some embodiments, a depth sensor may be provided at the camera module 291.
The pressure sensor 2802 is used to sense a pressure signal, and may convert the pressure signal into an electrical signal. In some embodiments, pressure sensor 2802 may be disposed on display 290. The pressure sensor 2802 is of various types, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like.
The gyro sensor 2803 may be used to determine a motion gesture of the mobile terminal 200. In some embodiments, the angular velocity of mobile terminal 200 about three axes (i.e., x, y, and z axes) may be determined by gyro sensor 2803. The gyro sensor 2803 can be used to capture anti-shake, navigation, motion-sensing game scenes, and the like.
In addition, sensors for other functions, such as an air pressure sensor, a magnetic sensor, an acceleration sensor, a distance sensor, a proximity sensor, a fingerprint sensor, a temperature sensor, a touch sensor, an ambient light sensor, a bone conduction sensor, etc., may be provided in the sensor module 280 according to actual needs.
Other devices that provide auxiliary functionality may also be included in mobile terminal 200. For example, the keys 294 include a power-on key, a volume key, etc., by which a user can generate key signal inputs related to user settings and function controls of the mobile terminal 200. As another example, indicator 292, motor 293, SIM card interface 295, and the like.
In the related color migration technical scheme, an intermediate image is firstly constructed according to the color characteristics of a reference image, then an input image is divided into a plurality of grid areas, the average color of each grid area is calculated, and then the color closest to the average color in each grid area is searched in the intermediate image for color migration. However, this color migration scheme can obtain a more perfect color migration result under a normal scene (for example, the input image is an image meeting the quality requirement), but for an input image with poor image quality, the color migration scheme can amplify the effect of a JPEG (a standard for continuous tone still image compression) compression block in the input image with poor image quality, so that the colors corresponding to the original two adjacent and similar color areas are mapped into two color cases with larger tone step distances, and an output image with inconsistent tone steps or a color lump phenomenon is obtained.
In view of one or more of the above problems, the present exemplary embodiment proposes a target image generating method and a target image generating apparatus, and a target image generating method and a target image generating apparatus of exemplary embodiments of the present disclosure are specifically described below.
Fig. 3 shows a flow of a target image generation method in the present exemplary embodiment, including the following steps S310 to S330:
In step S310, an image to be processed and a reference image are acquired.
In an exemplary embodiment, the image to be processed refers to an image designated for being a color migration target, for example, the image to be processed may be an image captured by an image capturing unit, or may be an image drawn by image editing software, or may be another type of image designated for being a color migration target, which is not particularly limited in this exemplary embodiment.
The reference image refers to a source image designated to provide a color used for color migration, and for example, the reference image may be an image having a gradient color theme or an image having a cool-warm color theme, which is not particularly limited in this exemplary embodiment.
For example, in an image color shift process, a new image C is output based on the image a and the image B, and the color shift causes the image C to have the color of the image a and the shape of the image B at the same time, where the image a may be considered as a reference image (i.e., a source image during the color shift) and the image B may be considered as an image to be processed (i.e., a target image during the color shift).
In step S320, filtering is performed on the image to be processed, so as to obtain an image to be processed with consistent color gradation.
In an exemplary embodiment, the filtering process (WAVE FILTERING) refers to a process of filtering out frequencies in a specific band in a signal, which is an important measure for suppressing and preventing interference, for example, the filtering process may be domain migration filtering (Domain Transform Filter), guided filtering (Guided Filter), or a combined filtering process of domain migration filtering and Guided filtering, and of course, the filtering process may be other types of filtering processes or a combination of multiple filtering processes, which is not limited in any way in this exemplary embodiment.
The tone scale is a color index (gray resolution) representing the intensity of the image, and the color fullness and fineness of the image are determined by the tone scale. If the color gradation in the image to be processed is not kept consistent, when the image quality of the image to be processed is low or the image is stored in an image defect in a lossy compression process, the image defect is amplified in a color migration process, so that the color gradation is inconsistent, namely, the problem of color blocks is solved.
In step S330, the reference image is used to perform color mapping processing on the image to be processed with consistent color gradation, so as to generate a target image corresponding to the color theme of the reference image.
In an exemplary embodiment, the color theme refers to a color scheme formed by a distribution rule of color components in the reference image, for example, the color theme may be a gradient color theme, or may be a color theme with cooling and warming, or may be another type of color theme, which is not limited in particular in this exemplary embodiment.
Steps S310 to S330 are further described below.
In an exemplary embodiment, after the image to be processed and the reference image are acquired, in order to make the image to be processed and the reference image more suitable for a color migration algorithm, color space conversion may be performed on the image to be processed and the reference image before color migration to improve efficiency and output result of the color migration process, which may specifically include, as shown in fig. 4:
Step S410, determining a first color space corresponding to the image to be processed and a second color space corresponding to the reference image;
step S420, performing color space conversion processing on the to-be-processed image in the first color space and the reference image in the second color space, and generating a to-be-processed image in the target color space and the reference image in the target color space.
Wherein a color space refers to a system or manner in which colors are described digitally by means of corresponding mathematical models.
The first color space may refer to a color space corresponding to the image to be processed, for example, the first color space may be an RGB color space (RGB color space is a color model composed of a red color channel R, a green color channel G, and a blue color channel B), an HSV color space (HSV color space is a color model composed of Hue, saturation, and Value), or other types of mixed (mixing) color spaces or Intensity/Saturation/Hue (Hue) color spaces, such as CMY/CMYK color spaces, HSI/HSL color spaces, and the like, which are not particularly limited in this example embodiment.
The second color space may refer to a color space corresponding to the reference image, for example, the second color space may be an RGB color space (RGB color space is a color model composed of a red color channel R, a green color channel G, and a blue color channel B), an HSV color space (HSV color space is a color model composed of Hue, saturation, and Value), or other types of mixed (mixing) color space or Intensity/Saturation/Hue (Hue) color space, such as CMY/CMYK color space, HSI/HSL color space, and the like, which is not particularly limited in the present exemplary embodiment.
It should be noted that the "first" and "second" are merely distinguishing respective corresponding color spaces of the image to be processed and the reference image, and are not limited in any particular manner, for example, the first color space corresponding to the image to be processed and the second color space corresponding to the reference image may be the same color space, for example, the first color space and the second color space may be RGB color spaces, or may be different color spaces, for example, the first color space is RGB color space, and the second color space is HSV color space.
The target color space may refer to a color space in which color channels are not related to each other and are suitable for a color migration algorithm, for example, the target color space may be a Lab/l×a×b color space (Lab color space is a color model formed by a luminance channel L, a color channel and a b color channel), or may be a YUV color space (YUV color space is a color model formed by a luminance channel Y, a chrominance channel U, and a density channel V), or may be other nonlinear luminance/chrominance (Luma/Chroma) color space, such as l×u×v color space, which is not limited in particular in this example embodiment.
In an exemplary embodiment, the first color space and the second color space may be converted into the same target color space, such as Lab/L a b color space.
The first color space and the second color space which correspond to the image to be processed and the reference image and are not suitable for the color migration algorithm are converted into the same target color space which is not related to each other and is suitable for the color migration algorithm, so that the calculation efficiency of the color migration algorithm can be effectively improved, and the display effect of the output image can be improved.
In an exemplary embodiment, the target color space may include a luminance channel feature, a first color channel feature, and a second color channel feature, for example, the target color space may be a Lab color space, then the L-channel may be a luminance channel, the feature value corresponding to the L-channel is a luminance channel feature, the a-channel may be a first color channel, the feature value corresponding to the a-channel is a first color channel feature, the b-channel may be a second color channel, and the feature value corresponding to the b-channel is a second color channel feature. Of course, the target color space may also be a YUV color space, the luminance channel feature may be a feature value corresponding to the Y channel, the first color channel feature may be a feature value corresponding to the chrominance channel U, and the second color channel feature may be a feature value corresponding to the concentration channel V, which is not particularly limited in this example embodiment.
Specifically, domain migration filtering processing can be performed on the first color channel feature and the second color channel feature corresponding to the image to be processed based on the brightness channel feature, so as to obtain the image to be processed with consistent global regional color gradation. The domain migration filtering (Domain Transform filter) is a filtering process of performing parallax optimization on a global area of an image to be processed through a recursive domain transformation filter, and the global parallax optimization can be realized through the domain migration filtering process on the image to be processed, particularly on the image to be processed with lower image quality, so that the color gradation consistency of the global area of the image to be processed is ensured, the image effect of the output image after color migration is improved from the source, and the phenomenon of color blocks is avoided.
In an exemplary embodiment, the guiding filtering process may be further performed on the first color channel feature and the second color channel feature corresponding to the image to be processed based on the brightness channel feature, so as to obtain the image to be processed with consistent local area tone. The Guided filtering (Guided Filter) refers to a filtering process of performing parallax optimization near a boundary in an image to be processed through a Guided Filter, and the parallax optimization of a local boundary of the image to be processed, especially an image to be processed with lower image quality, can be realized through the Guided filtering process, so that the consistency of the color gradation of the local area of the boundary of the image to be processed is ensured, and the color lump phenomenon is further avoided.
Specifically, domain migration filtering can be performed on the image to be processed to obtain the image to be processed with consistent global regional color gradation, and then guiding filtering is performed on the image to be processed with consistent global regional color gradation to obtain the image to be processed with consistent global regional and local boundary.
In an exemplary embodiment, the filtering process of the image to be processed may be implemented by the following steps, which may specifically include:
Step S510, obtaining brightness channel characteristics corresponding to the image to be processed, and generating a guide image according to the brightness channel characteristics;
and step S520, performing filtering processing on the first color channel feature and the second color channel feature corresponding to the image to be processed according to the guide image, so as to obtain the image to be processed with consistent color gradation.
The guide image refers to data used for filtering processing of an image to be processed by the guide filter, for example, the target color space may be a Lab color space, and the luminance channel characteristic corresponding to the image to be processed may be a channel characteristic value corresponding to an L channel, and domain migration filtering is performed on an a channel and a b channel of the image to be processed by using the L channel as a guide image, so as to ensure that the color of the image to be processed is filtered while eliminating parallax in the image to be processed.
In an exemplary embodiment, the color mapping process may be implemented on the image to be processed with consistent color gradation, so as to generate a target image corresponding to the color theme of the reference image, which is shown in fig. 6, and may specifically include:
step S610, calculating a migration matrix of the image to be processed with consistent color gradation relative to the reference image;
And step S620, performing color mapping processing on the image to be processed with consistent color gradation according to the migration matrix, and generating a target image corresponding to the color theme of the reference image.
The migration matrix refers to a conversion matrix for realizing migration of colors in the reference image into the image to be processed. The method for implementing color migration in the present exemplary embodiment may be an MKL color migration method (Monge-Kantorovitch Linear Colour Mapping), which is a linear color mapping method, and the migration matrix of the image to be processed relative to the reference image is calculated first, and then the color migration is performed on the image to be processed through the migration matrix, which may, of course, be other types of color migration methods.
Further, the migration matrix of the image to be processed relative to the reference image may be calculated by the following steps, as shown in fig. 7, which may specifically include:
Step S710, calculating covariance matrixes corresponding to the images to be processed with consistent color gradation and the reference image;
step S720, carrying out eigenvalue decomposition on the covariance matrix to obtain a target eigenvalue and a target eigenvector;
And step S730, determining a migration matrix of the image to be processed with consistent color gradation relative to the reference image according to the target feature value and the target feature vector.
Wherein the covariance matrix (Covariance Matrix) may be used to represent the probability density of the multi-dimensional random variable, such that the multi-dimensional random variable may be characterized by the covariance matrix. Eigenvalue decomposition (Eigendecomposition) refers to a method of decomposing a matrix into products of its eigenvalues and matrices represented by eigenvectors. The target eigenvalue and the target eigenvector may be eigenvalues and eigenvectors obtained by eigenvalue decomposition of the covariance matrix.
Fig. 8 schematically illustrates a flowchart of generating a target image in an exemplary embodiment of the present disclosure.
Referring to fig. 8, step S810 is performed to obtain an image to be processed, convert a first color space corresponding to the image to be processed into a target color space, and perform domain migration filtering and guide filtering on the image to be processed in the target color space to obtain an image to be processed with consistent color gradation;
Step S820, obtaining a reference image, and converting a first color space corresponding to the reference image into a target color space;
Step S830, respectively calculating covariance matrices for the to-be-processed picture and the reference picture with consistent color gradation in the target color space, such as Lab color space, and calculating 3×3 covariance matrices related to Lab color space;
Step S840, respectively carrying out eigenvalue decomposition on covariance matrixes corresponding to the to-be-processed image and the reference image with consistent color gradation to obtain a target eigenvalue and a target eigenvector;
in step S850, according to the target feature value and the target feature vector obtained by the feature value decomposition, a migration matrix of the image to be processed with consistent color gradation with respect to the reference image is determined, and specifically, the migration matrix may be represented as a relational expression group (1):
Wherein, T may represent a migration matrix of the image to be processed with consistent color gradation relative to the reference image, D may represent a diagonal matrix composed of target feature values, P may represent an orthogonal matrix composed of target feature vectors, subscript u may represent the image to be processed with consistent color gradation, and subscript v may represent the reference image;
Step S860, implementing color migration of the image to be processed relative to the reference image according to the transformation matrix T, generating the target image, and ending the process, for example, implementing color migration by the relation (2):
imgout=T(imginin)+μref (2)
Wherein T may represent a migration matrix of the image to be processed with consistent color gradation relative to the reference image, img out may represent a pixel value corresponding to the target image, img out may represent a pixel value corresponding to the image to be processed, and μ in、μref may represent average values of channels under the target color spaces corresponding to the image to be processed and the reference image, respectively.
In summary, in the present exemplary embodiment, an image to be processed and a reference image are acquired; filtering the image to be processed to obtain an image to be processed with consistent color gradation; and performing color mapping processing on the to-be-processed image with consistent color gradation by using the reference image to generate a target image corresponding to the color theme of the reference image. On one hand, before color migration treatment is carried out on an image to be processed, filtering treatment is carried out on the image to be processed, so that the color gradation in the image with poor image quality is kept consistent, and the robustness of an output target image is effectively improved; on the other hand, as the filtering treatment is carried out on the image to be processed, and the color migration is carried out on the obtained image to be processed with consistent color gradation, the problem that the color gradation is inconsistent or the phenomenon of color lump exists after the color migration of the image to be processed with poor quality can be effectively avoided, the accuracy of the target image is improved, and the display effect of the target image is improved; on the other hand, the image quality requirement of the image to be processed can be reduced by filtering the image to be processed, and the application range of color migration is improved.
The color migration scheme based on filtering is provided, before color migration is carried out, the image to be processed is preprocessed through domain migration filtering and guide filtering, so that the target image generation method disclosed by the invention can not have the color block problem on the image to be processed with any image quality in the output result.
In the related color migration scheme, if the image to be processed has more serious JPEG compression blurring, obvious color lump phenomenon can appear in the output target image when the image to be processed is subjected to color migration through a color migration algorithm. The domain migration filtering and the guided filtering are respectively carried out on the channel a and the channel b through the L channel information in the target color space, such as the Lab color space, in the image to be processed, so that the color lump problem is eliminated. In addition, the domain migration filter and the guide filter have smaller consumption of calculation resources, and the calculation efficiency can be effectively improved.
It is noted that the above-described figures are merely schematic illustrations of processes involved in a method according to exemplary embodiments of the present disclosure, and are not intended to be limiting. It will be readily appreciated that the processes shown in the above figures do not indicate or limit the temporal order of these processes. In addition, it is also readily understood that these processes may be performed synchronously or asynchronously, for example, among a plurality of modules.
Further, referring to fig. 9, in this exemplary embodiment, there is further provided a target image generating apparatus 900, including an image acquisition module 910, an image filtering module 920, and an image color migration module 930. Wherein:
the image acquisition module 910 may be configured to acquire an image to be processed and a reference image;
the image filtering module 920 may be configured to perform filtering processing on the image to be processed to obtain an image to be processed with consistent color gradation;
The image color migration module 930 may be configured to perform color mapping processing on the image to be processed with the consistent color gradation through the reference image, to generate a target image corresponding to a color theme of the reference image.
In an exemplary embodiment, the target image generating apparatus 900 further includes a color space converting unit, which may be configured to:
Determining a first color space corresponding to the image to be processed and a second color space corresponding to the reference image;
and performing color space conversion processing on the to-be-processed image in the first color space and the reference image in the second color space to generate the to-be-processed image in the target color space and the reference image in the target color space.
In an exemplary embodiment, the image filtering module 920 further includes a domain migration filtering unit that may be configured to:
and performing domain migration filtering processing on the first color channel characteristic and the second color channel characteristic corresponding to the image to be processed based on the brightness channel characteristic to obtain the image to be processed with consistent global regional color gradation.
In an exemplary embodiment, the image filtering module 920 further includes a guided filtering unit that may be used to:
And carrying out guide filtering processing on the first color channel characteristic and the second color channel characteristic corresponding to the image to be processed based on the brightness channel characteristic to obtain the image to be processed with consistent local area color gradation.
In an exemplary embodiment, the image filtering module 920 may also be configured to:
acquiring brightness channel characteristics corresponding to the image to be processed, and generating a guide image according to the brightness channel characteristics;
And carrying out filtering processing on the first color channel characteristics and the second color channel characteristics corresponding to the image to be processed according to the guide image to obtain the image to be processed with consistent color gradation.
In an exemplary embodiment, the image color migration module 930 further includes:
A migration matrix calculating unit, configured to calculate a migration matrix of the image to be processed with consistent color gradation relative to the reference image;
And the color mapping unit is used for carrying out color mapping processing on the image to be processed with consistent color gradation according to the migration matrix, and generating a target image corresponding to the color theme of the reference image.
In an exemplary embodiment, the migration matrix computing unit may be further configured to:
calculating a covariance matrix corresponding to the image to be processed with consistent color gradation and the reference image;
performing eigenvalue decomposition on the covariance matrix to obtain a target eigenvalue and a target eigenvector;
and determining a migration matrix of the image to be processed with consistent color gradation relative to the reference image according to the target characteristic value and the target characteristic vector.
The specific details of each module in the above apparatus are already described in the method section, and the details that are not disclosed can be referred to the embodiment of the method section, so that they will not be described in detail.
Those skilled in the art will appreciate that the various aspects of the present disclosure may be implemented as a system, method, or program product. Accordingly, various aspects of the disclosure may be embodied in the following forms, namely: an entirely hardware embodiment, an entirely software embodiment (including firmware, micro-code, etc.) or an embodiment combining hardware and software aspects may be referred to herein as a "circuit," module "or" system.
Exemplary embodiments of the present disclosure also provide a computer-readable storage medium having stored thereon a program product capable of implementing the method described above in the present specification. In some possible implementations, various aspects of the disclosure may also be implemented in the form of a program product comprising program code for causing a terminal device to carry out the steps according to the various exemplary embodiments of the disclosure as described in the "exemplary methods" section of this specification, when the program product is run on the terminal device, e.g. any one or more of the steps of fig. 3 to 8 may be carried out.
It should be noted that the computer readable medium shown in the present disclosure may be a computer readable signal medium or a computer readable storage medium, or any combination of the two. The computer readable storage medium can be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or a combination of any of the foregoing. More specific examples of the computer-readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing.
In the context of this disclosure, a computer-readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In the present disclosure, however, the computer-readable signal medium may include a data signal propagated in baseband or as part of a carrier wave, with the computer-readable program code embodied therein. Such a propagated data signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination of the foregoing. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: wireless, wire, fiber optic cable, RF, etc., or any suitable combination of the foregoing.
Furthermore, the program code for carrying out operations of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computing device, partly on the user's device, as a stand-alone software package, partly on the user's computing device, partly on a remote computing device, or entirely on the remote computing device or server. In the case of remote computing devices, the remote computing device may be connected to the user computing device through any kind of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or may be connected to an external computing device (e.g., connected via the Internet using an Internet service provider).
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any adaptations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (9)

1. A target image generation method, characterized by comprising:
Acquiring an image to be processed and a reference image;
Determining a first color space corresponding to the image to be processed and a second color space corresponding to the reference image;
performing color space conversion processing on the image to be processed in the first color space and the reference image in the second color space to generate the image to be processed in the target color space and the reference image in the target color space;
Filtering the image to be processed to obtain an image to be processed with consistent color gradation;
And performing color mapping processing on the image to be processed with consistent color gradation through the reference image, and generating a target image corresponding to the color theme of the reference image.
2. The method of claim 1, wherein the target color space comprises a luminance channel feature, a first color channel feature, and a second color channel feature;
The filtering processing is carried out on the image to be processed to obtain the image to be processed with consistent color gradation, which comprises the following steps:
and performing domain migration filtering processing on the first color channel characteristic and the second color channel characteristic corresponding to the image to be processed based on the brightness channel characteristic to obtain the image to be processed with consistent global regional color gradation.
3. The method according to claim 2, wherein the filtering the image to be processed to obtain the image to be processed with consistent color gradation, further comprises:
And carrying out guide filtering processing on the first color channel characteristic and the second color channel characteristic corresponding to the image to be processed based on the brightness channel characteristic to obtain the image to be processed with consistent local area color gradation.
4. A method according to claim 2 or 3, wherein the filtering the image to be processed to obtain a uniform-tone image to be processed comprises:
acquiring brightness channel characteristics corresponding to the image to be processed, and generating a guide image according to the brightness channel characteristics;
And carrying out filtering processing on the first color channel characteristics and the second color channel characteristics corresponding to the image to be processed according to the guide image to obtain the image to be processed with consistent color gradation.
5. The method according to claim 1, wherein performing color mapping processing on the image to be processed with the consistent color gradation by the reference image to generate a target image corresponding to a color theme of the reference image, includes:
calculating a migration matrix of the image to be processed with consistent color gradation relative to the reference image;
And performing color mapping processing on the image to be processed with consistent color gradation according to the migration matrix, and generating a target image corresponding to the color theme of the reference image.
6. The method of claim 5, wherein said calculating a migration matrix of said tonescale consistent image to be processed relative to said reference image comprises:
calculating a covariance matrix corresponding to the image to be processed with consistent color gradation and the reference image;
performing eigenvalue decomposition on the covariance matrix to obtain a target eigenvalue and a target eigenvector;
and determining a migration matrix of the image to be processed with consistent color gradation relative to the reference image according to the target characteristic value and the target characteristic vector.
7. An object image generating apparatus, comprising:
The image acquisition module is used for acquiring an image to be processed and a reference image;
the method comprises the steps of determining a first color space corresponding to the image to be processed and a second color space corresponding to the reference image; a module for performing color space conversion processing on the image to be processed in the first color space and the reference image in the second color space to generate the image to be processed in the target color space and the reference image in the target color space;
the image filtering module is used for carrying out filtering treatment on the image to be processed to obtain the image to be processed with consistent color gradation;
And the image color migration module is used for carrying out color mapping processing on the image to be processed with consistent color gradation through the reference image, and generating a target image corresponding to the color theme of the reference image.
8. A computer readable medium, on which a computer program is stored, which computer program, when being executed by a processor, implements the method according to any one of claims 1 to 6.
9. An electronic device, comprising:
A processor; and
A memory for storing executable instructions of the processor;
Wherein the processor is configured to perform the method of any one of claims 1 to 6 via execution of the executable instructions.
CN202110241630.8A 2021-03-04 2021-03-04 Target image generation method and device, computer readable medium and electronic equipment Active CN112967194B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110241630.8A CN112967194B (en) 2021-03-04 2021-03-04 Target image generation method and device, computer readable medium and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110241630.8A CN112967194B (en) 2021-03-04 2021-03-04 Target image generation method and device, computer readable medium and electronic equipment

Publications (2)

Publication Number Publication Date
CN112967194A CN112967194A (en) 2021-06-15
CN112967194B true CN112967194B (en) 2024-05-14

Family

ID=76276537

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110241630.8A Active CN112967194B (en) 2021-03-04 2021-03-04 Target image generation method and device, computer readable medium and electronic equipment

Country Status (1)

Country Link
CN (1) CN112967194B (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955902A (en) * 2014-05-08 2014-07-30 国网上海市电力公司 Weak light image enhancing method based on Retinex and Reinhard color migration
CN105261046A (en) * 2015-09-23 2016-01-20 北京航空航天大学 Scenario-adaptive tone migration method
WO2019023968A1 (en) * 2017-08-02 2019-02-07 深圳传音通讯有限公司 Image color adjustment system and color adjustment method for smart terminal
CN109816755A (en) * 2019-02-02 2019-05-28 珠海金山网络游戏科技有限公司 A kind of production method of ink image, calculates equipment and storage medium at device
CN110866866A (en) * 2019-11-14 2020-03-06 腾讯科技(深圳)有限公司 Image color-matching processing method and device, electronic device and storage medium
CN111563517A (en) * 2020-04-20 2020-08-21 腾讯科技(深圳)有限公司 Image processing method, image processing device, electronic equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955902A (en) * 2014-05-08 2014-07-30 国网上海市电力公司 Weak light image enhancing method based on Retinex and Reinhard color migration
CN105261046A (en) * 2015-09-23 2016-01-20 北京航空航天大学 Scenario-adaptive tone migration method
WO2019023968A1 (en) * 2017-08-02 2019-02-07 深圳传音通讯有限公司 Image color adjustment system and color adjustment method for smart terminal
CN109816755A (en) * 2019-02-02 2019-05-28 珠海金山网络游戏科技有限公司 A kind of production method of ink image, calculates equipment and storage medium at device
CN110866866A (en) * 2019-11-14 2020-03-06 腾讯科技(深圳)有限公司 Image color-matching processing method and device, electronic device and storage medium
CN111563517A (en) * 2020-04-20 2020-08-21 腾讯科技(深圳)有限公司 Image processing method, image processing device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN112967194A (en) 2021-06-15

Similar Documents

Publication Publication Date Title
CN111598776B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN111179282B (en) Image processing method, image processing device, storage medium and electronic apparatus
CN110069974B (en) Highlight image processing method and device and electronic equipment
CN109255774B (en) Image fusion method, device and equipment
CN112562019A (en) Image color adjusting method and device, computer readable medium and electronic equipment
CN112967193B (en) Image calibration method and device, computer readable medium and electronic equipment
CN112887582A (en) Image color processing method and device and related equipment
CN111696039B (en) Image processing method and device, storage medium and electronic equipment
CN111866483B (en) Color restoration method and device, computer readable medium and electronic device
WO2023284401A1 (en) Image beautification processing method and apparatus, storage medium, and electronic device
CN111062993A (en) Color-merged drawing image processing method, device, equipment and storage medium
WO2024027287A9 (en) Image processing system and method, and computer-readable medium and electronic device
CN113132696A (en) Image tone mapping method, device, electronic equipment and storage medium
CN113610720A (en) Video denoising method and device, computer readable medium and electronic device
CN113455013A (en) Electronic device for processing image and image processing method thereof
CN113902636A (en) Image deblurring method and device, computer readable medium and electronic equipment
CN113112422A (en) Image processing method, image processing device, electronic equipment and computer readable medium
CN113658065A (en) Image noise reduction method and device, computer readable medium and electronic equipment
CN112489144A (en) Image processing method, image processing apparatus, terminal device, and storage medium
CN112967194B (en) Target image generation method and device, computer readable medium and electronic equipment
CN115546858B (en) Face image processing method and electronic equipment
US20210067690A1 (en) Electronic device and method for processing image by electronic device
CN115187487A (en) Image processing method and device, electronic device and storage medium
CN115205159A (en) Image processing method and device, electronic device and storage medium
CN114119413A (en) Image processing method and device, readable medium and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant