CN111311500A - Method and device for carrying out color restoration on image - Google Patents

Method and device for carrying out color restoration on image Download PDF

Info

Publication number
CN111311500A
CN111311500A CN201811517272.3A CN201811517272A CN111311500A CN 111311500 A CN111311500 A CN 111311500A CN 201811517272 A CN201811517272 A CN 201811517272A CN 111311500 A CN111311500 A CN 111311500A
Authority
CN
China
Prior art keywords
image
connected region
region
pixels
saturation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201811517272.3A
Other languages
Chinese (zh)
Inventor
孙超伟
竺旭东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Huawei Technologies Co Ltd
Original Assignee
Huawei Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Huawei Technologies Co Ltd filed Critical Huawei Technologies Co Ltd
Priority to CN201811517272.3A priority Critical patent/CN111311500A/en
Priority to PCT/CN2019/121126 priority patent/WO2020119454A1/en
Publication of CN111311500A publication Critical patent/CN111311500A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/77Retouching; Inpainting; Scratch removal
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/187Segmentation; Edge detection involving region growing; involving region merging; involving connected component labelling
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N1/00Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
    • H04N1/46Colour picture communication systems
    • H04N1/56Processing of colour picture signals
    • H04N1/60Colour correction or control
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Image Processing (AREA)

Abstract

The embodiment of the application provides a method and a device for carrying out color restoration on an image, relates to the field of image processing, and can carry out accurate color restoration on an overexposed object (such as a traffic signal lamp) in the image. The method comprises the following steps: acquiring a first image to be processed, wherein the first image comprises an overexposed first target object; determining a first region of the first image according to the saturation and brightness of the pixels of the first image; carrying out binarization processing on the first area to obtain a binary image corresponding to the first area; determining at least one connected region with the area larger than or equal to a first threshold value in the binary image; the contour of at least one connected region corresponds to the contour of the first target object; restoring the color of the at least one connected region. The embodiment of the application is applied to scenes for performing color restoration on the overexposed image.

Description

Method and device for carrying out color restoration on image
Technical Field
The present application relates to the field of image processing, and in particular, to a method and an apparatus for performing color restoration on an image.
Background
Image overexposure is a relatively common phenomenon that can cause a series of problems. For example, in a traffic video monitoring system, a monitoring camera working in an electronic police mode can simultaneously take a snapshot of a red light signal and a violation vehicle to obtain evidence, and the snapshot is used as the violation evidence of a violation person. However, in a situation where the ambient light level is low (such as a dusk or night scene), when the monitoring camera takes a picture of the red signal light for evidence, the camera often needs to increase the shutter, gain and aperture size. If the gain, shutter or aperture is increased too much, it may cause the captured traffic light to be overexposed (e.g., the red signal light may turn yellow or white) resulting in the captured image not being evidence of a violation.
At present, there are several methods for implementing color restoration of signal lamps, which can be divided into hardware method and software method. The hardware method comprises the following steps: a camera with an ultra-wide dynamic range can be used for eliminating signal lamp color distortion caused by strong light. The reason is that the camera with the ultra-wide dynamic range has a large dynamic range of detection brightness, and can restore image details in a scene with high contrast brightness. On the one hand, however, ultra-wide dynamic cameras are expensive; on the other hand, the proportion of the signal lamp area in the whole picture is very small, so that the color restoration effect of the camera with an ultra-wide dynamic range on the overexposed signal lamp under low illumination is limited. Software method 1: the identification of the signal lamp region and then the color restoration of the signal lamp can be performed based on the red, green, blue, RGB (color) space of the image. However, the RGB color space does not reflect the luminance information of the signal lamp well, and may cause a gradient phenomenon in which the signal lamp area after color restoration is discontinuous from the surrounding image. Software method 2: the signal lamp in the image can be subjected to region identification and color enhancement based on a deep learning method. However, the method based on the deep learning is not high in the accuracy of recognizing the contour of the traffic light, and therefore, the traffic light color restoration region is likely to be discontinuous from the surrounding image.
Therefore, a method for more accurately performing color restoration on an overexposed object in an image (e.g., an overexposed traffic light) is needed.
Disclosure of Invention
The embodiment of the application provides an image processing method, which can accurately restore colors of an overexposed object (such as a traffic light) in an image. Furthermore, a series of problems (for example, the problem of difficult forensics caused by signal lamp overexposure in the current electric police monitoring scene) caused by overexposure of objects in the image can be solved.
In a first aspect, an embodiment of the present application provides a method for performing color restoration on an image, including: acquiring a first image to be processed, wherein the first image comprises an overexposed first target object; determining a first region of the first image from the saturation and brightness of the pixels of the first image, the saturation of the pixels of the first region being lower than the average of the saturation of the pixels of the first image and the brightness of the pixels of the first region being higher than the average of the brightness of the pixels of the first image; the first region corresponds to a region (where) the first target object is; carrying out binarization processing on the first area to obtain a binary image corresponding to the first area; determining at least one connected region with the area larger than or equal to a first threshold value in the binary image; the contour of at least one connected region corresponds to the contour of the first target object; restoring the color of the at least one connected region.
Based on the method provided by the embodiment of the application, after the first image to be processed is obtained, the first region of the first image (the first region can be used as the region where the first target object of the overexposure is located) can be determined according to the saturation and the brightness of the pixels of the first image, and then the first region can be subjected to binarization processing to obtain a binary image corresponding to the first region; then, at least one connected region with the area larger than or equal to the first threshold in the binary image is determined (the contour of the at least one connected region corresponds to the contour of the first target object), and then the color of the at least one connected region is restored (i.e. the color of the overexposed first target object is restored). Therefore, the method provided by the embodiment of the application can accurately restore the color of the overexposed object (namely, the first target object, which may be a traffic light, for example) in the image. Furthermore, a series of problems (for example, the problem of difficult forensics caused by signal lamp overexposure in the current electric police monitoring scene) caused by overexposure of objects in the image can be solved.
In one possible implementation, before determining the first region of the first image according to the saturation and the brightness of the first image, the method further includes: the first image is converted from a first space, which is any one of a luminance chrominance YUV space, an RGB space, or a Hue Saturation Luminance (HSL) space, to a Hue Saturation Value (HSV) space. That is, if the first image to be processed is in the first space, the first space may be a YUV space, or an RGB space, or an HSL space, and at this time, in order to obtain the saturation component and the brightness component of the pixel of the first image, the first image needs to be converted from the first space to the HSV space.
In one possible implementation, when the first target object is a traffic light, the restoring the color of the at least one connected region includes: acquiring color information of a traffic signal lamp; when the traffic signal lamp is red, adjusting the hue of the pixels in at least one connected region to a red range, and respectively increasing and decreasing the saturation and the brightness of the pixels in at least one connected region; when the traffic signal lamp is yellow, adjusting the hue of the pixels in at least one connected region to a yellow range, and respectively increasing and decreasing the saturation and the brightness of the pixels in at least one connected region; when the traffic signal lamp is green, the hue of the pixels of the at least one connected region is adjusted to be in a green range, and the saturation and the brightness of the pixels of the at least one connected region are respectively increased and decreased.
In the embodiment of the present application, the lifting and the lowering of the saturation and the brightness of the pixel of the at least one connected component may be linear lifting and linear lowering of the saturation and the brightness of the pixel of the at least one connected component, or may be non-linear lifting and non-linear lowering of the saturation and the brightness of the pixel of the at least one connected component.
In one possible implementation, when the first target object is a traffic light, the restoring the color of the at least one connected region includes: converting at least one connected region to RGB space; acquiring color information of a traffic signal lamp; when the traffic signal lamp is red, adjusting the red component of the pixel of at least one connected area to a first preset range, and reducing the blue and green components of the pixel of at least one connected area; when the traffic signal lamp is yellow, adjusting the red and green components of the pixels of at least one connected region to a second preset range, and reducing the blue component of the pixels of at least one connected region; and when the traffic signal lamp is green, adjusting the green component of the pixel of the at least one connected region to a third preset range, and reducing the red and blue components of the pixel of the at least one connected region.
Wherein the reducing the blue and green components of the pixels of the at least one connected component region may be a linear or non-linear reduction of the blue and green components of the pixels of the at least one connected component region.
In one possible implementation, before determining the first region of the first image according to the saturation and the brightness of the first image, the method further includes: the saturation component and the brightness component of the pixels of the first image are subjected to a filtering process. In this way, the contour of the overexposed first object can be identified more stably and accurately.
In one possible implementation, acquiring the first image to be processed includes: and taking the second area selected by the user on the second image as the first image to be processed.
In this way, compared with directly processing the second image, selecting the first image from the second image and processing only the first image can reduce the amount of calculation in the subsequent steps, and can improve the recognition accuracy of the region where the first target object (e.g., a signal lamp) is located.
In a second aspect, an embodiment of the present application provides an image processing apparatus, including: the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a first image to be processed, and the first image comprises an overexposed first target object; a determination unit configured to determine a first region of the first image based on a saturation and a brightness of a pixel of the first image, the saturation of the pixel of the first region being lower than an average of the saturations of the pixel of the first image, and the brightness of the pixel of the first region being higher than an average of the brightness of the pixel of the first image; the first area corresponds to a first target object area; the processing unit is used for carrying out binarization processing on the first area to obtain a binary image corresponding to the first area; the determining unit is further used for determining at least one connected region with the area larger than or equal to a first threshold value in the binary image; the contour of the first target object corresponds to the contour of the at least one connected region; and the processing unit is also used for restoring the color of at least one communication area.
In one possible implementation, the processing unit is further configured to: converting the first image from a first space to HSV space, wherein the first space is any one of YUV space, RGB space or HSL space.
In one possible implementation, when the first target object is a traffic light, the processing unit is configured to: acquiring color information of a traffic signal lamp through an acquisition unit; when the traffic signal lamp is red, adjusting the hue of the pixels in at least one connected region to a red range, and respectively increasing and decreasing the saturation and the brightness of the pixels in at least one connected region; when the traffic signal lamp is yellow, adjusting the hue of the pixels in at least one connected region to a yellow range, and respectively increasing and decreasing the saturation and the brightness of the pixels in at least one connected region; when the traffic signal lamp is green, the hue of the pixels of the at least one connected region is adjusted to be in a green range, and the saturation and the brightness of the pixels of the at least one connected region are respectively increased and decreased.
In one possible implementation, when the first target object is a traffic light, the processing unit is configured to: converting at least one connected region to RGB space; acquiring color information of a traffic signal lamp through an acquisition unit; when the traffic signal lamp is red, adjusting the red component of the pixel of at least one connected area to a first preset range, and reducing the blue and green components of the pixel of at least one connected area; when the traffic signal lamp is yellow, adjusting the red and green components of the pixels of at least one connected region to a second preset range, and reducing the blue component of the pixels of at least one connected region; and when the traffic signal lamp is green, adjusting the green component of the pixel of the at least one connected region to a third preset range, and reducing the red and blue components of the pixel of the at least one connected region.
In one possible implementation, the processing unit is further configured to: the saturation component and the brightness component of the pixels of the first image are subjected to a filtering process.
In one possible implementation manner, the obtaining unit is configured to: and taking the second area selected by the user on the second image as the first image to be processed.
For technical effects of the second aspect and various possible implementations thereof, reference may be made to the technical effects of the first aspect and various possible implementations thereof, which are not described herein in detail.
In a third aspect, the present application provides an apparatus, which exists in the form of a chip product, and the apparatus includes a processor and a memory, the memory is configured to be coupled to the processor and stores necessary program instructions and data of the apparatus, and the processor is configured to execute the program instructions stored in the memory, so that the apparatus performs the functions of the image processing device in the method.
In a fourth aspect, an embodiment of the present application provides an image processing apparatus, where the image processing apparatus may implement a function performed by the image processing apparatus, and the function may be implemented by hardware or may be implemented by hardware executing corresponding software. The hardware or software includes one or more modules corresponding to the above functions.
In one possible design, the image processing device includes a processor and a communication interface, and the processor is configured to support the image processing device to execute the corresponding functions of the method. The communication interface is used to support communication between the image processing device and other devices, such as signal light detectors on traffic lights. The image processing device may also include a memory for coupling with the processor that retains program instructions and data necessary for the image processing device.
In a fifth aspect, embodiments of the present application provide a computer-readable storage medium, which includes instructions that, when executed on a computer, cause the computer to perform any one of the methods provided in the first aspect.
In a sixth aspect, embodiments of the present application provide a computer program product containing instructions, which when run on a computer, cause the computer to perform any one of the methods provided in the first aspect.
Drawings
Fig. 1 is a schematic system architecture diagram of a method for color restoration of an image according to an embodiment of the present disclosure;
fig. 2 is a schematic structural diagram of an image processing apparatus according to an embodiment of the present application;
fig. 3 is a schematic flowchart of a method for color restoration of an image according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram illustrating a user selecting a first image to be processed according to an embodiment of the present application;
FIG. 5 is a schematic view of a first region provided in an embodiment of the present application;
fig. 6 is a schematic diagram of a binary image corresponding to a first region according to an embodiment of the present disclosure;
FIG. 7 is a schematic view of a communication zone provided in an embodiment of the present application;
fig. 8 is a schematic diagram of a first region, a binary image corresponding to the first region, and a connected region determined according to the binary image, according to an embodiment of the present application;
fig. 9 is a schematic structural diagram of another image processing apparatus according to an embodiment of the present application.
Detailed Description
The embodiment of the application provides a method and a device for carrying out color (color) reduction on an image, which are applied to a scene for carrying out color reduction on an overexposed image. It is understood that the embodiments of the present application can also be applied to a scene in which a video including one or more frames of an overexposed image is color-restored. In particular, the method can be applied to the process of performing color restoration on an overexposed object (or an overexposed area) in the image. For example, in the process of color restoration of an overexposed traffic light in an image or video taken by an electronic police (a camera or a monitor).
The embodiment of the application takes a scene of a violation image shot by an electronic police camera as an example, and as shown in fig. 1, provides a system architecture schematic diagram suitable for a method for performing color reduction on an image (the violation image shot by the electronic police camera), and includes the electronic police camera and an image processing device connected with the electronic police camera (the image processing device may also be integrated in the electronic police camera), and the image processing device may perform color reduction processing on an image or a video shot by the electronic police camera. The image processing device may also be connected to a traffic light in order to obtain (historical) color changes of the signal from a signal detector on the traffic light.
The image processing apparatus in fig. 1 in this embodiment of the present application may be implemented by one apparatus, or may be one functional module in one apparatus, which is not specifically limited in this embodiment of the present application. It is understood that the above functions may be network elements in a hardware device, or software functions running on dedicated hardware, or virtualization functions instantiated on a platform (e.g., a cloud platform), or a system-on-chip. In the embodiment of the present application, the chip system may be composed of a chip, and may also include a chip and other discrete devices.
For example, the apparatus 200 in fig. 2 may be used to implement the functions of the image processing device provided in the embodiment of the present application. Fig. 2 is a schematic hardware structure diagram of an apparatus 200 according to an embodiment of the present disclosure. The apparatus 200 includes at least one processor 201 therein, which is used to implement the functions of the image processing device provided in the embodiments of the present application. Also included in the apparatus 200 is a bus 202 and at least one communication interface 204. Memory 203 may also be included in the apparatus 200.
In the embodiment of the present application, the processor may be a Central Processing Unit (CPU), a general purpose processor, a Network Processor (NP), a Digital Signal Processor (DSP), a microprocessor, a microcontroller, a Programmable Logic Device (PLD), or any combination thereof. The processor may also be any other means having a processing function such as a circuit, device or software module.
Bus 202 may be used to transfer information between the above components.
A communication interface 204 for communicating with other devices or communication networks, such as ethernet, Radio Access Network (RAN), Wireless Local Area Network (WLAN), etc. The communication interface 204 may be an interface, a circuit, a transceiver, or other device capable of implementing communication, and is not limited in this application. The communication interface 204 may be coupled with the processor 201. The coupling in the embodiments of the present application is an indirect coupling or a communication connection between devices, units or modules, and may be an electrical, mechanical or other form for information interaction between the devices, units or modules.
In the embodiments of the present application, the memory may be a read-only memory (ROM) or other types of static storage devices that can store static information and instructions, a Random Access Memory (RAM) or other types of dynamic storage devices that can store information and instructions, an electrically erasable programmable read-only memory (EEPROM), a compact disc read-only memory (CD-ROM) or other optical disc storage, optical disc storage (including compact disc, laser disc, optical disc, digital versatile disc, blu-ray disc, etc.), magnetic disk storage media or other magnetic storage devices, or any other medium that can be used to carry or store desired program code in the form of instructions or data structures and that can be accessed by a computer, but is not limited to such. The memory may be stand alone or may be coupled to the processor, such as via bus 202. The memory may also be integral to the processor.
The memory 203 is used for storing program instructions and can be controlled by the processor 201 to execute, so as to implement the method for color restoration of an image provided by the following embodiments of the present application. The processor 201 is configured to call and execute the instructions stored in the memory 203, so as to implement the method for color restoration of an image according to the following embodiments of the present application.
Optionally, the computer-executable instructions in the embodiments of the present application may also be referred to as application program codes, which are not specifically limited in the embodiments of the present application.
Optionally, the memory 203 may be included in the processor 201.
In particular implementations, processor 201 may include one or more CPUs such as CPU0 and CPU1 in fig. 2, for example, as one embodiment.
In particular implementations, apparatus 200 may include multiple processors, such as processor 201 and processor 207 in FIG. 2, for example, as an example. Each of these processors may be a single-core (single-CPU) processor or a multi-core (multi-CPU) processor. A processor herein may refer to one or more devices, circuits, and/or processing cores for processing data (e.g., computer program instructions).
In one implementation, the apparatus 200 may further include an output device 205 and an input device 206, as an example. An output device 205 is coupled to the processor 201 and may display information in a variety of ways. For example, the output device 205 may be a Liquid Crystal Display (LCD), a Light Emitting Diode (LED) display device, a Cathode Ray Tube (CRT) display device, a projector (projector), or the like. The input device 206 is coupled to the processor 201 and may receive user input in a variety of ways. For example, the input device 206 may be a camera, a mouse, a keyboard, a touch screen device, a sensing device, or the like.
The apparatus 200 may be a general-purpose device or a special-purpose device. In a specific implementation, the image processing device 200 may be a video camera, a monitor, a video display device, a desktop computer, a laptop computer, a web server, a Personal Digital Assistant (PDA), a mobile phone, a tablet computer, a wireless terminal device, an embedded device, or a device with a similar structure as in fig. 2. The embodiment of the present application does not limit the type of the apparatus 200.
For clarity and conciseness of the following description of the various embodiments, a brief introduction to related concepts or technologies is first presented:
color space: color is usually described by three independent attributes, and three independent variables are combined to form a space coordinate, namely a color space. The color space may include RGB (color) space, YUV (color) space, HSV (color) space, HSL (color) space, etc., and different color spaces may measure the color of the same object from different angles. In different processing procedures, the emphasis on color processing is different, so that various color spaces can be mutually converted to meet different processing requirements. Wherein:
RGB space: r represents a red component (red channel), G represents a green component (green channel), and B represents a blue component (blue channel). The colors are obtained by variation of the three color channels of red, green and blue and their superposition with each other. In each component, the smaller the value, the lower the luminance, and the larger the value, the higher the luminance. When the components are mixed, the mixed luminance is equal to the sum of the luminances of the components.
YUV space: y represents brightness, i.e., a gray scale value. U and V represent chroma, which is used to describe the color and saturation of the image, and may specify the color of the pixel. If there is only a Y signal component and no U, V component, then the image so represented is a black and white grayscale image. The YUV space is mainly used to optimize the transmission of color video signals to be backward compatible with legacy black and white televisions.
HSV space: HSV is a method of representing points in the RGB color space in an inverted cone. Wherein, H represents the color information, namely the position of the spectral color, and can be measured by angle, and the value range is 0-360 degrees. Red is 0 °, green is 120 °, and blue is 240 °. S is expressed as the ratio between the saturation of the selected color and the maximum saturation of that color, ranging from 0 to 1. When S is 0, only the gray scale is present. V represents the brightness of the color, ranging from 0 to 1.
HSL space: HSL is similar to HSV, the first two parameters in this model are the same as HSV. L represents the brightness of a color and can be used to control the shading of the color. The value range of L is 0-100%, the smaller the value is, the darker the color is, the closer to black the color is; the larger the value, the brighter the color, the closer to white.
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the present application, unless otherwise specified, "at least one" means one or more, "a plurality" means two or more. In addition, in order to facilitate clear description of technical solutions of the embodiments of the present application, in the embodiments of the present application, terms such as "first" and "second" are used to distinguish the same items or similar items having substantially the same functions and actions. Those skilled in the art will appreciate that the terms "first," "second," etc. do not denote any order or quantity, nor do the terms "first," "second," etc. denote any order or importance.
For convenience of understanding, the method for performing color restoration on an image provided by the embodiments of the present application is specifically described below with reference to the accompanying drawings.
As shown in fig. 3, an embodiment of the present application provides a method for performing color restoration on an image, including:
301. a first image to be processed is acquired, wherein the first image comprises an overexposed first target object.
The first target object refers to one or more objects (or areas where the objects are located) that are overexposed in the first image. The object may be, for example, a traffic light (of various shapes and colors), a vehicle, a traffic sign, etc.
In one possible design, the second area framed by the user on the second image may be used as the first image to be processed. For example, as shown in fig. 4, when the user thinks that a traffic signal (a first target object) in a certain image (a second image) is overexposed, the user may perform a first operation on the second image through an input device (e.g., a mouse or a touch screen) of the image processing device, where the first operation is used to frame out an area (a second area) where the overexposed traffic signal is located. After the image processing device identifies the first operation of the user on the second image, the second area can be cut from the second image according to the coordinate information of the second area framed and selected by the user, and a third image and the first image to be processed are obtained. The first image to be processed comprises a second area, and the third image comprises other areas except the second area. That is, the first image to be processed includes the second area selected by the user on the second image.
In this way, compared with directly processing the second image, the first image is selected from the second image and only the first image is processed, so that on one hand, the calculation amount in the subsequent steps can be reduced, and on the other hand, the identification accuracy of the area where the first target object (for example, a traffic signal lamp) is located can be improved.
302. Optionally, the saturation component and the brightness component of the pixel of the first image are subjected to filtering processing.
It should be noted that, if the second image is an image in an HSV format, that is, the second image is in an HSV space, then the first image to be processed is also in the HSV space, and the saturation component and the brightness component of the pixel of the first image may be directly subjected to filtering processing, so as to more stably and accurately identify the contour of the overexposed first object (for example, the overexposed traffic signal lamp) in the subsequent step.
If the second image is in the first space, which may be a YUV space or an RGB space or an HSL space, the first image to be processed is also in the first space. At this time, in order to obtain the saturation component and the brightness component of the pixel of the first image, it is necessary to convert the first image from the first space to the HSV space. Then, the saturation component and the brightness component of the pixels of the first image are filtered, so that the contour of the overexposed first object can be identified more stably and accurately in the subsequent steps.
303. A first region of the first image is determined from the saturation and brightness of the pixels of the first image.
In the embodiment of the present application, the first area corresponds to an area of (where) the first target object is located. The region in the first image where the overexposed first target object (e.g., the overexposed traffic signal light) is located may be determined from the saturation and the brightness of the pixels of the first image.
In one possible design, the saturation component and the brightness component of the pixel of the first image after the filtering process may be analyzed, and an area where the saturation component is lower than the average value of the saturation of the pixel of the first image and the brightness component is higher than the average value of the brightness of the pixel of the first image is determined as the area where the first target object is located. I.e. the saturation of the pixels of the first area is lower than the average of the saturation of the pixels of the first image and the brightness of the pixels of the first area is higher than the average of the brightness of the pixels of the first image. Fig. 5 is a schematic diagram of a first region.
304. And carrying out binarization processing on the first area to obtain a binary image corresponding to the first area.
For example, as shown in fig. 6, after the first region shown in fig. 5 is subjected to binarization processing, a binary image corresponding to the first region can be obtained. The binary image corresponding to the first region may include three connected regions (a, b, and c, respectively). The connected area a is the area where the signal lamp is located, and the connected areas b and c may be the areas where the halo generated by the signal lamp is located.
305. At least one connected region in the binary image having an area greater than or equal to a first threshold is determined.
In one possible design, the contour of at least one connected region in the binary image, the area of which is greater than or equal to the first threshold value, corresponds to the contour of the first target object. The contour of at least one connected region in the binary image having an area greater than or equal to the first threshold may be used as the contour of the first target object. It should be noted that the outline of the at least one connected region having the area greater than or equal to the first threshold may be larger or smaller than the outline of the actual (real) first target object. For example, when the first target object is a signal lamp, since the (bright) signal lamp may generate a halo, it may be caused that the at least one connected region includes not only the region where the signal lamp itself is located but also the region where the halo generated by the signal lamp is located, and therefore the profile of the at least one connected region having the area greater than or equal to the first threshold value may be larger than the profile of the actual signal lamp. For another example, if a portion of the signal lamp is damaged and cannot emit light, the at least one communication area may only include an area where the portion of the signal lamp emits light, and in this case, the outline of the at least one communication area is smaller than the outline of the actual signal lamp.
Wherein the first threshold value may be determined according to an area of a connected region having a largest area in the binary image. For example, the first threshold may be equal to the area of the connected region having the largest area in the binary image, or the first threshold may be N% (e.g., 30%) of the area of the connected region having the largest area in the binary image, where N is a positive number.
A plurality of connected regions with different areas exist in the binary image corresponding to the first region. For example, as shown in fig. 6, the binary image corresponding to the first region may include three connected regions (a, b, and c, respectively). And searching all connected regions in the binary image based on the binary image. And sorting all searched connected regions according to the size of the area (for example, sorting all the connected regions in the order of the area from large to small). Among the communication regions with different sizes, the communication region with a smaller area may be filtered out, the communication region with a larger area is reserved, and at least one communication region with a larger area is used as a region of a first target object (for example, a signal lamp) (for example, the communication region with the largest area may be used as a region where the signal lamp is located, or a plurality of communication regions with larger areas may be used as regions where the signal lamp is located).
For example, the connected component with the largest area in the binary image may be filtered, and assuming that the area of the connected component is max, the first threshold may be set to max/(x +2) to filter the connected component with the area smaller than max/(x + 2). For example, as shown in fig. 7, connected regions (b and c) may be filtered out, leaving connected region a. The outline of the connected region a corresponds to the outline of the first target object.
The value range of the parameter x may be 0 to 9, and the parameter x may be used to adjust the size of the first threshold. The larger the value of x is, the fewer the filtered connected regions with smaller areas are, that is, the more the connected regions with smaller areas are reserved, so that the more the contours of the connected regions correspond to the contours of the first target object. For example, if the first target object is a signal lamp, the area where the signal lamp is located is more diffused outwards (i.e., the area where the signal lamp is located includes not only the signal lamp but also a halo generated by the signal lamp). The smaller the value of x is, the more the connected regions with smaller filtered areas are, that is, the fewer the connected regions with smaller areas are reserved, so that the contour of the first target object corresponds to the contour of the fewer connected regions. For example, if the first target object is a signal lamp, the area in which the signal lamp is located is more inwardly shrunk (i.e., the area in which the signal lamp is located includes only the signal lamp and does not include a halo generated by the signal lamp).
For another example, if the first image to be processed includes an arrow-shaped indicator light, the image processing apparatus determines that the first region of the first image is as shown in (a) in fig. 8, and after performing binarization processing on the first region, as shown in (b) in fig. 8, the binary image corresponding to the first region may include four connected regions (d, e, f, and g, respectively), and as shown in (c) in fig. 8, at least one connected region having an area greater than or equal to the first threshold value in the binary image may be connected regions d and e. I.e., filtering out connected regions (f and g), leaving connected regions d and e. The outline of the connected regions d and e may be taken as the outline of the first target object.
306. Restoring the color of the at least one connected region.
In the embodiment of the present application, restoring the color of the at least one connected region may also be considered as modifying the color of the at least one connected region. And restoring or correcting the color of the at least one connected region, namely restoring or correcting the color of the overexposed first target object. For example, as shown in fig. 7, the color of the connected region a, that is, the color of the overexposed first target object is reduced or corrected.
For example, when the first target object is a traffic light, color information of the traffic light may be first acquired based on a traffic light detector externally connected to the traffic light or based on an image recognition status light.
In one possible embodiment, the color of at least one of the connected regions can be reproduced in the HSV space. Specifically, when the traffic signal lamp is red (determined), the hue of the pixel in at least one connected region is adjusted to a red range, and the saturation and the brightness of the pixel in at least one connected region are respectively increased and decreased; when (determining) the traffic signal lamp is yellow, adjusting the hue of the pixels of the at least one connected region to a yellow range, and respectively increasing and decreasing the saturation and brightness of the pixels of the at least one connected region; when the traffic signal lamp is green, adjusting the hue of the pixels of the at least one connected region to a green range, and respectively increasing and decreasing the saturation and brightness of the pixels of the at least one connected region.
In the embodiment of the present application, the lifting and the lowering of the saturation and the brightness of the pixel of the at least one connected component may be linear lifting and linear lowering of the saturation and the brightness of the pixel of the at least one connected component, or may be non-linear lifting and non-linear lowering of the saturation and the brightness of the pixel of the at least one connected component. Linearly boosting the saturation may boost the saturation of each of the pixels in the at least one connected region by the same amount (value); the non-linear lifting of the saturation may lift the saturation of different ones of the pixels in the at least one connected region by different amounts (values). Linearly reducing the brightness may reduce the brightness of each of the pixels in the at least one connected region by the same amount (value); the non-linear reduction of the brightness may reduce the brightness of different ones of the pixels in the at least one connected region by different amounts (values). It should be noted that after the saturation and the brightness of the pixels in at least one connected region are respectively increased and decreased, the saturation and the brightness of at least one connected region have a strong correlation. Before the saturation and brightness of the pixels of the at least one connected region are respectively increased and decreased, the saturation and brightness of the at least one connected region do not have strong correlation.
In another possible design, the color of at least one connected region may be restored in RGB space. Specifically, when the traffic signal light is red, the red component of the pixel of the at least one connected region may be adjusted to a first preset range, and the blue and green components of the pixel of the at least one connected region may be reduced.
For example, assuming that an 8-bit encoded RGB image is color-restored, if the red component (R value) of a dot pixel is Rx, the R value of the dot pixel may be adjusted to (200+55 × x/9+ Rx)/2 (a first predetermined range). The value range of the parameter x can be 0-9, and the parameter x can be used for adjusting the value of the R.
Wherein the reducing the blue and green components of the pixels of the at least one connected component region may be a linear or non-linear reduction of the blue and green components of the pixels of the at least one connected component region. The linear reduction may reduce the blue and green components of each of the pixels in the at least one connected region by the same amount (value); the non-linear reduction may reduce the blue and green components of different ones of the pixels in the at least one connected region by different amounts (values).
Increasing the red and green components of the pixels of the at least one connected component and decreasing the blue component of the pixels of the at least one connected component when (determining) the traffic light is yellow; when the traffic signal light is green, the green component of the pixels of the at least one connected component is increased and the red and blue components of the pixels of the at least one connected component are decreased. For the specific process, reference may be made to the above related processing process when the traffic signal light is red, which is not described herein again.
307. The first image and the third image are merged.
If the first image is transformed from the first space to the HSV space in step 302, the color-reduced first image may be transformed from the HSV space to the first space again so that the first image and the third image (i.e., the remaining portion of the first image cut from the second image) are merged to obtain a complete image (which may be considered to be the color-reduced second image). If the operation of converting the first image from the first space to the HSV space is not performed in step 302, the first image and the third image after color restoration can be directly merged to obtain a complete image.
In a possible design, in the process of performing color restoration on an overexposed first target object in a video, a user may select the overexposed object before the video starts or when the video is paused, and in the process of playing a subsequent video, the method of the embodiment of the present application may be adopted to perform corresponding processing on each frame of image of the video, so as to restore the color of the overexposed first target object in the entire video.
Based on the method provided by the embodiment of the application, after a first image to be processed is obtained, a first area of the first image (the first area can be used as an area where a first target object subjected to overexposure is located) can be determined according to the saturation and brightness of pixels of the first image, and then binarization processing is performed on the first area to obtain a binary image corresponding to the first area; then, at least one connected region with an area larger than or equal to the first threshold in the binary image is determined (the contour of the at least one connected region can be used as the contour of the first target object), and then, the color of the at least one connected region is restored (i.e., the color of the overexposed first target object is restored). Therefore, the method provided by the embodiment of the application can accurately restore the color of the overexposed object (such as a traffic light) in the image. Furthermore, a series of problems (for example, the problem of difficult forensics caused by signal lamp overexposure in the current electric police monitoring scene) caused by overexposure of objects in the image can be solved.
In the prior art, a camera with an ultra-wide dynamic range can be adopted to eliminate color distortion of a signal lamp caused by strong light, and the ultra-wide dynamic camera is expensive and has a limited color restoration effect on an overexposed signal lamp under low illumination. The method provided by the embodiment of the application has the advantages that additional equipment is not needed, the price is low, the color restoration effect can be achieved, the environmental adaptability is high, and the color restoration of the overexposure signal lamp under low illumination can be carried out. In the prior art, color restoration of a signal lamp may be performed based on an RGB space of an image, or a region of the signal lamp in the image may be identified and color enhancement may be performed based on a deep learning method, but a gradient phenomenon may occur in which the color-restored signal lamp region is discontinuous from surrounding images. The method provided by the embodiment of the application is carried out based on the HSV space, and the brightness information of the entity signal lamp is better reflected, so that the identification of the signal lamp area is more stable and accurate, the signal lamp area after color restoration is more continuous with surrounding images, and the naked eye effect is more vivid. In addition, the method provided by the embodiment of the application is efficient and reliable, and can meet the color reduction effect of the signal lamp in the video.
The scheme provided by the embodiment of the application is mainly described from the perspective of the image processing device. It is to be understood that the image processing apparatus includes, in order to implement the above-described functions, a corresponding hardware structure and/or software modules that perform the respective functions. Those skilled in the art will readily appreciate that the algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or a combination of hardware and software. Whether a function is performed as hardware or software-driven hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
The embodiment of the present application may perform division of functional modules on the image processing apparatus according to the above method example, for example, each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, in the embodiment of the present application, the division of the module is schematic, and is only one logic function division, and there may be another division manner in actual implementation.
In the case of dividing the respective functional modules by corresponding respective functions, fig. 9 shows a schematic diagram of a possible configuration of the image processing apparatus 9 involved in the above-described embodiment, the image processing apparatus including: an acquisition unit 901, a determination unit 902 and a processing unit 903. The acquisition unit 901 is used to support the image processing apparatus to execute the process 301 in fig. 3. The determination unit 902 is used to support the image processing apparatus to perform the processes 303 and 305 in fig. 3. The processing unit 903 is used to support the image processing apparatus to execute 302, 304, and 306 in fig. 3. All relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The steps of a method or algorithm described in connection with the disclosure herein may be embodied in hardware or in software instructions executed by a processor. The software instructions may consist of corresponding software modules that may be stored in RAM, flash memory, ROM, EPROM, EEPROM, registers, hard disk, a removable hard disk, a compact disk, or any other form of storage medium known in the art. An exemplary storage medium is coupled to the processor such the processor can read information from, and write information to, the storage medium. Of course, the storage medium may also be integral to the processor. The processor and the storage medium may reside in an Application Specific Integrated Circuit (ASIC). Additionally, the ASIC may reside in a core network interface device. Of course, the processor and the storage medium may reside as discrete components in a core network interface device.
Those skilled in the art will recognize that in one or more of the examples described above, the functions described herein may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on an image processing device-readable medium. Image processing device readable media includes image processing device storage media and communication media including any medium that facilitates transfer of an image processing device program from one place to another. A storage media may be any available media that can be accessed by a general purpose or special purpose image processing device.
The above-mentioned embodiments, objects, technical solutions and advantages of the present application are further described in detail, it should be understood that the above-mentioned embodiments are only examples of the present application, and are not intended to limit the scope of the present application, and any modifications, equivalent substitutions, improvements and the like made on the basis of the technical solutions of the present application should be included in the scope of the present application.

Claims (13)

1. A method of color restoration of an image, comprising:
acquiring a first image to be processed, wherein the first image comprises an overexposed first target object;
determining a first region of the first image from a saturation and a brightness of pixels of the first image, the saturation of pixels of the first region being lower than an average of the saturation of pixels of the first image and the brightness of pixels of the first region being higher than an average of the brightness of pixels of the first image; the first region corresponds to a region of the first target object;
carrying out binarization processing on the first area to obtain a binary image corresponding to the first area;
determining at least one connected region with an area larger than or equal to a first threshold value in the binary image; the contour of the at least one connected region corresponds to the contour of the first target object;
restoring the color of the at least one connected region.
2. A method of color reducing an image as defined in claim 1, wherein prior to determining the first region of the first image as a function of saturation and brightness of the first image, the method further comprises:
and converting the first image from a first space into a hue saturation value HSV space, wherein the first space is any one of a luminance chrominance YUV space, a red green blue RGB space or a hue saturation luminance HSL space.
3. The method for color restoration of an image according to claim 1 or 2, wherein when the first target object is a traffic light, the restoring the color of the at least one connected region comprises:
acquiring color information of the traffic signal lamp;
when the traffic signal lamp is red, adjusting the hue of the pixels of the at least one connected region to a red range, and respectively increasing and decreasing the saturation and the brightness of the pixels of the at least one connected region;
when the traffic signal lamp is yellow, adjusting the hue of the pixels of the at least one connected region to a yellow range, and respectively increasing and decreasing the saturation and the brightness of the pixels of the at least one connected region;
when the traffic signal lamp is green, adjusting the hue of the pixels of the at least one connected region to a green range, and respectively increasing and decreasing the saturation and the brightness of the pixels of the at least one connected region.
4. The method for color restoration of an image according to claim 1 or 2, wherein when the first target object is a traffic light, the restoring the color of the at least one connected region comprises:
converting the at least one connected region to RGB space;
acquiring color information of the traffic signal lamp;
when the traffic signal lamp is red, adjusting the red component of the pixel of the at least one connected region to a first preset range, and reducing the blue and green components of the pixel of the at least one connected region;
when the traffic signal lamp is yellow, adjusting the red and green components of the pixels of the at least one connected region to a second preset range, and reducing the blue component of the pixels of the at least one connected region;
and when the traffic signal lamp is green, adjusting the green component of the pixel of the at least one connected region to a third preset range, and reducing the red and blue components of the pixel of the at least one connected region.
5. A method of color reducing an image according to any of claims 1-4, wherein before determining the first region of the first image in dependence on the saturation and brightness of the first image, the method further comprises:
performing a filtering process on a saturation component and a brightness component of a pixel of the first image.
6. The method for color restoration of an image according to any one of claims 1-5, wherein the obtaining the first image to be processed comprises:
and taking the second area selected by the user on the second image as the first image to be processed.
7. An image processing apparatus characterized by comprising:
the device comprises an acquisition unit, a processing unit and a processing unit, wherein the acquisition unit is used for acquiring a first image to be processed, and the first image comprises an overexposed first target object;
a determination unit configured to determine a first region of the first image according to a saturation and a brightness of a pixel of the first image, the saturation of the pixel of the first region being lower than an average of the saturations of the pixel of the first image, and the brightness of the pixel of the first region being higher than an average of the brightness of the pixel of the first image; the first region corresponds to a region of the first target object;
the processing unit is used for carrying out binarization processing on the first area to obtain a binary image corresponding to the first area;
the determining unit is further used for determining at least one connected region with the area larger than or equal to a first threshold value in the binary image; the contour of the at least one connected region corresponds to the contour of the first target object;
the processing unit is further used for restoring the color of the at least one communication area.
8. The image processing apparatus of claim 7, wherein the processing unit is further configured to:
and converting the first image from a first space into a hue saturation value HSV space, wherein the first space is any one of a luminance chrominance YUV space, a red green blue RGB space or a hue saturation luminance HSL space.
9. The image processing apparatus according to claim 7 or 8, wherein when the first target object is a traffic signal, the processing unit is configured to:
acquiring color information of the traffic signal lamp through the acquisition unit;
when the traffic signal lamp is red, adjusting the hue of the pixels of the at least one connected region to a red range, and respectively increasing and decreasing the saturation and the brightness of the pixels of the at least one connected region;
when the traffic signal lamp is yellow, adjusting the hue of the pixels of the at least one connected region to a yellow range, and respectively increasing and decreasing the saturation and the brightness of the pixels of the at least one connected region;
when the traffic signal lamp is green, adjusting the hue of the pixels of the at least one connected region to a green range, and respectively increasing and decreasing the saturation and the brightness of the pixels of the at least one connected region.
10. The image processing apparatus according to claim 7 or 8, wherein when the first target object is a traffic signal, the processing unit is configured to:
converting the at least one connected region to RGB space;
acquiring color information of the traffic signal lamp through the acquisition unit;
when the traffic signal lamp is red, adjusting the red component of the pixel of the at least one connected region to a first preset range, and reducing the blue and green components of the pixel of the at least one connected region;
when the traffic signal lamp is yellow, adjusting the red and green components of the pixels of the at least one connected region to a second preset range, and reducing the blue component of the pixels of the at least one connected region;
and when the traffic signal lamp is green, adjusting the green component of the pixel of the at least one connected region to a third preset range, and reducing the red and blue components of the pixel of the at least one connected region.
11. The image processing device according to any one of claims 7-10, wherein the processing unit is further configured to:
performing a filtering process on a saturation component and a brightness component of a pixel of the first image.
12. The apparatus according to any one of claims 7 to 11, wherein the acquisition unit is configured to:
and taking the second area selected by the user on the second image as the first image to be processed.
13. A computer-readable storage medium comprising instructions that, when executed on a computer, cause the computer to perform the method of color-restoring an image of any of claims 1 to 6.
CN201811517272.3A 2018-12-12 2018-12-12 Method and device for carrying out color restoration on image Pending CN111311500A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811517272.3A CN111311500A (en) 2018-12-12 2018-12-12 Method and device for carrying out color restoration on image
PCT/CN2019/121126 WO2020119454A1 (en) 2018-12-12 2019-11-27 Method and apparatus for color reproduction of image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811517272.3A CN111311500A (en) 2018-12-12 2018-12-12 Method and device for carrying out color restoration on image

Publications (1)

Publication Number Publication Date
CN111311500A true CN111311500A (en) 2020-06-19

Family

ID=71076231

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811517272.3A Pending CN111311500A (en) 2018-12-12 2018-12-12 Method and device for carrying out color restoration on image

Country Status (2)

Country Link
CN (1) CN111311500A (en)
WO (1) WO2020119454A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200747A (en) * 2020-10-16 2021-01-08 展讯通信(上海)有限公司 Image processing method and device and computer readable storage medium
CN113436126A (en) * 2021-07-13 2021-09-24 上海艾为电子技术股份有限公司 Image saturation enhancement method and system and electronic equipment
CN114783192A (en) * 2022-03-24 2022-07-22 杭州海康威视数字技术股份有限公司 Signal lamp color processing method and device

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116883542A (en) * 2022-11-22 2023-10-13 广州开得联软件技术有限公司 Image processing method, device, electronic equipment and storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8867830B2 (en) * 2011-12-06 2014-10-21 Michael Donvig Joensson Image processing method for recovering details in overexposed digital video footage or digital still images
JP2016001782A (en) * 2014-06-11 2016-01-07 キヤノン株式会社 Image processing system, imaging device, image processing method, program, and storage medium
CN104301621B (en) * 2014-09-28 2017-09-05 北京凌云光技术有限责任公司 image processing method, device and terminal
CN106507079B (en) * 2016-11-03 2019-08-27 浙江宇视科技有限公司 A kind of color rendition method and device
CN106504217B (en) * 2016-11-29 2019-03-15 Oppo广东移动通信有限公司 Image processing method, image processing apparatus, imaging device and electronic device

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112200747A (en) * 2020-10-16 2021-01-08 展讯通信(上海)有限公司 Image processing method and device and computer readable storage medium
CN112200747B (en) * 2020-10-16 2022-06-21 展讯通信(上海)有限公司 Image processing method and device and computer readable storage medium
CN113436126A (en) * 2021-07-13 2021-09-24 上海艾为电子技术股份有限公司 Image saturation enhancement method and system and electronic equipment
CN113436126B (en) * 2021-07-13 2022-06-10 上海艾为电子技术股份有限公司 Image saturation enhancement method and system and electronic equipment
CN114783192A (en) * 2022-03-24 2022-07-22 杭州海康威视数字技术股份有限公司 Signal lamp color processing method and device

Also Published As

Publication number Publication date
WO2020119454A1 (en) 2020-06-18

Similar Documents

Publication Publication Date Title
CN110544258B (en) Image segmentation method and device, electronic equipment and storage medium
CN111311500A (en) Method and device for carrying out color restoration on image
WO2017121018A1 (en) Method and apparatus for processing two-dimensional code image, and terminal and storage medium
CN103646392B (en) Backlighting detecting and equipment
CN109274985B (en) Video transcoding method and device, computer equipment and storage medium
WO2018149253A1 (en) Image processing method and device
JP6553624B2 (en) Measurement equipment and system
CN108876742B (en) Image color enhancement method and device
US20180130188A1 (en) Image highlight detection and rendering
US10438376B2 (en) Image processing apparatus replacing color of portion in image into single color, image processing method, and storage medium
CN108230256A (en) Image processing method, device, computer installation and computer readable storage medium
CN111368819B (en) Light spot detection method and device
CN108806638B (en) Image display method and device
CN111712021A (en) Intelligent adjusting method, device and system for lamplight of art gallery
CN111626967A (en) Image enhancement method, image enhancement device, computer device and readable storage medium
KR102311367B1 (en) Image processing apparatus, image processing method, and storage medium
US20230049656A1 (en) Method of processing image, electronic device, and medium
CN111523551A (en) Binarization method, device and equipment for blue object
US20130155254A1 (en) Imaging apparatus, image processing apparatus, and image processing method
CN110880164A (en) Image processing method, device and equipment and computer storage medium
CN110310341B (en) Method, device, equipment and storage medium for generating default parameters in color algorithm
WO2015189369A1 (en) Methods and systems for color processing of digital images
CN112215237B (en) Image processing method and device, electronic equipment and computer readable storage medium
EP3038059A1 (en) Methods and systems for color processing of digital images
CN116612146B (en) Image processing method, device, electronic equipment and computer storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination